WorldWideScience

Sample records for lsst camera optics

  1. Mechanical Design of the LSST Camera

    Energy Technology Data Exchange (ETDEWEB)

    Nordby, Martin; Bowden, Gordon; Foss, Mike; Guiffre, Gary; /SLAC; Ku, John; /Unlisted; Schindler, Rafe; /SLAC

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors in image reconstruction. Design and analysis for the camera body and cryostat will be detailed.

  2. The LSST Camera 500 watt -130 degC Mixed Refrigerant Cooling System

    Energy Technology Data Exchange (ETDEWEB)

    Bowden, Gordon B.; Langton, Brian J.; /SLAC; Little, William A.; /MMR-Technologies, Mountain View, CA; Powers, Jacob R; Schindler, Rafe H.; /SLAC; Spektor, Sam; /MMR-Technologies, Mountain View, CA

    2014-05-28

    The LSST Camera has a higher cryogenic heat load than previous CCD telescope cameras due to its large size (634 mm diameter focal plane, 3.2 Giga pixels) and its close coupled front-end electronics operating at low temperature inside the cryostat. Various refrigeration technologies are considered for this telescope/camera environment. MMR-Technology’s Mixed Refrigerant technology was chosen. A collaboration with that company was started in 2009. The system, based on a cluster of Joule-Thomson refrigerators running a special blend of mixed refrigerants is described. Both the advantages and problems of applying this technology to telescope camera refrigeration are discussed. Test results from a prototype refrigerator running in a realistic telescope configuration are reported. Current and future stages of the development program are described. (auth)

  3. The European perspective for LSST

    Science.gov (United States)

    Gangler, Emmanuel

    2017-06-01

    LSST is a next generation telescope that will produce an unprecedented data flow. The project goal is to deliver data products such as images and catalogs thus enabling scientific analysis for a wide community of users. As a large scale survey, LSST data will be complementary with other facilities in a wide range of scientific domains, including data from ESA or ESO. European countries have invested in LSST since 2007, in the construction of the camera as well as in the computing effort. This latter will be instrumental in designing the next step: how to distribute LSST data to Europe. Astroinformatics challenges for LSST indeed includes not only the analysis of LSST big data, but also the practical efficiency of the data access.

  4. LSST camera readout chip ASPIC: test tools

    Science.gov (United States)

    Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.

    2012-02-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  5. LSST camera readout chip ASPIC: test tools

    International Nuclear Information System (INIS)

    Antilogus, P; Bailly, Ph; Juramy, C; Lebbolo, H; Martin, D; Jeglot, J; Moniez, M; Tocut, V; Wicek, F

    2012-01-01

    The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.

  6. LSST telescope and site status

    Science.gov (United States)

    Gressler, William J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) Project1 received its construction authorization from the National Science Foundation in August 2014. The Telescope and Site (T and S) group has made considerable progress towards completion in subsystems required to support the scope of the LSST science mission. The LSST goal is to conduct a wide, fast, deep survey via a 3-mirror wide field of view optical design, a 3.2-Gpixel camera, and an automated data processing system. The summit facility is currently under construction on Cerro Pachón in Chile, with major vendor subsystem deliveries and integration planned over the next several years. This paper summarizes the status of the activities of the T and S group, tasked with design, analysis, and construction of the summit and base facilities and infrastructure necessary to control the survey, capture the light, and calibrate the data. All major telescope work package procurements have been awarded to vendors and are in varying stages of design and fabrication maturity and completion. The unique M1M3 primary/tertiary mirror polishing effort is completed and the mirror now resides in storage waiting future testing. Significant progress has been achieved on all the major telescope subsystems including the summit facility, telescope mount assembly, dome, hexapod and rotator systems, coating plant, base facility, and the calibration telescope. In parallel, in-house efforts including the software needed to control the observatory such as the scheduler and the active optics control, have also seen substantial advancement. The progress and status of these subsystems and future LSST plans during this construction phase are presented.

  7. LSST and the Epoch of Reionization Experiments

    Science.gov (United States)

    Ivezić, Željko

    2018-05-01

    The Large Synoptic Survey Telescope (LSST), a next generation astronomical survey, sited on Cerro Pachon in Chile, will provide an unprecedented amount of imaging data for studies of the faint optical sky. The LSST system includes an 8.4m (6.7m effective) primary mirror and a 3.2 Gigapixel camera with a 9.6 sq. deg. field of view. This system will enable about 10,000 sq. deg. of sky to be covered twice per night, every three to four nights on average, with typical 5-sigma depth for point sources of r = 24.5 (AB). With over 800 observations in the ugrizy bands over a 10-year period, these data will enable coadded images reaching r = 27.5 (about 5 magnitudes deeper than SDSS) as well as studies of faint time-domain astronomy. The measured properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. The resulting hundreds of petabytes of imaging data for about 40 billion objects will be used for scientific investigations ranging from the properties of near-Earth asteroids to characterizations of dark matter and dark energy. For example, simulations estimate that LSST will discover about 1,000 quasars at redshifts exceeding 7; this sample will place tight constraints on the cosmic environment at the end of the reionization epoch. In addition to a brief introduction to LSST, I review the value of LSST data in support of epoch of reionization experiments and discuss how international participants can join LSST.

  8. LSST beam simulator

    International Nuclear Information System (INIS)

    Tyson, J A; Klint, M; Sasian, J; Claver, C; Muller, G; Gilmor, K

    2014-01-01

    It is always important to test new imagers for a mosaic camera before device acceptance and constructing the mosaic. This is particularly true of the LSST CCDs due to the fast beam illumination: at long wavelengths there can be significant beam divergence (defocus) inside the silicon because of the long absorption length for photons near the band gap. Moreover, realistic sky scenes need to be projected onto the CCD focal plane Thus, we need to design and build an f/1.2 re-imaging system. The system must simulate the entire LSST 1 operation, including a sky with galaxies and stars with approximately black-body spectra superimposed on a spatially diffuse night sky emission with its complex spectral features

  9. LSST Painting Risk Evaluation Memo

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, Justin E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-10

    The optics subsystem is required to paint the edges of optics black where possible. Due to the risks in applying the paint LSST requests a review of the impact of removing this requirement for the filters and L3.

  10. Big Software for Big Data: Scaling Up Photometry for LSST (Abstract)

    Science.gov (United States)

    Rawls, M.

    2017-06-01

    (Abstract only) The Large Synoptic Survey Telescope (LSST) will capture mosaics of the sky every few nights, each containing more data than your computer's hard drive can store. As a result, the software to process these images is as critical to the science as the telescope and the camera. I discuss the algorithms and software being developed by the LSST Data Management team to handle such a large volume of data. All of our work is open source and available to the community. Once LSST comes online, our software will produce catalogs of objects and a stream of alerts. These will bring exciting new opportunities for follow-up observations and collaborations with LSST scientists.

  11. The Future of the Time Domain with LSST

    Science.gov (United States)

    Walkowicz, Lucianne M.

    2012-04-01

    abstract-type="normal">SummaryIn the coming decade LSST's combination of all-sky coverage, consistent long-term monitoring and flexible criteria for event identification will revolutionize studies of a wide variety of astrophysical phenomena. Time-domain science with LSST encompasses objects both familiar and exotic, from classical variables within our Galaxy to explosive cosmological events. Increased sample sizes of known-but-rare observational phenomena will quantify their distributions for the first time, thus challenging existing theories. Perhaps most excitingly, LSST will provide the opportunity to sample previously untouched regions of parameter space. LSST will generate `alerts' within 60 seconds of detecting a new transient, permitting the community to follow up unusual events in greater detail. However, follow-up will remain a challenge as the volume of transients will easily saturate available spectroscopic resources. Characterization of events and access to appropriate ancillary data (e.g. from prior observations, either in the optical or in other passbands) will be of the utmost importance in prioritizing follow-up observations. The incredible scientific opportunities and unique challenges afforded by LSST demand organization, forethought and creativity from the astronomical community. To learn more about the telescope specifics and survey design, as well as obtaining a overview of the variety of the scientific investigations that LSST will enable, readers are encouraged to look at the LSST Science Book: http://www.lsst.org/lsst/scibook. Organizational details of the LSST science collaborations and management may be found at http://www.lsstcorp.org.

  12. The LSST operations simulator

    Science.gov (United States)

    Delgado, Francisco; Saha, Abhijit; Chandrasekharan, Srinivasan; Cook, Kem; Petry, Catherine; Ridgway, Stephen

    2014-08-01

    The Operations Simulator for the Large Synoptic Survey Telescope (LSST; http://www.lsst.org) allows the planning of LSST observations that obey explicit science driven observing specifications, patterns, schema, and priorities, while optimizing against the constraints placed by design-specific opto-mechanical system performance of the telescope facility, site specific conditions as well as additional scheduled and unscheduled downtime. It has a detailed model to simulate the external conditions with real weather history data from the site, a fully parameterized kinematic model for the internal conditions of the telescope, camera and dome, and serves as a prototype for an automatic scheduler for the real time survey operations with LSST. The Simulator is a critical tool that has been key since very early in the project, to help validate the design parameters of the observatory against the science requirements and the goals from specific science programs. A simulation run records the characteristics of all observations (e.g., epoch, sky position, seeing, sky brightness) in a MySQL database, which can be queried for any desired purpose. Derivative information digests of the observing history are made with an analysis package called Simulation Survey Tools for Analysis and Reporting (SSTAR). Merit functions and metrics have been designed to examine how suitable a specific simulation run is for several different science applications. Software to efficiently compare the efficacy of different survey strategies for a wide variety of science applications using such a growing set of metrics is under development. A recent restructuring of the code allows us to a) use "look-ahead" strategies that avoid cadence sequences that cannot be completed due to observing constraints; and b) examine alternate optimization strategies, so that the most efficient scheduling algorithm(s) can be identified and used: even few-percent efficiency gains will create substantive scientific

  13. LSST summit facility construction progress report: reacting to design refinements and field conditions

    Science.gov (United States)

    Barr, Jeffrey D.; Gressler, William; Sebag, Jacques; Seriche, Jaime; Serrano, Eduardo

    2016-07-01

    The civil work, site infrastructure and buildings for the summit facility of the Large Synoptic Survey Telescope (LSST) are among the first major elements that need to be designed, bid and constructed to support the subsequent integration of the dome, telescope, optics, camera and supporting systems. As the contracts for those other major subsystems now move forward under the management of the LSST Telescope and Site (T and S) team, there has been inevitable and beneficial evolution in their designs, which has resulted in significant modifications to the facility and infrastructure. The earliest design requirements for the LSST summit facility were first documented in 2005, its contracted full design was initiated in 2010, and construction began in January, 2015. During that entire development period, and extending now roughly halfway through construction, there continue to be necessary modifications to the facility design resulting from the refinement of interfaces to other major elements of the LSST project and now, during construction, due to unanticipated field conditions. Changes from evolving interfaces have principally involved the telescope mount, the dome and mirror handling/coating facilities which have included significant variations in mass, dimensions, heat loads and anchorage conditions. Modifications related to field conditions have included specifying and testing alternative methods of excavation and contending with the lack of competent rock substrate where it was predicted to be. While these and other necessary changes are somewhat specific to the LSST project and site, they also exemplify inherent challenges related to the typical timeline for the design and construction of astronomical observatory support facilities relative to the overall development of the project.

  14. Scientific Synergy between LSST and Euclid

    Science.gov (United States)

    Rhodes, Jason; Nichol, Robert C.; Aubourg, Éric; Bean, Rachel; Boutigny, Dominique; Bremer, Malcolm N.; Capak, Peter; Cardone, Vincenzo; Carry, Benoît; Conselice, Christopher J.; Connolly, Andrew J.; Cuillandre, Jean-Charles; Hatch, N. A.; Helou, George; Hemmati, Shoubaneh; Hildebrandt, Hendrik; Hložek, Renée; Jones, Lynne; Kahn, Steven; Kiessling, Alina; Kitching, Thomas; Lupton, Robert; Mandelbaum, Rachel; Markovic, Katarina; Marshall, Phil; Massey, Richard; Maughan, Ben J.; Melchior, Peter; Mellier, Yannick; Newman, Jeffrey A.; Robertson, Brant; Sauvage, Marc; Schrabback, Tim; Smith, Graham P.; Strauss, Michael A.; Taylor, Andy; Von Der Linden, Anja

    2017-12-01

    Euclid and the Large Synoptic Survey Telescope (LSST) are poised to dramatically change the astronomy landscape early in the next decade. The combination of high-cadence, deep, wide-field optical photometry from LSST with high-resolution, wide-field optical photometry, and near-infrared photometry and spectroscopy from Euclid will be powerful for addressing a wide range of astrophysical questions. We explore Euclid/LSST synergy, ignoring the political issues associated with data access to focus on the scientific, technical, and financial benefits of coordination. We focus primarily on dark energy cosmology, but also discuss galaxy evolution, transient objects, solar system science, and galaxy cluster studies. We concentrate on synergies that require coordination in cadence or survey overlap, or would benefit from pixel-level co-processing that is beyond the scope of what is currently planned, rather than scientific programs that could be accomplished only at the catalog level without coordination in data processing or survey strategies. We provide two quantitative examples of scientific synergies: the decrease in photo-z errors (benefiting many science cases) when high-resolution Euclid data are used for LSST photo-z determination, and the resulting increase in weak-lensing signal-to-noise ratio from smaller photo-z errors. We briefly discuss other areas of coordination, including high-performance computing resources and calibration data. Finally, we address concerns about the loss of independence and potential cross-checks between the two missions and the potential consequences of not collaborating.

  15. Management evolution in the LSST project

    Science.gov (United States)

    Sweeney, Donald; Claver, Charles; Jacoby, Suzanne; Kantor, Jeffrey; Krabbendam, Victor; Kurita, Nadine

    2010-07-01

    The Large Synoptic Survey Telescope (LSST) project has evolved from just a few staff members in 2003 to about 100 in 2010; the affiliation of four founding institutions has grown to 32 universities, government laboratories, and industry. The public private collaboration aims to complete the estimated $450 M observatory in the 2017 timeframe. During the design phase of the project from 2003 to the present the management structure has been remarkably stable. At the same time, the funding levels, staffing levels and scientific community participation have grown dramatically. The LSSTC has introduced project controls and tools required to manage the LSST's complex funding model, technical structure and distributed work force. Project controls have been configured to comply with the requirements of federal funding agencies. Some of these tools for risk management, configuration control and resource-loaded schedule have been effective and others have not. Technical tasks associated with building the LSST are distributed into three subsystems: Telescope & Site, Camera, and Data Management. Each sub-system has its own experienced Project Manager and System Scientist. Delegation of authority is enabling and effective; it encourages a strong sense of ownership within the project. At the project level, subsystem management follows the principle that there is one Board of Directors, Director, and Project Manager who have overall authority.

  16. LSST: Education and Public Outreach

    Science.gov (United States)

    Bauer, Amanda; Herrold, Ardis; LSST Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will conduct a 10-year wide, fast, and deep survey of the night sky starting in 2022. LSST Education and Public Outreach (EPO) will enable public access to a subset of LSST data so anyone can explore the universe and be part of the discovery process. LSST EPO aims to facilitate a pathway from entry-level exploration of astronomical imagery to more sophisticated interaction with LSST data using tools similar to what professional astronomers use. To deliver data to the public, LSST EPO is creating an online Portal to serve as the main hub to EPO activities. The Portal will host an interactive Skyviewer, access to LSST data for educators and the public through online Jupyter notebooks, original multimedia for informal science centers and planetariums, and feature citizen science projects that use LSST data. LSST EPO will engage with the Chilean community through Spanish-language components of the Portal and will partner with organizations serving underrepresented groups in STEM.

  17. Dark Energy Studies with LSST Image Simulations, Final Report

    International Nuclear Information System (INIS)

    Peterson, John Russell

    2016-01-01

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  18. Investigating interoperability of the LSST data management software stack with Astropy

    Science.gov (United States)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  19. Characterization of Prototype LSST CCDs

    Energy Technology Data Exchange (ETDEWEB)

    OCONNOR,P.; FRANK, J.; GEARY, J.C.; GILMORE, D.K.; KOTOV, I.; RADEKA, V.; TAKACS, P.; TYSON, J.A.

    2008-06-23

    The ambitious science goals of the Large Synoptic Survey Telescope (LSST) will be achieved in part by a wide-field imager that will achieve a new level of performance in terms of area, speed, and sensitivity. The instrument performance is dominated by the focal plane sensors, which are now in development. These new-generation sensors will make use of advanced semiconductor technology and will be complemented by a highly integrated electronics package located inside the cryostat. A test laboratory has been set up at Brookhaven National Laboratory (BNL) to characterize prototype sensors and to develop test and assembly techniques for eventual integration of production sensors and electronics into modules that will form the final focal plane. As described in [1], the key requirements for LSST sensors are wideband quantum efficiency (QE) extending beyond lpm in the red, control of point spread function (PSF), and fast readout using multiple amplifiers per chip operated in parallel. In addition, LSST's fast optical system (f71.25) places severe constraints on focal plane flatness. At the chip level this involves packaging techniques to minimize warpage of the silicon die, and at the mosaic level careful assembly and metrology to achieve a high coplanarity of the sensor tiles. In view of the long lead time to develop the needed sensor technology, LSST undertook a study program with several vendors to fabricate and test devices which address the most critical performance features [2]. The remainder of this paper presents key results of this study program. Section 2 summarizes the sensor requirements and the results of design optimization studies, and Section 3 presents the sensor development plan. In Section 4 we describe the test bench at BNL. Section 5 reports measurement results obtained to date oh devices fabricated by several vendors. Section 6 presents a summary of the paper and an outlook for the future work. We present characterization methods and results on

  20. Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream

    Science.gov (United States)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration

    2018-01-01

    The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.

  1. Investigating the Bright End of LSST Photometry

    Science.gov (United States)

    Ojala, Elle; Pepper, Joshua; LSST Collaboration

    2018-01-01

    The Large Synoptic Survey Telescope (LSST) will begin operations in 2022, conducting a wide-field, synoptic multiband survey of the southern sky. Some fraction of objects at the bright end of the magnitude regime observed by LSST will overlap with other wide-sky surveys, allowing for calibration and cross-checking between surveys. The LSST is optimized for observations of very faint objects, so much of this data overlap will be comprised of saturated images. This project provides the first in-depth analysis of saturation in LSST images. Using the PhoSim package to create simulated LSST images, we evaluate saturation properties of several types of stars to determine the brightness limitations of LSST. We also collect metadata from many wide-field photometric surveys to provide cross-survey accounting and comparison. Additionally, we evaluate the accuracy of the PhoSim modeling parameters to determine the reliability of the software. These efforts will allow us to determine the expected useable data overlap between bright-end LSST images and faint-end images in other wide-sky surveys. Our next steps are developing methods to extract photometry from saturated images.This material is based upon work supported in part by the National Science Foundation through Cooperative Agreement 1258333 managed by the Association of Universities for Research in Astronomy (AURA), and the Department of Energy under Contract No. DE-AC02-76SF00515 with the SLAC National Accelerator Laboratory. Additional LSST funding comes from private donations, grants to universities, and in-kind support from LSSTC Institutional Members.Thanks to NSF grant PHY-135195 and the 2017 LSSTC Grant Award #2017-UG06 for making this project possible.

  2. LSST Resources for the Community

    Science.gov (United States)

    Jones, R. Lynne

    2011-01-01

    LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.

  3. Formal Education with LSST

    Science.gov (United States)

    Herrold, Ardis; Bauer, Amanda, Dr.; Peterson, J. Matt; Large Synoptic Survey Telescope Education and Public Outreach Team

    2018-01-01

    The Large Synoptic Survey Telescope will usher in a new age of astronomical data exploration for science educators and students. LSST data sets will be large, deep, and dynamic, and will establish a time-domain record that will extend over a decade. They will be used to provide engaging, relevant learning experiences.The EPO Team will develop online investigations using authentic LSST data that offer varying levels of challenge and depth by the start of telescope operations, slated to begin in 2022. The topics will cover common introductory astronomy concepts, and will align with the four science domains of LSST: The Milky Way, the changing sky (transients), solar system (moving) objects, and dark matter and dark energy.Online Jupyter notebooks will make LSST data easily available to access and analyze by students at the advanced middle school through college levels. Using online notebooks will circumvent common obstacles caused by firewalls, bandwidth issues, and the need to download software, as they will be accessible from any computer or tablet with internet access. Although the LSST EPO Jupyter notebooks are Python-based, a knowledge of programming will not be required to use them.Each topical investigation will include teacher and student versions of Jupyter notebooks, instructional videos, and access to a suite of support materials including a forum, and professional development training and tutorial videos.Jupyter notebooks will contain embedded widgets to process data, eliminating the need to use external spreadsheets and plotting software. Students will be able to analyze data by using some of the existing modules already developed for professional astronomers. This will shorten the time needed to conduct investigations and will shift the emphasis to understanding the underlying science themes, which is often lost with novice learners.

  4. Compact Optical Technique for Streak Camera Calibration

    International Nuclear Information System (INIS)

    Bell, P; Griffith, R; Hagans, K; Lerche, R; Allen, C; Davies, T; Janson, F; Justin, R; Marshall, B; Sweningsen, O

    2004-01-01

    The National Ignition Facility (NIF) is under construction at the Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses1 (optical comb generators) that are suitable for temporal calibrations. These optical comb generators (Figure 1) are used with the LLNL optical streak cameras. They are small, portable light sources that produce a series of temporally short, uniformly spaced, optical pulses. Comb generators have been produced with 0.1, 0.5, 1, 3, 6, and 10-GHz pulse trains of 780-nm wavelength light with individual pulse durations of ∼25-ps FWHM. Signal output is via a fiber-optic connector. Signal is transported from comb generator to streak camera through multi-mode, graded-index optical fibers. At the NIF, ultra-fast streak-cameras are used by the Laser Fusion Program experimentalists to record fast transient optical signals. Their temporal resolution is unmatched by any other transient recorder. Their ability to spatially discriminate an image along the input slit allows them to function as a one-dimensional image recorder, time-resolved spectrometer, or multichannel transient recorder. Depending on the choice of photocathode, they can be made sensitive to photon energies from 1.1 eV to 30 keV and beyond. Comb generators perform two important functions for LLNL streak-camera users. First, comb generators are used as a precision time-mark generator for calibrating streak camera sweep rates. Accuracy is achieved by averaging many streak camera images of comb generator signals. Time-base calibrations with portable comb generators are easily done in both the calibration laboratory and in situ. Second, comb signals are applied

  5. Examining the Potential of LSST to Contribute to Exoplanet Discovery

    Science.gov (United States)

    Lund, Michael B.; Pepper, Joshua; Jacklin, Savannah; Stassun, Keivan G.

    2018-01-01

    The Large Synoptic Survey Telescope (LSST), currently under construction in Chile with scheduled first light in 2019, will be one of the major sources of data in the next decade and is one of the top priorities expressed in the last Decadal Survey. As LSST is intended to cover a range of science questions, and so the LSST community is still working on optimizing the observing strategy of the survey. With a survey area that will cover half the sky in 6 bands providing photometric data on billions of stars from 16th to 24th magnitude, LSST has the ability to be leveraged to help contribute to exoplanet science. In particular, LSST has the potential to detect exoplanets around stellar populations that are not normally usually included in transiting exoplanet searches. This includes searching for exoplanets around red and white dwarfs and stars in the galactic plane and bulge, stellar clusters, and potentially even the Magellanic Clouds. In probing these varied stellar populations, relative exoplanet frequency can be examined, and in turn, LSST may be able to provide fresh insight into how stellar environment can play a role in planetary formation rates.Our initial work on this project has been to demonstrate that even with the limitations of the LSST cadence, exoplanets would be recoverable and detectable in the LSST photometry, and to show that exoplanets indeed worth including in discussions of variable sources that LSST can contribute to. We have continued to expand this work to examine exoplanets around stars in belonging to various stellar populations, both to show the types of systems that LSST is capable of discovering, and to determine the potential exoplanet yields using standard algorithms that have already been implemented in transiting exoplanet searches, as well as how changes to LSST's observing schedule may impact both of these results.

  6. Compact Optical Technique for Streak Camera Calibration

    International Nuclear Information System (INIS)

    Curt Allen; Terence Davies; Frans Janson; Ronald Justin; Bruce Marshall; Oliver Sweningsen; Perry Bell; Roger Griffith; Karla Hagans; Richard Lerche

    2004-01-01

    The National Ignition Facility is under construction at the Lawrence Livermore National Laboratory for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses that are suitable for temporal calibrations

  7. Reliable and repeatable characterization of optical streak cameras

    International Nuclear Information System (INIS)

    Charest, Michael R. Jr.; Torres, Peter III; Silbernagel, Christopher T.; Kalantar, Daniel H.

    2008-01-01

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility. To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information.

  8. Reliable and Repeatable Characterization of Optical Streak Cameras

    International Nuclear Information System (INIS)

    Kalantar, D; Charest, M; Torres III, P; Charest, M

    2008-01-01

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information

  9. Reliable and Repeatable Characterization of Optical Streak Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Michael Charest Jr., Peter Torres III, Christopher Silbernagel, and Daniel Kalantar

    2008-10-31

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information.

  10. Reliable and Repeatable Characterication of Optical Streak Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Kalantar, D; Charest, M; Torres III, P; Charest, M

    2008-05-06

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information.

  11. Spurious Shear in Weak Lensing with LSST

    Energy Technology Data Exchange (ETDEWEB)

    Chang, C.; Kahn, S.M.; Jernigan, J.G.; Peterson, J.R.; AlSayyad, Y.; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A.; Gibson, R.R.; Gilmore, K.; Grace, E.; Hannel, M.; Hodge, M.A.; Jee, M.J.; Jones, L.; Krughoff, S.; Lorenz, S.; Marshall, P.J.; Marshall, S.; Meert, A.

    2012-09-19

    The complete 10-year survey from the Large Synoptic Survey Telescope (LSST) will image {approx} 20,000 square degrees of sky in six filter bands every few nights, bringing the final survey depth to r {approx} 27.5, with over 4 billion well measured galaxies. To take full advantage of this unprecedented statistical power, the systematic errors associated with weak lensing measurements need to be controlled to a level similar to the statistical errors. This work is the first attempt to quantitatively estimate the absolute level and statistical properties of the systematic errors on weak lensing shear measurements due to the most important physical effects in the LSST system via high fidelity ray-tracing simulations. We identify and isolate the different sources of algorithm-independent, additive systematic errors on shear measurements for LSST and predict their impact on the final cosmic shear measurements using conventional weak lensing analysis techniques. We find that the main source of the errors comes from an inability to adequately characterise the atmospheric point spread function (PSF) due to its high frequency spatial variation on angular scales smaller than {approx} 10{prime} in the single short exposures, which propagates into a spurious shear correlation function at the 10{sup -4}-10{sup -3} level on these scales. With the large multi-epoch dataset that will be acquired by LSST, the stochastic errors average out, bringing the final spurious shear correlation function to a level very close to the statistical errors. Our results imply that the cosmological constraints from LSST will not be severely limited by these algorithm-independent, additive systematic effects.

  12. Optical registration of spaceborne low light remote sensing camera

    Science.gov (United States)

    Li, Chong-yang; Hao, Yan-hui; Xu, Peng-mei; Wang, Dong-jie; Ma, Li-na; Zhao, Ying-long

    2018-02-01

    For the high precision requirement of spaceborne low light remote sensing camera optical registration, optical registration of dual channel for CCD and EMCCD is achieved by the high magnification optical registration system. System integration optical registration and accuracy of optical registration scheme for spaceborne low light remote sensing camera with short focal depth and wide field of view is proposed in this paper. It also includes analysis of parallel misalignment of CCD and accuracy of optical registration. Actual registration results show that imaging clearly, MTF and accuracy of optical registration meet requirements, it provide important guarantee to get high quality image data in orbit.

  13. Photometric classification and redshift estimation of LSST Supernovae

    Science.gov (United States)

    Dai, Mi; Kuhlmann, Steve; Wang, Yun; Kovacs, Eve

    2018-04-01

    Supernova (SN) classification and redshift estimation using photometric data only have become very important for the Large Synoptic Survey Telescope (LSST), given the large number of SNe that LSST will observe and the impossibility of spectroscopically following up all the SNe. We investigate the performance of a SN classifier that uses SN colors to classify LSST SNe with the Random Forest classification algorithm. Our classifier results in an AUC of 0.98 which represents excellent classification. We are able to obtain a photometric SN sample containing 99% SNe Ia by choosing a probability threshold. We estimate the photometric redshifts (photo-z) of SNe in our sample by fitting the SN light curves using the SALT2 model with nested sampling. We obtain a mean bias () of 0.012 with σ ( z_phot-z_spec/1+z_spec) = 0.0294 without using a host-galaxy photo-z prior, and a mean bias () of 0.0017 with σ ( z_phot-z_spec/1+z_spec) = 0.0116 using a host-galaxy photo-z prior. Assuming a flat ΛCDM model with Ωm = 0.3, we obtain Ωm of 0.305 ± 0.008 (statistical errors only), using the simulated LSST sample of photometric SNe Ia (with intrinsic scatter σint = 0.11) derived using our methodology without using host-galaxy photo-z prior. Our method will help boost the power of SNe from the LSST as cosmological probes.

  14. Photometric Redshifts with the LSST: Evaluating Survey Observing Strategies

    Science.gov (United States)

    Graham, Melissa L.; Connolly, Andrew J.; Ivezić, Željko; Schmidt, Samuel J.; Jones, R. Lynne; Jurić, Mario; Daniel, Scott F.; Yoachim, Peter

    2018-01-01

    In this paper we present and characterize a nearest-neighbors color-matching photometric redshift estimator that features a direct relationship between the precision and accuracy of the input magnitudes and the output photometric redshifts. This aspect makes our estimator an ideal tool for evaluating the impact of changes to LSST survey parameters that affect the measurement errors of the photometry, which is the main motivation of our work (i.e., it is not intended to provide the “best” photometric redshifts for LSST data). We show how the photometric redshifts will improve with time over the 10 year LSST survey and confirm that the nominal distribution of visits per filter provides the most accurate photo-z results. The LSST survey strategy naturally produces observations over a range of airmass, which offers the opportunity of using an SED- and z-dependent atmospheric affect on the observed photometry as a color-independent redshift indicator. We show that measuring this airmass effect and including it as a prior has the potential to improve the photometric redshifts and can ameliorate extreme outliers, but that it will only be adequately measured for the brightest galaxies, which limits its overall impact on LSST photometric redshifts. We furthermore demonstrate how this airmass effect can induce a bias in the photo-z results, and caution against survey strategies that prioritize high-airmass observations for the purpose of improving this prior. Ultimately, we intend for this work to serve as a guide for the expectations and preparations of the LSST science community with regard to the minimum quality of photo-z as the survey progresses.

  15. The Effects of Commercial Airline Traffic on LSST Observing Efficiency

    Science.gov (United States)

    Gibson, Rose; Claver, Charles; Stubbs, Christopher

    2016-01-01

    The Large Synoptic Survey Telescope (LSST) is a ten-year survey that will map the southern sky in six different filters 800 times before the end of its run. In this paper, we explore the primary effect of airline traffic on scheduling the LSST observations in addition to the secondary effect of condensation trails, or contrails, created by the presence of the aircraft. The large national investment being made in LSST implies that small improvments observing efficiency through aircraft and contrail avoidance can result in a significant improvement in the quality of the survey and its science. We have used the Automatic Dependent Surveillance-Broadcast (ADS-B) signals received from commercial aircraft to monitor and record activity over the LSST site. We installed a ADS-B ground station on Cerro Pachón, Chile consiting of a1090Mhz antenna on the Andes Lidar Observatory feeding a RTL2832U software defined radio. We used dump1090 to convert the received ADS-B telementry into Basestation format, where we found that during the busiest time of the night there were only 4 signals being received each minute on average, which will have very small direct effect, if any, on the LSST observing scheduler. As part of future studies we will examin the effects of contrals on LSST observations. Gibson was supported by the NOAO/KPNO Research Experiences for Undergraduates (REU) Program which is funded by the National Science Foundation Research Experience for Undergraduates Program (AST-1262829).

  16. Opportunities and challenges for time domain astronomy with LSST

    Science.gov (United States)

    Ivezic, Zeljko

    2014-01-01

    The Large Synoptic Survey Telescope (LSST) will enable faint optical time-domain astronomy by carrying out an imaging survey covering the sky that is visible from Cerro Pachon in Northern Chile. Of the order thousand 9.6 sq. deg. images (3.2 Gigapix) will be obtained per night using pairs of 15-second back-to-back exposures, with typical 5-sigma depth for point sources of 24.5 (AB). With close to 1000 observations of a 18,000 sq. deg. region in ugrizy bands over a 10-year period, these data will enable a deep stack across half the sky reaching five magnitudes deeper than the SDSS survey ( 27.5, 5 sigma, point source), and with twice as good seeing (0.7 arcsec median seeing in the r band). The measured and archived properties of newly discovered and known astrometric and photometric transients will be publicly reported within 60 sec after closing the shutter. Automated classification of the expected several million alerts per night, and selection of transient events requiring immediate follow-up, is an outstanding problem for the community. These data will represent a treasure trove for follow-up programs using other ground and space-based telescopes, such as fast-response fast-cadence photometric observations and spectroscopy, as well as for facilities operating at non-optical wavelengths and for gravitational wave programs. I will describe the relevant data products to be delivered by LSST and will summarize challenges that will need to be addressed by the community at large.

  17. Flat-field response and geometric distortion measurements of optical streak cameras

    International Nuclear Information System (INIS)

    Montgomery, D.S.; Drake, R.P.; Jones, B.A.; Wiedwald, J.D.

    1987-08-01

    To accurately measure pulse amplitude, shape, and relative time histories of optical signals with an optical streak camera, it is necessary to correct each recorded image for spatially-dependent gain nonuniformity and geometric distortion. Gain nonuniformities arise from sensitivity variations in the streak-tube photocathode, phosphor screen, image-intensifier tube, and image recording system. These nonuniformities may be severe, and have been observed to be on the order of 100% for some LLNL optical streak cameras. Geometric distortion due to optical couplings, electron-optics, and sweep nonlinearity not only affects pulse position and timing measurements, but affects pulse amplitude and shape measurements as well. By using a 1.053-μm, long-pulse, high-power laser to generate a spatially and temporally uniform source as input to the streak camera, the combined effects of flat-field response and geometric distortion can be measured under the normal dynamic operation of cameras with S-1 photocathodes. Additionally, by using the same laser system to generate a train of short pulses that can be spatially modulated at the input of the streak camera, we can effectively create a two-dimensional grid of equally-spaced pulses. This allows a dynamic measurement of the geometric distortion of the streak camera. We will discuss the techniques involved in performing these calibrations, will present some of the measured results for LLNL optical streak cameras, and will discuss software methods to correct for these effects. 6 refs., 6 figs

  18. On the Detectability of Planet X with LSST

    Science.gov (United States)

    Trilling, David E.; Bellm, Eric C.; Malhotra, Renu

    2018-06-01

    Two planetary mass objects in the far outer solar system—collectively referred to here as Planet X— have recently been hypothesized to explain the orbital distribution of distant Kuiper Belt Objects. Neither planet is thought to be exceptionally faint, but the sky locations of these putative planets are poorly constrained. Therefore, a wide area survey is needed to detect these possible planets. The Large Synoptic Survey Telescope (LSST) will carry out an unbiased, large area (around 18000 deg2), deep (limiting magnitude of individual frames of 24.5) survey (the “wide-fast-deep (WFD)” survey) of the southern sky beginning in 2022, and it will therefore be an important tool in searching for these hypothesized planets. Here, we explore the effectiveness of LSST as a search platform for these possible planets. Assuming the current baseline cadence (which includes the WFD survey plus additional coverage), we estimate that LSST will confidently detect or rule out the existence of Planet X in 61% of the entire sky. At orbital distances up to ∼75 au, Planet X could simply be found in the normal nightly moving object processing; at larger distances, it will require custom data processing. We also discuss the implications of a nondetection of Planet X in LSST data.

  19. A Euclid, LSST and WFIRST Joint Processing Study

    Science.gov (United States)

    Chary, Ranga-Ram; Joint Processing Working Group

    2018-01-01

    Euclid, LSST and WFIRST are the flagship cosmological projects of the next decade. By mapping several thousand square degrees of sky and covering the electromagnetic spectrum from the optical to the NIR with (sub-)arcsec resolution, these projects will provide exciting new constraints on the nature of dark energy and dark matter. The ultimate cosmological, astrophysical and time-domain science yield from these missions, which will detect several billions of sources, requires joint processing at the pixel-level. Three U.S. agencies (DOE, NASA and NSF) are supporting an 18-month study which aims to 1) assess the optimal techniques to combine these, and ancillary data sets at the pixel level; 2) investigate options for an interface that will enable community access to the joint data products; and 3) identify the computing and networking infrastructure to properly handle and manipulate these large datasets together. A Joint Processing Working Group (JPWG) is carrying out this study and consists of US-based members from the community and science/data processing centers of each of these projects. Coordination with European partners is envisioned in the future and European Euclid members are involved in the JPWG as observers. The JPWG will scope the effort and resources required to build up the capabilities to support scientific investigations using joint processing in time for the start of science surveys by LSST and Euclid.

  20. Reliable and Repeatable Characterization of Optical Streak Cameras

    International Nuclear Information System (INIS)

    Michael R. Charest, Peter Torres III, Christopher Silbernagel

    2008-01-01

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser performance verification experiments at the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electronic components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases the characterization data is used to 'correct' data images, to remove some of the nonlinearities. In order to obtain these camera characterizations, a specific data set is collected where the response to specific known inputs is recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, temporal resolution, etc., from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information

  1. The Impact of Gaia and LSST on Binaries and Exoplanets

    DEFF Research Database (Denmark)

    Eyer, L.; Dubath, P.; Mowlavi, N.

    2012-01-01

    Two upcoming large scale surveys, the ESA Gaia and LSST projects, will bring a new era in astronomy. The number of binary systems that will be observed and detected by these projects is enormous, estimations range from millions for Gaia to several tens of millions for LSST. We review some tools...

  2. Optical Comb Generation for Streak Camera Calibration for Inertial Confinement Fusion Experiments

    International Nuclear Information System (INIS)

    Ronald Justin; Terence Davies; Frans Janson; Bruce Marshall; Perry Bell; Daniel Kalantar; Joseph Kimbrough; Stephen Vernon; Oliver Sweningsen

    2008-01-01

    The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) is coming on-line to support physics experimentation for the U.S. Department of Energy (DOE) programs in Inertial Confinement Fusion (ICF) and Stockpile Stewardship (SS). Optical streak cameras are an integral part of the experimental diagnostics instrumentation at NIF. To accurately reduce streak camera data a highly accurate temporal calibration is required. This article describes a technique for simultaneously generating a precise +/- 2 ps optical marker pulse (fiducial reference) and trains of precisely timed, short-duration optical pulses (so-called 'comb' pulse trains) that are suitable for the timing calibrations. These optical pulse generators are used with the LLNL optical streak cameras. They are small, portable light sources that, in the comb mode, produce a series of temporally short, uniformly spaced optical pulses, using a laser diode source. Comb generators have been produced with pulse-train repetition rates up to 10 GHz at 780 nm, and somewhat lower frequencies at 664 nm. Individual pulses can be as short as 25-ps FWHM. Signal output is via a fiber-optic connector on the front panel of the generator box. The optical signal is transported from comb generator to streak camera through multi-mode, graded-index optical fiber

  3. Flat-field response and geometric distortion measurements of optical streak cameras

    International Nuclear Information System (INIS)

    Montgomery, D.S.; Drake, R.P.; Jones, B.A.; Wiedwald, J.D.

    1987-01-01

    To accurately measure pulse amplitude, shape, and relative time histories of optical signals with an optical streak camera, it is necessary to correct each recorded image for spatially-dependent gain nonuniformity and geometric distortion. Gain nonuniformities arise from sensitivity variations in the streak-tube photocathode, phosphor screen, image-intensifier tube, and image recording system. By using a 1.053-μm, long-pulse, high-power laser to generate a spatially and temporally uniform source as input to the streak camera, the combined effects of flat-field response and geometric distortion can be measured under the normal dynamic operation of cameras with S-1 photocathodes. Additionally, by using the same laser system to generate a train of short pulses that can be spatially modulated at the input of the streak camera, the authors can create a two-dimensional grid of equally-spaced pulses. This allows a dynamic measurement of the geometric distortion of the streak camera. The author discusses the techniques involved in performing these calibrations, present some of the measured results for LLNL optical streak cameras, and will discuss software methods to correct for these effects

  4. SHOK—The First Russian Wide-Field Optical Camera in Space

    Science.gov (United States)

    Lipunov, V. M.; Gorbovskoy, E. S.; Kornilov, V. G.; Panasyuk, M. I.; Amelushkin, A. M.; Petrov, V. L.; Yashin, I. V.; Svertilov, S. I.; Vedenkin, N. N.

    2018-02-01

    Onboard the spacecraft Lomonosov is established two fast, fixed, very wide-field cameras SHOK. The main goal of this experiment is the observation of GRB optical emission before, synchronously, and after the gamma-ray emission. The field of view of each of the cameras is placed in the gamma-ray burst detection area of other devices located onboard the "Lomonosov" spacecraft. SHOK provides measurements of optical emissions with a magnitude limit of ˜ 9-10m on a single frame with an exposure of 0.2 seconds. The device is designed for continuous sky monitoring at optical wavelengths in the very wide field of view (1000 square degrees each camera), detection and localization of fast time-varying (transient) optical sources on the celestial sphere, including provisional and synchronous time recording of optical emissions from the gamma-ray burst error boxes, detected by the BDRG device and implemented by a control signal (alert trigger) from the BDRG. The Lomonosov spacecraft has two identical devices, SHOK1 and SHOK2. The core of each SHOK device is a fast-speed 11-Megapixel CCD. Each of the SHOK devices represents a monoblock, consisting of a node observations of optical emission, the electronics node, elements of the mechanical construction, and the body.

  5. Exact optics - III. Schwarzschild's spectrograph camera revised

    Science.gov (United States)

    Willstrop, R. V.

    2004-03-01

    Karl Schwarzschild identified a system of two mirrors, each defined by conic sections, free of third-order spherical aberration, coma and astigmatism, and with a flat focal surface. He considered it impractical, because the field was too restricted. This system was rediscovered as a quadratic approximation to one of Lynden-Bell's `exact optics' designs which have wider fields. Thus the `exact optics' version has a moderate but useful field, with excellent definition, suitable for a spectrograph camera. The mirrors are strongly aspheric in both the Schwarzschild design and the exact optics version.

  6. Optomechanical stability design of space optical mapping camera

    Science.gov (United States)

    Li, Fuqiang; Cai, Weijun; Zhang, Fengqin; Li, Na; Fan, Junjie

    2018-01-01

    According to the interior orientation elements and imaging quality requirements of mapping application to mapping camera and combined with off-axis three-mirror anastigmat(TMA) system, high optomechanical stability design of a space optical mapping camera is introduced in this paper. The configuration is a coaxial TMA system used in off-axis situation. Firstly, the overall optical arrangement is described., and an overview of the optomechanical packaging is provided. Zerodurglass, carbon fiber composite and carbon-fiber reinforced silicon carbon (C/SiC) are widely used in the optomechanical structure, because their low coefficient of thermal expansion (CTE) can reduce the thermal sensitivity of the mirrors and focal plane. Flexible and unloading support are used in reflector and camera supporting structure. Epoxy structural adhesives is used for bonding optics to metal structure is also introduced in this paper. The primary mirror is mounted by means of three-point ball joint flexures system, which is attach to the back of the mirror. Then, In order to predict flexural displacements due to gravity, static finite element analysis (FEA) is performed on the primary mirror. The optical performance peak-to-valley (PV) and root-mean-square (RMS) wavefront errors are detected before and after assemble. Also, the dynamic finite element analysis(FEA) of the whole optical arrangement is carried out as to investigate the performance of optomechanical. Finally, in order to evaluate the stability of the design, the thermal vacuum test and vibration test are carried out and the Modulation Transfer Function (MTF) and elements of interior orientation are presented as the evaluation index. Before and after the thermal vacuum test and vibration test, the MTF, focal distance and position of the principal point of optical system are measured and the result is as expected.

  7. Optical camera system for radiation field

    International Nuclear Information System (INIS)

    Maki, Koichi; Senoo, Makoto; Takahashi, Fuminobu; Shibata, Keiichiro; Honda, Takuro.

    1995-01-01

    An infrared-ray camera comprises a transmitting filter used exclusively for infrared-rays at a specific wavelength, such as far infrared-rays and a lens used exclusively for infrared rays. An infrared ray emitter-incorporated photoelectric image converter comprising an infrared ray emitting device, a focusing lens and a semiconductor image pick-up plate is disposed at a place of low gamma-ray dose rate. Infrared rays emitted from an objective member are passed through the lens system of the camera, and real images are formed by way of the filter. They are transferred by image fibers, introduced to the photoelectric image converter and focused on the image pick-up plate by the image-forming lens. Further, they are converted into electric signals and introduced to a display and monitored. With such a constitution, an optical material used exclusively for infrared rays, for example, ZnSe can be used for the lens system and the optical transmission system. Accordingly, it can be used in a radiation field of high gamma ray dose rate around the periphery of the reactor container. (I.N.)

  8. The fly's eye camera system

    Science.gov (United States)

    Mészáros, L.; Pál, A.; Csépány, G.; Jaskó, A.; Vida, K.; Oláh, K.; Mezö, G.

    2014-12-01

    We introduce the Fly's Eye Camera System, an all-sky monitoring device intended to perform time domain astronomy. This camera system design will provide complementary data sets for other synoptic sky surveys such as LSST or Pan-STARRS. The effective field of view is obtained by 19 cameras arranged in a spherical mosaic form. These individual cameras of the device stand on a hexapod mount that is fully capable of achieving sidereal tracking for the subsequent exposures. This platform has many advantages. First of all it requires only one type of moving component and does not include unique parts. Hence this design not only eliminates problems implied by unique elements, but the redundancy of the hexapod allows smooth operations even if one or two of the legs are stuck. In addition, it can calibrate itself by observed stars independently from both the geographical location (including northen and southern hemisphere) and the polar alignment of the full mount. All mechanical elements and electronics are designed within the confines of our institute Konkoly Observatory. Currently, our instrument is in testing phase with an operating hexapod and reduced number of cameras.

  9. Designing for Peta-Scale in the LSST Database

    Science.gov (United States)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  10. A study on the optimization of optical guide of gamma camera detector

    International Nuclear Information System (INIS)

    Chung, Yong Hyun; Cho, Gyu Seong; Kim, Ho Kyung; Lee, Wan No; Kim, Young Soo

    2000-01-01

    An optical guide, which is a light guide located between NaI(Tl) scintillation-crystal and array of photo-multiplier tubes (PMTs) in the gamma camera detector system, is an essential component to deliver the spatial information recorded in scintillator to the PMTs. Without the optical guide, the spatial information within the range of a single PMT could not be obtained. For the design of the optimal optical guide, it is necessary to characterize its properties, especially sensitivity and spatial resolution of detector. In this study, the thickness and the refractive index of optical guide, which affect not only on the sensitivity but also on the spatial resolution of gamma-camera detector, were investigated by using Monte Carlo simulation. A 12'x12'x3/8' NaI(Tl) and 23 PMTs with each 5' diameter were considered as a gamma-camera detector components. Interactions of optical photons in the scintillator and the optical guide were simulated using a commercial code DETECT97, and the spatial resolution, mainly interfered by the intrinsic inward distortion within the PMT, was investigated using our own ANGER program, which was developed to calculate positions of incident photons in the gamma camera. From the simulation results, it was found that an optical guide with 1.6 of refractive index and 10 mm of thickness give maximum sensitivity and minimum spatial distortion, respectively

  11. Radiation-resistant optical sensors and cameras; Strahlungsresistente optische Sensoren und Kameras

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, G. [Imaging and Sensing Technology, Bonn (Germany)

    2008-02-15

    Introducing video technology, i.e. 'TV', specifically in the nuclear field was considered at an early stage. Possibilities to view spaces in nuclear facilities by means of radiation-resistant optical sensors or cameras are presented. These systems are to enable operators to monitor and control visually the processes occurring within such spaces. Camera systems are used, e.g., for remote surveillance of critical components in nuclear power plants and nuclear facilities, and thus contribute also to plant safety. A different application of optical systems resistant to radiation is in the visual inspection of, e.g., reactor pressure vessels and in tracing small parts inside a reactor. Camera systems are also employed in remote disassembly of radioactively contaminated old plants. Unfortunately, the niche market of radiation-resistant camera systems hardly gives rise to the expectation of research funds becoming available for the development of new radiation-resistant optical systems for picture taking and viewing. Current efforts are devoted mainly to improvements of image evaluation and image quality. Other items on the agendas of manufacturers are the reduction in camera size, which is limited by the size of picture tubes, and the increased use of commercial CCD cameras together with adequate shieldings or improved lenses. Consideration is also being given to the use of periphery equipment and to data transmission by LAN, WAN, or Internet links to remote locations. (orig.)

  12. Compact optical technique for streak camera calibration

    International Nuclear Information System (INIS)

    Bell, Perry; Griffith, Roger; Hagans, Karla; Lerche, Richard; Allen, Curt; Davies, Terence; Janson, Frans; Justin, Ronald; Marshall, Bruce; Sweningsen, Oliver

    2004-01-01

    To produce accurate data from optical streak cameras requires accurate temporal calibration sources. We have reproduced an older technology for generating optical timing marks that had been lost due to component availability. Many improvements have been made which allow the modern units to service a much larger need. Optical calibrators are now available that produce optical pulse trains of 780 nm wavelength light at frequencies ranging from 0.1 to 10 GHz, with individual pulse widths of approximately 25 ps full width half maximum. Future plans include the development of single units that produce multiple frequencies to cover a wide temporal range, and that are fully controllable via an RS232 interface

  13. Compact optical technique for streak camera calibration

    Science.gov (United States)

    Bell, Perry; Griffith, Roger; Hagans, Karla; Lerche, Richard; Allen, Curt; Davies, Terence; Janson, Frans; Justin, Ronald; Marshall, Bruce; Sweningsen, Oliver

    2004-10-01

    To produce accurate data from optical streak cameras requires accurate temporal calibration sources. We have reproduced an older technology for generating optical timing marks that had been lost due to component availability. Many improvements have been made which allow the modern units to service a much larger need. Optical calibrators are now available that produce optical pulse trains of 780 nm wavelength light at frequencies ranging from 0.1 to 10 GHz, with individual pulse widths of approximately 25 ps full width half maximum. Future plans include the development of single units that produce multiple frequencies to cover a wide temporal range, and that are fully controllable via an RS232 interface.

  14. Method used to test the imaging consistency of binocular camera's left-right optical system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  15. Crosstalk in multi-output CCDs for LSST

    International Nuclear Information System (INIS)

    O'Connor, P.

    2015-01-01

    LSST's compact, low-power focal plane will be subject to electronic crosstalk with some unique signatures due to its readout geometry. This note describes the crosstalk mechanisms, ongoing characterization of prototypes, and implications for the observing cadence

  16. Optimizing the LSST Dither Pattern for Survey Uniformity

    Science.gov (United States)

    Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration

    2015-01-01

    The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.

  17. The multi-camera optical surveillance system (MOS)

    International Nuclear Information System (INIS)

    Otto, P.; Wagner, H.; Richter, B.; Gaertner, K.J.; Laszlo, G.; Neumann, G.

    1991-01-01

    The transition from film camera to video surveillance systems, in particular the implementation of high capacity multi-camera video systems, results in a large increase in the amount of recorded scenes. Consequently, there is a substantial increase in the manpower requirements for review. Moreover, modern microprocessor controlled equipment facilitates the collection of additional data associated with each scene. Both the scene and the annotated information have to be evaluated by the inspector. The design of video surveillance systems for safeguards necessarily has to account for both appropriate recording and reviewing techniques. An aspect of principal importance is that the video information is stored on tape. Under the German Support Programme to the Agency a technical concept has been developed which aims at optimizing the capabilities of a multi-camera optical surveillance (MOS) system including the reviewing technique. This concept is presented in the following paper including a discussion of reviewing and reliability

  18. Camera System MTF: combining optic with detector

    Science.gov (United States)

    Andersen, Torben B.; Granger, Zachary A.

    2017-08-01

    MTF is one of the most common metrics used to quantify the resolving power of an optical component. Extensive literature is dedicated to describing methods to calculate the Modulation Transfer Function (MTF) for stand-alone optical components such as a camera lens or telescope, and some literature addresses approaches to determine an MTF for combination of an optic with a detector. The formulations pertaining to a combined electro-optical system MTF are mostly based on theory, and assumptions that detector MTF is described only by the pixel pitch which does not account for wavelength dependencies. When working with real hardware, detectors are often characterized by testing MTF at discrete wavelengths. This paper presents a method to simplify the calculation of a polychromatic system MTF when it is permissible to consider the detector MTF to be independent of wavelength.

  19. Multi-Wavelength Spectroscopy of Tidal Disruption Flares: A Legacy Sample for the LSST Era

    Science.gov (United States)

    Cenko, Stephen

    2017-08-01

    When a star passes within the sphere of disruption of a massive black hole, tidal forces will overcome self-gravity and unbind the star. While approximately half of the stellar debris is ejected at high velocities, the remaining material stays bound to the black hole and accretes, resulting in a luminous, long-lived transient known as a tidal disruption flare (TDF). In addition to serving as unique laboratories for accretion physics, TDFs offer the hope of measuring black hole masses in galaxies much too distant for resolved kinematic studies.In order to realize this potential, we must better understand the detailed processes by which the bound debris circularizes and forms an accretion disk. Spectroscopy is critical to this effort, as emission and absorption line diagnostics provide insight into the location and physical state (velocity, density, composition) of the emitting gas (in analogy with quasars). UV spectra are particularly critical, as most strong atomic features fall in this bandpass, and high-redshift TDF discoveries from LSST will sample rest-frame UV wavelengths.Here we propose to obtain a sequence of UV (HST) and optical (Gemini/GMOS) spectra for a sample of 5 TDFs discovered by the Zwicky Transient Facility, doubling the number of TDFs with UV spectra. Our observations will directly test models for the generation of the UV/optical emission (circularization vs reprocessing) by searching for outflows and measuring densities, temperatures, and composition as a function of time. This effort is critical to developing the framework by which we can infer black hole properties (e.g., mass) from LSST TDF discoveries.

  20. TRANSITING PLANETS WITH LSST. II. PERIOD DETECTION OF PLANETS ORBITING 1 M{sub ⊙} HOSTS

    Energy Technology Data Exchange (ETDEWEB)

    Jacklin, Savannah [Department of Astrophysics and Planetary Science, Villanova University, Villanova, PA 19085 (United States); Lund, Michael B.; Stassun, Keivan G. [Department of Physics and Astronomy, Vanderbilt University, Nashville, TN 37235 (United States); Pepper, Joshua [Department of Physics, Lehigh University, Bethlehem, PA 18015 (United States)

    2015-07-15

    The Large Synoptic Survey Telescope (LSST) will photometrically monitor ∼10{sup 9} stars for 10 years. The resulting light curves can be used to detect transiting exoplanets. In particular, as demonstrated by Lund et al., LSST will probe stellar populations currently undersampled in most exoplanet transit surveys, including out to extragalactic distances. In this paper we test the efficiency of the box-fitting least-squares (BLS) algorithm for accurately recovering the periods of transiting exoplanets using simulated LSST data. We model planets with a range of radii orbiting a solar-mass star at a distance of 7 kpc, with orbital periods ranging from 0.5 to 20 days. We find that standard-cadence LSST observations will be able to reliably recover the periods of Hot Jupiters with periods shorter than ∼3 days; however, it will remain a challenge to confidently distinguish these transiting planets from false positives. At the same time, we find that the LSST deep-drilling cadence is extremely powerful: the BLS algorithm successfully recovers at least 30% of sub-Saturn-size exoplanets with orbital periods as long as 20 days, and a simple BLS power criterion robustly distinguishes ∼98% of these from photometric (i.e., statistical) false positives.

  1. EXPECTED LARGE SYNOPTIC SURVEY TELESCOPE (LSST) YIELD OF ECLIPSING BINARY STARS

    International Nuclear Information System (INIS)

    Prsa, Andrej; Pepper, Joshua; Stassun, Keivan G.

    2011-01-01

    In this paper, we estimate the Large Synoptic Survey Telescope (LSST) yield of eclipsing binary stars, which will survey ∼20,000 deg 2 of the southern sky during a period of 10 years in six photometric passbands to r ∼ 24.5. We generate a set of 10,000 eclipsing binary light curves sampled to the LSST time cadence across the whole sky, with added noise as a function of apparent magnitude. This set is passed to the analysis-of-variance period finder to assess the recoverability rate for the periods, and the successfully phased light curves are passed to the artificial-intelligence-based pipeline ebai to assess the recoverability rate in terms of the eclipsing binaries' physical and geometric parameters. We find that, out of ∼24 million eclipsing binaries observed by LSST with a signal-to-noise ratio >10 in mission lifetime, ∼28% or 6.7 million can be fully characterized by the pipeline. Of those, ∼25% or 1.7 million will be double-lined binaries, a true treasure trove for stellar astrophysics.

  2. Optical Design of the Camera for Transiting Exoplanet Survey Satellite (TESS)

    Science.gov (United States)

    Chrisp, Michael; Clark, Kristin; Primeau, Brian; Dalpiaz, Michael; Lennon, Joseph

    2015-01-01

    The optical design of the wide field of view refractive camera, 34 degrees diagonal field, for the TESS payload is described. This fast f/1.4 cryogenic camera, operating at -75 C, has no vignetting for maximum light gathering within the size and weight constraints. Four of these cameras capture full frames of star images for photometric searches of planet crossings. The optical design evolution, from the initial Petzval design, took advantage of Forbes aspheres to develop a hybrid design form. This maximized the correction from the two aspherics resulting in a reduction of average spot size by sixty percent in the final design. An external long wavelength pass filter was replaced by an internal filter coating on a lens to save weight, and has been fabricated to meet the specifications. The stray light requirements were met by an extended lens hood baffle design, giving the necessary off-axis attenuation.

  3. Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream

    Science.gov (United States)

    Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration

    2018-05-01

    The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.

  4. Applying UV cameras for SO2 detection to distant or optically thick volcanic plumes

    Science.gov (United States)

    Kern, Christoph; Werner, Cynthia; Elias, Tamar; Sutton, A. Jeff; Lübcke, Peter

    2013-01-01

    Ultraviolet (UV) camera systems represent an exciting new technology for measuring two dimensional sulfur dioxide (SO2) distributions in volcanic plumes. The high frame rate of the cameras allows the retrieval of SO2 emission rates at time scales of 1 Hz or higher, thus allowing the investigation of high-frequency signals and making integrated and comparative studies with other high-data-rate volcano monitoring techniques possible. One drawback of the technique, however, is the limited spectral information recorded by the imaging systems. Here, a framework for simulating the sensitivity of UV cameras to various SO2 distributions is introduced. Both the wavelength-dependent transmittance of the optical imaging system and the radiative transfer in the atmosphere are modeled. The framework is then applied to study the behavior of different optical setups and used to simulate the response of these instruments to volcanic plumes containing varying SO2 and aerosol abundances located at various distances from the sensor. Results show that UV radiative transfer in and around distant and/or optically thick plumes typically leads to a lower sensitivity to SO2 than expected when assuming a standard Beer–Lambert absorption model. Furthermore, camera response is often non-linear in SO2 and dependent on distance to the plume and plume aerosol optical thickness and single scatter albedo. The model results are compared with camera measurements made at Kilauea Volcano (Hawaii) and a method for integrating moderate resolution differential optical absorption spectroscopy data with UV imagery to retrieve improved SO2 column densities is discussed.

  5. INTRODUCING NOVEL GENERATION OF HIGH ACCURACY CAMERA OPTICAL-TESTING AND CALIBRATION TEST-STANDS FEASIBLE FOR SERIES PRODUCTION OF CAMERAS

    Directory of Open Access Journals (Sweden)

    M. Nekouei Shahraki

    2015-12-01

    Full Text Available The recent advances in the field of computer-vision have opened the doors of many opportunities for taking advantage of these techniques and technologies in many fields and applications. Having a high demand for these systems in today and future vehicles implies a high production volume of video cameras. The above criterions imply that it is critical to design test systems which deliver fast and accurate calibration and optical-testing capabilities. In this paper we introduce new generation of test-stands delivering high calibration quality in single-shot calibration of fisheye surround-view cameras. This incorporates important geometric features from bundle-block calibration, delivers very high (sub-pixel calibration accuracy, makes possible a very fast calibration procedure (few seconds, and realizes autonomous calibration via machines. We have used the geometrical shape of a Spherical Helix (Type: 3D Spherical Spiral with special geometrical characteristics, having a uniform radius which corresponds to the uniform motion. This geometrical feature was mechanically realized using three dimensional truncated icosahedrons which practically allow the implementation of a spherical helix on multiple surfaces. Furthermore the test-stand enables us to perform many other important optical tests such as stray-light testing, enabling us to evaluate the certain qualities of the camera optical module.

  6. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera.

    Science.gov (United States)

    Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng

    2017-06-20

    The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.

  7. Technical assessment of Navitar Zoom 6000 optic and Sony HDC-X310 camera for MEMS presentations and training.

    Energy Technology Data Exchange (ETDEWEB)

    Diegert, Carl F.

    2006-02-01

    This report evaluates a newly-available, high-definition, video camera coupled with a zoom optical system for microscopic imaging of micro-electro-mechanical systems. We did this work to support configuration of three document-camera-like stations as part of an installation in a new Microsystems building at Sandia National Laboratories. The video display walls to be installed as part of these three presentation and training stations are of extraordinary resolution and quality. The new availability of a reasonably-priced, cinema-quality, high-definition video camera offers the prospect of filling these displays with full-motion imaging of Sandia's microscopic products at a quality substantially beyond the quality of typical video microscopes. Simple and robust operation of the microscope stations will allow the extraordinary-quality imaging to contribute to Sandia's day-to-day research and training operations. This report illustrates the disappointing image quality from a camera/lens system comprised of a Sony HDC-X310 high-definition video camera coupled to a Navitar Zoom 6000 lens. We determined that this Sony camera is capable of substantially more image quality than the Navitar optic can deliver. We identified an optical doubler lens from Navitar as the component of their optical system that accounts for a substantial part of the image quality problem. While work continues to incrementally improve performance of the Navitar system, we are also evaluating optical systems from other vendors to couple to this Sony camera.

  8. Motionless active depth from defocus system using smart optics for camera autofocus applications

    Science.gov (United States)

    Amin, M. Junaid; Riza, Nabeel A.

    2016-04-01

    This paper describes a motionless active Depth from Defocus (DFD) system design suited for long working range camera autofocus applications. The design consists of an active illumination module that projects a scene illuminating coherent conditioned optical radiation pattern which maintains its sharpness over multiple axial distances allowing an increased DFD working distance range. The imager module of the system responsible for the actual DFD operation deploys an electronically controlled variable focus lens (ECVFL) as a smart optic to enable a motionless imager design capable of effective DFD operation. An experimental demonstration is conducted in the laboratory which compares the effectiveness of the coherent conditioned radiation module versus a conventional incoherent active light source, and demonstrates the applicability of the presented motionless DFD imager design. The fast response and no-moving-parts features of the DFD imager design are especially suited for camera scenarios where mechanical motion of lenses to achieve autofocus action is challenging, for example, in the tiny camera housings in smartphones and tablets. Applications for the proposed system include autofocus in modern day digital cameras.

  9. Recording of radiation-induced optical density changes in doped agarose gels with a CCD camera

    International Nuclear Information System (INIS)

    Tarte, B.J.; Jardine, P.A.; Van Doorn, T.

    1996-01-01

    Full text: Spatially resolved dose measurement with iron-doped agarose gels is continuing to be investigated for applications in radiotherapy dosimetry. It has previously been proposed to use optical methods, rather than MRI, for dose measurement with such gels and this has been investigated using a spectrophotometer (Appleby A and Leghrouz A, Med Phys, 18:309-312, 1991). We have previously studied the use of a pencil beam laser for such optical density measurement of gels and are currently investigating charge-coupled devices (CCD) camera imaging for the same purpose but with the advantages of higher data acquisition rates and potentially greater spatial resolution. The gels used in these studies were poured, irradiated and optically analysed in Perspex casts providing gel sections 1 cm thick and up to 20 cm x 30 cm in dimension. The gels were also infused with a metal indicator dye (xylenol orange) to render the radiation induced oxidation of the iron in the gel sensitive to optical radiation, specifically in the green spectral region. Data acquisition with the CCD camera involved illumination of the irradiated gel section with a diffuse white light source, with the light from the plane of the gel section focussed to the CCD array with a manual zoom lens. The light was also filtered with a green colour glass filter to maximise the contrast between unirradiated and irradiated gels. The CCD camera (EG and G Reticon MC4013) featured a 1024 x 1024 pixel array and was interfaced to a PC via a frame grabber acquisition board with 8 bit resolution. The performance of the gel dosimeter was appraised in mapping of physical and dynamic wedged 6 MV X-ray fields. The results from the CCD camera detection system were compared with both ionisation chamber data and laser based optical density measurements of the gels. Cross beam profiles were extracted from each measurement system at a particular depth (eg. 2.3 cm for the physical wedge field) for direct comparison. A

  10. Optical Characterization of the SPT-3G Camera

    Science.gov (United States)

    Pan, Z.; Ade, P. A. R.; Ahmed, Z.; Anderson, A. J.; Austermann, J. E.; Avva, J. S.; Thakur, R. Basu; Bender, A. N.; Benson, B. A.; Carlstrom, J. E.; Carter, F. W.; Cecil, T.; Chang, C. L.; Cliche, J. F.; Cukierman, A.; Denison, E. V.; de Haan, T.; Ding, J.; Dobbs, M. A.; Dutcher, D.; Everett, W.; Foster, A.; Gannon, R. N.; Gilbert, A.; Groh, J. C.; Halverson, N. W.; Harke-Hosemann, A. H.; Harrington, N. L.; Henning, J. W.; Hilton, G. C.; Holzapfel, W. L.; Huang, N.; Irwin, K. D.; Jeong, O. B.; Jonas, M.; Khaire, T.; Kofman, A. M.; Korman, M.; Kubik, D.; Kuhlmann, S.; Kuo, C. L.; Lee, A. T.; Lowitz, A. E.; Meyer, S. S.; Michalik, D.; Montgomery, J.; Nadolski, A.; Natoli, T.; Nguyen, H.; Noble, G. I.; Novosad, V.; Padin, S.; Pearson, J.; Posada, C. M.; Rahlin, A.; Ruhl, J. E.; Saunders, L. J.; Sayre, J. T.; Shirley, I.; Shirokoff, E.; Smecher, G.; Sobrin, J. A.; Stark, A. A.; Story, K. T.; Suzuki, A.; Tang, Q. Y.; Thompson, K. L.; Tucker, C.; Vale, L. R.; Vanderlinde, K.; Vieira, J. D.; Wang, G.; Whitehorn, N.; Yefremenko, V.; Yoon, K. W.; Young, M. R.

    2018-05-01

    The third-generation South Pole Telescope camera is designed to measure the cosmic microwave background across three frequency bands (centered at 95, 150 and 220 GHz) with ˜ 16,000 transition-edge sensor (TES) bolometers. Each multichroic array element on a detector wafer has a broadband sinuous antenna that couples power to six TESs, one for each of the three observing bands and both polarizations, via lumped element filters. Ten detector wafers populate the detector array, which is coupled to the sky via a large-aperture optical system. Here we present the frequency band characterization with Fourier transform spectroscopy, measurements of optical time constants, beam properties, and optical and polarization efficiencies of the detector array. The detectors have frequency bands consistent with our simulations and have high average optical efficiency which is 86, 77 and 66% for the 95, 150 and 220 GHz detectors. The time constants of the detectors are mostly between 0.5 and 5 ms. The beam is round with the correct size, and the polarization efficiency is more than 90% for most of the bolometers.

  11. Behavioral Model of High Performance Camera for NIF Optics Inspection

    International Nuclear Information System (INIS)

    Hackel, B M

    2007-01-01

    The purpose of this project was to develop software that will model the behavior of the high performance Spectral Instruments 1000 series Charge-Coupled Device (CCD) camera located in the Final Optics Damage Inspection (FODI) system on the National Ignition Facility. NIF's target chamber will be mounted with 48 Final Optics Assemblies (FOAs) to convert the laser light from infrared to ultraviolet and focus it precisely on the target. Following a NIF shot, the optical components of each FOA must be carefully inspected for damage by the FODI to ensure proper laser performance during subsequent experiments. Rapid image capture and complex image processing (to locate damage sites) will reduce shot turnaround time; thus increasing the total number of experiments NIF can conduct during its 30 year lifetime. Development of these rapid processes necessitates extensive offline software automation -- especially after the device has been deployed in the facility. Without access to the unique real device or an exact behavioral model, offline software testing is difficult. Furthermore, a software-based behavioral model allows for many instances to be running concurrently; this allows multiple developers to test their software at the same time. Thus it is beneficial to construct separate software that will exactly mimic the behavior and response of the real SI-1000 camera

  12. Report on the Radiation Effects Testing of the Infrared and Optical Transition Radiation Camera Systems

    Energy Technology Data Exchange (ETDEWEB)

    Holloway, Michael Andrew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-04-20

    Presented in this report are the results tests performed at Argonne National Lab in collaboration with Los Alamos National Lab to assess the reliability of the critical 99Mo production facility beam monitoring diagnostics. The main components of the beam monitoring systems are two cameras that will be exposed to radiation during accelerator operation. The purpose of this test is to assess the reliability of the cameras and related optical components when exposed to operational radiation levels. Both X-ray and neutron radiation could potentially damage camera electronics as well as the optical components such as lenses and windows. This report covers results of the testing of component reliability when exposed to X-ray radiation. With the information from this study we provide recommendations for implementing protective measures for the camera systems in order to minimize the occurrence of radiation-induced failure within a ten month production run cycle.

  13. Multiple-aperture optical design for micro-level cameras using 3D-printing method

    Science.gov (United States)

    Peng, Wei-Jei; Hsu, Wei-Yao; Cheng, Yuan-Chieh; Lin, Wen-Lung; Yu, Zong-Ru; Chou, Hsiao-Yu; Chen, Fong-Zhi; Fu, Chien-Chung; Wu, Chong-Syuan; Huang, Chao-Tsung

    2018-02-01

    The design of the ultra miniaturized camera using 3D-printing technology directly printed on to the complementary metal-oxide semiconductor (CMOS) imaging sensor is presented in this paper. The 3D printed micro-optics is manufactured using the femtosecond two-photon direct laser writing, and the figure error which could achieve submicron accuracy is suitable for the optical system. Because the size of the micro-level camera is approximately several hundreds of micrometers, the resolution is reduced much and highly limited by the Nyquist frequency of the pixel pitch. For improving the reduced resolution, one single-lens can be replaced by multiple-aperture lenses with dissimilar field of view (FOV), and then stitching sub-images with different FOV can achieve a high resolution within the central region of the image. The reason is that the angular resolution of the lens with smaller FOV is higher than that with larger FOV, and then the angular resolution of the central area can be several times than that of the outer area after stitching. For the same image circle, the image quality of the central area of the multi-lens system is significantly superior to that of a single-lens. The foveated image using stitching FOV breaks the limitation of the resolution for the ultra miniaturized imaging system, and then it can be applied such as biomedical endoscopy, optical sensing, and machine vision, et al. In this study, the ultra miniaturized camera with multi-aperture optics is designed and simulated for the optimum optical performance.

  14. Euratom multi-camera optical surveillance system (EMOSS) - a digital solution

    International Nuclear Information System (INIS)

    Otto, P.; Wagner, H.G.; Taillade, B.; Pryck, C. de.

    1991-01-01

    In 1989 the Euratom Safeguards Directorate of the Commission of the European Communities drew up functional and draft technical specifications for a new fully digital multi-camera optical surveillance system. HYMATOM of Castries designed and built a prototype unit for laboratory and field tests. This paper reports and system design and first test results

  15. Completely optical orientation determination for an unstabilized aerial three-line camera

    Science.gov (United States)

    Wohlfeil, Jürgen

    2010-10-01

    Aerial line cameras allow the fast acquisition of high-resolution images at low costs. Unfortunately the measurement of the camera's orientation with the necessary rate and precision is related with large effort, unless extensive camera stabilization is used. But also stabilization implicates high costs, weight, and power consumption. This contribution shows that it is possible to completely derive the absolute exterior orientation of an unstabilized line camera from its images and global position measurements. The presented approach is based on previous work on the determination of the relative orientation of subsequent lines using optical information from the remote sensing system. The relative orientation is used to pre-correct the line images, in which homologous points can reliably be determined using the SURF operator. Together with the position measurements these points are used to determine the absolute orientation from the relative orientations via bundle adjustment of a block of overlapping line images. The approach was tested at a flight with the DLR's RGB three-line camera MFC. To evaluate the precision of the resulting orientation the measurements of a high-end navigation system and ground control points are used.

  16. Development of an all-optical framing camera and its application on the Z-pinch.

    Science.gov (United States)

    Song, Yan; Peng, Bodong; Wang, Hong-Xing; Song, Guzhou; Li, Binkang; Yue, Zhiqin; Li, Yang; Sun, Tieping; Xu, Qing; Ma, Jiming; Sheng, Liang; Han, Changcai; Duan, Baojun; Yao, Zhiming; Yan, Weipeng

    2017-12-11

    An all-optical framing camera has been developed which measures the spatial profile of photons flux by utilizing a laser beam to probe the refractive index change in an indium phosphide semiconductor. This framing camera acquires two frames with the time resolution of about 1.5 ns and the inter frame separation time of about 13 ns by angularly multiplexing the probe beam on to the semiconductor. The spatial resolution of this camera has been estimated to be about 140 μm and the spectral response of this camera has also been theoretically investigated in 5 eV-100 KeV range. This camera has been applied in investigating the imploding dynamics of the molybdenum planar wire array Z-pinch on the 1-MA "QiangGuang-1" facility. This framing camera can provide an alternative scheme for high energy density physics experiments.

  17. Optical design of the comet Shoemaker-Levy speckle camera

    Energy Technology Data Exchange (ETDEWEB)

    Bissinger, H. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    An optical design is presented in which the Lick 3 meter telescope and a bare CCD speckle camera system was used to image the collision sites of the Shoemaker-Levy 9 comet with the Planet Jupiter. The brief overview includes of the optical constraints and system layout. The choice of a Risley prism combination to compensate for the time dependent atmospheric chromatic changes are described. Plate scale and signal-to-noise ratio curves resulting from imaging reference stars are compared with theory. Comparisons between un-corrected and reconstructed images of Jupiter`s impact sites. The results confirm that speckle imaging techniques can be used over an extended time period to provide a method to image large extended objects.

  18. Fast force actuators for LSST primary/tertiary mirror

    Science.gov (United States)

    Hileman, Edward; Warner, Michael; Wiecha, Oliver

    2010-07-01

    The very short slew times and resulting high inertial loads imposed upon the Large Synoptic Survey Telescope (LSST) create new challenges to the primary mirror support actuators. Traditionally large borosilicate mirrors are supported by pneumatic systems, which is also the case for the LSST. These force based actuators bear the weight of the mirror and provide active figure correction, but do not define the mirror position. A set of six locating actuators (hardpoints) arranged in a hexapod fashion serve to locate the mirror. The stringent dynamic requirements demand that the force actuators must be able to counteract in real time for dynamic forces on the hardpoints during slewing to prevent excessive hardpoint loads. The support actuators must also maintain the prescribed forces accurately during tracking to maintain acceptable mirror figure. To meet these requirements, candidate pneumatic cylinders incorporating force feedback control and high speed servo valves are being tested using custom instrumentation with automatic data recording. Comparative charts are produced showing details of friction, hysteresis cycles, operating bandwidth, and temperature dependency. Extremely low power actuator controllers are being developed to avoid heat dissipation in critical portions of the mirror and also to allow for increased control capabilities at the actuator level, thus improving safety, performance, and the flexibility of the support system.

  19. Retinal axial focusing and multi-layer imaging with a liquid crystal adaptive optics camera

    International Nuclear Information System (INIS)

    Liu Rui-Xue; Zheng Xian-Liang; Li Da-Yu; Hu Li-Fa; Cao Zhao-Liang; Mu Quan-Quan; Xuan Li; Xia Ming-Liang

    2014-01-01

    With the help of adaptive optics (AO) technology, cellular level imaging of living human retina can be achieved. Aiming to reduce distressing feelings and to avoid potential drug induced diseases, we attempted to image retina with dilated pupil and froze accommodation without drugs. An optimized liquid crystal adaptive optics camera was adopted for retinal imaging. A novel eye stared system was used for stimulating accommodation and fixating imaging area. Illumination sources and imaging camera kept linkage for focusing and imaging different layers. Four subjects with diverse degree of myopia were imaged. Based on the optical properties of the human eye, the eye stared system reduced the defocus to less than the typical ocular depth of focus. In this way, the illumination light can be projected on certain retina layer precisely. Since that the defocus had been compensated by the eye stared system, the adopted 512 × 512 liquid crystal spatial light modulator (LC-SLM) corrector provided the crucial spatial fidelity to fully compensate high-order aberrations. The Strehl ratio of a subject with −8 diopter myopia was improved to 0.78, which was nearly close to diffraction-limited imaging. By finely adjusting the axial displacement of illumination sources and imaging camera, cone photoreceptors, blood vessels and nerve fiber layer were clearly imaged successfully. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  20. Wavefront analysis for plenoptic camera imaging

    International Nuclear Information System (INIS)

    Luan Yin-Sen; Xu Bing; Yang Ping; Tang Guo-Mao

    2017-01-01

    The plenoptic camera is a single lens stereo camera which can retrieve the direction of light rays while detecting their intensity distribution. In this paper, to reveal more truths of plenoptic camera imaging, we present the wavefront analysis for the plenoptic camera imaging from the angle of physical optics but not from the ray tracing model of geometric optics. Specifically, the wavefront imaging model of a plenoptic camera is analyzed and simulated by scalar diffraction theory and the depth estimation is redescribed based on physical optics. We simulate a set of raw plenoptic images of an object scene, thereby validating the analysis and derivations and the difference between the imaging analysis methods based on geometric optics and physical optics are also shown in simulations. (paper)

  1. Reconstruction of data for an experiment using multi-gap spark chambers with six-camera optics

    International Nuclear Information System (INIS)

    Maybury, R.; Daley, H.M.

    1983-06-01

    A program has been developed to reconstruct spark positions in a pair of multi-gap optical spark chambers viewed by six cameras, which were used by a Rutherford Laboratory experiment. The procedure for correlating camera views to calculate spark positions is described. Calibration of the apparatus, and the application of time- and intensity-dependent corrections are discussed. (author)

  2. Sweep time performance of optic streak camera

    International Nuclear Information System (INIS)

    Wang Zhebin; Yang Dong; Zhang Huige

    2012-01-01

    The sweep time performance of the optic streak camera (OSC) is of critical importance to its application. The systematic analysis of full-screen sweep velocity shows that the traditional method based on the averaged velocity and its nonlinearity would increase the uncertainty of sweep time and can not reflect the influence of the spatial distortion of OSC. A elaborate method for sweep time has been developed with the aid of full-screen sweep velocity and its uncertainty. It is proved by the theoretical analysis and experimental study that the method would decrease the uncertainty of sweep time within 1%, which would improve the accuracy of sweep time and the reliability of OSC application. (authors)

  3. Seeing in a different light—using an infrared camera to teach heat transfer and optical phenomena

    Science.gov (United States)

    Pei Wong, Choun; Subramaniam, R.

    2018-05-01

    The infrared camera is a useful tool in physics education to ‘see’ in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  4. Seeing in a Different Light--Using an Infrared Camera to Teach Heat Transfer and Optical Phenomena

    Science.gov (United States)

    Wong, Choun Pei; Subramaniam, R.

    2018-01-01

    The infrared camera is a useful tool in physics education to 'see' in the infrared. In this paper, we describe four simple experiments that focus on phenomena related to heat transfer and optics that are encountered at undergraduate physics level using an infrared camera, and discuss the strengths and limitations of this tool for such purposes.

  5. Defect testing of large aperture optics based on high resolution CCD camera

    International Nuclear Information System (INIS)

    Cheng Xiaofeng; Xu Xu; Zhang Lin; He Qun; Yuan Xiaodong; Jiang Xiaodong; Zheng Wanguo

    2009-01-01

    A fast testing method on inspecting defects of large aperture optics was introduced. With uniform illumination by LED source at grazing incidence, the image of defects on the surface of and inside the large aperture optics could be enlarged due to scattering. The images of defects were got by high resolution CCD camera and microscope, and the approximate mathematical relation between viewing dimension and real dimension of defects was simulated. Thus the approximate real dimension and location of all defects could be calculated through the high resolution pictures. (authors)

  6. Simulating Galaxies and Active Galactic Nuclei in the LSST Image Simulation Effort

    NARCIS (Netherlands)

    Pizagno II, Jim; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A.; Chang, C.; Gibson, R. R.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, S. K.; Lorenz, S.; Marshall, S.; Shmakova, S. M.; Sylvestri, N.; Todd, N.; Young, M.

    We present an extragalactic source catalog, which includes galaxies and Active Galactic Nuclei, that is used for the Large Survey Synoptic Telescope Imaging Simulation effort. The galaxies are taken from the De Lucia et. al. (2006) semi-analytic modeling (SAM) of the Millennium Simulation. The LSST

  7. Detection of Double White Dwarf Binaries with Gaia, LSST and eLISA

    Science.gov (United States)

    Korol, V.; Rossi, E. M.; Groot, P. J.

    2017-03-01

    According to simulations around 108 double degenerate white dwarf binaries are expected to be present in the Milky Way. Due to their intrinsic faintness, the detection of these systems is a challenge, and the total number of detected sources so far amounts only to a few tens. This will change in the next two decades with the advent of Gaia, the LSST and eLISA. We present an estimation of how many compact DWDs with orbital periods less than a few hours we will be able to detect 1) through electromagnetic radiation with Gaia and LSST and 2) through gravitational wave radiation with eLISA. We find that the sample of simultaneous electromagnetic and gravitational waves detections is expected to be substantial, and will provide us a powerful tool for probing the white dwarf astrophysics and the structure of the Milky Way, letting us into the era of multi-messenger astronomy for these sources.

  8. Radiation Dose-Rate Extraction from the Camera Image of Quince 2 Robot System using Optical Character Recognition

    International Nuclear Information System (INIS)

    Cho, Jai Wan; Jeong, Kyung Min

    2012-01-01

    In the case of the Japanese Quince 2 robot system, 7 CCD/CMOS cameras were used. 2 CCD cameras of Quince robot are used for the forward and backward monitoring of the surroundings during navigation. And 2 CCD (or CMOS) cameras are used for monitoring the status of front-end and back-end motion mechanics such as flippers and crawlers. A CCD camera with wide field of view optics is used for monitoring the status of the communication (VDSL) cable reel. And another 2 CCD cameras are assigned for reading the indication value of the radiation dosimeter and the instrument. The Quince 2 robot measured radiation in the unit 2 reactor building refueling floor of the Fukushima nuclear power plant. The CCD camera with wide field-of-view (fisheye) lens reads indicator of the dosimeter loaded on the Quince 2 robot, which was sent to carry out investigating the unit 2 reactor building refueling floor situation. The camera image with gamma ray dose-rate information is transmitted to the remote control site via VDSL communication line. At the remote control site, the radiation information in the unit 2 reactor building refueling floor can be perceived by monitoring the camera image. To make up the radiation profile in the surveyed refueling floor, the gamma ray dose-rate information in the image should be converted to numerical value. In this paper, we extract the gamma ray dose-rate value in the unit 2 reactor building refueling floor using optical character recognition method

  9. Radiation Dose-Rate Extraction from the Camera Image of Quince 2 Robot System using Optical Character Recognition

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    In the case of the Japanese Quince 2 robot system, 7 CCD/CMOS cameras were used. 2 CCD cameras of Quince robot are used for the forward and backward monitoring of the surroundings during navigation. And 2 CCD (or CMOS) cameras are used for monitoring the status of front-end and back-end motion mechanics such as flippers and crawlers. A CCD camera with wide field of view optics is used for monitoring the status of the communication (VDSL) cable reel. And another 2 CCD cameras are assigned for reading the indication value of the radiation dosimeter and the instrument. The Quince 2 robot measured radiation in the unit 2 reactor building refueling floor of the Fukushima nuclear power plant. The CCD camera with wide field-of-view (fisheye) lens reads indicator of the dosimeter loaded on the Quince 2 robot, which was sent to carry out investigating the unit 2 reactor building refueling floor situation. The camera image with gamma ray dose-rate information is transmitted to the remote control site via VDSL communication line. At the remote control site, the radiation information in the unit 2 reactor building refueling floor can be perceived by monitoring the camera image. To make up the radiation profile in the surveyed refueling floor, the gamma ray dose-rate information in the image should be converted to numerical value. In this paper, we extract the gamma ray dose-rate value in the unit 2 reactor building refueling floor using optical character recognition method

  10. Variability-based active galactic nucleus selection using image subtraction in the SDSS and LSST era

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yumi; Gibson, Robert R.; Becker, Andrew C.; Ivezić, Željko; Connolly, Andrew J.; Ruan, John J.; Anderson, Scott F. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); MacLeod, Chelsea L., E-mail: ymchoi@astro.washington.edu [Physics Department, U.S. Naval Academy, 572 Holloway Road, Annapolis, MD 21402 (United States)

    2014-02-10

    With upcoming all-sky surveys such as LSST poised to generate a deep digital movie of the optical sky, variability-based active galactic nucleus (AGN) selection will enable the construction of highly complete catalogs with minimum contamination. In this study, we generate g-band difference images and construct light curves (LCs) for QSO/AGN candidates listed in Sloan Digital Sky Survey Stripe 82 public catalogs compiled from different methods, including spectroscopy, optical colors, variability, and X-ray detection. Image differencing excels at identifying variable sources embedded in complex or blended emission regions such as Type II AGNs and other low-luminosity AGNs that may be omitted from traditional photometric or spectroscopic catalogs. To separate QSOs/AGNs from other sources using our difference image LCs, we explore several LC statistics and parameterize optical variability by the characteristic damping timescale (τ) and variability amplitude. By virtue of distinguishable variability parameters of AGNs, we are able to select them with high completeness of 93.4% and efficiency (i.e., purity) of 71.3%. Based on optical variability, we also select highly variable blazar candidates, whose infrared colors are consistent with known blazars. One-third of them are also radio detected. With the X-ray selected AGN candidates, we probe the optical variability of X-ray detected optically extended sources using their difference image LCs for the first time. A combination of optical variability and X-ray detection enables us to select various types of host-dominated AGNs. Contrary to the AGN unification model prediction, two Type II AGN candidates (out of six) show detectable variability on long-term timescales like typical Type I AGNs. This study will provide a baseline for future optical variability studies of extended sources.

  11. Trained neurons-based motion detection in optical camera communications

    Science.gov (United States)

    Teli, Shivani; Cahyadi, Willy Anugrah; Chung, Yeon Ho

    2018-04-01

    A concept of trained neurons-based motion detection (TNMD) in optical camera communications (OCC) is proposed. The proposed TNMD is based on neurons present in a neural network that perform repetitive analysis in order to provide efficient and reliable motion detection in OCC. This efficient motion detection can be considered another functionality of OCC in addition to two traditional functionalities of illumination and communication. To verify the proposed TNMD, the experiments were conducted in an indoor static downlink OCC, where a mobile phone front camera is employed as the receiver and an 8 × 8 red, green, and blue (RGB) light-emitting diode array as the transmitter. The motion is detected by observing the user's finger movement in the form of centroid through the OCC link via a camera. Unlike conventional trained neurons approaches, the proposed TNMD is trained not with motion itself but with centroid data samples, thus providing more accurate detection and far less complex detection algorithm. The experiment results demonstrate that the TNMD can detect all considered motions accurately with acceptable bit error rate (BER) performances at a transmission distance of up to 175 cm. In addition, while the TNMD is performed, a maximum data rate of 3.759 kbps over the OCC link is obtained. The OCC with the proposed TNMD combined can be considered an efficient indoor OCC system that provides illumination, communication, and motion detection in a convenient smart home environment.

  12. Optimization of a miniature short-wavelength infrared objective optics of a short-wavelength infrared to visible upconversion layer attached to a mobile-devices visible camera

    Science.gov (United States)

    Kadosh, Itai; Sarusi, Gabby

    2017-10-01

    The use of dual cameras in parallax in order to detect and create 3-D images in mobile devices has been increasing over the last few years. We propose a concept where the second camera will be operating in the short-wavelength infrared (SWIR-1300 to 1800 nm) and thus have night vision capability while preserving most of the other advantages of dual cameras in terms of depth and 3-D capabilities. In order to maintain commonality of the two cameras, we propose to attach to one of the cameras a SWIR to visible upconversion layer that will convert the SWIR image into a visible image. For this purpose, the fore optics (the objective lenses) should be redesigned for the SWIR spectral range and the additional upconversion layer, whose thickness is mobile device visible range camera sensor (the CMOS sensor). This paper presents such a SWIR objective optical design and optimization that is formed and fit mechanically to the visible objective design but with different lenses in order to maintain the commonality and as a proof-of-concept. Such a SWIR objective design is very challenging since it requires mimicking the original visible mobile camera lenses' sizes and the mechanical housing, so we can adhere to the visible optical and mechanical design. We present in depth a feasibility study and the overall optical system performance of such a SWIR mobile-device camera fore optics design.

  13. Optical character recognition of camera-captured images based on phase features

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2015-09-01

    Nowadays most of digital information is obtained using mobile devices specially smartphones. In particular, it brings the opportunity for optical character recognition in camera-captured images. For this reason many recognition applications have been recently developed such as recognition of license plates, business cards, receipts and street signal; document classification, augmented reality, language translator and so on. Camera-captured images are usually affected by geometric distortions, nonuniform illumination, shadow, noise, which make difficult the recognition task with existing systems. It is well known that the Fourier phase contains a lot of important information regardless of the Fourier magnitude. So, in this work we propose a phase-based recognition system exploiting phase-congruency features for illumination/scale invariance. The performance of the proposed system is tested in terms of miss classifications and false alarms with the help of computer simulation.

  14. The Search for Transients and Variables in the LSST Pathfinder Survey

    Science.gov (United States)

    Gorsuch, Mary Katherine; Kotulla, Ralf

    2018-01-01

    This research was completed during participation in the NSF-REU program at University of Wisconsin-Madison. Two fields of a few square degrees, close to the galactic plane, were imaged on the WIYN 3.5 meter telescope during the commissioning of the One Degree Imager (ODI) focal plane. These images were taken with repeated, shorter exposures in order to model an LSST-like cadence. This data was taken in order to identify transient and variable light sources. This was done by using Source Extractor to generate a catalog of all sources in each exposure, and inserting this data into a larger photometry database composed of all exposures for each field. A Python code was developed to analyze the data and isolate sources of interest from a large data set. We found that there were some discrepancies in the data, which lead to some interesting results that we are looking into further. Variable and transient sources, while relatively well understood, are not numerous in current cataloging systems. This will be a major undertaking of the Large Synoptic Survey Telescope (LSST), which this project is a precursor to. Locating these sources may give us a better understanding of where these sources are located and how they impact their surroundings.

  15. Time-Of-Flight Camera, Optical Tracker and Computed Tomography in Pairwise Data Registration.

    Directory of Open Access Journals (Sweden)

    Bartlomiej Pycinski

    Full Text Available A growing number of medical applications, including minimal invasive surgery, depends on multi-modal or multi-sensors data processing. Fast and accurate 3D scene analysis, comprising data registration, seems to be crucial for the development of computer aided diagnosis and therapy. The advancement of surface tracking system based on optical trackers already plays an important role in surgical procedures planning. However, new modalities, like the time-of-flight (ToF sensors, widely explored in non-medical fields are powerful and have the potential to become a part of computer aided surgery set-up. Connection of different acquisition systems promises to provide a valuable support for operating room procedures. Therefore, the detailed analysis of the accuracy of such multi-sensors positioning systems is needed.We present the system combining pre-operative CT series with intra-operative ToF-sensor and optical tracker point clouds. The methodology contains: optical sensor set-up and the ToF-camera calibration procedures, data pre-processing algorithms, and registration technique. The data pre-processing yields a surface, in case of CT, and point clouds for ToF-sensor and marker-driven optical tracker representation of an object of interest. An applied registration technique is based on Iterative Closest Point algorithm.The experiments validate the registration of each pair of modalities/sensors involving phantoms of four various human organs in terms of Hausdorff distance and mean absolute distance metrics. The best surface alignment was obtained for CT and optical tracker combination, whereas the worst for experiments involving ToF-camera.The obtained accuracies encourage to further develop the multi-sensors systems. The presented substantive discussion concerning the system limitations and possible improvements mainly related to the depth information produced by the ToF-sensor is useful for computer aided surgery developers.

  16. Streak camera measurements of laser pulse temporal dispersion in short graded-index optical fibers

    International Nuclear Information System (INIS)

    Lerche, R.A.; Phillips, G.E.

    1981-01-01

    Streak camera measurements were used to determine temporal dispersion in short (5 to 30 meter) graded-index optical fibers. Results show that 50-ps, 1.06-μm and 0.53-μm laser pulses can be propagated without significant dispersion when care is taken to prevent propagation of energy in fiber cladding modes

  17. Speed of sound and photoacoustic imaging with an optical camera based ultrasound detection system

    Science.gov (United States)

    Nuster, Robert; Paltauf, Guenther

    2017-07-01

    CCD camera based optical ultrasound detection is a promising alternative approach for high resolution 3D photoacoustic imaging (PAI). To fully exploit its potential and to achieve an image resolution SOS) in the image reconstruction algorithm. Hence, in the proposed work the idea and a first implementation are shown how speed of sound imaging can be added to a previously developed camera based PAI setup. The current setup provides SOS-maps with a spatial resolution of 2 mm and an accuracy of the obtained absolute SOS values of about 1%. The proposed dual-modality setup has the potential to provide highly resolved and perfectly co-registered 3D photoacoustic and SOS images.

  18. TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope

    Science.gov (United States)

    Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.

    Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.

  19. Commercialization of radiation tolerant camera

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10{sup 6} - 10{sup 8} rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  20. Commercialization of radiation tolerant camera

    International Nuclear Information System (INIS)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10 6 - 10 8 rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  1. Coaxial fundus camera for opthalmology

    Science.gov (United States)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  2. PhC-4 new high-speed camera with mirror scanning

    International Nuclear Information System (INIS)

    Daragan, A.O.; Belov, B.G.

    1979-01-01

    The description of the optical system and the construction of the high-speed PhC-4 photographic camera with mirror scanning of the continuously operating type is given. The optical system of the camera is based on the foursided rotating mirror, two optical inlets and two working sectors. The PhC-4 camera provides the framing rate up to 600 thousand frames per second. (author)

  3. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  4. Performance Evaluations and Quality Validation System for Optical Gas Imaging Cameras That Visualize Fugitive Hydrocarbon Gas Emissions

    Science.gov (United States)

    Optical gas imaging (OGI) cameras have the unique ability to exploit the electromagnetic properties of fugitive chemical vapors to make invisible gases visible. This ability is extremely useful for industrial facilities trying to mitigate product losses from escaping gas and fac...

  5. The Sydney University PAPA camera

    Science.gov (United States)

    Lawson, Peter R.

    1994-04-01

    The Precision Analog Photon Address (PAPA) camera is a photon-counting array detector that uses optical encoding to locate photon events on the output of a microchannel plate image intensifier. The Sydney University camera is a 256x256 pixel detector which can operate at speeds greater than 1 million photons per second and produce individual photon coordinates with a deadtime of only 300 ns. It uses a new Gray coded mask-plate which permits a simplified optical alignment and successfully guards against vignetting artifacts.

  6. Properties of tree rings in LSST sensors

    International Nuclear Information System (INIS)

    Park, H.Y.; Tsybychev, D.; Nomerotski, A.

    2017-01-01

    Images of uniformly illuminated sensors for the Large Synoptic Survey Telescope have circular periodic patterns with an appearance similar to tree rings. These patterns are caused by circularly symmetric variations of the dopant concentration in the monocrystal silicon boule induced by the manufacturing process. Non-uniform charge density results in the parasitic electric field inside the silicon sensor, which may distort shapes of astronomical sources. In this study we analyzed data from fifteen LSST sensors produced by ITL to determine the main parameters of the tree rings: amplitude and period, and also variability across the sensors tested at Brookhaven National Laboratory. Tree ring pattern has a weak dependence on the wavelength. However the ring amplitude gets smaller as wavelength gets longer, since longer wavelengths penetrate deeper into the silicon. Tree ring amplitude gets larger as it gets closer to the outer part of the wafer, from 0.1 to 1.0%, indicating that the resistivity variation is larger for larger radii.

  7. Long wavelength infrared camera (LWIRC): a 10 micron camera for the Keck Telescope

    Energy Technology Data Exchange (ETDEWEB)

    Wishnow, E.H.; Danchi, W.C.; Tuthill, P.; Wurtz, R.; Jernigan, J.G.; Arens, J.F.

    1998-05-01

    The Long Wavelength Infrared Camera (LWIRC) is a facility instrument for the Keck Observatory designed to operate at the f/25 forward Cassegrain focus of the Keck I telescope. The camera operates over the wavelength band 7-13 {micro}m using ZnSe transmissive optics. A set of filters, a circular variable filter (CVF), and a mid-infrared polarizer are available, as are three plate scales: 0.05``, 0.10``, 0.21`` per pixel. The camera focal plane array and optics are cooled using liquid helium. The system has been refurbished with a 128 x 128 pixel Si:As detector array. The electronics readout system used to clock the array is compatible with both the hardware and software of the other Keck infrared instruments NIRC and LWS. A new pre-amplifier/A-D converter has been designed and constructed which decreases greatly the system susceptibility to noise.

  8. 200 ps FWHM and 100 MHz repetition rate ultrafast gated camera for optical medical functional imaging

    Science.gov (United States)

    Uhring, Wilfried; Poulet, Patrick; Hanselmann, Walter; Glazenborg, René; Zint, Virginie; Nouizi, Farouk; Dubois, Benoit; Hirschi, Werner

    2012-04-01

    The paper describes the realization of a complete optical imaging device to clinical applications like brain functional imaging by time-resolved, spectroscopic diffuse optical tomography. The entire instrument is assembled in a unique setup that includes a light source, an ultrafast time-gated intensified camera and all the electronic control units. The light source is composed of four near infrared laser diodes driven by a nanosecond electrical pulse generator working in a sequential mode at a repetition rate of 100 MHz. The resulting light pulses, at four wavelengths, are less than 80 ps FWHM. They are injected in a four-furcated optical fiber ended with a frontal light distributor to obtain a uniform illumination spot directed towards the head of the patient. Photons back-scattered by the subject are detected by the intensified CCD camera; there are resolved according to their time of flight inside the head. The very core of the intensified camera system is the image intensifier tube and its associated electrical pulse generator. The ultrafast generator produces 50 V pulses, at a repetition rate of 100 MHz and a width corresponding to the 200 ps requested gate. The photocathode and the Micro-Channel-Plate of the intensifier have been specially designed to enhance the electromagnetic wave propagation and reduce the power loss and heat that are prejudicial to the quality of the image. The whole instrumentation system is controlled by an FPGA based module. The timing of the light pulses and the photocathode gating is precisely adjustable with a step of 9 ps. All the acquisition parameters are configurable via software through an USB plug and the image data are transferred to a PC via an Ethernet link. The compactness of the device makes it a perfect device for bedside clinical applications.

  9. A daytime measurement of the lunar contribution to the night sky brightness in LSST's ugrizy bands-initial results

    Science.gov (United States)

    Coughlin, Michael; Stubbs, Christopher; Claver, Chuck

    2016-06-01

    We report measurements from which we determine the spatial structure of the lunar contribution to night sky brightness, taken at the LSST site on Cerro Pachon in Chile. We use an array of six photodiodes with filters that approximate the Large Synoptic Survey Telescope's u, g, r, i, z, and y bands. We use the sun as a proxy for the moon, and measure sky brightness as a function of zenith angle of the point on sky, zenith angle of the sun, and angular distance between the sun and the point on sky. We make a correction for the difference between the illumination spectrum of the sun and the moon. Since scattered sunlight totally dominates the daytime sky brightness, this technique allows us to cleanly determine the contribution to the (cloudless) night sky from backscattered moonlight, without contamination from other sources of night sky brightness. We estimate our uncertainty in the relative lunar night sky brightness vs. zenith and lunar angle to be between 0.3-0.7 mags depending on the passband. This information is useful in planning the optimal execution of the LSST survey, and perhaps for other astronomical observations as well. Although our primary objective is to map out the angular structure and spectrum of the scattered light from the atmosphere and particulates, we also make an estimate of the expected number of scattered lunar photons per pixel per second in LSST, and find values that are in overall agreement with previous estimates.

  10. Development and Optical Testing of the Camera, Hand Lens, and Microscope Probe with Scannable Laser Spectroscopy (CHAMP-SLS)

    Science.gov (United States)

    Mungas, Greg S.; Gursel, Yekta; Sepulveda, Cesar A.; Anderson, Mark; La Baw, Clayton; Johnson, Kenneth R.; Deans, Matthew; Beegle, Luther; Boynton, John

    2008-01-01

    Conducting high resolution field microscopy with coupled laser spectroscopy that can be used to selectively analyze the surface chemistry of individual pixels in a scene is an enabling capability for next generation robotic and manned spaceflight missions, civil, and military applications. In the laboratory, we use a range of imaging and surface preparation tools that provide us with in-focus images, context imaging for identifying features that we want to investigate at high magnification, and surface-optical coupling that allows us to apply optical spectroscopic analysis techniques for analyzing surface chemistry particularly at high magnifications. The camera, hand lens, and microscope probe with scannable laser spectroscopy (CHAMP-SLS) is an imaging/spectroscopy instrument capable of imaging continuously from infinity down to high resolution microscopy (resolution of approx. 1 micron/pixel in a final camera format), the closer CHAMP-SLS is placed to a feature, the higher the resultant magnification. At hand lens to microscopic magnifications, the imaged scene can be selectively interrogated with point spectroscopic techniques such as Raman spectroscopy, microscopic Laser Induced Breakdown Spectroscopy (micro-LIBS), laser ablation mass-spectrometry, Fluorescence spectroscopy, and/or Reflectance spectroscopy. This paper summarizes the optical design, development, and testing of the CHAMP-SLS optics.

  11. In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms

    Science.gov (United States)

    Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.

    2007-12-01

    We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.

  12. CCD characterization and measurements automation

    International Nuclear Information System (INIS)

    Kotov, I.V.; Frank, J.; Kotov, A.I.; Kubanek, P.; O'Connor, P.; Prouza, M.; Radeka, V.; Takacs, P.

    2012-01-01

    Modern mosaic cameras have grown both in size and in number of sensors. The required volume of sensor testing and characterization has grown accordingly. For camera projects as large as the LSST, test automation becomes a necessity. A CCD testing and characterization laboratory was built and is in operation for the LSST project. Characterization of LSST study contract sensors has been performed. The characterization process and its automation are discussed, and results are presented. Our system automatically acquires images, populates a database with metadata information, and runs express analysis. This approach is illustrated on 55 Fe data analysis. 55 Fe data are used to measure gain, charge transfer efficiency and charge diffusion. Examples of express analysis results are presented and discussed.

  13. Transmission electron microscope CCD camera

    Science.gov (United States)

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  14. Experimental Characterization of Close-Emitter Interference in an Optical Camera Communication System.

    Science.gov (United States)

    Chavez-Burbano, Patricia; Guerra, Victor; Rabadan, Jose; Rodríguez-Esparragón, Dionisio; Perez-Jimenez, Rafael

    2017-07-04

    Due to the massive insertion of embedded cameras in a wide variety of devices and the generalized use of LED lamps, Optical Camera Communication (OCC) has been proposed as a practical solution for future Internet of Things (IoT) and smart cities applications. Influence of mobility, weather conditions, solar radiation interference, and external light sources over Visible Light Communication (VLC) schemes have been addressed in previous works. Some authors have studied the spatial intersymbol interference from close emitters within an OCC system; however, it has not been characterized or measured in function of the different transmitted wavelengths. In this work, this interference has been experimentally characterized and the Normalized Power Signal to Interference Ratio (NPSIR) for easily determining the interference in other implementations, independently of the selected system devices, has been also proposed. A set of experiments in a darkroom, working with RGB multi-LED transmitters and a general purpose camera, were performed in order to obtain the NPSIR values and to validate the deduced equations for 2D pixel representation of real distances. These parameters were used in the simulation of a wireless sensor network scenario in a small office, where the Bit Error Rate (BER) of the communication link was calculated. The experiments show that the interference of other close emitters in terms of the distance and the used wavelength can be easily determined with the NPSIR. Finally, the simulation validates the applicability of the deduced equations for scaling the initial results into real scenarios.

  15. Exploiting Auto-Collimation for Real-Time Onboard Monitoring of Space Optical Camera Geometric Parameters

    Science.gov (United States)

    Liu, W.; Wang, H.; Liu, D.; Miu, Y.

    2018-05-01

    Precise geometric parameters are essential to ensure the positioning accuracy for space optical cameras. However, state-of-the-art onorbit calibration method inevitably suffers from long update cycle and poor timeliness performance. To this end, in this paper we exploit the optical auto-collimation principle and propose a real-time onboard calibration scheme for monitoring key geometric parameters. Specifically, in the proposed scheme, auto-collimation devices are first designed by installing collimated light sources, area-array CCDs, and prisms inside the satellite payload system. Through utilizing those devices, the changes in the geometric parameters are elegantly converted into changes in the spot image positions. The variation of geometric parameters can be derived via extracting and processing the spot images. An experimental platform is then set up to verify the feasibility and analyze the precision index of the proposed scheme. The experiment results demonstrate that it is feasible to apply the optical auto-collimation principle for real-time onboard monitoring.

  16. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    Science.gov (United States)

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  17. Experimental Characterization of Close-Emitter Interference in an Optical Camera Communication System

    Science.gov (United States)

    Chavez-Burbano, Patricia; Rabadan, Jose; Perez-Jimenez, Rafael

    2017-01-01

    Due to the massive insertion of embedded cameras in a wide variety of devices and the generalized use of LED lamps, Optical Camera Communication (OCC) has been proposed as a practical solution for future Internet of Things (IoT) and smart cities applications. Influence of mobility, weather conditions, solar radiation interference, and external light sources over Visible Light Communication (VLC) schemes have been addressed in previous works. Some authors have studied the spatial intersymbol interference from close emitters within an OCC system; however, it has not been characterized or measured in function of the different transmitted wavelengths. In this work, this interference has been experimentally characterized and the Normalized Power Signal to Interference Ratio (NPSIR) for easily determining the interference in other implementations, independently of the selected system devices, has been also proposed. A set of experiments in a darkroom, working with RGB multi-LED transmitters and a general purpose camera, were performed in order to obtain the NPSIR values and to validate the deduced equations for 2D pixel representation of real distances. These parameters were used in the simulation of a wireless sensor network scenario in a small office, where the Bit Error Rate (BER) of the communication link was calculated. The experiments show that the interference of other close emitters in terms of the distance and the used wavelength can be easily determined with the NPSIR. Finally, the simulation validates the applicability of the deduced equations for scaling the initial results into real scenarios. PMID:28677613

  18. PROCEDURE ENABLING SIMULATION AND IN-DEPTH ANALYSIS OF OPTICAL EFFECTS IN CAMERA-BASED TIME-OF-FLIGHT SENSORS

    Directory of Open Access Journals (Sweden)

    M. Baumgart

    2018-05-01

    Full Text Available This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.

  19. Camera Calibration of Stereo Photogrammetric System with One-Dimensional Optical Reference Bar

    International Nuclear Information System (INIS)

    Xu, Q Y; Ye, D; Che, R S; Qi, X; Huang, Y

    2006-01-01

    To carry out the precise measurement of large-scale complex workpieces, accurately calibration of the stereo photogrammetric system has becoming more and more important. This paper proposed a flexible and reliable camera calibration of stereo photogrammetric system based on quaternion with one-dimensional optical reference bar, which has three small collinear infrared LED marks and the lengths between these marks have been precisely calibration. By moving the optical reference bar at a number of locations/orientations over the measurement volume, we calibrate the stereo photogrammetric systems with the geometric constraint of the optical reference bar. The extrinsic parameters calibration process consists of linear parameters estimation based on quaternion and nonlinear refinement based on the maximum likelihood criterion. Firstly, we linear estimate the extrinsic parameters of the stereo photogrameetric systems based on quaternion. Then with the quaternion results as the initial values, we refine the extrinsic parameters through maximum likelihood criterion with the Levenberg-Marquardt Algorithm. In the calibration process, we can automatically control the light intensity and optimize the exposure time to get uniform intensity profile of the image points at different distance and obtain higher S/N ratio. The experiment result proves that the calibration method proposed is flexible, valid and obtains good results in the application

  20. The GCT camera for the Cherenkov Telescope Array

    Science.gov (United States)

    Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium

    2017-12-01

    The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.

  1. Streak camera recording of interferometer fringes

    International Nuclear Information System (INIS)

    Parker, N.L.; Chau, H.H.

    1977-01-01

    The use of an electronic high-speed camera in the streaking mode to record interference fringe motion from a velocity interferometer is discussed. Advantages of this method over the photomultiplier tube-oscilloscope approach are delineated. Performance testing and data for the electronic streak camera are discussed. The velocity profile of a mylar flyer accelerated by an electrically exploded bridge, and the jump-off velocity of metal targets struck by these mylar flyers are measured in the camera tests. Advantages of the streak camera include portability, low cost, ease of operation and maintenance, simplified interferometer optics, and rapid data analysis

  2. A Crowd-Sourcing Indoor Localization Algorithm via Optical Camera on a Smartphone Assisted by Wi-Fi Fingerprint RSSI.

    Science.gov (United States)

    Chen, Wei; Wang, Weiping; Li, Qun; Chang, Qiang; Hou, Hongtao

    2016-03-19

    Indoor positioning based on existing Wi-Fi fingerprints is becoming more and more common. Unfortunately, the Wi-Fi fingerprint is susceptible to multiple path interferences, signal attenuation, and environmental changes, which leads to low accuracy. Meanwhile, with the recent advances in charge-coupled device (CCD) technologies and the processing speed of smartphones, indoor positioning using the optical camera on a smartphone has become an attractive research topic; however, the major challenge is its high computational complexity; as a result, real-time positioning cannot be achieved. In this paper we introduce a crowd-sourcing indoor localization algorithm via an optical camera and orientation sensor on a smartphone to address these issues. First, we use Wi-Fi fingerprint based on the K Weighted Nearest Neighbor (KWNN) algorithm to make a coarse estimation. Second, we adopt a mean-weighted exponent algorithm to fuse optical image features and orientation sensor data as well as KWNN in the smartphone to refine the result. Furthermore, a crowd-sourcing approach is utilized to update and supplement the positioning database. We perform several experiments comparing our approach with other positioning algorithms on a common smartphone to evaluate the performance of the proposed sensor-calibrated algorithm, and the results demonstrate that the proposed algorithm could significantly improve accuracy, stability, and applicability of positioning.

  3. Cameras in mobile phones

    Science.gov (United States)

    Nummela, Ville; Viinikanoja, Jarkko; Alakarhu, Juha

    2006-04-01

    One of the fastest growing markets in consumer markets today are camera phones. During past few years total volume has been growing fast and today millions of mobile phones with camera will be sold. At the same time resolution and functionality of the cameras has been growing from CIF towards DSC level. From camera point of view the mobile world is an extremely challenging field. Cameras should have good image quality but in small size. They also need to be reliable and their construction should be suitable for mass manufacturing. All components of the imaging chain should be well optimized in this environment. Image quality and usability are the most important parameters to user. The current trend of adding more megapixels to cameras and at the same time using smaller pixels is affecting both. On the other hand reliability and miniaturization are key drivers for product development as well as the cost. In optimized solution all parameters are in balance but the process of finding the right trade-offs is not an easy task. In this paper trade-offs related to optics and their effects to image quality and usability of cameras are discussed. Key development areas from mobile phone camera point of view are also listed.

  4. CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models

    Science.gov (United States)

    Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

    2011-02-01

    Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

  5. Mixel camera--a new push-broom camera concept for high spatial resolution keystone-free hyperspectral imaging.

    Science.gov (United States)

    Høye, Gudrun; Fridman, Andrei

    2013-05-06

    Current high-resolution push-broom hyperspectral cameras introduce keystone errors to the captured data. Efforts to correct these errors in hardware severely limit the optical design, in particular with respect to light throughput and spatial resolution, while at the same time the residual keystone often remains large. The mixel camera solves this problem by combining a hardware component--an array of light mixing chambers--with a mathematical method that restores the hyperspectral data to its keystone-free form, based on the data that was recorded onto the sensor with large keystone. A Virtual Camera software, that was developed specifically for this purpose, was used to compare the performance of the mixel camera to traditional cameras that correct keystone in hardware. The mixel camera can collect at least four times more light than most current high-resolution hyperspectral cameras, and simulations have shown that the mixel camera will be photon-noise limited--even in bright light--with a significantly improved signal-to-noise ratio compared to traditional cameras. A prototype has been built and is being tested.

  6. BENCHMARKING THE OPTICAL RESOLVING POWER OF UAV BASED CAMERA SYSTEMS

    Directory of Open Access Journals (Sweden)

    H. Meißner

    2017-08-01

    Full Text Available UAV based imaging and 3D object point generation is an established technology. Some of the UAV users try to address (very highaccuracy applications, i.e. inspection or monitoring scenarios. In order to guarantee such level of detail and accuracy high resolving imaging systems are mandatory. Furthermore, image quality considerably impacts photogrammetric processing, as the tie point transfer, mandatory for forming the block geometry, fully relies on the radiometric quality of images. Thus, empirical testing of radiometric camera performance is an important issue, in addition to standard (geometric calibration, which normally is covered primarily. Within this paper the resolving power of ten different camera/lens installations has been investigated. Selected systems represent different camera classes, like DSLRs, system cameras, larger format cameras and proprietary systems. As the systems have been tested in wellcontrolled laboratory conditions and objective quality measures have been derived, individual performance can be compared directly, thus representing a first benchmark on radiometric performance of UAV cameras. The results have shown, that not only the selection of appropriate lens and camera body has an impact, in addition the image pre-processing, i.e. the use of a specific debayering method, significantly influences the final resolving power.

  7. Aluminum-coated optical fibers as efficient infrared timing fiducial photocathodes for synchronizing x-ray streak cameras

    International Nuclear Information System (INIS)

    Koch, J.A.; MacGowan, B.J.

    1991-01-01

    The timing fiducial system at the Nova Two-Beam Facility allows time-resolved x-ray and optical streak camera data from laser-produced plasmas to be synchronized to within 30 ps. In this system, an Al-coated optical fiber is inserted into an aperture in the cathode plate of each streak camera. The coating acts as a photocathode for a low-energy pulse of 1ω (λ = 1.054 μm) light which is synchronized to the main Nova beam. The use of the fundamental (1ω) for this fiducial pulse has been found to offer significant advantages over the use of the 2ω second harmonic (λ = 0.53 μm). These advantages include brighter signals, greater reliability, and a higher relative damage threshold, allowing routine use without fiber replacement. The operation of the system is described, and experimental data and interpretations are discussed which suggest that the electron production in the Al film is due to thermionic emission. The results of detailed numerical simulations of the relevant thermal processes, undertaken to model the response of the coated fiber to 1ω laser pulses, are also presented, which give qualitative agreement with experimental data. Quantitative discrepancies between the modeling results and the experimental data are discussed, and suggestions for further research are given

  8. Stereo Pinhole Camera: Assembly and experimental activities

    Directory of Open Access Journals (Sweden)

    Gilmário Barbosa Santos

    2015-05-01

    Full Text Available This work describes the assembling of a stereo pinhole camera for capturing stereo-pairs of images and proposes experimental activities with it. A pinhole camera can be as sophisticated as you want, or so simple that it could be handcrafted with practically recyclable materials. This paper describes the practical use of the pinhole camera throughout history and currently. Aspects of optics and geometry involved in the building of the stereo pinhole camera are presented with illustrations. Furthermore, experiments are proposed by using the images obtained by the camera for 3D visualization through a pair of anaglyph glasses, and the estimation of relative depth by triangulation is discussed.

  9. Image quality testing of assembled IR camera modules

    Science.gov (United States)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  10. Characterization of X-ray streak cameras for use on Nova

    International Nuclear Information System (INIS)

    Kalantar, D.H.; Bell, P.M.; Costa, R.L.; Hammel, B.A.; Landen, O.L.; Orzechowski, T.J.; Hares, J.D.; Dymoke-Bradshaw, A.K.L.

    1996-09-01

    There are many different types of measurements that require a continuous time history of x-ray emission that can be provided with an x-ray streak camera. In order to properly analyze the images that are recorded with the x-ray streak cameras operated on Nova, it is important to account for the streak characterization of each camera. We have performed a number of calibrations of the streak cameras both on the bench as well as with Nova disk target shots where we use a time modulated laser intensity profile (self-beating of the laser) on the target to generate an x-ray comb. We have measured the streak camera sweep direction and spatial offset, curvature of the electron optics, sweep rate, and magnification and resolution of the electron optics

  11. Reducing the Variance of Intrinsic Camera Calibration Results in the ROS Camera_Calibration Package

    Science.gov (United States)

    Chiou, Geoffrey Nelson

    The intrinsic calibration of a camera is the process in which the internal optical and geometric characteristics of the camera are determined. If accurate intrinsic parameters of a camera are known, the ray in 3D space that every point in the image lies on can be determined. Pairing with another camera allows for the position of the points in the image to be calculated by intersection of the rays. Accurate intrinsics also allow for the position and orientation of a camera relative to some world coordinate system to be calculated. These two reasons for having accurate intrinsic calibration for a camera are especially important in the field of industrial robotics where 3D cameras are frequently mounted on the ends of manipulators. In the ROS (Robot Operating System) ecosystem, the camera_calibration package is the default standard for intrinsic camera calibration. Several researchers from the Industrial Robotics & Automation division at Southwest Research Institute have noted that this package results in large variances in the intrinsic parameters of the camera when calibrating across multiple attempts. There are also open issues on this matter in their public repository that have not been addressed by the developers. In this thesis, we confirm that the camera_calibration package does indeed return different results across multiple attempts, test out several possible hypothesizes as to why, identify the reason, and provide simple solution to fix the cause of the issue.

  12. The development of large-aperture test system of infrared camera and visible CCD camera

    Science.gov (United States)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  13. Fluorescence-enhanced optical imaging in large tissue volumes using a gain-modulated ICCD camera

    International Nuclear Information System (INIS)

    Godavarty, Anuradha; Eppstein, Margaret J; Zhang, Chaoyang; Theru, Sangeeta; Thompson, Alan B; Gurfinkel, Michael; Sevick-Muraca, Eva M

    2003-01-01

    A novel image-intensified charge-coupled device (ICCD) imaging system has been developed to perform 3D fluorescence tomographic imaging in the frequency-domain using near-infrared contrast agents. The imager is unique since it (i) employs a large tissue-mimicking phantom, which is shaped and sized to resemble a female breast and part of the extended chest-wall region, and (ii) enables rapid data acquisition in the frequency-domain by using a gain-modulated ICCD camera. Diffusion model predictions are compared to experimental measurements using two different referencing schemes under two different experimental conditions of perfect and imperfect uptake of fluorescent agent into a target. From these experimental measurements, three-dimensional images of fluorescent absorption were reconstructed using a computationally efficient variant of the approximate extended Kalman filter algorithm. The current work represents the first time that 3D fluorescence-enhanced optical tomographic reconstructions have been achieved from experimental measurements of the time-dependent light propagation on a clinically relevant breast-shaped tissue phantom using a gain-modulated ICCD camera

  14. The Cosmic Evolution Through UV Spectroscopy (CETUS) Probe Mission Concept

    Science.gov (United States)

    Danchi, William; Heap, Sara; Woodruff, Robert; Hull, Anthony; Kendrick, Stephen E.; Purves, Lloyd; McCandliss, Stephan; Kelly Dodson, Greg Mehle, James Burge, Martin Valente, Michael Rhee, Walter Smith, Michael Choi, Eric Stoneking

    2018-01-01

    CETUS is a mission concept for an all-UV telescope with 3 scientific instruments: a wide-field camera, a wide-field multi-object spectrograph, and a point-source high-resolution and medium resolution spectrograph. It is primarily intended to work with other survey telescopes in the 2020’s (e.g. E-ROSITA (X-ray), LSST, Subaru, WFIRST (optical-near-IR), SKA (radio) to solve major, outstanding problems in astrophysics. In this poster presentation, we give an overview of CETUS key science goals and a progress report on the CETUS mission and instrument design.

  15. Polarizing aperture stereoscopic cinema camera

    Science.gov (United States)

    Lipton, Lenny

    2012-07-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

  16. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  17. The Whole is Greater than the Sum of the Parts: Optimizing the Joint Science Return from LSST, Euclid and WFIRST

    Energy Technology Data Exchange (ETDEWEB)

    Jain, B. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Spergel, D. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Connolly, A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Dell' antonio, I. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Frieman, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Gawiser, E. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Gehrels, N. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Gladney, L. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Heitmann, K. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Helou, G. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Hirata, C. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ho, S. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ivezic, Z. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Jarvis, M. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Kahn, S. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Kalirai, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Kim, A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Lupton, R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Mandelbaum, R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Marshall, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Newman, J. A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Postman, M. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Rhodes, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Strauss, M. A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Tyson, J. A. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Wood-Vesey, W. M. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-02-02

    The scientific opportunity offered by the combination of data from LSST, WFIRST and Euclid goes well beyond the science enabled by any one of the data sets alone. The range in wavelength, angular resolution and redshift coverage that these missions jointly span is remarkable. With major investments in LSST and WFIRST, and partnership with ESA in Euclid, the US has an outstanding scientific opportunity to carry out a combined analysis of these data sets. It is imperative for us to seize it and, together with our European colleagues, prepare for the defining cosmological pursuit of the 21st century. The main argument for conducting a single, high-quality reference co-analysis exercise and carefully documenting the results is the complexity and subtlety of systematics that define this co-analysis. Falling back on many small efforts by different teams in selected fields and for narrow goals will be inefficient, leading to significant duplication of effort.

  18. A television/still camera with common optical system for reactor inspection

    International Nuclear Information System (INIS)

    Hughes, G.; McBane, P.

    1976-01-01

    One of the problems of reactor inspection is to obtain permanent high quality records. Video recordings provide a record of poor quality but known content. Still cameras can be used but the frame content is not predictable. Efforts have been made to combine T.V. viewing to align a still camera but a simple combination does not provide the same frame size. The necessity to preset the still camera controls severely restricts the flexibility of operation. A camera has, therefore, been designed which allows a search operation using the T.V. system. When an anomaly is found the still camera controls can be remotely set, an exact record obtained and the search operation continued without removal from the reactor. An application of this camera in the environment of the blanket gas region above the sodium region in PFR at 150 0 C is described

  19. Towards next generation 3D cameras

    Science.gov (United States)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (robotic inspection and assembly systems.

  20. Astronomy and the camera obscura

    Science.gov (United States)

    Feist, M.

    2000-02-01

    The camera obscura (from Latin meaning darkened chamber) is a simple optical device with a long history. In the form considered here, it can be traced back to 1550. It had its heyday during the Victorian era when it was to be found at the seaside as a tourist attraction or sideshow. It was also used as an artist's drawing aid and, in 1620, the famous astronomer-mathematician, Johannes Kepler used a small tent camera obscura to trace the scenery.

  1. Qualification Tests of Micro-camera Modules for Space Applications

    Science.gov (United States)

    Kimura, Shinichi; Miyasaka, Akira

    Visual capability is very important for space-based activities, for which small, low-cost space cameras are desired. Although cameras for terrestrial applications are continually being improved, little progress has been made on cameras used in space, which must be extremely robust to withstand harsh environments. This study focuses on commercial off-the-shelf (COTS) CMOS digital cameras because they are very small and are based on an established mass-market technology. Radiation and ultrahigh-vacuum tests were conducted on a small COTS camera that weighs less than 100 mg (including optics). This paper presents the results of the qualification tests for COTS cameras and for a small, low-cost COTS-based space camera.

  2. ARNICA, the Arcetri near-infrared camera: Astronomical performance assessment.

    Science.gov (United States)

    Hunt, L. K.; Lisi, F.; Testi, L.; Baffa, C.; Borelli, S.; Maiolino, R.; Moriondo, G.; Stanga, R. M.

    1996-01-01

    The Arcetri near-infrared camera ARNICA was built as a users' instrument for the Infrared Telescope at Gornergrat (TIRGO), and is based on a 256x256 NICMOS 3 detector. In this paper, we discuss ARNICA's optical and astronomical performance at the TIRGO and at the William Herschel Telescope on La Palma. Optical performance is evaluated in terms of plate scale, distortion, point spread function, and ghosting. Astronomical performance is characterized by camera efficiency, sensitivity, and spatial uniformity of the photometry.

  3. Optics for MUSIC: a new (sub)millimeter camera for the Caltech Submillimeter Observatory

    Science.gov (United States)

    Sayers, Jack; Czakon, Nicole G.; Day, Peter K.; Downes, Thomas P.; Duan, Ran P.; Gao, Jiansong; Glenn, Jason; Golwala, Sunil R.; Hollister, Matt I.; LeDuc, Henry G.; Mazin, Benjamin A.; Maloney, Philip R.; Noroozian, Omid; Nguyen, Hien T.; Schlaerth, James A.; Siegel, Seth; Vaillancourt, John E.; Vayonakis, Anastasios; Wilson, Philip R.; Zmuidzinas, Jonas

    2010-07-01

    We will present the design and implementation, along with calculations and some measurements of the performance, of the room-temperature and cryogenic optics for MUSIC, a new (sub)millimeter camera we are developing for the Caltech Submm Observatory (CSO). The design consists of two focusing elements in addition to the CSO primary and secondary mirrors: a warm off-axis elliptical mirror and a cryogenic (4K) lens. These optics will provide a 14 arcmin field of view that is diffraction limited in all four of the MUSIC observing bands (2.00, 1.33, 1.02, and 0.86 mm). A cold (4K) Lyot stop will be used to define the primary mirror illumination, which will be maximized while keeping spillover at the sub 1% level. The MUSIC focal plane will be populated with broadband phased antenna arrays that efficiently couple to factor of (see manuscript) 3 in bandwidth,1, 2 and each pixel on the focal plane will be read out via a set of four lumped element filters that define the MUSIC observing bands (i.e., each pixel on the focal plane simultaneously observes in all four bands). Finally, a series of dielectric and metal-mesh low pass filters have been implemented to reduce the optical power load on the MUSIC cryogenic stages to a quasi-negligible level while maintaining good transmission in-band.

  4. Intraocular camera for retinal prostheses: Refractive and diffractive lens systems

    Science.gov (United States)

    Hauer, Michelle Christine

    The focus of this thesis is on the design and analysis of refractive, diffractive, and hybrid refractive/diffractive lens systems for a miniaturized camera that can be surgically implanted in the crystalline lens sac and is designed to work in conjunction with current and future generation retinal prostheses. The development of such an intraocular camera (IOC) would eliminate the need for an external head-mounted or eyeglass-mounted camera. Placing the camera inside the eye would allow subjects to use their natural eye movements for foveation (attention) instead of more cumbersome head tracking, would notably aid in personal navigation and mobility, and would also be significantly more psychologically appealing from the standpoint of personal appearances. The capability for accommodation with no moving parts or feedback control is incorporated by employing camera designs that exhibit nearly infinite depth of field. Such an ultracompact optical imaging system requires a unique combination of refractive and diffractive optical elements and relaxed system constraints derived from human psychophysics. This configuration necessitates an extremely compact, short focal-length lens system with an f-number close to unity. Initially, these constraints appear highly aggressive from an optical design perspective. However, after careful analysis of the unique imaging requirements of a camera intended to work in conjunction with the relatively low pixellation levels of a retinal microstimulator array, it becomes clear that such a design is not only feasible, but could possibly be implemented with a single lens system.

  5. Initial inflight calibration for Hayabusa2 optical navigation camera (ONC) for science observations of asteroid Ryugu

    Science.gov (United States)

    Suzuki, H.; Yamada, M.; Kouyama, T.; Tatsumi, E.; Kameda, S.; Honda, R.; Sawada, H.; Ogawa, N.; Morota, T.; Honda, C.; Sakatani, N.; Hayakawa, M.; Yokota, Y.; Yamamoto, Y.; Sugita, S.

    2018-01-01

    Hayabusa2, the first sample return mission to a C-type asteroid was launched by the Japan Aerospace Exploration Agency (JAXA) on December 3, 2014 and will arrive at the asteroid in the middle of 2018 to collect samples from its surface, which may contain both hydrated minerals and organics. The optical navigation camera (ONC) system on board the Hayabusa2 consists of three individual framing CCD cameras, ONC-T for a telescopic nadir view, ONC-W1 for a wide-angle nadir view, and ONC-W2 for a wide-angle slant view will be used to observe the surface of Ryugu. The cameras will be used to measure the global asteroid shape, local morphologies, and visible spectroscopic properties. Thus, image data obtained by ONC will provide essential information to select landing (sampling) sites on the asteroid. This study reports the results of initial inflight calibration based on observations of Earth, Mars, Moon, and stars to verify and characterize the optical performance of the ONC, such as flat-field sensitivity, spectral sensitivity, point-spread function (PSF), distortion, and stray light of ONC-T, and distortion for ONC-W1 and W2. We found some potential problems that may influence our science observations. This includes changes in sensitivity of flat fields for all bands from those that were measured in the pre-flight calibration and existence of a stray light that arises under certain conditions of spacecraft attitude with respect to the sun. The countermeasures for these problems were evaluated by using data obtained during initial in-flight calibration. The results of our inflight calibration indicate that the error of spectroscopic measurements around 0.7 μm using 0.55, 0.70, and 0.86 μm bands of the ONC-T can be lower than 0.7% after these countermeasures and pixel binning. This result suggests that our ONC-T would be able to detect typical strength (∼3%) of the serpentine absorption band often found on CM chondrites and low albedo asteroids with ≥ 4

  6. Evaluation of the optical cross talk level in the SiPMs adopted in ASTRI SST-2M Cherenkov Camera using EASIROC front-end electronics

    International Nuclear Information System (INIS)

    Impiombato, D; Giarrusso, S; Mineo, T; Agnetta, G; Biondo, B; Catalano, O; Gargano, C; Rosa, G La; Russo, F; Sottile, G; Belluso, M; Billotta, S; Bonanno, G; Garozzo, S; Marano, D; Romeo, G

    2014-01-01

    ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana), is a flagship project of the Italian Ministry of Education, University and Research whose main goal is the design and construction of an end-to-end prototype of the Small Size of Telescopes of the Cherenkov Telescope Array. The prototype, named ASTRI SST-2M, will adopt a wide field dual mirror optical system in a Schwarzschild-Couder configuration to explore the VHE range of the electromagnetic spectrum. The camera at the focal plane is based on Silicon Photo-Multipliers detectors which is an innovative solution for the detection astronomical Cherenkov light. This contribution reports some preliminary results on the evaluation of the optical cross talk level among the SiPM pixels foreseen for the ASTRI SST-2M camera

  7. CCD camera system for use with a streamer chamber

    International Nuclear Information System (INIS)

    Angius, S.A.; Au, R.; Crawley, G.C.; Djalali, C.; Fox, R.; Maier, M.; Ogilvie, C.A.; Molen, A. van der; Westfall, G.D.; Tickle, R.S.

    1988-01-01

    A system based on three charge-coupled-device (CCD) cameras is described here. It has been used to acquire images from a streamer chamber and consists of three identical subsystems, one for each camera. Each subsystem contains an optical lens, CCD camera head, camera controller, an interface between the CCD and a microprocessor, and a link to a minicomputer for data recording and on-line analysis. Image analysis techniques have been developed to enhance the quality of the particle tracks. Some steps have been made to automatically identify tracks and reconstruct the event. (orig.)

  8. Gamma ray camera

    International Nuclear Information System (INIS)

    Wang, S.-H.; Robbins, C.D.

    1979-01-01

    An Anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the Anger camera. The image intensifier tube has a negatively charged flat scintillator screen, a flat photocathode layer, and a grounded, flat output phosphor display screen, all of which have the same dimension to maintain unit image magnification; all components are contained within a grounded metallic tube, with a metallic, inwardly curved input window between the scintillator screen and a collimator. The display screen can be viewed by an array of photomultipliers or solid state detectors. There are two photocathodes and two phosphor screens to give a two stage intensification, the two stages being optically coupled by a light guide. (author)

  9. World's fastest and most sensitive astronomical camera

    Science.gov (United States)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these

  10. The GISMO-2 Bolometer Camera

    Science.gov (United States)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; hide

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  11. Design of microcontroller based system for automation of streak camera

    International Nuclear Information System (INIS)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-01-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  12. Design of microcontroller based system for automation of streak camera.

    Science.gov (United States)

    Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  13. Design of microcontroller based system for automation of streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P. [Laser Electronics Support Division, RRCAT, Indore 452013 (India)

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  14. Prism-based single-camera system for stereo display

    Science.gov (United States)

    Zhao, Yue; Cui, Xiaoyu; Wang, Zhiguo; Chen, Hongsheng; Fan, Heyu; Wu, Teresa

    2016-06-01

    This paper combines the prism and single camera and puts forward a method of stereo imaging with low cost. First of all, according to the principle of geometrical optics, we can deduce the relationship between the prism single-camera system and dual-camera system, and according to the principle of binocular vision we can deduce the relationship between binoculars and dual camera. Thus we can establish the relationship between the prism single-camera system and binoculars and get the positional relation of prism, camera, and object with the best effect of stereo display. Finally, using the active shutter stereo glasses of NVIDIA Company, we can realize the three-dimensional (3-D) display of the object. The experimental results show that the proposed approach can make use of the prism single-camera system to simulate the various observation manners of eyes. The stereo imaging system, which is designed by the method proposed by this paper, can restore the 3-D shape of the object being photographed factually.

  15. Modelling of the over-exposed pixel area of CCD cameras caused by laser dazzling

    NARCIS (Netherlands)

    Benoist, K.W.; Schleijpen, R.M.A.

    2014-01-01

    A simple model has been developed and implemented in Matlab code, predicting the over-exposed pixel area of cameras caused by laser dazzling. Inputs of this model are the laser irradiance on the front optics of the camera, the Point Spread Function (PSF) of the used optics, the integration time of

  16. Preflight Calibration Test Results for Optical Navigation Camera Telescope (ONC-T) Onboard the Hayabusa2 Spacecraft

    Science.gov (United States)

    Kameda, S.; Suzuki, H.; Takamatsu, T.; Cho, Y.; Yasuda, T.; Yamada, M.; Sawada, H.; Honda, R.; Morota, T.; Honda, C.; Sato, M.; Okumura, Y.; Shibasaki, K.; Ikezawa, S.; Sugita, S.

    2017-07-01

    The optical navigation camera telescope (ONC-T) is a telescopic framing camera with seven colors onboard the Hayabusa2 spacecraft launched on December 3, 2014. The main objectives of this instrument are to optically navigate the spacecraft to asteroid Ryugu and to conduct multi-band mapping the asteroid. We conducted performance tests of the instrument before its installation on the spacecraft. We evaluated the dark current and bias level, obtained data on the dependency of the dark current on the temperature of the charge-coupled device (CCD). The bias level depends strongly on the temperature of the electronics package but only weakly on the CCD temperature. The dark-reference data, which is obtained simultaneously with observation data, can be used for estimation of the dark current and bias level. A long front hood is used for ONC-T to reduce the stray light at the expense of flatness in the peripheral area of the field of view (FOV). The central area in FOV has a flat sensitivity, and the limb darkening has been measured with an integrating sphere. The ONC-T has a wheel with seven bandpass filters and a panchromatic glass window. We measured the spectral sensitivity using an integrating sphere and obtained the sensitivity of all the pixels. We also measured the point-spread function using a star simulator. Measurement results indicate that the full width at half maximum is less than two pixels for all the bandpass filters and in the temperature range expected in the mission phase except for short periods of time during touchdowns.

  17. Analysis of Brown camera distortion model

    Science.gov (United States)

    Nowakowski, Artur; Skarbek, Władysław

    2013-10-01

    Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.

  18. Lensless Photoluminescence Hyperspectral Camera Employing Random Speckle Patterns.

    Czech Academy of Sciences Publication Activity Database

    Žídek, Karel; Denk, Ondřej; Hlubuček, Jiří

    2017-01-01

    Roč. 7, č. 1 (2017), č. článku 15309. ISSN 2045-2322 R&D Projects: GA MŠk(CZ) LO1206; GA ČR(CZ) GJ17-26284Y Institutional support: RVO:61389021 Keywords : compressed sensing * photoluminescence imaging * laser speckles * single-pixel camera Subject RIV: BH - Optics, Masers, Lasers OBOR OECD: Optics (including laser optics and quantum optics) Impact factor: 4.259, year: 2016 https://www.nature.com/articles/s41598-017-14443-4

  19. Radiometric calibration of digital cameras using neural networks

    Science.gov (United States)

    Grunwald, Michael; Laube, Pascal; Schall, Martin; Umlauf, Georg; Franz, Matthias O.

    2017-08-01

    Digital cameras are used in a large variety of scientific and industrial applications. For most applications, the acquired data should represent the real light intensity per pixel as accurately as possible. However, digital cameras are subject to physical, electronic and optical effects that lead to errors and noise in the raw image. Temperature- dependent dark current, read noise, optical vignetting or different sensitivities of individual pixels are examples of such effects. The purpose of radiometric calibration is to improve the quality of the resulting images by reducing the influence of the various types of errors on the measured data and thus improving the quality of the overall application. In this context, we present a specialized neural network architecture for radiometric calibration of digital cameras. Neural networks are used to learn a temperature- and exposure-dependent mapping from observed gray-scale values to true light intensities for each pixel. In contrast to classical at-fielding, neural networks have the potential to model nonlinear mappings which allows for accurately capturing the temperature dependence of the dark current and for modeling cameras with nonlinear sensitivities. Both scenarios are highly relevant in industrial applications. The experimental comparison of our network approach to classical at-fielding shows a consistently higher reconstruction quality, also for linear cameras. In addition, the calibration is faster than previous machine learning approaches based on Gaussian processes.

  20. Camera-marker and inertial sensor fusion for improved motion tracking

    NARCIS (Netherlands)

    Roetenberg, D.; Veltink, P.H.

    2005-01-01

    A method for combining a camera-marker based motion analysis system with miniature inertial sensors is proposed. It is used to fill gaps of optical data and can increase the data rate of the optical system.

  1. Comparison of low-cost handheld retinal camera and traditional table top retinal camera in the detection of retinal features indicating a risk of cardiovascular disease

    Science.gov (United States)

    Joshi, V.; Wigdahl, J.; Nemeth, S.; Zamora, G.; Ebrahim, E.; Soliz, P.

    2018-02-01

    Retinal abnormalities associated with hypertensive retinopathy are useful in assessing the risk of cardiovascular disease, heart failure, and stroke. Assessing these risks as part of primary care can lead to a decrease in the incidence of cardiovascular disease-related deaths. Primary care is a resource limited setting where low cost retinal cameras may bring needed help without compromising care. We compared a low-cost handheld retinal camera to a traditional table top retinal camera on their optical characteristics and performance to detect hypertensive retinopathy. A retrospective dataset of N=40 subjects (28 with hypertensive retinopathy, 12 controls) was used from a clinical study conducted at a primary care clinic in Texas. Non-mydriatic retinal fundus images were acquired using a Pictor Plus hand held camera (Volk Optical Inc.) and a Canon CR1-Mark II tabletop camera (Canon USA) during the same encounter. The images from each camera were graded by a licensed optometrist according to the universally accepted Keith-Wagener-Barker Hypertensive Retinopathy Classification System, three weeks apart to minimize memory bias. The sensitivity of the hand-held camera to detect any level of hypertensive retinopathy was 86% compared to the Canon. Insufficient photographer's skills produced 70% of the false negative cases. The other 30% were due to the handheld camera's insufficient spatial resolution to resolve the vascular changes such as minor A/V nicking and copper wiring, but these were associated with non-referable disease. Physician evaluation of the performance of the handheld camera indicates it is sufficient to provide high risk patients with adequate follow up and management.

  2. Sedimentological Investigations of the Martian Surface using the Mars 2001 Robotic Arm Camera and MECA Optical Microscope

    Science.gov (United States)

    Rice, J. W., Jr.; Smith, P. H.; Marshall, J. R.

    1999-01-01

    The first microscopic sedimentological studies of the Martian surface will commence with the landing of the Mars Polar Lander (MPL) December 3, 1999. The Robotic Arm Camera (RAC) has a resolution of 25 um/p which will permit detailed micromorphological analysis of surface and subsurface materials. The Robotic Ann will be able to dig up to 50 cm below the surface. The walls of the trench will also be inspected by RAC to look for evidence of stratigraphic and / or sedimentological relationships. The 2001 Mars Lander will build upon and expand the sedimentological research begun by the RAC on MPL. This will be accomplished by: (1) Macroscopic (dm to cm): Descent Imager, Pancam, RAC; (2) Microscopic (mm to um RAC, MECA Optical Microscope (Figure 2), AFM This paper will focus on investigations that can be conducted by the RAC and MECA Optical Microscope.

  3. A USB 2.0 computer interface for the UCO/Lick CCD cameras

    Science.gov (United States)

    Wei, Mingzhi; Stover, Richard J.

    2004-09-01

    The new UCO/Lick Observatory CCD camera uses a 200 MHz fiber optic cable to transmit image data and an RS232 serial line for low speed bidirectional command and control. Increasingly RS232 is a legacy interface supported on fewer computers. The fiber optic cable requires either a custom interface board that is plugged into the mainboard of the image acquisition computer to accept the fiber directly or an interface converter that translates the fiber data onto a widely used standard interface. We present here a simple USB 2.0 interface for the UCO/Lick camera. A single USB cable connects to the image acquisition computer and the camera's RS232 serial and fiber optic cables plug into the USB interface. Since most computers now support USB 2.0 the Lick interface makes it possible to use the camera on essentially any modern computer that has the supporting software. No hardware modifications or additions to the computer are needed. The necessary device driver software has been written for the Linux operating system which is now widely used at Lick Observatory. The complete data acquisition software for the Lick CCD camera is running on a variety of PC style computers as well as an HP laptop.

  4. Picosecond x-ray streak cameras

    Science.gov (United States)

    Averin, V. I.; Bryukhnevich, Gennadii I.; Kolesov, G. V.; Lebedev, Vitaly B.; Miller, V. A.; Saulevich, S. V.; Shulika, A. N.

    1991-04-01

    The first multistage image converter with an X-ray photocathode (UMI-93 SR) was designed in VNIIOFI in 1974 [1]. The experiments carried out in IOFAN pointed out that X-ray electron-optical cameras using the tube provided temporal resolution up to 12 picoseconds [2]. The later work has developed into the creation of the separate streak and intensifying tubes. Thus, PV-003R tube has been built on base of UMI-93SR design, fibre optically connected to PMU-2V image intensifier carrying microchannel plate.

  5. MOSS spectroscopic camera for imaging time resolved plasma species temperature and flow speed

    International Nuclear Information System (INIS)

    Michael, Clive; Howard, John

    2000-01-01

    A MOSS (Modulated Optical Solid-State) spectroscopic camera has been devised to monitor the spatial and temporal variations of temperatures and flow speeds of plasma ion species, the Doppler broadening measurement being made of spectroscopic lines specified. As opposed to a single channel MOSS spectrometer, the camera images light from plasma onto an array of light detectors, being mentioned 2D imaging of plasma ion temperatures and flow speeds. In addition, compared to a conventional grating spectrometer, the MOSS camera shows an excellent light collecting performance which leads to the improvement of signal to noise ratio and of time resolution. The present paper first describes basic items of MOSS spectroscopy, then follows MOSS camera with an emphasis on the optical system of 2D imaging. (author)

  6. MOSS spectroscopic camera for imaging time resolved plasma species temperature and flow speed

    Energy Technology Data Exchange (ETDEWEB)

    Michael, Clive; Howard, John [Australian National Univ., Plasma Research Laboratory, Canberra (Australia)

    2000-03-01

    A MOSS (Modulated Optical Solid-State) spectroscopic camera has been devised to monitor the spatial and temporal variations of temperatures and flow speeds of plasma ion species, the Doppler broadening measurement being made of spectroscopic lines specified. As opposed to a single channel MOSS spectrometer, the camera images light from plasma onto an array of light detectors, being mentioned 2D imaging of plasma ion temperatures and flow speeds. In addition, compared to a conventional grating spectrometer, the MOSS camera shows an excellent light collecting performance which leads to the improvement of signal to noise ratio and of time resolution. The present paper first describes basic items of MOSS spectroscopy, then follows MOSS camera with an emphasis on the optical system of 2D imaging. (author)

  7. Soft x-ray streak cameras

    International Nuclear Information System (INIS)

    Stradling, G.L.

    1988-01-01

    This paper is a discussion of the development and of the current state of the art in picosecond soft x-ray streak camera technology. Accomplishments from a number of institutions are discussed. X-ray streak cameras vary from standard visible streak camera designs in the use of an x-ray transmitting window and an x-ray sensitive photocathode. The spectral sensitivity range of these instruments includes portions of the near UV and extends from the subkilovolt x- ray region to several tens of kilovolts. Attendant challenges encountered in the design and use of x-ray streak cameras include the accommodation of high-voltage and vacuum requirements, as well as manipulation of a photocathode structure which is often fragile. The x-ray transmitting window is generally too fragile to withstand atmospheric pressure, necessitating active vacuum pumping and a vacuum line of sight to the x-ray signal source. Because of the difficulty of manipulating x-ray beams with conventional optics, as is done with visible light, the size of the photocathode sensing area, access to the front of the tube, the ability to insert the streak tube into a vacuum chamber and the capability to trigger the sweep with very short internal delay times are issues uniquely relevant to x-ray streak camera use. The physics of electron imaging may place more stringent limitations on the temporal and spatial resolution obtainable with x-ray photocathodes than with the visible counterpart. Other issues which are common to the entire streak camera community also concern the x-ray streak camera users and manufacturers

  8. Demonstration of the CDMA-mode CAOS smart camera.

    Science.gov (United States)

    Riza, Nabeel A; Mazhar, Mohsin A

    2017-12-11

    Demonstrated is the code division multiple access (CDMA)-mode coded access optical sensor (CAOS) smart camera suited for bright target scenarios. Deploying a silicon CMOS sensor and a silicon point detector within a digital micro-mirror device (DMD)-based spatially isolating hybrid camera design, this smart imager first engages the DMD starring mode with a controlled factor of 200 high optical attenuation of the scene irradiance to provide a classic unsaturated CMOS sensor-based image for target intelligence gathering. Next, this CMOS sensor provided image data is used to acquire a focused zone more robust un-attenuated true target image using the time-modulated CDMA-mode of the CAOS camera. Using four different bright light test target scenes, successfully demonstrated is a proof-of-concept visible band CAOS smart camera operating in the CDMA-mode using up-to 4096 bits length Walsh design CAOS pixel codes with a maximum 10 KHz code bit rate giving a 0.4096 seconds CAOS frame acquisition time. A 16-bit analog-to-digital converter (ADC) with time domain correlation digital signal processing (DSP) generates the CDMA-mode images with a 3600 CAOS pixel count and a best spatial resolution of one micro-mirror square pixel size of 13.68 μm side. The CDMA-mode of the CAOS smart camera is suited for applications where robust high dynamic range (DR) imaging is needed for un-attenuated un-spoiled bright light spectrally diverse targets.

  9. A Simple Spectrophotometer Using Common Materials and a Digital Camera

    Science.gov (United States)

    Widiatmoko, Eko; Widayani; Budiman, Maman; Abdullah, Mikrajuddin; Khairurrijal

    2011-01-01

    A simple spectrophotometer was designed using cardboard, a DVD, a pocket digital camera, a tripod and a computer. The DVD was used as a diffraction grating and the camera as a light sensor. The spectrophotometer was calibrated using a reference light prior to use. The spectrophotometer was capable of measuring optical wavelengths with a…

  10. Digital holography using a digital photo-camera

    Czech Academy of Sciences Publication Activity Database

    Sekanina, H.; Pospíšil, Jaroslav

    2002-01-01

    Roč. 49, č. 13 (2002), s. 2083-2092 ISSN 0950-0340 Institutional research plan: CEZ:AV0Z1010921 Keywords : digital holography * photo-camera Subject RIV: BH - Optics, Masers, Lasers Impact factor: 1.717, year: 2002

  11. Cheap streak camera based on the LD-S-10 intensifier tube

    Science.gov (United States)

    Dashevsky, Boris E.; Krutik, Mikhail I.; Surovegin, Alexander L.

    1992-01-01

    Basic properties of a new streak camera and its test results are reported. To intensify images on its screen, we employed modular G1 tubes, the LD-A-1.0 and LD-A-0.33, enabling magnification of 1.0 and 0.33, respectively. If necessary, the LD-A-0.33 tube may be substituted by any other image intensifier of the LDA series, the choice to be determined by the size of the CCD matrix with fiber-optical windows. The reported camera employs a 12.5- mm-long CCD strip consisting of 1024 pixels, each 12 X 500 micrometers in size. Registered radiation was imaged on a 5 X 0.04 mm slit diaphragm tightly connected with the LD-S- 10 fiber-optical input window. Electrons escaping the cathode are accelerated in a 5 kV electric field and focused onto a phosphor screen covering a fiber-optical plate as they travel between deflection plates. Sensitivity of the latter was 18 V/mm, which implies that the total deflecting voltage was 720 V per 40 mm of the screen surface, since reversed-polarity scan pulses +360 V and -360 V were applied across the deflection plate. The streak camera provides full scan times over the screen of 15, 30, 50, 100, 250, and 500 ns. Timing of the electrically or optically driven camera was done using a 10 ns step-controlled-delay (0 - 500 ns) circuit.

  12. Periscope-camera system for visible and infrared imaging diagnostics on TFTR

    International Nuclear Information System (INIS)

    Medley, S.S.; Dimock, D.L.; Hayes, S.; Long, D.; Lowrence, J.L.; Mastrocola, V.; Renda, G.; Ulrickson, M.; Young, K.M.

    1985-05-01

    An optical diagnostic consisting of a periscope which relays images of the torus interior to an array of cameras is used on the Tokamak Fusion Test Reactor (TFTR) to view plasma discharge phenomena and inspect vacuum vessel internal structures in both visible and near-infrared wavelength regions. Three periscopes view through 20-cm-diameter fused-silica windows which are spaced around the torus midplane to provide a viewing coverage of approximately 75% of the vacuum vessel internal surface area. The periscopes have f/8 optics and motor-driven controls for focusing, magnification selection (5 0 , 20 0 , and 60 0 field of view), elevation and azimuth setting, mast rotation, filter selection, iris aperture, and viewing port selection. The four viewing ports on each periscope are equipped with multiple imaging devices which include: (1) an inspection eyepiece, (2) standard (RCA TC2900) and fast (RETICON) framing rate television cameras, (3) a PtSi CCD infrared imaging camera, (4) a 35 mm Nikon F3 still camera, or (5) a 16 mm Locam II movie camera with variable framing up to 500 fps. Operation of the periscope-camera system is controlled either locally or remotely through a computer-CAMAC interface. A description of the equipment and examples of its application are presented

  13. Periscope-camera system for visible and infrared imaging diagnostics on TFTR

    International Nuclear Information System (INIS)

    Medley, S.S.; Dimock, D.L.; Hayes, S.; Long, D.; Lowrance, J.L.; Mastrocola, V.; Renda, G.; Ulrickson, M.; Young, K.M.

    1985-01-01

    An optical diagnostic consisting of a periscope which relays images of the torus interior to an array of cameras is used on the Tokamak Fusion Test Reactor (TFTR) to view plasma discharge phenomena and inspect the vacuum vessel internal structures in both the visible and near-infrared wavelength regions. Three periscopes view through 20-cm-diam fused-silica windows which are spaced around the torus midplane to provide a viewing coverage of approximately 75% of the vacuum vessel internal surface area. The periscopes have f/8 optics and motor-driven controls for focusing, magnification selection (5 0 , 20 0 , and 60 0 field of view), elevation and azimuth setting, mast rotation, filter selection, iris aperture, and viewing port selection. The four viewing ports on each periscope are equipped with multiple imaging devices which include: (1) an inspection eyepiece, (2) standard (RCA TC2900) and fast (RETICON) framing rate television cameras, (3) a PtSi CCD infrared imaging camera, (4) a 35-mm Nikon F3 still camera, or (5) a 16-mm Locam II movie camera with variable framing rate up to 500 fps. Operation of the periscope-camera system is controlled either locally or remotely through a computer-CAMAC interface. A description of the equipment and examples of its application are presented

  14. Camera Traps Can Be Heard and Seen by Animals

    Science.gov (United States)

    Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356

  15. Camera traps can be heard and seen by animals.

    Directory of Open Access Journals (Sweden)

    Paul D Meek

    Full Text Available Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5 and infrared illumination outputs (n = 7 of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21 and assessed the vision ranges (n = 3 of mammals species (where data existed to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  16. Ultra-fast framing camera tube

    Science.gov (United States)

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  17. Dark Energy Camera for Blanco

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  18. Gamma camera with reflectivity mask

    International Nuclear Information System (INIS)

    Stout, K.J.

    1980-01-01

    In accordance with the present invention there is provided a radiographic camera comprising: a scintillator; a plurality of photodectors positioned to face said scintillator; a plurality of masked regions formed upon a face of said scintillator opposite said photdetectors and positioned coaxially with respective ones of said photodetectors for decreasing the amount of internal reflection of optical photons generated within said scintillator. (auth)

  19. Temperature measurement with industrial color camera devices

    Science.gov (United States)

    Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen

    1999-05-01

    This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.

  20. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  1. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    International Nuclear Information System (INIS)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef; Goiffon, Vincent; Corbiere, Franck; Rolando, Sebastien; Molina, Romain; Estribeau, Magali; Avon, Barbara; Magnan, Pierre; Paillet, Philippe; Duhamel, Olivier; Gaillardin, Marc; Raine, Melanie

    2015-01-01

    electronics will be exposed. Another important element of the camera is the optical system that transports the image from the scene to the image sensor. This arrangement of glass-based lenses is affected by radiations through two mechanisms: the radiation induced absorption and the radiation induced refractive index changes. The first one will limit the signal to noise ratio of the image whereas the second one will directly affect the resolution of the camera. We'll present at the conference a coupled simulation/experiment study of these effects for various commercial glasses and present vulnerability study of typical optical systems to radiations at MGy doses. The last very important part of the camera is the illumination system that can be based on various technologies of emitting devices like LED, SLED or lasers. The most promising solutions for high radiation doses will be presented at the conference. In addition to this hardening-by-component approach, the global radiation tolerance of the camera can be drastically improve by working at the system level, combining innovative approaches eg. for the optical and illumination systems. We'll present at the conference the developed approach allowing to extend the camera lifetime up to the MGy dose range. (authors)

  2. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef [Universite de Saint-Etienne, Lab. Hubert Curien, UMR-CNRS 5516, F-42000 Saint-Etienne (France); Goiffon, Vincent; Corbiere, Franck; Rolando, Sebastien; Molina, Romain; Estribeau, Magali; Avon, Barbara; Magnan, Pierre [ISAE, Universite de Toulouse, F-31055 Toulouse (France); Paillet, Philippe; Duhamel, Olivier; Gaillardin, Marc; Raine, Melanie [CEA, DAM, DIF, F-91297 Arpajon (France)

    2015-07-01

    electronics will be exposed. Another important element of the camera is the optical system that transports the image from the scene to the image sensor. This arrangement of glass-based lenses is affected by radiations through two mechanisms: the radiation induced absorption and the radiation induced refractive index changes. The first one will limit the signal to noise ratio of the image whereas the second one will directly affect the resolution of the camera. We'll present at the conference a coupled simulation/experiment study of these effects for various commercial glasses and present vulnerability study of typical optical systems to radiations at MGy doses. The last very important part of the camera is the illumination system that can be based on various technologies of emitting devices like LED, SLED or lasers. The most promising solutions for high radiation doses will be presented at the conference. In addition to this hardening-by-component approach, the global radiation tolerance of the camera can be drastically improve by working at the system level, combining innovative approaches eg. for the optical and illumination systems. We'll present at the conference the developed approach allowing to extend the camera lifetime up to the MGy dose range. (authors)

  3. Opto-mechanical design of the G-CLEF flexure control camera system

    Science.gov (United States)

    Oh, Jae Sok; Park, Chan; Kim, Jihun; Kim, Kang-Min; Chun, Moo-Young; Yu, Young Sam; Lee, Sungho; Nah, Jakyoung; Park, Sung-Joon; Szentgyorgyi, Andrew; McMuldroch, Stuart; Norton, Timothy; Podgorski, William; Evans, Ian; Mueller, Mark; Uomoto, Alan; Crane, Jeffrey; Hare, Tyson

    2016-08-01

    The GMT-Consortium Large Earth Finder (G-CLEF) is the very first light instrument of the Giant Magellan Telescope (GMT). The G-CLEF is a fiber feed, optical band echelle spectrograph that is capable of extremely precise radial velocity measurement. KASI (Korea Astronomy and Space Science Institute) is responsible for Flexure Control Camera (FCC) included in the G-CLEF Front End Assembly (GCFEA). The FCC is a kind of guide camera, which monitors the field images focused on a fiber mirror to control the flexure and the focus errors within the GCFEA. The FCC consists of five optical components: a collimator including triple lenses for producing a pupil, neutral density filters allowing us to use much brighter star as a target or a guide, a tent prism as a focus analyzer for measuring the focus offset at the fiber mirror, a reimaging camera with three pair of lenses for focusing the beam on a CCD focal plane, and a CCD detector for capturing the image on the fiber mirror. In this article, we present the optical and mechanical FCC designs which have been modified after the PDR in April 2015.

  4. Calibration of Low Cost RGB and NIR Uav Cameras

    Science.gov (United States)

    Fryskowska, A.; Kedzierski, M.; Grochala, A.; Braula, A.

    2016-06-01

    Non-metric digital cameras are being widely used for photogrammetric studies. The increase in resolution and quality of images obtained by non-metric cameras, allows to use it in low-cost UAV and terrestrial photogrammetry. Imagery acquired with non-metric cameras can be used in 3D modeling of objects or landscapes, reconstructing of historical sites, generating digital elevation models (DTM), orthophotos, or in the assessment of accidents. Non-metric digital camcorders are characterized by instability and ignorance of the interior orientation parameters. Therefore, the use of these devices requires prior calibration. Calibration research was conducted using non-metric camera, different calibration tests and various software. The first part of the paper contains a brief theoretical introduction including the basic definitions, like the construction of non-metric cameras or description of different optical distortions. The second part of the paper contains cameras calibration process, details of the calibration methods and models that have been used. Sony Nex 5 camera calibration has been done using software: Image Master Calib, Matlab - Camera Calibrator application and Agisoft Lens. For the study 2D test fields has been used. As a part of the research a comparative analysis of the results have been done.

  5. Liquid lens: advances in adaptive optics

    Science.gov (United States)

    Casey, Shawn Patrick

    2010-12-01

    'Liquid lens' technologies promise significant advancements in machine vision and optical communications systems. Adaptations for machine vision, human vision correction, and optical communications are used to exemplify the versatile nature of this technology. Utilization of liquid lens elements allows the cost effective implementation of optical velocity measurement. The project consists of a custom image processor, camera, and interface. The images are passed into customized pattern recognition and optical character recognition algorithms. A single camera would be used for both speed detection and object recognition.

  6. NSTX Tangential Divertor Camera

    International Nuclear Information System (INIS)

    Roquemore, A.L.; Ted Biewer; Johnson, D.; Zweben, S.J.; Nobuhiro Nishino; Soukhanovskii, V.A.

    2004-01-01

    Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor

  7. Optical methods for the optimization of system SWaP-C using aspheric components and advanced optical polymers

    Science.gov (United States)

    Zelazny, Amy; Benson, Robert; Deegan, John; Walsh, Ken; Schmidt, W. David; Howe, Russell

    2013-06-01

    We describe the benefits to camera system SWaP-C associated with the use of aspheric molded glasses and optical polymers in the design and manufacture of optical components and elements. Both camera objectives and display eyepieces, typical for night vision man-portable EO/IR systems, are explored. We discuss optical trade-offs, system performance, and cost reductions associated with this approach in both visible and non-visible wavebands, specifically NIR and LWIR. Example optical models are presented, studied, and traded using this approach.

  8. Low-cost mobile phone microscopy with a reversed mobile phone camera lens.

    Directory of Open Access Journals (Sweden)

    Neil A Switz

    Full Text Available The increasing capabilities and ubiquity of mobile phones and their associated digital cameras offer the possibility of extending low-cost, portable diagnostic microscopy to underserved and low-resource areas. However, mobile phone microscopes created by adding magnifying optics to the phone's camera module have been unable to make use of the full image sensor due to the specialized design of the embedded camera lens, exacerbating the tradeoff between resolution and field of view inherent to optical systems. This tradeoff is acutely felt for diagnostic applications, where the speed and cost of image-based diagnosis is related to the area of the sample that can be viewed at sufficient resolution. Here we present a simple and low-cost approach to mobile phone microscopy that uses a reversed mobile phone camera lens added to an intact mobile phone to enable high quality imaging over a significantly larger field of view than standard microscopy. We demonstrate use of the reversed lens mobile phone microscope to identify red and white blood cells in blood smears and soil-transmitted helminth eggs in stool samples.

  9. Low-cost mobile phone microscopy with a reversed mobile phone camera lens.

    Science.gov (United States)

    Switz, Neil A; D'Ambrosio, Michael V; Fletcher, Daniel A

    2014-01-01

    The increasing capabilities and ubiquity of mobile phones and their associated digital cameras offer the possibility of extending low-cost, portable diagnostic microscopy to underserved and low-resource areas. However, mobile phone microscopes created by adding magnifying optics to the phone's camera module have been unable to make use of the full image sensor due to the specialized design of the embedded camera lens, exacerbating the tradeoff between resolution and field of view inherent to optical systems. This tradeoff is acutely felt for diagnostic applications, where the speed and cost of image-based diagnosis is related to the area of the sample that can be viewed at sufficient resolution. Here we present a simple and low-cost approach to mobile phone microscopy that uses a reversed mobile phone camera lens added to an intact mobile phone to enable high quality imaging over a significantly larger field of view than standard microscopy. We demonstrate use of the reversed lens mobile phone microscope to identify red and white blood cells in blood smears and soil-transmitted helminth eggs in stool samples.

  10. Time-resolved laser-excited Shpol'skii spectrometry with a fiber-optic probe and ICCD camera

    International Nuclear Information System (INIS)

    Bystol, Adam J.; Campiglia, Andres D.; Gillispie, Gregory D.

    2000-01-01

    Improved methodology for chemical analysis via laser-excited Shpol'skii spectrometry is reported. The complications of traditional methodology for measurements at liquid nitrogen temperature are avoided by freezing the distal end of a bifurcated fiber-optic probe directly into the sample matrix. Emission wavelength-time matrices were rapidly collected by automatically incrementing the gate delay of an intensified charge-coupled device (ICCD) camera relative to the laser excitation pulse. The excitation source is a compact frequency-doubled tunable dye laser whose bandwidth (<0.03 nm) is well matched for Shpol'skii spectroscopy. Data reproducibility for quantitative analysis purposes and analytical figures of merit are demonstrated for several polycyclic aromatic hydrocarbons at 77 K. Although not attempted in this study, time-resolved excitation-emission matrices could easily be collected with this instrumental system. (c) 2000 Society for Applied Spectroscopy

  11. Illumination technique for the relative calibration of the ASTRI SST-2M camera

    Energy Technology Data Exchange (ETDEWEB)

    Rodeghiero, Gabriele, E-mail: gabriele.rodeghiero@studenti.unipd.it [Department of Physics and Astronomy, University of Padova, Vicolo dell' Osservatorio 5, 35100 PD (Italy); Catalano, Osvaldo; Segreto, Alberto [INAF IASF Palermo, Via Ugo La Malfa 153, 90146 PA (Italy); De Caprio, Vincenzo [INAF OACN, Salita Moiariello, 16, 80131 Napoli, NA (Italy); Giro, Enrico; Lessio, Luigi [INAF OAPD, Vicolo dell' Osservatorio 5, 35100 PD (Italy); Conconi, Paolo; Canestrari, Rodolfo [INAF OAB, Via E. Bianchi 46, 23807 Merate, LC (Italy)

    2014-11-11

    We present a new illumination technique for the camera relative gain calibration of the ASTRI SST-2M Cherenkov telescope. The camera illumination is achieved by means of an optical fiber that diffuses the light inside a protective PMMA window above the focal plane. We report the encouraging results of the development tests carried out on two PMMA window prototypes illuminated by a standard optical fiber. We checked also the reliability of the method by a series of ray tracing simulations for different scattering models and PMMA window shapes finding good agreement with experimental results.

  12. Illumination technique for the relative calibration of the ASTRI SST-2M camera

    International Nuclear Information System (INIS)

    Rodeghiero, Gabriele; Catalano, Osvaldo; Segreto, Alberto; De Caprio, Vincenzo; Giro, Enrico; Lessio, Luigi; Conconi, Paolo; Canestrari, Rodolfo

    2014-01-01

    We present a new illumination technique for the camera relative gain calibration of the ASTRI SST-2M Cherenkov telescope. The camera illumination is achieved by means of an optical fiber that diffuses the light inside a protective PMMA window above the focal plane. We report the encouraging results of the development tests carried out on two PMMA window prototypes illuminated by a standard optical fiber. We checked also the reliability of the method by a series of ray tracing simulations for different scattering models and PMMA window shapes finding good agreement with experimental results

  13. A detailed comparison of single-camera light-field PIV and tomographic PIV

    Science.gov (United States)

    Shi, Shengxian; Ding, Junfei; Atkinson, Callum; Soria, Julio; New, T. H.

    2018-03-01

    This paper conducts a comprehensive study between the single-camera light-field particle image velocimetry (LF-PIV) and the multi-camera tomographic particle image velocimetry (Tomo-PIV). Simulation studies were first performed using synthetic light-field and tomographic particle images, which extensively examine the difference between these two techniques by varying key parameters such as pixel to microlens ratio (PMR), light-field camera Tomo-camera pixel ratio (LTPR), particle seeding density and tomographic camera number. Simulation results indicate that the single LF-PIV can achieve accuracy consistent with that of multi-camera Tomo-PIV, but requires the use of overall greater number of pixels. Experimental studies were then conducted by simultaneously measuring low-speed jet flow with single-camera LF-PIV and four-camera Tomo-PIV systems. Experiments confirm that given a sufficiently high pixel resolution, a single-camera LF-PIV system can indeed deliver volumetric velocity field measurements for an equivalent field of view with a spatial resolution commensurate with those of multi-camera Tomo-PIV system, enabling accurate 3D measurements in applications where optical access is limited.

  14. The opto-cryo-mechanical design of the short wavelength camera for the CCAT Observatory

    Science.gov (United States)

    Parshley, Stephen C.; Adams, Joseph; Nikola, Thomas; Stacey, Gordon J.

    2014-07-01

    The CCAT observatory is a 25-m class Gregorian telescope designed for submillimeter observations that will be deployed at Cerro Chajnantor (~5600 m) in the high Atacama Desert region of Chile. The Short Wavelength Camera (SWCam) for CCAT is an integral part of the observatory, enabling the study of star formation at high and low redshifts. SWCam will be a facility instrument, available at first light and operating in the telluric windows at wavelengths of 350, 450, and 850 μm. In order to trace the large curvature of the CCAT focal plane, and to suit the available instrument space, SWCam is divided into seven sub-cameras, each configured to a particular telluric window. A fully refractive optical design in each sub-camera will produce diffraction-limited images. The material of choice for the optical elements is silicon, due to its excellent transmission in the submillimeter and its high index of refraction, enabling thin lenses of a given power. The cryostat's vacuum windows double as the sub-cameras' field lenses and are ~30 cm in diameter. The other lenses are mounted at 4 K. The sub-cameras will share a single cryostat providing thermal intercepts at 80, 15, 4, 1 and 0.1 K, with cooling provided by pulse tube cryocoolers and a dilution refrigerator. The use of the intermediate temperature stage at 15 K minimizes the load at 4 K and reduces operating costs. We discuss our design requirements, specifications, key elements and expected performance of the optical, thermal and mechanical design for the short wavelength camera for CCAT.

  15. Physical optics

    International Nuclear Information System (INIS)

    Kim Il Gon; Lee, Seong Su; Jang, Gi Wan

    2012-07-01

    This book indicates physical optics with properties and transmission of light, mathematical expression of wave like harmonic wave and cylindrical wave, electromagnetic theory and light, transmission of light with Fermat principle and Fresnel equation, geometrical optics I, geometrical optics II, optical instrument such as stops, glasses and camera, polarized light like double refraction by polarized light, interference, interference by multiple reflections, diffraction, solid optics, crystal optics such as Faraday rotation and Kerr effect and measurement of light. Each chapter has an exercise.

  16. Physical optics

    Energy Technology Data Exchange (ETDEWEB)

    Kim Il Gon; Lee, Seong Su; Jang, Gi Wan

    2012-07-15

    This book indicates physical optics with properties and transmission of light, mathematical expression of wave like harmonic wave and cylindrical wave, electromagnetic theory and light, transmission of light with Fermat principle and Fresnel equation, geometrical optics I, geometrical optics II, optical instrument such as stops, glasses and camera, polarized light like double refraction by polarized light, interference, interference by multiple reflections, diffraction, solid optics, crystal optics such as Faraday rotation and Kerr effect and measurement of light. Each chapter has an exercise.

  17. Space telescope phase B definition study. Volume 2A: Science instruments, f48/96 planetary camera

    Science.gov (United States)

    Grosso, R. P.; Mccarthy, D. J.

    1976-01-01

    The analysis and preliminary design of the f48/96 planetary camera for the space telescope are discussed. The camera design is for application to the axial module position of the optical telescope assembly.

  18. Influence of Digital Camera Errors on the Photogrammetric Image Processing

    Science.gov (United States)

    Sužiedelytė-Visockienė, Jūratė; Bručas, Domantas

    2009-01-01

    The paper deals with the calibration of digital camera Canon EOS 350D, often used for the photogrammetric 3D digitalisation and measurements of industrial and construction site objects. During the calibration data on the optical and electronic parameters, influencing the distortion of images, such as correction of the principal point, focal length of the objective, radial symmetrical and non-symmetrical distortions were obtained. The calibration was performed by means of the Tcc software implementing the polynomial of Chebichev and using a special test-field with the marks, coordinates of which are precisely known. The main task of the research - to determine how parameters of the camera calibration influence the processing of images, i. e. the creation of geometric model, the results of triangulation calculations and stereo-digitalisation. Two photogrammetric projects were created for this task. In first project the non-corrected and in the second the corrected ones, considering the optical errors of the camera obtained during the calibration, images were used. The results of analysis of the images processing is shown in the images and tables. The conclusions are given.

  19. Researches on hazard avoidance cameras calibration of Lunar Rover

    Science.gov (United States)

    Li, Chunyan; Wang, Li; Lu, Xin; Chen, Jihua; Fan, Shenghong

    2017-11-01

    Lunar Lander and Rover of China will be launched in 2013. It will finish the mission targets of lunar soft landing and patrol exploration. Lunar Rover has forward facing stereo camera pair (Hazcams) for hazard avoidance. Hazcams calibration is essential for stereo vision. The Hazcam optics are f-theta fish-eye lenses with a 120°×120° horizontal/vertical field of view (FOV) and a 170° diagonal FOV. They introduce significant distortion in images and the acquired images are quite warped, which makes conventional camera calibration algorithms no longer work well. A photogrammetric calibration method of geometric model for the type of optical fish-eye constructions is investigated in this paper. In the method, Hazcams model is represented by collinearity equations with interior orientation and exterior orientation parameters [1] [2]. For high-precision applications, the accurate calibration model is formulated with the radial symmetric distortion and the decentering distortion as well as parameters to model affinity and shear based on the fisheye deformation model [3] [4]. The proposed method has been applied to the stereo camera calibration system for Lunar Rover.

  20. Automatically assessing properties of dynamic cameras for camera selection and rapid deployment of video content analysis tasks in large-scale ad-hoc networks

    Science.gov (United States)

    den Hollander, Richard J. M.; Bouma, Henri; van Rest, Jeroen H. C.; ten Hove, Johan-Martijn; ter Haar, Frank B.; Burghouts, Gertjan J.

    2017-10-01

    Video analytics is essential for managing large quantities of raw data that are produced by video surveillance systems (VSS) for the prevention, repression and investigation of crime and terrorism. Analytics is highly sensitive to changes in the scene, and for changes in the optical chain so a VSS with analytics needs careful configuration and prompt maintenance to avoid false alarms. However, there is a trend from static VSS consisting of fixed CCTV cameras towards more dynamic VSS deployments over public/private multi-organization networks, consisting of a wider variety of visual sensors, including pan-tilt-zoom (PTZ) cameras, body-worn cameras and cameras on moving platforms. This trend will lead to more dynamic scenes and more frequent changes in the optical chain, creating structural problems for analytics. If these problems are not adequately addressed, analytics will not be able to continue to meet end users' developing needs. In this paper, we present a three-part solution for managing the performance of complex analytics deployments. The first part is a register containing meta data describing relevant properties of the optical chain, such as intrinsic and extrinsic calibration, and parameters of the scene such as lighting conditions or measures for scene complexity (e.g. number of people). A second part frequently assesses these parameters in the deployed VSS, stores changes in the register, and signals relevant changes in the setup to the VSS administrator. A third part uses the information in the register to dynamically configure analytics tasks based on VSS operator input. In order to support the feasibility of this solution, we give an overview of related state-of-the-art technologies for autocalibration (self-calibration), scene recognition and lighting estimation in relation to person detection. The presented solution allows for rapid and robust deployment of Video Content Analysis (VCA) tasks in large scale ad-hoc networks.

  1. Linking optical and infrared observations with gravitational wave sources through transient variability

    International Nuclear Information System (INIS)

    Stubbs, C W

    2008-01-01

    Optical and infrared observations have thus far detected more celestial cataclysms than have been seen in gravity waves (GW). This argues that we should search for gravity wave signatures that correspond to transient variables seen at optical wavelengths, at precisely known positions. There is an unknown time delay between the optical and gravitational transient, but knowing the source location precisely specifies the corresponding time delays across the gravitational antenna network as a function of the GW-to-optical arrival time difference. Optical searches should detect virtually all supernovae that are plausible gravitational radiation sources. The transient optical signature expected from merging compact objects is not as well understood, but there are good reasons to expect detectable transient optical/IR emission from most of these sources as well. The next generation of deep wide-field surveys (for example PanSTARRS and LSST) will be sensitive to subtle optical variability, but we need to fill the 'blind spots' that exist in the galactic plane, and for optically bright transient sources. In particular, a galactic plane variability survey at λ∼ 2 μm seems worthwhile. Science would benefit from closer coordination between the various optical survey projects and the gravity wave community

  2. Afocal viewport optics for underwater imaging

    Science.gov (United States)

    Slater, Dan

    2014-09-01

    A conventional camera can be adapted for underwater use by enclosing it in a sealed waterproof pressure housing with a viewport. The viewport, as an optical interface between water and air needs to consider both the camera and water optical characteristics while also providing a high pressure water seal. Limited hydrospace visibility drives a need for wide angle viewports. Practical optical interfaces between seawater and air vary from simple flat plate windows to complex water contact lenses. This paper first provides a brief overview of the physical and optical properties of the ocean environment along with suitable optical materials. This is followed by a discussion of the characteristics of various afocal underwater viewport types including flat windows, domes and the Ivanoff corrector lens, a derivative of a Galilean wide angle camera adapter. Several new and interesting optical designs derived from the Ivanoff corrector lens are presented including a pair of very compact afocal viewport lenses that are compatible with both in water and in air environments and an afocal underwater hyper-hemispherical fisheye lens.

  3. New light field camera based on physical based rendering tracing

    Science.gov (United States)

    Chung, Ming-Han; Chang, Shan-Ching; Lee, Chih-Kung

    2014-03-01

    Even though light field technology was first invented more than 50 years ago, it did not gain popularity due to the limitation imposed by the computation technology. With the rapid advancement of computer technology over the last decade, the limitation has been uplifted and the light field technology quickly returns to the spotlight of the research stage. In this paper, PBRT (Physical Based Rendering Tracing) was introduced to overcome the limitation of using traditional optical simulation approach to study the light field camera technology. More specifically, traditional optical simulation approach can only present light energy distribution but typically lack the capability to present the pictures in realistic scenes. By using PBRT, which was developed to create virtual scenes, 4D light field information was obtained to conduct initial data analysis and calculation. This PBRT approach was also used to explore the light field data calculation potential in creating realistic photos. Furthermore, we integrated the optical experimental measurement results with PBRT in order to place the real measurement results into the virtually created scenes. In other words, our approach provided us with a way to establish a link of virtual scene with the real measurement results. Several images developed based on the above-mentioned approaches were analyzed and discussed to verify the pros and cons of the newly developed PBRT based light field camera technology. It will be shown that this newly developed light field camera approach can circumvent the loss of spatial resolution associated with adopting a micro-lens array in front of the image sensors. Detailed operational constraint, performance metrics, computation resources needed, etc. associated with this newly developed light field camera technique were presented in detail.

  4. The first GCT camera for the Cherenkov Telescope Array

    CERN Document Server

    De Franco, A.; Allan, D.; Armstrong, T.; Ashton, T.; Balzer, A.; Berge, D.; Bose, R.; Brown, A.M.; Buckley, J.; Chadwick, P.M.; Cooke, P.; Cotter, G.; Daniel, M.K.; Funk, S.; Greenshaw, T.; Hinton, J.; Kraus, M.; Lapington, J.; Molyneux, P.; Moore, P.; Nolan, S.; Okumura, A.; Ross, D.; Rulten, C.; Schmoll, J.; Schoorlemmer, H.; Stephan, M.; Sutcliffe, P.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Varner, G.; Watson, J.; Zink, A.

    2015-01-01

    The Gamma Cherenkov Telescope (GCT) is proposed to be part of the Small Size Telescope (SST) array of the Cherenkov Telescope Array (CTA). The GCT dual-mirror optical design allows the use of a compact camera of diameter roughly 0.4 m. The curved focal plane is equipped with 2048 pixels of ~0.2{\\deg} angular size, resulting in a field of view of ~9{\\deg}. The GCT camera is designed to record the flashes of Cherenkov light from electromagnetic cascades, which last only a few tens of nanoseconds. Modules based on custom ASICs provide the required fast electronics, facilitating sampling and digitisation as well as first level of triggering. The first GCT camera prototype is currently being commissioned in the UK. On-telescope tests are planned later this year. Here we give a detailed description of the camera prototype and present recent progress with testing and commissioning.

  5. Optomechanical System Development of the AWARE Gigapixel Scale Camera

    Science.gov (United States)

    Son, Hui S.

    Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.

  6. Low-cost uncooled VOx infrared camera development

    Science.gov (United States)

    Li, Chuan; Han, C. J.; Skidmore, George D.; Cook, Grady; Kubala, Kenny; Bates, Robert; Temple, Dorota; Lannon, John; Hilton, Allan; Glukh, Konstantin; Hardy, Busbee

    2013-06-01

    The DRS Tamarisk® 320 camera, introduced in 2011, is a low cost commercial camera based on the 17 µm pixel pitch 320×240 VOx microbolometer technology. A higher resolution 17 µm pixel pitch 640×480 Tamarisk®640 has also been developed and is now in production serving the commercial markets. Recently, under the DARPA sponsored Low Cost Thermal Imager-Manufacturing (LCTI-M) program and internal project, DRS is leading a team of industrial experts from FiveFocal, RTI International and MEMSCAP to develop a small form factor uncooled infrared camera for the military and commercial markets. The objective of the DARPA LCTI-M program is to develop a low SWaP camera (costs less than US $500 based on a 10,000 units per month production rate. To meet this challenge, DRS is developing several innovative technologies including a small pixel pitch 640×512 VOx uncooled detector, an advanced digital ROIC and low power miniature camera electronics. In addition, DRS and its partners are developing innovative manufacturing processes to reduce production cycle time and costs including wafer scale optic and vacuum packaging manufacturing and a 3-dimensional integrated camera assembly. This paper provides an overview of the DRS Tamarisk® project and LCTI-M related uncooled technology development activities. Highlights of recent progress and challenges will also be discussed. It should be noted that BAE Systems and Raytheon Vision Systems are also participants of the DARPA LCTI-M program.

  7. Displacement and deformation measurement for large structures by camera network

    Science.gov (United States)

    Shang, Yang; Yu, Qifeng; Yang, Zhen; Xu, Zhiqiang; Zhang, Xiaohu

    2014-03-01

    A displacement and deformation measurement method for large structures by a series-parallel connection camera network is presented. By taking the dynamic monitoring of a large-scale crane in lifting operation as an example, a series-parallel connection camera network is designed, and the displacement and deformation measurement method by using this series-parallel connection camera network is studied. The movement range of the crane body is small, and that of the crane arm is large. The displacement of the crane body, the displacement of the crane arm relative to the body and the deformation of the arm are measured. Compared with a pure series or parallel connection camera network, the designed series-parallel connection camera network can be used to measure not only the movement and displacement of a large structure but also the relative movement and deformation of some interesting parts of the large structure by a relatively simple optical measurement system.

  8. Optical improvement for laser material processing

    Energy Technology Data Exchange (ETDEWEB)

    Bosman, J.; De Keijzer, M.A.; De Kok, C.J.G.M. [ECN Engineering and Services, Petten (Netherlands); Molenaar, R.; Kettelarij, H.

    2010-05-15

    The use of laser technology enables flexibility and new concepts for example solar cell production but also optical moulds. The reason why laser technology is used in these cases is not the laser system itself but the ability to tailor this type of energy to the demands of the production processes. To ensure the full potential of the laser technology it can be improved by adding optical elements like polarizer, cameras, lenses and sensors. Two of these extra optical elements are presented here. First laser pulse energy attenuation. This is used to increase the controllability of laser processes. And second a new camera optic that enables integrated alignment with respect to features on the product. This last option enables marking on existing features and automated compensation of scanner drift. These camera systems can be used for micro welding of polymers and repair of existing markings in moulds.

  9. Phase camera experiment for Advanced Virgo

    International Nuclear Information System (INIS)

    Agatsuma, Kazuhiro; Beuzekom, Martin van; Schaaf, Laura van der; Brand, Jo van den

    2016-01-01

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser modulation/demodulation method is used to measure mirror displacements and used for the position controls. This plays a significant role because the quality of controls affect the noise level of the GW detector. The phase camera is able to monitor each sideband separately, which has a great benefit for the manipulation of the delicate controls. Also, overcoming mirror aberrations will be an essential part of Advanced Virgo (AdV), which is a GW detector close to Pisa. Especially low-frequency sidebands can be affected greatly by aberrations in one of the interferometer cavities. The phase cameras allow tracking such changes because the state of the sidebands gives information on mirror aberrations. A prototype of the phase camera has been developed and is currently tested. The performance checks are almost completed and the installation of the optics at the AdV site has started. After the installation and commissioning, the phase camera will be combined to a thermal compensation system that consists of CO 2 lasers and compensation plates. In this paper, we focus on the prototype and show some limitations from the scanner performance. - Highlights: • The phase camera is being developed for a gravitational wave detector. • A scanner performance limits the operation speed and layout design of the system. • An operation range was found by measuring the frequency response of the scanner.

  10. Phase camera experiment for Advanced Virgo

    Energy Technology Data Exchange (ETDEWEB)

    Agatsuma, Kazuhiro, E-mail: agatsuma@nikhef.nl [National Institute for Subatomic Physics, Amsterdam (Netherlands); Beuzekom, Martin van; Schaaf, Laura van der [National Institute for Subatomic Physics, Amsterdam (Netherlands); Brand, Jo van den [National Institute for Subatomic Physics, Amsterdam (Netherlands); VU University, Amsterdam (Netherlands)

    2016-07-11

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser modulation/demodulation method is used to measure mirror displacements and used for the position controls. This plays a significant role because the quality of controls affect the noise level of the GW detector. The phase camera is able to monitor each sideband separately, which has a great benefit for the manipulation of the delicate controls. Also, overcoming mirror aberrations will be an essential part of Advanced Virgo (AdV), which is a GW detector close to Pisa. Especially low-frequency sidebands can be affected greatly by aberrations in one of the interferometer cavities. The phase cameras allow tracking such changes because the state of the sidebands gives information on mirror aberrations. A prototype of the phase camera has been developed and is currently tested. The performance checks are almost completed and the installation of the optics at the AdV site has started. After the installation and commissioning, the phase camera will be combined to a thermal compensation system that consists of CO{sub 2} lasers and compensation plates. In this paper, we focus on the prototype and show some limitations from the scanner performance. - Highlights: • The phase camera is being developed for a gravitational wave detector. • A scanner performance limits the operation speed and layout design of the system. • An operation range was found by measuring the frequency response of the scanner.

  11. Medium-sized aperture camera for Earth observation

    Science.gov (United States)

    Kim, Eugene D.; Choi, Young-Wan; Kang, Myung-Seok; Kim, Ee-Eul; Yang, Ho-Soon; Rasheed, Ad. Aziz Ad.; Arshad, Ahmad Sabirin

    2017-11-01

    Satrec Initiative and ATSB have been developing a medium-sized aperture camera (MAC) for an earth observation payload on a small satellite. Developed as a push-broom type high-resolution camera, the camera has one panchromatic and four multispectral channels. The panchromatic channel has 2.5m, and multispectral channels have 5m of ground sampling distances at a nominal altitude of 685km. The 300mm-aperture Cassegrain telescope contains two aspheric mirrors and two spherical correction lenses. With a philosophy of building a simple and cost-effective camera, the mirrors incorporate no light-weighting, and the linear CCDs are mounted on a single PCB with no beam splitters. MAC is the main payload of RazakSAT to be launched in 2005. RazakSAT is a 180kg satellite including MAC, designed to provide high-resolution imagery of 20km swath width on a near equatorial orbit (NEqO). The mission objective is to demonstrate the capability of a high-resolution remote sensing satellite system on a near equatorial orbit. This paper describes the overview of the MAC and RarakSAT programmes, and presents the current development status of MAC focusing on key optical aspects of Qualification Model.

  12. Investigation of an Autofocusing Method for Visible Aerial Cameras Based on Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Zhichao Chen

    2016-01-01

    Full Text Available In order to realize the autofocusing in aerial camera, an autofocusing system is established and its characteristics such as working principle and optical-mechanical structure and focus evaluation function are investigated. The reason for defocusing in aviation camera is analyzed and several autofocusing methods along with appropriate focus evaluation functions are introduced based on the image processing techniques. The proposed autofocusing system is designed and implemented using two CMOS detectors. The experiment results showed that the proposed method met the aviation camera focusing accuracy requirement, and a maximum focusing error of less than half of the focus depth is achieved. The system designed in this paper can find the optical imaging focal plane in real-time; as such, this novel design has great potential in practical engineering, especially aerospace applications.

  13. Optical absorption measurement system

    International Nuclear Information System (INIS)

    Draggoo, V.G.; Morton, R.G.; Sawicki, R.H.; Bissinger, H.D.

    1989-01-01

    This patent describes a non-intrusive method for measuring the temperature rise of optical elements under high laser power optical loading to determine the absorption coefficient. The method comprises irradiating the optical element with a high average power laser beam, viewing the optical element with an infrared camera to determine the temperature across the optical element and calculating the absorption of the optical element from the temperature

  14. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    Energy Technology Data Exchange (ETDEWEB)

    abate, alex [Univ. of Arizona, Tucson, AZ (United States); cheu, elliott [Univ. of Arizona, Tucson, AZ (United States)

    2016-10-24

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  15. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    International Nuclear Information System (INIS)

    Abate, Alex; Cheu, Elliott

    2016-01-01

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  16. Graphic design of pinhole cameras

    Science.gov (United States)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  17. Measuring the spatial resolution of an optical system in an undergraduate optics laboratory

    Science.gov (United States)

    Leung, Calvin; Donnelly, T. D.

    2017-06-01

    Two methods of quantifying the spatial resolution of a camera are described, performed, and compared, with the objective of designing an imaging-system experiment for students in an undergraduate optics laboratory. With the goal of characterizing the resolution of a typical digital single-lens reflex (DSLR) camera, we motivate, introduce, and show agreement between traditional test-target contrast measurements and the technique of using Fourier analysis to obtain the modulation transfer function (MTF). The advantages and drawbacks of each method are compared. Finally, we explore the rich optical physics at work in the camera system by calculating the MTF as a function of wavelength and f-number. For example, we find that the Canon 40D demonstrates better spatial resolution at short wavelengths, in accordance with scalar diffraction theory, but is not diffraction-limited, being significantly affected by spherical aberration. The experiment and data analysis routines described here can be built and written in an undergraduate optics lab setting.

  18. Invisible watermarking optical camera communication and compatibility issues of IEEE 802.15.7r1 specification

    Science.gov (United States)

    Le, Nam-Tuan

    2017-05-01

    Copyright protection and information security are two most considered issues of digital data following the development of internet and computer network. As an important solution for protection, watermarking technology has become one of the challenged roles in industry and academic research. The watermarking technology can be classified by two categories: visible watermarking and invisible watermarking. With invisible technique, there is an advantage on user interaction because of the visibility. By applying watermarking for communication, it will be a challenge and a new direction for communication technology. In this paper we will propose one new research on communication technology using optical camera communications (OCC) based invisible watermarking. Beside the analysis on performance of proposed system, we also suggest the frame structure of PHY and MAC layer for IEEE 802.15.7r1 specification which is a revision of visible light communication (VLC) standardization.

  19. Design principles and applications of a cooled CCD camera for electron microscopy.

    Science.gov (United States)

    Faruqi, A R

    1998-01-01

    Cooled CCD cameras offer a number of advantages in recording electron microscope images with CCDs rather than film which include: immediate availability of the image in a digital format suitable for further computer processing, high dynamic range, excellent linearity and a high detective quantum efficiency for recording electrons. In one important respect however, film has superior properties: the spatial resolution of CCD detectors tested so far (in terms of point spread function or modulation transfer function) are inferior to film and a great deal of our effort has been spent in designing detectors with improved spatial resolution. Various instrumental contributions to spatial resolution have been analysed and in this paper we discuss the contribution of the phosphor-fibre optics system in this measurement. We have evaluated the performance of a number of detector components and parameters, e.g. different phosphors (and a scintillator), optical coupling with lens or fibre optics with various demagnification factors, to improve the detector performance. The camera described in this paper, which is based on this analysis, uses a tapered fibre optics coupling between the phosphor and the CCD and is installed on a Philips CM12 electron microscope equipped to perform cryo-microscopy. The main use of the camera so far has been in recording electron diffraction patterns from two dimensional crystals of bacteriorhodopsin--from wild type and from different trapped states during the photocycle. As one example of the type of data obtained with the CCD camera a two dimensional Fourier projection map from the trapped O-state is also included. With faster computers, it will soon be possible to undertake this type of work on an on-line basis. Also, with improvements in detector size and resolution, CCD detectors, already ideal for diffraction, will be able to compete with film in the recording of high resolution images.

  20. Space telescope phase B definition study. Volume 2A: Science instruments, f24 field camera

    Science.gov (United States)

    Grosso, R. P.; Mccarthy, D. J.

    1976-01-01

    The analysis and design of the F/24 field camera for the space telescope are discussed. The camera was designed for application to the radial bay of the optical telescope assembly and has an on axis field of view of 3 arc-minutes by 3 arc-minutes.

  1. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    Science.gov (United States)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  2. Low-cost and high-speed optical mark reader based on an intelligent line camera

    Science.gov (United States)

    Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin

    2003-08-01

    Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.

  3. Deep Rapid Optical Follow-Up of Gravitational Wave Sources with the Dark Energy Camera

    Science.gov (United States)

    Cowperthwaite, Philip

    2018-01-01

    The detection of an electromagnetic counterpart associated with a gravitational wave detection by the Advanced LIGO and VIRGO interferometers is one of the great observational challenges of our time. The large localization regions and potentially faint counterparts require the use of wide-field, large aperture telescopes. As a result, the Dark Energy Camera, a 3.3 sq deg CCD imager on the 4-m Blanco telescope at CTIO in Chile is the most powerful instrument for this task in the Southern Hemisphere. I will report on the results from our joint program between the community and members of the dark energy survey to conduct rapid and efficient follow-up of gravitational wave sources. This includes systematic searches for optical counterparts, as well as developing an understanding of contaminating sources on timescales not normally probed by traditional untargeted supernova surveys. I will additionally comment on the immense science gains to be made by a joint detection and discuss future prospects from the standpoint of both next generation wide-field telescopes and next generation gravitational wave detectors.

  4. High-performance dual-speed CCD camera system for scientific imaging

    Science.gov (United States)

    Simpson, Raymond W.

    1996-03-01

    Traditionally, scientific camera systems were partitioned with a `camera head' containing the CCD and its support circuitry and a camera controller, which provided analog to digital conversion, timing, control, computer interfacing, and power. A new, unitized high performance scientific CCD camera with dual speed readout at 1 X 106 or 5 X 106 pixels per second, 12 bit digital gray scale, high performance thermoelectric cooling, and built in composite video output is described. This camera provides all digital, analog, and cooling functions in a single compact unit. The new system incorporates the A/C converter, timing, control and computer interfacing in the camera, with the power supply remaining a separate remote unit. A 100 Mbyte/second serial link transfers data over copper or fiber media to a variety of host computers, including Sun, SGI, SCSI, PCI, EISA, and Apple Macintosh. Having all the digital and analog functions in the camera made it possible to modify this system for the Woods Hole Oceanographic Institution for use on a remote controlled submersible vehicle. The oceanographic version achieves 16 bit dynamic range at 1.5 X 105 pixels/second, can be operated at depths of 3 kilometers, and transfers data to the surface via a real time fiber optic link.

  5. Weather and atmosphere observation with the ATOM all-sky camera

    Directory of Open Access Journals (Sweden)

    Jankowsky Felix

    2015-01-01

    Full Text Available The Automatic Telescope for Optical Monitoring (ATOM for H.E.S.S. is an 75 cm optical telescope which operates fully automated. As there is no observer present during observation, an auxiliary all-sky camera serves as weather monitoring system. This device takes an all-sky image of the whole sky every three minutes. The gathered data then undergoes live-analysis by performing astrometric comparison with a theoretical night sky model, interpreting the absence of stars as cloud coverage. The sky monitor also serves as tool for a meteorological analysis of the observation site of the the upcoming Cherenkov Telescope Array. This overview covers design and benefits of the all-sky camera and additionally gives an introduction into current efforts to integrate the device into the atmosphere analysis programme of H.E.S.S.

  6. Theoretical considerations on the possibility of using a television camera in scintigraphy

    International Nuclear Information System (INIS)

    Banget Mossaz, Gaston; Cezilly, Daniel; Paccard, Michel

    1969-04-01

    After a presentation of the principles of scintigraphy for the exploration of human organs, of the three main parts of scintigraphic apparels (collimator, scintillator and detector) and of their characteristics (resolving power, sensitivity, contrast), this paper describes the properties of gamma radiations interacting with matter, their absorption and their detection, some statistical notions about gamma radiations (Poisson law), the properties of the collimator, of the scintillator (sodium iodide) and of the detector. The use of a television camera is then introduced with issues concerning the limitations of a camera tube, the electronic optics of the tube, camera tubes with brightness amplification, the case of a Vidicon tube, etc. and some considerations on the potential benefits of television cameras on resolution, contrast and sensitivity

  7. Gradient-Index Optics

    Science.gov (United States)

    2010-03-31

    nonimaging design capabilities to incorporate 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 12-04-2011 13. SUPPLEMENTARY NOTES The views, opinions...Box 12211 Research Triangle Park, NC 27709-2211 15. SUBJECT TERMS Imaging Optics, Nonimaging Optics, Gradient Index Optics, Camera, Concentrator...imaging and nonimaging design capabilities to incorporate manufacturable GRIN lenses can provide imaging lens systems that are compact and

  8. The iQID camera: An ionizing-radiation quantum imaging detector

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Brian W., E-mail: brian.miller@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); College of Optical Sciences, The University of Arizona, Tucson, AZ 85719 (United States); Gregory, Stephanie J.; Fuller, Erin S. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Barrett, Harrison H.; Bradford Barber, H.; Furenlid, Lars R. [Center for Gamma-Ray Imaging, The University of Arizona, Tucson, AZ 85719 (United States); College of Optical Sciences, The University of Arizona, Tucson, AZ 85719 (United States)

    2014-12-11

    We have developed and tested a novel, ionizing-radiation Quantum Imaging Detector (iQID). This scintillation-based detector was originally developed as a high-resolution gamma-ray imager, called BazookaSPECT, for use in single-photon emission computed tomography (SPECT). Recently, we have investigated the detector's response and imaging potential with other forms of ionizing radiation including alpha, neutron, beta, and fission fragment particles. The confirmed response to this broad range of ionizing radiation has prompted its new title. The principle operation of the iQID camera involves coupling a scintillator to an image intensifier. The scintillation light generated by particle interactions is optically amplified by the intensifier and then re-imaged onto a CCD/CMOS camera sensor. The intensifier provides sufficient optical gain that practically any CCD/CMOS camera can be used to image ionizing radiation. The spatial location and energy of individual particles are estimated on an event-by-event basis in real time using image analysis algorithms on high-performance graphics processing hardware. Distinguishing features of the iQID camera include portability, large active areas, excellent detection efficiency for charged particles, and high spatial resolution (tens of microns). Although modest, iQID has energy resolution that is sufficient to discriminate between particles. Additionally, spatial features of individual events can be used for particle discrimination. An important iQID imaging application that has recently been developed is real-time, single-particle digital autoradiography. We present the latest results and discuss potential applications.

  9. Hubble Space Telescope, Faint Object Camera

    Science.gov (United States)

    1981-01-01

    This drawing illustrates Hubble Space Telescope's (HST's), Faint Object Camera (FOC). The FOC reflects light down one of two optical pathways. The light enters a detector after passing through filters or through devices that can block out light from bright objects. Light from bright objects is blocked out to enable the FOC to see background images. The detector intensifies the image, then records it much like a television camera. For faint objects, images can be built up over long exposure times. The total image is translated into digital data, transmitted to Earth, and then reconstructed. The purpose of the HST, the most complex and sensitive optical telescope ever made, is to study the cosmos from a low-Earth orbit. By placing the telescope in space, astronomers are able to collect data that is free of the Earth's atmosphere. The HST detects objects 25 times fainter than the dimmest objects seen from Earth and provides astronomers with an observable universe 250 times larger than visible from ground-based telescopes, perhaps as far away as 14 billion light-years. The HST views galaxies, stars, planets, comets, possibly other solar systems, and even unusual phenomena such as quasars, with 10 times the clarity of ground-based telescopes. The HST was deployed from the Space Shuttle Discovery (STS-31 mission) into Earth orbit in April 1990. The Marshall Space Flight Center had responsibility for design, development, and construction of the HST. The Perkin-Elmer Corporation, in Danbury, Cornecticut, developed the optical system and guidance sensors.

  10. Process simulation in digital camera system

    Science.gov (United States)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  11. Novel computer-based endoscopic camera

    Science.gov (United States)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  12. Accurate and cost-effective MTF measurement system for lens modules of digital cameras

    Science.gov (United States)

    Chang, Gao-Wei; Liao, Chia-Cheng; Yeh, Zong-Mu

    2007-01-01

    For many years, the widening use of digital imaging products, e.g., digital cameras, has given rise to much attention in the market of consumer electronics. However, it is important to measure and enhance the imaging performance of the digital ones, compared to that of conventional cameras (with photographic films). For example, the effect of diffraction arising from the miniaturization of the optical modules tends to decrease the image resolution. As a figure of merit, modulation transfer function (MTF) has been broadly employed to estimate the image quality. Therefore, the objective of this paper is to design and implement an accurate and cost-effective MTF measurement system for the digital camera. Once the MTF of the sensor array is provided, that of the optical module can be then obtained. In this approach, a spatial light modulator (SLM) is employed to modulate the spatial frequency of light emitted from the light-source. The modulated light going through the camera under test is consecutively detected by the sensors. The corresponding images formed from the camera are acquired by a computer and then, they are processed by an algorithm for computing the MTF. Finally, through the investigation on the measurement accuracy from various methods, such as from bar-target and spread-function methods, it appears that our approach gives quite satisfactory results.

  13. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    Science.gov (United States)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  14. A contribution to the study of plasmas. An ultra rapid photometric camera

    International Nuclear Information System (INIS)

    Alpern, Marc R.

    1971-01-01

    The limitations of image converters in ultra-rapid photography were discussed, and an electronographic camera designed, to make photometric measurements on plasmas was presented. The electron optics was then studied and the performance attainable, particularly in dynamic operation, was assessed. The experimental facts concerning the interaction between a laser beam and a thin layer of gold was finally established using this camera, the complexity of the mechanism involved in this interaction was shown. (author) [fr

  15. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras.

    Science.gov (United States)

    Quinn, Mark Kenneth; Spinosa, Emanuele; Roberts, David A

    2017-07-25

    Measurements of pressure-sensitive paint (PSP) have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.

  16. Miniaturisation of Pressure-Sensitive Paint Measurement Systems Using Low-Cost, Miniaturised Machine Vision Cameras

    Directory of Open Access Journals (Sweden)

    Mark Kenneth Quinn

    2017-07-01

    Full Text Available Measurements of pressure-sensitive paint (PSP have been performed using new or non-scientific imaging technology based on machine vision tools. Machine vision camera systems are typically used for automated inspection or process monitoring. Such devices offer the benefits of lower cost and reduced size compared with typically scientific-grade cameras; however, their optical qualities and suitability have yet to be determined. This research intends to show relevant imaging characteristics and also show the applicability of such imaging technology for PSP. Details of camera performance are benchmarked and compared to standard scientific imaging equipment and subsequent PSP tests are conducted using a static calibration chamber. The findings demonstrate that machine vision technology can be used for PSP measurements, opening up the possibility of performing measurements on-board small-scale model such as those used for wind tunnel testing or measurements in confined spaces with limited optical access.

  17. X-ray imaging using digital cameras

    Science.gov (United States)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  18. The Alfred Nobel rocket camera. An early aerial photography attempt

    Science.gov (United States)

    Ingemar Skoog, A.

    2010-02-01

    Alfred Nobel (1833-1896), mainly known for his invention of dynamite and the creation of the Nobel Prices, was an engineer and inventor active in many fields of science and engineering, e.g. chemistry, medicine, mechanics, metallurgy, optics, armoury and rocketry. Amongst his inventions in rocketry was the smokeless solid propellant ballistite (i.e. cordite) patented for the first time in 1887. As a very wealthy person he actively supported many Swedish inventors in their work. One of them was W.T. Unge, who was devoted to the development of rockets and their applications. Nobel and Unge had several rocket patents together and also jointly worked on various rocket applications. In mid-1896 Nobel applied for patents in England and France for "An Improved Mode of Obtaining Photographic Maps and Earth or Ground Measurements" using a photographic camera carried by a "…balloon, rocket or missile…". During the remaining of 1896 the mechanical design of the camera mechanism was pursued and cameras manufactured. In April 1897 (after the death of Alfred Nobel) the first aerial photos were taken by these cameras. These photos might be the first documented aerial photos taken by a rocket borne camera. Cameras and photos from 1897 have been preserved. Nobel did not only develop the rocket borne camera but also proposed methods on how to use the photographs taken for ground measurements and preparing maps.

  19. Security camera resolution measurements: Horizontal TV lines versus modulation transfer function measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Birch, Gabriel Carisle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Griffin, John Clark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The horizontal television lines (HTVL) metric has been the primary quantity used by division 6000 related to camera resolution for high consequence security systems. This document shows HTVL measurements are fundamen- tally insufficient as a metric to determine camera resolution, and propose a quantitative, standards based methodology by measuring the camera system modulation transfer function (MTF), the most common and accepted metric of res- olution in the optical science community. Because HTVL calculations are easily misinterpreted or poorly defined, we present several scenarios in which HTVL is frequently reported, and discuss their problems. The MTF metric is discussed, and scenarios are presented with calculations showing the application of such a metric.

  20. Infrared detectors and test technology of cryogenic camera

    Science.gov (United States)

    Yang, Xiaole; Liu, Xingxin; Xing, Mailing; Ling, Long

    2016-10-01

    Cryogenic camera which is widely used in deep space detection cools down optical system and support structure by cryogenic refrigeration technology, thereby improving the sensitivity. Discussing the characteristics and design points of infrared detector combined with camera's characteristics. At the same time, cryogenic background test systems of chip and detector assembly are established. Chip test system is based on variable cryogenic and multilayer Dewar, and assembly test system is based on target and background simulator in the thermal vacuum environment. The core of test is to establish cryogenic background. Non-uniformity, ratio of dead pixels and noise of test result are given finally. The establishment of test system supports for the design and calculation of infrared systems.

  1. A novel simultaneous streak and framing camera without principle errors

    Science.gov (United States)

    Jingzhen, L.; Fengshan, S.; Ningwen, L.; Xiangdong, G.; Bin, H.; Qingyang, W.; Hongyi, C.; Yi, C.; Xiaowei, L.

    2018-02-01

    A novel simultaneous streak and framing camera with continuous access, the perfect information of which is far more important for the exact interpretation and precise evaluation of many detonation events and shockwave phenomena, has been developed. The camera with the maximum imaging frequency of 2 × 106 fps and the maximum scanning velocity of 16.3 mm/μs has fine imaging properties which are the eigen resolution of over 40 lp/mm in the temporal direction and over 60 lp/mm in the spatial direction and the framing frequency principle error of zero for framing record, and the maximum time resolving power of 8 ns and the scanning velocity nonuniformity of 0.136%~-0.277% for streak record. The test data have verified the performance of the camera quantitatively. This camera, simultaneously gained frames and streak with parallax-free and identical time base, is characterized by the plane optical system at oblique incidence different from space system, the innovative camera obscura without principle errors, and the high velocity motor driven beryllium-like rotating mirror, made of high strength aluminum alloy with cellular lateral structure. Experiments demonstrate that the camera is very useful and reliable to take high quality pictures of the detonation events.

  2. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    angle θ of its corresponding rays w.r.t. the optical axis as θ = ar 1+br2 . After a successful calibration, we know the correspondence of the image points to the 3D optical rays in the coordinate system of the camera. The following steps aim at finding the transformation between the camera and the world coordinate systems, i.e. the pose of the camera in the 3D world, using 2D image matches. For computing 3D structure, we construct a set of tentative matches detecting different affine covariant feature regions including MSER, Harris Affine, and Hessian Affine in acquired images. These features are alternative to popular SIFT features and work comparably in our situation. Parameters of the detectors are chosen to limit the number of regions to 1-2 thousands per image. The detected regions are assigned local affine frames (LAF) and transformed into standard positions w.r.t. their LAFs. Discrete Cosine Descriptors are computed for each region in the standard position. Finally, mutual distances of all regions in one image and all regions in the other image are computed as the Euclidean distances of their descriptors and tentative matches are constructed by selecting the mutually closest pairs. Opposed to the methods using short baseline images, simpler image features which are not affine covariant cannot be used because the view point can change a lot between consecutive frames. Furthermore, feature matching has to be performed on the whole frame because no assumptions on the proximity of the consecutive projections can be made for wide baseline images. This is making the feature detection, description, and matching much more time-consuming than it is for short baseline images and limits the usage to low frame rate sequences when operating in real-time. Robust 3D structure can be computed by RANSAC which searches for the largest subset of the set of tentative matches which is, within a predefined threshold ", consistent with an epipolar geometry. We use ordered sampling as

  3. Detector construction for a scintillation camera

    International Nuclear Information System (INIS)

    Ashe, J.B.

    1977-01-01

    An improved transducer construction for a scintillation camera in which a light conducting element is equipped with a layer of moisture impervious material is described. A scintillation crystal is thereafter positioned in optical communication with the moisture impervious layer and the remaining surfaces of the scintillation crystal are encompassed by a moisture shield. Affixing the moisture impervious layer to the light conducting element prior to attachment of the scintillation crystal reduces the requirement for mechanical strength in the moisture impervious layer and thereby allows a layer of reduced thickness to be utilized. Preferably, photodetectors are also positioned in optical communication with the light conducting element prior to positioning the scintillation crystal in contact with the impervious layer. 13 claims, 4 figures

  4. Development of a SiPM Camera for a Schwarzschild-Couder Cherenkov Telescope for the Cherenkov Telescope Array

    CERN Document Server

    Otte, A N; Dickinson, H.; Funk, S.; Jogler, T.; Johnson, C.A.; Karn, P.; Meagher, K.; Naoya, H.; Nguyen, T.; Okumura, A.; Santander, M.; Sapozhnikov, L.; Stier, A.; Tajima, H.; Tibaldo, L.; Vandenbroucke, J.; Wakely, S.; Weinstein, A.; Williams, D.A.

    2015-01-01

    We present the development of a novel 11328 pixel silicon photomultiplier (SiPM) camera for use with a ground-based Cherenkov telescope with Schwarzschild-Couder optics as a possible medium-sized telescope for the Cherenkov Telescope Array (CTA). The finely pixelated camera samples air-shower images with more than twice the optical resolution of cameras that are used in current Cherenkov telescopes. Advantages of the higher resolution will be a better event reconstruction yielding improved background suppression and angular resolution of the reconstructed gamma-ray events, which is crucial in morphology studies of, for example, Galactic particle accelerators and the search for gamma-ray halos around extragalactic sources. Packing such a large number of pixels into an area of only half a square meter and having a fast readout directly attached to the back of the sensors is a challenging task. For the prototype camera development, SiPMs from Hamamatsu with through silicon via (TSV) technology are used. We give ...

  5. A time-resolved image sensor for tubeless streak cameras

    Science.gov (United States)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  6. Computational imaging with multi-camera time-of-flight systems

    KAUST Repository

    Shrestha, Shikhar

    2016-07-11

    Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.

  7. Initial Demonstration of 9-MHz Framing Camera Rates on the FAST UV Drive Laser Pulse Trains

    Energy Technology Data Exchange (ETDEWEB)

    Lumpkin, A. H. [Fermilab; Edstrom Jr., D. [Fermilab; Ruan, J. [Fermilab

    2016-10-09

    We report the configuration of a Hamamatsu C5680 streak camera as a framing camera to record transverse spatial information of green-component laser micropulses at 3- and 9-MHz rates for the first time. The latter is near the time scale of the ~7.5-MHz revolution frequency of the Integrable Optics Test Accelerator (IOTA) ring and its expected synchroton radiation source temporal structure. The 2-D images are recorded with a Gig-E readout CCD camera. We also report a first proof of principle with an OTR source using the linac streak camera in a semi-framing mode.

  8. Small Orbital Stereo Tracking Camera Technology Development

    Science.gov (United States)

    Gagliano, L.; Bryan, T.; MacLeod, T.

    On-Orbit Small Debris Tracking and Characterization is a technical gap in the current National Space Situational Awareness necessary to safeguard orbital assets and crew. This poses a major risk of MOD damage to ISS and Exploration vehicles. In 2015 this technology was added to NASAs Office of Chief Technologist roadmap. For missions flying in or assembled in or staging from LEO, the physical threat to vehicle and crew is needed in order to properly design the proper level of MOD impact shielding and proper mission design restrictions. Need to verify debris flux and size population versus ground RADAR tracking. Use of ISS for In-Situ Orbital Debris Tracking development provides attitude, power, data and orbital access without a dedicated spacecraft or restricted operations on-board a host vehicle as a secondary payload. Sensor Applicable to in-situ measuring orbital debris in flux and population in other orbits or on other vehicles. Could enhance safety on and around ISS. Some technologies extensible to monitoring of extraterrestrial debris as well To help accomplish this, new technologies must be developed quickly. The Small Orbital Stereo Tracking Camera is one such up and coming technology. It consists of flying a pair of intensified megapixel telephoto cameras to evaluate Orbital Debris (OD) monitoring in proximity of International Space Station. It will demonstrate on-orbit optical tracking (in situ) of various sized objects versus ground RADAR tracking and small OD models. The cameras are based on Flight Proven Advanced Video Guidance Sensor pixel to spot algorithms (Orbital Express) and military targeting cameras. And by using twin cameras we can provide Stereo images for ranging & mission redundancy. When pointed into the orbital velocity vector (RAM), objects approaching or near the stereo camera set can be differentiated from the stars moving upward in background.

  9. An evolution of image source camera attribution approaches.

    Science.gov (United States)

    Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul

    2016-05-01

    researchers, are also critically analysed and further categorised into four different classes, namely, optical aberrations based, sensor camera fingerprints based, processing statistics based and processing regularities based, to present a classification. Furthermore, this paper aims to investigate the challenging problems, and the proposed strategies of such schemes based on the suggested taxonomy to plot an evolution of the source camera attribution approaches with respect to the subjective optimisation criteria over the last decade. The optimisation criteria were determined based on the strategies proposed to increase the detection accuracy, robustness and computational efficiency of source camera brand, model or device attribution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Comparison of parameters of modern cooled and uncooled thermal cameras

    Science.gov (United States)

    Bareła, Jarosław; Kastek, Mariusz; Firmanty, Krzysztof; Krupiński, Michał

    2017-10-01

    During the design of a system employing thermal cameras one always faces a problem of choosing the camera types best suited for the task. In many cases such a choice is far from optimal one, and there are several reasons for that. System designers often favor tried and tested solution they are used to. They do not follow the latest developments in the field of infrared technology and sometimes their choices are based on prejudice and not on facts. The paper presents the results of measurements of basic parameters of MWIR and LWIR thermal cameras, carried out in a specialized testing laboratory. The measured parameters are decisive in terms of image quality generated by thermal cameras. All measurements were conducted according to current procedures and standards. However the camera settings were not optimized for a specific test conditions or parameter measurements. Instead the real settings used in normal camera operations were applied to obtain realistic camera performance figures. For example there were significant differences between measured values of noise parameters and catalogue data provided by manufacturers, due to the application of edge detection filters to increase detection and recognition ranges. The purpose of this paper is to provide help in choosing the optimal thermal camera for particular application, answering the question whether to opt for cheaper microbolometer device or apply slightly better (in terms of specifications) yet more expensive cooled unit. Measurements and analysis were performed by qualified personnel with several dozen years of experience in both designing and testing of thermal camera systems with both cooled and uncooled focal plane arrays. Cameras of similar array sizes and optics were compared, and for each tested group the best performing devices were selected.

  11. Photographic zoom fisheye lens design for DSLR cameras

    Science.gov (United States)

    Yan, Yufeng; Sasian, Jose

    2017-09-01

    Photographic fisheye lenses with fixed focal length for cameras with different sensor formats have been well developed for decades. However, photographic fisheye lenses with variable focal length are rare on the market due in part to the greater design difficulty. This paper presents a large aperture zoom fisheye lens for DSLR cameras that produces both circular and diagonal fisheye imaging for 35-mm sensors and diagonal fisheye imaging for APS-C sensors. The history and optical characteristics of fisheye lenses are briefly reviewed. Then, a 9.2- to 16.1-mm F/2.8 to F/3.5 zoom fisheye lens design is presented, including the design approach and aberration control. Image quality and tolerance performance analysis for this lens are also presented.

  12. Establishing imaging sensor specifications for digital still cameras

    Science.gov (United States)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  13. Payload topography camera of Chang'e-3

    International Nuclear Information System (INIS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-01-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application. (paper)

  14. Depth profile measurement with lenslet images of the plenoptic camera

    Science.gov (United States)

    Yang, Peng; Wang, Zhaomin; Zhang, Wei; Zhao, Hongying; Qu, Weijuan; Zhao, Haimeng; Asundi, Anand; Yan, Lei

    2018-03-01

    An approach for carrying out depth profile measurement of an object with the plenoptic camera is proposed. A single plenoptic image consists of multiple lenslet images. To begin with, these images are processed directly with a refocusing technique to obtain the depth map, which does not need to align and decode the plenoptic image. Then, a linear depth calibration is applied based on the optical structure of the plenoptic camera for depth profile reconstruction. One significant improvement of the proposed method concerns the resolution of the depth map. Unlike the traditional method, our resolution is not limited by the number of microlenses inside the camera, and the depth map can be globally optimized. We validated the method with experiments on depth map reconstruction, depth calibration, and depth profile measurement, with the results indicating that the proposed approach is both efficient and accurate.

  15. Applying and extending ISO/TC42 digital camera resolution standards to mobile imaging products

    Science.gov (United States)

    Williams, Don; Burns, Peter D.

    2007-01-01

    There are no fundamental differences between today's mobile telephone cameras and consumer digital still cameras that suggest many existing ISO imaging performance standards do not apply. To the extent that they have lenses, color filter arrays, detectors, apertures, image processing, and are hand held, there really are no operational or architectural differences. Despite this, there are currently differences in the levels of imaging performance. These are driven by physical and economic constraints, and image-capture conditions. Several ISO standards for resolution, well established for digital consumer digital cameras, require care when applied to the current generation of cell phone cameras. In particular, accommodation of optical flare, shading non-uniformity and distortion are recommended. We offer proposals for the application of existing ISO imaging resolution performance standards to mobile imaging products, and suggestions for extending performance standards to the characteristic behavior of camera phones.

  16. Development and application of an automatic system for measuring the laser camera

    International Nuclear Information System (INIS)

    Feng Shuli; Peng Mingchen; Li Kuncheng

    2004-01-01

    Objective: To provide an automatic system for measuring imaging quality of laser camera, and to make an automatic measurement and analysis system. Methods: On the special imaging workstation (SGI 540), the procedure was written by using Matlab language. An automatic measurement and analysis system of imaging quality for laser camera was developed and made according to the imaging quality measurement standard of laser camera of International Engineer Commission (IEC). The measurement system used the theories of digital signal processing, and was based on the characteristics of digital images, as well as put the automatic measurement and analysis of laser camera into practice by the affiliated sample pictures of the laser camera. Results: All the parameters of imaging quality of laser camera, including H-D and MTF curve, low and middle and high resolution of optical density, all kinds of geometry distort, maximum and minimum density, as well as the dynamic range of gray scale, could be measured by this system. The system was applied for measuring the laser cameras in 20 hospitals in Beijing. The measuring results showed that the system could provide objective and quantitative data, and could accurately evaluate the imaging quality of laser camera, as well as correct the results made by manual measurement based on the affiliated sample pictures of the laser camera. Conclusion: The automatic measuring system of laser camera is an effective and objective tool for testing the quality of the laser camera, and the system makes a foundation for the future research

  17. Effect of camera temperature variations on stereo-digital image correlation measurements

    KAUST Repository

    Pan, Bing

    2015-11-25

    In laboratory and especially non-laboratory stereo-digital image correlation (stereo-DIC) applications, the extrinsic and intrinsic parameters of the cameras used in the system may change slightly due to the camera warm-up effect and possible variations in ambient temperature. Because these camera parameters are generally calibrated once prior to measurements and considered to be unaltered during the whole measurement period, the changes in these parameters unavoidably induce displacement/strain errors. In this study, the effect of temperature variations on stereo-DIC measurements is investigated experimentally. To quantify the errors associated with camera or ambient temperature changes, surface displacements and strains of a stationary optical quartz glass plate with near-zero thermal expansion were continuously measured using a regular stereo-DIC system. The results confirm that (1) temperature variations in the cameras and ambient environment have a considerable influence on the displacements and strains measured by stereo-DIC due to the slightly altered extrinsic and intrinsic camera parameters; and (2) the corresponding displacement and strain errors correlate with temperature changes. For the specific stereo-DIC configuration used in this work, the temperature-induced strain errors were estimated to be approximately 30–50 με/°C. To minimize the adverse effect of camera temperature variations on stereo-DIC measurements, two simple but effective solutions are suggested.

  18. Effect of camera temperature variations on stereo-digital image correlation measurements

    KAUST Repository

    Pan, Bing; Shi, Wentao; Lubineau, Gilles

    2015-01-01

    In laboratory and especially non-laboratory stereo-digital image correlation (stereo-DIC) applications, the extrinsic and intrinsic parameters of the cameras used in the system may change slightly due to the camera warm-up effect and possible variations in ambient temperature. Because these camera parameters are generally calibrated once prior to measurements and considered to be unaltered during the whole measurement period, the changes in these parameters unavoidably induce displacement/strain errors. In this study, the effect of temperature variations on stereo-DIC measurements is investigated experimentally. To quantify the errors associated with camera or ambient temperature changes, surface displacements and strains of a stationary optical quartz glass plate with near-zero thermal expansion were continuously measured using a regular stereo-DIC system. The results confirm that (1) temperature variations in the cameras and ambient environment have a considerable influence on the displacements and strains measured by stereo-DIC due to the slightly altered extrinsic and intrinsic camera parameters; and (2) the corresponding displacement and strain errors correlate with temperature changes. For the specific stereo-DIC configuration used in this work, the temperature-induced strain errors were estimated to be approximately 30–50 με/°C. To minimize the adverse effect of camera temperature variations on stereo-DIC measurements, two simple but effective solutions are suggested.

  19. High dynamic range image acquisition based on multiplex cameras

    Science.gov (United States)

    Zeng, Hairui; Sun, Huayan; Zhang, Tinghua

    2018-03-01

    High dynamic image is an important technology of photoelectric information acquisition, providing higher dynamic range and more image details, and it can better reflect the real environment, light and color information. Currently, the method of high dynamic range image synthesis based on different exposure image sequences cannot adapt to the dynamic scene. It fails to overcome the effects of moving targets, resulting in the phenomenon of ghost. Therefore, a new high dynamic range image acquisition method based on multiplex cameras system was proposed. Firstly, different exposure images sequences were captured with the camera array, using the method of derivative optical flow based on color gradient to get the deviation between images, and aligned the images. Then, the high dynamic range image fusion weighting function was established by combination of inverse camera response function and deviation between images, and was applied to generated a high dynamic range image. The experiments show that the proposed method can effectively obtain high dynamic images in dynamic scene, and achieves good results.

  20. Electro-optic control of photographic imaging quality through ‘Smart Glass’ windows in optics demonstrations

    Science.gov (United States)

    Ozolinsh, Maris; Paulins, Paulis

    2017-09-01

    An experimental setup allowing the modeling of conditions in optical devices and in the eye at various degrees of scattering such as cataract pathology in human eyes is presented. The scattering in cells of polymer-dispersed liquid crystals (PDLCs) and ‘Smart Glass’ windows is used in the modeling experiments. Both applications are used as optical obstacles placed in different positions of the optical information flow pathway either directly on the stimuli demonstration computer screen or mounted directly after the image-formation lens of a digital camera. The degree of scattering is changed continuously by applying an AC voltage of up to 30-80 V to the PDLC cell. The setup uses a camera with 14 bit depth and a 24 mm focal length lens. Light-emitting diodes and diode-pumped solid-state lasers emitting radiation of different wavelengths are used as portable small-divergence light sources in the experiments. Image formation, optical system point spread function, modulation transfer functions, and system resolution limits are determined for such sample optical systems in student optics and optometry experimental exercises.

  1. Electro-optic control of photographic imaging quality through ‘Smart Glass’ windows in optics demonstrations

    International Nuclear Information System (INIS)

    Ozolinsh, Maris; Paulins, Paulis

    2017-01-01

    An experimental setup allowing the modeling of conditions in optical devices and in the eye at various degrees of scattering such as cataract pathology in human eyes is presented. The scattering in cells of polymer-dispersed liquid crystals (PDLCs) and ‘Smart Glass’ windows is used in the modeling experiments. Both applications are used as optical obstacles placed in different positions of the optical information flow pathway either directly on the stimuli demonstration computer screen or mounted directly after the image-formation lens of a digital camera. The degree of scattering is changed continuously by applying an AC voltage of up to 30–80 V to the PDLC cell. The setup uses a camera with 14 bit depth and a 24 mm focal length lens. Light-emitting diodes and diode-pumped solid-state lasers emitting radiation of different wavelengths are used as portable small-divergence light sources in the experiments. Image formation, optical system point spread function, modulation transfer functions, and system resolution limits are determined for such sample optical systems in student optics and optometry experimental exercises. (paper)

  2. Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging.

    Science.gov (United States)

    Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; Geert Sander de Jong, Jan; van Geest, Bert; Stoop, Karel; Young, Ian Ted

    2012-12-01

    We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.

  3. Feasibility study of a lens-coupled charge-coupled device gamma camera

    International Nuclear Information System (INIS)

    Lee, Hakjae; Jung, Youngjun; Kim, Jungmin; Bae, Seungbin; Lee, Kisung; Kang, Jungwon

    2011-01-01

    A charge-coupled device (CCD) is generally used in a digital camera as a light-collecting device such as a photomultiplier tube (PMT). Because of its low sensitivity and very high dark current, CCD have not been popularly used for gamma imaging systems. However, a recent CCD technological breakthrough has improved CCD sensitivity, and the use of a Peltier cooling system can significantly minimize the dark current. In this study, we investigated the feasibility of a prototype CCD gamma camera consisting of a CsI scintillator, optical lenses, and a CCD module. Despite electron-multiplying (EM) CCDs having higher performance, in this study, we built a cost-effective system consisted of low-cost components compared to EMCCDs. Our prototype detector consists of a CsI scintillator, two optical lenses, and a conventional Peltier-cooled CCD. The performance of this detector was evaluated by acquiring the sensitivity, resolution, and the modulation transfer function (MTF). The sensitivity of the prototype detector showed excellent linearity. With a 1 mm-diameter pinhole collimator, the full width at half-maximum (FWHM) of a 1.1 mm Tc-99m line source image was 2.85 mm. These results show that the developed prototype camera is feasible for small animal gamma imaging.

  4. Mechanically assisted liquid lens zoom system for mobile phone cameras

    Science.gov (United States)

    Wippermann, F. C.; Schreiber, P.; Bräuer, A.; Berge, B.

    2006-08-01

    Camera systems with small form factor are an integral part of today's mobile phones which recently feature auto focus functionality. Ready to market solutions without moving parts have been developed by using the electrowetting technology. Besides virtually no deterioration, easy control electronics and simple and therefore cost-effective fabrication, this type of liquid lenses enables extremely fast settling times compared to mechanical approaches. As a next evolutionary step mobile phone cameras will be equipped with zoom functionality. We present first order considerations for the optical design of a miniaturized zoom system based on liquid-lenses and compare it to its mechanical counterpart. We propose a design of a zoom lens with a zoom factor of 2.5 considering state-of-the-art commercially available liquid lens products. The lens possesses auto focus capability and is based on liquid lenses and one additional mechanical actuator. The combination of liquid lenses and a single mechanical actuator enables extremely short settling times of about 20ms for the auto focus and a simplified mechanical system design leading to lower production cost and longer life time. The camera system has a mechanical outline of 24mm in length and 8mm in diameter. The lens with f/# 3.5 provides market relevant optical performance and is designed for an image circle of 6.25mm (1/2.8" format sensor).

  5. Computational cameras for moving iris recognition

    Science.gov (United States)

    McCloskey, Scott; Venkatesha, Sharath

    2015-05-01

    Iris-based biometric identification is increasingly used for facility access and other security applications. Like all methods that exploit visual information, however, iris systems are limited by the quality of captured images. Optical defocus due to a small depth of field (DOF) is one such challenge, as is the acquisition of sharply-focused iris images from subjects in motion. This manuscript describes the application of computational motion-deblurring cameras to the problem of moving iris capture, from the underlying theory to system considerations and performance data.

  6. Soft x-ray camera for internal shape and current density measurements on a noncircular tokamak

    International Nuclear Information System (INIS)

    Fonck, R.J.; Jaehnig, K.P.; Powell, E.T.; Reusch, M.; Roney, P.; Simon, M.P.

    1988-05-01

    Soft x-ray measurements of the internal plasma flux surface shaped in principle allow a determination of the plasma current density distribution, and provide a necessary monitor of the degree of internal elongation of tokamak plasmas with a noncircular cross section. A two-dimensional, tangentially viewing, soft x-ray pinhole camera has been fabricated to provide internal shape measurements on the PBX-M tokamak. It consists of a scintillator at the focal plane of a foil-filtered pinhole camera, which is, in turn, fiber optically coupled to an intensified framing video camera (/DELTA/t />=/ 3 msec). Automated data acquisition is performed on a stand-alone image-processing system, and data archiving and retrieval takes place on an optical disk video recorder. The entire diagnostic is controlled via a PDP-11/73 microcomputer. The derivation of the polodial emission distribution from the measured image is done by fitting to model profiles. 10 refs., 4 figs

  7. SFR test fixture for hemispherical and hyperhemispherical camera systems

    Science.gov (United States)

    Tamkin, John M.

    2017-08-01

    Optical testing of camera systems in volume production environments can often require expensive tooling and test fixturing. Wide field (fish-eye, hemispheric and hyperhemispheric) optical systems create unique challenges because of the inherent distortion, and difficulty in controlling reflections from front-lit high resolution test targets over the hemisphere. We present a unique design for a test fixture that uses low-cost manufacturing methods and equipment such as 3D printing and an Arduino processor to control back-lit multi-color (VIS/NIR) targets and sources. Special care with LED drive electronics is required to accommodate both global and rolling shutter sensors.

  8. Development of underwater camera using high-definition camera

    International Nuclear Information System (INIS)

    Tsuji, Kenji; Watanabe, Masato; Takashima, Masanobu; Kawamura, Shingo; Tanaka, Hiroyuki

    2012-01-01

    In order to reduce the time for core verification or visual inspection of BWR fuels, the underwater camera using a High-Definition camera has been developed. As a result of this development, the underwater camera has 2 lights and 370 x 400 x 328mm dimensions and 20.5kg weight. Using the camera, 6 or so spent-fuel IDs are identified at 1 or 1.5m distance at a time, and 0.3mmφ pin-hole is recognized at 1.5m distance and 20 times zoom-up. Noises caused by radiation less than 15 Gy/h are not affected the images. (author)

  9. The development of high-speed 100 fps CCD camera

    International Nuclear Information System (INIS)

    Hoffberg, M.; Laird, R.; Lenkzsus, F.; Liu, C.; Rodricks, B.

    1997-01-01

    This paper describes the development of a high-speed CCD digital camera system. The system has been designed to use CCDs from various manufacturers with minimal modifications. The first camera built on this design utilizes a Thomson 512 x 512 pixel CCD as its sensor, which is read out from two parallel outputs at a speed of 15 MHz/pixel/output. The data undergo correlated double sampling after which it is digitized into 12 bits. The throughput of the system translates into 60 MB/second, which is either stored directly in a PC or transferred to a custom-designed VXI module. The PC data acquisition version of the camera can collect sustained data in real time that is limited to the memory installed in the PC. The VXI version of the camera, also controlled by a PC, stores 512 MB of real-time data before it must be read out to the PC disk storage. The uncooled CCD can be used either with lenses for visible light imaging or with a phosphor screen for X-ray imaging. This camera has been tested with a phosphor screen coupled to a fiber-optic face plate for high-resolution, high-speed X-ray imaging. The camera is controlled through a custom event-driven user-friendly Windows package. The pixel clock speed can be changed from 1 to 15 MHz. The noise was measured to be 1.05 bits at a 13.3 MHz pixel clock. This paper will describe the electronics, software, and characterizations that have been performed using both visible and X-ray photons. (orig.)

  10. A method of camera calibration in the measurement process with reference mark for approaching observation space target

    Science.gov (United States)

    Zhang, Hua; Zeng, Luan

    2017-11-01

    Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.

  11. Scheimpflug camera combined with placido-disk corneal topography and optical biometry for intraocular lens power calculation.

    Science.gov (United States)

    Kirgiz, Ahmet; Atalay, Kurşat; Kaldirim, Havva; Cabuk, Kubra Serefoglu; Akdemir, Mehmet Orcun; Taskapili, Muhittin

    2017-08-01

    The purpose of this study was to compare the keratometry (K) values obtained by the Scheimpflug camera combined with placido-disk corneal topography (Sirius) and optical biometry (Lenstar) for intraocular lens (IOL) power calculation before the cataract surgery, and to evaluate the accuracy of postoperative refraction. 50 eyes of 40 patients were scheduled to have phacoemulsification with the implantation of a posterior chamber intraocular lens. The IOL power was calculated using the SRK/T formula with Lenstar K and K readings from Sirius. Simulated K (SimK), K at 3-, 5-, and 7-mm zones from Sirius were compared with Lenstar K readings. The accuracy of these parameters was determined by calculating the mean absolute error (MAE). The mean Lenstar K value was 44.05 diopters (D) ±1.93 (SD) and SimK, K at 3-, 5-, and 7-mm zones were 43.85 ± 1.91, 43.88 ± 1.9, 43.84 ± 1.9, 43.66 ± 1.85 D, respectively. There was no statistically significant difference between the K readings (P = 0.901). When Lenstar was used for the corneal power measurements, MAE was 0.42 ± 0.33 D, but when simK of Sirius was used, it was 0.37 ± 0.32 D (the lowest MAE (0.36 ± 0.32 D) was achieved as a result of 5 mm K measurement), but it was not statistically significant (P = 0.892). Of all the K readings of Sirius and Lenstar, Sirius 5-mm zone K readings were the best in predicting a more precise IOL power. The corneal power measurements with the Scheimpflug camera combined with placido-disk corneal topography can be safely used for IOL power calculation.

  12. Application of single-image camera calibration for ultrasound augmented laparoscopic visualization.

    Science.gov (United States)

    Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D; Shekhar, Raj

    2015-03-01

    Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool ( rdCalib ; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker ® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.

  13. Application of single-image camera calibration for ultrasound augmented laparoscopic visualization

    Science.gov (United States)

    Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj

    2015-03-01

    Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool (rdCalib; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.

  14. Accuracy of Corneal Thickness by Swept-Source Optical Coherence Tomography and Scheimpflug Camera in Virgin and Treated Fuchs Endothelial Dystrophy.

    Science.gov (United States)

    Arnalich-Montiel, Francisco; Ortiz-Toquero, Sara; Auladell, Clara; Couceiro, Ana

    2018-06-01

    To assess intraobserver repeatability, intersession reproducibility, and agreement of swept-source Fourier-domain optical coherence tomography (SS-OCT) and the Scheimpflug camera in measuring corneal thickness in virgin and grafted eyes with Fuchs endothelial corneal dystrophy (FECD). Thirty-six control eyes, 35 FECD eyes, 30 FECD with corneal edema eyes, 25 Descemet stripping automated endothelial keratoplasty (DSAEK) eyes, and 29 Descemet membrane endothelial keratoplasty (DMEK) eyes were included. The apical center, pupillary center, and thinnest corneal thickness were determined in 3 consecutive images and repeated 2 weeks later. Repeatability and reproducibility coefficients, intraclass correlation coefficients, and 95% limits of agreement (LOA) between measurements were calculated. Agreement between devices was assessed using Bland-Altman analysis. Corneal thickness measurements were highly reproducible and repeatable with both systems. SS-OCT showed better repeatability in all corneal locations in the normal, FECD, FECD with edema, DSAEK, and DMEK groups (coefficient of variation ≤0.60%, ≤0.36%, ≤0.43%, ≤1.09%, and ≤0.48%, respectively) than the Scheimpflug (coefficient of variation ≤1.15%, ≤0.92%, ≤1.10%, ≤1.25%, and ≤1.14%, respectively). Between-session 95% LOA for SS-OCT was less than 3% for all groups except for the FECD with edema group, being almost double using the Scheimpflug camera. Differences between instruments were statistically significant in all groups and locations (P group (P ≤ 0.51); however, SS-OCT underestimated all measurements. SS-OCT provides better reproducible and repeatable measurements of corneal thickness than those obtained with the Scheimpflug camera in patients with FECD or an endothelial transplant. Variations between examinations higher than the 95% LOA observed in our study should raise awareness of changes in the endothelial function.

  15. The use of optical scanning for analysis of casting shape

    Directory of Open Access Journals (Sweden)

    M. Wieczorowski

    2011-04-01

    Full Text Available In the paper the use of optical scanning for inspection of casting shape and its accuracy was described. Optical system applied todigitization of objects determines all dimensions and shape of inspected object. This technology is used in quality control and reverse engineering. System is based on triangulation: sensor head performs projection of different patterns of fringes onto measured object and scanner tracks their distribution with two cameras. Basing on optical transform equations, a processing unit automatically and with remarkable accuracy calculates 3D coordinates for every pixel of camera. Depending on camera resolution the result of such a scan is acloud of points with up to 5 million points for every image. In the paper examples of applications for castings with different designationwas presented.

  16. Sub-Camera Calibration of a Penta-Camera

    Science.gov (United States)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  17. A randomized comparison of laparoscopic, flexible endoscopic, and wired and wireless magnetic cameras on ex vivo and in vivo NOTES surgical performance.

    Science.gov (United States)

    Chang, Victoria C; Tang, Shou-Jiang; Swain, C Paul; Bergs, Richard; Paramo, Juan; Hogg, Deborah C; Fernandez, Raul; Cadeddu, Jeffrey A; Scott, Daniel J

    2013-08-01

    The influence of endoscopic video camera (VC) image quality on surgical performance has not been studied. Flexible endoscopes are used as substitutes for laparoscopes in natural orifice translumenal endoscopic surgery (NOTES), but their optics are originally designed for intralumenal use. Manipulable wired or wireless independent VCs might offer advantages for NOTES but are still under development. To measure the optical characteristics of 4 VC systems and to compare their impact on the performance of surgical suturing tasks. VC systems included a laparoscope (Storz 10 mm), a flexible endoscope (Olympus GIF 160), and 2 prototype deployable cameras (magnetic anchoring and guidance system [MAGS] Camera and PillCam). In a randomized fashion, the 4 systems were evaluated regarding standardized optical characteristics and surgical manipulations of previously validated ex vivo (fundamentals of laparoscopic surgery model) and in vivo (live porcine Nissen model) tasks; objective metrics (time and errors/precision) and combined surgeon (n = 2) performance were recorded. Subtle differences were detected for color tests, and field of view was variable (65°-115°). Suitable resolution was detected up to 10 cm for the laparoscope and MAGS camera but only at closer distances for the endoscope and PillCam. Compared with the laparoscope, surgical suturing performances were modestly lower for the MAGS camera and significantly lower for the endoscope (ex vivo) and PillCam (ex vivo and in vivo). This study documented distinct differences in VC systems that may be used for NOTES in terms of both optical characteristics and surgical performance. Additional work is warranted to optimize cameras for NOTES. Deployable systems may be especially well suited for this purpose.

  18. Relative camera localisation in non-overlapping camera networks using multiple trajectories

    NARCIS (Netherlands)

    John, V.; Englebienne, G.; Kröse, B.J.A.

    2012-01-01

    In this article we present an automatic camera calibration algorithm using multiple trajectories in a multiple camera network with non-overlapping field-of-views (FOV). Visible trajectories within a camera FOV are assumed to be measured with respect to the camera local co-ordinate system.

  19. Measurement methods and accuracy analysis of Chang'E-5 Panoramic Camera installation parameters

    Science.gov (United States)

    Yan, Wei; Ren, Xin; Liu, Jianjun; Tan, Xu; Wang, Wenrui; Chen, Wangli; Zhang, Xiaoxia; Li, Chunlai

    2016-04-01

    Chang'E-5 (CE-5) is a lunar probe for the third phase of China Lunar Exploration Project (CLEP), whose main scientific objectives are to implement lunar surface sampling and to return the samples back to the Earth. To achieve these goals, investigation of lunar surface topography and geological structure within sampling area seems to be extremely important. The Panoramic Camera (PCAM) is one of the payloads mounted on CE-5 lander. It consists of two optical systems which installed on a camera rotating platform. Optical images of sampling area can be obtained by PCAM in the form of a two-dimensional image and a stereo images pair can be formed by left and right PCAM images. Then lunar terrain can be reconstructed based on photogrammetry. Installation parameters of PCAM with respect to CE-5 lander are critical for the calculation of exterior orientation elements (EO) of PCAM images, which is used for lunar terrain reconstruction. In this paper, types of PCAM installation parameters and coordinate systems involved are defined. Measurement methods combining camera images and optical coordinate observations are studied for this work. Then research contents such as observation program and specific solution methods of installation parameters are introduced. Parametric solution accuracy is analyzed according to observations obtained by PCAM scientifically validated experiment, which is used to test the authenticity of PCAM detection process, ground data processing methods, product quality and so on. Analysis results show that the accuracy of the installation parameters affects the positional accuracy of corresponding image points of PCAM stereo images within 1 pixel. So the measurement methods and parameter accuracy studied in this paper meet the needs of engineering and scientific applications. Keywords: Chang'E-5 Mission; Panoramic Camera; Installation Parameters; Total Station; Coordinate Conversion

  20. Compton camera imaging and the cone transform: a brief overview

    Science.gov (United States)

    Terzioglu, Fatma; Kuchment, Peter; Kunyansky, Leonid

    2018-05-01

    While most of Radon transform applications to imaging involve integrations over smooth sub-manifolds of the ambient space, lately important situations have appeared where the integration surfaces are conical. Three of such applications are single scatter optical tomography, Compton camera medical imaging, and homeland security. In spite of the similar surfaces of integration, the data and the inverse problems associated with these modalities differ significantly. In this article, we present a brief overview of the mathematics arising in Compton camera imaging. In particular, the emphasis is made on the overdetermined data and flexible geometry of the detectors. For the detailed results, as well as other approaches (e.g. smaller-dimensional data or restricted geometry of detectors) the reader is directed to the relevant publications. Only a brief description and some references are provided for the single scatter optical tomography. This work was supported in part by NSF DMS grants 1211463 (the first two authors), 1211521 and 141877 (the third author), as well as a College of Science of Texas A&M University grant.

  1. STREAK CAMERA MEASUREMENTS OF THE APS PC GUN DRIVE LASER

    Energy Technology Data Exchange (ETDEWEB)

    Dooling, J. C.; Lumpkin, A. H.

    2017-06-25

    We report recent pulse-duration measurements of the APS PC Gun drive laser at both second harmonic and fourth harmonic wavelengths. The drive laser is a Nd:Glass-based chirped pulsed amplifier (CPA) operating at an IR wavelength of 1053 nm, twice frequency-doubled to obtain UV output for the gun. A Hamamatsu C5680 streak camera and an M5675 synchroscan unit are used for these measurements; the synchroscan unit is tuned to 119 MHz, the 24th subharmonic of the linac s-band operating frequency. Calibration is accomplished both electronically and optically. Electronic calibration utilizes a programmable delay line in the 119 MHz rf path. The optical delay uses an etalon with known spacing between reflecting surfaces and is coated for the visible, SH wavelength. IR pulse duration is monitored with an autocorrelator. Fitting the streak camera image projected profiles with Gaussians, UV rms pulse durations are found to vary from 2.1 ps to 3.5 ps as the IR varies from 2.2 ps to 5.2 ps.

  2. Multi-spectral CCD camera system for ocean water color and seacoast observation

    Science.gov (United States)

    Zhu, Min; Chen, Shiping; Wu, Yanlin; Huang, Qiaolin; Jin, Weiqi

    2001-10-01

    One of the earth observing instruments on HY-1 Satellite which will be launched in 2001, the multi-spectral CCD camera system, is developed by Beijing Institute of Space Mechanics & Electricity (BISME), Chinese Academy of Space Technology (CAST). In 798 km orbit, the system can provide images with 250 m ground resolution and a swath of 500 km. It is mainly used for coast zone dynamic mapping and oceanic watercolor monitoring, which include the pollution of offshore and coast zone, plant cover, watercolor, ice, terrain underwater, suspended sediment, mudflat, soil and vapor gross. The multi- spectral camera system is composed of four monocolor CCD cameras, which are line array-based, 'push-broom' scanning cameras, and responding for four spectral bands. The camera system adapts view field registration; that is, each camera scans the same region at the same moment. Each of them contains optics, focal plane assembly, electrical circuit, installation structure, calibration system, thermal control and so on. The primary features on the camera system are: (1) Offset of the central wavelength is better than 5 nm; (2) Degree of polarization is less than 0.5%; (3) Signal-to-noise ratio is about 1000; (4) Dynamic range is better than 2000:1; (5) Registration precision is better than 0.3 pixel; (6) Quantization value is 12 bit.

  3. Photometric Calibration of Consumer Video Cameras

    Science.gov (United States)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  4. Design of an experimental four-camera setup for enhanced 3D surface reconstruction in microsurgery

    Directory of Open Access Journals (Sweden)

    Marzi Christian

    2017-09-01

    Full Text Available Future fully digital surgical visualization systems enable a wide range of new options. Caused by optomechanical limitations a main disadvantage of today’s surgical microscopes is their incapability of providing arbitrary perspectives to more than two observers. In a fully digital microscopic system, multiple arbitrary views can be generated from a 3D reconstruction. Modern surgical microscopes allow replacing the eyepieces by cameras in order to record stereoscopic videos. A reconstruction from these videos can only contain the amount of detail the recording camera system gathers from the scene. Therefore, covered surfaces can result in a faulty reconstruction for deviating stereoscopic perspectives. By adding cameras recording the object from different angles, additional information of the scene is acquired, allowing to improve the reconstruction. Our approach is to use a fixed four-camera setup as a front-end system to capture enhanced 3D topography of a pseudo-surgical scene. This experimental setup would provide images for the reconstruction algorithms and generation of multiple observing stereo perspectives. The concept of the designed setup is based on the common main objective (CMO principle of current surgical microscopes. These systems are well established and optically mature. Furthermore, the CMO principle allows a more compact design and a lowered effort in calibration than cameras with separate optics. Behind the CMO four pupils separate the four channels which are recorded by one camera each. The designed system captures an area of approximately 28mm × 28mm with four cameras. Thus, allowing to process images of 6 different stereo perspectives. In order to verify the setup, it is modelled in silico. It can be used in further studies to test algorithms for 3D reconstruction from up to four perspectives and provide information about the impact of additionally recorded perspectives on the enhancement of a reconstruction.

  5. Optics for mobile phone imaging

    Science.gov (United States)

    Vigier-Blanc, Emmanuelle E.

    2004-02-01

    Micro cameras for mobile phones require specific opto electronic designs using high-resolution micro technologies for compromising optical, electronical and mechanical requirements. The purpose of this conference is to present the optical critical parameters for imaging optics embedded into mobile phones. We will overview the optics critical parameters involved into micro optical cameras, as seen from user point of view, and their interdependence and relative influence onto optical performances of the product, as: -Focal length, field of view and array size. -Lens speed and depth of field: what is hidden behind lens speed, how to compromise small aperture, production tolerances, sensitivity, good resolution in corners and great depth of field -Relative illumination, this smooth fall off of intensity toward edge of array -Resolution; how to measure it, the interaction of pixel size, small dimensions -Sensitivity, insuring same sensitivity as human being under both twilight and midday sunny conditions. -Mischievous effects, as flare, glare, ghost effects and how to avoid them -How to match sensor spectrum and photopic eye curve: IR filter, and color balancing. We will compromise above parameters and see how to match with market needs and productivity insurance.

  6. Determining fast orientation changes of multi-spectral line cameras from the primary images

    Science.gov (United States)

    Wohlfeil, Jürgen

    2012-01-01

    Fast orientation changes of airborne and spaceborne line cameras cannot always be avoided. In such cases it is essential to measure them with high accuracy to ensure a good quality of the resulting imagery products. Several approaches exist to support the orientation measurement by using optical information received through the main objective/telescope. In this article an approach is proposed that allows the determination of non-systematic orientation changes between every captured line. It does not require any additional camera hardware or onboard processing capabilities but the payload images and a rough estimate of the camera's trajectory. The approach takes advantage of the typical geometry of multi-spectral line cameras with a set of linear sensor arrays for different spectral bands on the focal plane. First, homologous points are detected within the heavily distorted images of different spectral bands. With their help a connected network of geometrical correspondences can be built up. This network is used to calculate the orientation changes of the camera with the temporal and angular resolution of the camera. The approach was tested with an extensive set of aerial surveys covering a wide range of different conditions and achieved precise and reliable results.

  7. Optical Arc-Length Sensor For TIG Welding

    Science.gov (United States)

    Smith, Matthew A.

    1990-01-01

    Proposed subsystem of tungsten/inert-gas (TIG) welding system measures length of welding arc optically. Viewed by video camera, in one of three alternative optical configurations. Length of arc measured instead of inferred from voltage.

  8. Application of phase matching autofocus in airborne long-range oblique photography camera

    Science.gov (United States)

    Petrushevsky, Vladimir; Guberman, Asaf

    2014-06-01

    The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.

  9. An Online Tilt Estimation and Compensation Algorithm for a Small Satellite Camera

    Science.gov (United States)

    Lee, Da-Hyun; Hwang, Jai-hyuk

    2018-04-01

    In the case of a satellite camera designed to execute an Earth observation mission, even after a pre-launch precision alignment process has been carried out, misalignment will occur due to external factors during the launch and in the operating environment. In particular, for high-resolution satellite cameras, which require submicron accuracy for alignment between optical components, misalignment is a major cause of image quality degradation. To compensate for this, most high-resolution satellite cameras undergo a precise realignment process called refocusing before and during the operation process. However, conventional Earth observation satellites only execute refocusing upon de-space. Thus, in this paper, an online tilt estimation and compensation algorithm that can be utilized after de-space correction is executed. Although the sensitivity of the optical performance degradation due to the misalignment is highest in de-space, the MTF can be additionally increased by correcting tilt after refocusing. The algorithm proposed in this research can be used to estimate the amount of tilt that occurs by taking star images, and it can also be used to carry out automatic tilt corrections by employing a compensation mechanism that gives angular motion to the secondary mirror. Crucially, this algorithm is developed using an online processing system so that it can operate without communication with the ground.

  10. Portable fiber-optic taper coupled optical microscopy platform

    Science.gov (United States)

    Wang, Weiming; Yu, Yan; Huang, Hui; Ou, Jinping

    2017-04-01

    The optical fiber taper coupled with CMOS has advantages of high sensitivity, compact structure and low distortion in the imaging platform. So it is widely used in low light, high speed and X-ray imaging systems. In the meanwhile, the peculiarity of the coupled structure can meet the needs of the demand in microscopy imaging. Toward this end, we developed a microscopic imaging platform based on the coupling of cellphone camera module and fiber optic taper for the measurement of the human blood samples and ascaris lumbricoides. The platform, weighing 70 grams, is based on the existing camera module of the smartphone and a fiber-optic array which providing a magnification factor of 6x.The top facet of the taper, on which samples are placed, serves as an irregular sampling grid for contact imaging. The magnified images of the sample, located on the bottom facet of the fiber, are then projected onto the CMOS sensor. This paper introduces the portable medical imaging system based on the optical fiber coupling with CMOS, and theoretically analyzes the feasibility of the system. The image data and process results either can be stored on the memory or transmitted to the remote medical institutions for the telemedicine. We validate the performance of this cell-phone based microscopy platform using human blood samples and test target, achieving comparable results to a standard bench-top microscope.

  11. A survey of camera error sources in machine vision systems

    Science.gov (United States)

    Jatko, W. B.

    In machine vision applications, such as an automated inspection line, television cameras are commonly used to record scene intensity in a computer memory or frame buffer. Scene data from the image sensor can then be analyzed with a wide variety of feature-detection techniques. Many algorithms found in textbooks on image processing make the implicit simplifying assumption of an ideal input image with clearly defined edges and uniform illumination. The ideal image model is helpful to aid the student in understanding the principles of operation, but when these algorithms are blindly applied to real-world images the results can be unsatisfactory. This paper examines some common measurement errors found in camera sensors and their underlying causes, and possible methods of error compensation. The role of the camera in a typical image-processing system is discussed, with emphasis on the origination of signal distortions. The effects of such things as lighting, optics, and sensor characteristics are considered.

  12. Optical stimulator for vision-based sensors

    DEFF Research Database (Denmark)

    Rössler, Dirk; Pedersen, David Arge Klevang; Benn, Mathias

    2014-01-01

    We have developed an optical stimulator system for vision-based sensors. The stimulator is an efficient tool for stimulating a camera during on-ground testing with scenes representative of spacecraft flights. Such scenes include starry sky, planetary objects, and other spacecraft. The optical...

  13. Improved approach to characterizing and presenting streak camera performance

    International Nuclear Information System (INIS)

    Wiedwald, J.D.; Jones, B.A.

    1985-01-01

    The performance of a streak camera recording system is strongly linked to the technique used to amplify, detect and quantify the streaked image. At the Lawrence Livermore National Laboratory (LLNL) streak camera images have been recorded both on film and by fiber-optically coupling to charge-coupled devices (CCD's). During the development of a new process for recording these images (lens coupling the image onto a cooled CCD) the definitions of important performance characteristics such as resolution and dynamic range were re-examined. As a result of this development, these performance characteristics are now presented to the streak camera user in a more useful format than in the past. This paper describes how these techniques are used within the Laser Fusion Program at LLNL. The system resolution is presented as a modulation transfer function, including the seldom reported effects that flare and light scattering have at low spatial frequencies. Data are presented such that a user can adjust image intensifier gain and pixel averaging to optimize the useful dynamic range in any particular application

  14. Cosmology with the Large Synoptic Survey Telescope: an overview

    Science.gov (United States)

    Zhan, Hu; Tyson, J. Anthony

    2018-06-01

    The Large Synoptic Survey Telescope (LSST) is a high étendue imaging facility that is being constructed atop Cerro Pachón in northern Chile. It is scheduled to begin science operations in 2022. With an ( effective) aperture, a novel three-mirror design achieving a seeing-limited field of view, and a 3.2 gigapixel camera, the LSST has the deep-wide-fast imaging capability necessary to carry out an survey in six passbands (ugrizy) to a coadded depth of over 10 years using of its observational time. The remaining of the time will be devoted to considerably deeper and faster time-domain observations and smaller surveys. In total, each patch of the sky in the main survey will receive 800 visits allocated across the six passbands with exposure visits. The huge volume of high-quality LSST data will provide a wide range of science opportunities and, in particular, open a new era of precision cosmology with unprecedented statistical power and tight control of systematic errors. In this review, we give a brief account of the LSST cosmology program with an emphasis on dark energy investigations. The LSST will address dark energy physics and cosmology in general by exploiting diverse precision probes including large-scale structure, weak lensing, type Ia supernovae, galaxy clusters, and strong lensing. Combined with the cosmic microwave background data, these probes form interlocking tests on the cosmological model and the nature of dark energy in the presence of various systematics. The LSST data products will be made available to the US and Chilean scientific communities and to international partners with no proprietary period. Close collaborations with contemporaneous imaging and spectroscopy surveys observing at a variety of wavelengths, resolutions, depths, and timescales will be a vital part of the LSST science program, which will not only enhance specific studies but, more importantly, also allow a more complete understanding of the Universe through different windows.

  15. SNAPSHOT SPECTRAL AND COLOR IMAGING USING A REGULAR DIGITAL CAMERA WITH A MONOCHROMATIC IMAGE SENSOR

    Directory of Open Access Journals (Sweden)

    J. Hauser

    2017-10-01

    Full Text Available Spectral imaging (SI refers to the acquisition of the three-dimensional (3D spectral cube of spatial and spectral data of a source object at a limited number of wavelengths in a given wavelength range. Snapshot spectral imaging (SSI refers to the instantaneous acquisition (in a single shot of the spectral cube, a process suitable for fast changing objects. Known SSI devices exhibit large total track length (TTL, weight and production costs and relatively low optical throughput. We present a simple SSI camera based on a regular digital camera with (i an added diffusing and dispersing phase-only static optical element at the entrance pupil (diffuser and (ii tailored compressed sensing (CS methods for digital processing of the diffused and dispersed (DD image recorded on the image sensor. The diffuser is designed to mix the spectral cube data spectrally and spatially and thus to enable convergence in its reconstruction by CS-based algorithms. In addition to performing SSI, this SSI camera is capable to perform color imaging using a monochromatic or gray-scale image sensor without color filter arrays.

  16. Strategy for the Development of a Smart NDVI Camera System for Outdoor Plant Detection and Agricultural Embedded Systems

    Directory of Open Access Journals (Sweden)

    Ali Akbar Zarezadeh

    2013-01-01

    Full Text Available The application of (smart cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR and the red channel optical frequency band. Two aligned charge coupled device (CCD chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed.

  17. Strategy for the development of a smart NDVI camera system for outdoor plant detection and agricultural embedded systems.

    Science.gov (United States)

    Dworak, Volker; Selbeck, Joern; Dammer, Karl-Heinz; Hoffmann, Matthias; Zarezadeh, Ali Akbar; Bobda, Christophe

    2013-01-24

    The application of (smart) cameras for process control, mapping, and advanced imaging in agriculture has become an element of precision farming that facilitates the conservation of fertilizer, pesticides, and machine time. This technique additionally reduces the amount of energy required in terms of fuel. Although research activities have increased in this field, high camera prices reflect low adaptation to applications in all fields of agriculture. Smart, low-cost cameras adapted for agricultural applications can overcome this drawback. The normalized difference vegetation index (NDVI) for each image pixel is an applicable algorithm to discriminate plant information from the soil background enabled by a large difference in the reflectance between the near infrared (NIR) and the red channel optical frequency band. Two aligned charge coupled device (CCD) chips for the red and NIR channel are typically used, but they are expensive because of the precise optical alignment required. Therefore, much attention has been given to the development of alternative camera designs. In this study, the advantage of a smart one-chip camera design with NDVI image performance is demonstrated in terms of low cost and simplified design. The required assembly and pixel modifications are described, and new algorithms for establishing an enhanced NDVI image quality for data processing are discussed.

  18. Self-Calibration Method Based on Surface Micromaching of Light Transceiver Focal Plane for Optical Camera

    Directory of Open Access Journals (Sweden)

    Jin Li

    2016-10-01

    Full Text Available In remote sensing photogrammetric applications, inner orientation parameter (IOP calibration of remote sensing camera is a prerequisite for determining image position. However, achieving such a calibration without temporal and spatial limitations remains a crucial but unresolved issue to date. The accuracy of IOP calibration methods of a remote sensing camera determines the performance of image positioning. In this paper, we propose a high-accuracy self-calibration method without temporal and spatial limitations for remote sensing cameras. Our method is based on an auto-collimating dichroic filter combined with a surface micromachining (SM point-source focal plane. The proposed method can autonomously complete IOP calibration without the need of outside reference targets. The SM procedure is used to manufacture a light transceiver focal plane, which integrates with point sources, a splitter, and a complementary metal oxide semiconductor sensor. A dichroic filter is used to fabricate an auto-collimation light reflection element. The dichroic filter, splitter, and SM point-source focal plane are integrated into a camera to perform an integrated self-calibration. Experimental measurements confirm the effectiveness and convenience of the proposed method. Moreover, the method can achieve micrometer-level precision and can satisfactorily complete real-time calibration without temporal or spatial limitations.

  19. Mobile phone camera benchmarking: combination of camera speed and image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-01-01

    When a mobile phone camera is tested and benchmarked, the significance of quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. For example, ISO 15781 defines several measurements to evaluate various camera system delays. However, the speed or rapidity metrics of the mobile phone's camera system have not been used with the quality metrics even if the camera speed has become more and more important camera performance feature. There are several tasks in this work. Firstly, the most important image quality metrics are collected from the standards and papers. Secondly, the speed related metrics of a mobile phone's camera system are collected from the standards and papers and also novel speed metrics are identified. Thirdly, combinations of the quality and speed metrics are validated using mobile phones in the market. The measurements are done towards application programming interface of different operating system. Finally, the results are evaluated and conclusions are made. The result of this work gives detailed benchmarking results of mobile phone camera systems in the market. The paper defines also a proposal of combined benchmarking metrics, which includes both quality and speed parameters.

  20. Optical metrology

    CERN Document Server

    Gåsvik, Kjell J

    2003-01-01

    New material on computerized optical processes, computerized ray tracing, and the fast Fourier transform, Bibre-Bragg sensors, and temporal phase unwrapping.* New introductory sections to all chapters.* Detailed discussion on lasers and laser principles, including an introduction to radiometry and photometry.* Thorough coverage of the CCD camera.

  1. Optical analysis of a compound quasi-microscope for planetary landers

    Science.gov (United States)

    Wall, S. D.; Burcher, E. E.; Huck, F. O.

    1974-01-01

    A quasi-microscope concept, consisting of facsimile camera augmented with an auxiliary lens as a magnifier, was introduced and analyzed. The performance achievable with this concept was primarily limited by a trade-off between resolution and object field; this approach leads to a limiting resolution of 20 microns when used with the Viking lander camera (which has an angular resolution of 0.04 deg). An optical system is analyzed which includes a field lens between camera and auxiliary lens to overcome this limitation. It is found that this system, referred to as a compound quasi-microscope, can provide improved resolution (to about 2 microns ) and a larger object field. However, this improvement is at the expense of increased complexity, special camera design requirements, and tighter tolerances on the distances between optical components.

  2. High spatial resolution infrared camera as ISS external experiment

    Science.gov (United States)

    Eckehard, Lorenz; Frerker, Hap; Fitch, Robert Alan

    High spatial resolution infrared camera as ISS external experiment for monitoring global climate changes uses ISS internal and external resources (eg. data storage). The optical experiment will consist of an infrared camera for monitoring global climate changes from the ISS. This technology was evaluated by the German small satellite mission BIRD and further developed in different ESA projects. Compared to BIRD the presended instrument uses proven sensor advanced technologies (ISS external) and ISS on board processing and storage capabili-ties (internal). The instrument will be equipped with a serial interfaces for TM/TC and several relay commands for the power supply. For data processing and storage a mass memory is re-quired. The access to actual attitude data is highly desired to produce geo referenced maps-if possible by an on board processing.

  3. First-order optical analysis of a quasi-microscope for planetary landers

    Science.gov (United States)

    Huck, F. O.; Sinclair, A. R.; Burcher, E. E.

    1973-01-01

    A first-order geometrical optics analysis of a facsimile camera augmented with an auxiliary lens as magnifier is presented. This concept, called quasi-microscope, bridges the gap between surface resolutions of the order of 1 to 10 mm which can be obtained directly with planetary lander cameras and resolutions of the order of 0.2 to 10 microns which can be obtained only with relatively complex microscopes. A facsimile camera was considered in the analysis; however, the analytical results can also be applied to television and film cameras. It was found that quasi-microscope resolutions in the range from 10 to 100 microns are obtainable with current state-of-the-art lander facsimile cameras. For the Viking lander camera having an angular resolution of 0.04 deg, which was considered as a specific example, the best achievable resolution would be about 20 microns. The preferred approach to increase the resolution of the quasi-microscope would be, if possible, through an increase in angular resolution of the camera. A twofold to threefold improvement in resolution could also be achieved with a special camera focus position, but this approach tends to require larger and heavier auxiliary optics.

  4. Application of X-ray CCD camera in X-ray spot diagnosis of rod-pinch diode

    International Nuclear Information System (INIS)

    Song Yan; Zhou Ming; Song Guzhou; Ma Jiming; Duan Baojun; Han Changcai; Yao Zhiming

    2015-01-01

    The pinhole imaging technique is widely used in the measurement of X-ray spot of rod-pinch diode. The X-ray CCD camera, which was composed of film, fiber optic taper and CCD camera, was employed to replace the imaging system based on scintillator, lens and CCD camera in the diagnosis of X-ray spot. The resolution of the X-ray CCD camera was studied. The resolution is restricted by the film and is 5 lp/mm in the test with Pb resolution chart. The frequency is 1.5 lp/mm when the MTF is 0.5 in the test with edge image. The resolution tests indicate that the X-ray CCD camera can meet the requirement of the diagnosis of X-ray spot whose scale is about 1.5 mm when the pinhole imaging magnification is 0.5. At last, the image of X-ray spot was gained and the restoration was implemented in the diagnosis of X-ray spot of rod-pinch diode. (authors)

  5. The AOTF-Based NO2 Camera

    Science.gov (United States)

    Dekemper, E.; Fussen, D.; Vanhellemont, F.; Vanhamel, J.; Pieroux, D.; Berkenbosch, S.

    2017-12-01

    In an urban environment, nitrogen dioxide is emitted by a multitude of static and moving point sources (cars, industry, power plants, heating systems,…). Air quality models generally rely on a limited number of monitoring stations which do not capture the whole pattern, neither allow for full validation. So far, there has been a lack of instrument capable of measuring NO2 fields with the necessary spatio-temporal resolution above major point sources (power plants), or more extended ones (cities). We have developed a new type of passive remote sensing instrument aiming at the measurement of 2-D distributions of NO2 slant column densities (SCDs) with a high spatial (meters) and temporal (minutes) resolution. The measurement principle has some similarities with the popular filter-based SO2 camera (used in volcanic and industrial sulfur emissions monitoring) as it relies on spectral images taken at wavelengths where the molecule absorption cross section is different. But contrary to the SO2 camera, the spectral selection is performed by an acousto-optical tunable filter (AOTF) capable of resolving the target molecule's spectral features. A first prototype was successfully tested with the plume of a coal-firing power plant in Romania, revealing the dynamics of the formation of NO2 in the early plume. A lighter version of the NO2 camera is now being tested on other targets, such as oil refineries and urban air masses.

  6. Iterative Refinement of Transmission Map for Stereo Image Defogging Using a Dual Camera Sensor

    Directory of Open Access Journals (Sweden)

    Heegwang Kim

    2017-12-01

    Full Text Available Recently, the stereo imaging-based image enhancement approach has attracted increasing attention in the field of video analysis. This paper presents a dual camera-based stereo image defogging algorithm. Optical flow is first estimated from the stereo foggy image pair, and the initial disparity map is generated from the estimated optical flow. Next, an initial transmission map is generated using the initial disparity map. Atmospheric light is then estimated using the color line theory. The defogged result is finally reconstructed using the estimated transmission map and atmospheric light. The proposed method can refine the transmission map iteratively. Experimental results show that the proposed method can successfully remove fog without color distortion. The proposed method can be used as a pre-processing step for an outdoor video analysis system and a high-end smartphone with a dual camera system.

  7. Iterative Refinement of Transmission Map for Stereo Image Defogging Using a Dual Camera Sensor.

    Science.gov (United States)

    Kim, Heegwang; Park, Jinho; Park, Hasil; Paik, Joonki

    2017-12-09

    Recently, the stereo imaging-based image enhancement approach has attracted increasing attention in the field of video analysis. This paper presents a dual camera-based stereo image defogging algorithm. Optical flow is first estimated from the stereo foggy image pair, and the initial disparity map is generated from the estimated optical flow. Next, an initial transmission map is generated using the initial disparity map. Atmospheric light is then estimated using the color line theory. The defogged result is finally reconstructed using the estimated transmission map and atmospheric light. The proposed method can refine the transmission map iteratively. Experimental results show that the proposed method can successfully remove fog without color distortion. The proposed method can be used as a pre-processing step for an outdoor video analysis system and a high-end smartphone with a dual camera system.

  8. Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry

    Science.gov (United States)

    Chikatsu, Hirofumi; Takahashi, Yoji

    2009-08-01

    The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

  9. Multispectral calibration to enhance the metrology performance of C-mount camera systems

    Directory of Open Access Journals (Sweden)

    S. Robson

    2014-06-01

    Full Text Available Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same ‘C-mount’ wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

  10. On the development of radiation tolerant surveillance camera from consumer-grade components

    Directory of Open Access Journals (Sweden)

    Klemen Ambrožič

    2017-01-01

    Full Text Available In this paper an overview on the process of designing a radiation tolerant surveillance camera from consumer grade components and commercially available particle shielding materials is given. This involves utilization of Monte-Carlo particle transport code MCNP6 and ENDF/B-VII.0 nuclear data libraries, as well as testing the physical electrical systems against γ radiation, utilizing JSI TRIGA mk. II fuel elements as a γ-ray sources. A new, aluminum, 20 cm × 20 cm × 30 cm irradiation facility with electrical power and signal wire guide-tube to the reactor platform, was designed and constructed and used for irradiation of large electronic and optical components assemblies with activated fuel elements. Electronic components to be used in the camera were tested against γ-radiation in an independent manner, to determine their radiation tolerance. Several camera designs were proposed and simulated using MCNP, to determine incident particle and dose attenuation factors. Data obtained from the measurements and MCNP simulations will be used to finalize the design of 3 surveillance camera models, with different radiation tolerances.

  11. On the development of radiation tolerant surveillance camera from consumer-grade components

    Science.gov (United States)

    Klemen, Ambrožič; Luka, Snoj; Lars, Öhlin; Jan, Gunnarsson; Niklas, Barringer

    2017-09-01

    In this paper an overview on the process of designing a radiation tolerant surveillance camera from consumer grade components and commercially available particle shielding materials is given. This involves utilization of Monte-Carlo particle transport code MCNP6 and ENDF/B-VII.0 nuclear data libraries, as well as testing the physical electrical systems against γ radiation, utilizing JSI TRIGA mk. II fuel elements as a γ-ray sources. A new, aluminum, 20 cm × 20 cm × 30 cm irradiation facility with electrical power and signal wire guide-tube to the reactor platform, was designed and constructed and used for irradiation of large electronic and optical components assemblies with activated fuel elements. Electronic components to be used in the camera were tested against γ-radiation in an independent manner, to determine their radiation tolerance. Several camera designs were proposed and simulated using MCNP, to determine incident particle and dose attenuation factors. Data obtained from the measurements and MCNP simulations will be used to finalize the design of 3 surveillance camera models, with different radiation tolerances.

  12. Applying image quality in cell phone cameras: lens distortion

    Science.gov (United States)

    Baxter, Donald; Goma, Sergio R.; Aleksic, Milivoje

    2009-01-01

    This paper describes the framework used in one of the pilot studies run under the I3A CPIQ initiative to quantify overall image quality in cell-phone cameras. The framework is based on a multivariate formalism which tries to predict overall image quality from individual image quality attributes and was validated in a CPIQ pilot program. The pilot study focuses on image quality distortions introduced in the optical path of a cell-phone camera, which may or may not be corrected in the image processing path. The assumption is that the captured image used is JPEG compressed and the cellphone camera is set to 'auto' mode. As the used framework requires that the individual attributes to be relatively perceptually orthogonal, in the pilot study, the attributes used are lens geometric distortion (LGD) and lateral chromatic aberrations (LCA). The goal of this paper is to present the framework of this pilot project starting with the definition of the individual attributes, up to their quantification in JNDs of quality, a requirement of the multivariate formalism, therefore both objective and subjective evaluations were used. A major distinction in the objective part from the 'DSC imaging world' is that the LCA/LGD distortions found in cell-phone cameras, rarely exhibit radial behavior, therefore a radial mapping/modeling cannot be used in this case.

  13. Large Synoptic Survey Telescope: From Science Drivers to Reference Design

    Energy Technology Data Exchange (ETDEWEB)

    Ivezic, Z.; Axelrod, T.; Brandt, W.N.; Burke, D.L.; Claver, C.F.; Connolly, A.; Cook, K.H.; Gee, P.; Gilmore, D.K.; Jacoby, S.H.; Jones, R.L.; Kahn, S.M.; Kantor, J.P.; Krabbendam, V.; Lupton, R.H.; Monet, D.G.; Pinto, P.A.; Saha, A.; Schalk, T.L.; Schneider, D.P.; Strauss, Michael A.; /Washington U., Seattle, Astron. Dept. /LSST Corp. /Penn State U., Astron. Astrophys. /KIPAC, Menlo Park /NOAO, Tucson /LLNL, Livermore /UC, Davis /Princeton U., Astrophys. Sci. Dept. /Naval Observ., Flagstaff /Arizona U., Astron. Dept. - Steward Observ. /UC, Santa Cruz /Harvard U. /Johns Hopkins U. /Illinois U., Urbana

    2011-10-14

    In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST). LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg{sup 2} field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg{sup 2} with {delta} < +34.5{sup o}, and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg{sup 2} region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the

  14. Large Synoptic Survey Telescope: From science drivers to reference design

    Directory of Open Access Journals (Sweden)

    Ivezić Ž.

    2008-01-01

    Full Text Available In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next- generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST. LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pachon in Northern Chile. The current baseline design, with an 8.4 m (6.5 m effective primary mirror, a 9.6 deg2 field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg2 with δ < +34.5◦ , and will be imaged multiple times in six bands, ugrizy, covering the wavelength range 320-1050 nm. About 90% of the observing time will be devoted to a deep- wide-fast survey mode which will observe a 20,000 deg2 region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We describe how the LSST

  15. Large Synoptic Survey Telescope: From Science Drivers To Reference Design

    Directory of Open Access Journals (Sweden)

    Ivezić, Ž.

    2008-06-01

    Full Text Available In the history of astronomy, major advances in our understanding of the Universe have come from dramatic improvements in our ability to accurately measure astronomical quantities. Aided by rapid progress in information technology, current sky surveys are changing the way we view and study the Universe. Next-generation surveys will maintain this revolutionary progress. We focus here on the most ambitious survey currently planned in the visible band, the Large Synoptic Survey Telescope (LSST. LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: constraining dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. It will be a large, wide-field ground-based system designed to obtain multiple images covering the sky that is visible from Cerro Pach'{o}n in Northern Chile. The current baseline design, with an 8.4, m (6.5, m effective primary mirror, a 9.6 deg$^2$ field of view, and a 3,200 Megapixel camera, will allow about 10,000 square degrees of sky to be covered using pairs of 15-second exposures in two photometric bands every three nights on average. The system is designed to yield high image quality, as well as superb astrometric and photometric accuracy. The survey area will include 30,000 deg$^2$ with $delta<+34.5^circ$, and will be imaged multiple times in six bands, $ugrizy$, covering the wavelength range 320--1050 nm. About 90\\% of the observing time will be devoted to a deep-wide-fast survey mode which will observe a 20,000 deg$^2$ region about 1000 times in the six bands during the anticipated 10 years of operation. These data will result in databases including 10 billion galaxies and a similar number of stars, and will serve the majority of science programs. The remaining 10\\% of the observing time will be allocated to special programs such as Very Deep and Very Fast time domain surveys. We

  16. Be Foil ''Filter Knee Imaging'' NSTX Plasma with Fast Soft X-ray Camera

    International Nuclear Information System (INIS)

    B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

    2005-01-01

    A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28 o ) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip

  17. Precise measurement of a subpicosecond electron single bunch by the femtosecond streak camera

    International Nuclear Information System (INIS)

    Uesaka, M.; Ueda, T.; Kozawa, T.; Kobayashi, T.

    1998-01-01

    Precise measurement of a subpicosecond electron single bunch by the femtosecond streak camera is presented. The subpicosecond electron single bunch of energy 35 MeV was generated by the achromatic magnetic pulse compressor at the S-band linear accelerator of nuclear engineering research laboratory (NERL), University of Tokyo. The electric charge per bunch and beam size are 0.5 nC and the horizontal and vertical beam sizes are 3.3 and 5.5 mm (full width at half maximum; FWHM), respectively. Pulse shape of the electron single bunch is measured via Cherenkov radiation emitted in air by the femtosecond streak camera. Optical parameters of the optical measurement system were optimized based on much experiment and numerical analysis in order to achieve a subpicosecond time resolution. By using the optimized optical measurement system, the subpicosecond pulse shape, its variation for the differents rf phases in the accelerating tube, the jitter of the total system and the correlation between measured streak images and calculated longitudinal phase space distributions were precisely evaluated. This measurement system is going to be utilized in several subpicosecond analyses for radiation physics and chemistry. (orig.)

  18. Respiratory-Gated MRgHIFU in Upper Abdomen Using an MR-Compatible In-Bore Digital Camera

    Directory of Open Access Journals (Sweden)

    Vincent Auboiroux

    2014-01-01

    Full Text Available Objective. To demonstrate the technical feasibility and the potential interest of using a digital optical camera inside the MR magnet bore for monitoring the breathing cycle and subsequently gating the PRFS MR thermometry, MR-ARFI measurement, and MRgHIFU sonication in the upper abdomen. Materials and Methods. A digital camera was reengineered to remove its magnetic parts and was further equipped with a 7 m long USB cable. The system was electromagnetically shielded and operated inside the bore of a closed 3T clinical scanner. Suitable triggers were generated based on real-time motion analysis of the images produced by the camera (resolution 640×480 pixels, 30 fps. Respiratory-gated MR-ARFI prepared MRgHIFU ablation was performed in the kidney and liver of two sheep in vivo, under general anaesthesia and ventilator-driven forced breathing. Results. The optical device demonstrated very good MR compatibility. The current setup permitted the acquisition of motion artefact-free and high resolution MR 2D ARFI and multiplanar interleaved PRFS thermometry (average SNR 30 in liver and 56 in kidney. Microscopic histology indicated precise focal lesions with sharply delineated margins following the respiratory-gated HIFU sonications. Conclusion. The proof-of-concept for respiratory motion management in MRgHIFU using an in-bore digital camera has been validated in vivo.

  19. Digital optical correlator x-ray telescope alignment monitoring system

    Science.gov (United States)

    Lis, Tomasz; Gaskin, Jessica; Jasper, John; Gregory, Don A.

    2018-01-01

    The High-Energy Replicated Optics to Explore the Sun (HEROES) program is a balloon-borne x-ray telescope mission to observe hard x-rays (˜20 to 70 keV) from the sun and multiple astrophysical targets. The payload consists of eight mirror modules with a total of 114 optics that are mounted on a 6-m-long optical bench. Each mirror module is complemented by a high-pressure xenon gas scintillation proportional counter. Attached to the payload is a camera that acquires star fields and then matches the acquired field to star maps to determine the pointing of the optical bench. Slight misalignments between the star camera, the optical bench, and the telescope elements attached to the optical bench may occur during flight due to mechanical shifts, thermal gradients, and gravitational effects. These misalignments can result in diminished imaging and reduced photon collection efficiency. To monitor these misalignments during flight, a supplementary Bench Alignment Monitoring System (BAMS) was added to the payload. BAMS hardware comprises two cameras mounted directly to the optical bench and rings of light-emitting diodes (LEDs) mounted onto the telescope components. The LEDs in these rings are mounted in a predefined, asymmetric pattern, and their positions are tracked using an optical/digital correlator. The BAMS analysis software is a digital adaption of an optical joint transform correlator. The aim is to enhance the observational proficiency of HEROES while providing insight into the magnitude of mechanically and thermally induced misalignments during flight. Results from a preflight test of the system are reported.

  20. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, U.; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is mounted in the ray inlet opening of the camera, while the others are placed on separate supports. The supports are swingably mounted upon a column one above the other through about 90 0 to a collimator exchange position. Each of the separate supports is swingable to a vertically aligned position, with limiting of the swinging movement and positioning of the support at the desired exchange position. The collimators are carried on the supports by means of a series of vertically disposed coil springs. Projections on the camera are movable from above into grooves of the collimator at the exchange position, whereupon the collimator is turned so that it is securely prevented from falling out of the camera head

  1. Determination of feature generation methods for PTZ camera object tracking

    Science.gov (United States)

    Doyle, Daniel D.; Black, Jonathan T.

    2012-06-01

    Object detection and tracking using computer vision (CV) techniques have been widely applied to sensor fusion applications. Many papers continue to be written that speed up performance and increase learning of artificially intelligent systems through improved algorithms, workload distribution, and information fusion. Military application of real-time tracking systems is becoming more and more complex with an ever increasing need of fusion and CV techniques to actively track and control dynamic systems. Examples include the use of metrology systems for tracking and measuring micro air vehicles (MAVs) and autonomous navigation systems for controlling MAVs. This paper seeks to contribute to the determination of select tracking algorithms that best track a moving object using a pan/tilt/zoom (PTZ) camera applicable to both of the examples presented. The select feature generation algorithms compared in this paper are the trained Scale-Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), the Mixture of Gaussians (MoG) background subtraction method, the Lucas- Kanade optical flow method (2000) and the Farneback optical flow method (2003). The matching algorithm used in this paper for the trained feature generation algorithms is the Fast Library for Approximate Nearest Neighbors (FLANN). The BSD licensed OpenCV library is used extensively to demonstrate the viability of each algorithm and its performance. Initial testing is performed on a sequence of images using a stationary camera. Further testing is performed on a sequence of images such that the PTZ camera is moving in order to capture the moving object. Comparisons are made based upon accuracy, speed and memory.

  2. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    Science.gov (United States)

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  3. Quantitative assessment of optic nerve head pallor

    International Nuclear Information System (INIS)

    Vilser, W; Seifert, B U; Riemer, T; Nagel, E; Weisensee, J; Hammer, M

    2008-01-01

    Ischaemia, loss of neural tissue, glial cell activation and tissue remodelling are symptoms of anterior ischaemic as well as glaucomatous optic neuropathy leading to pallor of the optic nerve head. Here, we describe a simple method for the pallor measurement using a fundus camera equipped with a colour CCD camera and a special dual bandpass filter. The reproducibility of the determined mean pallor value was 11.7% (coefficient of variation for repeated measurements in the same subject); the variation over six healthy subjects was 14.8%. A significant difference between the mean pallor of an atrophic disc and that of the contralateral eye of the same individual was found. However, even the clinically unaffected eye showed a significantly increased pallor compared to the mean of the healthy control group. Thus, optic disc pallor measurement, as described here, may be helpful in the early detection and follow-up of optic neuropathy

  4. Counting neutrons with a commercial S-CMOS camera

    Science.gov (United States)

    Patrick, Van Esch; Paolo, Mutti; Emilio, Ruiz-Martinez; Estefania, Abad Garcia; Marita, Mosconi; Jon, Ortega

    2018-01-01

    It is possible to detect individual flashes from thermal neutron impacts in a ZnS scintillator using a CMOS camera looking at the scintillator screen, and off line image processing. Some preliminary results indicated that the efficiency of recognition could be improved by optimizing the light collection and the image processing. We will report on this ongoing work which is a result from the collaboration between ESS Bilbao and the ILL. The main progress to be reported is situated on the level of the on-line treatment of the imaging data. If this technology is to work on a genuine scientific instrument, it is necessary that all the processing happens on line, to avoid the accumulation of large amounts of image data to be analyzed off line. An FPGA-based real-time full-deca mode VME-compatible CameraLink board has been developed at the SCI of the ILL, which is able to manage the data flow from the camera and convert it in a reasonable "neutron impact" data flow like from a usual neutron counting detector. The main challenge of the endeavor is the optical light collection from the scintillator. While the light yield of a ZnS scintillator is a priori rather important, the amount of light collected with a photographic objective is small. Different scintillators and different light collection techniques have been experimented with and results will be shown for different setups improving upon the light recuperation on the camera sensor. Improvements on the algorithm side will also be presented. The algorithms have to be at the same time efficient in their recognition of neutron signals, in their rejection of noise signals (internal and external to the camera) but also have to be simple enough to be easily implemented in the FPGA. The path from the idea of detecting individual neutron impacts with a CMOS camera to a practical working instrument detector is challenging, and in this paper we will give an overview of the part of the road that has already been walked.

  5. Counting neutrons with a commercial S-CMOS camera

    Directory of Open Access Journals (Sweden)

    Patrick Van Esch

    2018-01-01

    Full Text Available It is possible to detect individual flashes from thermal neutron impacts in a ZnS scintillator using a CMOS camera looking at the scintillator screen, and off line image processing. Some preliminary results indicated that the efficiency of recognition could be improved by optimizing the light collection and the image processing. We will report on this ongoing work which is a result from the collaboration between ESS Bilbao and the ILL. The main progress to be reported is situated on the level of the on-line treatment of the imaging data. If this technology is to work on a genuine scientific instrument, it is necessary that all the processing happens on line, to avoid the accumulation of large amounts of image data to be analyzed off line. An FPGA-based real-time full-deca mode VME-compatible CameraLink board has been developed at the SCI of the ILL, which is able to manage the data flow from the camera and convert it in a reasonable “neutron impact” data flow like from a usual neutron counting detector. The main challenge of the endeavor is the optical light collection from the scintillator. While the light yield of a ZnS scintillator is a priori rather important, the amount of light collected with a photographic objective is small. Different scintillators and different light collection techniques have been experimented with and results will be shown for different setups improving upon the light recuperation on the camera sensor. Improvements on the algorithm side will also be presented. The algorithms have to be at the same time efficient in their recognition of neutron signals, in their rejection of noise signals (internal and external to the camera but also have to be simple enough to be easily implemented in the FPGA. The path from the idea of detecting individual neutron impacts with a CMOS camera to a practical working instrument detector is challenging, and in this paper we will give an overview of the part of the road that has

  6. Quantification of atmospheric visibility with dual digital cameras during daytime and nighttime

    Directory of Open Access Journals (Sweden)

    K. Du

    2013-08-01

    Full Text Available A digital optical method "DOM-Vis" was developed to measure atmospheric visibility. In this method, two digital pictures were taken of the same target at two different distances along the same straight line. The pictures were analyzed to determine the optical contrasts between the target and its sky background and, subsequently, visibility is calculated. A light transfer scheme for DOM-Vis was delineated, based upon which algorithms were developed for both daytime and nighttime scenarios. A series of field tests were carried out under different weather and meteorological conditions to study the impacts of such operational parameters as exposure, optical zoom, distance between the two camera locations, and distance of the target. This method was validated by comparing the DOM-Vis results with those measured using a co-located Vaisala® visibility meter. The visibility under which this study was carried out ranged from 1 to 20 km. This digital-photography-based method possesses a number of advantages compared with traditional methods. Pre-calibration of the detector with a visibility meter is not required. In addition, the application of DOM-Vis is independent of several factors like the exact distance of the target and several camera setting parameters. These features make DOM-Vis more adaptive under a variety of field conditions.

  7. The influence of flywheel micro vibration on space camera and vibration suppression

    Science.gov (United States)

    Li, Lin; Tan, Luyang; Kong, Lin; Wang, Dong; Yang, Hongbo

    2018-02-01

    Studied the impact of flywheel micro vibration on a high resolution optical satellite that space-borne integrated. By testing the flywheel micro vibration with six-component test bench, the flywheel disturbance data is acquired. The finite element model of the satellite was established and the unit force/torque were applied at the flywheel mounting position to obtain the micro vibration data of the camera. Integrated analysis of the data of the two parts showed that the influence of flywheel micro vibration on the camera is mainly concentrated around 60-80 Hz and 170-230 Hz, the largest angular displacement of the secondary mirror along the optical axis direction is 0.04″ and the maximum angular displacement vertical to optical axis is 0.032″. After the design and installation of vibration isolator, the maximum angular displacement of the secondary mirror is 0.011″, the decay rate of root mean square value of the angular displacement is more than 50% and the maximum is 96.78%. The whole satellite was suspended to simulate the boundary condition on orbit; the imaging experiment results show that the image motion caused by the flywheel micro vibrationis less than 0.1 pixel after installing the vibration isolator.

  8. Contributed Review: Camera-limits for wide-field magnetic resonance imaging with a nitrogen-vacancy spin sensor

    Science.gov (United States)

    Wojciechowski, Adam M.; Karadas, Mürsel; Huck, Alexander; Osterkamp, Christian; Jankuhn, Steffen; Meijer, Jan; Jelezko, Fedor; Andersen, Ulrik L.

    2018-03-01

    Sensitive, real-time optical magnetometry with nitrogen-vacancy centers in diamond relies on accurate imaging of small (≪10-2), fractional fluorescence changes across the diamond sample. We discuss the limitations on magnetic field sensitivity resulting from the limited number of photoelectrons that a camera can record in a given time. Several types of camera sensors are analyzed, and the smallest measurable magnetic field change is estimated for each type. We show that most common sensors are of a limited use in such applications, while certain highly specific cameras allow achieving nanotesla-level sensitivity in 1 s of a combined exposure. Finally, we demonstrate the results obtained with a lock-in camera that paves the way for real-time, wide-field magnetometry at the nanotesla level and with a micrometer resolution.

  9. Accurate estimation of camera shot noise in the real-time

    Science.gov (United States)

    Cheremkhin, Pavel A.; Evtikhiev, Nikolay N.; Krasnov, Vitaly V.; Rodin, Vladislav G.; Starikov, Rostislav S.

    2017-10-01

    Nowadays digital cameras are essential parts of various technological processes and daily tasks. They are widely used in optics and photonics, astronomy, biology and other various fields of science and technology such as control systems and video-surveillance monitoring. One of the main information limitations of photo- and videocameras are noises of photosensor pixels. Camera's photosensor noise can be divided into random and pattern components. Temporal noise includes random noise component while spatial noise includes pattern noise component. Temporal noise can be divided into signal-dependent shot noise and signal-nondependent dark temporal noise. For measurement of camera noise characteristics, the most widely used methods are standards (for example, EMVA Standard 1288). It allows precise shot and dark temporal noise measurement but difficult in implementation and time-consuming. Earlier we proposed method for measurement of temporal noise of photo- and videocameras. It is based on the automatic segmentation of nonuniform targets (ASNT). Only two frames are sufficient for noise measurement with the modified method. In this paper, we registered frames and estimated shot and dark temporal noises of cameras consistently in the real-time. The modified ASNT method is used. Estimation was performed for the cameras: consumer photocamera Canon EOS 400D (CMOS, 10.1 MP, 12 bit ADC), scientific camera MegaPlus II ES11000 (CCD, 10.7 MP, 12 bit ADC), industrial camera PixeLink PL-B781F (CMOS, 6.6 MP, 10 bit ADC) and video-surveillance camera Watec LCL-902C (CCD, 0.47 MP, external 8 bit ADC). Experimental dependencies of temporal noise on signal value are in good agreement with fitted curves based on a Poisson distribution excluding areas near saturation. Time of registering and processing of frames used for temporal noise estimation was measured. Using standard computer, frames were registered and processed during a fraction of second to several seconds only. Also the

  10. A Python Software Toolbox for the Analysis of SO2 Camera Data. Implications in Geosciences

    Directory of Open Access Journals (Sweden)

    Jonas Gliß

    2017-12-01

    Full Text Available Ultraviolet (UV SO2 cameras have become a common tool to measure and monitor SO2 emission rates, mostly from volcanoes but also from anthropogenic sources (e.g., power plants or ships. Over the past decade, the analysis of UV SO2 camera data has seen many improvements. As a result, for many of the required analysis steps, several alternatives exist today (e.g., cell vs. DOAS based camera calibration; optical flow vs. cross-correlation based gas-velocity retrieval. This inspired the development of Pyplis (Python plume imaging software, an open-source software toolbox written in Python 2.7, which unifies the most prevalent methods from literature within a single, cross-platform analysis framework. Pyplis comprises a vast collection of algorithms relevant for the analysis of UV SO2 camera data. These include several routines to retrieve plume background radiances as well as routines for cell and DOAS based camera calibration. The latter includes two independent methods to identify the DOAS field-of-view (FOV within the camera images (based on (1 Pearson correlation and (2 IFR inversion method. Plume velocities can be retrieved using an optical flow algorithm as well as signal cross-correlation. Furthermore, Pyplis includes a routine to perform a first order correction of the signal dilution effect (also referred to as light dilution. All required geometrical calculations are performed within a 3D model environment allowing for distance retrievals to plume and local terrain features on a pixel basis. SO2 emission rates can be retrieved simultaneously for an arbitrary number of plume intersections. Hence, Pyplis provides a state-of-the-art framework for more efficient and flexible analyses of UV SO2 camera data and, therefore, marks an important step forward towards more transparency, reliability and inter-comparability of the results. Pyplis has been extensively and successfully tested using data from several field campaigns. Here, the main features

  11. Design of the high resolution optical instrument for the Pleiades HR Earth observation satellites

    Science.gov (United States)

    Lamard, Jean-Luc; Gaudin-Delrieu, Catherine; Valentini, David; Renard, Christophe; Tournier, Thierry; Laherrere, Jean-Marc

    2017-11-01

    As part of its contribution to Earth observation from space, ALCATEL SPACE designed, built and tested the High Resolution cameras for the European intelligence satellites HELIOS I and II. Through these programmes, ALCATEL SPACE enjoys an international reputation. Its capability and experience in High Resolution instrumentation is recognised by the most customers. Coming after the SPOT program, it was decided to go ahead with the PLEIADES HR program. PLEIADES HR is the optical high resolution component of a larger optical and radar multi-sensors system : ORFEO, which is developed in cooperation between France and Italy for dual Civilian and Defense use. ALCATEL SPACE has been entrusted by CNES with the development of the high resolution camera of the Earth observation satellites PLEIADES HR. The first optical satellite of the PLEIADES HR constellation will be launched in mid-2008, the second will follow in 2009. To minimize the development costs, a mini satellite approach has been selected, leading to a compact concept for the camera design. The paper describes the design and performance budgets of this novel high resolution and large field of view optical instrument with emphasis on the technological features. This new generation of camera represents a breakthrough in comparison with the previous SPOT cameras owing to a significant step in on-ground resolution, which approaches the capabilities of aerial photography. Recent advances in detector technology, optical fabrication and electronics make it possible for the PLEIADES HR camera to achieve their image quality performance goals while staying within weight and size restrictions normally considered suitable only for much lower performance systems. This camera design delivers superior performance using an innovative low power, low mass, scalable architecture, which provides a versatile approach for a variety of imaging requirements and allows for a wide number of possibilities of accommodation with a mini

  12. The image camera of the 17 m diameter air Cherenkov telescope MAGIC

    CERN Document Server

    Ostankov, A P

    2001-01-01

    The image camera of the 17 m diameter MAGIC telescope, an air Cherenkov telescope currently under construction to be installed at the Canary island La Palma, is described. The main goal of the experiment is to cover the unexplored energy window from approx 10 to approx 300 GeV in gamma-ray astrophysics. In its first phase with a classical PMT camera the MAGIC telescope is expected to reach an energy threshold of approx 30 GeV. The operational conditions, the special characteristics of the developed PMTs and their use with light concentrators, the fast signal transfer scheme using analog optical links, the trigger and DAQ organization as well as image reconstruction strategy are described. The different paths being explored towards future camera improvements, in particular the constraints in using silicon avalanche photodiodes and GaAsP hybrid photodetectors in air Cherenkov telescopes are discussed.

  13. Optical system for laser triggering of PBFA II

    International Nuclear Information System (INIS)

    Hamil, R.A.; Seamons, L.O.; Schanwald, L.P.; Gerber, R.A.

    1985-01-01

    The PBFA II laser triggering optical system consists of nearly 300 optical components. These optics must be sufficiently precise to preserve the laser beam quality, as well as to equally distribute the energy of the UV laser beam to the 36, 5.5 MV gas-filled switches at precisely the same instant. Both index variation and cleanliness of the air long the laser path must be controlled. The manual alignment system is capable of alignment to better than the acceptable error of 200 microradians (laser to switches). A technique has been devised to ease the alignment procedure by using a special high gain video camera and a tool alignment telescope to view retroreflective tape targets having optical brightness gains over white surfaces of 10/sup 3/. The camera is a charge-coupled detector intensified by a double microchannel plate having an optical gain of between 10/sup 4/ and 10/sup 5/

  14. SAAO's new robotic telescope and WiNCam (Wide-field Nasmyth Camera)

    Science.gov (United States)

    Worters, Hannah L.; O'Connor, James E.; Carter, David B.; Loubser, Egan; Fourie, Pieter A.; Sickafoose, Amanda; Swanevelder, Pieter

    2016-08-01

    The South African Astronomical Observatory (SAAO) is designing and manufacturing a wide-field camera for use on two of its telescopes. The initial concept was of a Prime focus camera for the 74" telescope, an equatorial design made by Grubb Parsons, where it would employ a 61mmx61mm detector to cover a 23 arcmin diameter field of view. However, while in the design phase, SAAO embarked on the process of acquiring a bespoke 1-metre robotic alt-az telescope with a 43 arcmin field of view, which needs a homegrown instrument suite. The Prime focus camera design was thus adapted for use on either telescope, increasing the detector size to 92mmx92mm. Since the camera will be mounted on the Nasmyth port of the new telescope, it was dubbed WiNCam (Wide-field Nasmyth Camera). This paper describes both WiNCam and the new telescope. Producing an instrument that can be swapped between two very different telescopes poses some unique challenges. At the Nasmyth port of the alt-az telescope there is ample circumferential space, while on the 74 inch the available envelope is constrained by the optical footprint of the secondary, if further obscuration is to be avoided. This forces the design into a cylindrical volume of 600mm diameter x 250mm height. The back focal distance is tightly constrained on the new telescope, shoehorning the shutter, filter unit, guider mechanism, a 10mm thick window and a tip/tilt mechanism for the detector into 100mm depth. The iris shutter and filter wheel planned for prime focus could no longer be accommodated. Instead, a compact shutter with a thickness of less than 20mm has been designed in-house, using a sliding curtain mechanism to cover an aperture of 125mmx125mm, while the filter wheel has been replaced with 2 peripheral filter cartridges (6 filters each) and a gripper to move a filter into the beam. We intend using through-vacuum wall PCB technology across the cryostat vacuum interface, instead of traditional hermetic connector-based wiring. This

  15. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, Ul; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is replaceably mounted in the ray inlet opening of the camera, while the others are placed on separate supports. Supports are swingably mounted upon a column one above the other

  16. Touch And Go Camera System (TAGCAMS) for the OSIRIS-REx Asteroid Sample Return Mission

    Science.gov (United States)

    Bos, B. J.; Ravine, M. A.; Caplinger, M.; Schaffner, J. A.; Ladewig, J. V.; Olds, R. D.; Norman, C. D.; Huish, D.; Hughes, M.; Anderson, S. K.; Lorenz, D. A.; May, A.; Jackman, C. D.; Nelson, D.; Moreau, M.; Kubitschek, D.; Getzandanner, K.; Gordon, K. E.; Eberhardt, A.; Lauretta, D. S.

    2018-02-01

    NASA's OSIRIS-REx asteroid sample return mission spacecraft includes the Touch And Go Camera System (TAGCAMS) three camera-head instrument. The purpose of TAGCAMS is to provide imagery during the mission to facilitate navigation to the target asteroid, confirm acquisition of the asteroid sample, and document asteroid sample stowage. The cameras were designed and constructed by Malin Space Science Systems (MSSS) based on requirements developed by Lockheed Martin and NASA. All three of the cameras are mounted to the spacecraft nadir deck and provide images in the visible part of the spectrum, 400-700 nm. Two of the TAGCAMS cameras, NavCam 1 and NavCam 2, serve as fully redundant navigation cameras to support optical navigation and natural feature tracking. Their boresights are aligned in the nadir direction with small angular offsets for operational convenience. The third TAGCAMS camera, StowCam, provides imagery to assist with and confirm proper stowage of the asteroid sample. Its boresight is pointed at the OSIRIS-REx sample return capsule located on the spacecraft deck. All three cameras have at their heart a 2592 × 1944 pixel complementary metal oxide semiconductor (CMOS) detector array that provides up to 12-bit pixel depth. All cameras also share the same lens design and a camera field of view of roughly 44° × 32° with a pixel scale of 0.28 mrad/pixel. The StowCam lens is focused to image features on the spacecraft deck, while both NavCam lens focus positions are optimized for imaging at infinity. A brief description of the TAGCAMS instrument and how it is used to support critical OSIRIS-REx operations is provided.

  17. Development and characterization of a CCD camera system for use on six-inch manipulator systems

    International Nuclear Information System (INIS)

    Logory, L.M.; Bell, P.M.; Conder, A.D.; Lee, F.D.

    1996-01-01

    The Lawrence Livermore National Laboratory has designed, constructed, and fielded a compact CCD camera system for use on the Six Inch Manipulator (SIM) at the Nova laser facility. The camera system has been designed to directly replace the 35 mm film packages on all active SIM-based diagnostics. The unit's electronic package is constructed for small size and high thermal conductivity using proprietary printed circuit board technology, thus reducing the size of the overall camera and improving its performance when operated within the vacuum environment of the Nova laser target chamber. The camera has been calibrated and found to yield a linear response, with superior dynamic range and signal-to-noise levels as compared to T-Max 3200 optic film, while providing real-time access to the data. Limiting factors related to fielding such devices on Nova will be discussed, in addition to planned improvements of the current design

  18. OCAMS: The OSIRIS-REx Camera Suite

    Science.gov (United States)

    Rizk, B.; Drouet d'Aubigny, C.; Golish, D.; Fellows, C.; Merrill, C.; Smith, P.; Walker, M. S.; Hendershot, J. E.; Hancock, J.; Bailey, S. H.; DellaGiustina, D. N.; Lauretta, D. S.; Tanner, R.; Williams, M.; Harshman, K.; Fitzgibbon, M.; Verts, W.; Chen, J.; Connors, T.; Hamara, D.; Dowd, A.; Lowman, A.; Dubin, M.; Burt, R.; Whiteley, M.; Watson, M.; McMahon, T.; Ward, M.; Booher, D.; Read, M.; Williams, B.; Hunten, M.; Little, E.; Saltzman, T.; Alfred, D.; O'Dougherty, S.; Walthall, M.; Kenagy, K.; Peterson, S.; Crowther, B.; Perry, M. L.; See, C.; Selznick, S.; Sauve, C.; Beiser, M.; Black, W.; Pfisterer, R. N.; Lancaster, A.; Oliver, S.; Oquest, C.; Crowley, D.; Morgan, C.; Castle, C.; Dominguez, R.; Sullivan, M.

    2018-02-01

    The OSIRIS-REx Camera Suite (OCAMS) will acquire images essential to collecting a sample from the surface of Bennu. During proximity operations, these images will document the presence of satellites and plumes, record spin state, enable an accurate model of the asteroid's shape, and identify any surface hazards. They will confirm the presence of sampleable regolith on the surface, observe the sampling event itself, and image the sample head in order to verify its readiness to be stowed. They will document Bennu's history as an example of early solar system material, as a microgravity body with a planetesimal size-scale, and as a carbonaceous object. OCAMS is fitted with three cameras. The MapCam will record color images of Bennu as a point source on approach to the asteroid in order to connect Bennu's ground-based point-source observational record to later higher-resolution surface spectral imaging. The SamCam will document the sample site before, during, and after it is disturbed by the sample mechanism. The PolyCam, using its focus mechanism, will observe the sample site at sub-centimeter resolutions, revealing surface texture and morphology. While their imaging requirements divide naturally between the three cameras, they preserve a strong degree of functional overlap. OCAMS and the other spacecraft instruments will allow the OSIRIS-REx mission to collect a sample from a microgravity body on the same visit during which it was first optically acquired from long range, a useful capability as humanity reaches out to explore near-Earth, Main-Belt and Jupiter Trojan asteroids.

  19. A high precision recipe for correcting images distorted by a tapered fiber optic

    International Nuclear Information System (INIS)

    Islam, M Sirajul; Kitchen, M J; Lewis, R A; Uesugi, K

    2010-01-01

    Images captured with a tapered fiber optic camera show significant spatial distortion mainly because the spatial orientation of the fiber bundles is not identical at each end of the taper. We present three different techniques for the automatic distortion correction of images acquired with a charge-coupled device (CCD) camera bonded to a tapered optical fiber. In this paper we report - (i) comparison of various methods for distortion correction (ii) extensive quantitative analysis of the techniques and (iii) experiments carried out using a high resolution fiber optic camera. A pinhole array was used to find control points in the distorted image space. These control points were then associated with their known true coordinates. To apply geometric correction, three different approaches were investigated - global polynomial fitting, local polynomial fitting and triangulated interpolation. Sub-pixel accuracy was achieved in all approaches, but the experimental results reveal that the triangulated interpolation gave the most satisfactory result for the distortion correction. The effect of proper alignment of the mask with the fiber optic taper (FOT) camera was also investigated. It was found that the overall dewarping error is minimal when the mask is almost parallel to the CCD.

  20. Radiation-resistant camera tube

    International Nuclear Information System (INIS)

    Kuwahata, Takao; Manabe, Sohei; Makishima, Yasuhiro

    1982-01-01

    It was a long time ago that Toshiba launched on manufacturing black-and-white radiation-resistant camera tubes employing nonbrowning face-plate glass for ITV cameras used in nuclear power plants. Now in compliance with the increasing demand in nuclear power field, the Company is at grips with the development of radiation-resistant single color-camera tubes incorporating a color-stripe filter for color ITV cameras used under radiation environment. Herein represented are the results of experiments on characteristics of materials for single color-camera tubes and prospects for commercialization of the tubes. (author)

  1. Common aperture multispectral spotter camera: Spectro XR

    Science.gov (United States)

    Petrushevsky, Vladimir; Freiman, Dov; Diamant, Idan; Giladi, Shira; Leibovich, Maor

    2017-10-01

    The Spectro XRTM is an advanced color/NIR/SWIR/MWIR 16'' payload recently developed by Elbit Systems / ELOP. The payload's primary sensor is a spotter camera with common 7'' aperture. The sensor suite includes also MWIR zoom, EO zoom, laser designator or rangefinder, laser pointer / illuminator and laser spot tracker. Rigid structure, vibration damping and 4-axes gimbals enable high level of line-of-sight stabilization. The payload's list of features include multi-target video tracker, precise boresight, strap-on IMU, embedded moving map, geodetic calculations suite, and image fusion. The paper describes main technical characteristics of the spotter camera. Visible-quality, all-metal front catadioptric telescope maintains optical performance in wide range of environmental conditions. High-efficiency coatings separate the incoming light into EO, SWIR and MWIR band channels. Both EO and SWIR bands have dual FOV and 3 spectral filters each. Several variants of focal plane array formats are supported. The common aperture design facilitates superior DRI performance in EO and SWIR, in comparison to the conventionally configured payloads. Special spectral calibration and color correction extend the effective range of color imaging. An advanced CMOS FPA and low F-number of the optics facilitate low light performance. SWIR band provides further atmospheric penetration, as well as see-spot capability at especially long ranges, due to asynchronous pulse detection. MWIR band has good sharpness in the entire field-of-view and (with full HD FPA) delivers amount of detail far exceeding one of VGA-equipped FLIRs. The Spectro XR offers level of performance typically associated with larger and heavier payloads.

  2. The Brazilian wide field imaging camera (WFI) for the China/Brazil earth resources satellite: CBERS 3 and 4

    Science.gov (United States)

    Scaduto, L. C. N.; Carvalho, E. G.; Modugno, R. G.; Cartolano, R.; Evangelista, S. H.; Segoria, D.; Santos, A. G.; Stefani, M. A.; Castro Neto, J. C.

    2017-11-01

    The purpose of this paper is to present the optical system developed for the Wide Field imaging Camera - WFI that will be integrated to the CBERS 3 and 4 satellites (China Brazil Earth resources Satellite). This camera will be used for remote sensing of the Earth and it is aimed to work at an altitude of 778 km. The optical system is designed for four spectral bands covering the range of wavelengths from blue to near infrared and its field of view is +/-28.63°, which covers 866 km, with a ground resolution of 64 m at nadir. WFI has been developed through a consortium formed by Opto Electrônica S. A. and Equatorial Sistemas. In particular, we will present the optical analysis based on the Modulation Transfer Function (MTF) obtained during the Engineering Model phase (EM) and the optical tests performed to evaluate the requirements. Measurements of the optical system MTF have been performed using an interferometer at the wavelength of 632.8nm and global MTF tests (including the CCD and signal processing electronic) have been performed by using a collimator with a slit target. The obtained results showed that the performance of the optical system meets the requirements of project.

  3. GRACE star camera noise

    Science.gov (United States)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  4. Design and evaluation of controls for drift, video gain, and color balance in spaceborne facsimile cameras

    Science.gov (United States)

    Katzberg, S. J.; Kelly, W. L., IV; Rowland, C. W.; Burcher, E. E.

    1973-01-01

    The facsimile camera is an optical-mechanical scanning device which has become an attractive candidate as an imaging system for planetary landers and rovers. This paper presents electronic techniques which permit the acquisition and reconstruction of high quality images with this device, even under varying lighting conditions. These techniques include a control for low frequency noise and drift, an automatic gain control, a pulse-duration light modulation scheme, and a relative spectral gain control. Taken together, these techniques allow the reconstruction of radiometrically accurate and properly balanced color images from facsimile camera video data. These techniques have been incorporated into a facsimile camera and reproduction system, and experimental results are presented for each technique and for the complete system.

  5. Prototypic Development and Evaluation of a Medium Format Metric Camera

    Science.gov (United States)

    Hastedt, H.; Rofallski, R.; Luhmann, T.; Rosenbauer, R.; Ochsner, D.; Rieke-Zapp, D.

    2018-05-01

    Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2-3 m in each direction) and large volumes (around 20 x 20 x 1-10 m). The requested precision in object space (1σ RMS) is defined to be within 0.1-0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1) high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2) a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3) a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002). Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm-0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement). All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.

  6. PROTOTYPIC DEVELOPMENT AND EVALUATION OF A MEDIUM FORMAT METRIC CAMERA

    Directory of Open Access Journals (Sweden)

    H. Hastedt

    2018-05-01

    Full Text Available Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2–3 m in each direction and large volumes (around 20 x 20 x 1–10 m. The requested precision in object space (1σ RMS is defined to be within 0.1–0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1 high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2 a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3 a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002. Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm–0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement. All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.

  7. Camera processing with chromatic aberration.

    Science.gov (United States)

    Korneliussen, Jan Tore; Hirakawa, Keigo

    2014-10-01

    Since the refractive index of materials commonly used for lens depends on the wavelengths of light, practical camera optics fail to converge light to a single point on an image plane. Known as chromatic aberration, this phenomenon distorts image details by introducing magnification error, defocus blur, and color fringes. Though achromatic and apochromatic lens designs reduce chromatic aberration to a degree, they are complex and expensive and they do not offer a perfect correction. In this paper, we propose a new postcapture processing scheme designed to overcome these problems computationally. Specifically, the proposed solution is comprised of chromatic aberration-tolerant demosaicking algorithm and post-demosaicking chromatic aberration correction. Experiments with simulated and real sensor data verify that the chromatic aberration is effectively corrected.

  8. [Evaluation of Iris Morphology Viewed through Stromal Edematous Corneas by Infrared Camera].

    Science.gov (United States)

    Kobayashi, Masaaki; Morishige, Naoyuki; Morita, Yukiko; Yamada, Naoyuki; Kobayashi, Motomi; Sonoda, Koh-Hei

    2016-02-01

    We reported that the application of infrared camera enables us to observe iris morphology in Peters' anomaly through edematous corneas. To observe the iris morphology in bullous keratopathy or failure grafts with an infrared camera. Eleven bullous keratopathy or failure grafts subjects (6 men and 5 women, mean age ± SD; 72.7 ± 13.0 years old) were enrolled in this study. The iris morphology was observed by applying visible light mode and near infrared light mode of infrared camera (MeibomPen). The detectability of pupil shapes, iris patterns and presence of iridectomy was evaluated. Infrared mode observation enabled us to detect the pupil shapes in 11 out of 11 cases, iris patterns in 3 out of 11 cases, and presence of iridetomy in 9 out of 11 cases although visible light mode observation could not detect any iris morphological changes. Applying infrared optics was valuable for observation of the iris morphology through stromal edematous corneas.

  9. X-ray topography with scintillators coupled to image intensifiers or camera tubes

    International Nuclear Information System (INIS)

    Beauvais, Yves; Mathiot, Alain.

    1978-01-01

    The possibility of imaging topographic figures in real time by using a thin scintillator coupled to either a high-gain image intensifier or a camera tube is investigated. The camera tube must have a high gain because of the low photon fluxes that are encountered in practice, and because of the relatively low quantum yield of thin phosphors. With conventional X-ray generators, the resolution is photon-noise limited. With more powerful generators like synchrotrons, real-time imaging appears possible, and the resolution is limited by the modulation transfer function of the image tube. Higher resolution can be reached by increasing the magnification between the screen and the image tube. When doing so, the input field is reduced and thinner phosphor screens must be used, resulting in a lower yield. Each time the magnification is doubled, the minimum required photon flux is multiplier by about 8, so that the advantages of increasing the magnification are rapidly limited, so far as real-time imaging is concerned. Because image tube resolution is mainly limited by the modulation transfer function of the phosphor for image intensifiers, and by that of the target for camera tubes, improvement of photocathode resolution can be obtained by magnifying electron optics. A zooming electron optic would permit the field and the resolution of the tube to be adapted to the observed subject. Unfortunately such tubes do not exist at present for this type of application, and in the required size

  10. Determining the Position of Head and Shoulders in Neurological Practice with the use of Cameras

    Directory of Open Access Journals (Sweden)

    P. Kutílek

    2011-01-01

    Full Text Available The posture of the head and shoulders can be influenced negatively by many diseases of the nervous system, visual and vestibular systems. We have designed a system and a set of procedures for evaluating the inclination (roll, flexion (pitch and rotation (yaw of the head and the inclination (roll and rotation (yaw of the shoulders. A new computational algorithm allows non-invasive and non-contact head and shoulder position measurement using two cameras mounted opposite each other, and the displacement of the optical axis of the cameras is also corrected.

  11. High-speed two-frame gated camera for parameters measurement of Dragon-Ⅰ LIA

    International Nuclear Information System (INIS)

    Jiang Xiaoguo; Wang Yuan; Zhang Kaizhi; Shi Jinshui; Deng Jianjun; Li Jin

    2012-01-01

    The time-resolved measurement system which can work at very high speed is necessary in electron beam parameter diagnosis for Dragon-Ⅰ linear induction accelerator (LIA). A two-frame gated camera system has been developed and put into operation. The camera system adopts the optical principle of splitting the imaging light beam into two parts in the imaging space of a lens with long focus length. It includes lens coupled gated image intensifier, CCD camera, high speed shutter trigger device based on large scale field programmable gate array. The minimum exposure time for each image is about 3 ns, and the interval time between two images can be adjusted with a step of about 0.5 ns. The exposure time and the interval time can be independently adjusted and can reach about 1 s. The camera system features good linearity, good response uniformity, equivalent background illumination (EBI) as low as about 5 electrons per pixel per second, large adjustment range of sensitivity, and excel- lent flexibility and adaptability in applications. The camera system can capture two frame images at one time with the image size of 1024 x 1024. It meets the requirements of measurement for Dragon-Ⅰ LIA. (authors)

  12. Gamma camera

    International Nuclear Information System (INIS)

    Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    The design of a collimation system for a gamma camera for use in nuclear medicine is described. When used with a 2-dimensional position sensitive radiation detector, the novel system can produce superior images than conventional cameras. The optimal thickness and positions of the collimators are derived mathematically. (U.K.)

  13. Picosecond camera

    International Nuclear Information System (INIS)

    Decroisette, Michel

    A Kerr cell activated by infrared pulses of a model locked Nd glass laser, acts as an ultra-fast and periodic shutter, with a few p.s. opening time. Associated with a S.T.L. camera, it gives rise to a picosecond camera allowing us to study very fast effects [fr

  14. Camera, handlens, and microscope optical system for imaging and coupled optical spectroscopy

    Science.gov (United States)

    Mungas, Greg S. (Inventor); Boynton, John (Inventor); Sepulveda, Cesar A. (Inventor); Nunes de Sepulveda, legal representative, Alicia (Inventor); Gursel, Yekta (Inventor)

    2012-01-01

    An optical system comprising two lens cells, each lens cell comprising multiple lens elements, to provide imaging over a very wide image distance and within a wide range of magnification by changing the distance between the two lens cells. An embodiment also provides scannable laser spectroscopic measurements within the field-of-view of the instrument.

  15. Compact 3D Camera for Shake-the-Box Particle Tracking

    Science.gov (United States)

    Hesseling, Christina; Michaelis, Dirk; Schneiders, Jan

    2017-11-01

    Time-resolved 3D-particle tracking usually requires the time-consuming optical setup and calibration of 3 to 4 cameras. Here, a compact four-camera housing has been developed. The performance of the system using Shake-the-Box processing (Schanz et al. 2016) is characterized. It is shown that the stereo-base is large enough for sensible 3D velocity measurements. Results from successful experiments in water flows using LED illumination are presented. For large-scale wind tunnel measurements, an even more compact version of the system is mounted on a robotic arm. Once calibrated for a specific measurement volume, the necessity for recalibration is eliminated even when the system moves around. Co-axial illumination is provided through an optical fiber in the middle of the housing, illuminating the full measurement volume from one viewing direction. Helium-filled soap bubbles are used to ensure sufficient particle image intensity. This way, the measurement probe can be moved around complex 3D-objects. By automatic scanning and stitching of recorded particle tracks, the detailed time-averaged flow field of a full volume of cubic meters in size is recorded and processed. Results from an experiment at TU-Delft of the flow field around a cyclist are shown.

  16. Per-Pixel Coded Exposure for High-Speed and High-Resolution Imaging Using a Digital Micromirror Device Camera

    Directory of Open Access Journals (Sweden)

    Wei Feng

    2016-03-01

    Full Text Available High-speed photography is an important tool for studying rapid physical phenomena. However, low-frame-rate CCD (charge coupled device or CMOS (complementary metal oxide semiconductor camera cannot effectively capture the rapid phenomena with high-speed and high-resolution. In this paper, we incorporate the hardware restrictions of existing image sensors, design the sampling functions, and implement a hardware prototype with a digital micromirror device (DMD camera in which spatial and temporal information can be flexibly modulated. Combined with the optical model of DMD camera, we theoretically analyze the per-pixel coded exposure and propose a three-element median quicksort method to increase the temporal resolution of the imaging system. Theoretically, this approach can rapidly increase the temporal resolution several, or even hundreds, of times without increasing bandwidth requirements of the camera. We demonstrate the effectiveness of our method via extensive examples and achieve 100 fps (frames per second gain in temporal resolution by using a 25 fps camera.

  17. On the accuracy potential of focused plenoptic camera range determination in long distance operation

    Science.gov (United States)

    Sardemann, Hannes; Maas, Hans-Gerd

    2016-04-01

    Plenoptic cameras have found increasing interest in optical 3D measurement techniques in recent years. While their basic principle is 100 years old, the development in digital photography, micro-lens fabrication technology and computer hardware has boosted the development and lead to several commercially available ready-to-use cameras. Beyond their popular option of a posteriori image focusing or total focus image generation, their basic ability of generating 3D information from single camera imagery depicts a very beneficial option for certain applications. The paper will first present some fundamentals on the design and history of plenoptic cameras and will describe depth determination from plenoptic camera image data. It will then present an analysis of the depth determination accuracy potential of plenoptic cameras. While most research on plenoptic camera accuracy so far has focused on close range applications, we will focus on mid and long ranges of up to 100 m. This range is especially relevant, if plenoptic cameras are discussed as potential mono-sensorial range imaging devices in (semi-)autonomous cars or in mobile robotics. The results show the expected deterioration of depth measurement accuracy with depth. At depths of 30-100 m, which may be considered typical in autonomous driving, depth errors in the order of 3% (with peaks up to 10-13 m) were obtained from processing small point clusters on an imaged target. Outliers much higher than these values were observed in single point analysis, stressing the necessity of spatial or spatio-temporal filtering of the plenoptic camera depth measurements. Despite these obviously large errors, a plenoptic camera may nevertheless be considered a valid option for the application fields of real-time robotics like autonomous driving or unmanned aerial and underwater vehicles, where the accuracy requirements decrease with distance.

  18. Observations of the Perseids 2012 using SPOSH cameras

    Science.gov (United States)

    Margonis, A.; Flohrer, J.; Christou, A.; Elgner, S.; Oberst, J.

    2012-09-01

    The Perseids are one of the most prominent annual meteor showers occurring every summer when the stream of dust particles, originating from Halley-type comet 109P/Swift-Tuttle, intersects the orbital path of the Earth. The dense core of this stream passes Earth's orbit on the 12th of August producing the maximum number of meteors. The Technical University of Berlin (TUB) and the German Aerospace Center (DLR) organize observing campaigns every summer monitoring the Perseids activity. The observations are carried out using the Smart Panoramic Optical Sensor Head (SPOSH) camera system [0]. The SPOSH camera has been developed by DLR and Jena-Optronik GmbH under an ESA/ESTEC contract and it is designed to image faint, short-lived phenomena on dark planetary hemispheres. The camera features a highly sensitive backilluminated 1024x1024 CCD chip and a high dynamic range of 14 bits. The custom-made fish-eye lens offers a 120°x120° field-of-view (168° over the diagonal). Figure 1: A meteor captured by the SPOSH cameras simultaneously during the last 2011 observing campaign in Greece. The horizon including surrounding mountains can be seen in the image corners as a result of the large FOV of the camera. The observations will be made on the Greek Peloponnese peninsula monitoring the post-peak activity of the Perseids during a one-week period around the August New Moon (14th to 21st). Two SPOSH cameras will be deployed in two remote sites in high altitudes for the triangulation of meteor trajectories captured at both stations simultaneously. The observations during this time interval will give us the possibility to study the poorly-observed postmaximum branch of the Perseid stream and compare the results with datasets from previous campaigns which covered different periods of this long-lived meteor shower. The acquired data will be processed using dedicated software for meteor data reduction developed at TUB and DLR. Assuming a successful campaign, statistics, trajectories

  19. Advantages of computer cameras over video cameras/frame grabbers for high-speed vision applications

    Science.gov (United States)

    Olson, Gaylord G.; Walker, Jo N.

    1997-09-01

    Cameras designed to work specifically with computers can have certain advantages in comparison to the use of cameras loosely defined as 'video' cameras. In recent years the camera type distinctions have become somewhat blurred, with a great presence of 'digital cameras' aimed more at the home markets. This latter category is not considered here. The term 'computer camera' herein is intended to mean one which has low level computer (and software) control of the CCD clocking. These can often be used to satisfy some of the more demanding machine vision tasks, and in some cases with a higher rate of measurements than video cameras. Several of these specific applications are described here, including some which use recently designed CCDs which offer good combinations of parameters such as noise, speed, and resolution. Among the considerations for the choice of camera type in any given application would be such effects as 'pixel jitter,' and 'anti-aliasing.' Some of these effects may only be relevant if there is a mismatch between the number of pixels per line in the camera CCD and the number of analog to digital (A/D) sampling points along a video scan line. For the computer camera case these numbers are guaranteed to match, which alleviates some measurement inaccuracies and leads to higher effective resolution.

  20. Family Of Calibrated Stereometric Cameras For Direct Intraoral Use

    Science.gov (United States)

    Curry, Sean; Moffitt, Francis; Symes, Douglas; Baumrind, Sheldon

    1983-07-01

    In order to study empirically the relative efficiencies of different types of orthodontic appliances in repositioning teeth in vivo, we have designed and constructed a pair of fixed-focus, normal case, fully-calibrated stereometric cameras. One is used to obtain stereo photography of single teeth, at a scale of approximately 2:1, and the other is designed for stereo imaging of the entire dentition, study casts, facial structures, and other related objects at a scale of approximately 1:8. Twin lenses simultaneously expose adjacent frames on a single roll of 70 mm film. Physical flatness of the film is ensured by the use of a spring-loaded metal pressure plate. The film is forced against a 3/16" optical glass plate upon which is etched an array of 16 fiducial marks which divide the film format into 9 rectangular regions. Using this approach, it has been possible to produce photographs which are undistorted for qualitative viewing and from which quantitative data can be acquired by direct digitization of conventional photographic enlargements. We are in the process of designing additional members of this family of cameras. All calibration and data acquisition and analysis techniques previously developed will be directly applicable to these new cameras.

  1. Divergence-ratio axi-vision camera (Divcam): A distance mapping camera

    International Nuclear Information System (INIS)

    Iizuka, Keigo

    2006-01-01

    A novel distance mapping camera the divergence-ratio axi-vision camera (Divcam) is proposed. The decay rate of the illuminating light with distance due to the divergence of the light is used as means of mapping the distance. Resolutions of 10 mm over a range of meters and 0.5 mm over a range of decimeters were achieved. The special features of this camera are its high resolution real-time operation, simplicity, compactness, light weight, portability, and yet low fabrication cost. The feasibility of various potential applications is also included

  2. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    Directory of Open Access Journals (Sweden)

    Bruno Roux

    2008-11-01

    Full Text Available The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1 the use of unprocessed image data did not improve the results of image analyses; 2 vignetting had a significant effect, especially for the modified camera, and 3 normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  3. Optical design and development of a snapshot light-field laryngoscope

    Science.gov (United States)

    Zhu, Shuaishuai; Jin, Peng; Liang, Rongguang; Gao, Liang

    2018-02-01

    The convergence of recent advances in optical fabrication and digital processing yields a generation of imaging technology-light-field (LF) cameras which bridge the realms of applied mathematics, optics, and high-performance computing. Herein for the first time, we introduce the paradigm of LF imaging into laryngoscopy. The resultant probe can image the three-dimensional shape of vocal folds within a single camera exposure. Furthermore, to improve the spatial resolution, we developed an image fusion algorithm, providing a simple solution to a long-standing problem in LF imaging.

  4. Thermal Cameras and Applications

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. This type of camera was originally developed as a surveillance and night vision tool for the military, but recently the price has dropped, significantly opening up...... a broader field of applications. Deploying this type of sensor in vision systems eliminates the illumination problems of normal greyscale and RGB cameras. This survey provides an overview of the current applications of thermal cameras. Applications include animals, agriculture, buildings, gas detection......, industrial, and military applications, as well as detection, tracking, and recognition of humans. Moreover, this survey describes the nature of thermal radiation and the technology of thermal cameras....

  5. Parallelised photoacoustic signal acquisition using a Fabry-Perot sensor and a camera-based interrogation scheme

    Science.gov (United States)

    Saeb Gilani, T.; Villringer, C.; Zhang, E.; Gundlach, H.; Buchmann, J.; Schrader, S.; Laufer, J.

    2018-02-01

    Tomographic photoacoustic (PA) images acquired using a Fabry-Perot (FP) based scanner offer high resolution and image fidelity but can result in long acquisition times due to the need for raster scanning. To reduce the acquisition times, a parallelised camera-based PA signal detection scheme is developed. The scheme is based on using a sCMOScamera and FPI sensors with high homogeneity of optical thickness. PA signals were acquired using the camera-based setup and the signal to noise ratio (SNR) was measured. A comparison of the SNR of PA signal detected using 1) a photodiode in a conventional raster scanning detection scheme and 2) a sCMOS camera in parallelised detection scheme is made. The results show that the parallelised interrogation scheme has the potential to provide high speed PA imaging.

  6. Distance and velocity estimation using optical flow from a monocular camera

    NARCIS (Netherlands)

    Ho, H.W.; de Croon, G.C.H.E.; Chu, Q.

    2016-01-01

    Monocular vision is increasingly used in Micro Air Vehicles for navigation. In particular, optical flow, inspired by flying insects, is used to perceive vehicles’ movement with respect to the surroundings or sense changes in the environment. However, optical flow does not directly provide us the

  7. Radiation camera exposure control

    International Nuclear Information System (INIS)

    Martone, R.J.; Yarsawich, M.; Wolczek, W.

    1976-01-01

    A system and method for governing the exposure of an image generated by a radiation camera to an image sensing camera is disclosed. The exposure is terminated in response to the accumulation of a predetermined quantity of radiation, defining a radiation density, occurring in a predetermined area. An index is produced which represents the value of that quantity of radiation whose accumulation causes the exposure termination. The value of the predetermined radiation quantity represented by the index is sensed so that the radiation camera image intensity can be calibrated to compensate for changes in exposure amounts due to desired variations in radiation density of the exposure, to maintain the detectability of the image by the image sensing camera notwithstanding such variations. Provision is also made for calibrating the image intensity in accordance with the sensitivity of the image sensing camera, and for locating the index for maintaining its detectability and causing the proper centering of the radiation camera image

  8. Distance and velocity estimation using optical flow from a monocular camera

    NARCIS (Netherlands)

    Ho, H.W.; de Croon, G.C.H.E.; Chu, Q.

    2017-01-01

    Monocular vision is increasingly used in micro air vehicles for navigation. In particular, optical flow, inspired by flying insects, is used to perceive vehicle movement with respect to the surroundings or sense changes in the environment. However, optical flow does not directly provide us the

  9. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  10. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    Science.gov (United States)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  11. Adapting Virtual Camera Behaviour

    DEFF Research Database (Denmark)

    Burelli, Paolo

    2013-01-01

    In a three-dimensional virtual environment aspects such as narrative and interaction completely depend on the camera since the camera defines the player’s point of view. Most research works in automatic camera control aim to take the control of this aspect from the player to automatically gen- er...

  12. Cryogenic optical systems for the rapid infrared imager/spectrometer (RIMAS)

    Science.gov (United States)

    Capone, John I.; Content, David A.; Kutyrev, Alexander S.; Robinson, Frederick D.; Lotkin, Gennadiy N.; Toy, Vicki L.; Veilleux, Sylvain; Moseley, Samuel H.; Gehrels, Neil A.; Vogel, Stuart N.

    2014-07-01

    The Rapid Infrared Imager/Spectrometer (RIMAS) is designed to perform follow-up observations of transient astronomical sources at near infrared (NIR) wavelengths (0.9 - 2.4 microns). In particular, RIMAS will be used to perform photometric and spectroscopic observations of gamma-ray burst (GRB) afterglows to compliment the Swift satellite's science goals. Upon completion, RIMAS will be installed on Lowell Observatory's 4.3 meter Discovery Channel Telescope (DCT) located in Happy Jack, Arizona. The instrument's optical design includes a collimator lens assembly, a dichroic to divide the wavelength coverage into two optical arms (0.9 - 1.4 microns and 1.4 - 2.4 microns respectively), and a camera lens assembly for each optical arm. Because the wavelength coverage extends out to 2.4 microns, all optical elements are cooled to ~70 K. Filters and transmission gratings are located on wheels prior to each camera allowing the instrument to be quickly configured for photometry or spectroscopy. An athermal optomechanical design is being implemented to prevent lenses from loosing their room temperature alignment as the system is cooled. The thermal expansion of materials used in this design have been measured in the lab. Additionally, RIMAS has a guide camera consisting of four lenses to aid observers in passing light from target sources through spectroscopic slits. Efforts to align these optics are ongoing.

  13. A luminescence imaging system based on a CCD camera

    DEFF Research Database (Denmark)

    Duller, G.A.T.; Bøtter-Jensen, L.; Markey, B.G.

    1997-01-01

    Stimulated luminescence arising from naturally occurring minerals is likely to be spatially heterogeneous. Standard luminescence detection systems are unable to resolve this variability. Several research groups have attempted to use imaging photon detectors, or image intensifiers linked...... to photographic systems, in order to obtain spatially resolved data. However, the former option is extremely expensive and it is difficult to obtain quantitative data from the latter. This paper describes the use of a CCD camera for imaging both thermoluminescence and optically stimulated luminescence. The system...

  14. Registration of an on-axis see-through head-mounted display and camera system

    Science.gov (United States)

    Luo, Gang; Rensing, Noa M.; Weststrate, Evan; Peli, Eli

    2005-02-01

    An optical see-through head-mounted display (HMD) system integrating a miniature camera that is aligned with the user's pupil is developed and tested. Such an HMD system has a potential value in many augmented reality applications, in which registration of the virtual display to the real scene is one of the critical aspects. The camera alignment to the user's pupil results in a simple yet accurate calibration and a low registration error across a wide range of depth. In reality, a small camera-eye misalignment may still occur in such a system due to the inevitable variations of HMD wearing position with respect to the eye. The effects of such errors are measured. Calculation further shows that the registration error as a function of viewing distance behaves nearly the same for different virtual image distances, except for a shift. The impact of prismatic effect of the display lens on registration is also discussed.

  15. Making Ceramic Cameras

    Science.gov (United States)

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  16. INFN Camera demonstrator for the Cherenkov Telescope Array

    CERN Document Server

    Ambrosi, G; Aramo, C.; Bertucci, B.; Bissaldi, E.; Bitossi, M.; Brasolin, S.; Busetto, G.; Carosi, R.; Catalanotti, S.; Ciocci, M.A.; Consoletti, R.; Da Vela, P.; Dazzi, F.; De Angelis, A.; De Lotto, B.; de Palma, F.; Desiante, R.; Di Girolamo, T.; Di Giulio, C.; Doro, M.; D'Urso, D.; Ferraro, G.; Ferrarotto, F.; Gargano, F.; Giglietto, N.; Giordano, F.; Giraudo, G.; Iacovacci, M.; Ionica, M.; Iori, M.; Longo, F.; Mariotti, M.; Mastroianni, S.; Minuti, M.; Morselli, A.; Paoletti, R.; Pauletta, G.; Rando, R.; Fernandez, G. Rodriguez; Rugliancich, A.; Simone, D.; Stella, C.; Tonachini, A.; Vallania, P.; Valore, L.; Vagelli, V.; Verzi, V.; Vigorito, C.

    2015-01-01

    The Cherenkov Telescope Array is a world-wide project for a new generation of ground-based Cherenkov telescopes of the Imaging class with the aim of exploring the highest energy region of the electromagnetic spectrum. With two planned arrays, one for each hemisphere, it will guarantee a good sky coverage in the energy range from a few tens of GeV to hundreds of TeV, with improved angular resolution and a sensitivity in the TeV energy region better by one order of magnitude than the currently operating arrays. In order to cover this wide energy range, three different telescope types are envisaged, with different mirror sizes and focal plane features. In particular, for the highest energies a possible design is a dual-mirror Schwarzschild-Couder optical scheme, with a compact focal plane. A silicon photomultiplier (SiPM) based camera is being proposed as a solution to match the dimensions of the pixel (angular size of ~ 0.17 degrees). INFN is developing a camera demonstrator made by 9 Photo Sensor Modules (PSMs...

  17. Single photon detection and localization accuracy with an ebCMOS camera

    Energy Technology Data Exchange (ETDEWEB)

    Cajgfinger, T. [CNRS/IN2P3, Institut de Physique Nucléaire de Lyon, Villeurbanne F-69622 (France); Dominjon, A., E-mail: agnes.dominjon@nao.ac.jp [Université de Lyon, Université de Lyon 1, Lyon 69003 France. (France); Barbier, R. [CNRS/IN2P3, Institut de Physique Nucléaire de Lyon, Villeurbanne F-69622 (France); Université de Lyon, Université de Lyon 1, Lyon 69003 France. (France)

    2015-07-01

    The CMOS sensor technologies evolve very fast and offer today very promising solutions to existing issues facing by imaging camera systems. CMOS sensors are very attractive for fast and sensitive imaging thanks to their low pixel noise (1e-) and their possibility of backside illumination. The ebCMOS group of IPNL has produced a camera system dedicated to Low Light Level detection and based on a 640 kPixels ebCMOS with its acquisition system. After reminding the principle of detection of an ebCMOS and the characteristics of our prototype, we confront our camera to other imaging systems. We compare the identification efficiency and the localization accuracy of a point source by four different photo-detection devices: the scientific CMOS (sCMOS), the Charge Coupled Device (CDD), the Electron Multiplying CCD (emCCD) and the Electron Bombarded CMOS (ebCMOS). Our ebCMOS camera is able to identify a single photon source in less than 10 ms with a localization accuracy better than 1 µm. We report as well efficiency measurement and the false positive identification of the ebCMOS camera by identifying more than hundreds of single photon sources in parallel. About 700 spots are identified with a detection efficiency higher than 90% and a false positive percentage lower than 5. With these measurements, we show that our target tracking algorithm can be implemented in real time at 500 frames per second under a photon flux of the order of 8000 photons per frame. These results demonstrate that the ebCMOS camera concept with its single photon detection and target tracking algorithm is one of the best devices for low light and fast applications such as bioluminescence imaging, quantum dots tracking or adaptive optics.

  18. Uncertainties in cloud phase and optical thickness retrievals from the Earth Polychromatic Imaging Camera (EPIC)

    Science.gov (United States)

    Meyer, Kerry; Yang, Yuekui; Platnick, Steven

    2018-01-01

    This paper presents an investigation of the expected uncertainties of a single channel cloud optical thickness (COT) retrieval technique, as well as a simple cloud temperature threshold based thermodynamic phase approach, in support of the Deep Space Climate Observatory (DSCOVR) mission. DSCOVR cloud products will be derived from Earth Polychromatic Imaging Camera (EPIC) observations in the ultraviolet and visible spectra. Since EPIC is not equipped with a spectral channel in the shortwave or mid-wave infrared that is sensitive to cloud effective radius (CER), COT will be inferred from a single visible channel with the assumption of appropriate CER values for liquid and ice phase clouds. One month of Aqua MODIS daytime granules from April 2005 is selected for investigating cloud phase sensitivity, and a subset of these granules that has similar EPIC sun-view geometry is selected for investigating COT uncertainties. EPIC COT retrievals are simulated with the same algorithm as the operational MODIS cloud products (MOD06), except using fixed phase-dependent CER values. Uncertainty estimates are derived by comparing the single channel COT retrievals with the baseline bi-spectral MODIS retrievals. Results show that a single channel COT retrieval is feasible for EPIC. For ice clouds, single channel retrieval errors are minimal (clouds the error is mostly limited to within 10%, although for thin clouds (COT cloud masking and cloud temperature retrievals are not considered in this study. PMID:29619116

  19. Camera Movement in Narrative Cinema

    DEFF Research Database (Denmark)

    Nielsen, Jakob Isak

    2007-01-01

    section unearths what characterizes the literature on camera movement. The second section of the dissertation delineates the history of camera movement itself within narrative cinema. Several organizational principles subtending the on-screen effect of camera movement are revealed in section two...... but they are not organized into a coherent framework. This is the task that section three meets in proposing a functional taxonomy for camera movement in narrative cinema. Two presumptions subtend the taxonomy: That camera movement actively contributes to the way in which we understand the sound and images on the screen......, commentative or valuative manner. 4) Focalization: associating the movement of the camera with the viewpoints of characters or entities in the story world. 5) Reflexive: inviting spectators to engage with the artifice of camera movement. 6) Abstract: visualizing abstract ideas and concepts. In order...

  20. Optical identification of sea-mines - Gated viewing three-dimensional laser radar

    DEFF Research Database (Denmark)

    Busck, Jens

    2005-01-01

    A gated viewing high accuracy mono-static laser radar has been developed for the purpose of improving the optical underwater sea-mine identification handled by the Navy. In the final stage of the sea-mine detection, classification and identification process the Navy applies a remote operated...... vehicle for optical identification of the bottom seamine. The experimental results of the thesis indicate that replacing the conventional optical video and spotlight system applied by the Navy with the gated viewing two- and three-dimensional laser radar can improve the underwater optical sea...... of the short laser pulses (0.5 ns), the high laser pulse repetition rate (32.4 kHz), the fast gating camera (0.2 ns), the short camera delay steps (0.1 ns), the applied optical single mode fiber, and the applied algorithm for three-dimensional imaging. The gated viewing laser radar system configuration...

  1. A design of a high speed dual spectrometer by single line scan camera

    Science.gov (United States)

    Palawong, Kunakorn; Meemon, Panomsak

    2018-03-01

    A spectrometer that can capture two orthogonal polarization components of s light beam is demanded for polarization sensitive imaging system. Here, we describe the design and implementation of a high speed spectrometer for simultaneous capturing of two orthogonal polarization components, i.e. vertical and horizontal components, of light beam. The design consists of a polarization beam splitter, two polarization-maintain optical fibers, two collimators, a single line-scan camera, a focusing lens, and a reflection blaze grating. The alignment of two beam paths was designed to be symmetrically incident on the blaze side and reverse blaze side of reflection grating, respectively. The two diffracted beams were passed through the same focusing lens and focused on the single line-scan sensors of a CMOS camera. The two spectra of orthogonal polarization were imaged on 1000 pixels per spectrum. With the proposed setup, the amplitude and shape of the two detected spectra can be controlled by rotating the collimators. The technique for optical alignment of spectrometer will be presented and discussed. The two orthogonal polarization spectra can be simultaneously captured at a speed of 70,000 spectra per second. The high speed dual spectrometer can simultaneously detected two orthogonal polarizations, which is an important component for the development of polarization-sensitive optical coherence tomography. The performance of the spectrometer have been measured and analyzed.

  2. Beam profile measurements on the advanced test accelerator using optical techniques

    International Nuclear Information System (INIS)

    Chong, Y.P.; Kalibjian, R.; Cornish, J.P.; Kallman, J.S.; Donnelly, D.

    1986-01-01

    Beam current density profiles of ATA have been measured both spatially and temporally using a number of diagnostics. An extremely important technique involves measuring optical emissions from either a target foil inserted into the beam path or gas atoms and molecules excited by beam electrons. This paper describes the detection of the optical emission. A 2-D gated television camera with a single or dual micro-channel-plate (MCP) detector for high gain provides excellent spatial and temporal resolution. Measurements are routinely made with resolutions of 1 mm and 5 ns respectively. The optical line of sight allows splitting part of the signal to a streak camera or photometer for even higher time resolution

  3. TRANSFORMATION ALGORITHM FOR IMAGES OBTAINED BY OMNIDIRECTIONAL CAMERAS

    Directory of Open Access Journals (Sweden)

    V. P. Lazarenko

    2015-01-01

    Full Text Available Omnidirectional optoelectronic systems find their application in areas where a wide viewing angle is critical. However, omnidirectional optoelectronic systems have a large distortion that makes their application more difficult. The paper compares the projection functions of traditional perspective lenses and omnidirectional wide angle fish-eye lenses with a viewing angle not less than 180°. This comparison proves that distortion models of omnidirectional cameras cannot be described as a deviation from the classic model of pinhole camera. To solve this problem, an algorithm for transforming omnidirectional images has been developed. The paper provides a brief comparison of the four calibration methods available in open source toolkits for omnidirectional optoelectronic systems. Geometrical projection model is given used for calibration of omnidirectional optical system. The algorithm consists of three basic steps. At the first step, we calculate he field of view of a virtual pinhole PTZ camera. This field of view is characterized by an array of 3D points in the object space. At the second step the array of corresponding pixels for these three-dimensional points is calculated. Then we make a calculation of the projection function that expresses the relation between a given 3D point in the object space and a corresponding pixel point. In this paper we use calibration procedure providing the projection function for calibrated instance of the camera. At the last step final image is formed pixel-by-pixel from the original omnidirectional image using calculated array of 3D points and projection function. The developed algorithm gives the possibility for obtaining an image for a part of the field of view of an omnidirectional optoelectronic system with the corrected distortion from the original omnidirectional image. The algorithm is designed for operation with the omnidirectional optoelectronic systems with both catadioptric and fish-eye lenses

  4. Feasibility of integrating a multi-camera optical tracking system in intra-operative electron radiation therapy scenarios

    International Nuclear Information System (INIS)

    García-Vázquez, V; Marinetto, E; Santos-Miranda, J A; Calvo, F A; Desco, M; Pascau, J

    2013-01-01

    Intra-operative electron radiation therapy (IOERT) combines surgery and ionizing radiation applied directly to an exposed unresected tumour mass or to a post-resection tumour bed. The radiation is collimated and conducted by a specific applicator docked to the linear accelerator. The dose distribution in tissues to be irradiated and in organs at risk can be planned through a pre-operative computed tomography (CT) study. However, surgical retraction of structures and resection of a tumour affecting normal tissues significantly modify the patient's geometry. Therefore, the treatment parameters (applicator dimension, pose (position and orientation), bevel angle, and beam energy) may require the original IOERT treatment plan to be modified depending on the actual surgical scenario. We propose the use of a multi-camera optical tracking system to reliably record the actual pose of the IOERT applicator in relation to the patient's anatomy in an environment prone to occlusion problems. This information can be integrated in the radio-surgical treatment planning system in order to generate a real-time accurate description of the IOERT scenario. We assessed the accuracy of the applicator pose by performing a phantom-based study that resembled three real clinical IOERT scenarios. The error obtained (2 mm) was below the acceptance threshold for external radiotherapy practice, thus encouraging future implementation of this approach in real clinical IOERT scenarios. (paper)

  5. Detailed Morphological Changes of Foveoschisis in Patient with X-Linked Retinoschisis Detected by SD-OCT and Adaptive Optics Fundus Camera

    Directory of Open Access Journals (Sweden)

    Keiichiro Akeo

    2015-01-01

    Full Text Available Purpose. To report the morphological and functional changes associated with a regression of foveoschisis in a patient with X-linked retinoschisis (XLRS. Methods. A 42-year-old man with XLRS underwent genetic analysis and detailed ophthalmic examinations. Functional assessments included best-corrected visual acuity (BCVA, full-field electroretinograms (ERGs, and multifocal ERGs (mfERGs. Morphological assessments included fundus photography, spectral-domain optical coherence tomography (SD-OCT, and adaptive optics (AO fundus imaging. After the baseline clinical data were obtained, topical dorzolamide was applied to the patient. The patient was followed for 24 months. Results. A reported RS1 gene mutation was found (P203L in the patient. At the baseline, his decimal BCVA was 0.15 in the right and 0.3 in the left eye. Fundus photographs showed bilateral spoke wheel-appearing maculopathy. SD-OCT confirmed the foveoschisis in the left eye. The AO images of the left eye showed spoke wheel retinal folds, and the folds were thinner than those in fundus photographs. During the follow-up period, the foveal thickness in the SD-OCT images and the number of retinal folds in the AO images were reduced. Conclusions. We have presented the detailed morphological changes of foveoschisis in a patient with XLRS detected by SD-OCT and AO fundus camera. However, the findings do not indicate whether the changes were influenced by topical dorzolamide or the natural history.

  6. Geometric optics theory and design of astronomical optical systems using Mathematica

    CERN Document Server

    Romano, Antonio

    2016-01-01

    This text, now in its second edition, presents the mathematical background needed to design many optical combinations that are used in astronomical telescopes and cameras. It uses a novel approach to third-order aberration theory based on Fermat’s principle and the use of particular optical paths (called stigmatic paths) instead of rays, allowing for easier derivation of third-order formulae. Each optical combination analyzed is accompanied by a downloadable Mathematica® notebook that automates its third-order design, eliminating the need for lengthy calculations. The essential aspects of an optical system with an axis of rotational symmetry are introduced first, along with a development of Gaussian optics from Fermat’s principal. A simpler approach to third-order monochromatic aberrations based on both Fermat’s principle and stigmatic paths is then described, followed by a new chapter on fifth-order aberrations and their classification. Several specific optical devices are discussed and analyzed, incl...

  7. O-6 Optical Property Degradation of the Hubble Space Telescope's Wide Field Camera-2 Pick Off Mirror

    Science.gov (United States)

    McNamara, Karen M.; Hughes, D. W.; Lauer, H. V.; Burkett, P. J.; Reed, B. B.

    2011-01-01

    Degradation in the performance of optical components can be greatly affected by exposure to the space environment. Many factors can contribute to such degradation including surface contaminants; outgassing; vacuum, UV, and atomic oxygen exposure; temperature cycling; or combinations of parameters. In-situ observations give important clues to degradation processes, but there are relatively few opportunities to correlate those observations with post-flight ground analyses. The return of instruments from the Hubble Space Telescope (HST) after its final servicing mission in May 2009 provided such an opportunity. Among the instruments returned from HST was the Wide-Field Planetary Camera-2 (WFPC-2), which had been exposed to the space environment for 16 years. This work focuses on the identifying the sources of degradation in the performance of the Pick-off mirror (POM) from WFPC-2. Techniques including surface reflectivity measurements, spectroscopic ellipsometry, FTIR (and ATR-FTIR) analyses, SEM/EDS, X-ray photoelectron spectroscopy (XPS) with and without ion milling, and wet and dry physical surface sampling were performed. Destructive and contact analyses took place only after completion of the non-destructive measurements. Spectroscopic ellipsometry was then repeated to determine the extent of contaminant removal by the destructive techniques, providing insight into the nature and extent of polymerization of the contaminant layer.

  8. Assessment of skin wound healing with a multi-aperture camera

    Science.gov (United States)

    Nabili, Marjan; Libin, Alex; Kim, Loan; Groah, Susan; Ramella-Roman, Jessica C.

    2009-02-01

    A clinical trial was conducted at the National Rehabilitation Hospital on 15 individuals to assess whether Rheparan Skin, a bio-engineered component of the extracellular matrix of the skin, is effective at promoting healing of a variety of wounds. Along with standard clinical outcome measures, a spectroscopic camera was used to assess the efficacy of Rheparan skin. Gauzes soaked with Rheparan skin were placed on volunteers wounds for 5 minutes twice weekly for four weeks. Images of the wounds were taken using a multi spectral camera and a digital camera at baseline and weekly thereafter. Spectral images collected at different wavelengths were used combined with optical skin models to quantify parameters of interest such as oxygen saturation (SO2), water content, and melanin concentration. A digital wound measurement system (VERG) was also used to measure the size of the wound. 9 of the 15 measured subjects showed a definitive improvement post treatment in the form of a decrease in wound area. 7 of these 9 individuals also showed an increase in oxygen saturation in the ulcerated area during the trial. A similar trend was seen in other metrics. Spectral imaging of skin wound can be a valuable tool to establish wound-healing trends and to clarify healing mechanisms.

  9. VUV testing of science cameras at MSFC: QE measurement of the CLASP flight cameras

    Science.gov (United States)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-08-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint MSFC, National Astronomical Observatory of Japan (NAOJ), Instituto de Astrofisica de Canarias (IAC) and Institut D'Astrophysique Spatiale (IAS) sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512 × 512 detector, dual channel analog readout and an internally mounted cold block. At the flight CCD temperature of -20C, the CLASP cameras exceeded the low-noise performance requirements (UV, EUV and X-ray science cameras at MSFC.

  10. Neutron cameras for ITER

    International Nuclear Information System (INIS)

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-01-01

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from 16 N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with 16 N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins

  11. The optical design concept of SPICA-SAFARI

    Science.gov (United States)

    Jellema, Willem; Kruizinga, Bob; Visser, Huib; van den Dool, Teun; Pastor Santos, Carmen; Torres Redondo, Josefina; Eggens, Martin; Ferlet, Marc; Swinyard, Bruce; Dohlen, Kjetil; Griffin, Doug; Gonzalez Fernandez, Luis Miguel; Belenguer, Tomas; Matsuhara, Hideo; Kawada, Mitsunobu; Doi, Yasuo

    2012-09-01

    The Safari instrument on the Japanese SPICA mission is a zodiacal background limited imaging spectrometer offering a photometric imaging (R ≍ 2), and a low (R = 100) and medium spectral resolution (R = 2000 at 100 μm) spectroscopy mode in three photometric bands covering the 34-210 μm wavelength range. The instrument utilizes Nyquist sampled filled arrays of very sensitive TES detectors providing a 2’x2’ instantaneous field of view. The all-reflective optical system of Safari is highly modular and consists of an input optics module containing the entrance shutter, a calibration source and a pair of filter wheels, followed by an interferometer and finally the camera bay optics accommodating the focal-plane arrays. The optical design is largely driven and constrained by volume inviting for a compact three-dimensional arrangement of the interferometer and camera bay optics without compromising the optical performance requirements associated with a diffraction- and background-limited spectroscopic imaging instrument. Central to the optics we present a flexible and compact non-polarizing Mach-Zehnder interferometer layout, with dual input and output ports, employing a novel FTS scan mechanism based on magnetic bearings and a linear motor. In this paper we discuss the conceptual design of the focal-plane optics and describe how we implement the optical instrument functions, define the photometric bands, deal with straylight control, diffraction and thermal emission in the long-wavelength limit and interface to the large-format FPA arrays at one end and the SPICA telescope assembly at the other end.

  12. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    NARCIS (Netherlands)

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P.T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many

  13. Laser-based terahertz-field-driven streak camera for the temporal characterization of ultrashort processes

    Energy Technology Data Exchange (ETDEWEB)

    Schuette, Bernd

    2011-09-15

    In this work, a novel laser-based terahertz-field-driven streak camera is presented. It allows for a pulse length characterization of femtosecond (fs) extreme ultraviolet (XUV) pulses by a cross-correlation with terahertz (THz) pulses generated with a Ti:sapphire laser. The XUV pulses are emitted by a source of high-order harmonic generation (HHG) in which an intense near-infrared (NIR) fs laser pulse is focused into a gaseous medium. The design and characterization of a high-intensity THz source needed for the streak camera is also part of this thesis. The source is based on optical rectification of the same NIR laser pulse in a lithium niobate crystal. For this purpose, the pulse front of the NIR beam is tilted via a diffraction grating to achieve velocity matching between NIR and THz beams within the crystal. For the temporal characterization of the XUV pulses, both HHG and THz beams are focused onto a gas target. The harmonic radiation creates photoelectron wavepackets which are then accelerated by the THz field depending on its phase at the time of ionization. This principle adopted from a conventional streak camera and now widely used in attosecond metrology. The streak camera presented here is an advancement of a terahertz-field-driven streak camera implemented at the Free Electron Laser in Hamburg (FLASH). The advantages of the laser-based streak camera lie in its compactness, cost efficiency and accessibility, while providing the same good quality of measurements as obtained at FLASH. In addition, its flexibility allows for a systematic investigation of streaked Auger spectra which is presented in this thesis. With its fs time resolution, the terahertz-field-driven streak camera thereby bridges the gap between attosecond and conventional cameras. (orig.)

  14. Laser-based terahertz-field-driven streak camera for the temporal characterization of ultrashort processes

    International Nuclear Information System (INIS)

    Schuette, Bernd

    2011-09-01

    In this work, a novel laser-based terahertz-field-driven streak camera is presented. It allows for a pulse length characterization of femtosecond (fs) extreme ultraviolet (XUV) pulses by a cross-correlation with terahertz (THz) pulses generated with a Ti:sapphire laser. The XUV pulses are emitted by a source of high-order harmonic generation (HHG) in which an intense near-infrared (NIR) fs laser pulse is focused into a gaseous medium. The design and characterization of a high-intensity THz source needed for the streak camera is also part of this thesis. The source is based on optical rectification of the same NIR laser pulse in a lithium niobate crystal. For this purpose, the pulse front of the NIR beam is tilted via a diffraction grating to achieve velocity matching between NIR and THz beams within the crystal. For the temporal characterization of the XUV pulses, both HHG and THz beams are focused onto a gas target. The harmonic radiation creates photoelectron wavepackets which are then accelerated by the THz field depending on its phase at the time of ionization. This principle adopted from a conventional streak camera and now widely used in attosecond metrology. The streak camera presented here is an advancement of a terahertz-field-driven streak camera implemented at the Free Electron Laser in Hamburg (FLASH). The advantages of the laser-based streak camera lie in its compactness, cost efficiency and accessibility, while providing the same good quality of measurements as obtained at FLASH. In addition, its flexibility allows for a systematic investigation of streaked Auger spectra which is presented in this thesis. With its fs time resolution, the terahertz-field-driven streak camera thereby bridges the gap between attosecond and conventional cameras. (orig.)

  15. Development of Single Optical Sensor Method for the Measurement Droplet Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Ho; Ahn, Tae Hwan; Yun, Byong Jo [Pusan National University, Busan (Korea, Republic of); Bae, Byoung Uhn; Kim, Kyoung Doo [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, we tried to develop single optical fiber probe(S-TOP) sensor method to measure droplet parameters such as diameter, droplet fraction, and droplet velocity and so on. To calibrate and confirm the optical fiber sensor for those parameters, we conducted visualization experiments by using a high speed camera with the optical sensor. To evaluate the performance of the S-TOP accurately, we repeated calibration experiments at a given droplet flow condition. Figure. 3 shows the result of the calibration. In this graph, the x axis is the droplet velocity measured by visualization and the y axis is grd, D which is obtained from S-TOP. In this study, we have developed the single tip optical probe sensor to measure the droplet parameters. From the calibration experiments with high speed camera, we get the calibration curve for the droplet velocity. Additionally, the chord length distribution of droplets is measured by the optical probe.

  16. Development of Single Optical Sensor Method for the Measurement Droplet Parameters

    International Nuclear Information System (INIS)

    Kim, Tae Ho; Ahn, Tae Hwan; Yun, Byong Jo; Bae, Byoung Uhn; Kim, Kyoung Doo

    2016-01-01

    In this study, we tried to develop single optical fiber probe(S-TOP) sensor method to measure droplet parameters such as diameter, droplet fraction, and droplet velocity and so on. To calibrate and confirm the optical fiber sensor for those parameters, we conducted visualization experiments by using a high speed camera with the optical sensor. To evaluate the performance of the S-TOP accurately, we repeated calibration experiments at a given droplet flow condition. Figure. 3 shows the result of the calibration. In this graph, the x axis is the droplet velocity measured by visualization and the y axis is grd, D which is obtained from S-TOP. In this study, we have developed the single tip optical probe sensor to measure the droplet parameters. From the calibration experiments with high speed camera, we get the calibration curve for the droplet velocity. Additionally, the chord length distribution of droplets is measured by the optical probe.

  17. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  18. Instantaneous phase-shifting Fizeau interferometry with high-speed pixelated phase-mask camera

    Science.gov (United States)

    Yatagai, Toyohiko; Jackin, Boaz Jessie; Ono, Akira; Kiyohara, Kosuke; Noguchi, Masato; Yoshii, Minoru; Kiyohara, Motosuke; Niwa, Hayato; Ikuo, Kazuyuki; Onuma, Takashi

    2015-08-01

    A Fizeou interferometer with instantaneous phase-shifting ability using a Wollaston prism is designed. to measure dynamic phase change of objects, a high-speed video camera of 10-5s of shutter speed is used with a pixelated phase-mask of 1024 × 1024 elements. The light source used is a laser of wavelength 532 nm which is split into orthogonal polarization states by passing through a Wollaston prism. By adjusting the tilt of the reference surface it is possible to make the reference and object beam with orthogonal polarizations states to coincide and interfere. Then the pixelated phase-mask camera calculate the phase changes and hence the optical path length difference. Vibration of speakers and turbulence of air flow were successfully measured in 7,000 frames/sec.

  19. [Computer optical topography: a study of the repeatability of the results of human body model examination].

    Science.gov (United States)

    Sarnadskiĭ, V N

    2007-01-01

    The problem of repeatability of the results of examination of a plastic human body model is considered. The model was examined in 7 positions using an optical topograph for kyphosis diagnosis. The examination was performed under television camera monitoring. It was shown that variation of the model position in the camera view affected the repeatability of the results of topographic examination, especially if the model-to-camera distance was changed. A study of the repeatability of the results of optical topographic examination can help to increase the reliability of the topographic method, which is widely used for medical screening of children and adolescents.

  20. IMAGE CAPTURE WITH SYNCHRONIZED MULTIPLE-CAMERAS FOR EXTRACTION OF ACCURATE GEOMETRIES

    Directory of Open Access Journals (Sweden)

    M. Koehl

    2016-06-01

    Full Text Available This paper presents a project of recording and modelling tunnels, traffic circles and roads from multiple sensors. The aim is the representation and the accurate 3D modelling of a selection of road infrastructures as dense point clouds in order to extract profiles and metrics from it. Indeed, these models will be used for the sizing of infrastructures in order to simulate exceptional convoy truck routes. The objective is to extract directly from the point clouds the heights, widths and lengths of bridges and tunnels, the diameter of gyrating and to highlight potential obstacles for a convoy. Light, mobile and fast acquisition approaches based on images and videos from a set of synchronized sensors have been tested in order to obtain useable point clouds. The presented solution is based on a combination of multiple low-cost cameras designed on an on-boarded device allowing dynamic captures. The experimental device containing GoPro Hero4 cameras has been set up and used for tests in static or mobile acquisitions. That way, various configurations have been tested by using multiple synchronized cameras. These configurations are discussed in order to highlight the best operational configuration according to the shape of the acquired objects. As the precise calibration of each sensor and its optics are major factors in the process of creation of accurate dense point clouds, and in order to reach the best quality available from such cameras, the estimation of the internal parameters of fisheye lenses of the cameras has been processed. Reference measures were also realized by using a 3D TLS (Faro Focus 3D to allow the accuracy assessment.

  1. Image Capture with Synchronized Multiple-Cameras for Extraction of Accurate Geometries

    Science.gov (United States)

    Koehl, M.; Delacourt, T.; Boutry, C.

    2016-06-01

    This paper presents a project of recording and modelling tunnels, traffic circles and roads from multiple sensors. The aim is the representation and the accurate 3D modelling of a selection of road infrastructures as dense point clouds in order to extract profiles and metrics from it. Indeed, these models will be used for the sizing of infrastructures in order to simulate exceptional convoy truck routes. The objective is to extract directly from the point clouds the heights, widths and lengths of bridges and tunnels, the diameter of gyrating and to highlight potential obstacles for a convoy. Light, mobile and fast acquisition approaches based on images and videos from a set of synchronized sensors have been tested in order to obtain useable point clouds. The presented solution is based on a combination of multiple low-cost cameras designed on an on-boarded device allowing dynamic captures. The experimental device containing GoPro Hero4 cameras has been set up and used for tests in static or mobile acquisitions. That way, various configurations have been tested by using multiple synchronized cameras. These configurations are discussed in order to highlight the best operational configuration according to the shape of the acquired objects. As the precise calibration of each sensor and its optics are major factors in the process of creation of accurate dense point clouds, and in order to reach the best quality available from such cameras, the estimation of the internal parameters of fisheye lenses of the cameras has been processed. Reference measures were also realized by using a 3D TLS (Faro Focus 3D) to allow the accuracy assessment.

  2. Evaluation of the geometric stability and the accuracy potential of digital cameras — Comparing mechanical stabilisation versus parameterisation

    Science.gov (United States)

    Rieke-Zapp, D.; Tecklenburg, W.; Peipe, J.; Hastedt, H.; Haig, Claudia

    Recent tests on the geometric stability of several digital cameras that were not designed for photogrammetric applications have shown that the accomplished accuracies in object space are either limited or that the accuracy potential is not exploited to the fullest extent. A total of 72 calibrations were calculated with four different software products for eleven digital camera models with different hardware setups, some with mechanical fixation of one or more parts. The calibration procedure was chosen in accord to a German guideline for evaluation of optical 3D measuring systems [VDI/VDE, VDI/VDE 2634 Part 1, 2002. Optical 3D Measuring Systems-Imaging Systems with Point-by-point Probing. Beuth Verlag, Berlin]. All images were taken with ringflashes which was considered a standard method for close-range photogrammetry. In cases where the flash was mounted to the lens, the force exerted on the lens tube and the camera mount greatly reduced the accomplished accuracy. Mounting the ringflash to the camera instead resulted in a large improvement of accuracy in object space. For standard calibration best accuracies in object space were accomplished with a Canon EOS 5D and a 35 mm Canon lens where the focusing tube was fixed with epoxy (47 μm maximum absolute length measurement error in object space). The fixation of the Canon lens was fairly easy and inexpensive resulting in a sevenfold increase in accuracy compared with the same lens type without modification. A similar accuracy was accomplished with a Nikon D3 when mounting the ringflash to the camera instead of the lens (52 μm maximum absolute length measurement error in object space). Parameterisation of geometric instabilities by introduction of an image variant interior orientation in the calibration process improved results for most cameras. In this case, a modified Alpa 12 WA yielded the best results (29 μm maximum absolute length measurement error in object space). Extending the parameter model with Fi

  3. Generalized free-space diffuse photon transport model based on the influence analysis of a camera lens diaphragm.

    Science.gov (United States)

    Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Chen, Duofang; Ma, Xiaopeng; Liang, Jimin; Tian, Jie

    2010-10-10

    The camera lens diaphragm is an important component in a noncontact optical imaging system and has a crucial influence on the images registered on the CCD camera. However, this influence has not been taken into account in the existing free-space photon transport models. To model the photon transport process more accurately, a generalized free-space photon transport model is proposed. It combines Lambertian source theory with analysis of the influence of the camera lens diaphragm to simulate photon transport process in free space. In addition, the radiance theorem is also adopted to establish the energy relationship between the virtual detector and the CCD camera. The accuracy and feasibility of the proposed model is validated with a Monte-Carlo-based free-space photon transport model and physical phantom experiment. A comparison study with our previous hybrid radiosity-radiance theorem based model demonstrates the improvement performance and potential of the proposed model for simulating photon transport process in free space.

  4. An Efficient Pipeline Wavefront Phase Recovery for the CAFADIS Camera for Extremely Large Telescopes

    Directory of Open Access Journals (Sweden)

    Eduardo Magdaleno

    2009-12-01

    Full Text Available In this paper we show a fast, specialized hardware implementation of the wavefront phase recovery algorithm using the CAFADIS camera. The CAFADIS camera is a new plenoptic sensor patented by the Universidad de La Laguna (Canary Islands, Spain: international patent PCT/ES2007/000046 (WIPO publication number WO/2007/082975. It can simultaneously measure the wavefront phase and the distance to the light source in a real-time process. The pipeline algorithm is implemented using Field Programmable Gate Arrays (FPGA. These devices present architecture capable of handling the sensor output stream using a massively parallel approach and they are efficient enough to resolve several Adaptive Optics (AO problems in Extremely Large Telescopes (ELTs in terms of processing time requirements. The FPGA implementation of the wavefront phase recovery algorithm using the CAFADIS camera is based on the very fast computation of two dimensional fast Fourier Transforms (FFTs. Thus we have carried out a comparison between our very novel FPGA 2D-FFTa and other implementations.

  5. Optic Disc and Optic Cup Segmentation Methodologies for Glaucoma Image Detection: A Survey

    Directory of Open Access Journals (Sweden)

    Ahmed Almazroa

    2015-01-01

    Full Text Available Glaucoma is the second leading cause of loss of vision in the world. Examining the head of optic nerve (cup-to-disc ratio is very important for diagnosing glaucoma and for patient monitoring after diagnosis. Images of optic disc and optic cup are acquired by fundus camera as well as Optical Coherence Tomography. The optic disc and optic cup segmentation techniques are used to isolate the relevant parts of the retinal image and to calculate the cup-to-disc ratio. The main objective of this paper is to review segmentation methodologies and techniques for the disc and cup boundaries which are utilized to calculate the disc and cup geometrical parameters automatically and accurately to help the professionals in the glaucoma to have a wide view and more details about the optic nerve head structure using retinal fundus images. We provide a brief description of each technique, highlighting its classification and performance metrics. The current and future research directions are summarized and discussed.

  6. Optic Disc and Optic Cup Segmentation Methodologies for Glaucoma Image Detection: A Survey

    Science.gov (United States)

    Almazroa, Ahmed; Burman, Ritambhar; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan

    2015-01-01

    Glaucoma is the second leading cause of loss of vision in the world. Examining the head of optic nerve (cup-to-disc ratio) is very important for diagnosing glaucoma and for patient monitoring after diagnosis. Images of optic disc and optic cup are acquired by fundus camera as well as Optical Coherence Tomography. The optic disc and optic cup segmentation techniques are used to isolate the relevant parts of the retinal image and to calculate the cup-to-disc ratio. The main objective of this paper is to review segmentation methodologies and techniques for the disc and cup boundaries which are utilized to calculate the disc and cup geometrical parameters automatically and accurately to help the professionals in the glaucoma to have a wide view and more details about the optic nerve head structure using retinal fundus images. We provide a brief description of each technique, highlighting its classification and performance metrics. The current and future research directions are summarized and discussed. PMID:26688751

  7. Improving Situational Awareness in camera surveillance by combining top-view maps with camera images

    NARCIS (Netherlands)

    Kooi, F.L.; Zeeders, R.

    2009-01-01

    The goal of the experiment described is to improve today's camera surveillance in public spaces. Three designs with the camera images combined on a top-view map were compared to each other and to the current situation in camera surveillance. The goal was to test which design makes spatial

  8. Towed Optical Assessment Device (TOAD) Data to Support Benthic Habitat Mapping since 2001

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Optical validation data were collected using a Tethered Optical Assessment Device (TOAD), an underwater sled equipped with an underwater digital video camera and...

  9. TV-acquired optical diagnostics systems on ATA

    International Nuclear Information System (INIS)

    Kalibjian, R.; Chong, Y.P.; Cornish, J.P.; Jackson, C.H.; Fessenden, T.J.

    1984-06-01

    The purpose of this paper is to report on optical system developments on the ATA and their applications to ATA beam characterization. Television (TV)-acquired optical diagnostics data provide spatial and temporal properties of the ATA beam that complements recorded information from other types of sensors, such as, beam-wall current monitors, x-ray probes, and rf probes. The ATA beam operates: (1) in the normal mode at 50-MeV, 10-kA at a 1-Hz rate; and (2) in the 1-KHz burst mode (for 10-pulses) at a 0.5 Hz rate. The beam has a 70-ns pulse width in vacuum propagation; however, beam-head erosion will occur in atmospheric propagation, thus limiting the pulse width to less than 50-ns. Various optical systems are used for ATA diagnostics. Optical-imaging provides a convenient measurement in a single pulse of the 2-dimensional profile of the beam intensity. It can also provide multiple 2-D framing in a single pulse. In some studies it may be desirable to study optical events with temporal resolution less than 100-ps with 1-dimensional streak cameras. Spatially integrated data from phototube cameras can also be used for background measurement applications as well as for single pixel monitoring. The optical line-of-sight (LOS) configurations have been made versatile to accommodate a large number of options for the various optical systems

  10. Control Design and Digital Implementation of a Fast 2-Degree-of-Freedom Translational Optical Image Stabilizer for Image Sensors in Mobile Camera Phones.

    Science.gov (United States)

    Wang, Jeremy H-S; Qiu, Kang-Fu; Chao, Paul C-P

    2017-10-13

    This study presents design, digital implementation and performance validation of a lead-lag controller for a 2-degree-of-freedom (DOF) translational optical image stabilizer (OIS) installed with a digital image sensor in mobile camera phones. Nowadays, OIS is an important feature of modern commercial mobile camera phones, which aims to mechanically reduce the image blur caused by hand shaking while shooting photos. The OIS developed in this study is able to move the imaging lens by actuating its voice coil motors (VCMs) at the required speed to the position that significantly compensates for imaging blurs by hand shaking. The compensation proposed is made possible by first establishing the exact, nonlinear equations of motion (EOMs) for the OIS, which is followed by designing a simple lead-lag controller based on established nonlinear EOMs for simple digital computation via a field-programmable gate array (FPGA) board in order to achieve fast response. Finally, experimental validation is conducted to show the favorable performance of the designed OIS; i.e., it is able to stabilize the lens holder to the desired position within 0.02 s, which is much less than previously reported times of around 0.1 s. Also, the resulting residual vibration is less than 2.2-2.5 μm, which is commensurate to the very small pixel size found in most of commercial image sensors; thus, significantly minimizing image blur caused by hand shaking.

  11. VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras

    Science.gov (United States)

    Champey, Patrick R.; Kobayashi, Ken; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.

  12. Advanced CCD camera developments

    Energy Technology Data Exchange (ETDEWEB)

    Condor, A. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  13. On-Line High Dose-Rate Gamma Ray Irradiation Test of the CCD/CMOS Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    In this paper, test results of gamma ray irradiation to CCD/CMOS cameras are described. From the CAMS (containment atmospheric monitoring system) data of Fukushima Dai-ichi nuclear power plant station, we found out that the gamma ray dose-rate when the hydrogen explosion occurred in nuclear reactors 1{approx}3 is about 160 Gy/h. If assumed that the emergency response robot for the management of severe accident of the nuclear power plant has been sent into the reactor area to grasp the inside situation of reactor building and to take precautionary measures against releasing radioactive materials, the CCD/CMOS cameras, which are loaded with the robot, serve as eye of the emergency response robot. In the case of the Japanese Quince robot system, which was sent to carry out investigating the unit 2 reactor building refueling floor situation, 7 CCD/CMOS cameras are used. 2 CCD cameras of Quince robot are used for the forward and backward monitoring of the surroundings during navigation. And 2 CCD (or CMOS) cameras are used for monitoring the status of front-end and back-end motion mechanics such as flippers and crawlers. A CCD camera with wide field of view optics is used for monitoring the status of the communication (VDSL) cable reel. And another 2 CCD cameras are assigned for reading the indication value of the radiation dosimeter and the instrument. In the preceding assumptions, a major problem which arises when dealing with CCD/CMOS cameras in the severe accident situations of the nuclear power plant is the presence of high dose-rate gamma irradiation fields. In the case of the DBA (design basis accident) situations of the nuclear power plant, in order to use a CCD/CMOS camera as an ad-hoc monitoring unit in the vicinity of high radioactivity structures and components of the nuclear reactor area, a robust survivability of this camera in such intense gamma-radiation fields therefore should be verified. The CCD/CMOS cameras of various types were gamma irradiated at a

  14. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    Science.gov (United States)

    Pospisil, J.; Jakubik, P.; Machala, L.

    2005-11-01

    This article reports the suggestion, realization and verification of the newly developed measuring means of the noiseless and locally shift-invariant modulation transfer function (MTF) of a digital video camera in a usual incoherent visible region of optical intensity, especially of its combined imaging, detection, sampling and digitizing steps which are influenced by the additive and spatially discrete photodetector, aliasing and quantization noises. Such means relates to the still camera automatic working regime and static two-dimensional spatially continuous light-reflection random target of white-noise property. The introduced theoretical reason for such a random-target method is also performed under exploitation of the proposed simulation model of the linear optical intensity response and possibility to express the resultant MTF by a normalized and smoothed rate of the ascertainable output and input power spectral densities. The random-target and resultant image-data were obtained and processed by means of a processing and evaluational PC with computation programs developed on the basis of MATLAB 6.5E The present examples of results and other obtained results of the performed measurements demonstrate the sufficient repeatability and acceptability of the described method for comparative evaluations of the performance of digital video cameras under various conditions.

  15. Design and Expected Performance of GISMO-2, a Two Color Millimeter Camera for the IRAM 30 m Telescope

    Science.gov (United States)

    Staguhn, Johannes G.; Benford, Dominic J.; Dwek, Eli; Hilton, Gene; Fixsen, Dale J.; Irwin, Kent; Jhabvala, Christine; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; hide

    2014-01-01

    We present the main design features for the GISMO-2 bolometer camera, which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISMO-2 will operate simultaneously in the 1 and 2 mm atmospherical windows. The 1 mm channel uses a 32 × 40 TES-based backshort under grid (BUG) bolometer array, the 2 mm channel operates with a 16 × 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISMO-2 was strongly influenced by our experience with the GISMO 2mm bolometer camera, which is successfully operating at the 30 m telescope. GISMO is accessible to the astronomical community through the regularIRAMcall for proposals.

  16. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  17. Camera-based microswitch technology to monitor mouth, eyebrow, and eyelid responses of children with profound multiple disabilities

    NARCIS (Netherlands)

    Lancioni, G.E.; Bellini, D.; Oliva, D.; Singh, N.N.; O'Reilly, M.F.; Sigafoos, J.; Lang, R.B.; Didden, H.C.M.

    2011-01-01

    A camera-based microswitch technology was recently used to successfully monitor small eyelid and mouth responses of two adults with profound multiple disabilities (Lancioni et al., Res Dev Disab 31:1509-1514, 2010a). This technology, in contrast with the traditional optic microswitches used for

  18. Solid state video cameras

    CERN Document Server

    Cristol, Y

    2013-01-01

    Solid State Video Cameras reviews the state of the art in the field of solid-state television cameras as compiled from patent literature. Organized into 10 chapters, the book begins with the basic array types of solid-state imagers and appropriate read-out circuits and methods. Documents relating to improvement of picture quality, such as spurious signal suppression, uniformity correction, or resolution enhancement, are also cited. The last part considerssolid-state color cameras.

  19. A Robust Method for Ego-Motion Estimation in Urban Environment Using Stereo Camera.

    Science.gov (United States)

    Ci, Wenyan; Huang, Yingping

    2016-10-17

    Visual odometry estimates the ego-motion of an agent (e.g., vehicle and robot) using image information and is a key component for autonomous vehicles and robotics. This paper proposes a robust and precise method for estimating the 6-DoF ego-motion, using a stereo rig with optical flow analysis. An objective function fitted with a set of feature points is created by establishing the mathematical relationship between optical flow, depth and camera ego-motion parameters through the camera's 3-dimensional motion and planar imaging model. Accordingly, the six motion parameters are computed by minimizing the objective function, using the iterative Levenberg-Marquard method. One of key points for visual odometry is that the feature points selected for the computation should contain inliers as much as possible. In this work, the feature points and their optical flows are initially detected by using the Kanade-Lucas-Tomasi (KLT) algorithm. A circle matching is followed to remove the outliers caused by the mismatching of the KLT algorithm. A space position constraint is imposed to filter out the moving points from the point set detected by the KLT algorithm. The Random Sample Consensus (RANSAC) algorithm is employed to further refine the feature point set, i.e., to eliminate the effects of outliers. The remaining points are tracked to estimate the ego-motion parameters in the subsequent frames. The approach presented here is tested on real traffic videos and the results prove the robustness and precision of the method.

  20. Quantitative optical trapping and optical manipulation of micro-sized objects

    Directory of Open Access Journals (Sweden)

    Rania Sayed

    2017-10-01

    Full Text Available An optical tweezers technique is used for ultraprecise micromanipulation to measure positions of micrometer scale objects with a precision down to the nanometer scale. It consists of a high performance research microscope with motorized scanning stage and sensitive position detection system. Up to 10 traps can be used quasi-simultaneously. Non photodamage optical trapping of Escherichia coli (E. coli bacteria cells of 2 µm in length, as an example of motile bacteria, has been shown in this paper. Also, efficient optical trapping and rotation of polystyrene latex particles of 3 µm in diameter have been studied, as an optical handle for the pick and place of other tiny objects. A fast galvoscanner is used to produce multiple optical traps for manipulation of micro-sized objects and optical forces of these trapped objects quantified and measured according to explanation of ray optics regime. The diameter of trapped particle is bigger than the wavelength of the trapping laser light. The force constant (k has been determined in real time from the positional time series recorded from the trapped object that is monitored by a CCD camera through a personal computer.

  1. Multiple Sensor Camera for Enhanced Video Capturing

    Science.gov (United States)

    Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko

    A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.

  2. Using DSLR cameras in digital holography

    Science.gov (United States)

    Hincapié-Zuluaga, Diego; Herrera-Ramírez, Jorge; García-Sucerquia, Jorge

    2017-08-01

    In Digital Holography (DH), the size of the bidimensional image sensor to record the digital hologram, plays a key role on the performance of this imaging technique; the larger the size of the camera sensor, the better the quality of the final reconstructed image. Scientific cameras with large formats are offered in the market, but their cost and availability limit their use as a first option when implementing DH. Nowadays, DSLR cameras provide an easy-access alternative that is worthwhile to be explored. The DSLR cameras are a wide, commercial, and available option that in comparison with traditional scientific cameras, offer a much lower cost per effective pixel over a large sensing area. However, in the DSLR cameras, with their RGB pixel distribution, the sampling of information is different to the sampling in monochrome cameras usually employed in DH. This fact has implications in their performance. In this work, we discuss why DSLR cameras are not extensively used for DH, taking into account the problem reported by different authors of object replication. Simulations of DH using monochromatic and DSLR cameras are presented and a theoretical deduction for the replication problem using the Fourier theory is also shown. Experimental results of DH implementation using a DSLR camera show the replication problem.

  3. A METHOD FOR SELF-CALIBRATION IN SATELLITE WITH HIGH PRECISION OF SPACE LINEAR ARRAY CAMERA

    Directory of Open Access Journals (Sweden)

    W. Liu

    2016-06-01

    Full Text Available At present, the on-orbit calibration of the geometric parameters of a space surveying camera is usually processed by data from a ground calibration field after capturing the images. The entire process is very complicated and lengthy and cannot monitor and calibrate the geometric parameters in real time. On the basis of a large number of on-orbit calibrations, we found that owing to the influence of many factors, e.g., weather, it is often difficult to capture images of the ground calibration field. Thus, regular calibration using field data cannot be ensured. This article proposes a real time self-calibration method for a space linear array camera on a satellite using the optical auto collimation principle. A collimating light source and small matrix array CCD devices are installed inside the load system of the satellite; these use the same light path as the linear array camera. We can extract the location changes of the cross marks in the matrix array CCD to determine the real-time variations in the focal length and angle parameters of the linear array camera. The on-orbit status of the camera is rapidly obtained using this method. On one hand, the camera’s change regulation can be mastered accurately and the camera’s attitude can be adjusted in a timely manner to ensure optimal photography; in contrast, self-calibration of the camera aboard the satellite can be realized quickly, which improves the efficiency and reliability of photogrammetric processing.

  4. Neutral-beam performance analysis using a CCD camera

    International Nuclear Information System (INIS)

    Hill, D.N.; Allen, S.L.; Pincosy, P.A.

    1986-01-01

    We have developed an optical diagnostic system suitable for characterizing the performance of energetic neutral beams. An absolutely calibrated CCD video camera is used to view the neutral beam as it passes through a relatively high pressure (10 -5 Torr) region outside the neutralizer: collisional excitation of the fast deuterium atoms produces H/sub proportional to/ emission (lambda = 6561A) that is proportional to the local atomic current density, independent of the species mix of accelerated ions over the energy range 5 to 20 keV. Digital processing of the video signal provides profile and aiming information for beam optimization. 6 refs., 3 figs

  5. Development of the Earth Observation Camera of MIRIS

    Directory of Open Access Journals (Sweden)

    Dae-Hee Lee

    2011-09-01

    Full Text Available We have designed and manufactured the Earth observation camera (EOC of multi-purpose infrared imaging system (MIRIS. MIRIS is a main payload of the STSAT-3, which will be launched in late 2012. The main objective of the EOC is to test the operation of Korean IR technology in space, so we have designed the optical and mechanical system of the EOC to fit the IR detector system. We have assembled the flight model (FM of EOC and performed environment tests successfully. The EOC is now ready to be integrated into the satellite system waiting for operation in space, as planned.

  6. Selecting a digital camera for telemedicine.

    Science.gov (United States)

    Patricoski, Chris; Ferguson, A Stewart

    2009-06-01

    The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.

  7. Automatic locking radioisotope camera lock

    International Nuclear Information System (INIS)

    Rosauer, P.J.

    1978-01-01

    The lock of the present invention secures the isotope source in a stored shielded condition in the camera until a positive effort has been made to open the lock and take the source outside of the camera and prevents disconnection of the source pigtail unless the source is locked in a shielded condition in the camera. It also gives a visual indication of the locked or possible exposed condition of the isotope source and prevents the source pigtail from being completely pushed out of the camera, even when the lock is released. (author)

  8. Search for GRB related prompt optical emission and other fast varying objects with ``Pi of the Sky'' detector

    Science.gov (United States)

    Ćwiok, M.; Dominik, W.; Małek, K.; Mankiewicz, L.; Mrowca-Ciułacz, J.; Nawrocki, K.; Piotrowski, L. W.; Sitek, P.; Sokołowski, M.; Wrochna, G.; Żarnecki, A. F.

    2007-06-01

    Experiment “Pi of the Sky” is designed to search for prompt optical emission from GRB sources. 32 CCD cameras covering 2 steradians will monitor the sky continuously. The data will be analysed on-line in search for optical flashes. The prototype with 2 cameras operated at Las Campanas (Chile) since 2004 has recognised several outbursts of flaring stars and has given limits for a few GRB.

  9. A charged-particle manipulator utilizing a co-axial tube electrodynamic trap with an integrated camera

    International Nuclear Information System (INIS)

    Jiang, L; Pau, S; Whitten, W B

    2011-01-01

    A charged-particle manipulator was designed and fabricated with an integrated imaging camera allowing real-time in-situ monitoring of trapped particle motion even when the trap device is under motion or rotation. The trap device was made of two co-axial electrically conductive tubes with diameters of 5.5 mm and 7 mm for the inner tube and outer tube, respectively; the imaging camera with its optical fiber bundle was integrated within the tubular trap device to realize a single instrument functioning as a manipulator. Motion of suspended microparticles of 3 μm to 50 μm in diameter can be monitored using the integrated camera regardless of the trap device orientations. This manipulator provides capability of controlled manipulation of trapped particles by tuning the operating conditions while monitoring the feedback of real-time particle motion. Imaging of suspended particles was not interrupted while the manipulator was translated and/or rotated. This integrated manipulator can be used for charged particle transport and repositioning.

  10. Acceptance/Operational Test Report for Tank 241-AN-104 camera and camera purge control system

    International Nuclear Information System (INIS)

    Castleberry, J.L.

    1995-11-01

    This Acceptance/Operational Test Procedure (ATP/OTP) will document the satisfactory operation of the camera purge panel, purge control panel, color camera system and associated control components destined for installation. The final acceptance of the complete system will be performed in the field. The purge panel and purge control panel will be tested for its safety interlock which shuts down the camera and pan-and-tilt inside the tank vapor space during loss of purge pressure and that the correct purge volume exchanges are performed as required by NFPA 496. This procedure is separated into seven sections. This Acceptance/Operational Test Report documents the successful acceptance and operability testing of the 241-AN-104 camera system and camera purge control system

  11. Establishment of Imaging Spectroscopy of Nuclear Gamma-Rays based on Geometrical Optics.

    Science.gov (United States)

    Tanimori, Toru; Mizumura, Yoshitaka; Takada, Atsushi; Miyamoto, Shohei; Takemura, Taito; Kishimoto, Tetsuro; Komura, Shotaro; Kubo, Hidetoshi; Kurosawa, Shunsuke; Matsuoka, Yoshihiro; Miuchi, Kentaro; Mizumoto, Tetsuya; Nakamasu, Yuma; Nakamura, Kiseki; Parker, Joseph D; Sawano, Tatsuya; Sonoda, Shinya; Tomono, Dai; Yoshikawa, Kei

    2017-02-03

    Since the discovery of nuclear gamma-rays, its imaging has been limited to pseudo imaging, such as Compton Camera (CC) and coded mask. Pseudo imaging does not keep physical information (intensity, or brightness in Optics) along a ray, and thus is capable of no more than qualitative imaging of bright objects. To attain quantitative imaging, cameras that realize geometrical optics is essential, which would be, for nuclear MeV gammas, only possible via complete reconstruction of the Compton process. Recently we have revealed that "Electron Tracking Compton Camera" (ETCC) provides a well-defined Point Spread Function (PSF). The information of an incoming gamma is kept along a ray with the PSF and that is equivalent to geometrical optics. Here we present an imaging-spectroscopic measurement with the ETCC. Our results highlight the intrinsic difficulty with CCs in performing accurate imaging, and show that the ETCC surmounts this problem. The imaging capability also helps the ETCC suppress the noise level dramatically by ~3 orders of magnitude without a shielding structure. Furthermore, full reconstruction of Compton process with the ETCC provides spectra free of Compton edges. These results mark the first proper imaging of nuclear gammas based on the genuine geometrical optics.

  12. Light-reflection random-target method for measurement of the modulation transfer function of a digital video-camera

    Czech Academy of Sciences Publication Activity Database

    Pospíšil, Jaroslav; Jakubík, P.; Machala, L.

    2005-01-01

    Roč. 116, - (2005), s. 573-585 ISSN 0030-4026 Institutional research plan: CEZ:AV0Z10100522 Keywords : random-target measuring method * light-reflection white - noise target * digital video camera * modulation transfer function * power spectral density Subject RIV: BH - Optics, Masers, Lasers Impact factor: 0.395, year: 2005

  13. Optical engineering at Los Alamos: a history

    International Nuclear Information System (INIS)

    Brixner, B.

    1983-01-01

    Optical engineering at Los Alamos, which began in 1943, has continued because scientific researchers usually want more resolving power than commercially available optical instruments provide. In addition, in-house engineering is often advantageous - when the technology for designing and making improved instrumentation is available locally - because of our remote location and the frequent need for accurate data. As a consequence, a number of improved research cameras and lens systems have been developed locally - especially for explosion and implosion photography, but even for oscilloscope photography. The development of high-speed cameras led to the ultimate in practical high-speed rotating mirrors and to the invention of a rapid, precise, and effective lens design procedure that has produced more than a hundred lens system that gives improved imaging in special conditions of use. Representative examples of this work are described

  14. Europe's space camera unmasks a cosmic gamma-ray machine

    Science.gov (United States)

    1996-11-01

    The new-found neutron star is the visible counterpart of a pulsating radio source, Pulsar 1055-52. It is a mere 20 kilometres wide. Although the neutron star is very hot, at about a million degrees C, very little of its radiant energy takes the form of visible light. It emits mainly gamma-rays, an extremely energetic form of radiation. By examining it at visible wavelengths, astronomers hope to figure out why Pulsar 1055-52 is the most efficient generator of gamma-rays known so far, anywhere the Universe. The Faint Object Camera found Pulsar 1055-52 in near ultraviolet light at 3400 angstroms, a little shorter in wavelength than the violet light at the extremity of the human visual range. Roberto Mignani, Patrizia Caraveo and Giovanni Bignami of the Istituto di Fisica Cosmica in Milan, Italy, report its optical identification in a forthcoming issue of Astrophysical Journal Letters (1 January 1997). The formal name of the object is PSR 1055-52. Evading the glare of an adjacent star The Italian team had tried since 1988 to spot Pulsar 1055-52 with two of the most powerful ground-based optical telescopes in the Southern Hemisphere. These were the 3.6-metre Telescope and the 3.5-metre New Technology Telescope of the European Southern Observatory at La Silla, Chile. Unfortunately an ordinary star 100,000 times brighter lay in almost the same direction in the sky, separated from the neutron star by only a thousandth of a degree. The Earth's atmosphere defocused the star's light sufficiently to mask the glimmer from Pulsar 1055-52. The astronomers therefore needed an instrument in space. The Faint Object Camera offered the best precision and sensitivity to continue the hunt. Devised by European astronomers to complement the American wide field camera in the Hubble Space Telescope, the Faint Object Camera has a relatively narrow field of view. It intensifies the image of a faint object by repeatedly accelerating electrons from photo-electric films, so as to produce

  15. Development of high-speed video cameras

    Science.gov (United States)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  16. QUANTITATIVE DETECTION OF ENVIRONMENTALLY IMPORTANT DYES USING DIODE LASER/FIBER-OPTIC RAMAN

    Science.gov (United States)

    A compact diode laser/fiber-optic Raman spectrometer is used for quantitative detection of environmentally important dyes. This system is based on diode laser excitation at 782 mm, fiber optic probe technology, an imaging spectrometer, and state-of-the-art scientific CCD camera. ...

  17. Geometric and Optic Characterization of a Hemispherical Dome Port for Underwater Photogrammetry

    Directory of Open Access Journals (Sweden)

    Fabio Menna

    2016-01-01

    Full Text Available The popularity of automatic photogrammetric techniques has promoted many experiments in underwater scenarios leading to quite impressive visual results, even by non-experts. Despite these achievements, a deep understanding of camera and lens behaviors as well as optical phenomena involved in underwater operations is fundamental to better plan field campaigns and anticipate the achievable results. The paper presents a geometric investigation of a consumer grade underwater camera housing, manufactured by NiMAR and equipped with a 7′′ dome port. After a review of flat and dome ports, the work analyzes, using simulations and real experiments, the main optical phenomena involved when operating a camera underwater. Specific aspects which deal with photogrammetric acquisitions are considered with some tests in laboratory and in a swimming pool. Results and considerations are shown and commented.

  18. Orion Optical Navigation Progress Toward Exploration: Mission 1

    Science.gov (United States)

    Holt, Greg N.; D'Souza, Christopher N.; Saley, David

    2018-01-01

    Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. It shares a history with the "method of lunar distances" that was used in the 18th century and gained some notoriety after its use by Captain James Cook during his 1768 Pacific voyage of the HMS Endeavor. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is being worked as a Government Furnished Equipment (GFE) project delivered as an application within the Core Flight Software of the Orion camera controller module. The mathematical formulation behind the initial ellipse fit in the image processing is detailed in Christian. The non-linear least squares refinement then follows the technique of Mortari as an estimation process of the planetary limb using the sigmoid function. The Orion optical navigation system uses a body fixed camera, a decision that was driven by mass and mechanism constraints. The general concept of operations involves a 2-hour pass once every 24 hours, with passes specifically placed before all maneuvers to supply accurate navigation information to guidance and targeting. The pass lengths are limited by thermal constraints on the vehicle since the OpNav attitude generally deviates from the thermally stable tail-to-sun attitude maintained during the rest of the orbit coast phase. Calibration is scheduled prior to every pass due to the unknown nature of thermal effects on the lens distortion and the mounting platform deformations between the camera and star trackers. The calibration technique is described in detail by Christian, et al. and simultaneously estimates the Brown-Conrady coefficients and the Star Tracker/Camera

  19. Winter precipitation particle size distribution measurement by Multi-Angle Snowflake Camera

    Science.gov (United States)

    Huang, Gwo-Jong; Kleinkort, Cameron; Bringi, V. N.; Notaroš, Branislav M.

    2017-12-01

    From the radar meteorology viewpoint, the most important properties for quantitative precipitation estimation of winter events are 3D shape, size, and mass of precipitation particles, as well as the particle size distribution (PSD). In order to measure these properties precisely, optical instruments may be the best choice. The Multi-Angle Snowflake Camera (MASC) is a relatively new instrument equipped with three high-resolution cameras to capture the winter precipitation particle images from three non-parallel angles, in addition to measuring the particle fall speed using two pairs of infrared motion sensors. However, the results from the MASC so far are usually presented as monthly or seasonally, and particle sizes are given as histograms, no previous studies have used the MASC for a single storm study, and no researchers use MASC to measure the PSD. We propose the methodology for obtaining the winter precipitation PSD measured by the MASC, and present and discuss the development, implementation, and application of the new technique for PSD computation based on MASC images. Overall, this is the first study of the MASC-based PSD. We present PSD MASC experiments and results for segments of two snow events to demonstrate the performance of our PSD algorithm. The results show that the self-consistency of the MASC measured single-camera PSDs is good. To cross-validate PSD measurements, we compare MASC mean PSD (averaged over three cameras) with the collocated 2D Video Disdrometer, and observe good agreements of the two sets of results.

  20. Video camera use at nuclear power plants

    International Nuclear Information System (INIS)

    Estabrook, M.L.; Langan, M.O.; Owen, D.E.

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs

  1. Uncertainties in cloud phase and optical thickness retrievals from the Earth Polychromatic Imaging Camera (EPIC).

    Science.gov (United States)

    Meyer, Kerry; Yang, Yuekui; Platnick, Steven

    2016-01-01

    This paper presents an investigation of the expected uncertainties of a single channel cloud optical thickness (COT) retrieval technique, as well as a simple cloud temperature threshold based thermodynamic phase approach, in support of the Deep Space Climate Observatory (DSCOVR) mission. DSCOVR cloud products will be derived from Earth Polychromatic Imaging Camera (EPIC) observations in the ultraviolet and visible spectra. Since EPIC is not equipped with a spectral channel in the shortwave or mid-wave infrared that is sensitive to cloud effective radius (CER), COT will be inferred from a single visible channel with the assumption of appropriate CER values for liquid and ice phase clouds. One month of Aqua MODIS daytime granules from April 2005 is selected for investigating cloud phase sensitivity, and a subset of these granules that has similar EPIC sun-view geometry is selected for investigating COT uncertainties. EPIC COT retrievals are simulated with the same algorithm as the operational MODIS cloud products (MOD06), except using fixed phase-dependent CER values. Uncertainty estimates are derived by comparing the single channel COT retrievals with the baseline bi-spectral MODIS retrievals. Results show that a single channel COT retrieval is feasible for EPIC. For ice clouds, single channel retrieval errors are minimal (< 2%) due to the particle size insensitivity of the assumed ice crystal (i.e., severely roughened aggregate of hexagonal columns) scattering properties at visible wavelengths, while for liquid clouds the error is mostly limited to within 10%, although for thin clouds (COT < 2) the error can be higher. Potential uncertainties in EPIC cloud masking and cloud temperature retrievals are not considered in this study.

  2. Human tracking over camera networks: a review

    Science.gov (United States)

    Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang

    2017-12-01

    In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.

  3. Microprocessor-controlled wide-range streak camera

    Science.gov (United States)

    Lewis, Amy E.; Hollabaugh, Craig

    2006-08-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  4. Microprocessor-controlled, wide-range streak camera

    International Nuclear Information System (INIS)

    Amy E. Lewis; Craig Hollabaugh

    2006-01-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized

  5. [Cinematography of ocular fundus with a jointed optical system and tv or cine-camera (author's transl)].

    Science.gov (United States)

    Kampik, A; Rapp, J

    1979-02-01

    A method of Cinematography of the ocular fundus is introduced which--by connecting a camera with an indirect ophthalmoscop--allows to record the monocular picture of the fundus as produced by the ophthalmic lens.

  6. A surgical navigation system for non-contact diffuse optical tomography and intraoperative cone-beam CT

    Science.gov (United States)

    Daly, Michael J.; Muhanna, Nidal; Chan, Harley; Wilson, Brian C.; Irish, Jonathan C.; Jaffray, David A.

    2014-02-01

    A freehand, non-contact diffuse optical tomography (DOT) system has been developed for multimodal imaging with intraoperative cone-beam CT (CBCT) during minimally-invasive cancer surgery. The DOT system is configured for near-infrared fluorescence imaging with indocyanine green (ICG) using a collimated 780 nm laser diode and a nearinfrared CCD camera (PCO Pixelfly USB). Depending on the intended surgical application, the camera is coupled to either a rigid 10 mm diameter endoscope (Karl Storz) or a 25 mm focal length lens (Edmund Optics). A prototype flatpanel CBCT C-Arm (Siemens Healthcare) acquires low-dose 3D images with sub-mm spatial resolution. A 3D mesh is extracted from CBCT for finite-element DOT implementation in NIRFAST (Dartmouth College), with the capability for soft/hard imaging priors (e.g., segmented lymph nodes). A stereoscopic optical camera (NDI Polaris) provides real-time 6D localization of reflective spheres mounted to the laser and camera. Camera calibration combined with tracking data is used to estimate intrinsic (focal length, principal point, non-linear distortion) and extrinsic (translation, rotation) lens parameters. Source/detector boundary data is computed from the tracked laser/camera positions using radiometry models. Target registration errors (TRE) between real and projected boundary points are ~1-2 mm for typical acquisition geometries. Pre-clinical studies using tissue phantoms are presented to characterize 3D imaging performance. This translational research system is under investigation for clinical applications in head-and-neck surgery including oral cavity tumour resection, lymph node mapping, and free-flap perforator assessment.

  7. SU-E-T-774: Use of a Scintillator-Mirror-Camera System for the Measurement of MLC Leakage Radiation with the CyberKnife M6 System

    Energy Technology Data Exchange (ETDEWEB)

    Goggin, L; Kilby, W; Noll, M; Maurer, C [Accuray Inc, Sunnyvale, CA (United States)

    2015-06-15

    Purpose: A technique using a scintillator-mirror-camera system to measure MLC leakage was developed to provide an efficient alternative to film dosimetry while maintaining high spatial resolution. This work describes the technique together with measurement uncertainties. Methods: Leakage measurements were made for the InCise™ MLC using the Logos XRV-2020A device. For each measurement approximately 170 leakage and background images were acquired using optimized camera settings. Average background was subtracted from each leakage frame before filtering the integrated leakage image to replace anomalous pixels. Pixel value to dose conversion was performed using a calibration image. Mean leakage was calculated within an ROI corresponding to the primary beam, and maximum leakage was determined by binning the image into overlapping 1mm x 1mm ROIs. 48 measurements were performed using 3 cameras and multiple MLC-linac combinations in varying beam orientations, with each compared to film dosimetry. Optical and environmental influences were also investigated. Results: Measurement time with the XRV-2020A was 8 minutes vs. 50 minutes using radiochromic film, and results were available immediately. Camera radiation exposure degraded measurement accuracy. With a relatively undamaged camera, mean leakage agreed with film measurement to ≤0.02% in 92% cases, ≤0.03% in 100% (for maximum leakage the values were 88% and 96%) relative to reference open field dose. The estimated camera lifetime over which this agreement is maintained is at least 150 measurements, and can be monitored using reference field exposures. A dependency on camera temperature was identified and a reduction in sensitivity with distance from image center due to optical distortion was characterized. Conclusion: With periodic monitoring of the degree of camera radiation damage, the XRV-2020A system can be used to measure MLC leakage. This represents a significant time saving when compared to the traditional

  8. Robot Towed Shortwave Infrared Camera for Specific Surface Area Retrieval of Surface Snow

    Science.gov (United States)

    Elliott, J.; Lines, A.; Ray, L.; Albert, M. R.

    2017-12-01

    Optical grain size and specific surface area are key parameters for measuring the atmospheric interactions of snow, as well as tracking metamorphosis and allowing for the ground truthing of remote sensing data. We describe a device using a shortwave infrared camera with changeable optical bandpass filters (centered at 1300 nm and 1550 nm) that can be used to quickly measure the average SSA over an area of 0.25 m^2. The device and method are compared with calculations made from measurements taken with a field spectral radiometer. The instrument is designed to be towed by a small autonomous ground vehicle, and therefore rides above the snow surface on ultra high molecular weight polyethylene (UHMW) skis.

  9. OPALS: A COTS-based Tech Demo of Optical Communications

    Science.gov (United States)

    Oaida, Bogdan

    2012-01-01

    I. Objective: Deliver video from ISS to optical ground terminal via an optical communications link. a) JPL Phaeton/Early Career Hire (ECH) training project. b) Implemented as Class-D payload. c) Downlink at approx.30Mb/s. II. Flight System a) Optical Head Beacon Acquisition Camera. Downlink Transmitter. 2-axis Gimbal. b) Sealed Container Laser Avionics Power distribution Digital I/O board III. Implementation: a) Ground Station - Optical Communications Telescope Laboratory at Table Mountain Facility b) Flight System mounted to ISS FRAM as standard I/F. Attached externally on Express Logistics Carrier.

  10. Optical wedge method for spatial reconstruction of particle trajectories

    International Nuclear Information System (INIS)

    Asatiani, T.L.; Alchudzhyan, S.V.; Gazaryan, K.A.; Zograbyan, D.Sh.; Kozliner, L.I.; Krishchyan, V.M.; Martirosyan, G.S.; Ter-Antonyan, S.V.

    1978-01-01

    A technique of optical wedges allowing the full reconstruction of pictures of events in space is considered. The technique is used for the detection of particle tracks in optical wide-gap spark chambers by photographing in one projection. The optical wedges are refracting right-angle plastic prisms positioned between the camera and the spark chamber so that through them both ends of the track are photographed. A method for calibrating measurements is given, and an estimate made of the accuracy of the determination of the second projection with the help of the optical wedges

  11. Absolute calibration method for fast-streaked, fiber optic light collection, spectroscopy systems

    International Nuclear Information System (INIS)

    Johnston, Mark D.; Frogget, Brent; Oliver, Bryan Velten; Maron, Yitzhak; Droemer, Darryl W.; Crain, Marlon D.

    2010-01-01

    This report outlines a convenient method to calibrate fast (<1ns resolution) streaked, fiber optic light collection, spectroscopy systems. Such a system is used to collect spectral data on plasmas generated in the A-K gap of electron beam diodes fielded on the RITS-6 accelerator (8-12MV, 140-200kA). On RITS, light is collected through a small diameter (200 micron) optical fiber and recorded on a fast streak camera at the output of 1 meter Czerny-Turner monochromator (F/7 optics). To calibrate such a system, it is necessary to efficiently couple light from a spectral lamp into a 200 micron diameter fiber, split it into its spectral components, with 10 Angstroms or less resolution, and record it on a streak camera with 1ns or less temporal resolution.

  12. An Optical and Terahertz Instrumentation System at the FAST LINAC at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Thurman-Keup, R. [Fermilab; Lumpkin, A. H. [Fermilab; Thangaraj, J. [Fermilab

    2017-08-01

    FAST is a facility at Fermilab that consists of a photoinjector, two superconducting capture cavities, one superconducting ILC-style cryomodule, and a small ring for studying non-linear, integrable beam optics called IOTA. This paper discusses the layout for the optical transport system that provides optical radiation to an externally located streak camera for bunch length measurements, and THz radiation to a Martin-Puplett interferometer, also for bunch length measurements. It accepts radiation from two synchrotron radiation ports in a chicane bunch compressor and a diffraction/transition radiation screen downstream of the compressor. It also has the potential to access signal from a transition radiation screen or YAG screen after the spectrometer magnet for measurements of energy-time correlations. Initial results from both the streak camera and Martin-Puplett will be presented.

  13. Towards Adaptive Virtual Camera Control In Computer Games

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We inves- tigate the relationship between camera placement and playing behaviour in games and build a user...... model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ...... camera control in games is discussed....

  14. Imaging spectroscopy using embedded diffractive optical arrays

    Science.gov (United States)

    Hinnrichs, Michele; Hinnrichs, Bradford

    2017-09-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image

  15. Low-cost far infrared bolometer camera for automotive use

    Science.gov (United States)

    Vieider, Christian; Wissmar, Stanley; Ericsson, Per; Halldin, Urban; Niklaus, Frank; Stemme, Göran; Källhammer, Jan-Erik; Pettersson, Håkan; Eriksson, Dick; Jakobsen, Henrik; Kvisterøy, Terje; Franks, John; VanNylen, Jan; Vercammen, Hans; VanHulsel, Annick

    2007-04-01

    A new low-cost long-wavelength infrared bolometer camera system is under development. It is designed for use with an automatic vision algorithm system as a sensor to detect vulnerable road users in traffic. Looking 15 m in front of the vehicle it can in case of an unavoidable impact activate a brake assist system or other deployable protection system. To achieve our cost target below €100 for the sensor system we evaluate the required performance and can reduce the sensitivity to 150 mK and pixel resolution to 80 x 30. We address all the main cost drivers as sensor size and production yield along with vacuum packaging, optical components and large volume manufacturing technologies. The detector array is based on a new type of high performance thermistor material. Very thin Si/SiGe single crystal multi-layers are grown epitaxially. Due to the resulting valence barriers a high temperature coefficient of resistance is achieved (3.3%/K). Simultaneously, the high quality crystalline material provides very low 1/f-noise characteristics and uniform material properties. The thermistor material is transferred from the original substrate wafer to the read-out circuit using adhesive wafer bonding and subsequent thinning. Bolometer arrays can then be fabricated using industry standard MEMS process and materials. The inherently good detector performance allows us to reduce the vacuum requirement and we can implement wafer level vacuum packaging technology used in established automotive sensor fabrication. The optical design is reduced to a single lens camera. We develop a low cost molding process using a novel chalcogenide glass (GASIR®3) and integrate anti-reflective and anti-erosion properties using diamond like carbon coating.

  16. Using the OOI Cabled Array HD Camera to Explore Geophysical and Oceanographic Problems at Axial Seamount

    Science.gov (United States)

    Crone, T. J.; Knuth, F.; Marburg, A.

    2016-12-01

    A broad array of Earth science problems can be investigated using high-definition video imagery from the seafloor, ranging from those that are geological and geophysical in nature, to those that are biological and water-column related. A high-definition video camera was installed as part of the Ocean Observatory Initiative's core instrument suite on the Cabled Array, a real-time fiber optic data and power system that stretches from the Oregon Coast to Axial Seamount on the Juan de Fuca Ridge. This camera runs a 14-minute pan-tilt-zoom routine 8 times per day, focusing on locations of scientific interest on and near the Mushroom vent in the ASHES hydrothermal field inside the Axial caldera. The system produces 13 GB of lossless HD video every 3 hours, and at the time of this writing it has generated 2100 recordings totaling 28.5 TB since it began streaming data into the OOI archive in August of 2015. Because of the large size of this dataset, downloading the entirety of the video for long timescale investigations is not practical. We are developing a set of user-side tools for downloading single frames and frame ranges from the OOI HD camera raw data archive to aid users interested in using these data for their research. We use these tools to download about one year's worth of partial frame sets to investigate several questions regarding the hydrothermal system at ASHES, including the variability of bacterial "floc" in the water-column, and changes in high temperature fluid fluxes using optical flow techniques. We show that while these user-side tools can facilitate rudimentary scientific investigations using the HD camera data, a server-side computing environment that allows users to explore this dataset without downloading any raw video will be required for more advanced investigations to flourish.

  17. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited)

    Energy Technology Data Exchange (ETDEWEB)

    MacPhee, A. G., E-mail: macphee2@llnl.gov; Hatch, B. W.; Bell, P. M.; Bradley, D. K.; Datte, P. S.; Landen, O. L.; Palmer, N. E.; Piston, K. W.; Rekow, V. V. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); Dymoke-Bradshaw, A. K. L.; Hares, J. D. [Kentech Instruments Ltd., Isis Building, Howbery Park, Wallingford, Oxfordshire OX10 8BD (United Kingdom); Hassett, J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); Department of Electrical and Computer Engineering, University of Rochester, Rochester, New York 14627 (United States); Meadowcroft, A. L. [AWE Aldermaston, Reading, Berkshire RG7 4PR (United Kingdom); Hilsabeck, T. J.; Kilkenny, J. D. [General Atomics, P.O. Box 85608, San Diego, California 92186-5608 (United States)

    2016-11-15

    We report simulations and experiments that demonstrate an increase in spatial resolution of the NIF core diagnostic x-ray streak cameras by at least a factor of two, especially off axis. A design was achieved by using a corrector electron optic to flatten the field curvature at the detector plane and corroborated by measurement. In addition, particle in cell simulations were performed to identify the regions in the streak camera that contribute the most to space charge blurring. These simulations provide a tool for convolving synthetic pre-shot spectra with the instrument function so signal levels can be set to maximize dynamic range for the relevant part of the streak record.

  18. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited).

    Science.gov (United States)

    MacPhee, A G; Dymoke-Bradshaw, A K L; Hares, J D; Hassett, J; Hatch, B W; Meadowcroft, A L; Bell, P M; Bradley, D K; Datte, P S; Landen, O L; Palmer, N E; Piston, K W; Rekow, V V; Hilsabeck, T J; Kilkenny, J D

    2016-11-01

    We report simulations and experiments that demonstrate an increase in spatial resolution of the NIF core diagnostic x-ray streak cameras by at least a factor of two, especially off axis. A design was achieved by using a corrector electron optic to flatten the field curvature at the detector plane and corroborated by measurement. In addition, particle in cell simulations were performed to identify the regions in the streak camera that contribute the most to space charge blurring. These simulations provide a tool for convolving synthetic pre-shot spectra with the instrument function so signal levels can be set to maximize dynamic range for the relevant part of the streak record.

  19. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  20. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2 0 deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  1. Streak electronic camera with slow-scanning storage tube used in the field of high-speed cineradiography

    International Nuclear Information System (INIS)

    Marilleau, J.; Bonnet, L.; Garcin, G.; Guix, R.; Loichot, R.

    The cineradiographic machine designed for measurements in the field of detonics consists of a linear accelerator associated with a braking target, a scintillator and a remote controlled electronic camera. The quantum factor of X-ray detection and the energetic efficiency of the scintillator are given. The electronic camera is built upon a deflection-converter tube (RCA C. 73 435 AJ) coupled by optical fibres to a photosensitive storage tube (TH-CSF Esicon) used in a slow-scanning process with electronic recording of the information. The different parts of the device are described. Some capabilities such as data processing numerical outputs, measurements and display are outlined. A streak cineradiogram of a typical implosion experiment is given [fr

  2. Gamma camera system

    International Nuclear Information System (INIS)

    Miller, D.W.; Gerber, M.S.; Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    A detailed description is given of a novel gamma camera which is designed to produce superior images than conventional cameras used in nuclear medicine. The detector consists of a solid state detector (e.g. germanium) which is formed to have a plurality of discrete components to enable 2-dimensional position identification. Details of the electronic processing circuits are given and the problems and limitations introduced by noise are discussed in full. (U.K.)

  3. The eye of the camera: effects of security cameras on pro-social behavior

    NARCIS (Netherlands)

    van Rompay, T.J.L.; Vonk, D.J.; Fransen, M.L.

    2009-01-01

    This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise

  4. Improved optical flow velocity analysis in SO2 camera images of volcanic plumes - implications for emission-rate retrievals investigated at Mt Etna, Italy and Guallatiri, Chile

    Science.gov (United States)

    Gliß, Jonas; Stebel, Kerstin; Kylling, Arve; Sudbø, Aasmund

    2018-02-01

    Accurate gas velocity measurements in emission plumes are highly desirable for various atmospheric remote sensing applications. The imaging technique of UV SO2 cameras is commonly used to monitor SO2 emissions from volcanoes and anthropogenic sources (e.g. power plants, ships). The camera systems capture the emission plumes at high spatial and temporal resolution. This allows the gas velocities in the plume to be retrieved directly from the images. The latter can be measured at a pixel level using optical flow (OF) algorithms. This is particularly advantageous under turbulent plume conditions. However, OF algorithms intrinsically rely on contrast in the images and often fail to detect motion in low-contrast image areas. We present a new method to identify ill-constrained OF motion vectors and replace them using the local average velocity vector. The latter is derived based on histograms of the retrieved OF motion fields. The new method is applied to two example data sets recorded at Mt Etna (Italy) and Guallatiri (Chile). We show that in many cases, the uncorrected OF yields significantly underestimated SO2 emission rates. We further show that our proposed correction can account for this and that it significantly improves the reliability of optical-flow-based gas velocity retrievals. In the case of Mt Etna, the SO2 emissions of the north-eastern crater are investigated. The corrected SO2 emission rates range between 4.8 and 10.7 kg s-1 (average of 7.1 ± 1.3 kg s-1) and are in good agreement with previously reported values. For the Guallatiri data, the emissions of the central crater and a fumarolic field are investigated. The retrieved SO2 emission rates are between 0.5 and 2.9 kg s-1 (average of 1.3 ± 0.5 kg s-1) and provide the first report of SO2 emissions from this remotely located and inaccessible volcano.

  5. Passive auto-focus for digital still cameras and camera phones: Filter-switching and low-light techniques

    Science.gov (United States)

    Gamadia, Mark Noel

    In order to gain valuable market share in the growing consumer digital still camera and camera phone market, camera manufacturers have to continually add and improve existing features to their latest product offerings. Auto-focus (AF) is one such feature, whose aim is to enable consumers to quickly take sharply focused pictures with little or no manual intervention in adjusting the camera's focus lens. While AF has been a standard feature in digital still and cell-phone cameras, consumers often complain about their cameras' slow AF performance, which may lead to missed photographic opportunities, rendering valuable moments and events with undesired out-of-focus pictures. This dissertation addresses this critical issue to advance the state-of-the-art in the digital band-pass filter, passive AF method. This method is widely used to realize AF in the camera industry, where a focus actuator is adjusted via a search algorithm to locate the in-focus position by maximizing a sharpness measure extracted from a particular frequency band of the incoming image of the scene. There are no known systematic methods for automatically deriving the parameters such as the digital pass-bands or the search step-size increments used in existing passive AF schemes. Conventional methods require time consuming experimentation and tuning in order to arrive at a set of parameters which balance AF performance in terms of speed and accuracy ultimately causing a delay in product time-to-market. This dissertation presents a new framework for determining an optimal set of passive AF parameters, named Filter- Switching AF, providing an automatic approach to achieve superior AF performance, both in good and low lighting conditions based on the following performance measures (metrics): speed (total number of iterations), accuracy (offset from truth), power consumption (total distance moved), and user experience (in-focus position overrun). Performance results using three different prototype cameras

  6. Image compensation for camera and lighting variability

    Science.gov (United States)

    Daley, Wayne D.; Britton, Douglas F.

    1996-12-01

    With the current trend of integrating machine vision systems in industrial manufacturing and inspection applications comes the issue of camera and illumination stabilization. Unless each application is built around a particular camera and highly controlled lighting environment, the interchangeability of cameras of fluctuations in lighting become a problem as each camera usually has a different response. An empirical approach is proposed where color tile data is acquired using the camera of interest, and a mapping is developed to some predetermined reference image using neural networks. A similar analytical approach based on a rough analysis of the imaging systems is also considered for deriving a mapping between cameras. Once a mapping has been determined, all data from one camera is mapped to correspond to the images of the other prior to performing any processing on the data. Instead of writing separate image processing algorithms for the particular image data being received, the image data is adjusted based on each particular camera and lighting situation. All that is required when swapping cameras is the new mapping for the camera being inserted. The image processing algorithms can remain the same as the input data has been adjusted appropriately. The results of utilizing this technique are presented for an inspection application.

  7. Applications of optical fibers and miniature photonic elements in medical diagnostics

    Science.gov (United States)

    Blaszczak, Urszula; Gilewski, Marian; Gryko, Lukasz; Zajac, Andrzej; Kukwa, Andrzej; Kukwa, Wojciech

    2014-05-01

    Construction of endoscopes which are known for decades, in particular in small devices with the diameter of few millimetres, are based on the application of fibre optic imaging bundles or bundles of fibers in the illumination systems (usually with a halogen source). Cameras - CCD and CMOS - with the sensor size of less than 5 mm emerging commercially and high power LED solutions allow to design and construct modern endoscopes characterized by many innovative properties. These constructions offer higher resolution. They are also relatively cheaper especially in the context of the integration of the majority of the functions on a single chip. Mentioned features of the CMOS sensors reduce the cycle of introducing the newly developed instruments to the market. The paper includes a description of the concept of the endoscope with a miniature camera built on the basis of CMOS detector manufactured by Omni Vision. The set of LEDs located at the operator side works as the illuminating system. Fibre optic system and the lens of the camera are used in shaping the beam illuminating the observed tissue. Furthermore, to broaden the range of applications of the endoscope, the illuminator allows to control the spectral characteristics of emitted light. The paper presents the analysis of the basic parameters of the light-and-optical system of the endoscope. The possibility of adjusting the magnifications of the lens, the field of view of the camera and its spatial resolution is discussed. Special attention was drawn to the issues related to the selection of the light sources used for the illumination in terms of energy efficiency and the possibility of providing adjusting the colour of the emitted light in order to improve the quality of the image obtained by the camera.

  8. Optimising camera traps for monitoring small mammals.

    Directory of Open Access Journals (Sweden)

    Alistair S Glen

    Full Text Available Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1 trigger speed, 2 passive infrared vs. microwave sensor, 3 white vs. infrared flash, and 4 still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea, feral cats (Felis catus and hedgehogs (Erinaceuseuropaeus. Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  9. Science, conservation, and camera traps

    Science.gov (United States)

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  10. Development of low-cost high-performance multispectral camera system at Banpil

    Science.gov (United States)

    Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.

    2014-05-01

    Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.

  11. Robust optical sensors for safety critical automotive applications

    Science.gov (United States)

    De Locht, Cliff; De Knibber, Sven; Maddalena, Sam

    2008-02-01

    Optical sensors for the automotive industry need to be robust, high performing and low cost. This paper focuses on the impact of automotive requirements on optical sensor design and packaging. Main strategies to lower optical sensor entry barriers in the automotive market include: Perform sensor calibration and tuning by the sensor manufacturer, sensor test modes on chip to guarantee functional integrity at operation, and package technology is key. As a conclusion, optical sensor applications are growing in automotive. Optical sensor robustness matured to the level of safety critical applications like Electrical Power Assisted Steering (EPAS) and Drive-by-Wire by optical linear arrays based systems and Automated Cruise Control (ACC), Lane Change Assist and Driver Classification/Smart Airbag Deployment by camera imagers based systems.

  12. A directional fast neutron detector using scintillating fibers and an intensified CCD camera system

    International Nuclear Information System (INIS)

    Holslin, Daniel; Armstrong, A.W.; Hagan, William; Shreve, David; Smith, Scott

    1994-01-01

    We have been developing and testing a scintillating fiber detector (SFD) for use as a fast neutron sensor which can discriminate against neutrons entering at angles non-parallel to the fiber axis (''directionality''). The detector/convertor component is a fiber bundle constructed of plastic scintillating fibers each measuring 10 cm long and either 0.3 mm or 0.5 mm in diameter. Extensive Monte Carlo simulations were made to optimize the bundle response to a range of fast neutron energies and to intense fluxes of high energy gamma-rays. The bundle is coupled to a set of gamma-ray insenitive electro-optic intensifiers whose output is viewed by a CCD camera directly coupled to the intensifiers. Two types of CCD cameras were utilized: 1) a standard, interline RS-170 camera with electronic shuttering and 2) a high-speed (up to 850 frame/s) field-transfer camera. Measurements of the neutron detection efficiency and directionality were made using 14 MeV neutrons, and the response to gamma-rays was performed using intense fluxes from radioisotopic sources (up to 20 R/h). Recently, the detector was constructed and tested using a large 10 cm by 10 cm square fiber bundle coupled to a 10 cm diameter GEN I intensifier tube. We present a description of the various detector systems and report the results of experimental tests. ((orig.))

  13. New camera systems for fuel services

    International Nuclear Information System (INIS)

    Hummel, W.; Beck, H.J.

    2010-01-01

    AREVA NP Fuel Services have many years of experience in visual examination and measurements on fuel assemblies and associated core components by using state of the art cameras and measuring technologies. The used techniques allow the surface and dimensional characterization of materials and shapes by visual examination. New enhanced and sophisticated technologies for fuel services f. e. are two shielded color camera systems for use under water and close inspection of a fuel assembly. Nowadays the market requirements for detecting and characterization of small defects (lower than the 10th of one mm) or cracks and analyzing surface appearances on an irradiated fuel rod cladding or fuel assembly structure parts have increased. Therefore it is common practice to use movie cameras with higher resolution. The radiation resistance of high resolution CCD cameras is in general very low and it is not possible to use them unshielded close to a fuel assembly. By extending the camera with a mirror system and shielding around the sensitive parts, the movie camera can be utilized for fuel assembly inspection. AREVA NP Fuel Services is now equipped with such kind of movie cameras. (orig.)

  14. Automatic multi-camera calibration for deployable positioning systems

    Science.gov (United States)

    Axelsson, Maria; Karlsson, Mikael; Rudner, Staffan

    2012-06-01

    Surveillance with automated positioning and tracking of subjects and vehicles in 3D is desired in many defence and security applications. Camera systems with stereo or multiple cameras are often used for 3D positioning. In such systems, accurate camera calibration is needed to obtain a reliable 3D position estimate. There is also a need for automated camera calibration to facilitate fast deployment of semi-mobile multi-camera 3D positioning systems. In this paper we investigate a method for automatic calibration of the extrinsic camera parameters (relative camera pose and orientation) of a multi-camera positioning system. It is based on estimation of the essential matrix between each camera pair using the 5-point method for intrinsically calibrated cameras. The method is compared to a manual calibration method using real HD video data from a field trial with a multicamera positioning system. The method is also evaluated on simulated data from a stereo camera model. The results show that the reprojection error of the automated camera calibration method is close to or smaller than the error for the manual calibration method and that the automated calibration method can replace the manual calibration.

  15. Multi-Angle Snowflake Camera Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Stuefer, Martin [Univ. of Alaska, Fairbanks, AK (United States); Bailey, J. [Univ. of Alaska, Fairbanks, AK (United States)

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASC cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.

  16. CALIBRATION PROCEDURES ON OBLIQUE CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    G. Kemper

    2016-06-01

    Full Text Available Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna –IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first

  17. Acoustic and Optical Televiewer Borehole Logging

    International Nuclear Information System (INIS)

    Ahmad Hasnulhadi Che Kamaruddin; Nik Marzukee Nik Ibrahim; Zaidi Ibrahim; Nurul Wahida Ahmad Khairuddin; Azmi Ibrahim

    2016-01-01

    This review paper is focused on Borehole Televiewer. Borehole Televiewer or (BHTV) was used to obtain high-resolution acoustical images from the borehole wall. A probe with a high resolution downward looking camera is used. The camera has specific optics (a conical mirror with a ring of bulbs) with just one shot needed to capture the entire borehole circumference as a 360 panoramic view. Settings similar to traditional cameras (exposure, quality, light, frame rate and resolution) make it effective in almost any type of borehole fluid. After each shot, a series of horizontal pixel strings are acquired, giving a rasterized RGB picture in real-time which is transmitted to the console and finally to a monitor. The orientation device embedded in the tool, which is made of 3 inclinometers and 3 magnetometers, allows the inclination and azimuth of the probe to be computed in real-time, correctly orienting the borehole images. Besides, Acoustic and Optical Televiewer has been introduced as its advanced in technological research. Its logging has been successfully applied to geotechnical investigations and mineral exploration (Schepers et al., 2001) due to advances in beam focusing, increased dynamic range, digital recording techniques, and digital data processing (Schepers, 1991). Thus, this paper will go through to the basic principle of (BHTV) as one type of data collection today. (author)

  18. Relative and Absolute Calibration of a Multihead Camera System with Oblique and Nadir Looking Cameras for a Uas

    Science.gov (United States)

    Niemeyer, F.; Schima, R.; Grenzdörffer, G.

    2013-08-01

    Numerous unmanned aerial systems (UAS) are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg) are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis" software and will give an overview of the results and experiences of test flights.

  19. Terahertz adaptive optics with a deformable mirror.

    Science.gov (United States)

    Brossard, Mathilde; Sauvage, Jean-François; Perrin, Mathias; Abraham, Emmanuel

    2018-04-01

    We report on the wavefront correction of a terahertz (THz) beam using adaptive optics, which requires both a wavefront sensor that is able to sense the optical aberrations, as well as a wavefront corrector. The wavefront sensor relies on a direct 2D electro-optic imaging system composed of a ZnTe crystal and a CMOS camera. By measuring the phase variation of the THz electric field in the crystal, we were able to minimize the geometrical aberrations of the beam, thanks to the action of a deformable mirror. This phase control will open the route to THz adaptive optics in order to optimize the THz beam quality for both practical and fundamental applications.

  20. Multifocal fluorescence microscope for fast optical recordings of neuronal action potentials.

    Science.gov (United States)

    Shtrahman, Matthew; Aharoni, Daniel B; Hardy, Nicholas F; Buonomano, Dean V; Arisaka, Katsushi; Otis, Thomas S

    2015-02-03

    In recent years, optical sensors for tracking neural activity have been developed and offer great utility. However, developing microscopy techniques that have several kHz bandwidth necessary to reliably capture optically reported action potentials (APs) at multiple locations in parallel remains a significant challenge. To our knowledge, we describe a novel microscope optimized to measure spatially distributed optical signals with submillisecond and near diffraction-limit resolution. Our design uses a spatial light modulator to generate patterned illumination to simultaneously excite multiple user-defined targets. A galvanometer driven mirror in the emission path streaks the fluorescence emanating from each excitation point during the camera exposure, using unused camera pixels to capture time varying fluorescence at rates that are ∼1000 times faster than the camera's native frame rate. We demonstrate that this approach is capable of recording Ca(2+) transients resulting from APs in neurons labeled with the Ca(2+) sensor Oregon Green Bapta-1 (OGB-1), and can localize the timing of these events with millisecond resolution. Furthermore, optically reported APs can be detected with the voltage sensitive dye DiO-DPA in multiple locations within a neuron with a signal/noise ratio up to ∼40, resolving delays in arrival time along dendrites. Thus, the microscope provides a powerful tool for photometric measurements of dynamics requiring submillisecond sampling at multiple locations. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. Imaging optical scattering of butterfly wing scales with a microscope.

    Science.gov (United States)

    Fu, Jinxin; Yoon, Beom-Jin; Park, Jung Ok; Srinivasarao, Mohan

    2017-08-06

    A new optical method is proposed to investigate the reflectance of structurally coloured objects, such as Morpho butterfly wing scales and cholesteric liquid crystals. Using a reflected-light microscope and a digital single-lens reflex (DSLR) camera, we have successfully measured the two-dimensional reflection pattern of individual wing scales of Morpho butterflies. We demonstrate that this method enables us to measure the bidirectional reflectance distribution function (BRDF). The scattering image observed in the back focal plane of the objective is projected onto the camera sensor by inserting a Bertrand lens in the optical path of the microscope. With monochromatic light illumination, we quantify the angle-dependent reflectance spectra from the wing scales of Morpho rhetenor by retrieving the raw signal from the digital camera sensor. We also demonstrate that the polarization-dependent reflection of individual wing scales is readily observed using this method, using the individual wing scales of Morpho cypris . In an effort to show the generality of the method, we used a chiral nematic fluid to illustrate the angle-dependent reflectance as seen by this method.

  2. Development of plenoptic infrared camera using low dimensional material based photodetectors

    Science.gov (United States)

    Chen, Liangliang

    Infrared (IR) sensor has extended imaging from submicron visible spectrum to tens of microns wavelength, which has been widely used for military and civilian application. The conventional bulk semiconductor materials based IR cameras suffer from low frame rate, low resolution, temperature dependent and highly cost, while the unusual Carbon Nanotube (CNT), low dimensional material based nanotechnology has been made much progress in research and industry. The unique properties of CNT lead to investigate CNT based IR photodetectors and imaging system, resolving the sensitivity, speed and cooling difficulties in state of the art IR imagings. The reliability and stability is critical to the transition from nano science to nano engineering especially for infrared sensing. It is not only for the fundamental understanding of CNT photoresponse induced processes, but also for the development of a novel infrared sensitive material with unique optical and electrical features. In the proposed research, the sandwich-structured sensor was fabricated within two polymer layers. The substrate polyimide provided sensor with isolation to background noise, and top parylene packing blocked humid environmental factors. At the same time, the fabrication process was optimized by real time electrical detection dielectrophoresis and multiple annealing to improve fabrication yield and sensor performance. The nanoscale infrared photodetector was characterized by digital microscopy and precise linear stage in order for fully understanding it. Besides, the low noise, high gain readout system was designed together with CNT photodetector to make the nano sensor IR camera available. To explore more of infrared light, we employ compressive sensing algorithm into light field sampling, 3-D camera and compressive video sensing. The redundant of whole light field, including angular images for light field, binocular images for 3-D camera and temporal information of video streams, are extracted and

  3. Homography-based multiple-camera person-tracking

    Science.gov (United States)

    Turk, Matthew R.

    2009-01-01

    Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of

  4. Computing camera heading: A study

    Science.gov (United States)

    Zhang, John Jiaxiang

    2000-08-01

    An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.

  5. Scintillation camera for high activity sources

    International Nuclear Information System (INIS)

    Arseneau, R.E.

    1978-01-01

    The invention described relates to a scintillation camera used for clinical medical diagnosis. Advanced recognition of many unacceptable pulses allows the scintillation camera to discard such pulses at an early stage in processing. This frees the camera to process a greater number of pulses of interest within a given period of time. Temporary buffer storage allows the camera to accommodate pulses received at a rate in excess of its maximum rated capability due to statistical fluctuations in the level of radioactivity of the radiation source measured. (U.K.)

  6. Decision about buying a gamma camera

    International Nuclear Information System (INIS)

    Ganatra, R.D.

    1992-01-01

    A large part of the referral to a nuclear medicine department is usually for imaging studies. Sooner or later, the nuclear medicine specialist will be called upon to make a decision about when and what type of gamma camera to buy. There is no longer an option of choosing between a rectilinear scanner and a gamma camera as the former is virtually out of the market. The decision that one has to make is when to invest in a gamma camera, and then on what basis to select the gamma camera

  7. Decision about buying a gamma camera

    Energy Technology Data Exchange (ETDEWEB)

    Ganatra, R D

    1993-12-31

    A large part of the referral to a nuclear medicine department is usually for imaging studies. Sooner or later, the nuclear medicine specialist will be called upon to make a decision about when and what type of gamma camera to buy. There is no longer an option of choosing between a rectilinear scanner and a gamma camera as the former is virtually out of the market. The decision that one has to make is when to invest in a gamma camera, and then on what basis to select the gamma camera 1 tab., 1 fig

  8. Selective-imaging camera

    Science.gov (United States)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  9. Video Chat with Multiple Cameras

    OpenAIRE

    MacCormick, John

    2012-01-01

    The dominant paradigm for video chat employs a single camera at each end of the conversation, but some conversations can be greatly enhanced by using multiple cameras at one or both ends. This paper provides the first rigorous investigation of multi-camera video chat, concentrating especially on the ability of users to switch between views at either end of the conversation. A user study of 23 individuals analyzes the advantages and disadvantages of permitting a user to switch between views at...

  10. Microprocessor-controlled, wide-range streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Amy E. Lewis, Craig Hollabaugh

    2006-09-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera’s user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  11. 1550 nm superluminescent diode and anti-Stokes effect CCD camera based optical coherence tomography for full-field optical metrology

    Science.gov (United States)

    Kredzinski, Lukasz; Connelly, Michael J.

    2011-06-01

    Optical Coherence Tomography (OCT) is a promising non-invasive imaging technology capable of carrying out 3D high-resolution cross-sectional images of the internal microstructure of examined material. However, almost all of these systems are expensive, requiring the use of complex optical setups, expensive light sources and complicated scanning of the sample under test. In addition most of these systems have not taken advantage of the competitively priced optical components available at wavelength within the main optical communications band located in the 1550 nm region. A comparatively simple and inexpensive full-field OCT system (FF-OCT), based on a superluminescent diode (SLD) light source and anti-stokes imaging device was constructed, to perform 3D cross-sectional imaging. This kind of inexpensive setup with moderate resolution could be easily applicable in low-level biomedical and industrial diagnostics. This paper involves calibration of the system and determines its suitability for imaging structures of biological tissues such as teeth, which has low absorption at 1550 nm.

  12. Orion Optical Navigation Progress Toward Exploration Mission 1

    Science.gov (United States)

    Holt, Greg N.; D'Souza, Christopher N.; Saley, David

    2018-01-01

    Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.

  13. Full-frame, high-speed 3D shape and deformation measurements using stereo-digital image correlation and a single color high-speed camera

    Science.gov (United States)

    Yu, Liping; Pan, Bing

    2017-08-01

    Full-frame, high-speed 3D shape and deformation measurement using stereo-digital image correlation (stereo-DIC) technique and a single high-speed color camera is proposed. With the aid of a skillfully designed pseudo stereo-imaging apparatus, color images of a test object surface, composed of blue and red channel images from two different optical paths, are recorded by a high-speed color CMOS camera. The recorded color images can be separated into red and blue channel sub-images using a simple but effective color crosstalk correction method. These separated blue and red channel sub-images are processed by regular stereo-DIC method to retrieve full-field 3D shape and deformation on the test object surface. Compared with existing two-camera high-speed stereo-DIC or four-mirror-adapter-assisted singe-camera high-speed stereo-DIC, the proposed single-camera high-speed stereo-DIC technique offers prominent advantages of full-frame measurements using a single high-speed camera but without sacrificing its spatial resolution. Two real experiments, including shape measurement of a curved surface and vibration measurement of a Chinese double-side drum, demonstrated the effectiveness and accuracy of the proposed technique.

  14. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  15. Meaning of visualizing retinal cone mosaic on adaptive optics images.

    Science.gov (United States)

    Jacob, Julie; Paques, Michel; Krivosic, Valérie; Dupas, Bénédicte; Couturier, Aude; Kulcsar, Caroline; Tadayoni, Ramin; Massin, Pascale; Gaudric, Alain

    2015-01-01

    To explore the anatomic correlation of the retinal cone mosaic on adaptive optics images. Retrospective nonconsecutive observational case series. A retrospective review of the multimodal imaging charts of 6 patients with focal alteration of the cone mosaic on adaptive optics was performed. Retinal diseases included acute posterior multifocal placoid pigment epitheliopathy (n = 1), hydroxychloroquine retinopathy (n = 1), and macular telangiectasia type 2 (n = 4). High-resolution retinal images were obtained using a flood-illumination adaptive optics camera. Images were recorded using standard imaging modalities: color and red-free fundus camera photography; infrared reflectance scanning laser ophthalmoscopy, fluorescein angiography, indocyanine green angiography, and spectral-domain optical coherence tomography (OCT) images. On OCT, in the marginal zone of the lesions, a disappearance of the interdigitation zone was observed, while the ellipsoid zone was preserved. Image recording demonstrated that such attenuation of the interdigitation zone co-localized with the disappearance of the cone mosaic on adaptive optics images. In 1 case, the restoration of the interdigitation zone paralleled that of the cone mosaic after a 2-month follow-up. Our results suggest that the interdigitation zone could contribute substantially to the reflectance of the cone photoreceptor mosaic. The absence of cones on adaptive optics images does not necessarily mean photoreceptor cell death. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Single Camera Calibration in 3D Vision

    Directory of Open Access Journals (Sweden)

    Caius SULIMAN

    2009-12-01

    Full Text Available Camera calibration is a necessary step in 3D vision in order to extract metric information from 2D images. A camera is considered to be calibrated when the parameters of the camera are known (i.e. principal distance, lens distorsion, focal length etc.. In this paper we deal with a single camera calibration method and with the help of this method we try to find the intrinsic and extrinsic camera parameters. The method was implemented with succes in the programming and simulation environment Matlab.

  17. RELATIVE AND ABSOLUTE CALIBRATION OF A MULTIHEAD CAMERA SYSTEM WITH OBLIQUE AND NADIR LOOKING CAMERAS FOR A UAS

    Directory of Open Access Journals (Sweden)

    F. Niemeyer

    2013-08-01

    Full Text Available Numerous unmanned aerial systems (UAS are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis“ software and will give an overview of the results and experiences of test flights.

  18. A camera specification for tendering purposes

    International Nuclear Information System (INIS)

    Lunt, M.J.; Davies, M.D.; Kenyon, N.G.

    1985-01-01

    A standardized document is described which is suitable for sending to companies which are being invited to tender for the supply of a gamma camera. The document refers to various features of the camera, the performance specification of the camera, maintenance details, price quotations for various options and delivery, installation and warranty details. (U.K.)

  19. Fiber-Optic Surface Temperature Sensor Based on Modal Interference

    Directory of Open Access Journals (Sweden)

    Frédéric Musin

    2016-07-01

    Full Text Available Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.

  20. Improved optical flow velocity analysis in SO2 camera images of volcanic plumes – implications for emission-rate retrievals investigated at Mt Etna, Italy and Guallatiri, Chile

    Directory of Open Access Journals (Sweden)

    J. Gliß

    2018-02-01

    Full Text Available Accurate gas velocity measurements in emission plumes are highly desirable for various atmospheric remote sensing applications. The imaging technique of UV SO2 cameras is commonly used to monitor SO2 emissions from volcanoes and anthropogenic sources (e.g. power plants, ships. The camera systems capture the emission plumes at high spatial and temporal resolution. This allows the gas velocities in the plume to be retrieved directly from the images. The latter can be measured at a pixel level using optical flow (OF algorithms. This is particularly advantageous under turbulent plume conditions. However, OF algorithms intrinsically rely on contrast in the images and often fail to detect motion in low-contrast image areas. We present a new method to identify ill-constrained OF motion vectors and replace them using the local average velocity vector. The latter is derived based on histograms of the retrieved OF motion fields. The new method is applied to two example data sets recorded at Mt Etna (Italy and Guallatiri (Chile. We show that in many cases, the uncorrected OF yields significantly underestimated SO2 emission rates. We further show that our proposed correction can account for this and that it significantly improves the reliability of optical-flow-based gas velocity retrievals. In the case of Mt Etna, the SO2 emissions of the north-eastern crater are investigated. The corrected SO2 emission rates range between 4.8 and 10.7 kg s−1 (average of 7.1  ±  1.3 kg s−1 and are in good agreement with previously reported values. For the Guallatiri data, the emissions of the central crater and a fumarolic field are investigated. The retrieved SO2 emission rates are between 0.5 and 2.9 kg s−1 (average of 1.3  ±  0.5 kg s−1 and provide the first report of SO2 emissions from this remotely located and inaccessible volcano.