WorldWideScience

Sample records for faint object camera

  1. Image processing for the ESA Faint Object Camera

    Science.gov (United States)

    Norris, P.

    1980-10-01

    The paper describes the Faint Object Camera (FOC) for image processing which complements the NASA Space Telescope for the 1983 Shuttle launch. The data processing for removing instrument signature effects from the FOC images is discussed along with subtle errors in the data. Data processing will be accomplished by a minicomputer driving a high quality color display, with large backing disk storage; interactive techniques for selective enhancement of image features will be combined with standard scientific transformation, filtering, and analysis methods. Astronomical techniques including star finding, will be used and spectral-type searches will be obtained from astronomical data analysis institutes.

  2. A Study of Planetary Nebulae using the Faint Object Infrared Camera for the SOFIA Telescope

    Science.gov (United States)

    Davis, Jessica

    2012-01-01

    A planetary nebula is formed following an intermediate-mass (1-8 solar M) star's evolution off of the main sequence; it undergoes a phase of mass loss whereby the stellar envelope is ejected and the core is converted into a white dwarf. Planetary nebulae often display complex morphologies such as waists or torii, rings, collimated jet-like outflows, and bipolar symmetry, but exactly how these features form is unclear. To study how the distribution of dust in the interstellar medium affects their morphology, we utilize the Faint Object InfraRed CAmera for the SOFIA Telescope (FORCAST) to obtain well-resolved images of four planetary nebulae--NGC 7027, NGC 6543, M2-9, and the Frosty Leo Nebula--at wavelengths where they radiate most of their energy. We retrieve mid infrared images at wavelengths ranging from 6.3 to 37.1 micron for each of our targets. IDL (Interactive Data Language) is used to perform basic analysis. We select M2-9 to investigate further; analyzing cross sections of the southern lobe reveals a slight limb brightening effect. Modeling the dust distribution within the lobes reveals that the thickness of the lobe walls is higher than anticipated, or rather than surrounding a vacuum surrounds a low density region of tenuous dust. Further analysis of this and other planetary nebulae is needed before drawing more specific conclusions.

  3. Faint Objects and How to Observe Them

    CERN Document Server

    Cudnik, Brian

    2013-01-01

    Astronomers' Observing Guides provide up-to-date information for amateur astronomers who want to know all about what it is they are observing. This is the basis of the first part of the book. The second part details observing techniques for practical astronomers, working with a range of different instruments. Faint Objects and How to Observe Them is for visual observers who want to "go deep" with their observing. It's a guide to some of the most distant, dim, and rarely observed objects in the sky, with background information on surveys and object lists -- some familiar and some not. Typically, amateur astronomers begin by looking at the brighter objects, and work their way "deeper" as their experience and skills improve. Faint Objects is about the faintest objects we can see with an amateur's telescope -- their physical nature, why they appear so dim, and how to track them down. By definition, these objects are hard to see! But moderate equipment (a decent telescope of at least 10-inch aperture) and the righ...

  4. Fainting

    Science.gov (United States)

    ... huffing") can cause fainting. Low blood sugar. The brain depends on a constant supply of sugar from the blood to work properly and keep a person awake. People who are taking insulin shots or other medications for diabetes can develop ...

  5. Population statistics of faint stellar and non-stellar objects

    Science.gov (United States)

    Vandenbergh, S.

    1979-01-01

    A disc and halo population model is constructed to fit star counts and color data down to V approximately 23 at absolute value of b = 90 deg. This model is used to predict star counts and colors down to V approximately 30. Deviations from these extrapolated relationships provide constraints on the number of faint quasars and black dwarf stars. It is shown that extra-galactic globular clusters start contributing significantly to star counts at V approximately 25 and are more numerous than stars for V 31. Morphological studies of galaxies with approximately 0.5, were made with the space telescope. Significant constraints on theoretical models that describe the evolution of clusters of galaxies are provided.

  6. Occluded object imaging via optimal camera selection

    Science.gov (United States)

    Yang, Tao; Zhang, Yanning; Tong, Xiaomin; Ma, Wenguang; Yu, Rui

    2013-12-01

    High performance occluded object imaging in cluttered scenes is a significant challenging task for many computer vision applications. Recently the camera array synthetic aperture imaging is proved to be an effective way to seeing object through occlusion. However, the imaging quality of occluded object is often significantly decreased by the shadows of the foreground occluder. Although some works have been presented to label the foreground occluder via object segmentation or 3D reconstruction, these methods will fail in the case of complicated occluder and severe occlusion. In this paper, we present a novel optimal camera selection algorithm to solve the above problem. The main characteristics of this algorithm include: (1) Instead of synthetic aperture imaging, we formulate the occluded object imaging problem as an optimal camera selection and mosaicking problem. To the best of our knowledge, our proposed method is the first one for occluded object mosaicing. (2) A greedy optimization framework is presented to propagate the visibility information among various depth focus planes. (3) A multiple label energy minimization formulation is designed in each plane to select the optimal camera. The energy is estimated in the synthetic aperture image volume and integrates the multi-view intensity consistency, previous visibility property and camera view smoothness, which is minimized via Graph cuts. We compare our method with the state-of-the-art synthetic aperture imaging algorithms, and extensive experimental results with qualitative and quantitative analysis demonstrate the effectiveness and superiority of our approach.

  7. Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion

    Science.gov (United States)

    Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.

    2017-03-01

    Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small image patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.

  8. Improving the ability of image sensors to detect faint stars and moving objects using image deconvolution techniques.

    Science.gov (United States)

    Fors, Octavi; Núñez, Jorge; Otazu, Xavier; Prades, Albert; Cardinal, Robert D

    2010-01-01

    In this paper we show how the techniques of image deconvolution can increase the ability of image sensors as, for example, CCD imagers, to detect faint stars or faint orbital objects (small satellites and space debris). In the case of faint stars, we show that this benefit is equivalent to double the quantum efficiency of the used image sensor or to increase the effective telescope aperture by more than 30% without decreasing the astrometric precision or introducing artificial bias. In the case of orbital objects, the deconvolution technique can double the signal-to-noise ratio of the image, which helps to discover and control dangerous objects as space debris or lost satellites. The benefits obtained using CCD detectors can be extrapolated to any kind of image sensors.

  9. Object recognition through turbulence with a modified plenoptic camera

    Science.gov (United States)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher

    2015-03-01

    Atmospheric turbulence adds accumulated distortion to images obtained by cameras and surveillance systems. When the turbulence grows stronger or when the object is further away from the observer, increasing the recording device resolution helps little to improve the quality of the image. Many sophisticated methods to correct the distorted images have been invented, such as using a known feature on or near the target object to perform a deconvolution process, or use of adaptive optics. However, most of the methods depend heavily on the object's location, and optical ray propagation through the turbulence is not directly considered. Alternatively, selecting a lucky image over many frames provides a feasible solution, but at the cost of time. In our work, we propose an innovative approach to improving image quality through turbulence by making use of a modified plenoptic camera. This type of camera adds a micro-lens array to a traditional high-resolution camera to form a semi-camera array that records duplicate copies of the object as well as "superimposed" turbulence at slightly different angles. By performing several steps of image reconstruction, turbulence effects will be suppressed to reveal more details of the object independently (without finding references near the object). Meanwhile, the redundant information obtained by the plenoptic camera raises the possibility of performing lucky image algorithmic analysis with fewer frames, which is more efficient. In our work, the details of our modified plenoptic cameras and image processing algorithms will be introduced. The proposed method can be applied to coherently illuminated object as well as incoherently illuminated objects. Our result shows that the turbulence effect can be effectively suppressed by the plenoptic camera in the hardware layer and a reconstructed "lucky image" can help the viewer identify the object even when a "lucky image" by ordinary cameras is not achievable.

  10. An equalised global graphical model-based approach for multi-camera object tracking

    OpenAIRE

    Chen, Weihua; Cao, Lijun; Chen, Xiaotang; Huang, Kaiqi

    2015-01-01

    Non-overlapping multi-camera visual object tracking typically consists of two steps: single camera object tracking and inter-camera object tracking. Most of tracking methods focus on single camera object tracking, which happens in the same scene, while for real surveillance scenes, inter-camera object tracking is needed and single camera tracking methods can not work effectively. In this paper, we try to improve the overall multi-camera object tracking performance by a global graph model with...

  11. Ball lightning observation: an objective video-camera analysis report

    OpenAIRE

    Sello, Stefano; Viviani, Paolo; Paganini, Enrico

    2011-01-01

    In this paper we describe a video-camera recording of a (probable) ball lightning event and both the related image and signal analyses for its photometric and dynamical characterization. The results strongly support the BL nature of the recorded luminous ball object and allow the researchers to have an objective and unique video document of a possible BL event for further analyses. Some general evaluations of the obtained results considering the proposed ball lightning models conclude the paper.

  12. Faint Object Spectrograph Spectra of the UV Emission Lines in NGC 5558: Detection of Strong Narrow Components

    Science.gov (United States)

    Crenshaw, D. Michael; Boggess, Albert; Wu, Chi-Chao

    1993-01-01

    Ultraviolet spectra of the Seyfert 1 galaxy NGC 5548 were obtained with the Faint Object Spectrograph (FOS) on the Hubble Space Telescope on 1992 July 5, when the UV continuum and broad emission lines were at their lowest ever observed level. The high resolution of the spectra, relative to previous UV observations, and the low state of NGC 5548 allow the detection and accurate measurement of strong narrow components of the emission lines of Ly alpha, C IV 1549, and C III 1909. Isolation of the UV narrow components enables a detailed comparison of narrow-line region (NLR) properties in Seyfert 1 and 2 galaxies, and removal of their contribution is important for studies of the broad-line region (BLR). Relative to the other narrow lines, C IV 1549 is much stronger in NGC 5548 than in Seyfert 2 galaxies, and Mg II 2798 is very weak or absent.

  13. Determination of feature generation methods for PTZ camera object tracking

    Science.gov (United States)

    Doyle, Daniel D.; Black, Jonathan T.

    2012-06-01

    Object detection and tracking using computer vision (CV) techniques have been widely applied to sensor fusion applications. Many papers continue to be written that speed up performance and increase learning of artificially intelligent systems through improved algorithms, workload distribution, and information fusion. Military application of real-time tracking systems is becoming more and more complex with an ever increasing need of fusion and CV techniques to actively track and control dynamic systems. Examples include the use of metrology systems for tracking and measuring micro air vehicles (MAVs) and autonomous navigation systems for controlling MAVs. This paper seeks to contribute to the determination of select tracking algorithms that best track a moving object using a pan/tilt/zoom (PTZ) camera applicable to both of the examples presented. The select feature generation algorithms compared in this paper are the trained Scale-Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), the Mixture of Gaussians (MoG) background subtraction method, the Lucas- Kanade optical flow method (2000) and the Farneback optical flow method (2003). The matching algorithm used in this paper for the trained feature generation algorithms is the Fast Library for Approximate Nearest Neighbors (FLANN). The BSD licensed OpenCV library is used extensively to demonstrate the viability of each algorithm and its performance. Initial testing is performed on a sequence of images using a stationary camera. Further testing is performed on a sequence of images such that the PTZ camera is moving in order to capture the moving object. Comparisons are made based upon accuracy, speed and memory.

  14. Motion object tracking algorithm using multi-cameras

    Science.gov (United States)

    Kong, Xiaofang; Chen, Qian; Gu, Guohua

    2015-09-01

    Motion object tracking is one of the most important research directions in computer vision. Challenges in designing a robust tracking method are usually caused by partial or complete occlusions on targets. However, motion object tracking algorithm based on multiple cameras according to the homography relation in three views can deal with this issue effectively since the information combining from multiple cameras in different views can make the target more complete and accurate. In this paper, a robust visual tracking algorithm based on the homography relations of three cameras in different views is presented to cope with the occlusion. First of all, being the main contribution of this paper, the motion object tracking algorithm based on the low-rank matrix representation under the framework of the particle filter is applied to track the same target in the public region respectively in different views. The target model and the occlusion model are established and an alternating optimization algorithm is utilized to solve the proposed optimization formulation while tracking. Then, we confirm the plane in which the target has the largest occlusion weight to be the principal plane and calculate the homography to find out the mapping relations between different views. Finally, the images of the other two views are projected into the main plane. By making use of the homography relation between different views, the information of the occluded target can be obtained completely. The proposed algorithm has been examined throughout several challenging image sequences, and experiments show that it overcomes the failure of the motion tracking especially under the situation of the occlusion. Besides, the proposed algorithm improves the accuracy of the motion tracking comparing with other state-of-the-art algorithms.

  15. Object search using mobile platform and RGBD camera

    OpenAIRE

    Bizjak, Jani

    2012-01-01

    Intelligent robots are nowadays making their way from laboratories to domestic homes. In this thesis, a vacuum cleaner robot iRobot Roomba is upgraded so it can search for objects in a room. Robot sensors alone are not good enough that is why a RGBD camera Kinect is added on top of the robot. For software part the robot meta operating system ROS is used. ROS connects the robot’s hardware and software together and is easily upgradable with custom packages for different tasks. This thesis is di...

  16. Detecting Flying Objects Using a Single Moving Camera.

    Science.gov (United States)

    Rozantsev, Artem; Lepetit, Vincent; Fua, Pascal

    2017-05-01

    We propose an approach for detecting flying objects such as Unmanned Aerial Vehicles (UAVs) and aircrafts when they occupy a small portion of the field of view, possibly moving against complex backgrounds, and are filmed by a camera that itself moves. We argue that solving such a difficult problem requires combining both appearance and motion cues. To this end we propose a regression-based approach for object-centric motion stabilization of image patches that allows us to achieve effective classification on spatio-temporal image cubes and outperform state-of-the-art techniques. As this problem has not yet been extensively studied, no test datasets are publicly available. We therefore built our own, both for UAVs and aircrafts, and will make them publicly available so they can be used to benchmark future flying object detection and collision avoidance algorithms.

  17. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  18. Rapid objective measurement of gamma camera resolution using statistical moments.

    Science.gov (United States)

    Hander, T A; Lancaster, J L; Kopp, D T; Lasher, J C; Blumhardt, R; Fox, P T

    1997-02-01

    An easy and rapid method for the measurement of the intrinsic spatial resolution of a gamma camera was developed. The measurement is based on the first and second statistical moments of regions of interest (ROIs) applied to bar phantom images. This leads to an estimate of the modulation transfer function (MTF) and the full-width-at-half-maximum (FWHM) of a line spread function (LSF). Bar phantom images were acquired using four large field-of-view (LFOV) gamma cameras (Scintronix, Picker, Searle, Siemens). The following factors important for routine measurements of gamma camera resolution with this method were tested: ROI placement and shape, phantom orientation, spatial sampling, and procedural consistency. A 0.2% coefficient of variation (CV) between repeat measurements of MTF was observed for a circular ROI. The CVs of less than 2% were observed for measured MTF values for bar orientations ranging from -10 degrees to +10 degrees with respect to the x and y axes of the camera acquisition matrix. A 256 x 256 matrix (1.6 mm pixel spacing) was judged sufficient for routine measurements, giving an estimate of the FWHM to within 0.1 mm of manufacturer-specified values (3% difference). Under simulated clinical conditions, the variation in measurements attributable to procedural effects yielded a CV of less than 2% in newer generation cameras. The moments method for determining MTF correlated well with a peak-valley method, with an average difference of 0.03 across the range of spatial frequencies tested (0.11-0.17 line pairs/mm, corresponding to 4.5-3.0 mm bars). When compared with the NEMA method for measuring intrinsic spatial resolution, the moments method was found to be within 4% of the expected FWHM.

  19. Searching for z~=6 Objects with the Hubble Space Telescope Advanced Camera for Surveys: Preliminary Analysis of a Deep Parallel Field

    Science.gov (United States)

    Yan, Haojing; Windhorst, Rogier A.; Cohen, Seth H.

    2003-03-01

    Recent results suggest that z~=6 marks the end of the reionization era. A large sample of objects at z~=6, therefore, will be of enormous importance, as it will enable us to observationally determine the exact epoch of the reionization and the sources that are responsible for it. With the Hubble Space Telescope Advanced Camera for Surveys (ACS) coming on-line, we now have a unique opportunity to discover a significant number of objects at z~=6. The pure parallel mode implemented for the Wide-Field Camera (WFC) has greatly enhanced this ability. We present our preliminary analysis of a deep ACS/WFC parallel field at |b|=74.4d. We find 30 plausible z~=6 candidates, all of which have signal-to-noise ratios greater than 7 in the F850LP band. The major source of contamination could be faint cool Galactic dwarfs, and we estimated that they would contribute at most four objects to our candidate list. We derived the cumulative number density of galaxies at 6.0contamination rate, it could possibly imply that the faint-end slope of the z~=6 luminosity function is steeper than α=-1.6. At the very least, our result suggests that galaxies with L

  20. Object Detection and Tracking-Based Camera Calibration for Normalized Human Height Estimation

    Directory of Open Access Journals (Sweden)

    Jaehoon Jung

    2016-01-01

    Full Text Available This paper presents a normalized human height estimation algorithm using an uncalibrated camera. To estimate the normalized human height, the proposed algorithm detects a moving object and performs tracking-based automatic camera calibration. The proposed method consists of three steps: (i moving human detection and tracking, (ii automatic camera calibration, and (iii human height estimation and error correction. The proposed method automatically calibrates camera by detecting moving humans and estimates the human height using error correction. The proposed method can be applied to object-based video surveillance systems and digital forensic.

  1. Multi Camera Multi Object Tracking using Block Search over Epipolar Geometry

    Directory of Open Access Journals (Sweden)

    Saman Sargolzaei

    2000-01-01

    Full Text Available We present strategy for multi-objects tracking in multi camera environment for the surveillance and security application where tracking multitude subjects are of utmost importance in a crowded scene. Our technique assumes partially overlapped multi-camera setup where cameras share common view from different angle to assess positions and activities of subjects under suspicion. To establish spatial correspondence between camera views we employ an epipolar geometry technique. We propose an overlapped block search method to find the interested pattern (target in new frames. Color pattern update scheme has been considered to further optimize the efficiency of the object tracking when object pattern changes due to object motion in the field of views of the cameras. Evaluation of our approach is presented with the results on PETS2007 dataset..

  2. Hubble Space Telescope faint object spectrograph Quasar Absorption System Snapshot Survey (AbSnap). 1: Astrometric optical positions and finding charts of 269 bright QSO

    Science.gov (United States)

    Bowen, David V.; Osmer, Samantha J.; Blades, J. Chris; Tytler, David; Cottrell, Lance; Fan, Xiao-Ming; Lanzetta, Kenneth M.

    1994-01-01

    We present finding charts and optical positions accurate to less than 1 arcsec for 269 bright (V less than or = 18.5) Quasi-Stellar Objects (QSOs). These objects were selected as candidates for the Hubble Space Telescope (HST) Quasar Absorption System Snapshot Survey (AbSnap), a program designed to use the Faint Object Spectrograph (FOS) to obtain short exposure ultraviolet (UV) spectra of bright QSOs. Many quasars were included because of their proximity to bright, low redshift galaxies and positions of these QSOs are measured accurately for the first time. Data were obtained using the digitized sky survey produced by the Space Telescope Science Institute's Guide Stars Selection System Astrometric Support Program.

  3. Object Occlusion Detection Using Automatic Camera Calibration for a Wide-Area Video Surveillance System

    Directory of Open Access Journals (Sweden)

    Jaehoon Jung

    2016-06-01

    Full Text Available This paper presents an object occlusion detection algorithm using object depth information that is estimated by automatic camera calibration. The object occlusion problem is a major factor to degrade the performance of object tracking and recognition. To detect an object occlusion, the proposed algorithm consists of three steps: (i automatic camera calibration using both moving objects and a background structure; (ii object depth estimation; and (iii detection of occluded regions. The proposed algorithm estimates the depth of the object without extra sensors but with a generic red, green and blue (RGB camera. As a result, the proposed algorithm can be applied to improve the performance of object tracking and object recognition algorithms for video surveillance systems.

  4. A framework for multi-object tracking over distributed wireless camera networks

    Science.gov (United States)

    Gau, Victor; Hwang, Jenq-Neng

    2010-07-01

    In this paper, we propose a unified framework targeting at two important issues in a distributed wireless camera network, i.e., object tracking and network communication, to achieve reliable multi-object tracking over distributed wireless camera networks. In the object tracking part, we propose a fully automated approach for tracking of multiple objects across multiple cameras with overlapping and non-overlapping field of views without initial training. To effectively exchange the tracking information among the distributed cameras, we proposed an idle probability based broadcasting method, iPro, which adaptively adjusts the broadcast probability to improve the broadcast effectiveness in a dense saturated camera network. Experimental results for the multi-object tracking demonstrate the promising performance of our approach on real video sequences for cameras with overlapping and non-overlapping views. The modeling and ns-2 simulation results show that iPro almost approaches the theoretical performance upper bound if cameras are within each other's transmission range. In more general scenarios, e.g., in case of hidden node problems, the simulation results show that iPro significantly outperforms standard IEEE 802.11, especially when the number of competing nodes increases.

  5. The Population of Optically Faint GEO Debris

    Science.gov (United States)

    Seitzer, Patrick; Barker, Ed; Buckalew, Brent; Burkhardt, Andrew; Cowardin, Heather; Frith, James; Gomez, Juan; Kaleida, Catherine; Lederer, Susan M.; Lee, Chris H.

    2016-01-01

    The 6.5-m Magellan telescope 'Walter Baade' at the Las Campanas Observatory in Chile has been used for spot surveys of the GEO orbital regime to study the population of optically faint GEO debris. The goal is to estimate the size of the population of GEO debris at sizes much smaller than can be studied with 1-meter class telescopes. Despite the small size of the field of view of the Magellan instrument (diameter 0.5-degree), a significant population of objects fainter than R = 19th magnitude have been found with angular rates consistent with circular orbits at GEO. We compare the size of this population with the numbers of GEO objects found at brighter magnitudes by smaller telescopes. The observed detections have a wide range in characteristics starting with those appearing as short uniform streaks. But there are a substantial number of detections with variations in brightness, flashers, during the 5-second exposure. The duration of each of these flashes can be extremely brief: sometimes less than half a second. This is characteristic of a rapidly tumbling object with a quite variable projected size times albedo. If the albedo is of the order of 0.2, then the largest projected size of these objects is around 10-cm. The data in this paper was collected over the last several years using Magellan's IMACS camera in f/2 mode. The analysis shows the brightness bins for the observed GEO population as well as the periodicity of the flashers. All objects presented are correlated with the catalog: the focus of the paper will be on the uncorrelated, optically faint, objects. The goal of this project is to better characterize the faint debris population in GEO that access to a 6.5-m optical telescope in a superb site can provide.

  6. The contribution of dissolving star clusters to the population of ultra faint objects in the outer halo of the Milky Way

    Science.gov (United States)

    Contenta, Filippo; Gieles, Mark; Balbinot, Eduardo; Collins, Michelle L. M.

    2017-04-01

    In the last decade, several ultra faint objects (UFOs, MV ≳ -3.5) have been discovered in the outer halo of the Milky Way. For some of these objects, it is not clear whether they are star clusters or (ultra faint) dwarf galaxies. In this work, we quantify the contribution of star clusters to the population of UFOs. We extrapolated the mass and Galactocentric radius distribution of the globular clusters using a population model, finding that the Milky Way contains about 3.3^{+7.3}_{-1.6} star clusters with MV ≳ -3.5 and Galactocentric radius ≥20 kpc. To understand whether dissolving clusters can appear as UFOs, we run a suite of direct N-body models, varying the orbit, the Galactic potential, the binary fraction and the black hole (BH) natal kick velocities. In the analyses, we consider observational biases such as luminosity limit, field stars and line-of-sight projection. We find that star clusters contribute to both the compact and the extended population of UFOs: clusters without BHs appear compact with radii ˜5 pc, while clusters that retain their BHs after formation have radii ≳ 20 pc. The properties of the extended clusters are remarkably similar to those of dwarf galaxies: high-inferred mass-to-light ratios due to binaries, binary properties mildly affected by dynamical evolution, no observable mass segregation and flattened stellar mass function. We conclude that the slope of the stellar mass function as a function of Galactocentric radius and the presence/absence of cold streams can discriminate between dark matter-free and dark matter-dominated UFOs.

  7. Modified 3D time-of-flight camera for object separation in organic farming

    Science.gov (United States)

    Knoll, Florian J.; Holtorf, Tim; Hussmann, Stephan

    2017-06-01

    An important step for the classification of plants in organic farming is the separation of the objects. In our approach a 3D camera will be used for this task. One problem is that most of the available 3D sensors are not suitable due to their bad resolution, low-speed and high costs. The Kinect II is an affordable alternative but designed for another workspace. In this paper it is shown why a 3D sensor is required for the separation of objects in organic farming and how the modification of a Kinect II 3D camera increases the range resolution to solve the given problem.

  8. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    Science.gov (United States)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  9. Expanded opportunities of THz passive camera for the detection of concealed objects

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

    2013-10-01

    Among the security problems, the detection of object implanted into either the human body or animal body is the urgent problem. At the present time the main tool for the detection of such object is X-raying only. However, X-ray is the ionized radiation and therefore can not be used often. Other way for the problem solving is passive THz imaging using. In our opinion, using of the passive THz camera may help to detect the object implanted into the human body under certain conditions. The physical reason of such possibility arises from temperature trace on the human skin as a result of the difference in temperature between object and parts of human body. Modern passive THz cameras have not enough resolution in temperature to see this difference. That is why, we use computer processing to enhance the passive THz camera resolution for this application. After computer processing of images captured by passive THz camera TS4, developed by ThruVision Systems Ltd., we may see the pronounced temperature trace on the human body skin from the water, which is drunk by person, or other food eaten by person. Nevertheless, there are many difficulties on the way of full soution of this problem. We illustrate also an improvement of quality of the image captured by comercially available passive THz cameras using computer processing. In some cases, one can fully supress a noise on the image without loss of its quality. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts.

  10. The Example of Using the Xiaomi Cameras in Inventory of Monumental Objects - First Results

    Science.gov (United States)

    Markiewicz, J. S.; Łapiński, S.; Bienkowski, R.; Kaliszewska, A.

    2017-11-01

    At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. Today, photogrammetry is becoming more and more popular and is becoming the standard of documentation in many projects involving the recording of all possible spatial data on landscape, architecture, or even single objects. Low-cost sensors allow for the creation of reliable and accurate three-dimensional models of investigated objects. This paper presents the results of a comparison between the outcomes obtained when using three sources of image: low-cost Xiaomi cameras, a full-frame camera (Canon 5D Mark II) and middle-frame camera (Hasselblad-Hd4). In order to check how the results obtained from the two sensors differ the following parameters were analysed: the accuracy of the orientation of the ground level photos on the control and check points, the distribution of appointed distortion in the self-calibration process, the flatness of the walls, the discrepancies between point clouds from the low-cost cameras and references data. The results presented below are a result of co-operation of researchers from three institutions: the Systems Research Institute PAS, The Department of Geodesy and Cartography at the Warsaw University of Technology and the National Museum in Warsaw.

  11. THE EXAMPLE OF USING THE XIAOMI CAMERAS IN INVENTORY OF MONUMENTAL OBJECTS - FIRST RESULTS

    Directory of Open Access Journals (Sweden)

    J. S. Markiewicz

    2017-11-01

    Full Text Available At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. Today, photogrammetry is becoming more and more popular and is becoming the standard of documentation in many projects involving the recording of all possible spatial data on landscape, architecture, or even single objects. Low-cost sensors allow for the creation of reliable and accurate three-dimensional models of investigated objects. This paper presents the results of a comparison between the outcomes obtained when using three sources of image: low-cost Xiaomi cameras, a full-frame camera (Canon 5D Mark II and middle-frame camera (Hasselblad-Hd4. In order to check how the results obtained from the two sensors differ the following parameters were analysed: the accuracy of the orientation of the ground level photos on the control and check points, the distribution of appointed distortion in the self-calibration process, the flatness of the walls, the discrepancies between point clouds from the low-cost cameras and references data. The results presented below are a result of co-operation of researchers from three institutions: the Systems Research Institute PAS, The Department of Geodesy and Cartography at the Warsaw University of Technology and the National Museum in Warsaw.

  12. Detecting Target Objects by Natural Language Instructions Using an RGB-D Camera

    Directory of Open Access Journals (Sweden)

    Jiatong Bao

    2016-12-01

    Full Text Available Controlling robots by natural language (NL is increasingly attracting attention for its versatility, convenience and no need of extensive training for users. Grounding is a crucial challenge of this problem to enable robots to understand NL instructions from humans. This paper mainly explores the object grounding problem and concretely studies how to detect target objects by the NL instructions using an RGB-D camera in robotic manipulation applications. In particular, a simple yet robust vision algorithm is applied to segment objects of interest. With the metric information of all segmented objects, the object attributes and relations between objects are further extracted. The NL instructions that incorporate multiple cues for object specifications are parsed into domain-specific annotations. The annotations from NL and extracted information from the RGB-D camera are matched in a computational state estimation framework to search all possible object grounding states. The final grounding is accomplished by selecting the states which have the maximum probabilities. An RGB-D scene dataset associated with different groups of NL instructions based on different cognition levels of the robot are collected. Quantitative evaluations on the dataset illustrate the advantages of the proposed method. The experiments of NL controlled object manipulation and NL-based task programming using a mobile manipulator show its effectiveness and practicability in robotic applications.

  13. Implementation of Smooth Continuous Camera Trajectories for Viewing PDB and VRML Objects

    Science.gov (United States)

    Pahor, Milan

    2005-05-01

    A parametrically defined camera trajectory enabling the accurate and systematic scanning of the entire surface of virtual (Protein Data Bank (PDB)) proteins is developed and implemented. The smooth continuous path guarantees that each local region of the protein is inspected from a variety of directions in a controlled and uniform manner. Manipulation of several parameters governs the density, character and duration of the scan. Applications to the analysis of other real and virtual objects are also considered.

  14. Building three-dimensional object models using a mobile platform and an RGBD camera

    OpenAIRE

    Tratnik, Damjan

    2013-01-01

    Nowadays we can observe 3D models in computer games, computer enriched movies, augmented reality and animations, as well as in the designing of new products, simulations, architecture, medicine etc. Due to the arrival of new low-cost depth cameras (e.g. Microsoft Kinect and Asus Xtion Pro), constructing 3D models of objects has become much more accessible and widespread. In this diploma thesis, we have prepared a short review of several distinct techniques of capturing shapes of objects and c...

  15. Evaluation of Moving Object Detection Based on Various Input Noise Using Fixed Camera

    Science.gov (United States)

    Kiaee, N.; Hashemizadeh, E.; Zarrinpanjeh, N.

    2017-09-01

    Detecting and tracking objects in video has been as a research area of interest in the field of image processing and computer vision. This paper evaluates the performance of a novel method for object detection algorithm in video sequences. This process helps us to know the advantage of this method which is being used. The proposed framework compares the correct and wrong detection percentage of this algorithm. This method was evaluated with the collected data in the field of urban transport which include car and pedestrian in fixed camera situation. The results show that the accuracy of the algorithm will decreases because of image resolution reduction.

  16. Traffic intensity monitoring using multiple object detection with traffic surveillance cameras

    Science.gov (United States)

    Hamdan, H. G. Muhammad; Khalifah, O. O.

    2017-11-01

    Object detection and tracking is a field of research that has many applications in the current generation with increasing number of cameras on the streets and lower cost for Internet of Things(IoT). In this paper, a traffic intensity monitoring system is implemented based on the Macroscopic Urban Traffic model is proposed using computer vision as its source. The input of this program is extracted from a traffic surveillance camera which has another program running a neural network classification which can identify and differentiate the vehicle type is implanted. The neural network toolbox is trained with positive and negative input to increase accuracy. The accuracy of the program is compared to other related works done and the trends of the traffic intensity from a road is also calculated. relevant articles in literature searches, great care should be taken in constructing both. Lastly the limitation and the future work is concluded.

  17. The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics

    Science.gov (United States)

    Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.

    2003-04-01

    The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.

  18. Children's exposure to alcohol marketing within supermarkets: An objective analysis using GPS technology and wearable cameras.

    Science.gov (United States)

    Chambers, T; Pearson, A L; Stanley, J; Smith, M; Barr, M; Ni Mhurchu, C; Signal, L

    2017-07-01

    Exposure to alcohol marketing within alcohol retailers has been associated with higher rates of childhood drinking, brand recognition, and marketing recall. This study aimed to objectively measure children's everyday exposure to alcohol marketing within supermarkets. Children aged 11-13 (n = 167) each wore a wearable camera and GPS device for four consecutive days. Micro-spatial analyses were used to examine exposures within supermarkets. In alcohol retailing supermarkets (n = 30), children encountered alcohol marketing on 85% of their visits (n = 78). Alcohol marketing was frequently near everyday goods (bread and milk) or entrance/exit. Alcohol sales in supermarkets should be banned in order to protect children from alcohol marketing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Temperature measurements on fast-rotating objects using a thermographic camera with an optomechanical image derotator

    Science.gov (United States)

    Altmann, Bettina; Pape, Christian; Reithmeier, Eduard

    2017-08-01

    Increasing requirements concerning the quality and lifetime of machine components in industrial and automotive applications require comprehensive investigations of the components in conditions close to the application. Irregularities in heating of mechanical parts reveal regions with increased loading of pressure, draft or friction. In the long run this leads to damage and total failure of the machine. Thermographic measurements of rotating objects, e.g., rolling bearings, brakes, and clutches provide an approach to investigate those defects. However, it is challenging to measure fast-rotating objects accurately. Currently one contact-free approach is performing stroboscopic measurements using an infrared sensor. The data acquisition is triggered so that the image is taken once per revolution. This leads to a huge loss of information on the majority of the movement and to motion blur. The objective of this research is showing the potential of using an optomechanical image derotator together with a thermographic camera. The derotator follows the rotation of the measurement object so that quasi-stationary thermal images during motion can be acquired by the infrared sensor. Unlike conventional derotators which use a glass prism to achieve this effect, the derotator within this work is equipped with a sophisticated reflector assembly. These reflectors are made of aluminum to transfer infrared radiation emitted by the rotating object. Because of the resulting stationary thermal image, the operation can be monitored continuously even for fast-rotating objects. The field of view can also be set to a small off-axis region of interest which then can be investigated with higher resolution or frame rate. To depict the potential of this approach, thermographic measurements on a rolling bearings in different operating states are presented.

  20. The Near-Earth Object Camera: A Next-Generation Minor Planet Survey

    Science.gov (United States)

    Mainzer, Amy K.; Wright, Edward L.; Bauer, James; Grav, Tommy; Cutri, Roc M.; Masiero, Joseph; Nugent, Carolyn R.

    2015-11-01

    The Near-Earth Object Camera (NEOCam) is a next-generation asteroid and comet survey designed to discover, characterize, and track large numbers of minor planets using a 50 cm infrared telescope located at the Sun-Earth L1 Lagrange point. Proposed to NASA's Discovery program, NEOCam is designed to carry out a comprehensive inventory of the small bodies in the inner regions of our solar system. It address three themes: 1) quantify the potential hazard that near-Earth objects may pose to Earth; 2) study the origins and evolution of our solar system as revealed by its small body populations; and 3) identify the best destinations for future robotic and human exploration. With a dual channel infrared imager that observes at 4-5 and 6-10 micron bands simultaneously through the use of a beamsplitter, NEOCam enables measurements of asteroid diameters and thermal inertia. NEOCam complements existing and planned visible light surveys in terms of orbital element phase space and wavelengths, since albedos can be determined for objects with both visible and infrared flux measurements. NEOCam was awarded technology development funding in 2011 to mature the necessary megapixel infrared detectors.

  1. Minimal camera networks for 3D image based modeling of cultural heritage objects.

    Science.gov (United States)

    Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

    2014-03-25

    3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue "Lamassu". Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883-859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm.

  2. Minimal Camera Networks for 3D Image Based Modeling of Cultural Heritage Objects

    Science.gov (United States)

    Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma

    2014-01-01

    3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue “Lamassu”. Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883–859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718

  3. Automated control of robotic camera tacheometers for measurements of industrial large scale objects

    Science.gov (United States)

    Heimonen, Teuvo; Leinonen, Jukka; Sipola, Jani

    2013-04-01

    The modern robotic tacheometers equipped with digital cameras (called also imaging total stations) and capable to measure reflectorless offer new possibilities to gather 3d data. In this paper an automated approach for the tacheometer measurements needed in the dimensional control of industrial large scale objects is proposed. There are two new contributions in the approach: the automated extraction of the vital points (i.e. the points to be measured) and the automated fine aiming of the tacheometer. The proposed approach proceeds through the following steps: First the coordinates of the vital points are automatically extracted from the computer aided design (CAD) data. The extracted design coordinates are then used to aim the tacheometer to point out to the designed location of the points, one after another. However, due to the deviations between the designed and the actual location of the points, the aiming need to be adjusted. An automated dynamic image-based look-and-move type servoing architecture is proposed to be used for this task. After a successful fine aiming, the actual coordinates of the point in question can be automatically measured by using the measuring functionalities of the tacheometer. The approach was validated experimentally and noted to be feasible. On average 97 % of the points actually measured in four different shipbuilding measurement cases were indeed proposed to be vital points by the automated extraction algorithm. The accuracy of the results obtained with the automatic control method of the tachoemeter were comparable to the results obtained with the manual control, and also the reliability of the image processing step of the method was found to be high in the laboratory experiments.

  4. Globular Clusters for Faint Galaxies

    Science.gov (United States)

    Kohler, Susanna

    2017-07-01

    The origin of ultra-diffuse galaxies (UDGs) has posed a long-standing mystery for astronomers. New observations of several of these faint giants with the Hubble Space Telescope are now lending support to one theory.Faint-Galaxy MysteryHubble images of Dragonfly 44 (top) and DFX1 (bottom). The right panels show the data with greater contrast and extended objects masked. [van Dokkum et al. 2017]UDGs large, extremely faint spheroidal objects were first discovered in the Virgo galaxy cluster roughly three decades ago. Modern telescope capabilities have resulted in many more discoveries of similar faint galaxies in recent years, suggesting that they are a much more common phenomenon than we originally thought.Despite the many observations, UDGs still pose a number of unanswered questions. Chief among them: what are UDGs? Why are these objects the size of normal galaxies, yet so dim? There are two primary models that explain UDGs:UDGs were originally small galaxies, hence their low luminosity. Tidal interactions then puffed them up to the large size we observe today.UDGs are effectively failed galaxies. They formed the same way as normal galaxies of their large size, but something truncated their star formation early, preventing them from gaining the brightness that we would expect for galaxies of their size.Now a team of scientists led by Pieter van Dokkum (Yale University) has made some intriguing observations with Hubble that lend weight to one of these models.Globulars observed in 16 Coma-cluster UDGs by Hubble. The top right panel shows the galaxy identifications. The top left panel shows the derived number of globular clusters in each galaxy. [van Dokkum et al. 2017]Globulars GaloreVan Dokkum and collaborators imaged two UDGs with Hubble: Dragonfly 44 and DFX1, both located in the Coma galaxy cluster. These faint galaxies are both smooth and elongated, with no obvious irregular features, spiral arms, star-forming regions, or other indications of tidal interactions

  5. Photometry of faint blue stars

    International Nuclear Information System (INIS)

    Kilkenny, D.; Hill, P.W.; Brown, A.

    1977-01-01

    Photometry on the uvby system is given for 61 faint blue stars. The stars are classified by means of the Stromgren indices, using criteria described in a previous paper (Kilkenny and Hill (1975)). (author)

  6. Children's everyday exposure to food marketing: an objective analysis using wearable cameras.

    Science.gov (United States)

    Signal, L N; Stanley, J; Smith, M; Barr, M B; Chambers, T J; Zhou, J; Duane, A; Gurrin, C; Smeaton, A F; McKerchar, C; Pearson, A L; Hoek, J; Jenkin, G L S; Ni Mhurchu, C

    2017-10-08

    Over the past three decades the global prevalence of childhood overweight and obesity has increased by 47%. Marketing of energy-dense nutrient-poor foods and beverages contributes to this worldwide increase. Previous research on food marketing to children largely uses self-report, reporting by parents, or third-party observation of children's environments, with the focus mostly on single settings and/or media. This paper reports on innovative research, Kids'Cam, in which children wore cameras to examine the frequency and nature of everyday exposure to food marketing across multiple media and settings. Kids'Cam was a cross-sectional study of 168 children (mean age 12.6 years, SD = 0.5) in Wellington, New Zealand. Each child wore a wearable camera on four consecutive days, capturing images automatically every seven seconds. Images were manually coded as either recommended (core) or not recommended (non-core) to be marketed to children by setting, marketing medium, and product category. Images in convenience stores and supermarkets were excluded as marketing examples were considered too numerous to count. On average, children were exposed to non-core food marketing 27.3 times a day (95% CI 24.8, 30.1) across all settings. This was more than twice their average exposure to core food marketing (12.3 per day, 95% CI 8.7, 17.4). Most non-core exposures occurred at home (33%), in public spaces (30%) and at school (19%). Food packaging was the predominant marketing medium (74% and 64% for core and non-core foods) followed by signs (21% and 28% for core and non-core). Sugary drinks, fast food, confectionary and snack foods were the most commonly encountered non-core foods marketed. Rates were calculated using Poisson regression. Children in this study were frequently exposed, across multiple settings, to marketing of non-core foods not recommended to be marketed to children. The study provides further evidence of the need for urgent action to reduce children's exposure to

  7. Science objectives and first results from the SMART-1/AMIE multicolour micro-camera

    Science.gov (United States)

    Josset, J.-L.; Beauvivre, S.; Cerroni, P.; de Sanctis, M. C.; Pinet, P.; Chevrel, S.; Langevin, Y.; Barucci, M. A.; Plancke, P.; Koschny, D.; Almeida, M.; Sodnik, Z.; Mancuso, S.; Hofmann, B. A.; Muinonen, K.; Shevchenko, V.; Shkuratov, Yu.; Ehrenfreund, P.; Foing, B. H.

    The Advanced Moon micro-Imager Experiment (AMIE), on-board SMART-1, the first European mission to the Moon, is an imaging system with scientific, technical and public outreach objectives. The science objectives are to image the lunar South Pole, permanent shadow areas (ice deposit), eternal light (crater rims), ancient lunar non-mare volcanism, local spectrophotometry and physical state of the lunar surface, and to map high latitudes regions (south) mainly at far side (South Pole Aitken basin). The technical objectives are to perform a Laserlink experiment (detection of laser beam emitted by ESA/Tenerife ground station), flight demonstration of new technologies and on-board autonomy navigation. The public outreach and educational objectives are to promote planetary exploration and space. We present here the first results obtained during the cruise phase.

  8. Contribution to the reconstruction of scenes made of cylindrical and polyhedral objects from sequences of images obtained by a moving camera

    International Nuclear Information System (INIS)

    Viala, Marc

    1992-01-01

    Environment perception is an important process which enables a robot to perform actions in an unknown scene. Although many sensors exist to 'give sight', the camera seems to play a leading part. This thesis deals with the reconstruction of scenes made of cylindrical and polyhedral objects from sequences of images provided by a moving camera. Two methods are presented. Both are based on the evolution of apparent contours of objects in a sequence. The first approach has been developed considering that camera motion is known. Despite the good results obtained by this method, the specific conditions it requires makes its use limited. In order to avoid an accurate evaluation of camera motion, we introduce another method allowing, at the same time, to estimate the object parameters and camera positions. In this approach, only is needed a 'poor' knowledge of camera displacements supplied by the control system of the robotic platform, in which the camera is embedded. An optimal integration of a priori information, as well as the dynamic feature of the state model to estimate, lead us to use the Kalman filter. Experiments conducted with synthetic and real images proved the reliability of these methods. Camera calibration set-up is also suggested to achieve the most accurate scene models resulting from reconstruction processes. (author) [fr

  9. IMPLEMENTATION OF IMAGE PROCESSING ALGORITHMS AND GLVQ TO TRACK AN OBJECT USING AR.DRONE CAMERA

    Directory of Open Access Journals (Sweden)

    Muhammad Nanda Kurniawan

    2014-08-01

    Full Text Available Abstract In this research, Parrot AR.Drone as an Unmanned Aerial Vehicle (UAV was used to track an object from above. Development of this system utilized some functions from OpenCV library and Robot Operating System (ROS. Techniques that were implemented in the system are image processing al-gorithm (Centroid-Contour Distance (CCD, feature extraction algorithm (Principal Component Analysis (PCA and an artificial neural network algorithm (Generalized Learning Vector Quantization (GLVQ. The final result of this research is a program for AR.Drone to track a moving object on the floor in fast response time that is under 1 second.

  10. FPGA-Based HD Camera System for the Micropositioning of Biomedical Micro-Objects Using a Contactless Micro-Conveyor

    Directory of Open Access Journals (Sweden)

    Elmar Yusifli

    2017-03-01

    Full Text Available With recent advancements, micro-object contactless conveyers are becoming an essential part of the biomedical sector. They help avoid any infection and damage that can occur due to external contact. In this context, a smart micro-conveyor is devised. It is a Field Programmable Gate Array (FPGA-based system that employs a smart surface for conveyance along with an OmniVision complementary metal-oxide-semiconductor (CMOS HD camera for micro-object position detection and tracking. A specific FPGA-based hardware design and VHSIC (Very High Speed Integrated Circuit Hardware Description Language (VHDL implementation are realized. It is done without employing any Nios processor or System on a Programmable Chip (SOPC builder based Central Processing Unit (CPU core. It keeps the system efficient in terms of resource utilization and power consumption. The micro-object positioning status is captured with an embedded FPGA-based camera driver and it is communicated to the Image Processing, Decision Making and Command (IPDC module. The IPDC is programmed in C++ and can run on a Personal Computer (PC or on any appropriate embedded system. The IPDC decisions are sent back to the FPGA, which pilots the smart surface accordingly. In this way, an automated closed-loop system is employed to convey the micro-object towards a desired location. The devised system architecture and implementation principle is described. Its functionality is also verified. Results have confirmed the proper functionality of the developed system, along with its outperformance compared to other solutions.

  11. Implementation of Image Processing Algorithms and Glvq to Track an Object Using Ar.drone Camera

    OpenAIRE

    Kurniawan, Muhammad Nanda; Widiyanto, Didit

    2014-01-01

    Abstract In this research, Parrot AR.Drone as an Unmanned Aerial Vehicle (UAV) was used to track an object from above. Development of this system utilized some functions from OpenCV library and Robot Operating System (ROS). Techniques that were implemented in the system are image processing al-gorithm (Centroid-Contour Distance (CCD)), feature extraction algorithm (Principal Component Analysis (PCA)) and an artificial neural network algorithm (Generalized Learning Vector Quantization (GLV...

  12. Design of the solid cryogen dewar for the Near-Infrared Camera and Multi-Object Spectrometer

    Science.gov (United States)

    Oonk, Rodney L.

    1991-01-01

    A multipurpose, second-generation HST instrument for imaging and spectroscopy in the 1 to 2.5 microns wavelength region is being developed. The Near-Infrared Camera Multi-Object Spectrometer (NICMOS) is unique since it is the only HST instrument operating in the NIR and cryogenically cooled. The NICMOS detector arrays are cooled to 58 K by a solid-nitrogen (SN2) dewar with a predicted lifetime of nearly five years. To obtain this long lifetime, a hybrid cooling approach using thermoelectric coolers (TECs) is employed to reduce the parasitic heat load on the SN2. The design features used to promote long life, the predicted lifetime improvements provided by the TECs, and the performance degradation in the event of TEC failure(s) are discussed.

  13. On faintly pi g-continuous functions

    Directory of Open Access Journals (Sweden)

    N. Rajesh

    2012-01-01

    Full Text Available A new class of functions, called faintly pi g-continuous functions, hasbeen defined and studied. The relationships among faintly pi g-continuous functions and pig-connected spaces, strongly pig-normal spaces and pi g-compact spaces are investigated. Furthermore, the relationships between faintly pi g-continuous functions and graphs are investigated.

  14. HUBBLE SPACE TELESCOPE/NEAR-INFRARED CAMERA AND MULTI-OBJECT SPECTROMETER OBSERVATIONS OF THE GLIMPSE9 STELLAR CLUSTER

    International Nuclear Information System (INIS)

    Messineo, Maria; Figer, Donald F.; Davies, Ben; Trombley, Christine; Kudritzki, R. P.; Rich, R. Michael; MacKenty, John

    2010-01-01

    We present Hubble Space Telescope/Near-Infrared Camera and Multi-Object Spectrometer photometry, and low-resolution K-band spectra of the GLIMPSE9 stellar cluster. The newly obtained color-magnitude diagram shows a cluster sequence with H - K S = ∼1 mag, indicating an interstellar extinction A K s = 1.6 ± 0.2 mag. The spectra of the three brightest stars show deep CO band heads, which indicate red supergiants with spectral type M1-M2. Two 09-B2 supergiants are also identified, which yield a spectrophotometric distance of 4.2 ± 0.4 kpc. Presuming that the population is coeval, we derive an age between 15 and 27 Myr, and a total cluster mass of 1600 ± 400 M sun , integrated down to 1 M sun . In the vicinity of GLIMPSE9 are several H II regions and supernova remnants, all of which (including GLIMPSE9) are probably associated with a giant molecular cloud (GMC) in the inner galaxy. GLIMPSE9 probably represents one episode of massive star formation in this GMC. We have identified several other candidate stellar clusters of the same complex.

  15. Fainting

    Science.gov (United States)

    ... and Nutrition Healthy Food Choices Weight Loss and Diet Plans Nutrients and Nutritional Info Sugar and Sugar Substitutes Exercise and Fitness Exercise Basics Sports Safety Injury Rehabilitation Emotional Well-Being Mental Health ...

  16. THE HUBBLE WIDE FIELD CAMERA 3 TEST OF SURFACES IN THE OUTER SOLAR SYSTEM: SPECTRAL VARIATION ON KUIPER BELT OBJECTS

    International Nuclear Information System (INIS)

    Fraser, Wesley C.; Brown, Michael E.; Glass, Florian

    2015-01-01

    Here, we present additional photometry of targets observed as part of the Hubble Wide Field Camera 3 (WFC3) Test of Surfaces in the Outer Solar System. Twelve targets were re-observed with the WFC3 in the optical and NIR wavebands designed to complement those used during the first visit. Additionally, all of the observations originally presented by Fraser and Brown were reanalyzed through the same updated photometry pipeline. A re-analysis of the optical and NIR color distribution reveals a bifurcated optical color distribution and only two identifiable spectral classes, each of which occupies a broad range of colors and has correlated optical and NIR colors, in agreement with our previous findings. We report the detection of significant spectral variations on five targets which cannot be attributed to photometry errors, cosmic rays, point-spread function or sensitivity variations, or other image artifacts capable of explaining the magnitude of the variation. The spectrally variable objects are found to have a broad range of dynamical classes and absolute magnitudes, exhibit a broad range of apparent magnitude variations, and are found in both compositional classes. The spectrally variable objects with sufficiently accurate colors for spectral classification maintain their membership, belonging to the same class at both epochs. 2005 TV189 exhibits a sufficiently broad difference in color at the two epochs that span the full range of colors of the neutral class. This strongly argues that the neutral class is one single class with a broad range of colors, rather than the combination of multiple overlapping classes

  17. Photometric Variability in the Faint Sky Variability Survey

    NARCIS (Netherlands)

    Morales-Rueda, L.; Groot, P.J.; Augusteijn, T.; Nelemans, G.A.; Vreeswijk, P.M.; Besselaar, E.J.M. van den

    2005-01-01

    The Faint Sky Variability Survey (FSVS) is aimed at finding photometric and/or astrometric variable objects between 16th and 24th mag on time-scales between tens of minutes and years with photometric precisions ranging from 3 millimag to 0.2 mag. An area of ~23 deg2, located at mid and

  18. Optical spectroscopy of faint gigahertz peaked-spectrum sources

    NARCIS (Netherlands)

    Snellen, IAG; Schilizzi, RT; Miley, GK; de Bruyn, AG; Rottgering, HJA

    1999-01-01

    We present spectroscopic observations of a sample of faint gigahertz peaked-spectrum (GPS) radio sources drawn from the Westerbork Northern Sky Survey (WENSS), Redshifts have been determined for 19 (40 per cent) of the objects. The optical spectra of the GPS sources identified with low-redshift

  19. Infrared-faint radio sources in the SERVS deep fields. Pinpointing AGNs at high redshift

    NARCIS (Netherlands)

    Maini, A.; Prandoni, I.; Norris, R. P.; Spitler, L. R.; Mignano, A.; Lacy, M.; Morganti, R.

    2016-01-01

    Context. Infrared-faint radio sources (IFRS) represent an unexpected class of objects which are relatively bright at radio wavelength, but unusually faint at infrared (IR) and optical wavelengths. A recent and extensive campaign on the radio-brightest IFRSs (S1.4 GHz≳ 10 mJy) has provided evidence

  20. Long-Term Continuous Double Station Observation of Faint Meteor Showers

    Czech Academy of Sciences Publication Activity Database

    Vítek, S.; Páta, P.; Koten, Pavel; Fliegel, K.

    2016-01-01

    Roč. 16, č. 9 (2016), 1493/1-1493/10 ISSN 1424-8220 R&D Projects: GA ČR GA14-25251S Institutional support: RVO:67985815 Keywords : faint meteor shower * meteoroid * CCD camera Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 2.677, year: 2016

  1. A Real-Time Method to Detect and Track Moving Objects (DATMO from Unmanned Aerial Vehicles (UAVs Using a Single Camera

    Directory of Open Access Journals (Sweden)

    Bruce MacDonald

    2012-04-01

    Full Text Available We develop a real-time method to detect and track moving objects (DATMO from unmanned aerial vehicles (UAVs using a single camera. To address the challenging characteristics of these vehicles, such as continuous unrestricted pose variation and low-frequency vibrations, new approaches must be developed. The main concept proposed in this work is to create an artificial optical flow field by estimating the camera motion between two subsequent video frames. The core of the methodology consists of comparing this artificial flow with the real optical flow directly calculated from the video feed. The motion of the UAV between frames is estimated with available parallel tracking and mapping techniques that identify good static features in the images and follow them between frames. By comparing the two optical flows, a list of dynamic pixels is obtained and then grouped into dynamic objects. Tracking these dynamic objects through time and space provides a filtering procedure to eliminate spurious events and misdetections. The algorithms have been tested with a quadrotor platform using a commercial camera.

  2. Euclidean Position Estimation if Features on a Moving Object Using a Single Camera: A Lyapunov-Based Approach

    National Research Council Canada - National Science Library

    Chitrakaran, V. K; Dawson, D. M; Chen, J; Dixon, W. E

    2004-01-01

    .... No explicit model is used to describe the movement of the object. Homography-based techniques are used in the development of the object kinematics, while Lyapunov design methods are utilized in the synthesis of the adaptive estimator...

  3. Using Color, Texture and Object-Based Image Analysis of Multi-Temporal Camera Data to Monitor Soil Aggregate Breakdown

    Directory of Open Access Journals (Sweden)

    Irena Ymeti

    2017-05-01

    Full Text Available Remote sensing has shown its potential to assess soil properties and is a fast and non-destructive method for monitoring soil surface changes. In this paper, we monitor soil aggregate breakdown under natural conditions. From November 2014 to February 2015, images and weather data were collected on a daily basis from five soils susceptible to detachment (Silty Loam with various organic matter content, Loam and Sandy Loam. Three techniques that vary in image processing complexity and user interaction were tested for the ability of monitoring aggregate breakdown. Considering that the soil surface roughness causes shadow cast, the blue/red band ratio is utilized to observe the soil aggregate changes. Dealing with images with high spatial resolution, image texture entropy, which reflects the process of soil aggregate breakdown, is used. In addition, the Huang thresholding technique, which allows estimation of the image area occupied by soil aggregate, is performed. Our results show that all three techniques indicate soil aggregate breakdown over time. The shadow ratio shows a gradual change over time with no details related to weather conditions. Both the entropy and the Huang thresholding technique show variations of soil aggregate breakdown responding to weather conditions. Using data obtained with a regular camera, we found that freezing–thawing cycles are the cause of soil aggregate breakdown.

  4. Using Color, Texture and Object-Based Image Analysis of Multi-Temporal Camera Data to Monitor Soil Aggregate Breakdown.

    Science.gov (United States)

    Ymeti, Irena; van der Werff, Harald; Shrestha, Dhruba Pikha; Jetten, Victor G; Lievens, Caroline; van der Meer, Freek

    2017-05-30

    Remote sensing has shown its potential to assess soil properties and is a fast and non-destructive method for monitoring soil surface changes. In this paper, we monitor soil aggregate breakdown under natural conditions. From November 2014 to February 2015, images and weather data were collected on a daily basis from five soils susceptible to detachment (Silty Loam with various organic matter content, Loam and Sandy Loam). Three techniques that vary in image processing complexity and user interaction were tested for the ability of monitoring aggregate breakdown. Considering that the soil surface roughness causes shadow cast, the blue/red band ratio is utilized to observe the soil aggregate changes. Dealing with images with high spatial resolution, image texture entropy, which reflects the process of soil aggregate breakdown, is used. In addition, the Huang thresholding technique, which allows estimation of the image area occupied by soil aggregate, is performed. Our results show that all three techniques indicate soil aggregate breakdown over time. The shadow ratio shows a gradual change over time with no details related to weather conditions. Both the entropy and the Huang thresholding technique show variations of soil aggregate breakdown responding to weather conditions. Using data obtained with a regular camera, we found that freezing-thawing cycles are the cause of soil aggregate breakdown.

  5. Application of Terrestrial Laser Scanner with an Integrated Thermal Camera in Non-Destructive Evaluation of Concrete Surface of Hydrotechnical Objects

    Science.gov (United States)

    Kaczmarek, Łukasz Dominik; Dobak, Paweł Józef; Kiełbasiński, Kamil

    2017-12-01

    The authors present possible applications of thermal data as an additional source of information on an object's behaviour during the technical assessment of the condition of a concrete surface. For the study one of the most recent propositions introduced by Zoller + Fröhlich company was used, which is an integration of a thermal camera with a terrestrial laser scanner. This solution enables an acquisition of geometric and spectral data on the surveyed object and also provides information on the surface's temperature in the selected points. A section of the dam's downstream concrete wall was selected as the subject of the study for which a number of scans were carried out and a number of thermal images were taken at different times of the day. The obtained thermal data was confronted with the acquired spectral information for the specified points. This made it possible to carry out broader analysis of the surface and an inspection of the revealed fissure. The thermal analysis of said fissure indicated that the temperature changes within it are slower, which may affect the way the concrete works and may require further elaboration by the appropriate experts. Through the integration of a thermal camera with a terrestrial laser scanner one can not only analyse changes of temperature in the discretely selected points but on the whole surface as well. Moreover, it is also possible to accurately determine the range and the area of the change affecting the surface. The authors note the limitations of the presented solution like, inter alia, the resolution of the thermal camera.

  6. Are the infrared-faint radio sources pulsars?

    Science.gov (United States)

    Cameron, A. D.; Keith, M.; Hobbs, G.; Norris, R. P.; Mao, M. Y.; Middelberg, E.

    2011-07-01

    Infrared-faint radio sources (IFRS) are objects which are strong at radio wavelengths but undetected in sensitive Spitzer observations at infrared wavelengths. Their nature is uncertain and most have not yet been associated with any known astrophysical object. One possibility is that they are radio pulsars. To test this hypothesis we undertook observations of 16 of these sources with the Parkes Radio Telescope. Our results limit the radio emission to a pulsed flux density of less than 0.21 mJy (assuming a 50 per cent duty cycle). This is well below the flux density of the IFRS. We therefore conclude that these IFRS are not radio pulsars.

  7. Interests and instrument: a micro-history of object Wh.3469 (X-ray powder diffraction camera, ca. 1940).

    Science.gov (United States)

    Scheffler, Robin Wolfe

    2009-12-01

    This paper presents a micro-history of an object in the collection of the Whipple Museum of the History of Science (accession no. Wh.3469), with an emphasis on how Wh.3469 reflects a hybrid of two different interwar British X-ray crystallographic communities, namely those based in WL Bragg's physics laboratory at the Victoria University of Manchester and the Crystallographic Laboratory at the University of Cambridge. It explores connections between Wh.3469's final design and construction and the different interests each community had in X-ray crystallography.

  8. Hydra II: A Faint and Compact Milky Way Dwarf Galaxy Found in the Survey of the Magellanic Stellar History

    NARCIS (Netherlands)

    Martin, Nicolas F.; Nidever, David L.; Besla, Gurtina; Olsen, Knut; Walker, Alistair R.; Vivas, A. Katherina; Gruendl, Robert A.; Kaleida, Catherine C.; Muñoz, Ricardo R.; Blum, Robert D.; Saha, Abhijit; Conn, Blair C.; Bell, Eric F.; Chu, You-Hua; Cioni, Maria-Rosa L.; de Boer, Thomas J. L.; Gallart, Carme; Jin, Shoko; Kunder, Andrea; Majewski, Steven R.; Martinez-Delgado, David; Monachesi, Antonela; Monelli, Matteo; Monteagudo, Lara; Noël, Noelia E. D.; Olszewski, Edward W.; Stringfellow, Guy S.; van der Marel, Roeland P.; Zaritsky, Dennis

    We present the discovery of a new dwarf galaxy, Hydra II, found serendipitously within the data from the ongoing Survey of the Magellanic Stellar History conducted with the Dark Energy Camera on the Blanco 4 m Telescope. The new satellite is compact ({{r}h}=68 ± 11 pc) and faint ({{M}V}=-4.8 ± 0.3),

  9. Faint stars and OmegaCAM

    NARCIS (Netherlands)

    Kuijken, K; Cristiani, S; Renzini, A; Williams, RE

    2001-01-01

    OmegaCAM will be the wide-field imager on the VLT Survey Telescope. In this contribution I present applications of this instrument to the study of faint stellar populations. Two projects are highlighted: a proper motion study to uncover the galactic halo population, and a microlensing study towards

  10. Thermal Cameras and Applications

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. This type of camera was originally developed as a surveillance and night vision tool for the military, but recently the price has dropped, significantly opening up...

  11. Strategies for Characterizing the Sensory Environment: Objective and Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera

    Science.gov (United States)

    2017-11-01

    distribution is unlimited. 17 6.1 Batch Processing of Beamformers To execute batch commands under the Windows operating system , first open a command...ecological frequency project was the first full-field deployment of the VRAP. By using this auxiliary system the operators would have a record of the...camera. Once satisfied that a camera was operational , the new window was closed and the next camera on the list was checked in the same manner

  12. Objectivity

    CERN Document Server

    Daston, Lorraine

    2010-01-01

    Objectivity has a history, and it is full of surprises. In Objectivity, Lorraine Daston and Peter Galison chart the emergence of objectivity in the mid-nineteenth-century sciences--and show how the concept differs from its alternatives, truth-to-nature and trained judgment. This is a story of lofty epistemic ideals fused with workaday practices in the making of scientific images. From the eighteenth through the early twenty-first centuries, the images that reveal the deepest commitments of the empirical sciences--from anatomy to crystallography--are those featured in scientific atlases, the compendia that teach practitioners what is worth looking at and how to look at it. Galison and Daston use atlas images to uncover a hidden history of scientific objectivity and its rivals. Whether an atlas maker idealizes an image to capture the essentials in the name of truth-to-nature or refuses to erase even the most incidental detail in the name of objectivity or highlights patterns in the name of trained judgment is a...

  13. Star formation rate and extinction in faint z ∼ 4 Lyman break galaxies

    Energy Technology Data Exchange (ETDEWEB)

    To, Chun-Hao; Wang, Wei-Hao [Institute of Astronomy and Astrophysics, Academia Sinica, No. 1, Sec. 4, Roosevelt Road, Taipei 10617, Taiwan (China); Owen, Frazer N. [National Radio Astronomy Observatory, P.O. Box 0, Socorro, NM 87801 (United States)

    2014-09-10

    We present a statistical detection of 1.5 GHz radio continuum emission from a sample of faint z ∼ 4 Lyman break galaxies (LBGs). To constrain their extinction and intrinsic star formation rate (SFR), we combine the latest ultradeep Very Large Array 1.5 GHz radio image and the Hubble Space Telescope Advanced Camera for Surveys (ACS) optical images in the GOODS-N. We select a large sample of 1771 z ∼ 4 LBGs from the ACS catalog using B {sub F435W}-dropout color criteria. Our LBG samples have I {sub F775W} ∼ 25-28 (AB), ∼0-3 mag fainter than M{sub UV}{sup ⋆} at z ∼ 4. In our stacked radio images, we find the LBGs to be point-like under our 2'' angular resolution. We measure their mean 1.5 GHz flux by stacking the measurements on the individual objects. We achieve a statistical detection of S {sub 1.5} {sub GHz} = 0.210 ± 0.075 μJy at ∼3σ for the first time on such a faint LBG population at z ∼ 4. The measurement takes into account the effects of source size and blending of multiple objects. The detection is visually confirmed by stacking the radio images of the LBGs, and the uncertainty is quantified with Monte Carlo simulations on the radio image. The stacked radio flux corresponds to an obscured SFR of 16.0 ± 5.7 M {sub ☉} yr{sup –1}, and implies a rest-frame UV extinction correction factor of 3.8. This extinction correction is in excellent agreement with that derived from the observed UV continuum spectral slope, using the local calibration of Meurer et al. This result supports the use of the local calibration on high-redshift LBGs to derive the extinction correction and SFR, and also disfavors a steep reddening curve such as that of the Small Magellanic Cloud.

  14. Morphology and astrometry of Infrared-Faint Radio Sources

    Science.gov (United States)

    Middelberg, Enno; Norris, Ray; Randall, Kate; Mao, Minnie; Hales, Christopher

    2008-10-01

    Infrared-Faint Radio Sources, or IFRS, are an unexpected class of object discovered in the Australia Telescope Large Area Survey, ATLAS. They are compact 1.4GHz radio sources with no visible counterparts in co-located (relatively shallow) Spitzer infrared and optical images. We have detected two of these objects with VLBI, indicating the presence of an AGN. These observations and our ATLAS data indicate that IFRS are extended on scales of arcseconds, and we wish to image their morphologies to obtain clues about their nature. These observations will also help us to select optical counterparts from very deep, and hence crowded, optical images which we have proposed. With these data in hand, we will be able to compare IFRS to known object types and to apply for spectroscopy to obtain their redshifts.

  15. Detection of faint X-ray spectral features using wavelength, energy, and spatial discrimination techniques

    International Nuclear Information System (INIS)

    Hudson, L.T.; Gillaspy, J.D.; Pomeroy, J.M.; Szabo, C.I.; Tan, J.N.; Radics, B.; Takacs, E.; Chantler, C.T.; Kimpton, J.A.; Kinnane, M.N.; Smale, L.F.

    2007-01-01

    We report here our methods and results of measurements of very low-signal X-ray spectra produced by highly charged ions in an electron beam ion trap (EBIT). A megapixel Si charge-coupled device (CCD) camera was used in a direct-detection, single-photon-counting mode to image spectra with a cylindrically bent Ge(2 2 0) crystal spectrometer. The resulting wavelength-dispersed spectra were then processed using several intrinsic features of CCD images and image-analysis techniques. We demonstrate the ability to clearly detect very faint spectral features that are on the order of the noise due to cosmic-ray background signatures in our images. These techniques remove extraneous signal due to muon tracks and other sources, and are coupled with the spectrometer wavelength dispersion and atomic-structure calculations of hydrogen-like Ti to identify the energy of a faint line that was not in evidence before applying the methods outlined here

  16. Sub-percent Photometry: Faint DA White Dwarf Spectrophotometric Standards for Astrophysical Observatories

    Science.gov (United States)

    Narayan, Gautham; Axelrod, Tim; Calamida, Annalisa; Saha, Abhijit; Matheson, Thomas; Olszewski, Edward; Holberg, Jay; Holberg, Jay; Bohlin, Ralph; Stubbs, Christopher W.; Rest, Armin; Deustua, Susana; Sabbi, Elena; MacKenty, John W.; Points, Sean D.; Hubeny, Ivan

    2018-01-01

    We have established a network of faint (16.5 Camera 3 (WFC3) on the Hubble Space Telescope (HST). We have developed two independent analyses to forward model all the observed photometry and ground-based spectroscopy and infer a spectral energy distribution for each source using a non-local-thermodynamic-equilibrium (NLTE) DA white dwarf atmosphere extincted by interstellar dust. The models are in excellent agreement with each other, and agree with the observations to better than 0.01 mag in all passbands, and better than 0.005 mag in the optical. The high-precision of these faint sources, tied directly to the most accurate flux standards presently available, make our network of standards ideally suited for any experiments that have very stringent requirements on absolute flux calibration, such as studies of dark energy using the Large Synoptic Survey Telescope (LSST) and the Wide-Field Infrared Survey Telescope (WFIRST).

  17. Chemical Abundance Measurements of Ultra-Faint Dwarf Galaxies Discovered by the Dark Energy Survey

    Science.gov (United States)

    Nagasawa, Daniel; Marshall, Jennifer L.; Simon, Joshua D.; Hansen, Terese; Li, Ting; Bernstein, Rebecca; Balbinot, Eduardo; Drlica-Wagner, Alex; Pace, Andrew; Strigari, Louis; Pellegrino, Craig; DePoy, Darren L.; Suntzeff, Nicholas; Bechtol, Keith; Dark Energy Suvey

    2018-01-01

    We present chemical abundance analysis results derived from high-resolution spectroscopy of ultra-faint dwarfs discovered by the Dark Energy Survey. Ultra-faint dwarf galaxies preserve a fossil record of the chemical abundance patterns imprinted by the first stars in the Universe. High-resolution spectroscopic observations of member stars in several recently discovered Milky Way satellites reveal a range of abundance patterns among ultra-faint dwarfs suggesting that star formation processes in the early Universe were quite diverse. The chemical content provides a glimpse not only of the varied nucleosynthetic processes and chemical history of the dwarfs themselves, but also the environment in which they were formed. We present the chemical abundance analysis of these objects and discuss possible explanations for the observed abundance patterns.

  18. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, Ul; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is replaceably mounted in the ray inlet opening of the camera, while the others are placed on separate supports. Supports are swingably mounted upon a column one above the other

  19. Gamma camera

    International Nuclear Information System (INIS)

    Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    The design of a collimation system for a gamma camera for use in nuclear medicine is described. When used with a 2-dimensional position sensitive radiation detector, the novel system can produce superior images than conventional cameras. The optimal thickness and positions of the collimators are derived mathematically. (U.K.)

  20. A LARGE AND FAINT PHOTOMETRIC CATALOG ON THE ECLIPTIC

    International Nuclear Information System (INIS)

    Buie, Marc W.; Trilling, David E.; Wasserman, Lawrence H.; Crudo, Richard A.

    2011-01-01

    A photometric catalog, developed for the calibration of the Deep Ecliptic Survey, is presented. The catalog contains 213,272 unique sources that were measured in V and R filters and transformed to the Johnson-Cousins systems using the Landolt standard catalog. All of the sources lie within 6 0 of the ecliptic and cover all longitudes except for the densest stellar regions nearest the galactic center. Seventeen percent of the sources in the catalog are derived from three or more nights of observation. The catalog contains sources as faint as R ∼19 but the largest fraction fall in the R ∼15-16 (V ∼16-17) mag range. All magnitude bins down to R = 19 have a significant fraction of objects with uncertainties ≤0.1 mag.

  1. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, U.; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is mounted in the ray inlet opening of the camera, while the others are placed on separate supports. The supports are swingably mounted upon a column one above the other through about 90 0 to a collimator exchange position. Each of the separate supports is swingable to a vertically aligned position, with limiting of the swinging movement and positioning of the support at the desired exchange position. The collimators are carried on the supports by means of a series of vertically disposed coil springs. Projections on the camera are movable from above into grooves of the collimator at the exchange position, whereupon the collimator is turned so that it is securely prevented from falling out of the camera head

  2. The laser scanning camera

    International Nuclear Information System (INIS)

    Jagger, M.

    The prototype development of a novel lenseless camera is reported which utilises a laser beam scanned in a raster by means of orthogonal vibrating mirrors to illuminate the field of view. Laser light reflected from the scene is picked up by a conveniently sited photosensitive device and used to modulate the brightness of a T.V. display scanned in synchronism with the moving laser beam, hence producing a T.V. image of the scene. The camera which needs no external lighting system can act in either a wide angle mode or by varying the size and position of the raster can be made to zoom in to view in detail any object within a 40 0 overall viewing angle. The resolution and performance of the camera are described and a comparison of these aspects is made with conventional T.V. cameras. (author)

  3. An HST study of three very faint GRB host galaxies

    DEFF Research Database (Denmark)

    Jaunsen, A.O.; Andersen, M.I.; Hjorth, J.

    2003-01-01

    As part of the HST/STIS GRB host survey program we present the detection of three faint gamma-ray burst (GRB) host galaxies based on an accurate localisation using ground-based data of the optical afterglows (OAs). A common property of these three hosts is their extreme faintness. The location at...

  4. World's fastest and most sensitive astronomical camera

    Science.gov (United States)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these

  5. a Faint and Lonely Brown Dwarf in the Solar Vicinity

    Science.gov (United States)

    1997-04-01

    Discovery of KELU-1 Promises New Insights into Strange Objects Brown Dwarfs are star-like objects which are too small to become real stars, yet too large to be real planets. Their mass is too small to ignite those nuclear processes which are responsible for the large energies and high temperatures of stars, but it is much larger than that of the planets we know in our solar system. Until now, very few Brown Dwarfs have been securely identified as such. Two are members of double-star systems, and a few more are located deep within the Pleiades star cluster. Now, however, Maria Teresa Ruiz of the Astronomy Department at Universidad de Chile (Santiago de Chile), using telescopes at the ESO La Silla observatory, has just discovered one that is all alone and apparently quite near to us. Contrary to the others which are influenced by other objects in their immediate surroundings, this new Brown Dwarf is unaffected and will thus be a perfect object for further investigations that may finally allow us to better understand these very interesting celestial bodies. It has been suggested that Brown Dwarfs may constitute a substantial part of the unseen dark matter in our Galaxy. This discovery may therefore also have important implications for this highly relevant research area. Searching for nearby faint stars The story of this discovery goes back to 1987 when Maria Teresa Ruiz decided to embark upon a long-term search (known as the Calan-ESO proper-motion survey ) for another type of unusual object, the so-called White Dwarfs , i.e. highly evolved, small and rather faint stars. Although they have masses similar to that of the Sun, such stars are no larger than the Earth and are therefore extremely compact. They are particularly interesting, because they most probably represent the future end point of evolution of our Sun, some billions of years from now. For this project, the Chilean astronomer obtained large-field photographic exposures with the 1-m ESO Schmidt telescope at

  6. THE FAINT END OF THE LUMINOSITY FUNCTION AND LOW SURFACE BRIGHTNESS GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Geller, Margaret J.; Kurtz, Michael J.; Fabricant, Daniel G. [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Diaferio, Antonaldo [Dipartimento di Fisica Generale ' Amedeo Avogadro' , Universita degli Studi di Torino, via P. Giuria 1, 10125 Torino (Italy); Dell' Antonio, Ian P., E-mail: mgeller@cfa.harvard.edu, E-mail: mkurtz@cfa.harvard.edu, E-mail: dfabricant@cfa.harvard.edu, E-mail: adiaferio@cfa.harvard.edu, E-mail: ian@het.brown.edu [Department of Physics, Brown University, Box 1843, Providence, RI 02912 (United States)

    2012-04-15

    Smithsonian Hectospec Lensing Survey (SHELS) is a dense redshift survey covering a 4 deg{sup 2} region to a limiting R = 20.6. In the construction of the galaxy catalog and in the acquisition of spectroscopic targets, we paid careful attention to the survey completeness for lower surface brightness dwarf galaxies. Thus, although the survey covers a small area, it is a robust basis for computation of the slope of the faint end of the galaxy luminosity function to a limiting M{sub R} = -13.3 + 5log h. We calculate the faint-end slope in the R band for the subset of SHELS galaxies with redshifts in the range 0.02 {<=}z < 0.1, SHELS{sub 0.1}. This sample contains 532 galaxies with R < 20.6 and with a median surface brightness within the half-light radius of SB{sub 50,R} = 21.82 mag arcsec{sup -2}. We used this sample to make one of the few direct measurements of the dependence of the faint end of the galaxy luminosity function on surface brightness. For the sample as a whole the faint-end slope, {alpha} = -1.31 {+-} 0.04, is consistent with both the Blanton et al. analysis of the Sloan Digital Sky Survey and the Liu et al. analysis of the COSMOS field. This consistency is impressive given the very different approaches of these three surveys. A magnitude-limited sample of 135 galaxies with optical spectroscopic redshifts with mean half-light surface brightness, SB{sub 50,R} {>=} 22.5 mag arcsec{sup -2} is unique to SHELS{sub 0.1}. The faint-end slope is {alpha}{sub 22.5} = -1.52 {+-} 0.16. SHELS{sub 0.1} shows that lower surface brightness objects dominate the faint-end slope of the luminosity function in the field, underscoring the importance of surface brightness limits in evaluating measurements of the faint-end slope and its evolution.

  7. VLBI observations of Infrared-Faint Radio Sources

    Science.gov (United States)

    Middelberg, Enno; Phillips, Chris; Norris, Ray; Tingay, Steven

    2006-10-01

    We propose to observe a small sample of radio sources from the ATLAS project (ATLAS = Australia Telescope Large Area Survey) with the LBA, to determine their compactness and map their structures. The sample consists of three radio sources with no counterpart in the co-located SWIRE survey (3.6 um to 160 um), carried out with the Spitzer Space Telescope. This rare class of sources, dubbed Infrared-Faint Radio Sources, or IFRS, is inconsistent with current galaxy evolution models. VLBI observations are an essential way to obtain further clues on what these objects are and why they are hidden from infrared observations: we will map their structure to test whether they resemble core-jet or double-lobed morphologies, and we will measure the flux densities on long baselines, to determine their compactness. Previous snapshot-style LBA observations of two other IFRS yielded no detections, hence we propose to use disk-based recording with 512 Mbps where possible, for highest sensitivity. With the observations proposed here, we will increase the number of VLBI-observed IFRS from two to five, soon allowing us to draw general conclusions about this intriguing new class of objects.

  8. On upper and lower faintly i-continuous multifunctions

    Directory of Open Access Journals (Sweden)

    C. Arivzhagi

    2018-04-01

    Full Text Available The aim of this paper is to introduce and study upper and lower faintly I-continuous multifunctions as a generalization of upper and lower I-continuous multifunctions, respectively.

  9. MCDONALD OBSERVATORY FAINT COMET SPECTRO-PHOTOMETRIC SURVEY

    Data.gov (United States)

    National Aeronautics and Space Administration — The McDonald Observatory Faint Comet Survey data set presents spectral data from 152 observations of 17 comets taken using the Intensified Dissector Scanner...

  10. Gamma camera

    International Nuclear Information System (INIS)

    Reiss, K.H.; Kotschak, O.; Conrad, B.

    1976-01-01

    A gamma camera with a simplified setup as compared with the state of engineering is described permitting, apart from good localization, also energy discrimination. Behind the usual vacuum image amplifier a multiwire proportional chamber filled with trifluorine bromium methane is connected in series. Localizing of the signals is achieved by a delay line, energy determination by means of a pulse height discriminator. With the aid of drawings and circuit diagrams, the setup and mode of operation are explained. (ORU) [de

  11. Gamma camera

    International Nuclear Information System (INIS)

    Berninger, W.H.

    1975-01-01

    The light pulse output of a scintillator, on which incident collimated gamma rays impinge, is detected by an array of photoelectric tubes each having a convexly curved photocathode disposed in close proximity to the scintillator. Electronic circuitry connected to outputs of the phototubes develops the scintillation event position coordinate electrical signals with good linearity and with substantial independence of the spacing between the scintillator and photocathodes so that the phototubes can be positioned as close to the scintillator as is possible to obtain less distortion in the field of view and improved spatial resolution as compared to conventional planar photocathode gamma cameras

  12. The radio properties of infrared-faint radio sources

    Science.gov (United States)

    Middelberg, E.; Norris, R. P.; Hales, C. A.; Seymour, N.; Johnston-Hollitt, M.; Huynh, M. T.; Lenc, E.; Mao, M. Y.

    2011-02-01

    Context. Infrared-faint radio sources (IFRS) are objects that have flux densities of several mJy at 1.4 GHz, but that are invisible at 3.6 μm when using sensitive Spitzer observations with μJy sensitivities. Their nature is unclear and difficult to investigate since they are only visible in the radio. Aims: High-resolution radio images and comprehensive spectral coverage can yield constraints on the emission mechanisms of IFRS and can give hints to similarities with known objects. Methods: We imaged a sample of 17 IFRS at 4.8 GHz and 8.6 GHz with the Australia Telescope Compact Array to determine the structures on arcsecond scales. We added radio data from other observing projects and from the literature to obtain broad-band radio spectra. Results: We find that the sources in our sample are either resolved out at the higher frequencies or are compact at resolutions of a few arcsec, which implies that they are smaller than a typical galaxy. The spectra of IFRS are remarkably steep, with a median spectral index of -1.4 and a prominent lack of spectral indices larger than -0.7. We also find that, given the IR non-detections, the ratio of 1.4 GHz flux density to 3.6 μm flux density is very high, and this puts them into the same regime as high-redshift radio galaxies. Conclusions: The evidence that IFRS are predominantly high-redshift sources driven by active galactic nuclei (AGN) is strong, even though not all IFRS may be caused by the same phenomenon. Compared to the rare and painstakingly collected high-redshift radio galaxies, IFRS appear to be much more abundant, but less luminous, AGN-driven galaxies at similar cosmological distances.

  13. Faint nebulosities in the vicinity of the Magellanic H I Stream

    International Nuclear Information System (INIS)

    Johnson, P.G.; Meaburn, J.; Osman, A.M.I.

    1982-01-01

    Very deep Hα image tube photographs with a wide-field filter camera have been taken of the Magellanic H I Stream. A diffuse region of emission has been detected. Furthermore a mosaic of high contrast prints of IIIaJ survey plates taken with the SRC Schmidt, has been compiled over the same area. A complex region of faint, blue, filamentary nebulosity has been revealed. This appears to be reflection nebulosity either in the galactic plane or less probably, in the vicinity of the Large Magellanic Cloud. A deep Hα 1.2-m Schmidt photograph of these blue filaments reinforces the suggestion that they are reflection nebulae. The reflection and emission nebulosities in this vicinity have been compared to each other and the Magellanic H I Stream. The diffuse region of Hα emission is particularly well correlated with the Stream. (author)

  14. Long-Term Continuous Double Station Observation of Faint Meteor Showers.

    Science.gov (United States)

    Vítek, Stanislav; Páta, Petr; Koten, Pavel; Fliegel, Karel

    2016-09-14

    Meteor detection and analysis is an essential topic in the field of astronomy. In this paper, a high-sensitivity and high-time-resolution imaging device for the detection of faint meteoric events is presented. The instrument is based on a fast CCD camera and an image intensifier. Two such instruments form a double-station observation network. The MAIA (Meteor Automatic Imager and Analyzer) system has been in continuous operation since 2013 and has successfully captured hundreds of meteors belonging to different meteor showers, as well as sporadic meteors. A data processing pipeline for the efficient processing and evaluation of the massive amount of video sequences is also introduced in this paper.

  15. Serendipitous discovery of a faint dwarf galaxy near a Local Volume dwarf

    Science.gov (United States)

    Makarova, L. N.; Makarov, D. I.; Antipova, A. V.; Karachentsev, I. D.; Tully, R. B.

    2018-03-01

    A faint dwarf irregular galaxy has been discovered in the HST/ACS field of LV J1157+5638. The galaxy is resolved into individual stars, including the brightest magnitude of the red giant branch. The dwarf is very likely a physical satellite of LV J1157+5638. The distance modulus of LV J1157+5638 using the tip of the red giant branch (TRGB) distance indicator is 29.82 ± 0.09 mag (D = 9.22 ± 0.38 Mpc). The TRGB distance modulus of LV J1157+5638 sat is 29.76 ± 0.11 mag (D = 8.95 ± 0.42 Mpc). The distances to the two galaxies are consistent within the uncertainties. The projected separation between them is only 3.9 kpc. LV J1157+5638 has a total absolute V magnitude of -13.26 ± 0.10 and linear Holmberg diameter of 1.36 kpc, whereas its faint satellite LV J1157+5638 sat has MV = -9.38 ± 0.13 mag and Holmberg diameter of 0.37 kpc. Such a faint dwarf was discovered for the first time beyond the nearest 4 Mpc from us. The presence of main-sequence stars in both galaxies unambiguously indicates the classification of the objects as dwarf irregulars with recent or ongoing star formation events in both galaxies.

  16. Infrared Faint Radio Sources in the Extended Chandra Deep Field South

    Science.gov (United States)

    Huynh, Minh T.

    2009-01-01

    Infrared-Faint Radio Sources (IFRSs) are a class of radio objects found in the Australia Telescope Large Area Survey (ATLAS) which have no observable counterpart in the Spitzer Wide-area Infrared Extragalactic Survey (SWIRE). The extended Chandra Deep Field South now has even deeper Spitzer imaging (3.6 to 70 micron) from a number of Legacy surveys. We report the detections of two IFRS sources in IRAC images. The non-detection of two other IFRSs allows us to constrain the source type. Detailed modeling of the SED of these objects shows that they are consistent with high redshift AGN (z > 2).

  17. Evidence for Infrared-faint Radio Sources as z > 1 Radio-loud Active Galactic Nuclei

    Science.gov (United States)

    Huynh, Minh T.; Norris, Ray P.; Siana, Brian; Middelberg, Enno

    2010-02-01

    Infrared-Faint Radio Sources (IFRSs) are a class of radio objects found in the Australia Telescope Large Area Survey which have no observable mid-infrared counterpart in the Spitzer Wide-area Infrared Extragalactic (SWIRE) survey. The extended Chandra Deep Field South now has even deeper Spitzer imaging (3.6-70 μm) from a number of Legacy surveys. We report the detections of two IFRS sources in IRAC images. The non-detection of two other IFRSs allows us to constrain the source type. Detailed modeling of the spectral energy distribution of these objects shows that they are consistent with high-redshift (z >~ 1) active galactic nuclei.

  18. Faint warm debris disks around nearby bright stars explored by AKARI and IRSF

    Science.gov (United States)

    Ishihara, Daisuke; Takeuchi, Nami; Kobayashi, Hiroshi; Nagayama, Takahiro; Kaneda, Hidehiro; Inutsuka, Shu-ichiro; Fujiwara, Hideaki; Onaka, Takashi

    2017-05-01

    Context. Debris disks are important observational clues for understanding planetary-system formation process. In particular, faint warm debris disks may be related to late planet formation near 1 au. A systematic search of faint warm debris disks is necessary to reveal terrestrial planet formation. Aims: Faint warm debris disks show excess emission that peaks at mid-IR wavelengths. Thus we explore debris disks using the AKARI mid-IR all-sky point source catalog (PSC), a product of the second generation unbiased IR all-sky survey. Methods: We investigate IR excess emission for 678 isolated main-sequence stars for which there are 18 μm detections in the AKARI mid-IR all-sky catalog by comparing their fluxes with the predicted fluxes of the photospheres based on optical to near-IR fluxes and model spectra. The near-IR fluxes are first taken from the 2MASS PSC. However, 286 stars with Ks Africa, and improved the flux accuracy from 14% to 1.8% on average. Results: We identified 53 debris-disk candidates including eight new detections from our sample of 678 main-sequence stars. The detection rate of debris disks for this work is 8%, which is comparable with those in previous works by Spitzer and Herschel. Conclusions: The importance of this study is the detection of faint warm debris disks around nearby field stars. At least nine objects have a large amount of dust for their ages, which cannot be explained by the conventional steady-state collisional cascade model. The full version of Table 2 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/601/A72

  19. Faint skylines in the near-infrared: observational constraint for IFU instruments

    Science.gov (United States)

    Flores, H.; Rodrigues, M.; Puech, M.; Yang, Y.; Hammer, F.

    2016-08-01

    The amplitudes and scales of spatial variations in the skylines can be a potential limit of the telescopes performance, because the study of the extremely faint objects requires a careful correction for the residual of the skylines if they are corrected. Using observations from the VLT/KMOS instrument, we have studied the spatial and temporal behavior of two faint skylines (10 to 80 times fainter than the strong skyline in the spectral window) and the effect of the skylines in the determination of the kinematics maps of distant galaxies. Using nine consecutives exposures of ten minutes. We found that the flux of the brighter skylines changes rapidly spatially and temporally, 5 to 10% and up to 15%, respectively. For the faint skyline, the fluctuations have a spatial and temporal amplitude up to 100%. The effect of the residual of the skyline on the velocity field of distant galaxies becomes dramatic when the emission line is faint (equivalent width equal to 15 A). All the kinematic information is lost. The shape and the centroid of the emission line change from spaxel to spaxel. This preliminary result needs to be extended; by continuing the simulation, in order to determine, the minimum flux that allows to recover of the kinematic information at different resolutions. Allowing to find the possible relation between spectral resolution and flux of the emission line. Our goal is to determine which is the best spectral resolution in the infrared to observe the distant galaxies with integral field spectrographs. Finding the best compromise between spectral resolution and the detection limit of the spectrograph.

  20. A Peculiar Faint Satellite in the Remote Outer Halo of M31

    Science.gov (United States)

    Mackey, A. D.; Huxor, A. P.; Martin, N. F.; Ferguson, A. M. N.; Dotter, A.; McConnachie, A. W.; Ibata, R. A.; Irwin, M. J.; Lewis, G. F.; Sakari, C. M.; Tanvir, N. R.; Venn, K. A.

    2013-06-01

    We present Hubble Space Telescope imaging of a newly discovered faint stellar system, PAndAS-48, in the outskirts of the M31 halo. Our photometry reveals this object to be comprised of an ancient and very metal-poor stellar population with age >~ 10 Gyr and [Fe/H] lsim -2.3. Our inferred distance modulus (m - M)0 = 24.57 ± 0.11 confirms that PAndAS-48 is most likely a remote M31 satellite with a three-dimensional galactocentric radius of 149^{+19}_{-8} kpc. We observe an apparent spread in color on the upper red giant branch that is larger than the photometric uncertainties should allow, and briefly explore the implications of this. Structurally, PAndAS-48 is diffuse, faint, and moderately flattened, with a half-light radius r_h=26^{+4}_{-3} pc, integrated luminosity MV = -4.8 ± 0.5, and ellipticity \\epsilon =0.30^{+0.08}_{-0.15}. On the size-luminosity plane it falls between the extended globular clusters seen in several nearby galaxies and the recently discovered faint dwarf satellites of the Milky Way; however, its characteristics do not allow us to unambiguously classify it as either type of system. If PAndAS-48 is a globular cluster then it is among the most elliptical, isolated, and metal-poor of any seen in the Local Group, extended or otherwise. Conversely, while its properties are generally consistent with those observed for the faint Milky Way dwarfs, it would be a factor of ~2-3 smaller in spatial extent than any known counterpart of comparable luminosity. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute (STScI), which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. These observations are associated with program GO 12515.

  1. A PECULIAR FAINT SATELLITE IN THE REMOTE OUTER HALO OF M31

    International Nuclear Information System (INIS)

    Mackey, A. D.; Dotter, A.; Huxor, A. P.; Martin, N. F.; Ibata, R. A.; Ferguson, A. M. N.; McConnachie, A. W.; Irwin, M. J.; Lewis, G. F.; Sakari, C. M.; Venn, K. A.; Tanvir, N. R.

    2013-01-01

    We present Hubble Space Telescope imaging of a newly discovered faint stellar system, PAndAS-48, in the outskirts of the M31 halo. Our photometry reveals this object to be comprised of an ancient and very metal-poor stellar population with age ∼> 10 Gyr and [Fe/H] ∼ 0 = 24.57 ± 0.11 confirms that PAndAS-48 is most likely a remote M31 satellite with a three-dimensional galactocentric radius of 149 +19 -8 kpc. We observe an apparent spread in color on the upper red giant branch that is larger than the photometric uncertainties should allow, and briefly explore the implications of this. Structurally, PAndAS-48 is diffuse, faint, and moderately flattened, with a half-light radius r h =26 +4 -3 pc, integrated luminosity M V = –4.8 ± 0.5, and ellipticity ε=0.30 +0.08 -0.15 . On the size-luminosity plane it falls between the extended globular clusters seen in several nearby galaxies and the recently discovered faint dwarf satellites of the Milky Way; however, its characteristics do not allow us to unambiguously classify it as either type of system. If PAndAS-48 is a globular cluster then it is among the most elliptical, isolated, and metal-poor of any seen in the Local Group, extended or otherwise. Conversely, while its properties are generally consistent with those observed for the faint Milky Way dwarfs, it would be a factor of ∼2-3 smaller in spatial extent than any known counterpart of comparable luminosity.

  2. Hydra II: A Faint and Compact Milky Way Dwarf Galaxy Found in the Survey of the Magellanic Stellar History

    OpenAIRE

    Martin, NF; Nidever, DL; Besla, G; Olsen, K; Walker, AR; Vivas, AK; Gruendl, RA; Kaleida, CC; Muñoz, RR; Blum, RD; Saha, A; Conn, BC; Bell, EF; Chu, YH; Cioni, MRL

    2015-01-01

    © 2015. The American Astronomical Society. All rights reserved.We present the discovery of a new dwarf galaxy, Hydra II, found serendipitously within the data from the ongoing Survey of the Magellanic Stellar History conducted with the Dark Energy Camera on the Blanco 4 m Telescope. The new satellite is compact (rh = 68 ± 11 pc) and faint (MV = -4.8 ± 0.3), but well within the realm of dwarf galaxies. The stellar distribution of Hydra II in the color-magnitude diagram is well-described by a m...

  3. First results from Faint Infrared Grism Survey (FIGS)

    DEFF Research Database (Denmark)

    Tilvi, V.; Pirzkal, N.; Malhotra, S.

    2016-01-01

    in the Faint Infrared Grism Survey (FIGS). These spectra, taken with G102 grism on Hubble Space Telescope (HST), show a significant emission line detection (6{\\sigma}) in multiple observational position angles (PA), with total integrated Ly{\\alpha} line flux of 1.06+/- 0.12 e10-17erg s-1cm-2. The line flux...... is nearly a factor of four higher than the previous MOSFIRE spectroscopic observations of faint Ly{\\alpha} emission at {\\lambda} = 1.0347{\\mu}m, yielding z = 7.5078+/- 0.0004. This is consistent with other recent observations implying that ground-based near-infrared spectroscopy underestimates total...

  4. Searching for z˜= 6 Objects with Deep ACS/WFC Parallel Observation

    Science.gov (United States)

    Yan, H.; Windhorst, R. A.; Cohen, S. H.

    2002-12-01

    Recent results suggest that z˜= 6 marks the end of the reionization era. A large sample of objects at z˜= 6, therefore, will be of enormous importance, as it will enable us to observationally determine the exact epoch of the reionization and the sources that are responsible for it. With the HST Advanced Camera for Surveys (ACS) coming on line, we now have an unique opportunity to discover a significant number of objects at z˜= 6. The pure parallel mode implemented for the Wide Field Camera (WFC) has greatly enhanced this ability. We present our preliminary analysis of a deep ACS/WFC parallel field at |b|=74.4o. We find 30 plausible z˜= 6 candidates, all of which have S/N>7 in the F850LP-band. The major sources of contamination could be faint Galactic cool dwarfs, and we estimated that they would contribute at most 4 objects to our candidate list. We derived the cumulative number density of galaxies at 6.0contamination rate, it could possibly imply that the faint-end slope of the z˜= 6 luminosity function is steeper than α =-1.6. Given the faintness of these candidates, spectroscopic identification is not feasible until the launch of the NGST, and imaging at longer wavelength is the only way to further confirm their nature in the next eight years. At the very least, our result suggests that galaxies with L

  5. COSMIC: A Multiobject Spectrograph and Direct Imaging Camera for the 5 Meter Hale Telescope Prime Focus

    Science.gov (United States)

    Kells, W.; Dressler, A.; Sivaramakrishnan, A.; Carr, D.; Koch, E.; Epps, H.; Hilyard, D.; Pardeilhan, G.

    1998-12-01

    We describe the design, construction, and operation of the Carnegie Observatories Spectroscopic Multislit and Imaging Camera (COSMIC) for the prime focus of the Hale 5 m telescope at Palomar Observatory. COSMIC is a reimaging grism spectrograph with a 13.65 arcmin square field of view, which can also be used as a direct imaging camera with a 9.75 arcmin square field of view. The wavelength coverage extends from 350 nm to almost 1 μm the detector is a thinned, back-illuminated SITe 2048x2048 CCD with high quantum efficiency and excellent cosmetics. Multislit aperture masks are produced photographically, with spectra of up to ~50 objects fitted on a single row of a slit mask. The instrument exhibits very little flexure and uses an active thermal control to maintain focus over a wide range of ambient temperature. In direct mode COSMIC is typically used with Kron-Cousins, Gunn, and narrow bandpass filters. The instrument achieves throughputs of greater than 50% for direct imaging and, in spectroscopic mode, a peak efficiency at 5500 Å of slightly better than 24% of light falling on the 5 m mirror. COSMIC is optimized for faint-object imaging, down to Gunn r=26 mag, and multiobject spectroscopy, down to r=23 mag, with typically 30 objects per spectroscopic exposure.

  6. Spectrum from Faint Galaxy IRAS F00183-7111

    Science.gov (United States)

    2003-01-01

    NASA's Spitzer Space Telescope has detected the building blocks of life in the distant universe, albeit in a violent milieu. Training its powerful infrared eye on a faint object located at a distance of 3.2 billion light-years, Spitzer has observed the presence of water and organic molecules in the galaxy IRAS F00183-7111. With an active galactic nucleus, this is one of the most luminous galaxies in the universe, rivaling the energy output of a quasar. Because it is heavily obscured by dust (see visible-light image in the inset), most of its luminosity is radiated at infrared wavelengths.The infrared spectrograph instrument onboard Spitzer breaks light into its constituent colors, much as a prism does for visible light. The image shows a low-resolution spectrum of the galaxy obtained by the spectrograph at wavelengths between 4 and 20 microns. Spectra are graphical representations of a celestial object's unique blend of light. Characteristic patterns, or fingerprints, within the spectra allow astronomers to identify the object's chemical composition and to determine such physical properties as temperature and density.The broad depression in the center of the spectrum denotes the presence of silicates (chemically similar to beach sand) in the galaxy. An emission peak within the bottom of the trough is the chemical signature for molecular hydrogen. The hydrocarbons (orange) are organic molecules comprised of carbon and hydrogen, two of the most common elements on Earth. Since it has taken more than three billion years for the light from the galaxy to reach Earth, it is intriguing to note the presence of organics in a distant galaxy at a time when life is thought to have started forming on our home planet.Additional features in the spectrum reveal the presence of water ice (blue), carbon dioxide ice (green) and carbon monoxide (purple) in both gas and solid forms. The magenta peak corresponds to singly ionized neon gas, a spectral line often used by astronomers as a

  7. The NEAT Camera Project

    Science.gov (United States)

    Jr., Ray L. Newburn

    1995-01-01

    The NEAT (Near Earth Asteroid Tracking) camera system consists of a camera head with a 6.3 cm square 4096 x 4096 pixel CCD, fast electronics, and a Sun Sparc 20 data and control computer with dual CPUs, 256 Mbytes of memory, and 36 Gbytes of hard disk. The system was designed for optimum use with an Air Force GEODSS (Ground-based Electro-Optical Deep Space Surveillance) telescope. The GEODSS telescopes have 1 m f/2.15 objectives of the Ritchey-Chretian type, designed originally for satellite tracking. Installation of NEAT began July 25 at the Air Force Facility on Haleakala, a 3000 m peak on Maui in Hawaii.

  8. Short timescale variability in the faint sky variability survey

    NARCIS (Netherlands)

    Morales-Rueda, L.; Groot, P.J.; Augusteijn, T.; Nelemans, G.A.; Vreeswijk, P.M.; Besselaar, E.J.M. van den

    2006-01-01

    We present the V-band variability analysis of the Faint Sky Variability Survey (FSVS). The FSVS combines colour and time variability information, from timescales of 24 minutes to tens of days, down to V = 24. We find that �1% of all point sources are variable along the main sequence reaching �3.5%

  9. Science, conservation, and camera traps

    Science.gov (United States)

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  10. Exploring the Faint End of the Luminosity-Metallicity Relation with Hα Dots

    Science.gov (United States)

    Hirschauer, Alec S.; Salzer, John J.

    2015-01-01

    The well-known correlation between a galaxy's luminosity and its gas-phase oxygen abundance (the luminosity-metallicity (L-Z) relation) offers clues toward our understanding of chemical enrichment histories and evolution. Bright galaxies are comparatively better studied than faint ones, leaving a relative dearth of observational data points to constrain the L-Z relation in the low-luminosity regime. We present high S/N nebular spectroscopy of low-luminosity star-forming galaxies observed with the KPNO 4m using the new KOSMOS spectrograph to derive direct-method metallicities. Our targets are strong point-like emission-line sources discovered serendipitously in continuum-subtracted narrowband images from the ALFALFA Hα survey. Follow-up spectroscopy of these "Hα dots" shows that these objects represent some of the lowest luminosity star-forming systems in the local Universe. Our KOSMOS spectra cover the full optical region and include detection of [O III] λ4363 in roughly a dozen objects. This paper presents some of the first scientific results obtained using this new spectrograph, and demonstrates its capabilities and effectiveness in deriving direct-method metallicities of faint objects.

  11. Infrared-faint radio sources are at high redshifts. Spectroscopic redshift determination of infrared-faint radio sources using the Very Large Telescope

    Science.gov (United States)

    Herzog, A.; Middelberg, E.; Norris, R. P.; Sharp, R.; Spitler, L. R.; Parker, Q. A.

    2014-07-01

    Context. Infrared-faint radio sources (IFRS) are characterised by relatively high radio flux densities and associated faint or even absent infrared and optical counterparts. The resulting extremely high radio-to-infrared flux density ratios up to several thousands were previously known only for high-redshift radio galaxies (HzRGs), suggesting a link between the two classes of object. However, the optical and infrared faintness of IFRS makes their study difficult. Prior to this work, no redshift was known for any IFRS in the Australia Telescope Large Area Survey (ATLAS) fields which would help to put IFRS in the context of other classes of object, especially of HzRGs. Aims: This work aims at measuring the first redshifts of IFRS in the ATLAS fields. Furthermore, we test the hypothesis that IFRS are similar to HzRGs, that they are higher-redshift or dust-obscured versions of these massive galaxies. Methods: A sample of IFRS was spectroscopically observed using the Focal Reducer and Low Dispersion Spectrograph 2 (FORS2) at the Very Large Telescope (VLT). The data were calibrated based on the Image Reduction and Analysis Facility (IRAF) and redshifts extracted from the final spectra, where possible. This information was then used to calculate rest-frame luminosities, and to perform the first spectral energy distribution modelling of IFRS based on redshifts. Results: We found redshifts of 1.84, 2.13, and 2.76, for three IFRS, confirming the suggested high-redshift character of this class of object. These redshifts and the resulting luminosities show IFRS to be similar to HzRGs, supporting our hypothesis. We found further evidence that fainter IFRS are at even higher redshifts. Conclusions: Considering the similarities between IFRS and HzRGs substantiated in this work, the detection of IFRS, which have a significantly higher sky density than HzRGs, increases the number of active galactic nuclei in the early universe and adds to the problems of explaining the formation of

  12. X-ray Counterparts of Infrared Faint Radio Sources

    Science.gov (United States)

    Schartel, Norbert

    2011-10-01

    Infrared Faint Radio Sources (IFRS) are radio sources with extremely faint or even absent infrared emission in deep Spitzer Surveys. Models of their spectral energy distributions, the ratios of radio to infrared flux densities and their steep radio spectra strongly suggest that IFRS are AGN at high redshifts (2IFRS, but if confirmed, the increased AGN numbers at these redshifts will account for the unresolved part of the X-ray background. The identification of X-ray counterparts of IFRS is considered to be the smoking gun for this hypothesis. We propose to observe 8 IFRS using 30ks pointed observations. X-ray detections of IFRS with different ratios of radio-to-infrared fluxes, will constrain the class-specific SED.

  13. A New System of Faint Near-Infrared Standard Stars

    Science.gov (United States)

    Persson, S. E.; Murphy, D. C.; Krzeminski, W.; Roth, M.; Rieke, M. J.

    1998-11-01

    A new grid of 65 faint near-infrared standard stars is presented. They are spread around the sky, lie between 10th and 12th magnitude at K, and are measured in most cases to precisions better than 0.001 mag in the J, H, K, and K_s bands; the latter is a medium-band modified K. A secondary list of red stars suitable for determining color transformations between photometric systems is also presented.

  14. High-speed holographic camera

    International Nuclear Information System (INIS)

    Novaro, Marc

    The high-speed holographic camera is a disgnostic instrument using holography as an information storing support. It allows us to take 10 holograms, of an object, with exposures times of 1,5ns, separated in time by 1 or 2ns. In order to get these results easily, no mobile part is used in the set-up [fr

  15. Camera Movement in Narrative Cinema

    DEFF Research Database (Denmark)

    Nielsen, Jakob Isak

    2007-01-01

    Just like art historians have focused on e.g. composition or lighting, this dissertation takes a single stylistic parameter as its object of study: camera movement. Within film studies this localized avenue of middle-level research has become increasingly viable under the aegis of a perspective k...

  16. Star/galaxy separation at faint magnitudes: application to a simulated Dark Energy Survey

    Energy Technology Data Exchange (ETDEWEB)

    Soumagnac, M. T.; Abdalla, F. B.; Lahav, O.; Kirk, D.; Sevilla, I.; Bertin, E.; Rowe, B. T. P.; Annis, J.; Busha, M. T.; Da Costa, L. N.; Frieman, J. A.; Gaztanaga, E.; Jarvis, M.; Lin, H.; Percival, W. J.; Santiago, B. X.; Sabiu, C. G.; Wechsler, R. H.; Wolz, L.; Yanny, B.

    2015-04-14

    We address the problem of separating stars from galaxies in future large photometric surveys. We focus our analysis on simulations of the Dark Energy Survey (DES). In the first part of the paper, we derive the science requirements on star/galaxy separation, for measurement of the cosmological parameters with the gravitational weak lensing and large-scale structure probes. These requirements are dictated by the need to control both the statistical and systematic errors on the cosmological parameters, and by point spread function calibration. We formulate the requirements in terms of the completeness and purity provided by a given star/galaxy classifier. In order to achieve these requirements at faint magnitudes, we propose a new method for star/galaxy separation in the second part of the paper. We first use principal component analysis to outline the correlations between the objects parameters and extract from it the most relevant information. We then use the reduced set of parameters as input to an Artificial Neural Network. This multiparameter approach improves upon purely morphometric classifiers (such as the classifier implemented in SExtractor), especially at faint magnitudes: it increases the purity by up to 20 per cent for stars and by up to 12 per cent for galaxies, at i-magnitude fainter than 23.

  17. Faint blue counts from formation of dwarf galaxies at z approximately equals 1

    Science.gov (United States)

    Babul, Arif; Rees, Martin J.

    1993-01-01

    The nature of faint blue objects (FBO's) has been a source of much speculation since their detection in deep CCD images of the sky. Their high surface density argues against them being progenitors of present-day bright galaxies and since they are only weakly clustered on small scales, they cannot be entities that merged together to form present-day galaxies. Babul & Rees (1992) have suggested that the observed faint blue counts may be due to dwarf elliptical galaxies undergoing their initial starburst at z is approximately equal to 1. In generic hierarchical clustering scenarios, however, dwarf galaxy halos (M is approximately 10(exp 9) solar mass) are expected to form at an earlier epoch; for example, typical 10(exp 9) solar mass halos will virialize at z is approximately equal to 2.3 if the power-spectrum for the density fluctuations is that of the standard b = 2 cold dark matter (CDM) model. Under 'ordinary conditions' the gas would rapidly cool, collect in the cores, and undergo star-formation. Conditions at high redshifts are far from 'ordinary'. The intense UV background will prevent the gas in the dwarf halos from cooling, the halos being released from their suspended state only when the UV flux has diminished sufficiently.

  18. Digital airborne camera introduction and technology

    CERN Document Server

    Sandau, Rainer

    2014-01-01

    The last decade has seen great innovations on the airborne camera. This book is the first ever written on the topic and describes all components of a digital airborne camera ranging from the object to be imaged to the mass memory device.

  19. Stereoscopic camera design

    Science.gov (United States)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  20. Using DSLR cameras in digital holography

    Science.gov (United States)

    Hincapié-Zuluaga, Diego; Herrera-Ramírez, Jorge; García-Sucerquia, Jorge

    2017-08-01

    In Digital Holography (DH), the size of the bidimensional image sensor to record the digital hologram, plays a key role on the performance of this imaging technique; the larger the size of the camera sensor, the better the quality of the final reconstructed image. Scientific cameras with large formats are offered in the market, but their cost and availability limit their use as a first option when implementing DH. Nowadays, DSLR cameras provide an easy-access alternative that is worthwhile to be explored. The DSLR cameras are a wide, commercial, and available option that in comparison with traditional scientific cameras, offer a much lower cost per effective pixel over a large sensing area. However, in the DSLR cameras, with their RGB pixel distribution, the sampling of information is different to the sampling in monochrome cameras usually employed in DH. This fact has implications in their performance. In this work, we discuss why DSLR cameras are not extensively used for DH, taking into account the problem reported by different authors of object replication. Simulations of DH using monochromatic and DSLR cameras are presented and a theoretical deduction for the replication problem using the Fourier theory is also shown. Experimental results of DH implementation using a DSLR camera show the replication problem.

  1. DISTRIBUTION OF FAINT ATOMIC GAS IN HICKSON COMPACT GROUPS

    Energy Technology Data Exchange (ETDEWEB)

    Borthakur, Sanchayeeta; Heckman, Timothy M.; Zhu, Guangtun [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States); Yun, Min Su [Astronomy Department, University of Massachusetts, Amherst, MA 01003 (United States); Verdes-Montenegro, Lourdes [Instituto de Astrofísica de Andalucía, CSIC, Apdo. Correos 3004, E-18080 Granada (Spain); Braatz, James A., E-mail: sanch@pha.jhu.edu [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States)

    2015-10-10

    We present 21 cm H i observations of four Hickson Compact Groups (HCGs) with evidence for a substantial intragroup medium using the Robert C. Byrd Green Bank Telescope (GBT). By mapping H i emission in a region of 25′ × 25′ (140–650 kpc) surrounding each HCG, these observations provide better estimates of H i masses. In particular, we detected 65% more H i than that detected in the Karl G. Jansky Very Large Array (VLA) imaging of HCG 92. We also identify whether the diffuse gas has the same spatial distribution as the high surface brightness (HSB) H i features detected in the VLA maps of these groups by comparing the H i strengths between the observed and modeled masses based on VLA maps. We found that the H i observed with the GBT has a similar spatial distribution to the HSB structures in HCG 31 and HCG 68. Conversely, the observed H i distributions in HCG 44 and HCG 92 were extended and showed significant offsets from the modeled masses. Most of the faint gas in HCG 44 lies to the northeast–southwest region and in HCG 92 lies in the northwest region of their respective groups. The spatial and dynamical similarities between the total (faint+HSB) and the HSB H i indicate that the faint gas is of tidal origin. We found that the gas will survive ionization by the cosmic UV background and the escaping ionizing photons from the star-forming regions and stay primarily neutral for at least 500 Myr.

  2. The GOODS UV Legacy Fields: A Full Census of Faint Star-Forming Galaxies at z~0.5-2

    Science.gov (United States)

    Oesch, Pascal

    2014-10-01

    Deep HST imaging has shown that the overall star formation density and UV light density at z>3 is dominated by faint, blue galaxies. Remarkably, very little is known about the equivalent galaxy population at lower redshifts. Understanding how these galaxies evolve across the epoch of peak cosmic star-formation is key to a complete picture of galaxy evolution. While we and others have been making every effort to use existing UV imaging data, a large fraction of the prior data were taken without post-flash and are not photometric. We now propose to obtain a robust legacy dataset for a complete census of faint star-forming galaxies at z~0.5-2, akin to what is achieved at z>3, using the unique capabilities of the WFC3/UVIS camera to obtain very deep UV imaging to 27.5-28.0 mag over the CANDELS Deep fields in GOODS North and South. We directly sample the FUV at z>~0.5 and we make these prime legacy fields for JWST with unique and essential UV/blue HST coverage. Together with the exquisite ancillary multi-wavelength data at high spatial resolution from ACS and WFC3/IR our program will result in accurate photometric redshifts for very faint sources and will enable a wealth of research by the community. This includes tracing the evolution of the FUV luminosity function over the peak of the star formation rate density from z~3 down to z~0.5, measuring the physical properties of sub-L* galaxies, and characterizing resolved stellar populations to decipher the build-up of the Hubble sequence from sub-galactic clumps. The lack of a future UV space telescope makes the acquisition of such legacy data imperative for the JWST era and beyond.

  3. Faint stars in the open cluster Trumpler 16

    International Nuclear Information System (INIS)

    Feinstein, A.

    1982-01-01

    UBVRI photoelectric data on faint stars in the young open cluster Tr 16 in the magnitude range V = 12--15 are presented. Some of these stars are overluminous for their class, and others fall slightly below the reddening line for A0-type stars in the two-color diagram. The problem presented by these stars is discussed; one possibility is that they are at the stage of gravitational contraction. Our new data in the (RI)/sub KC/ system are compatible with a deviation from the normal reddening law in the region of Tr 16. Large color excesses for the member stars are found in regions where the nebula appears darker

  4. Identification and spectrophotometry of faint southern radio galaxies

    International Nuclear Information System (INIS)

    Spinrad, H.; Kron, R.G.; Hunstead, R.W.

    1980-01-01

    We have observed a mixed sample of southern radio sources, identified on the Palomar sky survey or on previous direct plates taken with medium-aperture reflectors. At CIO we obtained a few deep 4m photographs and SIT spectrophotometry for redshift and continuum-color measurement. Almost all our sources were faint galaxies; the largest redshift measured was for 3C 275, with z=0.480. The ultraviolet continuum of PKS 0400--643, a ''thermal'' galaxy with z=0.476, closely resembles that of 3C 295 and shows some color evolution in U--B compared to nearby giant ellipticals

  5. The faint radio sky: VLBA observations of the COSMOS field

    Science.gov (United States)

    Herrera Ruiz, N.; Middelberg, E.; Deller, A.; Norris, R. P.; Best, P. N.; Brisken, W.; Schinnerer, E.; Smolčić, V.; Delvecchio, I.; Momjian, E.; Bomans, D.; Scoville, N. Z.; Carilli, C.

    2017-11-01

    Context. Quantifying the fraction of active galactic nuclei (AGN) in the faint radio population and understanding their relation with star-forming activity are fundamental to studies of galaxy evolution. Very long baseline interferometry (VLBI) observations are able to identify AGN above relatively low redshifts (z> 0.1) since they provide milli-arcsecond resolution. Aims: We have created an AGN catalogue from 2865 known radio sources observed in the Cosmic Evolution Survey (COSMOS) field, which has exceptional multi-wavelength coverage. With this catalogue we intend to study the faint radio sky with statistically relevant numbers and to analyse the AGN - host galaxy co-evolution, making use of the large amount of ancillary data available in the field. Methods: Wide-field VLBI observations were made of all known radio sources in the COSMOS field at 1.4 GHz to measure the AGN fraction, in particular in the faint radio population. We describe in detail the observations, data calibration, source detection and flux density measurements, parts of which we have developed for this survey. The combination of number of sources, sensitivity, and area covered with this project are unprecedented. Results: We have detected 468 radio sources, expected to be AGN, with the Very Long Baseline Array (VLBA). This is, to date, the largest sample assembled of VLBI detected sources in the sub-mJy regime. The input sample was taken from previous observations with the Very Large Array (VLA). We present the catalogue with additional optical, infrared and X-ray information. Conclusions: We find a detection fraction of 20 ± 1%, considering only those sources from the input catalogue which were in principle detectable with the VLBA (2361). As a function of the VLA flux density, the detection fraction is higher for higher flux densities, since at high flux densities a source could be detected even if the VLBI core accounts for a small percentage of the total flux density. As a function of

  6. CCD time-resolved photometry of faint cataclysmic variables. III

    Science.gov (United States)

    Howell, Steve B.; Szkody, Paula; Kreidl, Tobias J.; Mason, Keith O.; Puchnarewicz, E. M.

    1990-01-01

    CCD time-resolved photometry in V, B, and near-IR for 17 faint cataclysmic variables (CVs) is presented and analyzed. The data are obtained at Kitt Peak National Observatory, the Perkins reflector, Lowell Observatory, and the Observatorio del Roque de los Muchachos from April-June 1989. The degree of variability and periodicities for the CVs are examined. It is observed that the variability of most of the stars is consistent with CV class behavior. Orbital periods for five CVs are determined, and three potential eclipsing systems are detected.

  7. Sub-Camera Calibration of a Penta-Camera

    Science.gov (United States)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  8. SUB-CAMERA CALIBRATION OF A PENTA-CAMERA

    Directory of Open Access Journals (Sweden)

    K. Jacobsen

    2016-03-01

    Full Text Available Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors

  9. Those Nifty Digital Cameras!

    Science.gov (United States)

    Ekhaml, Leticia

    1996-01-01

    Describes digital photography--an electronic imaging technology that merges computer capabilities with traditional photography--and its uses in education. Discusses how a filmless camera works, types of filmless cameras, advantages and disadvantages, and educational applications of the consumer digital cameras. (AEF)

  10. Adapting Virtual Camera Behaviour

    DEFF Research Database (Denmark)

    Burelli, Paolo

    2013-01-01

    In a three-dimensional virtual environment aspects such as narrative and interaction completely depend on the camera since the camera defines the player’s point of view. Most research works in automatic camera control aim to take the control of this aspect from the player to automatically gen...

  11. Managing Syncope in the Elderly: The Not So Simple Faint in Aging Patients.

    Science.gov (United States)

    Solbiati, Monica; Sheldon, Robert; Seifer, Colette

    2016-09-01

    Providing care to the elderly patient with syncope poses problems that are unusual in their complexity. The differential diagnosis is broad, and sorting through it is made more difficult by the relative lack of symptoms surrounding the faint. Indeed, distinguishing faints from falls is often problematic. Many elderly patients are frail and are at risk of trauma if they should have an unprotected faint or fall to the ground. However, not all elderly patients are frail, and definitions of frailty vary. Providing accurate, effective, and appropriate care for the frail elderly patient who faints may require a multidisciplinary approach. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  12. Infrared-faint radio sources in the SERVS deep fields. Pinpointing AGNs at high redshift

    Science.gov (United States)

    Maini, A.; Prandoni, I.; Norris, R. P.; Spitler, L. R.; Mignano, A.; Lacy, M.; Morganti, R.

    2016-12-01

    Context. Infrared-faint radio sources (IFRS) represent an unexpected class of objects which are relatively bright at radio wavelength, but unusually faint at infrared (IR) and optical wavelengths. A recent and extensive campaign on the radio-brightest IFRSs (S1.4 GHz≳ 10 mJy) has provided evidence that most of them (if not all) contain an active galactic nuclei (AGN). Still uncertain is the nature of the radio-faintest IFRSs (S1.4 GHz≲ 1 mJy). Aims: The scope of this paper is to assess the nature of the radio-faintest IFRSs, testing their classification and improving the knowledge of their IR properties by making use of the most sensitive IR survey available so far: the Spitzer Extragalactic Representative Volume Survey (SERVS). We also explore how the criteria of IFRSs can be fine-tuned to pinpoint radio-loud AGNs at very high redshift (z > 4). Methods: We analysed a number of IFRS samples identified in SERVS fields, including a new sample (21 sources) extracted from the Lockman Hole. 3.6 and 4.5 μm IR counterparts of the 64 sources located in the SERVS fields were searched for and, when detected, their IR properties were studied. Results: We compared the radio/IR properties of the IR-detected IFRSs with those expected for a number of known classes of objects. We found that IR-detected IFRSs are mostly consistent with a mixture of high-redshift (z ≳ 3) radio-loud AGNs. The faintest ones (S1.4 GHz 100 μJy), however, could be also associated with nearer (z 2) dust-enshrouded star-burst galaxies. We also argue that, while IFRSs with radio-to-IR ratios >500 can very efficiently pinpoint radio-loud AGNs at redshift 2 < z < 4, lower radio-to-IR ratios ( 100-200) are expected for higher redshift radio-loud AGNs.

  13. Radiation camera exposure control

    International Nuclear Information System (INIS)

    Martone, R.J.; Yarsawich, M.; Wolczek, W.

    1976-01-01

    A system and method for governing the exposure of an image generated by a radiation camera to an image sensing camera is disclosed. The exposure is terminated in response to the accumulation of a predetermined quantity of radiation, defining a radiation density, occurring in a predetermined area. An index is produced which represents the value of that quantity of radiation whose accumulation causes the exposure termination. The value of the predetermined radiation quantity represented by the index is sensed so that the radiation camera image intensity can be calibrated to compensate for changes in exposure amounts due to desired variations in radiation density of the exposure, to maintain the detectability of the image by the image sensing camera notwithstanding such variations. Provision is also made for calibrating the image intensity in accordance with the sensitivity of the image sensing camera, and for locating the index for maintaining its detectability and causing the proper centering of the radiation camera image

  14. Do the enigmatic ``Infrared-Faint Radio Sources'' include pulsars?

    Science.gov (United States)

    Hobbs, George; Middelberg, Enno; Norris, Ray; Keith, Michael; Mao, Minnie; Champion, David

    2009-04-01

    The Australia Telescope Large Area Survey (ATLAS) team have surveyed seven square degrees of sky at 1.4GHz. During processing some unexpected infrared-faint radio sources (IFRS sources) were discovered. The nature of these sources is not understood, but it is possible that some of these sources may be pulsars within our own galaxy. We propose to observe the IFRS sources with steep spectral indices using standard search techniques to determine whether or not they are pulsars. A pulsar detection would 1) remove a subset of the IFRS sources from the ATLAS sample so they would not need to be observed with large optical/IR telescopes to find their hosts and 2) be intrinsically interesting as the pulsar would be a millisecond pulsar and/or have an extreme spatial velocity.

  15. Autonomous Multicamera Tracking on Embedded Smart Cameras

    Directory of Open Access Journals (Sweden)

    Bischof Horst

    2007-01-01

    Full Text Available There is currently a strong trend towards the deployment of advanced computer vision methods on embedded systems. This deployment is very challenging since embedded platforms often provide limited resources such as computing performance, memory, and power. In this paper we present a multicamera tracking method on distributed, embedded smart cameras. Smart cameras combine video sensing, processing, and communication on a single embedded device which is equipped with a multiprocessor computation and communication infrastructure. Our multicamera tracking approach focuses on a fully decentralized handover procedure between adjacent cameras. The basic idea is to initiate a single tracking instance in the multicamera system for each object of interest. The tracker follows the supervised object over the camera network, migrating to the camera which observes the object. Thus, no central coordination is required resulting in an autonomous and scalable tracking approach. We have fully implemented this novel multicamera tracking approach on our embedded smart cameras. Tracking is achieved by the well-known CamShift algorithm; the handover procedure is realized using a mobile agent system available on the smart camera network. Our approach has been successfully evaluated on tracking persons at our campus.

  16. The radio spectral energy distribution of infrared-faint radio sources

    Science.gov (United States)

    Herzog, A.; Norris, R. P.; Middelberg, E.; Seymour, N.; Spitler, L. R.; Emonts, B. H. C.; Franzen, T. M. O.; Hunstead, R.; Intema, H. T.; Marvil, J.; Parker, Q. A.; Sirothia, S. K.; Hurley-Walker, N.; Bell, M.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Callingham, J. R.; Deshpande, A. A.; Dwarakanath, K. S.; For, B.-Q.; Greenhill, L. J.; Hancock, P.; Hazelton, B. J.; Hindson, L.; Johnston-Hollitt, M.; Kapińska, A. D.; Kaplan, D. L.; Lenc, E.; Lonsdale, C. J.; McKinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Morgan, J.; Oberoi, D.; Offringa, A.; Ord, S. M.; Prabu, T.; Procopio, P.; Udaya Shankar, N.; Srivani, K. S.; Staveley-Smith, L.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.; Wu, C.; Zheng, Q.; Bannister, K. W.; Chippendale, A. P.; Harvey-Smith, L.; Heywood, I.; Indermuehle, B.; Popping, A.; Sault, R. J.; Whiting, M. T.

    2016-10-01

    Context. Infrared-faint radio sources (IFRS) are a class of radio-loud (RL) active galactic nuclei (AGN) at high redshifts (z ≥ 1.7) that are characterised by their relative infrared faintness, resulting in enormous radio-to-infrared flux density ratios of up to several thousand. Aims: Because of their optical and infrared faintness, it is very challenging to study IFRS at these wavelengths. However, IFRS are relatively bright in the radio regime with 1.4 GHz flux densities of a few to a few tens of mJy. Therefore, the radio regime is the most promising wavelength regime in which to constrain their nature. We aim to test the hypothesis that IFRS are young AGN, particularly GHz peaked-spectrum (GPS) and compact steep-spectrum (CSS) sources that have a low frequency turnover. Methods: We use the rich radio data set available for the Australia Telescope Large Area Survey fields, covering the frequency range between 150 MHz and 34 GHz with up to 19 wavebands from different telescopes, and build radio spectral energy distributions (SEDs) for 34 IFRS. We then study the radio properties of this class of object with respect to turnover, spectral index, and behaviour towards higher frequencies. We also present the highest-frequency radio observations of an IFRS, observed with the Plateau de Bure Interferometer at 105 GHz, and model the multi-wavelength and radio-far-infrared SED of this source. Results: We find IFRS usually follow single power laws down to observed frequencies of around 150 MHz. Mostly, the radio SEDs are steep (α IFRS show statistically significantly steeper radio SEDs than the broader RL AGN population. Our analysis reveals that the fractions of GPS and CSS sources in the population of IFRS are consistent with the fractions in the broader RL AGN population. We find that at least % of IFRS contain young AGN, although the fraction might be significantly higher as suggested by the steep SEDs and the compact morphology of IFRS. The detailed multi

  17. Solid state video cameras

    CERN Document Server

    Cristol, Y

    2013-01-01

    Solid State Video Cameras reviews the state of the art in the field of solid-state television cameras as compiled from patent literature. Organized into 10 chapters, the book begins with the basic array types of solid-state imagers and appropriate read-out circuits and methods. Documents relating to improvement of picture quality, such as spurious signal suppression, uniformity correction, or resolution enhancement, are also cited. The last part considerssolid-state color cameras.

  18. LSST Camera Optics Design

    Energy Technology Data Exchange (ETDEWEB)

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  19. Wavefront analysis for plenoptic camera imaging

    International Nuclear Information System (INIS)

    Luan Yin-Sen; Xu Bing; Yang Ping; Tang Guo-Mao

    2017-01-01

    The plenoptic camera is a single lens stereo camera which can retrieve the direction of light rays while detecting their intensity distribution. In this paper, to reveal more truths of plenoptic camera imaging, we present the wavefront analysis for the plenoptic camera imaging from the angle of physical optics but not from the ray tracing model of geometric optics. Specifically, the wavefront imaging model of a plenoptic camera is analyzed and simulated by scalar diffraction theory and the depth estimation is redescribed based on physical optics. We simulate a set of raw plenoptic images of an object scene, thereby validating the analysis and derivations and the difference between the imaging analysis methods based on geometric optics and physical optics are also shown in simulations. (paper)

  20. Lessons Learned from Crime Caught on Camera

    Science.gov (United States)

    Bernasco, Wim

    2018-01-01

    Objectives: The widespread use of camera surveillance in public places offers criminologists the opportunity to systematically and unobtrusively observe crime, their main subject matter. The purpose of this essay is to inform the reader of current developments in research on crimes caught on camera. Methods: We address the importance of direct observation of behavior and review criminological studies that used observational methods, with and without cameras, including the ones published in this issue. We also discuss the uses of camera recordings in other social sciences and in biology. Results: We formulate six key insights that emerge from the literature and make recommendations for future research. Conclusions: Camera recordings of real-life crime are likely to become part of the criminological tool kit that will help us better understand the situational and interactional elements of crime. Like any source, it has limitations that are best addressed by triangulation with other sources. PMID:29472728

  1. Lessons Learned from Crime Caught on Camera

    DEFF Research Database (Denmark)

    Lindegaard, Marie Rosenkrantz; Bernasco, Wim

    2018-01-01

    Objectives: The widespread use of camera surveillance in public places offers criminologists the opportunity to systematically and unobtrusively observe crime, their main subject matter. The purpose of this essay is to inform the reader of current developments in research on crimes caught on camera....... Methods: We address the importance of direct observation of behavior and review criminological studies that used observational methods, with and without cameras, including the ones published in this issue. We also discuss the uses of camera recordings in other social sciences and in biology. Results: We...... formulate six key insights that emerge from the literature and make recommendations for future research. Conclusions: Camera recordings of real-life crime are likely to become part of the criminological tool kit that will help us better understand the situational and interactional elements of crime. Like...

  2. The radio spectral energy distribution of infrared-faint radio sources

    NARCIS (Netherlands)

    Herzog, A.; Norris, R. P.; Middelberg, E.; Seymour, N.; Spitler, L. R.; Emonts, B. H. C.; Franzen, T. M. O.; Hunstead, R.; Intema, H. T.; Marvil, J.; Parker, Q. A.; Sirothia, S. K.; Hurley-Walker, N.; Bell, M.; Bernardi, G.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Callingham, J. R.; Deshpande, A. A.; Dwarakanath, K. S.; For, B. -Q; Greenhill, L. J.; Hancock, P.; Hazelton, B. J.; Hindson, L.; Johnston-Hollitt, M.; Kapińska, A. D.; Kaplan, D. L.; Lenc, E.; Lonsdale, C. J.; McKinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Morgan, J.; Oberoi, D.; Offringa, A.; Ord, S. M.; Prabu, T.; Procopio, P.; Udaya Shankar, N.; Srivani, K. S.; Staveley-Smith, L.; Subrahmanyan, R.; Tingay, S. J.; Wayth, R. B.; Webster, R. L.; Williams, A.; Williams, C. L.; Wu, C.; Zheng, Q.; Bannister, K. W.; Chippendale, A. P.; Harvey-Smith, L.; Heywood, I.; Indermuehle, B.; Popping, A.; Sault, R. J.; Whiting, M. T.

    2016-01-01

    Context. Infrared-faint radio sources (IFRS) are a class of radio-loud (RL) active galactic nuclei (AGN) at high redshifts (z ≥ 1.7) that are characterised by their relative infrared faintness, resulting in enormous radio-to-infrared flux density ratios of up to several thousand. Aims: Because of

  3. Optical and near-infrared imaging of faint Gigahertz Peaked Spectrum sources

    NARCIS (Netherlands)

    Snellen, IAG; Schilizzi, RT; de Bruyn, AG; Miley, GK; Rottgering, HJA; McMahon, RG; Fournon, IP

    1998-01-01

    A sample of 47 faint Gigahertz Peaked Spectrum (GPS) radio sources selected from the Westerbork Northern Sky Survey (WENSS) has been imaged in the optical and near-infrared, resulting in an identification fraction of 87 per cent. The R - I and R - K colours of the faint optical counterparts are as

  4. External pneumatic compression device prevents fainting in standing weight-bearing MRI

    DEFF Research Database (Denmark)

    Hansen, Bjarke Brandt; Bouert, Rasmus; Bliddal, Henning

    2013-01-01

    To investigate if a peristaltic external pneumatic compression device attached to the legs, while scanning, can reduce a substantial risk of fainting in standing weight-bearing magnetic resonance imaging (MRI).......To investigate if a peristaltic external pneumatic compression device attached to the legs, while scanning, can reduce a substantial risk of fainting in standing weight-bearing magnetic resonance imaging (MRI)....

  5. The Detection of Faint Space Objects Using Solid State Imaging Detectors.

    Science.gov (United States)

    1983-12-31

    are con.iposed of baryonic matter . New arguments were presented against halos being composed of planets or asteroids. D. Hegyi was also invited to...being made up of baryonic matter . 5.0 THE CHARGE-COUPLED DEVICE IMAGING SYSTEM Our major hardware improvement during the past year is a stainless steel...Hegyi Department of Physics University of Michigan Ann Arbor, Michigan ABSIR:CT The problems with massive halos being composed of baryonic matter are

  6. Track-Before-Detect Algorithm for Faint Moving Objects based on Random Sampling and Consensus

    Science.gov (United States)

    2014-09-01

    Histogram Probabilistic Multi-Hypothesis Tracker [Streit 2000]. Davey et al. showed that at 3dB peak SNR, the algorithms began to show degraded performance...a highly elliptical orbit with the apogee at 34694 km. It was observed at a range of 13865.44 km so it was moving in the staring FOV. The solar

  7. Optimization of Camera Parameters in Volume Intersection

    Science.gov (United States)

    Sakamoto, Sayaka; Shoji, Kenji; Toyama, Fubito; Miyamichi, Juichi

    Volume intersection is one of the simplest techniques for reconstructing 3-D shape from 2-D silhouettes. 3D shapes can be reconstructed from multiple view images by back-projecting them from the corresponding viewpoints and intersecting the resulting solid cones. The camera position and orientation (extrinsic camera parameters) of each viewpoint with respect to the object are needed to accomplish reconstruction. However, even a little variation in the camera parameters makes the reconstructed 3-D shape smaller than that with the exact parameters. The problem of optimizing camera parameters deals with determining exact ones based on multiple silhouette images and approximate ones. This paper examines attempts to optimize camera parameters by reconstructing a 3-D shape via the concept of volume intersection and then maximizing the volume of the 3-D shape. We have tested the proposed method to optimize the camera parameters using a VRML model. In experiments we apply the downhill simplex method to optimize them. The results of experiments show that the maximized volume of the reconstructed 3-D shape is one of the criteria to optimize camera parameters in camera arrangement like this experiment.

  8. Scintillation camera with second order resolution

    International Nuclear Information System (INIS)

    Muehllehner, G.

    1976-01-01

    A scintillation camera for use in radioisotope imaging to determine the concentration of radionuclides in a two-dimensional area is described in which means is provided for second order positional resolution. The phototubes, which normally provide only a single order of resolution, are modified to provide second order positional resolution of radiation within an object positioned for viewing by the scintillation camera. The phototubes are modified in that multiple anodes are provided to receive signals from the photocathode in a manner such that each anode is particularly responsive to photoemissions from a limited portion of the photocathode. Resolution of radioactive events appearing as an output of this scintillation camera is thereby improved

  9. The MVACS Robotic Arm Camera

    Science.gov (United States)

    Keller, H. U.; Hartwig, H.; Kramm, R.; Koschny, D.; Markiewicz, W. J.; Thomas, N.; Fernades, M.; Smith, P. H.; Reynolds, R.; Lemmon, M. T.; Weinberg, J.; Marcialis, R.; Tanner, R.; Boss, B. J.; Oquest, C.; Paige, D. A.

    2001-08-01

    The Robotic Arm Camera (RAC) is one of the key instruments newly developed for the Mars Volatiles and Climate Surveyor payload of the Mars Polar Lander. This lightweight instrument employs a front lens with variable focus range and takes images at distances from 11 mm (image scale 1:1) to infinity. Color images with a resolution of better than 50 μm can be obtained to characterize the Martian soil. Spectral information of nearby objects is retrieved through illumination with blue, green, and red lamp sets. The design and performance of the camera are described in relation to the science objectives and operation. The RAC uses the same CCD detector array as the Surface Stereo Imager and shares the readout electronics with this camera. The RAC is mounted at the wrist of the Robotic Arm and can characterize the contents of the scoop, the samples of soil fed to the Thermal Evolved Gas Analyzer, the Martian surface in the vicinity of the lander, and the interior of trenches dug out by the Robotic Arm. It can also be used to take panoramic images and to retrieve stereo information with an effective baseline surpassing that of the Surface Stereo Imager by about a factor of 3.

  10. Extended Schmidt law holds for faint dwarf irregular galaxies

    Science.gov (United States)

    Roychowdhury, Sambit; Chengalur, Jayaram N.; Shi, Yong

    2017-12-01

    Context. The extended Schmidt law (ESL) is a variant of the Schmidt which relates the surface densities of gas and star formation, with the surface density of stellar mass added as an extra parameter. Although ESL has been shown to be valid for a wide range of galaxy properties, its validity in low-metallicity galaxies has not been comprehensively tested. This is important because metallicity affects the crucial atomic-to-molecular transition step in the process of conversion of gas to stars. Aims: We empirically investigate for the first time whether low metallicity faint dwarf irregular galaxies (dIrrs) from the local universe follow the ESL. Here we consider the "global" law where surface densities are averaged over the galactic discs. dIrrs are unique not only because they are at the lowest end of mass and star formation scales for galaxies, but also because they are metal-poor compared to the general population of galaxies. Methods: Our sample is drawn from the Faint Irregular Galaxy GMRT Survey (FIGGS) which is the largest survey of atomic hydrogen in such galaxies. The gas surface densities are determined using their atomic hydrogen content. The star formation rates are calculated using GALEX far ultraviolet fluxes after correcting for dust extinction, whereas the stellar surface densities are calculated using Spitzer 3.6 μm fluxes. The surface densities are calculated over the stellar discs defined by the 3.6 μm images. Results: We find dIrrs indeed follow the ESL. The mean deviation of the FIGGS galaxies from the relation is 0.01 dex, with a scatter around the relation of less than half that seen in the original relation. In comparison, we also show that the FIGGS galaxies are much more deviant when compared to the "canonical" Kennicutt-Schmidt relation. Conclusions: Our results help strengthen the universality of the ESL, especially for galaxies with low metallicities. We suggest that models of star formation in which feedback from previous generations

  11. The first VLBI image of an infrared-faint radio source

    Science.gov (United States)

    Middelberg, E.; Norris, R. P.; Tingay, S.; Mao, M. Y.; Phillips, C. J.; Hotan, A. W.

    2008-11-01

    Context: We investigate the joint evolution of active galactic nuclei and star formation in the Universe. Aims: In the 1.4 GHz survey with the Australia Telescope Compact Array of the Chandra Deep Field South and the European Large Area ISO Survey - S1 we have identified a class of objects which are strong in the radio but have no detectable infrared and optical counterparts. This class has been called Infrared-Faint Radio Sources, or IFRS. 53 sources out of 2002 have been classified as IFRS. It is not known what these objects are. Methods: To address the many possible explanations as to what the nature of these objects is we have observed four sources with the Australian Long Baseline Array. Results: We have detected and imaged one of the four sources observed. Assuming that the source is at a high redshift, we find its properties in agreement with properties of Compact Steep Spectrum sources. However, due to the lack of optical and infrared data the constraints are not particularly strong.

  12. The Faint Young Sun Paradox: A Simplified Thermodynamic Approach

    Directory of Open Access Journals (Sweden)

    F. Angulo-Brown

    2012-01-01

    Full Text Available Classical models of the Sun suggest that the energy output in the early stage of its evolution was 30 percent less than today. In this context, radiative balance alone between The Sun and the Earth was not sufficient to explain the early presence of liquid water on Earth’s surface. This difficulty is called the faint young Sun paradox. Many proposals have been published to solve this paradox. In the present work, we propose an oversimplified finite-time thermodynamic approach that describes the air convective cells in the Earth atmosphere. This model introduces two atmospheric modes of thermodynamic performance: a first mode consisting in the maximization of the power output of the convective cells (maximum power regime and a second mode that consists in maximizing a functional representing a good trade-off between power output and entropy production (the ecological regime. Within the assumptions of this oversimplified model, we present different scenarios of albedo and greenhouse effects that seem realistic to preserve liquid water on the Earth in the early stage of formation.

  13. Helium shells and faint emission lines from slitless flash spectra.

    Science.gov (United States)

    Bazin, Cyril; Koutchmy, Serge

    2013-05-01

    At the time of the two last solar total eclipses of August 1st, 2008 in Siberia and July 11th, 2010 in French Polynesia, high frame rate CCD flash spectra were obtained. These eclipses occurred in quiet Sun period and after. The slitless flash spectra show two helium shells, in the weak Paschen α 4686 Å line of the ionized helium HeII and in the neutral helium HeI line at 4713 Å. The extensions of these helium shells are typically 3 Mm. In prominences, the extension of the interface with the corona is much more extended. The observations and analysis of these lines can properly be done only in eclipse conditions, when the intensity threshold reaches the coronal level, and the parasitic scattered light is virtually zero. Under the layers of 1 Mm above the limb, many faint low FIP lines were also seen in emission. These emission lines are superposed on the continuum containing absorption lines. The solar limb can be defined using the weak continuum appearing between the emission lines at the time of the second and third contact. The variations of the singly ionized iron line, the HeI and HeII lines and the continuum intensity are analyzed. The intensity ratio of ionized to neutral helium is studied for evaluating the ionization rate in low layers up to 2 Mm and also around a prominence.

  14. SUPERNOVA 2003ie WAS LIKELY A FAINT TYPE IIP EVENT

    Energy Technology Data Exchange (ETDEWEB)

    Arcavi, Iair; Gal-Yam, Avishay [Department of Particle Physics and Astrophysics, The Weizmann Institute of Science, Rehovot 76100 (Israel); Sergeev, Sergey G., E-mail: iair.arcavi@weizmann.ac.il [Crimean Astrophysical Observatory, P/O Nauchny, Crimea 98409 (Ukraine)

    2013-04-15

    We present new photometric observations of supernova (SN) 2003ie starting one month before discovery, obtained serendipitously while observing its host galaxy. With only a weak upper limit derived on the mass of its progenitor (<25 M{sub Sun }) from previous pre-explosion studies, this event could be a potential exception to the ''red supergiant (RSG) problem'' (the lack of high-mass RSGs exploding as Type IIP SNe). However, this is true only if SN2003ie was a Type IIP event, something which has never been determined. Using recently derived core-collapse SN light-curve templates, as well as by comparison to other known SNe, we find that SN2003ie was indeed a likely Type IIP event. However, with a plateau magnitude of {approx} - 15.5 mag, it is found to be a member of the faint Type IIP class. Previous members of this class have been shown to arise from relatively low-mass progenitors (<12 M{sub Sun }). It therefore seems unlikely that this SN had a massive RSG progenitor. The use of core-collapse SN light-curve templates is shown to be helpful in classifying SNe with sparse coverage. These templates are likely to become more robust as large homogeneous samples of core-collapse events are collected.

  15. Carbon Dioxide Cycling, Climate, Impacts, and the Faint Young Sun

    Science.gov (United States)

    Zahnle, K. J.; Sleep, H. H.

    1999-01-01

    Evidence for relatively mild climates on ancient Earth and Mars has been a puzzle in light of the faint early sun. The geologic evidence, although far from conclusive, would appear to indicate that the surfaces of both planets were, if anything, warmer ca. 3-4 Ga than they are now. The astrophysical argument that the sun ought to have brightened approx. 30% since it reached the main sequence is hard to refute. There results a paradox between the icehouse we expect and the greenhouse we think we see. The usual fix has been to posit massive CO2 atmospheres, although reduced gases (e.g., NH3 or CH4 ) have had their partisans. Evidence against siderite in paleosols dated 2.2-2.75 Ga sets a rough upper limit of 30 PAL (present atmospheric levels) on pCO2 at that time. This is an order of magnitude short of what is needed to defeat the fainter sun. We present here an independent argument against high pCO2 on early Earth that applies not only to the Archean but yet more forcefully to the Hadean era. Additional information is contained in the original extended abstract.

  16. No climate paradox under the faint early Sun.

    Science.gov (United States)

    Rosing, Minik T; Bird, Dennis K; Sleep, Norman H; Bjerrum, Christian J

    2010-04-01

    Environmental niches in which life first emerged and later evolved on the Earth have undergone dramatic changes in response to evolving tectonic/geochemical cycles and to biologic interventions, as well as increases in the Sun's luminosity of about 25 to 30 per cent over the Earth's history. It has been inferred that the greenhouse effect of atmospheric CO(2) and/or CH(4) compensated for the lower solar luminosity and dictated an Archaean climate in which liquid water was stable in the hydrosphere. Here we demonstrate, however, that the mineralogy of Archaean sediments, particularly the ubiquitous presence of mixed-valence Fe(II-III) oxides (magnetite) in banded iron formations is inconsistent with such high concentrations of greenhouse gases and the metabolic constraints of extant methanogens. Prompted by this, and the absence of geologic evidence for very high greenhouse-gas concentrations, we hypothesize that a lower albedo on the Earth, owing to considerably less continental area and to the lack of biologically induced cloud condensation nuclei, made an important contribution to moderating surface temperature in the Archaean eon. Our model calculations suggest that the lower albedo of the early Earth provided environmental conditions above the freezing point of water, thus alleviating the need for extreme greenhouse-gas concentrations to satisfy the faint early Sun paradox.

  17. Galaxy evolution and faint counts in the near-infrared.

    Science.gov (United States)

    Rocca-Volmerange, B.; Fioc, M.

    The contribution of distant galaxies to the diffuse near-infrared and submillimetric extragalactic backgrounds can be predicted with the help of a multispectral modelling of the faint galaxy counts. In particular, star-forming galaxies have to be taken into account as well as old evolved galaxies implying a coherent simulation of the stellar emission from the blue to the near-infrared with gas and dust contributions. For this reason, an extension of previous UV and visible models is worked out with a detailed synthesis population model for strong, short timescale starbursts in a dusty medium in the near IR (Lançon and Rocca-Volmerange, 1995) by using a spectral library of stars from 1.4 to 2.5 μm observed with the FTS instrument at the 3.60-m/CFHT and fitted on starburst spectra observed with the instrument. Then an extension of the authors' atlas of synthetic galaxies was carried out, the near-IR emission (K band) being carefully normalised to the blue (B, V or J+ bands) emission. Galaxy counts in the K band are modelled in various cosmologies. The comparison with observations raises questions to be discussed with results from visible counts.

  18. Educational Applications for Digital Cameras.

    Science.gov (United States)

    Cavanaugh, Terence; Cavanaugh, Catherine

    1997-01-01

    Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

  19. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2 0 deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  20. First results from the TOPSAT camera

    Science.gov (United States)

    Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve

    2017-11-01

    The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.

  1. Automatic camera tracking for remote manipulators

    Energy Technology Data Exchange (ETDEWEB)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  2. Automatic camera tracking for remote manipulators

    Energy Technology Data Exchange (ETDEWEB)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  3. Towards next generation 3D cameras

    Science.gov (United States)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (<100 microns resolution) scans in extremely demanding scenarios with low-cost components. Several of these cameras are making a practical impact in industrial automation, being adopted in robotic inspection and assembly systems.

  4. Advanced CCD camera developments

    Energy Technology Data Exchange (ETDEWEB)

    Condor, A. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  5. Gamma camera system

    International Nuclear Information System (INIS)

    Miller, D.W.; Gerber, M.S.; Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    A detailed description is given of a novel gamma camera which is designed to produce superior images than conventional cameras used in nuclear medicine. The detector consists of a solid state detector (e.g. germanium) which is formed to have a plurality of discrete components to enable 2-dimensional position identification. Details of the electronic processing circuits are given and the problems and limitations introduced by noise are discussed in full. (U.K.)

  6. The Circular Camera Movement

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    2014-01-01

    It has been an accepted precept in film theory that specific stylistic features do not express specific content. Nevertheless, it is possible to find many examples in the history of film in which stylistic features do express specific content: for instance, the circular camera movement is used re...... such as the circular camera movement. Keywords: embodied perception, embodied style, explicit narration, interpretation, style pattern, television style...

  7. A search for AGN activity in Infrared-Faint Radio Sources (IFRS)

    Science.gov (United States)

    Lenc, Emil; Middelberg, Enno; Norris, Ray; Mao, Minnie

    2010-04-01

    We propose to observe a large sample of radio sources from the ATLAS (Australia Telescope Large Area Survey) source catalogue with the LBA, to determine their compactness. The sample consists of 36 sources with no counterpart in the co-located SWIRE survey (3.6 um to 160 um), carried out with the Spitzer Space Telescope. This rare class of sources, dubber Infrared-Faint Radio Sources (IFRS), is inconsistent with current galaxy evolution models. VLBI observations are an essential way to obtain further clues on what these objects are and why they are hidden from infrared observations. We will measure the flux densities on long baselines to determine their compactness. Only five IFRS have been previously targeted with VLBI observations (resulting in two detections). We propose using single baseline (Parkes-ATCA) eVLBI observations with the LBA at 1 Gbps to maximise sensitivity. With the observations proposed here we will increase the number of VLBI-observed IFRS from 5 to 36, allowing us to draw statistical conclusions about this intriguing new class of objects.

  8. Neutron cameras for ITER

    International Nuclear Information System (INIS)

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-01-01

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from 16 N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with 16 N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins

  9. Deployable Wireless Camera Penetrators

    Science.gov (United States)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  10. Wide-Field Imaging of Omega Centauri with the Advanced Camera for Surveys

    Science.gov (United States)

    Haggard, D.; Dorfman, J. L.; Cool, A. M.; Anderson, J.; Bailyn, C. D.; Edmonds, P. D.; Grindlay, J. E.

    2003-12-01

    We present initial results of a wide-field imaging study of the globular cluster Omega Cen (NGC 5139) using the Advanced Camera for Surveys (ACS). We have obtained a mosaic of 3x3 pointings of the cluster using the HST/ACS Wide Field Camera covering approximately 10' x 10', roughly out to the cluster's half-mass radius. Using F435W (B435), F625W (R625) and F658N (H-alpha) filters, we are searching for optical counterparts of Chandra X-ray sources and studying the cluster's stellar populations. Here we report the discovery of an optical counterpart to the X-ray source identified by Rutledge et al. (2002) as a possible quiescent neutron star on the basis of its X-ray spectrum. The star's magnitude and color (R625 = 24.4, B435-R625 = 1.5) place it more than 1.5 magnitudes to the blue side of the main sequence. Through the H-alpha filter it is about 1.3 magnitudes brighter than cluster stars of comparable R625 magnitude. The blue color and H-alpha excess suggest the presence of an accretion disk, implying that the neutron star is a member of a quiescent low-mass X-ray binary. The object's faint absolute magnitude (M625 ˜ 10.6, M435 ˜ 11.8) implies that the system contains an unusually weak disk and that the companion, if it is a main-sequence star, is of very low mass (ACS study. This work is supported by NASA grant GO-9442 from the Space Telescope Science Institute.

  11. A Flight Photon Counting Camera for the WFIRST Coronagraph

    Science.gov (United States)

    Morrissey, Patrick

    2018-01-01

    A photon counting camera based on the Teledyne-e2v CCD201-20 electron multiplying CCD (EMCCD) is being developed for the NASA WFIRST coronagraph, an exoplanet imaging technology development of the Jet Propulsion Laboratory (Pasadena, CA) that is scheduled to launch in 2026. The coronagraph is designed to directly image planets around nearby stars, and to characterize their spectra. The planets are exceedingly faint, providing signals similar to the detector dark current, and require the use of photon counting detectors. Red sensitivity (600-980nm) is preferred to capture spectral features of interest. Since radiation in space affects the ability of the EMCCD to transfer the required single electron signals, care has been taken to develop appropriate shielding that will protect the cameras during a five year mission. In this poster, consideration of the effects of space radiation on photon counting observations will be described with the mitigating features of the camera design. An overview of the current camera flight system electronics requirements and design will also be described.

  12. The Dark Energy Camera

    Science.gov (United States)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  13. THE DARK ENERGY CAMERA

    Energy Technology Data Exchange (ETDEWEB)

    Flaugher, B.; Diehl, H. T.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Buckley-Geer, E. J. [Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510 (United States); Honscheid, K. [Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Abbott, T. M. C.; Bonati, M. [Cerro Tololo Inter-American Observatory, National Optical Astronomy Observatory, Casilla 603, La Serena (Chile); Antonik, M.; Brooks, D. [Department of Physics and Astronomy, University College London, Gower Street, London, WC1E 6BT (United Kingdom); Ballester, O.; Cardiel-Sas, L. [Institut de Física d’Altes Energies, Universitat Autònoma de Barcelona, E-08193 Bellaterra, Barcelona (Spain); Beaufore, L. [Department of Physics, The Ohio State University, Columbus, OH 43210 (United States); Bernstein, G. M. [Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Bernstein, R. A. [Carnegie Observatories, 813 Santa Barbara St., Pasadena, CA 91101 (United States); Bigelow, B.; Boprie, D. [Department of Physics, University of Michigan, Ann Arbor, MI 48109 (United States); Campa, J. [Centro de Investigaciones Energèticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Castander, F. J., E-mail: diehl@fnal.gov [Institut de Ciències de l’Espai, IEEC-CSIC, Campus UAB, Facultat de Ciències, Torre C5 par-2, E-08193 Bellaterra, Barcelona (Spain); Collaboration: DES Collaboration; and others

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  14. Speckle Interferometry of Close Visual Binaries with the ZW Optical ASI 224MC CMOS Camera

    Science.gov (United States)

    Genet, Russell; Rowe, David; Ashcraft, Clif; Wen, Sam; Jones, Gregory; Schillings, Benoit; Harshaw, Richard; Ray, Jimmy; Hass, Jacob

    2016-03-01

    Rapid improvements have been made in the design and manufacture of high-speed CMOS cameras with low read noise. Scientific CMOS (sCMOS) cameras appeared on the market a few years ago with performance approaching that of EMCCD cameras. Although sCMOS cameras remained expensive (typically over 10,000), there was hope that the cost of the low-noise CMOS chips would be reduced, allowing them to be incorporated into low-cost, mass-market cameras. This happened recently. A team at Sony developed a low read-noise (less than 1 electron) chip that has been incorporated in a camera made by ZWO that only costs 359 US, and just weighs 120 g (4.2 oz). Benoit Shillings obtained one of these cameras, placed it on his 0.5-meter telescope in San Jose, California, and, using a list of targets supplied by Richard Harshaw, quickly demonstrated that the new ZWO camera could observe remarkably faint and close double stars.

  15. Teaching the Thrill of Discovery: Student Exploration of Ultra-Faint Dwarf Galaxies with the NOAO Data Lab

    Science.gov (United States)

    Olsen, Knut; Walker, Constance E.; Smith, Blake; NOAO Data Lab Team

    2018-01-01

    We describe an activity aimed at teaching students how ultra-faint Milky Way dwarf galaxies are typically discovered: through filtering of optical photometric catalogs and cross-examination with deep images. The activity, which was developed as part of the Teen Astronomy Café program (https://teensciencecafe.org/cafes/az-teen-astronomy-cafe-tucson/), uses the NOAO Data Lab (http://datalab.noao.edu) and other professional-grade tools to lead high school students through exploration of the object catalog and images from the Survey of the Magellanic Stellar History (SMASH). The students are taught how to use images and color-magnitude diagrams to analyze and interpret stellar populations of increasing complexity, including those of star clusters and the Magellanic Clouds, and culminating with the discovery of the Hydra II ultra-faint dwarf galaxy. The tools and datasets presented allow the students to explore and discover other known stellar systems, as well as unknown candidate star clusters and dwarf galaxies. The ultimate goal of the activity is to give students insight into the methods of modern astronomical research and to allow them to participate in the thrill of discovery.

  16. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  17. BEASTS OF THE SOUTHERN WILD: DISCOVERY OF NINE ULTRA FAINT SATELLITES IN THE VICINITY OF THE MAGELLANIC CLOUDS

    Energy Technology Data Exchange (ETDEWEB)

    Koposov, Sergey E.; Belokurov, Vasily; Torrealba, Gabriel; Evans, N. Wyn, E-mail: koposov@ast.cam.ac.uk, E-mail: vasily@ast.cam.ac.uk [Institute of Astronomy, Madingley Road, Cambridge CB3 0HA (United Kingdom)

    2015-06-01

    We have used the publicly released Dark Energy Survey (DES) data to hunt for new satellites of the Milky Way (MW) in the southern hemisphere. Our search yielded a large number of promising candidates. In this paper, we announce the discovery of nine new unambiguous ultra-faint objects, whose authenticity can be established with the DES data alone. Based on the morphological properties, three of the new satellites are dwarf galaxies, one of which is located at the very outskirts of the MW, at a distance of 380 kpc. The remaining six objects have sizes and luminosities comparable to the Segue 1 satellite and cannot be classified straightforwardly without follow-up spectroscopic observations. The satellites we have discovered cluster around the LMC and the SMC. We show that such spatial distribution is unlikely under the assumption of isotropy, and, therefore, conclude that at least some of the new satellites must have been associated with the Magellanic Clouds in the past.

  18. Communities, Cameras, and Conservation

    Science.gov (United States)

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  19. Mars Observer camera

    Science.gov (United States)

    Malin, M. C.; Danielson, G. E.; Ingersoll, A. P.; Masursky, H.; Veverka, J.; Ravine, M. A.; Soulanille, T. A.

    1992-01-01

    The Mars Observer camera (MOC) is a three-component system (one narrow-angle and two wide-angle cameras) designed to take high spatial resolution pictures of the surface of Mars and to obtain lower spatial resolution, synoptic coverage of the planet's surface and atmosphere. The cameras are based on the 'push broom' technique; that is, they do not take 'frames' but rather build pictures, one line at a time, as the spacecraft moves around the planet in its orbit. MOC is primarily a telescope for taking extremely high resolution pictures of selected locations on Mars. Using the narrow-angle camera, areas ranging from 2.8 km x 2.8 km to 2.8 km x 25.2 km (depending on available internal digital buffer memory) can be photographed at about 1.4 m/pixel. Additionally, lower-resolution pictures (to a lowest resolution of about 11 m/pixel) can be acquired by pixel averaging; these images can be much longer, ranging up to 2.8 x 500 km at 11 m/pixel. High-resolution data will be used to study sediments and sedimentary processes, polar processes and deposits, volcanism, and other geologic/geomorphic processes.

  20. The world's fastest camera

    CERN Multimedia

    Piquepaille, Roland

    2006-01-01

    This image processor is not your typical digital camera. It took 6 years to 20 people and $6 million to build the "Regional Calorimeter Trigger"(RCT) which will be a component of the Compact Muon Solenoid (CMS) experiment, one of the detectors on the Large Hadron Collider (LHC) in Geneva, Switzerland (1 page)

  1. Camera as Cultural Critique

    DEFF Research Database (Denmark)

    Suhr, Christian

    2015-01-01

    researchers, cameras, and filmed subjects already inherently comprise analytical decisions. It is these ethnographic qualities inherent in audiovisual and photographic imagery that make it of particular value to a participatory anthropological enterprise that seeks to resist analytic closure and seeks instead...

  2. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  3. Thermoplastic film camera for holographic recording

    International Nuclear Information System (INIS)

    Liegeois, C.; Meyrueis, P.

    1982-01-01

    The design thermoplastic-film recording camera and its performance for holography of extended objects are reported. Special corona geometry and accurate control of development heat by constant current heating and high resolution measurement of the develop temperature make easy recording of reproducible, large aperture holograms possible. The experimental results give the transfer characteristics, the diffraction efficiency characteristics and the spatial frequency response. (orig.)

  4. The PAU Camera

    Science.gov (United States)

    Casas, R.; Ballester, O.; Cardiel-Sas, L.; Carretero, J.; Castander, F. J.; Castilla, J.; Crocce, M.; de Vicente, J.; Delfino, M.; Fernández, E.; Fosalba, P.; García-Bellido, J.; Gaztañaga, E.; Grañena, F.; Jiménez, J.; Madrid, F.; Maiorino, M.; Martí, P.; Miquel, R.; Neissner, C.; Ponce, R.; Sánchez, E.; Serrano, S.; Sevilla, I.; Tonello, N.; Troyano, I.

    2011-11-01

    The PAU Camera (PAUCam) is a wide-field camera designed to be mounted at the William Herschel Telescope (WHT) prime focus, located at the Observatorio del Roque de los Muchachos in the island of La Palma (Canary Islands).Its primary function is to carry out a cosmological survey, the PAU Survey, covering an area of several hundred square degrees of sky. Its purpose is to determine positions and distances using photometric redshift techniques. To achieve accurate photo-z's, PAUCam will be equipped with 40 narrow-band filters covering the range from 450 to850 nm, and six broad-band filters, those of the SDSS system plus the Y band. To fully cover the focal plane delivered by the telescope optics, 18 CCDs 2k x 4k are needed. The pixels are square of 15 μ m size. The optical characteristics of the prime focus corrector deliver a field-of-view where eight of these CCDs will have an illumination of more than 95% covering a field of 40 arc minutes. The rest of the CCDs will occupy the vignetted region extending the field diameter to one degree. Two of the CCDs will be devoted to auto-guiding.This camera have some innovative features. Firstly, both the broad-band and the narrow-band filters will be placed in mobile trays, hosting 16 such filters at most. Those are located inside the cryostat at few millimeters in front of the CCDs when observing. Secondly, a pressurized liquid nitrogen tank outside the camera will feed a boiler inside the cryostat with a controlled massflow. The read-out electronics will use the Monsoon architecture, originally developed by NOAO, modified and manufactured by our team in the frame of the DECam project (the camera used in the DES Survey).PAUCam will also be available to the astronomical community of the WHT.

  5. MISR radiometric camera-by-camera Cloud Mask V004

    Data.gov (United States)

    National Aeronautics and Space Administration — This file contains the Radiometric camera-by-camera Cloud Mask dataset. It is used to determine whether a scene is classified as clear or cloudy. A new parameter has...

  6. A single prolific r-process event preserved in an ultra-faint dwarf galaxy

    Science.gov (United States)

    Ji, Alexander; Frebel, Anna; Chiti, Anirudh; Simon, Joshua

    2016-03-01

    The heaviest elements in the periodic table are synthesized through the r-process, but the astrophysical site for r-process nucleosynthesis is still unknown. Ultra-faint dwarf galaxies contain a simple fossil record of early chemical enrichment that may determine this site. Previous measurements found very low levels of neutron-capture elements in ultra-faint dwarfs, preferring supernovae as the r-process site. I present high-resolution chemical abundances of nine stars in the recently discovered ultra-faint dwarf Reticulum II, which display extremely enhanced r-process abundances 2-3 orders of magnitude higher than the other ultra-faint dwarfs. Stars with such extreme r-process enhancements are only rarely found in the Milky Way halo. The r-process abundances imply that the neutron-capture material in Reticulum II was synthesized in a single prolific event that is incompatible with r-process yields from ordinary core-collapse supernovae. Reticulum II provides an opportunity to discriminate whether the source of this pure r-process signature is a neutron star merger or magnetorotationally driven supernova. The single event is also a uniquely stringent constraint on the metal mixing and star formation history of this ultra-faint dwarf galaxy.

  7. Calibration Techniques for Accurate Measurements by Underwater Camera Systems.

    Science.gov (United States)

    Shortis, Mark

    2015-12-07

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems.

  8. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    Directory of Open Access Journals (Sweden)

    Mark Shortis

    2015-12-01

    Full Text Available Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems.

  9. Control of the movement of a ROV camera; Controle de posicionamento da camera de um ROV

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Alexandre S. de; Dutra, Max Suell [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE); Reis, Ney Robinson S. dos [PETROBRAS, Rio de Janeiro, RJ (Brazil). Centro de Pesquisas; Santos, Auderi V. dos [Pontificia Univ. Catolica do Rio de Janeiro, RJ (Brazil)

    2004-07-01

    The ROV's (Remotely Operated Vehicles) are used for installation and maintenance of underwater exploration systems in the oil industry. These systems are operated in distant areas thus being of essential importance the use of a cameras for the visualization of the work area. The synchronization necessary in the accomplishment of the tasks when operating the manipulator and the movement of the camera for the operator is a complex task. For the accomplishment of this synchronization is presented in this work the analysis of the interconnection of the systems. The concatenation of the systems is made through the interconnection of the electric signals of the proportional valves of the actuators of the manipulator with the signals of the proportional valves of the actuators of the camera. With this interconnection the approach accompaniment of the movement of the manipulator for the camera, keeping the object of the visualization of the field of vision of the operator is obtained. (author)

  10. Body worn camera

    Science.gov (United States)

    Aishwariya, A.; Pallavi Sudhir, Gulavani; Garg, Nemesa; Karthikeyan, B.

    2017-11-01

    A body worn camera is small video camera worn on the body, typically used by police officers to record arrests, evidence from crime scenes. It helps preventing and resolving complaints brought by members of the public; and strengthening police transparency, performance, and accountability. The main constants of this type of the system are video format, resolution, frames rate, and audio quality. This system records the video in .mp4 format with 1080p resolution and 30 frames per second. One more important aspect to while designing this system is amount of power the system requires as battery management becomes very critical. The main design challenges are Size of the Video, Audio for the video. Combining both audio and video and saving it in .mp4 format, Battery, size that is required for 8 hours of continuous recording, Security. For prototyping this system is implemented using Raspberry Pi model B.

  11. Positron emission tomography camera

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    A positron emission tomography camera having a plurality of detector rings positioned side-by-side or offset by one-half of the detector cross section around a patient area to detect radiation therefrom. Each ring contains a plurality of scintillation detectors which are positioned around an inner circumference with a septum ring extending inwardly from the inner circumference along each outer edge of each ring. An additional septum ring is positioned in the middle of each ring of detectors and parallel to the other septa rings, whereby the inward extent of all the septa rings may be reduced by one-half and the number of detectors required in each ring is reduced. The additional septa reduces the costs of the positron camera and improves its performance

  12. Gamma camera display system

    International Nuclear Information System (INIS)

    Stout, K.J.

    1976-01-01

    A gamma camera having an array of photomultipliers coupled via pulse shaping circuitry and a resistor weighting circuit to a display for forming an image of a radioactive subject is described. A linearizing circuit is coupled to the weighting circuit, the linearizing circuit including a nonlinear feedback circuit with diode coupling to the weighting circuit for linearizing the correspondence between points of the display and points of the subject. 4 Claims, 5 Drawing Figures

  13. Scanning gamma camera

    International Nuclear Information System (INIS)

    Engdahl, L.W.; Batter, J.F. Jr.; Stout, K.J.

    1977-01-01

    A scanning system for a gamma camera providing for the overlapping of adjacent scan paths is described. A collimator mask having tapered edges provides for a graduated reduction in intensity of radiation received by a detector thereof, the reduction in intensity being graduated in a direction normal to the scanning path to provide a blending of images of adjacent scan paths. 31 claims, 15 figures

  14. Collimated trans-axial tomographic scintillation camera

    International Nuclear Information System (INIS)

    1980-01-01

    The objects of this invention are first to reduce the time required to obtain statistically significant data in trans-axial tomographic radioisotope scanning using a scintillation camera. Secondly, to provide a scintillation camera system to increase the rate of acceptance of radioactive events to contribute to the positional information obtainable from a known radiation source without sacrificing spatial resolution. Thirdly to reduce the scanning time without loss of image clarity. The system described comprises a scintillation camera detector, means for moving this in orbit about a cranial-caudal axis relative to a patient and a collimator having septa defining apertures such that gamma rays perpendicular to the axis are admitted with high spatial resolution, parallel to the axis with low resolution. The septa may be made of strips of lead. Detailed descriptions are given. (U.K.)

  15. PEOPLE REIDENTIFCATION IN A DISTRIBUTED CAMERA NETWORK

    Directory of Open Access Journals (Sweden)

    Icaro Oliveira de Oliveira

    2010-06-01

    Full Text Available This paper presents an approach to the object reidentification problem in a distributed camera network system. The reidentification or reacquisition problem consists essentially on the matching process of images acquired from different cameras. This work is applied in a monitored environment by cameras. This application is important to modern security systems, in which the targets presence identification in the environment expands the capacity of action by security agents in real time and provides important parameters like localization for each target. We used target’s interest points and target’s color with features for reidentification. The satisfactory results were obtained from real experiments in public video datasets and synthetic images with noise.

  16. Wired and Wireless Camera Triggering with Arduino

    Science.gov (United States)

    Kauhanen, H.; Rönnholm, P.

    2017-10-01

    Synchronous triggering is an important task that allows simultaneous data capture from multiple cameras. Accurate synchronization enables 3D measurements of moving objects or from a moving platform. In this paper, we describe one wired and four wireless variations of Arduino-based low-cost remote trigger systems designed to provide a synchronous trigger signal for industrial cameras. Our wireless systems utilize 315 MHz or 434 MHz frequencies with noise filtering capacitors. In order to validate the synchronization accuracy, we developed a prototype of a rotating trigger detection system (named RoTriDeS). This system is suitable to detect the triggering accuracy of global shutter cameras. As a result, the wired system indicated an 8.91 μs mean triggering time difference between two cameras. Corresponding mean values for the four wireless triggering systems varied between 7.92 and 9.42 μs. Presented values include both camera-based and trigger-based desynchronization. Arduino-based triggering systems appeared to be feasible, and they have the potential to be extended to more complicated triggering systems.

  17. X-ray-bright optically faint active galactic nuclei in the Subaru Hyper Suprime-Cam wide survey

    Science.gov (United States)

    Terashima, Yuichi; Suganuma, Makoto; Akiyama, Masayuki; Greene, Jenny E.; Kawaguchi, Toshihiro; Iwasawa, Kazushi; Nagao, Tohru; Noda, Hirofumi; Toba, Yoshiki; Ueda, Yoshihiro; Yamashita, Takuji

    2018-01-01

    We construct a sample of X-ray-bright optically faint active galactic nuclei by combining Subaru Hyper Suprime-Cam, XMM-Newton, and infrared source catalogs. Fifty-three X-ray sources satisfying i-band magnitude fainter than 23.5 mag and X-ray counts with the EPIC-PN detector larger than 70 are selected from 9.1 deg2, and their spectral energy distributions (SEDs) and X-ray spectra are analyzed. Forty-four objects with an X-ray to i-band flux ratio FX/Fi > 10 are classified as extreme X-ray-to-optical flux sources. Spectral energy distributions of 48 among 53 are represented by templates of type 2 AGNs or star-forming galaxies and show the optical signature of stellar emission from host galaxies in the source rest frame. Infrared/optical SEDs indicate a significant contribution of emission from dust to the infrared fluxes, and that the central AGN is dust obscured. The photometric redshifts determined from the SEDs are in the range of 0.6-2.5. The X-ray spectra are fitted by an absorbed power-law model, and the intrinsic absorption column densities are modest (best-fit log NH = 20.5-23.5 cm-2 in most cases). The absorption-corrected X-ray luminosities are in the range of 6 × 1042-2 × 1045 erg s-1. Twenty objects are classified as type 2 quasars based on X-ray luminsosity and NH. The optical faintness is explained by a combination of redshifts (mostly z > 1.0), strong dust extinction, and in part a large ratio of dust/gas.

  18. Small Orbital Stereo Tracking Camera Technology Development

    Science.gov (United States)

    Gagliano, L.; Bryan, T.; MacLeod, T.

    On-Orbit Small Debris Tracking and Characterization is a technical gap in the current National Space Situational Awareness necessary to safeguard orbital assets and crew. This poses a major risk of MOD damage to ISS and Exploration vehicles. In 2015 this technology was added to NASAs Office of Chief Technologist roadmap. For missions flying in or assembled in or staging from LEO, the physical threat to vehicle and crew is needed in order to properly design the proper level of MOD impact shielding and proper mission design restrictions. Need to verify debris flux and size population versus ground RADAR tracking. Use of ISS for In-Situ Orbital Debris Tracking development provides attitude, power, data and orbital access without a dedicated spacecraft or restricted operations on-board a host vehicle as a secondary payload. Sensor Applicable to in-situ measuring orbital debris in flux and population in other orbits or on other vehicles. Could enhance safety on and around ISS. Some technologies extensible to monitoring of extraterrestrial debris as well To help accomplish this, new technologies must be developed quickly. The Small Orbital Stereo Tracking Camera is one such up and coming technology. It consists of flying a pair of intensified megapixel telephoto cameras to evaluate Orbital Debris (OD) monitoring in proximity of International Space Station. It will demonstrate on-orbit optical tracking (in situ) of various sized objects versus ground RADAR tracking and small OD models. The cameras are based on Flight Proven Advanced Video Guidance Sensor pixel to spot algorithms (Orbital Express) and military targeting cameras. And by using twin cameras we can provide Stereo images for ranging & mission redundancy. When pointed into the orbital velocity vector (RAM), objects approaching or near the stereo camera set can be differentiated from the stars moving upward in background.

  19. Radiation-resistant camera tube

    International Nuclear Information System (INIS)

    Kuwahata, Takao; Manabe, Sohei; Makishima, Yasuhiro

    1982-01-01

    It was a long time ago that Toshiba launched on manufacturing black-and-white radiation-resistant camera tubes employing nonbrowning face-plate glass for ITV cameras used in nuclear power plants. Now in compliance with the increasing demand in nuclear power field, the Company is at grips with the development of radiation-resistant single color-camera tubes incorporating a color-stripe filter for color ITV cameras used under radiation environment. Herein represented are the results of experiments on characteristics of materials for single color-camera tubes and prospects for commercialization of the tubes. (author)

  20. Non-contact measurement of rotation angle with solo camera

    Science.gov (United States)

    Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun

    2015-02-01

    For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.

  1. SN 2009E: a faint clone of SN 1987A

    DEFF Research Database (Denmark)

    Pastorello, A.; Pumo, M. L.; Navasardyan, H.

    2012-01-01

    Context.1987A-like events form a rare sub-group of hydrogen-rich core-collapse supernovae that are thought to originate from the explosion of blue supergiant stars. Although SN 1987A is the best known supernova, very few objects of this group have been discovered and, hence, studied. Aims. In thi...

  2. DeTeCt 3.0: A software tool to detect impacts of small objects in video observations of Jupiter obtained by amateur astronomers

    Science.gov (United States)

    Juaristi, J.; Delcroix, M.; Hueso, R.; Sánchez-Lavega, A.

    2017-09-01

    Impacts of small size objects (10-20 m in diameter) with Jupiter atmosphere result in luminous superbolides that can be observed from the Earth with small size telescopes. Impacts of this kind have been observed four times by amateur astronomers since July 2010. The probability of observing one of these events is very small. Amateur astronomers observe Jupiter using fast video cameras that record thousands of frames during a few minutes which combine into a single image that generally results in a high-resolution image. Flashes are brief, faint and often lost by image reconstruction software. We present major upgrades in a software tool DeTeCt initially developed by amateur astronomer Marc Delcroix and our current project to maximize the chances of detecting more of these impacts in Jupiter.

  3. The PLATO camera

    Science.gov (United States)

    Laubier, D.; Bodin, P.; Pasquier, H.; Fredon, S.; Levacher, P.; Vola, P.; Buey, T.; Bernardi, P.

    2017-11-01

    PLATO (PLAnetary Transits and Oscillation of stars) is a candidate for the M3 Medium-size mission of the ESA Cosmic Vision programme (2015-2025 period). It is aimed at Earth-size and Earth-mass planet detection in the habitable zone of bright stars and their characterisation using the transit method and the asterosismology of their host star. That means observing more than 100 000 stars brighter than magnitude 11, and more than 1 000 000 brighter than magnitude 13, with a long continuous observing time for 20 % of them (2 to 3 years). This yields a need for an unusually long term signal stability. For the brighter stars, the noise requirement is less than 34 ppm.hr-1/2, from a frequency of 40 mHz down to 20 μHz, including all sources of noise like for instance the motion of the star images on the detectors and frequency beatings. Those extremely tight requirements result in a payload consisting of 32 synchronised, high aperture, wide field of view cameras thermally regulated down to -80°C, whose data are combined to increase the signal to noise performances. They are split into 4 different subsets pointing at 4 directions to widen the total field of view; stars in the centre of that field of view are observed by all 32 cameras. 2 extra cameras are used with color filters and provide pointing measurement to the spacecraft Attitude and Orbit Control System (AOCS) loop. The satellite is orbiting the Sun at the L2 Lagrange point. This paper presents the optical, electronic and electrical, thermal and mechanical designs devised to achieve those requirements, and the results from breadboards developed for the optics, the focal plane, the power supply and video electronics.

  4. Voice Controlled Stereographic Video Camera System

    Science.gov (United States)

    Goode, Georgianna D.; Philips, Michael L.

    1989-09-01

    For several years various companies have been developing voice recognition software. Yet, there are few applications of voice control in the robotics field and virtually no examples of voice controlled three dimensional (3-D) systems. In late 1987 ARD developed a highly specialized, voice controlled 3-D vision system for use in remotely controlled, non-tethered robotic applications. The system was designed as an operator's aid and incorporates features thought to be necessary or helpful in remotely maneuvering a vehicle. Foremost is the three dimensionality of the operator's console display. An image that provides normal depth perception cues over a range of depths greatly increases the ease with which an operator can drive a vehicle and investigate its environment. The availability of both vocal and manual control of all system functions allows the operator to guide the system according to his personal preferences. The camera platform can be panned +/-178 degrees and tilted +/-30 degrees for a full range of view of the vehicle's environment. The cameras can be zoomed and focused for close inspection of distant objects, while retaining substantial stereo effect by increasing the separation between the cameras. There is a ranging and measurement function, implemented through a graphical cursor, which allows the operator to mark objects in a scene to determine their relative positions. This feature will be helpful in plotting a driving path. The image seen on the screen is overlaid with icons and digital readouts which provide information about the position of the camera platform, the range to the graphical cursor and the measurement results. The cursor's "range" is actually the distance from the cameras to the object on which the cursor is resting. Other such features are included in the system and described in subsequent sections of this paper.

  5. First Light with a 67-Million-Pixel WFI Camera

    Science.gov (United States)

    1999-01-01

    The newest astronomical instrument at the La Silla observatory is a super-camera with no less than sixty-seven million image elements. It represents the outcome of a joint project between the European Southern Observatory (ESO) , the Max-Planck-Institut für Astronomie (MPI-A) in Heidelberg (Germany) and the Osservatorio Astronomico di Capodimonte (OAC) near Naples (Italy), and was installed at the 2.2-m MPG/ESO telescope in December 1998. Following careful adjustment and testing, it has now produced the first spectacular test images. With a field size larger than the Full Moon, the new digital Wide Field Imager is able to obtain detailed views of extended celestial objects to very faint magnitudes. It is the first of a new generation of survey facilities at ESO with which a variety of large-scale searches will soon be made over extended regions of the southern sky. These programmes will lead to the discovery of particularly interesting and unusual (rare) celestial objects that may then be studied with large telescopes like the VLT at Paranal. This will in turn allow astronomers to penetrate deeper and deeper into the many secrets of the Universe. More light + larger fields = more information! The larger a telescope is, the more light - and hence information about the Universe and its constituents - it can collect. This simple truth represents the main reason for building ESO's Very Large Telescope (VLT) at the Paranal Observatory. However, the information-gathering power of astronomical equipment can also be increased by using a larger detector with more image elements (pixels) , thus permitting the simultaneous recording of images of larger sky fields (or more details in the same field). It is for similar reasons that many professional photographers prefer larger-format cameras and/or wide-angle lenses to the more conventional ones. The Wide Field Imager at the 2.2-m telescope Because of technological limitations, the sizes of detectors most commonly in use in

  6. On the Nature of Ultra-faint Dwarf Galaxy Candidates. I. DES1, Eridanus III, and Tucana V

    Science.gov (United States)

    Conn, Blair C.; Jerjen, Helmut; Kim, Dongwon; Schirmer, Mischa

    2018-01-01

    We use deep Gemini/GMOS-S g, r photometry to study the three ultra-faint dwarf galaxy candidates DES1, Eridanus III (Eri III), and Tucana V (Tuc V). Their total luminosities, M V (DES1) = ‑1.42 ± 0.50 and M V (Eri III) = ‑2.07 ± 0.50, and mean metallicities, [{Fe}/{{H}}]=-{2.38}-0.19+0.21 and [{Fe}/{{H}}]=-{2.40}-0.12+0.19, are consistent with them being ultra-faint dwarf galaxies, as they fall just outside the 1σ confidence band of the luminosity–metallicity relation for Milky Way satellite galaxies. However, their positions in the size–luminosity relation suggest that they are star clusters. Interestingly, DES1 and Eri III are at relatively large Galactocentric distances, with DES1 located at {D}{GC}=74+/- 4 {kpc} and Eri III at {D}{GC}=91+/- 4 {kpc}. In projection, both objects are in the tail of gaseous filaments trailing the Magellanic Clouds and have similar 3D separations from the Small Magellanic Cloud (SMC): {{Δ }}{D}{SMC,{DES}1}=31.7 kpc and {{Δ }}{D}{SMC,{Eri}{III}}=41.0 kpc, respectively. It is plausible that these stellar systems are metal-poor SMC satellites. Tuc V represents an interesting phenomenon in its own right. Our deep photometry at the nominal position of Tuc V reveals a low-level excess of stars at various locations across the GMOS field without a well-defined center. An SMC Northern Overdensity–like isochrone would be an adequate match to the Tuc V color–magnitude diagram, and the proximity to the SMC (12.°1 {{Δ }}{D}{SMC,{Tuc}{{V}}}=13 kpc) suggests that Tuc V is either a chance grouping of stars related to the SMC halo or a star cluster in an advanced stage of dissolution.

  7. ARE THE FAINT STRUCTURES AHEAD OF SOLAR CORONAL MASS EJECTIONS REAL SIGNATURES OF DRIVEN SHOCKS?

    International Nuclear Information System (INIS)

    Lee, Jae-Ok; Moon, Y.-J.; Lee, Kangjin; Lee, Jin-Yi; Lee, Kyoung-Sun; Kim, Sujin

    2014-01-01

    Recently, several studies have assumed that the faint structures ahead of coronal mass ejections (CMEs) are caused by CME-driven shocks. In this study, we have conducted a statistical investigation to determine whether or not the appearance of such faint structures depends on CME speeds. For this purpose, we use 127 Solar and Heliospheric Observatory/Large Angle Spectroscopic COronagraph (LASCO) front-side halo (partial and full) CMEs near the limb from 1997 to 2011. We classify these CMEs into two groups by visual inspection of CMEs in the LASCO-C2 field of view: Group 1 has the faint structure ahead of a CME and Group 2 does not have such a structure. We find the following results. (1) Eighty-seven CMEs belong to Group 1 and 40 CMEs belong to Group 2. (2) Group 1 events have much higher speeds (average = 1230 km s –1 and median = 1199 km s –1 ) than Group 2 events (average = 598 km s –1 and median = 518 km s –1 ). (3) The fraction of CMEs with faint structures strongly depends on CME speeds (V): 0.93 (50/54) for fast CMEs with V ≥ 1000 km s –1 , 0.65 (34/52) for intermediate CMEs with 500 km s –1 ≤ V < 1000 km s –1 , and 0.14 (3/21) for slow CMEs with V < 500 km s –1 . We also find that the fraction of CMEs with deca-hecto metric type II radio bursts is consistent with the above tendency. Our results indicate that the observed faint structures ahead of fast CMEs are most likely an enhanced density manifestation of CME-driven shocks

  8. Holographic stereogram using camera array in dense arrangement

    Science.gov (United States)

    Yamamoto, Kenji; Oi, Ryutaro; Senoh, Takanori; Ichihashi, Yasuyuki; Kurita, Taiichiro

    2011-02-01

    Holographic stereograms can display 3D objects by using ray information. To display high quality representations of real 3D objects by using holographic stereograms, relatively dense ray information must be prepared as the 3D object information. One promising method of obtaining this information uses a combination of a camera array and view interpolation which is signal processing technique. However, it is still technically difficult to synthesize ray information without visible error by using view interpolation. Our approach uses a densely arranged camera array to reduce this difficulty. Even though view interpolation is a simple signal processing technique, the synthesized ray information produced by this camera array should be adequate. We designed and manufactured a densely arranged camera array and used it to generate holographic stereograms.

  9. PERFORMANCE EVALUATION OF THERMOGRAPHIC CAMERAS FOR PHOTOGRAMMETRIC MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2013-05-01

    Full Text Available The aim of this research is the performance evaluation of the termographic cameras for possible use for photogrammetric documentation and deformation analyses caused by moisture and isolation problem of the historical and cultural heritage. To perform geometric calibration of the termographic camera, the 3D test object was designed with 77 control points which were distributed in different depths. For performance evaluation, Flir A320 termographic camera with 320 × 240 pixels and lens with 18 mm focal length was used. The Nikon D3X SLR digital camera with 6048 × 4032 pixels and lens with 20 mm focal length was used as reference for comparison. The size of pixel was 25 μm for the Flir A320 termographic camera and 6 μm for the Nikon D3X SLR digital camera. The digital images of the 3D test object were recorded with the Flir A320 termographic camera and Nikon D3X SLR digital camera and the image coordinate of the control points in the images were measured. The geometric calibration parameters, including the focal length, position of principal points, radial and tangential distortions were determined with introduced additional parameters in bundle block adjustments. The measurement of image coordinates and bundle block adjustments with additional parameters were performed using the PHIDIAS digital photogrammetric system. The bundle block adjustment was repeated with determined calibration parameter for both Flir A320 termographic camera and Nikon D3X SLR digital camera. The obtained standard deviation of measured image coordinates was 9.6 μm and 10.5 μm for Flir A320 termographic camera and 8.3 μm and 7.7 μm for Nikon D3X SLR digital camera. The obtained standard deviation of measured image points in Flir A320 termographic camera images almost same accuracy level with digital camera in comparison with 4 times bigger pixel size. The obtained results from this research, the interior geometry of the termographic cameras and lens distortion was

  10. Performance Evaluation of Thermographic Cameras for Photogrammetric Measurements

    Science.gov (United States)

    Yastikli, N.; Guler, E.

    2013-05-01

    The aim of this research is the performance evaluation of the termographic cameras for possible use for photogrammetric documentation and deformation analyses caused by moisture and isolation problem of the historical and cultural heritage. To perform geometric calibration of the termographic camera, the 3D test object was designed with 77 control points which were distributed in different depths. For performance evaluation, Flir A320 termographic camera with 320 × 240 pixels and lens with 18 mm focal length was used. The Nikon D3X SLR digital camera with 6048 × 4032 pixels and lens with 20 mm focal length was used as reference for comparison. The size of pixel was 25 μm for the Flir A320 termographic camera and 6 μm for the Nikon D3X SLR digital camera. The digital images of the 3D test object were recorded with the Flir A320 termographic camera and Nikon D3X SLR digital camera and the image coordinate of the control points in the images were measured. The geometric calibration parameters, including the focal length, position of principal points, radial and tangential distortions were determined with introduced additional parameters in bundle block adjustments. The measurement of image coordinates and bundle block adjustments with additional parameters were performed using the PHIDIAS digital photogrammetric system. The bundle block adjustment was repeated with determined calibration parameter for both Flir A320 termographic camera and Nikon D3X SLR digital camera. The obtained standard deviation of measured image coordinates was 9.6 μm and 10.5 μm for Flir A320 termographic camera and 8.3 μm and 7.7 μm for Nikon D3X SLR digital camera. The obtained standard deviation of measured image points in Flir A320 termographic camera images almost same accuracy level with digital camera in comparison with 4 times bigger pixel size. The obtained results from this research, the interior geometry of the termographic cameras and lens distortion was modelled efficiently

  11. A Study towards Real Time Camera Calibration

    OpenAIRE

    Choudhury, Ragini

    2000-01-01

    Preliminary Report Prepared for the Project VISTEO; This report provides a detailed study of the problem of real time camera calibration. This analysis, based on the study of literature in the area, as well as the experiments carried out on real and synthetic data, is motivated by the requirements of the VISTEO project. VISTEO deals with a fusion of real images and synthetic environments, objects etc in TV video sequences. It thus deals with a challenging and fast growing area in virtual real...

  12. Positron emission tomography camera

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    A positron emission tomography camera having a plurality of detector planes positioned side-by-side around a patient area to detect radiation. Each plane includes a plurality of photomultiplier tubes and at least two rows of scintillation crystals on each photomultiplier tube extend across to adjacent photomultiplier tubes for detecting radiation from the patient area. Each row of crystals on each photomultiplier tube is offset from the other rows of crystals, and the area of each crystal on each tube in each row is different than the area of the crystals on the tube in other rows for detecting which crystal is actuated and allowing the detector to detect more inter-plane slides. The crystals are offset by an amount equal to the length of the crystal divided by the number of rows. The rows of crystals on opposite sides of the patient may be rotated 90 degrees relative to each other

  13. Junocam: Juno's Outreach Camera

    Science.gov (United States)

    Hansen, C. J.; Caplinger, M. A.; Ingersoll, A.; Ravine, M. A.; Jensen, E.; Bolton, S.; Orton, G.

    2017-11-01

    Junocam is a wide-angle camera designed to capture the unique polar perspective of Jupiter offered by Juno's polar orbit. Junocam's four-color images include the best spatial resolution ever acquired of Jupiter's cloudtops. Junocam will look for convective clouds and lightning in thunderstorms and derive the heights of the clouds. Junocam will support Juno's radiometer experiment by identifying any unusual atmospheric conditions such as hotspots. Junocam is on the spacecraft explicitly to reach out to the public and share the excitement of space exploration. The public is an essential part of our virtual team: amateur astronomers will supply ground-based images for use in planning, the public will weigh in on which images to acquire, and the amateur image processing community will help process the data.

  14. Automatic locking radioisotope camera lock

    International Nuclear Information System (INIS)

    Rosauer, P.J.

    1978-01-01

    The lock of the present invention secures the isotope source in a stored shielded condition in the camera until a positive effort has been made to open the lock and take the source outside of the camera and prevents disconnection of the source pigtail unless the source is locked in a shielded condition in the camera. It also gives a visual indication of the locked or possible exposed condition of the isotope source and prevents the source pigtail from being completely pushed out of the camera, even when the lock is released. (author)

  15. The WEBERSAT camera - An inexpensive earth imaging system

    Science.gov (United States)

    Jackson, Stephen; Raetzke, Jeffrey

    WEBERSAT is a 27 pound LEO satellite launched in 1990 into a 500 mile polar orbit. One of its payloads is a low cost CCD color camera system developed by engineering students at Weber State University. The camera is a modified Canon CI-10 with a 25 mm lens, automatic iris, and 780 x 490 pixel resolution. The iris range control potentiometer was made programmable; a 10.7 MHz digitization clock, fixed focus support, and solid tantalum capacitors were added. Camera output signals, composite video, red, green, blue, and the digitization clock are fed to a flash digitizer, where they are processed for storage in RAM. Camera control commands are stored and executed via the onboard computer. The CCD camera has successfully imaged meteorological features of the earth, land masses, and a number of astronomical objects.

  16. Enhancing image quality produced by IR cameras

    Science.gov (United States)

    Dulski, R.; Powalisz, P.; Kastek, M.; Trzaskawka, P.

    2010-10-01

    Images produced by IR cameras are a specific source of information. The perception and interpretation of such image greatly depends on thermal properties of observed object and surrounding scenery. In practice, the optimal settings of the camera as well as automatic temperature range control do not guarantee the displayed images is optimal from observer's point of view. The solution to this could be the methods and algorithms of digital image processing implemented in the camera. Such solution should provide intelligent, dynamic contrast control applied not only across entire image but also selectively to specific areas in order to maintain optimal visualization of observed scenery. The paper discusses problems dealing with improvement of the visibility of low-contrast objects and presents method of image enhancement. The algorithm is based on adaptive histogram equalization. The image enhancement algorithm was tested on real IR images. The algorithm significantly improves the image quality and the effectiveness of object detection for the majority of thermal images. Due to its adaptive nature it should be effective for any given thermal image. The application of such algorithm is promising alternative to more expensive opto-electronic components like improved optics and detectors.

  17. A faint galaxy redshift survey behind massive clusters

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Brenda Louise [Univ. of California, Berkeley, CA (United States)

    1999-05-01

    This thesis is concerned with the gravitational lensing effect by massive galaxy clusters. We have explored a new technique for measuring galaxy masses and for detecting high-z galaxies by their optical colors. A redshift survey has been obtained at the Keck for a magnitude limited sample of objects (I<23) behind three clusters, A1689, A2390, and A2218 within a radius of 0.5M pc. For each cluster we see both a clear trend of increasing flux and redshift towards the center. This behavior is the result of image magnifications, such that at fixed redshift one sees further down the luminosity function. The gradient of this magnification is, unlike measurements of image distortion, sensitive to the mass profile, and found to depart strongly from a pure isothermal halo. We have found that V RI color selection can be used effectively as a discriminant for finding high-z galaxies behind clusters and present five 4.1 < z < 5.1 spectra which are of very high quality due to their high mean magnification of ~20, showing strong, visibly-saturated interstellar metal lines in some cases. We have also investigated the radio ring lens PKS 1830-211, locating the source and multiple images and detected molecular absorption at mm wavelengths. Broad molecular absorption of width 1/40kms is found toward the southwest component only, where surprisingly it does not reach the base of the continuum, which implies incomplete coverage of the SW component by molecular gas, despite the small projected size of the source, less than 1/8h pc at the absorption redshift.

  18. CALIBRATION OF LOW COST RGB AND NIR UAV CAMERAS

    Directory of Open Access Journals (Sweden)

    A. Fryskowska

    2016-06-01

    Full Text Available Non-metric digital cameras are being widely used for photogrammetric studies. The increase in resolution and quality of images obtained by non-metric cameras, allows to use it in low-cost UAV and terrestrial photogrammetry. Imagery acquired with non-metric cameras can be used in 3D modeling of objects or landscapes, reconstructing of historical sites, generating digital elevation models (DTM, orthophotos, or in the assessment of accidents. Non-metric digital camcorders are characterized by instability and ignorance of the interior orientation parameters. Therefore, the use of these devices requires prior calibration. Calibration research was conducted using non-metric camera, different calibration tests and various software. The first part of the paper contains a brief theoretical introduction including the basic definitions, like the construction of non-metric cameras or description of different optical distortions. The second part of the paper contains cameras calibration process, details of the calibration methods and models that have been used. Sony Nex 5 camera calibration has been done using software: Image Master Calib, Matlab - Camera Calibrator application and Agisoft Lens. For the study 2D test fields has been used. As a part of the research a comparative analysis of the results have been done.

  19. Calibration of Low Cost RGB and NIR Uav Cameras

    Science.gov (United States)

    Fryskowska, A.; Kedzierski, M.; Grochala, A.; Braula, A.

    2016-06-01

    Non-metric digital cameras are being widely used for photogrammetric studies. The increase in resolution and quality of images obtained by non-metric cameras, allows to use it in low-cost UAV and terrestrial photogrammetry. Imagery acquired with non-metric cameras can be used in 3D modeling of objects or landscapes, reconstructing of historical sites, generating digital elevation models (DTM), orthophotos, or in the assessment of accidents. Non-metric digital camcorders are characterized by instability and ignorance of the interior orientation parameters. Therefore, the use of these devices requires prior calibration. Calibration research was conducted using non-metric camera, different calibration tests and various software. The first part of the paper contains a brief theoretical introduction including the basic definitions, like the construction of non-metric cameras or description of different optical distortions. The second part of the paper contains cameras calibration process, details of the calibration methods and models that have been used. Sony Nex 5 camera calibration has been done using software: Image Master Calib, Matlab - Camera Calibrator application and Agisoft Lens. For the study 2D test fields has been used. As a part of the research a comparative analysis of the results have been done.

  20. The Eye of the Camera

    NARCIS (Netherlands)

    van Rompay, Thomas Johannes Lucas; Vonk, Dorette J.; Fransen, M.L.

    2009-01-01

    This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise

  1. Benchmarking the Optical Resolving Power of Uav Based Camera Systems

    Science.gov (United States)

    Meißner, H.; Cramer, M.; Piltz, B.

    2017-08-01

    UAV based imaging and 3D object point generation is an established technology. Some of the UAV users try to address (very) highaccuracy applications, i.e. inspection or monitoring scenarios. In order to guarantee such level of detail and accuracy high resolving imaging systems are mandatory. Furthermore, image quality considerably impacts photogrammetric processing, as the tie point transfer, mandatory for forming the block geometry, fully relies on the radiometric quality of images. Thus, empirical testing of radiometric camera performance is an important issue, in addition to standard (geometric) calibration, which normally is covered primarily. Within this paper the resolving power of ten different camera/lens installations has been investigated. Selected systems represent different camera classes, like DSLRs, system cameras, larger format cameras and proprietary systems. As the systems have been tested in wellcontrolled laboratory conditions and objective quality measures have been derived, individual performance can be compared directly, thus representing a first benchmark on radiometric performance of UAV cameras. The results have shown, that not only the selection of appropriate lens and camera body has an impact, in addition the image pre-processing, i.e. the use of a specific debayering method, significantly influences the final resolving power.

  2. BENCHMARKING THE OPTICAL RESOLVING POWER OF UAV BASED CAMERA SYSTEMS

    Directory of Open Access Journals (Sweden)

    H. Meißner

    2017-08-01

    Full Text Available UAV based imaging and 3D object point generation is an established technology. Some of the UAV users try to address (very highaccuracy applications, i.e. inspection or monitoring scenarios. In order to guarantee such level of detail and accuracy high resolving imaging systems are mandatory. Furthermore, image quality considerably impacts photogrammetric processing, as the tie point transfer, mandatory for forming the block geometry, fully relies on the radiometric quality of images. Thus, empirical testing of radiometric camera performance is an important issue, in addition to standard (geometric calibration, which normally is covered primarily. Within this paper the resolving power of ten different camera/lens installations has been investigated. Selected systems represent different camera classes, like DSLRs, system cameras, larger format cameras and proprietary systems. As the systems have been tested in wellcontrolled laboratory conditions and objective quality measures have been derived, individual performance can be compared directly, thus representing a first benchmark on radiometric performance of UAV cameras. The results have shown, that not only the selection of appropriate lens and camera body has an impact, in addition the image pre-processing, i.e. the use of a specific debayering method, significantly influences the final resolving power.

  3. Active learning in camera calibration through vision measurement application

    Science.gov (United States)

    Li, Xiaoqin; Guo, Jierong; Wang, Xianchun; Liu, Changqing; Cao, Binfang

    2017-08-01

    Since cameras are increasingly more used in scientific application as well as in the applications requiring precise visual information, effective calibration of such cameras is getting more important. There are many reasons why the measurements of objects are not accurate. The largest reason is that the lens has a distortion. Another detrimental influence on the evaluation accuracy is caused by the perspective distortions in the image. They happen whenever we cannot mount the camera perpendicularly to the objects we want to measure. In overall, it is very important for students to understand how to correct lens distortions, that is camera calibration. If the camera is calibrated, the images are rectificated, and then it is possible to obtain undistorted measurements in world coordinates. This paper presents how the students should develop a sense of active learning for mathematical camera model besides the theoretical scientific basics. The authors will present the theoretical and practical lectures which have the goal of deepening the students understanding of the mathematical models of area scan cameras and building some practical vision measurement process by themselves.

  4. Photogrammetric Applications of Immersive Video Cameras

    Science.gov (United States)

    Kwiatek, K.; Tokarczyk, R.

    2014-05-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug®3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

  5. A direct-view customer-oriented digital holographic camera

    Science.gov (United States)

    Besaga, Vira R.; Gerhardt, Nils C.; Maksimyak, Peter P.; Hofmann, Martin R.

    2018-01-01

    In this paper, we propose a direct-view digital holographic camera system consisting mostly of customer-oriented components. The camera system is based on standard photographic units such as camera sensor and objective and is adapted to operate under off-axis external white-light illumination. The common-path geometry of the holographic module of the system ensures direct-view operation. The system can operate in both self-reference and self-interference modes. As a proof of system operability, we present reconstructed amplitude and phase information of a test sample.

  6. Gamma camera system

    International Nuclear Information System (INIS)

    Miller, D.W.; Gerber, M.S.

    1982-01-01

    The invention provides a composite solid state detector for use in deriving a display, by spatial coordinate information, of the distribution or radiation emanating from a source within a region of interest, comprising several solid state detector components, each having a given surface arranged for exposure to impinging radiation and exhibiting discrete interactions therewith at given spatially definable locations. The surface of each component and the surface disposed opposite and substantially parallel thereto are associated with impedence means configured to provide for each opposed surface outputs for signals relating the given location of the interactions with one spatial coordinate parameter of one select directional sense. The detector components are arranged to provide groupings of adjacently disposed surfaces mutually linearly oriented to exhibit a common directional sense of the spatial coordinate parameter. Means interconnect at least two of the outputs associated with each of the surfaces within a given grouping for collecting the signals deriving therefrom. The invention also provides a camera system for imaging the distribution of a source of gamma radiation situated within a region of interest

  7. Robust automatic camera pointing for airborne surveillance

    Science.gov (United States)

    Dwyer, David; Wren, Lee; Thornton, John; Bonsor, Nigel

    2002-08-01

    Airborne electro-optic surveillance from a moving platform currently requires regular interaction from a trained operator. Even simple tasks such as fixating on a static point on the ground can demand constant adjustment of the camera orientation to compensate for platform motion. In order to free up operator time for other tasks such as navigation and communication with ground assets, an automatic gaze control system is needed. This paper describes such a system, based purely on tracking points within the video image. A number of scene points are automatically selected and their inter-frame motion tracked. The scene motion is then estimated using a model of a planar projective transform. For reliable and accurate camera pointing, the modeling of the scene motion must be robust to common problems such as scene point obscuration, objects moving independently within the scene and image noise. This paper details a COTS based system for automatic camera fixation and describes ways of preventing objects moving in the scene or poor motion estimates from corrupting the scene motion model.

  8. Identification of faint central stars in extended, low-surface-brightness planetary nebulae

    International Nuclear Information System (INIS)

    Kwitter, K.B.; Lydon, T.J.; Jacoby, G.H.

    1988-01-01

    As part of a larger program to study the properties of planetary nebula central stars, a search for faint central stars in extended, low-surface-brightness planetary nebulae using CCD imaging is performed. Of 25 target nebulae, central star candidates have been identified in 17, with certainties ranging from extremely probable to possible. Observed V values in the central star candidates extend to fainter than 23 mag. The identifications are presented along with the resulting photometric measurements. 24 references

  9. Faint Radio Sources in the NOAO Bootes Field. VLBA Imaging And Optical Identifications

    Energy Technology Data Exchange (ETDEWEB)

    Wrobel, J.M.; /NRAO, Socorro; Taylor, Greg B.; /NRAO, Socorro /KIPAC, Menlo Park; Rector, T.A.; /NRAO, Socorro /Alaska U.; Myers, S.T.; /NRAO, Socorro; Fassnacht, C.D.; /UC,

    2005-06-13

    As a step toward investigating the parsec-scale properties of faint extragalactic radio sources, the Very Long Baseline Array (VLBA) was used at 5.0 GHz to obtain phase-referenced images of 76 sources in the NOAO Booetes field. These 76 sources were selected from the FIRST catalog to have peak flux densities above 10 mJy at 5'' resolution and deconvolved major diameters of less than 3'' at 1.4 GHz. Fifty-five of these faint radio sources were identified with accretion-powered radio galaxies and quasars brighter than 25.5 mag in the optical I band. On VLA scales at 1.4 GHz, a measure of the compactness of the faint sources (the ratio of the peak flux density from FIRST to the integrated flux density from the NVSS catalog) spans the full range of possibilities arising from source-resolution effects. Thirty of the faint radio sources, or 39{sub -7}{sup +9}%, were detected with the VLBA at 5.0 GHz with peak flux densities above 6 {sigma} {approx} 2 mJy at 2 mas resolution. The VLBA detections occur through the full range of compactness ratios. The stronger VLBA detections can themselves serve as phase-reference calibrators, boding well for opening up much of the radio sky to VLBA imaging. For the adopted cosmology, the VLBA resolution corresponds to 17 pc or finer. Most VLBA detections are unresolved or slightly resolved but one is diffuse and five show either double or core-jet structures; the properties of these latter six are discussed in detail. Eight VLBA detections are unidentified and fainter than 25.5 mag in the optical I band; their properties are highlighted because they likely mark optically-obscured active nuclei at high redshift.

  10. DETECTION OF FAINT EXTENDED SOURCES IN HYPERSPECTRAL DATA AND APPLICATION TO HDF-S MUSE OBSERVATIONS

    OpenAIRE

    Courbot, Jean-Baptiste; Mazet, Vincent; MONFRINI, Emmanuel; Collet, Christophe

    2016-01-01

    International audience; Circum-Galactic Medium surrounding galaxies has been punctually detected, but its morphology remains largely unknown. The Multi-Unit Spectroscopic Explorer (MUSE) spectro-imager provides for the first time both spectral and spatial resolution to spatially map such features. The problem lies in the statistical detection of faint spatially-extended sources in massive hyperspectral images such as provided by MUSE, and has not been previously handled. This paper presents a...

  11. A trajectory observer for camera-based underwater motion measurements

    DEFF Research Database (Denmark)

    Berg, Tor; Jouffroy, Jerome; Johansen, Vegar

    This work deals with the issue of estimating the trajectory of a vehicle or object moving underwater based on camera measurements. The proposed approach consists of a diffusion-based trajectory observer (Jouffroy and Opderbecke, 2004) processing whole segments of a trajectory at a time. Additiona....... Additionally, the observer contains a Tikhonov regularizer for smoothing the estimates. Then, a method for including the camera measurements in an appropriate manner is proposed....

  12. Comment on "Clouds and the Faint Young Sun Paradox" by Goldblatt and Zahnle (2011

    Directory of Open Access Journals (Sweden)

    R. Rondanelli

    2012-03-01

    Full Text Available Goldblatt and Zahnle (2011 raise a number of issues related to the possibility that cirrus clouds can provide a solution to the faint young sun paradox. Here, we argue that: (1 climates having a lower than present mean surface temperature cannot be discarded as solutions to the faint young sun paradox, (2 the detrainment from deep convective clouds in the tropics is a well-established physical mechanism for the formation of high clouds that have a positive radiative forcing (even if the possible role of these clouds as a negative climate feedback remains controversial and (3 even if some cloud properties are not mutually consistent with observations in radiative transfer parameterizations, the most relevant consistency (for the purpose of hypothesis testing is with observations of the cloud radiative forcing. Therefore, we maintain that cirrus clouds, as observed in the current climate and covering a large region of the tropics, can provide a solution to the faint young sun paradox, or at least ease the amount of CO2 or other greenhouse substances needed to provide temperatures above freezing during the Archean.

  13. The Faint End of the z = 5 Quasar Luminosity Function from the CFHTLS

    Science.gov (United States)

    McGreer, Ian D.; Fan, Xiaohui; Jiang, Linhua; Cai, Zheng

    2018-03-01

    We present results from a spectroscopic survey of z ∼ 5 quasars in the CFHT Legacy Survey. Using both optical color selection and a likelihood method, we select 97 candidates over an area of 105 deg2 to a limit of i AB 4 quasars. This sample extends measurements of the quasar luminosity function ∼1.5 mag fainter than our previous work in Sloan Digital Sky Survey Stripe 82. The resulting luminosity function is in good agreement with our previous results, and suggests that the faint end slope is not steep. We perform a detailed examination of our survey completeness, particularly the impact of the Lyα emission assumed in our quasar spectral models, and find hints that the observed Lyα emission from faint z ∼ 5 quasars is weaker than for z ∼ 3 quasars at a similar luminosity. Our results strongly disfavor a significant contribution of faint quasars to the hydrogen-ionizing background at z = 5.

  14. Infrared-faint radio sources remain undetected at far-infrared wavelengths. Deep photometric observations using the Herschel Space Observatory

    Science.gov (United States)

    Herzog, A.; Norris, R. P.; Middelberg, E.; Spitler, L. R.; Leipski, C.; Parker, Q. A.

    2015-08-01

    Context. Showing 1.4 GHz flux densities in the range of a few to a few tens of mJy, infrared-faint radio sources (IFRS) are a type of galaxy characterised by faint or absent near-infrared counterparts and consequently extreme radio-to-infrared flux density ratios up to several thousand. Recent studies showed that IFRS are radio-loud active galactic nuclei (AGNs) at redshifts ≳2, potentially linked to high-redshift radio galaxies (HzRGs). Aims: This work explores the far-infrared emission of IFRS, providing crucial information on the star forming and AGN activity of IFRS. Furthermore, the data enable examining the putative relationship between IFRS and HzRGs and testing whether IFRS are more distant or fainter siblings of these massive galaxies. Methods: A sample of six IFRS was observed with the Herschel Space Observatory between 100 μm and 500 μm. Using these results, we constrained the nature of IFRS by modelling their broad-band spectral energy distribution (SED). Furthermore, we set an upper limit on their infrared SED and decomposed their emission into contributions from an AGN and from star forming activity. Results: All six observed IFRS were undetected in all five Herschel far-infrared channels (stacking limits: σ = 0.74 mJy at 100 μm, σ = 3.45 mJy at 500 μm). Based on our SED modelling, we ruled out the following objects to explain the photometric characteristics of IFRS: (a) known radio-loud quasars and compact steep-spectrum sources at any redshift; (b) starburst galaxies with and without an AGN and Seyfert galaxies at any redshift, even if the templates were modified; and (c) known HzRGs at z ≲ 10.5. We find that the IFRS analysed in this work can only be explained by objects that fulfil the selection criteria of HzRGs. More precisely, IFRS could be (a) known HzRGs at very high redshifts (z ≳ 10.5); (b) low-luminosity siblings of HzRGs with additional dust obscuration at lower redshifts; (c) scaled or unscaled versions of Cygnus A at any

  15. Habitat Mapping Camera (HABCAM)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset entails imagery collected using the HabCam towed underwater vehicle and annotated data on objects or habitats in the images and notes on image...

  16. Comparison of polarimetric cameras

    Science.gov (United States)

    2017-03-01

    and darker objects will have higher degrees of polarization. Quantitative data has been collected on the moon that displays the UMOV effect for...different phases and regions of the moon (Zubko 2011). The effect relates the wavelength, color, and texture of an object to polarization. C. NON-OPTICAL...and registering the photos after saving them. If the trigger mode was unsuccessful the images results cause errors in registration. An attempt at

  17. True RGB line scan camera for color machine vision applications

    Science.gov (United States)

    Lemstrom, Guy F.

    1994-11-01

    In this paper a true RGB 3-chip color line scan camera is described. The camera was mainly developed for accurate color measuring in industrial applications. Due to the camera's modularity it's also possible to use it as a B/W-camera. The color separation is made with a RGB-beam splitter. The CCD linear arrays are fixed with a high accuracy to the beam splitters output in order to match the pixels of the three different CCDs on each other. This makes the color analyses simple compared to color line arrays where line or pixel matching has to be done. The beam splitter can be custom made to separate spectral components other than standard RGB. The spectral range is from 200 to 1000 nm for most CCDs and two or three spectral areas can be separately measured with the beam splitter. The camera is totally digital and has a 16-bit parallel computer interface to communicate with a signal processing board. Because of the open architecture of the camera it's possible for the customer to design a board with some special functions handling the preprocessing of the data (for example RGB - HSI conversion). The camera can also be equipped with a high speed CPU-board with enough local memory to do some image processing inside the camera before sending the data forward. The camera has been used in real industrial applications and has proven that its high resolution and high dynamic range can be used to measure color differences of small amounts to separate or grade objects such as minerals, food or other materials that can't be measured with a black and white camera.

  18. CCD TV camera, TM1300

    International Nuclear Information System (INIS)

    Takano, Mitsuo; Endou, Yukio; Nakayama, Hideo

    1982-01-01

    Development has been made of a black-and-white TV camera TM 1300 using an interline-transfer CCD, which excels in performance frame-transfer CCDs marketed since 1980: it has a greater number of horizontal picture elements and far smaller input power (less than 2 W at 9 V), uses hybrid ICs for the CCD driver unit to reduce the size of the camera, has no picture distortion, no burn-in; in addition, it has peripheral equipment, such as the camera housing and the pan and till head miniaturized as well. It is also expected to be widened in application to industrial TV. (author)

  19. High Quality Camera Surveillance System

    OpenAIRE

    Helaakoski, Ari

    2015-01-01

    Oulu University of Applied Sciences Information Technology Author: Ari Helaakoski Title of the master’s thesis: High Quality Camera Surveillance System Supervisor: Kari Jyrkkä Term and year of completion: Spring 2015 Number of pages: 31 This master’s thesis was commissioned by iProtoXi Oy and it was done to one iProtoXi customer. The aim of the thesis was to make a camera surveillance system which is using a High Quality camera with pan and tilt possibility. It should b...

  20. Control system for gamma camera

    International Nuclear Information System (INIS)

    Miller, D.W.

    1977-01-01

    An improved gamma camera arrangement is described which utilizing a solid state detector, formed of high purity germanium. the central arrangement of the camera operates to effect the carrying out of a trapezoidal filtering operation over antisymmetrically summed spatial signals through gated integration procedures utilizing idealized integrating intervals. By simultaneously carrying out peak energy evaluation of the input signals, a desirable control over pulse pile-up phenomena is achieved. Additionally, through the use of the time derivative of incoming pulse or signal energy information to initially enable the control system, a low level information evaluation is provided serving to enhance the signal processing efficiency of the camera

  1. Modular scintillation camera

    International Nuclear Information System (INIS)

    Barrett, H. H.

    1985-01-01

    Improved optical coupling modules to be used in coded-aperture-type radiographic imaging systems. In a first system, a rotating slit coded-aperture is employed between the radioactive object and the module. The module consists of one pair of side-by-side photomultipliers receiving light rays from a scintillation crystal exposed to the object via the coded-aperture. The light rays are guided to the photomultipliers by a mask having a central transverse transparent window, or by a cylindrical lens, the mask or lens being mounted in a light-conveying quartz block assembly providing internal reflections at opposite faces of the assembly. This generates output signals from the photomultipliers which can be utilized to compute one-dimensional coordinate values for restoring the image of the radioactive object on a display screen. In another form of optical coupling module, usable with other types of coded-apertures, four square photomultipliers form a substantially square block and receive light rays from scintillations from a scintillation crystal exposed to the radioactive object via the coded-aperture. The light rays are guided to the photomultipliers by a square mask or a centrally transparent square lens configuration mounted in a light-conveying assembly formed by internally reflecting quartz blocks, the optical rays being directed to the respective photomultipliers so as to generate resultant output signals which can be utilized to compute image coordinate values for two-dimensional representation of the radioactive object being examined

  2. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  3. New generation of meteorology cameras

    Science.gov (United States)

    Janout, Petr; Blažek, Martin; Páta, Petr

    2017-12-01

    A new generation of the WILLIAM (WIde-field aLL-sky Image Analyzing Monitoring system) camera includes new features such as monitoring of rain and storm clouds during the day observation. Development of the new generation of weather monitoring cameras responds to the demand for monitoring of sudden weather changes. However, new WILLIAM cameras are ready to process acquired image data immediately, release warning against sudden torrential rains, and send it to user's cell phone and email. Actual weather conditions are determined from image data, and results of image processing are complemented by data from sensors of temperature, humidity, and atmospheric pressure. In this paper, we present the architecture, image data processing algorithms of mentioned monitoring camera and spatially-variant model of imaging system aberrations based on Zernike polynomials.

  4. Astronomy and the camera obscura

    Science.gov (United States)

    Feist, M.

    2000-02-01

    The camera obscura (from Latin meaning darkened chamber) is a simple optical device with a long history. In the form considered here, it can be traced back to 1550. It had its heyday during the Victorian era when it was to be found at the seaside as a tourist attraction or sideshow. It was also used as an artist's drawing aid and, in 1620, the famous astronomer-mathematician, Johannes Kepler used a small tent camera obscura to trace the scenery.

  5. Object and Objective Lost?

    DEFF Research Database (Denmark)

    Lopdrup-Hjorth, Thomas

    2015-01-01

    This paper explores the erosion and problematization of ‘the organization’ as a demarcated entity. Utilizing Foucault's reflections on ‘state-phobia’ as a source of inspiration, I show how an organization-phobia has gained a hold within Organization Theory (OT). By attending to the history...... of this organization-phobia, the paper argues that OT has become increasingly incapable of speaking about its core object. I show how organizations went from being conceptualized as entities of major importance to becoming theoretically deconstructed and associated with all kinds of ills. Through this history......, organizations as distinct entities have been rendered so problematic that they have gradually come to be removed from the center of OT. The costs of this have been rather significant. Besides undermining the grounds that gave OT intellectual credibility and legitimacy to begin with, the organization-phobia...

  6. COSMIC INFRARED BACKGROUND FLUCTUATIONS IN DEEP SPITZER INFRARED ARRAY CAMERA IMAGES: DATA PROCESSING AND ANALYSIS

    International Nuclear Information System (INIS)

    Arendt, Richard G.; Kashlinsky, A.; Moseley, S. H.; Mather, J.

    2010-01-01

    This paper provides a detailed description of the data reduction and analysis procedures that have been employed in our previous studies of spatial fluctuation of the cosmic infrared background (CIB) using deep Spitzer Infrared Array Camera observations. The self-calibration we apply removes a strong instrumental signal from the fluctuations that would otherwise corrupt the results. The procedures and results for masking bright sources and modeling faint sources down to levels set by the instrumental noise are presented. Various tests are performed to demonstrate that the resulting power spectra of these fields are not dominated by instrumental or procedural effects. These tests indicate that the large-scale (∼>30') fluctuations that remain in the deepest fields are not directly related to the galaxies that are bright enough to be individually detected. We provide the parameterization of these power spectra in terms of separate instrument noise, shot noise, and power-law components. We discuss the relationship between fluctuations measured at different wavelengths and depths, and the relations between constraints on the mean intensity of the CIB and its fluctuation spectrum. Consistent with growing evidence that the ∼1-5 μm mean intensity of the CIB may not be as far above the integrated emission of resolved galaxies as has been reported in some analyses of DIRBE and IRTS observations, our measurements of spatial fluctuations of the CIB intensity indicate the mean emission from the objects producing the fluctuations is quite low (∼>1 nW m -2 sr -1 at 3-5 μm), and thus consistent with current γ-ray absorption constraints. The source of the fluctuations may be high-z Population III objects, or a more local component of very low luminosity objects with clustering properties that differ from the resolved galaxies. Finally, we discuss the prospects of the upcoming space-based surveys to directly measure the epochs inhabited by the populations producing these

  7. Cosmic Infrared Background Fluctuations in Deep Spitzer Infrared Array Camera Images: Data Processing and Analysis

    Science.gov (United States)

    Arendt, Richard G.; Kashlinsky, A.; Moseley, S. H.; Mather, J.

    2010-01-01

    This paper provides a detailed description of the data reduction and analysis procedures that have been employed in our previous studies of spatial fluctuation of the cosmic infrared background (CIB) using deep Spitzer Infrared Array Camera observations. The self-calibration we apply removes a strong instrumental signal from the fluctuations that would otherwise corrupt the results. The procedures and results for masking bright sources and modeling faint sources down to levels set by the instrumental noise are presented. Various tests are performed to demonstrate that the resulting power spectra of these fields are not dominated by instrumental or procedural effects. These tests indicate that the large-scale (gsim30') fluctuations that remain in the deepest fields are not directly related to the galaxies that are bright enough to be individually detected. We provide the parameterization of these power spectra in terms of separate instrument noise, shot noise, and power-law components. We discuss the relationship between fluctuations measured at different wavelengths and depths, and the relations between constraints on the mean intensity of the CIB and its fluctuation spectrum. Consistent with growing evidence that the ~1-5 μm mean intensity of the CIB may not be as far above the integrated emission of resolved galaxies as has been reported in some analyses of DIRBE and IRTS observations, our measurements of spatial fluctuations of the CIB intensity indicate the mean emission from the objects producing the fluctuations is quite low (gsim1 nW m-2 sr-1 at 3-5 μm), and thus consistent with current γ-ray absorption constraints. The source of the fluctuations may be high-z Population III objects, or a more local component of very low luminosity objects with clustering properties that differ from the resolved galaxies. Finally, we discuss the prospects of the upcoming space-based surveys to directly measure the epochs inhabited by the populations producing these source

  8. A cooperative control algorithm for camera based observational systems.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Joseph G.

    2012-01-01

    Over the last several years, there has been considerable growth in camera based observation systems for a variety of safety, scientific, and recreational applications. In order to improve the effectiveness of these systems, we frequently desire the ability to increase the number of observed objects, but solving this problem is not as simple as adding more cameras. Quite often, there are economic or physical restrictions that prevent us from adding additional cameras to the system. As a result, we require methods that coordinate the tracking of objects between multiple cameras in an optimal way. In order to accomplish this goal, we present a new cooperative control algorithm for a camera based observational system. Specifically, we present a receding horizon control where we model the underlying optimal control problem as a mixed integer linear program. The benefit of this design is that we can coordinate the actions between each camera while simultaneously respecting its kinematics. In addition, we further improve the quality of our solution by coupling our algorithm with a Kalman filter. Through this integration, we not only add a predictive component to our control, but we use the uncertainty estimates provided by the filter to encourage the system to periodically observe any outliers in the observed area. This combined approach allows us to intelligently observe the entire region of interest in an effective and thorough manner.

  9. High quality neutron radiography imaging using cooled CCD camera

    International Nuclear Information System (INIS)

    Kobayashi, Hisao

    1993-01-01

    An electronic imaging technique using cooled charge-coupled-device camera (C-CCD) was applied to neutron radiography. The camera was examined for linearities of signal outputs and its dynamic ranges about the number of photons generated in a converter by an incident neutron beam. It is expected that the camera can be applied to high quality NR imaging especially to tomographic imaging for static objects. When the C-CCD camera is applied to get tomogram on the basis of its excellent characteristics, the results will be discussed about the quality of the image through a dynamic range of CT value which is defined in this paper, and a guide of dimensional limitation which can reasonably reconstruct tomograms. (author)

  10. A practical block detector for a depth encoding PET camera

    International Nuclear Information System (INIS)

    Rogers, J.G.; Moisan, C.; Hoskinson, E.M.

    1995-10-01

    The depth-of-interaction effect in block detectors degrades the image resolution in commercial PET cameras and impedes the natural evolution of smaller, less expensive cameras. A method for correcting the measured position of each detected gamma ray by measuring its depth-of-interaction was tested and found to recover 38% of the lost resolution in a table-top 50 cm diameter camera. To obtain the desired depth sensitivity, standard commercial detectors were modified by a simple and practical process, which is suitable for mass production of the detectors. The impact of the detectors modifications on central image resolution and on the ability of the camera to correct for object scatter were also measured. (authors)

  11. Adaptive control of camera position for stereo vision

    Science.gov (United States)

    Crisman, Jill D.; Cleary, Michael E.

    1994-03-01

    A major problem in using two-camera stereo machine vision to perform real-world tasks, such as visual object tracking, is deciding where to position the cameras. Humans accomplish the analogous task by positioning their heads and eyes for optimal stereo effects. This paper describes recent work toward developing automated control strategies for camera motion in stereo machine vision systems for mobile robot navigation. Our goal is to achieve fast, reliable pursuit of a target while avoiding obstacles. Our strategy results in smooth, stable camera motion despite robot and target motion. Our algorithm has been shown to be successful at navigating a mobile robot, mediating visual target tracking and ultrasonic obstacle detection. The architecture, hardware, and simulation results are discussed.

  12. Acceptance/operational test procedure 241-AN-107 Video Camera System

    International Nuclear Information System (INIS)

    Pedersen, L.T.

    1994-01-01

    This procedure will document the satisfactory operation of the 241-AN-107 Video Camera System. The camera assembly, including camera mast, pan-and-tilt unit, camera, and lights, will be installed in Tank 241-AN-107 to monitor activities during the Caustic Addition Project. The camera focus, zoom, and iris remote controls will be functionally tested. The resolution and color rendition of the camera will be verified using standard reference charts. The pan-and-tilt unit will be tested for required ranges of motion, and the camera lights will be functionally tested. The master control station equipment, including the monitor, VCRs, printer, character generator, and video micrometer will be set up and performance tested in accordance with original equipment manufacturer's specifications. The accuracy of the video micrometer to measure objects in the range of 0.25 inches to 67 inches will be verified. The gas drying distribution system will be tested to ensure that a drying gas can be flowed over the camera and lens in the event that condensation forms on these components. This test will be performed by attaching the gas input connector, located in the upper junction box, to a pressurized gas supply and verifying that the check valve, located in the camera housing, opens to exhaust the compressed gas. The 241-AN-107 camera system will also be tested to assure acceptable resolution of the camera imaging components utilizing the camera system lights

  13. Feasibility of Using Video Camera for Automated Enforcement on Red-Light Running and Managed Lanes.

    Science.gov (United States)

    2009-12-25

    The overall objective of this study is to evaluate the feasibility, effectiveness, legality, and public acceptance aspects of automated enforcement on red light running and HOV occupancy requirement using video cameras in Nevada. This objective was a...

  14. The Faint End of the Cluster-galaxy Luminosity Function at High Redshift

    Science.gov (United States)

    Mancone, Conor L.; Baker, Troy; Gonzalez, Anthony H.; Ashby, Matthew L. N.; Stanford, Spencer A.; Brodwin, Mark; Eisenhardt, Peter R. M.; Snyder, Greg; Stern, Daniel; Wright, Edward L.

    2012-12-01

    We measure the faint-end slope of the galaxy luminosity function (LF) for cluster galaxies at 1 < z < 1.5 using Spitzer IRAC data. We investigate whether this slope, α, differs from that of the field LF at these redshifts, and with the cluster LF at low redshifts. The latter is of particular interest as low-luminosity galaxies are expected to undergo significant evolution. We use seven high-redshift spectroscopically confirmed galaxy clusters drawn from the IRAC Shallow Cluster Survey to measure the cluster-galaxy LF down to depths of M* + 3 (3.6 μm) and M* + 2.5 (4.5 μm). The summed LF at our median cluster redshift (z = 1.35) is well fit by a Schechter distribution with α3.6 μm = -0.97 ± 0.14 and α4.5 μm = -0.91 ± 0.28, consistent with a flat faint-end slope and is in agreement with measurements of the field LF in similar bands at these redshifts. A comparison to α in low-redshift clusters finds no statistically significant evidence of evolution. Combined with past studies which show that M* is passively evolving out to z ~ 1.3, this means that the shape of the cluster LF is largely in place by z ~ 1.3. This suggests that the processes that govern the buildup of the mass of low-mass cluster galaxies have no net effect on the faint-end slope of the cluster LF at z <~ 1.3.

  15. THE FAINT END OF THE CLUSTER-GALAXY LUMINOSITY FUNCTION AT HIGH REDSHIFT

    Energy Technology Data Exchange (ETDEWEB)

    Mancone, Conor L.; Baker, Troy; Gonzalez, Anthony H. [Department of Astronomy, University of Florida, Gainesville, FL 32611 (United States); Ashby, Matthew L. N.; Snyder, Greg [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Stanford, Spencer A. [Physics Department, University of California, Davis, CA 95616 (United States); Brodwin, Mark [Department of Physics and Astronomy, University of Missouri, 5110 Rockhill Road, Kansas City, MO 64110 (United States); Eisenhardt, Peter R. M.; Stern, Daniel [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States); Wright, Edward L., E-mail: cmancone@astro.ufl.edu [Department of Physics and Astronomy, University of California, Los Angeles, CA 90095-1547 (United States)

    2012-12-20

    We measure the faint-end slope of the galaxy luminosity function (LF) for cluster galaxies at 1 < z < 1.5 using Spitzer IRAC data. We investigate whether this slope, {alpha}, differs from that of the field LF at these redshifts, and with the cluster LF at low redshifts. The latter is of particular interest as low-luminosity galaxies are expected to undergo significant evolution. We use seven high-redshift spectroscopically confirmed galaxy clusters drawn from the IRAC Shallow Cluster Survey to measure the cluster-galaxy LF down to depths of M* + 3 (3.6 {mu}m) and M* + 2.5 (4.5 {mu}m). The summed LF at our median cluster redshift (z = 1.35) is well fit by a Schechter distribution with {alpha}{sub 3.6{mu}m} = -0.97 {+-} 0.14 and {alpha}{sub 4.5{mu}m} = -0.91 {+-} 0.28, consistent with a flat faint-end slope and is in agreement with measurements of the field LF in similar bands at these redshifts. A comparison to {alpha} in low-redshift clusters finds no statistically significant evidence of evolution. Combined with past studies which show that M* is passively evolving out to z {approx} 1.3, this means that the shape of the cluster LF is largely in place by z {approx} 1.3. This suggests that the processes that govern the buildup of the mass of low-mass cluster galaxies have no net effect on the faint-end slope of the cluster LF at z {approx}< 1.3.

  16. Application of digital image processing techniques to faint solar flare phenomena

    Science.gov (United States)

    Glackin, D. L.; Martin, S. F.

    1980-01-01

    Digital image processing of eight solar flare events was performed using the Video Information Communication and Retrieval language in order to study moving emission fronts, flare halos, and Moreton waves. The techniques used include contrast enhancement, isointensity contouring, the differencing of images, spatial filtering, and geometrical registration. The spatial extent and temporal behavior of the faint phenomena is examined along with the relation of the three types of phenomena to one another. The image processing techniques make possible the detailed study of the history of the phenomena and provide clues to their physical nature.

  17. Application of digital image processing techniques to faint solar flare phenomena

    International Nuclear Information System (INIS)

    Glackin, D.L.; Martin, S.F.

    1980-01-01

    Digital image processing of eight solar flare events was performed using the Video Information Communication and Retrieval language in order to study moving emission fronts, flare halos, and Moreton waves. The techniques used include contrast enhancement, isointensity contouring, the differencing of images, spatial filtering, and geometrical registration. The spatial extent and temporal behavior of the faint phenomena is examined along with the relation of the three types of phenomena to one another. The image processing techniques make possible the detailed study of the history of the phenomena and provide clues to their physical nature

  18. Geological Sulfur Isotopes Indicate Elevated OCS in the Archean Atmosphere, Solving the Faint Young Sun Paradox

    DEFF Research Database (Denmark)

    Ueno, Yuichiro; Johnson, Matthew Stanley; Danielache, Sebastian Oscar

    2009-01-01

    Distributions of sulfur isotopes in geological samples would provide a record of atmospheric composition if the mechanism producing the isotope effects could be described quantitatively. We determined the UV absorption spectra of 32SO2, 33SO2, and 34SO2 and use them to interpret the geological re......-rich, reducing Archean atmosphere. The radiative forcing, due to this level of OCS, is able to resolve the faint young sun paradox. Further, the decline of atmospheric OCS may have caused the late Archean glaciation....

  19. A Faint Luminous Halo that May Trace the Dark Matter around Spiral Galaxy NGC~5907

    OpenAIRE

    Sackett, Penny D.; Morrison, Heather L.; Harding, Paul; Boroson, Todd A.

    1994-01-01

    The presence of unseen halos of ``dark matter'' has long been inferred from the high rotation speeds of gas and stars in the outer parts of spiral galaxies$^{1}$. The volume density of this dark matter decreases less quickly from the galactic center than does that of the luminous mass (such as that in stars), meaning that the dark matter dominates the mass far from the center$^{1,2}$. While searching for faint starlight away from the plane of the edge-on disk galaxy \\gal$^{3}$, we have found ...

  20. Can We Trust the Use of Smartphone Cameras in Clinical Practice? Laypeople Assessment of Their Image Quality.

    Science.gov (United States)

    Boissin, Constance; Fleming, Julian; Wallis, Lee; Hasselberg, Marie; Laflamme, Lucie

    2015-11-01

    Smartphone cameras are rapidly being introduced in medical practice, among other devices for image-based teleconsultation. Little is known, however, about the actual quality of the images taken, which is the object of this study. A series of nonclinical objects (from three broad categories) were photographed by a professional photographer using three smartphones (iPhone(®) 4 [Apple, Cupertino, CA], Samsung [Suwon, Korea] Galaxy S2, and BlackBerry(®) 9800 [BlackBerry Ltd., Waterloo, ON, Canada]) and a digital camera (Canon [Tokyo, Japan] Mark II). In a Web survey a convenience sample of 60 laypeople "blind" to the types of camera assessed the quality of the photographs, individually and best overall. We then measured how each camera scored by object category and as a whole and whether a camera ranked best using a Mann-Whitney U test for 2×2 comparisons. There were wide variations between and within categories in the quality assessments for all four cameras. The iPhone had the highest proportion of images individually evaluated as good, and it also ranked best for more objects compared with other cameras, including the digital one. The ratings of the Samsung or the BlackBerry smartphone did not significantly differ from those of the digital camera. Whereas one smartphone camera ranked best more often, all three smartphones obtained results at least as good as those of the digital camera. Smartphone cameras can be a substitute for digital cameras for the purposes of medical teleconsulation.

  1. The fly's eye camera system

    Science.gov (United States)

    Mészáros, L.; Pál, A.; Csépány, G.; Jaskó, A.; Vida, K.; Oláh, K.; Mezö, G.

    2014-12-01

    We introduce the Fly's Eye Camera System, an all-sky monitoring device intended to perform time domain astronomy. This camera system design will provide complementary data sets for other synoptic sky surveys such as LSST or Pan-STARRS. The effective field of view is obtained by 19 cameras arranged in a spherical mosaic form. These individual cameras of the device stand on a hexapod mount that is fully capable of achieving sidereal tracking for the subsequent exposures. This platform has many advantages. First of all it requires only one type of moving component and does not include unique parts. Hence this design not only eliminates problems implied by unique elements, but the redundancy of the hexapod allows smooth operations even if one or two of the legs are stuck. In addition, it can calibrate itself by observed stars independently from both the geographical location (including northen and southern hemisphere) and the polar alignment of the full mount. All mechanical elements and electronics are designed within the confines of our institute Konkoly Observatory. Currently, our instrument is in testing phase with an operating hexapod and reduced number of cameras.

  2. Interactive Augmentation of Live Images using a HDR Stereo Camera

    OpenAIRE

    Korn, Matthias; Stange, Maik; von Arb, Andreas; Blum, Lisa; Kreil, Michael; Kunze, Kathrin-Jennifer; Anhenn, Jens; Wallrath, Timo; Grosch, Thorsten

    2007-01-01

    Adding virtual objects to real environments plays an important role in todays computer graphics: Typical examples are virtual furniture in a real room and virtual characters in real movies. For a believable appearance, consistent lighting of the virtual objects is required. We present an augmented reality system that displays virtual objects with consistent illumination and shadows in the image of a simple webcam. We use two high dynamic range video cameras with fisheye lenses permanently rec...

  3. Dark Energy Camera for Blanco

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  4. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  5. CHEMICAL DIVERSITY IN THE ULTRA-FAINT DWARF GALAXY TUCANA II

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Alexander P.; Frebel, Anna; Ezzeddine, Rana [Department of Physics and Kavli Institute for Astrophysics and Space Research, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Casey, Andrew R., E-mail: alexji@mit.edu [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge, CB3 0HA (United Kingdom)

    2016-11-20

    We present the first detailed chemical abundance study of the ultra-faint dwarf galaxy Tucana II, based on high-resolution Magellan/MIKE spectra of four red giant stars. The metallicities of these stars range from [Fe/H] = −3.2 to −2.6, and all stars are low in neutron-capture abundances ([Sr/Fe] and [Ba/Fe] < −1). However, a number of anomalous chemical signatures are present. One star is relatively metal-rich ([Fe/H] = −2.6) and shows [Na, α , Sc/Fe] < 0, suggesting an extended star formation history with contributions from AGB stars and SNe Ia. Two stars with [Fe/H] < −3 are mildly carbon-enhanced ([C/Fe] ∼ 0.7) and may be consistent with enrichment by faint supernovae, if such supernovae can produce neutron-capture elements. A fourth star with [Fe/H] = −3 is carbon-normal, and exhibits distinct light element abundance ratios from the carbon-enhanced stars. This carbon-normal star implies that at least two distinct nucleosynthesis sources, both possibly associated with Population III stars, contributed to the early chemical enrichment of this galaxy. Despite its very low luminosity, Tucana II shows a diversity of chemical signatures that preclude it from being a simple “one-shot” first galaxy yet still provide a window into star and galaxy formation in the early universe.

  6. Revealing a comet-like shape of the faint periphery of the nearby galaxy M 32

    Science.gov (United States)

    Georgiev, Ts. B.

    2016-02-01

    We performed BVRI photometry of the galaxy M 32 building images and isophote maps in magnitudes and in color indexes. While searching for the faint thick disk of M 32 we apply median filtering with aperture of 7.3 arcmin to detach the residual image of M 32 and its periphery above the surrounding magnitude or color background. The residual images in all photometric systems show that the periphery of M 32 possesses a comet-like shape with a tail oriented to SSE, in a direction opposite to the direction of M 110. The images calibrated in color indexes (b - v) and (b - v)+(r - i) show that the tail is redder than the local median background. The residual images in color indexes show that the red tail broadens and curves in direction towards S and SW. Simultaneously, the brightest part of M 32 occurs bounded from NW-NE-SE sides by a sickle-like formation with a significantly lower red color index. Generally, we do not find a faint thick disk of M 32. However, the comet-like shape on the periphery of M 32, especially as a formation with an increased red color index, provokes involuntarily the impression that the satellite M 32 overtakes the Andromeda galaxy. The redshifts show that the intimacy velocity of M 32 and Andromeda galaxy is about 100 km/s.

  7. THE ORIGIN OF THE HEAVIEST METALS IN MOST ULTRA-FAINT DWARF GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Roederer, Ian U., E-mail: iur@umich.edu [Department of Astronomy, University of Michigan, 1085 S. University Ave., Ann Arbor, MI 48109 (United States)

    2017-01-20

    The heaviest metals found in stars in most ultra-faint dwarf (UFD) galaxies in the Milky Way halo are generally underabundant by an order of magnitude or more when compared with stars in the halo field. Among the heavy elements produced by n -capture reactions, only Sr and Ba can be detected in red giant stars in most UFD galaxies. This limited chemical information is unable to identify the nucleosynthesis process(es) responsible for producing the heavy elements in UFD galaxies. Similar [Sr/Ba] and [Ba/Fe] ratios are found in three bright halo field stars, BD−18°5550, CS 22185–007, and CS 22891–200. Previous studies of high-quality spectra of these stars report detections of additional n -capture elements, including Eu. The [Eu/Ba] ratios in these stars span +0.41 to +0.86. These ratios and others among elements in the rare Earth domain indicate an r -process origin. These stars have some of the lowest levels of r -process enhancement known, with [Eu/H] spanning −3.95 to −3.32, and they may be considered nearby proxies for faint stars in UFD galaxies. Direct confirmation, however, must await future observations of additional heavy elements in stars in the UFD galaxies themselves.

  8. Streak cameras and their applications

    International Nuclear Information System (INIS)

    Bernet, J.M.; Imhoff, C.

    1987-01-01

    Over the last several years, development of various measurement techniques in the nanosecond and pico-second range has led to increased reliance on streak cameras. This paper will present the main electronic and optoelectronic performances of the Thomson-CSF TSN 506 cameras and their associated devices used to build an automatic image acquisition and processing system (NORMA). A brief survey of the diversity and the spread of the use of high speed electronic cinematography will be illustrated by a few typical applications [fr

  9. Collimator trans-axial tomographic scintillation camera

    International Nuclear Information System (INIS)

    Jaszczak, Ronald J.

    1979-01-01

    An improved collimator is provided for a scintillation camera system that employs a detector head for transaxial tomographic scanning. One object of this invention is to significantly reduce the time required to obtain statistically significant data in radioisotope scanning using a scintillation camera. Another is to increase the rate of acceptance of radioactive events to contribute to the positional information obtainable from a radiation source of known strength without sacrificing spatial resolution. A further object is to reduce the necessary scanning time without degrading the images obtained. The collimator described has apertures defined by septa of different radiation transparency. The septa are aligned to provide greater radiation shielding from gamma radiation travelling within planes perpendicular to the cranial-caudal axis and less radiation shielding from gamma radiation travelling within other planes. Septa may also define apertures such that the collimator provides high spatial resolution of gamma rays traveling within planes perpendicular to the cranial-caudal axis and directed at the detector and high radiation sensitivity to gamma radiation travelling other planes and indicated at the detector. (LL)

  10. ALGORITHM OF OBJECT RECOGNITION

    Directory of Open Access Journals (Sweden)

    Loktev Alexey Alexeevich

    2012-10-01

    Full Text Available The second important problem to be resolved to the algorithm and its software, that comprises an automatic design of a complex closed circuit television system, represents object recognition, by virtue of which an image is transmitted by the video camera. Since imaging of almost any object is dependent on many factors, including its orientation in respect of the camera, lighting conditions, parameters of the registering system, static and dynamic parameters of the object itself, it is quite difficult to formalize the image and represent it in the form of a certain mathematical model. Therefore, methods of computer-aided visualization depend substantially on the problems to be solved. They can be rarely generalized. The majority of these methods are non-linear; therefore, there is a need to increase the computing power and complexity of algorithms to be able to process the image. This paper covers the research of visual object recognition and implementation of the algorithm in the view of the software application that operates in the real-time mode

  11. The Camera Comes to Court.

    Science.gov (United States)

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  12. The LSST camera system overview

    Science.gov (United States)

    Gilmore, Kirk; Kahn, Steven; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe

    2006-06-01

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100°C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  13. Toy Cameras and Color Photographs.

    Science.gov (United States)

    Speight, Jerry

    1979-01-01

    The technique of using toy cameras for both black-and-white and color photography in the art class is described. The author suggests that expensive equipment can limit the growth of a beginning photographer by emphasizing technique and equipment instead of in-depth experience with composition fundamentals and ideas. (KC)

  14. Gamma camera with reflectivity mask

    International Nuclear Information System (INIS)

    Stout, K.J.

    1980-01-01

    In accordance with the present invention there is provided a radiographic camera comprising: a scintillator; a plurality of photodectors positioned to face said scintillator; a plurality of masked regions formed upon a face of said scintillator opposite said photdetectors and positioned coaxially with respective ones of said photodetectors for decreasing the amount of internal reflection of optical photons generated within said scintillator. (auth)

  15. Active galactic nuclei cores in infrared-faint radio sources. Very long baseline interferometry observations using the Very Long Baseline Array

    Science.gov (United States)

    Herzog, A.; Middelberg, E.; Norris, R. P.; Spitler, L. R.; Deller, A. T.; Collier, J. D.; Parker, Q. A.

    2015-06-01

    Context. Infrared-faint radio sources (IFRS) form a new class of galaxies characterised by radio flux densities between tenths and tens of mJy and faint or absent infrared counterparts. It has been suggested that these objects are radio-loud active galactic nuclei (AGNs) at significant redshifts (z ≳ 2). Aims: Whereas the high redshifts of IFRS have been recently confirmed based on spectroscopic data, the evidence for the presence of AGNs in IFRS is mainly indirect. So far, only two AGNs have been unquestionably confirmed in IFRS based on very long baseline interferometry (VLBI) observations. In this work, we test the hypothesis that IFRS contain AGNs in a large sample of sources using VLBI. Methods: We observed 57 IFRS with the Very Long Baseline Array (VLBA) down to a detection sensitivity in the sub-mJy regime and detected compact cores in 35 sources. Results: Our VLBA detections increase the number of VLBI-detected IFRS from 2 to 37 and provide strong evidence that most - if not all - IFRS contain AGNs. We find that IFRS have a marginally higher VLBI detection fraction than randomly selected sources with mJy flux densities at arcsec-scales. Moreover, our data provide a positive correlation between compactness - defined as the ratio of milliarcsec- to arcsec-scale flux density - and redshift for IFRS, but suggest a decreasing mean compactness with increasing arcsec-scale radio flux density. Based on these findings, we suggest that IFRS tend to contain young AGNs whose jets have not formed yet or have not expanded, equivalent to very compact objects. We found two IFRS that are resolved into two components. The two components are spatially separated by a few hundred milliarcseconds in both cases. They might be components of one AGN, a binary black hole, or the result of gravitational lensing.

  16. Visual Positioning Indoors: Human Eyes vs. Smartphone Cameras.

    Science.gov (United States)

    Wu, Dewen; Chen, Ruizhi; Chen, Liang

    2017-11-16

    Artificial Intelligence (AI) technologies and their related applications are now developing at a rapid pace. Indoor positioning will be one of the core technologies that enable AI applications because people spend 80% of their time indoors. Humans can locate themselves related to a visually well-defined object, e.g., a door, based on their visual observations. Can a smartphone camera do a similar job when it points to an object? In this paper, a visual positioning solution was developed based on a single image captured from a smartphone camera pointing to a well-defined object. The smartphone camera simulates the process of human eyes for the purpose of relatively locating themselves against a well-defined object. Extensive experiments were conducted with five types of smartphones on three different indoor settings, including a meeting room, a library, and a reading room. Experimental results shown that the average positioning accuracy of the solution based on five smartphone cameras is 30.6 cm, while that for the human-observed solution with 300 samples from 10 different people is 73.1 cm.

  17. OCAMS: The OSIRIS-REx Camera Suite

    Science.gov (United States)

    Rizk, B.; Drouet d'Aubigny, C.; Golish, D.; Fellows, C.; Merrill, C.; Smith, P.; Walker, M. S.; Hendershot, J. E.; Hancock, J.; Bailey, S. H.; DellaGiustina, D. N.; Lauretta, D. S.; Tanner, R.; Williams, M.; Harshman, K.; Fitzgibbon, M.; Verts, W.; Chen, J.; Connors, T.; Hamara, D.; Dowd, A.; Lowman, A.; Dubin, M.; Burt, R.; Whiteley, M.; Watson, M.; McMahon, T.; Ward, M.; Booher, D.; Read, M.; Williams, B.; Hunten, M.; Little, E.; Saltzman, T.; Alfred, D.; O'Dougherty, S.; Walthall, M.; Kenagy, K.; Peterson, S.; Crowther, B.; Perry, M. L.; See, C.; Selznick, S.; Sauve, C.; Beiser, M.; Black, W.; Pfisterer, R. N.; Lancaster, A.; Oliver, S.; Oquest, C.; Crowley, D.; Morgan, C.; Castle, C.; Dominguez, R.; Sullivan, M.

    2018-02-01

    The OSIRIS-REx Camera Suite (OCAMS) will acquire images essential to collecting a sample from the surface of Bennu. During proximity operations, these images will document the presence of satellites and plumes, record spin state, enable an accurate model of the asteroid's shape, and identify any surface hazards. They will confirm the presence of sampleable regolith on the surface, observe the sampling event itself, and image the sample head in order to verify its readiness to be stowed. They will document Bennu's history as an example of early solar system material, as a microgravity body with a planetesimal size-scale, and as a carbonaceous object. OCAMS is fitted with three cameras. The MapCam will record color images of Bennu as a point source on approach to the asteroid in order to connect Bennu's ground-based point-source observational record to later higher-resolution surface spectral imaging. The SamCam will document the sample site before, during, and after it is disturbed by the sample mechanism. The PolyCam, using its focus mechanism, will observe the sample site at sub-centimeter resolutions, revealing surface texture and morphology. While their imaging requirements divide naturally between the three cameras, they preserve a strong degree of functional overlap. OCAMS and the other spacecraft instruments will allow the OSIRIS-REx mission to collect a sample from a microgravity body on the same visit during which it was first optically acquired from long range, a useful capability as humanity reaches out to explore near-Earth, Main-Belt and Jupiter Trojan asteroids.

  18. Performance of Color Camera Machine Vision in Automated Furniture Rough Mill Systems

    Science.gov (United States)

    D. Earl Kline; Agus Widoyoko; Janice K. Wiedenbeck; Philip A. Araman

    1998-01-01

    The objective of this study was to evaluate the performance of color camera machine vision for lumber processing in a furniture rough mill. The study used 134 red oak boards to compare the performance of automated gang-rip-first rough mill yield based on a prototype color camera lumber inspection system developed at Virginia Tech with both estimated optimum rough mill...

  19. A Modified Adaptive Stochastic Resonance for Detecting Faint Signal in Sensors

    Directory of Open Access Journals (Sweden)

    Hengwei Li

    2007-02-01

    Full Text Available In this paper, an approach is presented to detect faint signals with strong noises in sensors by stochastic resonance (SR. We adopt the power spectrum as the evaluation tool of SR, which can be obtained by the fast Fourier transform (FFT. Furthermore, we introduce the adaptive filtering scheme to realize signal processing automatically. The key of the scheme is how to adjust the barrier height to satisfy the optimal condition of SR in the presence of any input. For the given input signal, we present an operable procedure to execute the adjustment scheme. An example utilizing one audio sensor to detect the fault information from the power supply is given. Simulation results show that th

  20. Faint emission features in the Mg II resonance-line wings. [in solar spectra

    Science.gov (United States)

    Allen, M. S.; Mcallister, H. C.

    1977-01-01

    Data obtained with a rocket-borne echelle spectrograph are presented which indicate the presence of three faint emission features deep in the cores of the Mg II h and k resonance-line wings in the solar Fraunhofer spectrum. Results of wavelength measurements are discussed, and the relative intensities of the emission features are examined. It is tentatively suggested that the first feature be identified with the Fe II line at 2797.037 A, the second feature is probably the V II line at 2803.469 A, and the third feature may originate in Fe II emission at 2804.021 A. Possible emission mechanisms are proposed, and it is concluded that the detected features may be of potential diagnostic value for the analysis of depth variations of temperature and velocity in the lower chromosphere as well as for solar and possibly stellar spectroscopy.

  1. TOWARD A NETWORK OF FAINT DA WHITE DWARFS AS HIGH-PRECISION SPECTROPHOTOMETRIC STANDARDS

    Energy Technology Data Exchange (ETDEWEB)

    Narayan, G.; Matheson, T.; Saha, A.; Claver, J. [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States); Axelrod, T.; Olszewski, E. [University of Arizona, Steward Observatory, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Holberg, J. B. [University of Arizona, Lunar and Planetary Laboratory, 1629 East University Boulevard, Tucson, AZ 85721 (United States); Stubbs, C. W. [Department of Physics, Harvard University, 17 Oxford Street, Cambridge, MA 02138 (United States); Bohlin, R. C.; Deustua, S.; Rest, A., E-mail: gnarayan@noao.edu [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States)

    2016-05-10

    We present the initial results from a program aimed at establishing a network of hot DA white dwarfs to serve as spectrophotometric standards for present and future wide-field surveys. These stars span the equatorial zone and are faint enough to be conveniently observed throughout the year with large-aperture telescopes. The spectra of these white dwarfs are analyzed in order to generate a non-local-thermodynamic-equilibrium model atmosphere normalized to Hubble Space Telescope colors, including adjustments for wavelength-dependent interstellar extinction. Once established, this standard star network will serve ground-based observatories in both hemispheres as well as space-based instrumentation from the UV to the near IR. We demonstrate the effectiveness of this concept and show how two different approaches to the problem using somewhat different assumptions produce equivalent results. We discuss the lessons learned and the resulting corrective actions applied to our program.

  2. Faint (and bright variable stars in the satellites of the Milky Way

    Directory of Open Access Journals (Sweden)

    Vivas A. Katherina

    2017-01-01

    Full Text Available I describe two ongoing projects related with variable stars in the satellites of the MilkyWay. In the first project, we are searching for dwarf Cepheid stars (a.k.a δ Scuti and/or SX Phe in some of the classical dwarf spheroidal galaxies. Our goal is to characterize the population of these variable stars under different environments (age, metallicity in order to study their use as standard candles in systems for which the metallicity is not necessarily known. In the second project we search for RR Lyrae stars in the new ultra-faint satellite galaxies that have been discovered around the Milky Way in recent years.

  3. Photometric Calibration of Consumer Video Cameras

    Science.gov (United States)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  4. Improvements of the 3D images captured with Time-of-Flight cameras

    OpenAIRE

    Falie, D.

    2009-01-01

    3D Time-of-Flight camera's images are affected by errors due to the diffuse (indirect) light and to the flare light. The presented method improves the 3D image reducing the distance's errors to dark surface objects. This is achieved by placing one or two contrast tags in the scene at different distances from the ToF camera. The white and black parts of the tags are situated at the same distance to the camera but the distances measured by the camera are different. This difference is used to co...

  5. Holographic interferometry using a digital photo-camera

    International Nuclear Information System (INIS)

    Sekanina, H.; Hledik, S.

    2001-01-01

    The possibilities of running digital holographic interferometry using commonly available compact digital zoom photo-cameras are studied. The recently developed holographic setup, suitable especially for digital photo-cameras equipped with an un detachable object lens, is used. The method described enables a simple and straightforward way of both recording and reconstructing of a digital holographic interferograms. The feasibility of the new method is verified by digital reconstruction of the interferograms acquired, using a numerical code based on the fast Fourier transform. Experimental results obtained are presented and discussed. (authors)

  6. Super-Resolution on Small Moving Objects

    NARCIS (Netherlands)

    Eekeren, A.W.M. van; Schutte, K.; Vliet, L.J. van

    2008-01-01

    Moving objects are often the most interesting parts in image sequences. When images from a camera are undersampled and the moving object is depicted small on the image plane, processing afterwards may help to improve the visibility as well as automatic recognition of the object. This paper presents

  7. STELLAR ARCHEOLOGY IN THE GALACTIC HALO WITH ULTRA-FAINT DWARFS. VII. HERCULES

    Energy Technology Data Exchange (ETDEWEB)

    Musella, Ilaria; Ripepi, Vincenzo; Marconi, Marcella, E-mail: ilaria@na.astro.it, E-mail: ripepi@na.astro.it, E-mail: marcella@na.astro.it [INAF, Osservatorio Astronomico di Capodimonte, I-8013 Napoli (Italy); and others

    2012-09-10

    We present the first time-series study of the ultra-faint dwarf galaxy Hercules. Using a variety of telescope/instrument facilities we secured about 50 V and 80 B epochs. These data allowed us to detect and characterize 10 pulsating variable stars in Hercules. Our final sample includes six fundamental-mode (ab-type) and three first-overtone (c-type) RR Lyrae stars, and one Anomalous Cepheid. The average period of the ab-type RR Lyrae stars, (P{sub ab}) = 0.68 days ({sigma} = 0.03 days), places Hercules in the Oosterhoff II group, as found for almost the totality of the ultra-faint dwarf galaxies investigated so far for variability. The RR Lyrae stars were used to obtain independent estimates of the metallicity, reddening, and distance to Hercules, for which we find [Fe/H] = -2.30 {+-} 0.15 dex, E(B - V) = 0.09 {+-} 0.02 mag, and (m - M){sub 0} = 20.6 {+-} 0.1 mag, in good agreement with the literature values. We have obtained a V, B - V color-magnitude diagram (CMD) of Hercules that reaches V {approx} 25 mag and extends beyond the galaxy's half-light radius over a total area of 40' Multiplication-Sign 36'. The CMD and the RR Lyrae stars indicate the presence of a population as old and metal-poor as (at least) the Galactic globular cluster M68.

  8. The faint radio source population at 15.7 GHz - II. Multi-wavelength properties

    Science.gov (United States)

    Whittam, I. H.; Riley, J. M.; Green, D. A.; Jarvis, M. J.; Vaccari, M.

    2015-11-01

    A complete, flux density limited sample of 96 faint (>0.5 mJy) radio sources is selected from the 10C survey at 15.7 GHz in the Lockman Hole. We have matched this sample to a range of multi-wavelength catalogues, including Spitzer Extragalactic Representative Volume Survey, Spitzer Wide-area Infrared Extragalactic survey, United Kingdom Infrared Telescope Infrared Deep Sky Survey and optical data; multi-wavelength counterparts are found for 80 of the 96 sources and spectroscopic redshifts are available for 24 sources. Photometric redshifts are estimated for the sources with multi-wavelength data available; the median redshift of the sample is 0.91 with an interquartile range of 0.84. Radio-to-optical ratios show that at least 94 per cent of the sample are radio loud, indicating that the 10C sample is dominated by radio galaxies. This is in contrast to samples selected at lower frequencies, where radio-quiet AGN and star-forming galaxies are present in significant numbers at these flux density levels. All six radio-quiet sources have rising radio spectra, suggesting that they are dominated by AGN emission. These results confirm the conclusions of Paper I that the faint, flat-spectrum sources which are found to dominate the 10C sample below ˜1 mJy are the cores of radio galaxies. The properties of the 10C sample are compared to the Square Kilometre Array Design Studies Simulated Skies; a population of low-redshift star-forming galaxies predicted by the simulation is not found in the observed sample.

  9. Architectural Design Document for Camera Models

    DEFF Research Database (Denmark)

    Thuesen, Gøsta

    1998-01-01

    Architecture of camera simulator models and data interface for the Maneuvering of Inspection/Servicing Vehicle (MIV) study.......Architecture of camera simulator models and data interface for the Maneuvering of Inspection/Servicing Vehicle (MIV) study....

  10. Simulation-based camera navigation training in laparoscopy-a randomized trial

    DEFF Research Database (Denmark)

    Nilsson, Cecilia; Sørensen, Jette Led; Konge, Lars

    2017-01-01

    patient safety. The objectives of this trial were to examine how to train laparoscopic camera navigation and to explore the transfer of skills to the operating room. MATERIALS AND METHODS: A randomized, single-center superiority trial with three groups: The first group practiced simulation-based camera...... navigation tasks (camera group), the second group practiced performing a simulation-based cholecystectomy (procedure group), and the third group received no training (control group). Participants were surgical novices without prior laparoscopic experience. The primary outcome was assessment of camera.......033), had a higher score. CONCLUSIONS: Simulation-based training improves the technical skills required for camera navigation, regardless of practicing camera navigation or the procedure itself. Transfer to the clinical setting could, however, not be demonstrated. The control group demonstrated higher...

  11. REAL-TIME CAMERA GUIDANCE FOR 3D SCENE RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    F. Schindler

    2012-07-01

    Full Text Available We propose a framework for operator guidance during the image acquisition process for reliable multi-view stereo reconstruction. Goal is to achieve full coverage of the object and sufficient overlap. Multi-view stereo is a commonly used method to reconstruct both camera trajectory and 3D object shape. After determining an initial solution, a globally optimal reconstruction is usually obtained by executing a bundle adjustment involving all images. Acquiring suitable images, however, still requires an experienced operator to ensure accuracy and completeness of the final solution. We propose an interactive framework for guiding unexperienced users or possibly an autonomous robot. Using approximate camera orientations and object points we estimate point uncertainties within a sliding bundle adjustment and suggest appropriate camera movements. A visual feedback system communicates the decisions to the user in an intuitive way. We demonstrate the suitability of our system with a virtual image acquisition simulation as well as in real-world scenarios. We show that when following the camera movements suggested by our system, the proposed framework is able to generate good approximate values for the bundle adjustment, leading to accurate results compared to ground truth after few iterations. Possible applications are non-professional 3D acquisition systems on low-cost platforms like mobile phones, autonomously navigating robots as well as online flight planning of unmanned aerial vehicles.

  12. Graphic design of pinhole cameras

    Science.gov (United States)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  13. The Use of Camera Traps in Wildlife

    OpenAIRE

    Yasin Uçarlı; Bülent Sağlam

    2013-01-01

    Camera traps are increasingly used in the abundance and density estimates of wildlife species. Camera traps are very good alternative for direct observation in case, particularly, steep terrain, dense vegetation covered areas or nocturnal species. The main reason for the use of camera traps is eliminated that the economic, personnel and time loss in a continuous manner at the same time in different points. Camera traps, motion and heat sensitive, can take a photo or video according to the mod...

  14. Stereo Pinhole Camera: Assembly and experimental activities

    OpenAIRE

    Santos, Gilmário Barbosa; Departamento de Ciência da Computação, Universidade do Estado de Santa Catarina, Joinville; Cunha, Sidney Pinto; Centro de Tecnologia da Informação Renato Archer, Campinas

    2015-01-01

    This work describes the assembling of a stereo pinhole camera for capturing stereo-pairs of images and proposes experimental activities with it. A pinhole camera can be as sophisticated as you want, or so simple that it could be handcrafted with practically recyclable materials. This paper describes the practical use of the pinhole camera throughout history and currently. Aspects of optics and geometry involved in the building of the stereo pinhole camera are presented with illustrations. Fur...

  15. Automated Meteor Detection by All-Sky Digital Camera Systems

    Science.gov (United States)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  16. An Open Standard for Camera Trap Data

    NARCIS (Netherlands)

    Forrester, Tavis; O'Brien, Tim; Fegraus, Eric; Jansen, P.A.; Palmer, Jonathan; Kays, Roland; Ahumada, Jorge; Stern, Beth; McShea, William

    2016-01-01

    Camera traps that capture photos of animals are a valuable tool for monitoring biodiversity. The use of camera traps is rapidly increasing and there is an urgent need for standardization to facilitate data management, reporting and data sharing. Here we offer the Camera Trap Metadata Standard as an

  17. A camera specification for tendering purposes

    International Nuclear Information System (INIS)

    Lunt, M.J.; Davies, M.D.; Kenyon, N.G.

    1985-01-01

    A standardized document is described which is suitable for sending to companies which are being invited to tender for the supply of a gamma camera. The document refers to various features of the camera, the performance specification of the camera, maintenance details, price quotations for various options and delivery, installation and warranty details. (U.K.)

  18. High-Speed Smart Camera with High Resolution

    Directory of Open Access Journals (Sweden)

    J. Dubois

    2007-02-01

    Full Text Available High-speed video cameras are powerful tools for investigating for instance the biomechanics analysis or the movements of mechanical parts in manufacturing processes. In the past years, the use of CMOS sensors instead of CCDs has enabled the development of high-speed video cameras offering digital outputs, readout flexibility, and lower manufacturing costs. In this paper, we propose a high-speed smart camera based on a CMOS sensor with embedded processing. Two types of algorithms have been implemented. A compression algorithm, specific to high-speed imaging constraints, has been implemented. This implementation allows to reduce the large data flow (6.55 Gbps and to propose a transfer on a serial output link (USB 2.0. The second type of algorithm is dedicated to feature extraction such as edge detection, markers extraction, or image analysis, wavelet analysis, and object tracking. These image processing algorithms have been implemented into an FPGA embedded inside the camera. These implementations are low-cost in terms of hardware resources. This FPGA technology allows us to process in real time 500 images per second with a 1280×1024 resolution. This camera system is a reconfigurable platform, other image processing algorithms can be implemented.

  19. High-Speed Smart Camera with High Resolution

    Directory of Open Access Journals (Sweden)

    Mosqueron R

    2007-01-01

    Full Text Available High-speed video cameras are powerful tools for investigating for instance the biomechanics analysis or the movements of mechanical parts in manufacturing processes. In the past years, the use of CMOS sensors instead of CCDs has enabled the development of high-speed video cameras offering digital outputs, readout flexibility, and lower manufacturing costs. In this paper, we propose a high-speed smart camera based on a CMOS sensor with embedded processing. Two types of algorithms have been implemented. A compression algorithm, specific to high-speed imaging constraints, has been implemented. This implementation allows to reduce the large data flow (6.55 Gbps and to propose a transfer on a serial output link (USB 2.0. The second type of algorithm is dedicated to feature extraction such as edge detection, markers extraction, or image analysis, wavelet analysis, and object tracking. These image processing algorithms have been implemented into an FPGA embedded inside the camera. These implementations are low-cost in terms of hardware resources. This FPGA technology allows us to process in real time 500 images per second with a 1280×1024 resolution. This camera system is a reconfigurable platform, other image processing algorithms can be implemented.

  20. Rats Can Acquire Conditional Fear of Faint Light Leaking through the Acrylic Resin Used to Mount Fiber Optic Cannulas

    Science.gov (United States)

    Eckmier, Adam; de Marcillac, Willy Daney; Maître, Agnès; Jay, Thérèse M.; Sanders, Matthew J.; Godsil, Bill P.

    2016-01-01

    Rodents are exquisitely sensitive to light and optogenetic behavioral experiments routinely introduce light-delivery materials into experimental situations, which raises the possibility that light could leak and influence behavioral performance. We examined whether rats respond to a faint diffusion of light, termed caplight, which emanated through…

  1. Collimated trans-axial tomographic scintillation camera

    International Nuclear Information System (INIS)

    1980-01-01

    The principal problem in trans-axial tomographic radioisotope scanning is the length of time required to obtain meaningful data. Patient movement and radioisotope migration during the scanning period can cause distortion of the image. The object of this invention is to reduce the scanning time without degrading the images obtained. A system is described in which a scintillation camera detector is moved to an orbit about the cranial-caudal axis relative to the patient. A collimator is used in which lead septa are arranged so as to admit gamma rays travelling perpendicular to this axis with high spatial resolution and those travelling in the direction of the axis with low spatial resolution, thus increasing the rate of acceptance of radioactive events to contribute to the positional information obtainable without sacrificing spatial resolution. (author)

  2. Relative camera localisation in non-overlapping camera networks using multiple trajectories

    NARCIS (Netherlands)

    John, V.; Englebienne, G.; Kröse, B.J.A.

    2012-01-01

    In this article we present an automatic camera calibration algorithm using multiple trajectories in a multiple camera network with non-overlapping field-of-views (FOV). Visible trajectories within a camera FOV are assumed to be measured with respect to the camera local co-ordinate system.

  3. Feature-based automatic color calibration for networked camera system

    Science.gov (United States)

    Yamamoto, Shoji; Taki, Keisuke; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2011-01-01

    In this paper, we have developed a feature-based automatic color calibration by using an area-based detection and adaptive nonlinear regression method. Simple color matching of chartless is achieved by using the characteristic of overlapping image area with each camera. Accurate detection of common object is achieved by the area-based detection that combines MSER with SIFT. Adaptive color calibration by using the color of detected object is calculated by nonlinear regression method. This method can indicate the contribution of object's color for color calibration, and automatic selection notification for user is performed by this function. Experimental result show that the accuracy of the calibration improves gradually. It is clear that this method can endure practical use of multi-camera color calibration if an enough sample is obtained.

  4. The significance of faint visualization of the superior sagittal sinus in brain scintigraphy for the diagnosis of brain death

    International Nuclear Information System (INIS)

    Bisset, R.; Sfakianakis, G.; Ihmedian, I.; Holzman, B.; Curless, R.; Serafini, A.

    1985-01-01

    Brain death is associated with cessation of blood flow to the brain. Tc-99m brain flow studies are used as a laboratory confirmatory test for the establishment of the diagnosis of brain death. Criteria for the diagnosis of cessation of blood flow to the brain are 1) visualization of carotid artery activity in the neck of the patient and 2) no visualization of activity in the distribution of the anterior and middle cerebral arteries. The authors noticed that in a significant number of patients, although there was no visualization of arterial blood flow to the brain the static images demonstrated faint accumulation of activity in the region of the superior sagittal sinus (SSS). In a four year period 212 brain flow studies were performed in 154 patients for diagnosis of brain death; of them 137 studies (65%) showed no evidence of arterial flow. In 103 out of the 137 studies (75%) there was no visualization of the SSS; in the remaining 34 studies (3l patients) however three patterns of faint activity attributed to partial and or faint visualization of the SSS could be recognized at the midline of the immediate anterior static view: a) linear from the cranial vault floor up b) disk shaped at the apex of the vault and c) disk shaped at the apex tailing caudad. All of the 3l patients in this group satisfied brain death criteria within four days of the last study which showed faint visualization of the superior sagittal sinus. The authors conclude that even in the presence of a faint visualization of the superior sagittal sinus on static post brain flow scintigraphy, the diagnosis of cessation of blood flow to the brain can be made if there is no evidence of arterial blood flow

  5. Modelling Virtual Camera Behaviour Through Player Gaze

    DEFF Research Database (Denmark)

    Picardi, Andrea; Burelli, Paolo; Yannakakis, Georgios N.

    2012-01-01

    In a three-dimensional virtual environment, aspects such as narrative and interaction largely depend on the placement and animation of the virtual camera. Therefore, virtual camera control plays a critical role in player experience and, thereby, in the overall quality of a computer game. Both game...... on the relationship between virtual camera, game-play and player behaviour. We run a game user experiment to shed some light on this relationship and identify relevant dif- ferences between camera behaviours through different game sessions, playing behaviours and player gaze patterns. Re- sults show that users can...... be efficiently profiled in dissimilar clusters according to camera control as part of their game- play behaviour....

  6. Stereo Pinhole Camera: Assembly and experimental activities

    Directory of Open Access Journals (Sweden)

    Gilmário Barbosa Santos

    2015-05-01

    Full Text Available This work describes the assembling of a stereo pinhole camera for capturing stereo-pairs of images and proposes experimental activities with it. A pinhole camera can be as sophisticated as you want, or so simple that it could be handcrafted with practically recyclable materials. This paper describes the practical use of the pinhole camera throughout history and currently. Aspects of optics and geometry involved in the building of the stereo pinhole camera are presented with illustrations. Furthermore, experiments are proposed by using the images obtained by the camera for 3D visualization through a pair of anaglyph glasses, and the estimation of relative depth by triangulation is discussed.

  7. Adapting virtual camera behaviour through player modelling

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2015-01-01

    Research in virtual camera control has focused primarily on finding methods to allow designers to place cameras effectively and efficiently in dynamic and unpredictable environments, and to generate complex and dynamic plans for cinematography in virtual environments. In this article, we propose...... a novel approach to virtual camera control, which builds upon camera control and player modelling to provide the user with an adaptive point-of-view. To achieve this goal, we propose a methodology to model the player’s preferences on virtual camera movements and we employ the resulting models to tailor...

  8. Parallel object-oriented data mining system

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick

    2004-01-06

    A data mining system uncovers patterns, associations, anomalies and other statistically significant structures in data. Data files are read and displayed. Objects in the data files are identified. Relevant features for the objects are extracted. Patterns among the objects are recognized based upon the features. Data from the Faint Images of the Radio Sky at Twenty Centimeters (FIRST) sky survey was used to search for bent doubles. This test was conducted on data from the Very Large Array in New Mexico which seeks to locate a special type of quasar (radio-emitting stellar object) called bent doubles. The FIRST survey has generated more than 32,000 images of the sky to date. Each image is 7.1 megabytes, yielding more than 100 gigabytes of image data in the entire data set.

  9. Feasibility of Using Video Cameras for Automated Enforcement on Red-Light Running and Managed Lanes.

    Science.gov (United States)

    2009-12-01

    The overall objective of this study is to evaluate the feasibility, effectiveness, legality, and public acceptance aspects of automated enforcement on red light running and high occupancy vehicle (HOV) occupancy requirement using video cameras in Nev...

  10. A Comparison of Techniques for Camera Selection and Hand-Off in a Video Network

    Science.gov (United States)

    Li, Yiming; Bhanu, Bir

    Video networks are becoming increasingly important for solving many real-world problems. Multiple video sensors require collaboration when performing various tasks. One of the most basic tasks is the tracking of objects, which requires mechanisms to select a camera for a certain object and hand-off this object from one camera to another so as to accomplish seamless tracking. In this chapter, we provide a comprehensive comparison of current and emerging camera selection and hand-off techniques. We consider geometry-, statistics-, and game theory-based approaches and provide both theoretical and experimental comparison using centralized and distributed computational models. We provide simulation and experimental results using real data for various scenarios of a large number of cameras and objects for in-depth understanding of strengths and weaknesses of these techniques.

  11. Initial laboratory evaluation of color video cameras

    Energy Technology Data Exchange (ETDEWEB)

    Terry, P L

    1991-01-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than identify an intruder. Monochrome cameras are adequate for that application and were selected over color cameras because of their greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Color information is useful for identification purposes, and color camera technology is rapidly changing. Thus, Sandia National Laboratories established an ongoing program to evaluate color solid-state cameras. Phase one resulted in the publishing of a report titled, Initial Laboratory Evaluation of Color Video Cameras (SAND--91-2579).'' It gave a brief discussion of imager chips and color cameras and monitors, described the camera selection, detailed traditional test parameters and procedures, and gave the results of the evaluation of twelve cameras. In phase two six additional cameras were tested by the traditional methods and all eighteen cameras were tested by newly developed methods. This report details both the traditional and newly developed test parameters and procedures, and gives the results of both evaluations.

  12. Finding Objects for Assisting Blind People

    OpenAIRE

    Yi, Chucai; Flores, Roberto W.; Chincha, Ricardo; Tian, YingLi

    2013-01-01

    Computer vision technology has been widely used for blind assistance, such as navigation and wayfinding. However, few camera-based systems are developed for helping blind or visually-impaired people to find daily necessities. In this paper, we propose a prototype system of blind-assistant object finding by camera-based network and matching-based recognition. We collect a dataset of daily necessities and apply Speeded-Up Robust Features (SURF) and Scale Invariant Feature Transform (SIFT) featu...

  13. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  14. Performance Objectives

    Science.gov (United States)

    1978-12-01

    mathematics program. Ir this study he measured mathenatics skills, mathematics application, ard student attitudes. Ne used the Stanford Achievement...34most vague" (5). Means and variances were computed for each Item on the questionnaire, Correlations were then ccinputed between and among the...between subjects’ ratings of objectives with direct objects and objectives containing "x" and "y." This is reflected in tests computed separately for

  15. Detecting method of subjects' 3D positions and experimental advanced camera control system

    Science.gov (United States)

    Kato, Daiichiro; Abe, Kazuo; Ishikawa, Akio; Yamada, Mitsuho; Suzuki, Takahito; Kuwashima, Shigesumi

    1997-04-01

    Steady progress is being made in the development of an intelligent robot camera capable of automatically shooting pictures with a powerful sense of reality or tracking objects whose shooting requires advanced techniques. Currently, only experienced broadcasting cameramen can provide these pictures.TO develop an intelligent robot camera with these abilities, we need to clearly understand how a broadcasting cameraman assesses his shooting situation and how his camera is moved during shooting. We use a real- time analyzer to study a cameraman's work and his gaze movements at studios and during sports broadcasts. This time, we have developed a detecting method of subjects' 3D positions and an experimental camera control system to help us further understand the movements required for an intelligent robot camera. The features are as follows: (1) Two sensor cameras shoot a moving subject and detect colors, producing its 3D coordinates. (2) Capable of driving a camera based on camera movement data obtained by a real-time analyzer. 'Moving shoot' is the name we have given to the object position detection technology on which this system is based. We used it in a soccer game, producing computer graphics showing how players moved. These results will also be reported.

  16. Human tracking over camera networks: a review

    Science.gov (United States)

    Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang

    2017-12-01

    In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.

  17. Image compensation for camera and lighting variability

    Science.gov (United States)

    Daley, Wayne D.; Britton, Douglas F.

    1996-12-01

    With the current trend of integrating machine vision systems in industrial manufacturing and inspection applications comes the issue of camera and illumination stabilization. Unless each application is built around a particular camera and highly controlled lighting environment, the interchangeability of cameras of fluctuations in lighting become a problem as each camera usually has a different response. An empirical approach is proposed where color tile data is acquired using the camera of interest, and a mapping is developed to some predetermined reference image using neural networks. A similar analytical approach based on a rough analysis of the imaging systems is also considered for deriving a mapping between cameras. Once a mapping has been determined, all data from one camera is mapped to correspond to the images of the other prior to performing any processing on the data. Instead of writing separate image processing algorithms for the particular image data being received, the image data is adjusted based on each particular camera and lighting situation. All that is required when swapping cameras is the new mapping for the camera being inserted. The image processing algorithms can remain the same as the input data has been adjusted appropriately. The results of utilizing this technique are presented for an inspection application.

  18. Optimising camera traps for monitoring small mammals.

    Directory of Open Access Journals (Sweden)

    Alistair S Glen

    Full Text Available Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1 trigger speed, 2 passive infrared vs. microwave sensor, 3 white vs. infrared flash, and 4 still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea, feral cats (Felis catus and hedgehogs (Erinaceuseuropaeus. Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  19. Optimising camera traps for monitoring small mammals.

    Science.gov (United States)

    Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  20. Ultra faint dwarf galaxies: an arena for testing dark matter versus modified gravity

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Weikang; Ishak, Mustapha, E-mail: wxl123830@utdallas.edu, E-mail: mishak@utdallas.edu [Department of Physics, University of Texas at Dallas, Richardson, TX 75083 (United States)

    2016-10-01

    The scenario consistent with a wealth of observations for the missing mass problem is that of weakly interacting dark matter particles. However, arguments or proposals for a Newtonian or relativistic modified gravity scenario continue to be made. A distinguishing characteristic between the two scenarios is that dark matter particles can produce a gravitational effect, in principle, without the need of baryons while this is not the case for the modified gravity scenario where such an effect must be correlated with the amount of baryonic matter. We consider here ultra-faint dwarf (UFD) galaxies as a promising arena to test the two scenarios based on the above assertion. We compare the correlation of the luminosity with the velocity dispersion between samples of UFD and non-UFD galaxies, finding a significant loss of correlation for UFD galaxies. For example, we find for 28 non-UFD galaxies a strong correlation coefficient of −0.688 which drops to −0.077 for the 23 UFD galaxies. Incoming and future data will determine whether the observed stochasticity for UFD galaxies is physical or due to systematics in the data. Such a loss of correlation (if it is to persist) is possible and consistent with the dark matter scenario for UFD galaxies but would constitute a new challenge for the modified gravity scenario.

  1. Archean Earth Atmosphere Fractal Haze Aggregates: Light Scattering Calculations and the Faint Young Sun Paradox

    Science.gov (United States)

    Boness, D. A.; Terrell-Martinez, B.

    2010-12-01

    As part of an ongoing undergraduate research project of light scattering calculations involving fractal carbonaceous soot aggregates relevant to current anthropogenic and natural sources in Earth's atmosphere, we have read with interest a recent paper [E.T. Wolf and O.B Toon,Science 328, 1266 (2010)] claiming that the Faint Young Sun paradox discussed four decades ago by Carl Sagan and others can be resolved without invoking heavy CO2 concentrations as a greenhouse gas warming the early Earth enough to sustain liquid water and hence allow the origin of life. Wolf and Toon report that a Titan-like Archean Earth haze, with a fractal haze aggregate nature due to nitrogen-methane photochemistry at high altitudes, should block enough UV light to protect the warming greenhouse gas NH3 while allowing enough visible light to reach the surface of the Earth. To test this hypothesis, we have employed a rigorous T-Matrix arbitrary-particle light scattering technique, to avoid the simplifications inherent in Mie-sphere scattering, on haze fractal aggregates at UV and visible wavelenths of incident light. We generate these model aggregates using diffusion-limited cluster aggregation (DLCA) algorithms, which much more closely fit actual haze fractal aggregates than do diffusion-limited aggregation (DLA) algorithms.

  2. VizieR Online Data Catalog: Infrared-faint radio sources catalog (Collier+, 2014)

    Science.gov (United States)

    Collier, J. D.; Banfield, J. K.; Norris, R. P.; Schnitzeler, D. H. F. M.; Kimball, A. E.; Filipovic, M. D.; Jarrett, T. H.; Lonsdale, C. J.; Tothill, N. F. H.

    2014-11-01

    The 20cm radio data come from the Unified Radio Catalog (URC) compiled by Kimball & Ivezic (2008AJ....136..684K). This radio catalogue combines data from the National Radio Astronomy Observatory (NRAO) VLA Sky Survey (NVSS; Condon et al., 1998, Cat. VIII/65), Faint Images of the Radio Sky at Twenty Centimeters (FIRST; Becker, White & Helfand, 1995, cat. VIII/92), Green Bank 6cm survey (GB6; Gregory et al., 1996, Cat. VIII/40), the Westerbork Northern Sky Survey (WENSS; Rengelink et al. 1997; de Bruyn et al. 2000, Cat. VIII/62) and the Sloan Digital Sky Survey Data Release 6 (SDSS DR6; Adelman-McCarthy et al., 2008, Cat. II/282). We use updated NVSS and FIRST data from the URC version 2.0 (Kimball & Ivezic, in preparation), which includes a number of new sources as well as updated positions and flux densities. The IR data come from WISE (Wright et al. (WISE Team) 2009, Cat. II/311), which is an all-sky survey centred at 3.4, 4.6, 12 and 22um (referred to as bands W1, W2, W3 and W4), with respective angular resolutions of 6.1, 6.4, 6.5 and 12.0-arcsec (full width at half-maximum, FWHM), and typical 5σ sensitivity levels of 0.08, 0.11, 1 and 6mJy, with sensitivity increasing towards the ecliptic poles. (1 data file).

  3. A faint type of supernova from a white dwarf with a helium-rich companion.

    Science.gov (United States)

    Perets, H B; Gal-Yam, A; Mazzali, P A; Arnett, D; Kagan, D; Filippenko, A V; Li, W; Arcavi, I; Cenko, S B; Fox, D B; Leonard, D C; Moon, D-S; Sand, D J; Soderberg, A M; Anderson, J P; James, P A; Foley, R J; Ganeshalingam, M; Ofek, E O; Bildsten, L; Nelemans, G; Shen, K J; Weinberg, N N; Metzger, B D; Piro, A L; Quataert, E; Kiewe, M; Poznanski, D

    2010-05-20

    Supernovae are thought to arise from two different physical processes. The cores of massive, short-lived stars undergo gravitational core collapse and typically eject a few solar masses during their explosion. These are thought to appear as type Ib/c and type II supernovae, and are associated with young stellar populations. In contrast, the thermonuclear detonation of a carbon-oxygen white dwarf, whose mass approaches the Chandrasekhar limit, is thought to produce type Ia supernovae. Such supernovae are observed in both young and old stellar environments. Here we report a faint type Ib supernova, SN 2005E, in the halo of the nearby isolated galaxy, NGC 1032. The 'old' environment near the supernova location, and the very low derived ejected mass ( approximately 0.3 solar masses), argue strongly against a core-collapse origin. Spectroscopic observations and analysis reveal high ejecta velocities, dominated by helium-burning products, probably excluding this as a subluminous or a regular type Ia supernova. We conclude that it arises from a low-mass, old progenitor, likely to have been a helium-accreting white dwarf in a binary. The ejecta contain more calcium than observed in other types of supernovae and probably large amounts of radioactive (44)Ti.

  4. Objective lens

    Science.gov (United States)

    Olczak, Eugene G. (Inventor)

    2011-01-01

    An objective lens and a method for using same. The objective lens has a first end, a second end, and a plurality of optical elements. The optical elements are positioned between the first end and the second end and are at least substantially symmetric about a plane centered between the first end and the second end.

  5. Agile Objects

    Science.gov (United States)

    German, Senta; Harris, Jim

    2017-01-01

    In this article, the authors argue that the art-historical canon, however it is construed, has little relevance to the selection of objects for museum-based teaching. Their contention is that all objects are fundamentally agile and capable of interrogation from any number of disciplinary standpoints, and that the canon of museum education,…

  6. Study of a sample of faint Be stars in the exofield of CoRoT. I. Spectroscopic characterization

    Science.gov (United States)

    Semaan, T.; Hubert, A. M.; Zorec, J.; Martayan, C.; Frémat, Y.; Gutiérrez-Soto, J.; Fabregat, J.

    2013-03-01

    Context. Be stars are probably the most rapid rotators among stars in the main sequence (MS) and, as such, are excellent candidates to study the incidence of the rotation on the characteristics of their non-radial pulsations, as well as on their internal structure. Pulsations are also thought to be possible mechanisms that help the mass ejection needed to build up the circumstellar disks of Be stars. Aims: The purpose of this paper is to identify a number of faint Be stars observed with the CoRoT satellite and to determine their fundamental parameters, which will enable us to study their pulsation properties as a function of the location in the HR diagram and to search for correlations with the light outbursts, which are possibly produced by discrete mass ejections. Methods: We identified those objects in the exofields of CoRoT presenting the Be phenomenon using Hα surveys, as well as automated methods based on pulsation properties that we finally confirmed with FLAMES/GIRAFFE and X-shooter spectroscopic observations at VLT/ESO, and with near-IR photometry. The spectra were 1) corrected for the veiling effect, 2) treated with the GIRFIT code to determine apparent fundamental parameters, and 3) corrected with the FASTROT code for effects induced by the rapid rotation. Results: A list of 41 Be star candidates were found from photometric and spectroscopic criteria. The spectral coverage useful for determining the fundamental parameters was obtained for only about half of them. We then spectroscopically identified 21 Be stars, two probable Be stars, and two B stars contaminated by the Sh 2-284 nebulosity. A short description of the spectral characteristics of each star is given. The fundamental parameters and, in particular, the rotation frequency νr (cycles per day) were all corrected for rotational effects at rotation rates ranging from Ω/Ωc = 0.8 to 1.0. We have determined the positions of Be stars in the HR diagram and find two of them located beyond the MS

  7. Optical Recognition And Tracking Of Objects

    Science.gov (United States)

    Chao, Tien-Hsin; Liu, Hua-Kuang

    1988-01-01

    Separate objects moving independently tracked simultaneously. System uses coherent optical techniques to obtain correlation between each object and reference image. Moving objects monitored by charge-coupled-device television camera, output fed to liquid-crystal television (LCTV) display. Acting as spatial light modulator, LCTV impresses images of moving objects on collimated laser beam. Beam spatially low-pass filtered to remove high-spatial-frequency television grid pattern.

  8. A Quality Evaluation of Single and Multiple Camera Calibration Approaches for an Indoor Multi Camera Tracking System

    Directory of Open Access Journals (Sweden)

    M. Adduci

    2014-06-01

    Full Text Available Human detection and tracking has been a prominent research area for several scientists around the globe. State of the art algorithms have been implemented, refined and accelerated to significantly improve the detection rate and eliminate false positives. While 2D approaches are well investigated, 3D human detection and tracking is still an unexplored research field. In both 2D/3D cases, introducing a multi camera system could vastly expand the accuracy and confidence of the tracking process. Within this work, a quality evaluation is performed on a multi RGB-D camera indoor tracking system for examining how camera calibration and pose can affect the quality of human tracks in the scene, independently from the detection and tracking approach used. After performing a calibration step on every Kinect sensor, state of the art single camera pose estimators were evaluated for checking how good the quality of the poses is estimated using planar objects such as an ordinate chessboard. With this information, a bundle block adjustment and ICP were performed for verifying the accuracy of the single pose estimators in a multi camera configuration system. Results have shown that single camera estimators provide high accuracy results of less than half a pixel forcing the bundle to converge after very few iterations. In relation to ICP, relative information between cloud pairs is more or less preserved giving a low score of fitting between concatenated pairs. Finally, sensor calibration proved to be an essential step for achieving maximum accuracy in the generated point clouds, and therefore in the accuracy of the produced 3D trajectories, from each sensor.

  9. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    Directory of Open Access Journals (Sweden)

    Yu Lu

    2016-04-01

    Full Text Available A new compact large field of view (FOV multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second.

  10. Optimal Camera Network Design for 3d Modeling of Cultural Heritage

    Science.gov (United States)

    Alsadik, B. S.; Gerke, M.; Vosselman, G.

    2012-07-01

    Digital cultural heritage documentation in 3D is subject to research and practical applications nowadays. Image-based modeling is a technique to create 3D models, which starts with the basic task of designing the camera network. This task is - however - quite crucial in practical applications because it needs a thorough planning and a certain level of expertise and experience. Bearing in mind todays computational (mobile) power we think that the optimal camera network should be designed in the field, and, therefore, making the preprocessing and planning dispensable. The optimal camera network is designed when certain accuracy demands are fulfilled with a reasonable effort, namely keeping the number of camera shots at a minimum. In this study, we report on the development of an automatic method to design the optimum camera network for a given object of interest, focusing currently on buildings and statues. Starting from a rough point cloud derived from a video stream of object images, the initial configuration of the camera network assuming a high-resolution state-of-the-art non-metric camera is designed. To improve the image coverage and accuracy, we use a mathematical penalty method of optimization with constraints. From the experimental test, we found that, after optimization, the maximum coverage is attained beside a significant improvement of positional accuracy. Currently, we are working on a guiding system, to ensure, that the operator actually takes the desired images. Further next steps will include a reliable and detailed modeling of the object applying sophisticated dense matching techniques.

  11. STRAY DOG DETECTION IN WIRED CAMERA NETWORK

    Directory of Open Access Journals (Sweden)

    C. Prashanth

    2013-08-01

    Full Text Available Existing surveillance systems impose high level of security on humans but lacks attention on animals. Stray dogs could be used as an alternative to humans to carry explosive material. It is therefore imperative to ensure the detection of stray dogs for necessary corrective action. In this paper, a novel composite approach to detect the presence of stray dogs is proposed. The captured frame from the surveillance camera is initially pre-processed using Gaussian filter to remove noise. The foreground object of interest is extracted utilizing ViBe algorithm. Histogram of Oriented Gradients (HOG algorithm is used as the shape descriptor which derives the shape and size information of the extracted foreground object. Finally, stray dogs are classified from humans using a polynomial Support Vector Machine (SVM of order 3. The proposed composite approach is simulated in MATLAB and OpenCV. Further it is validated with real time video feeds taken from an existing surveillance system. From the results obtained, it is found that a classification accuracy of about 96% is achieved. This encourages the utilization of the proposed composite algorithm in real time surveillance systems.

  12. Photogrammetric Applications of Immersive Video Cameras

    OpenAIRE

    Kwiatek, K.; Tokarczyk, R.

    2014-01-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to ov...

  13. Directional Unfolded Source Term (DUST) for Compton Cameras.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Dean J.; Mitchell, Dean J.; Horne, Steven M.; O' Brien, Sean; Thoreson, Gregory G

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  14. About possibility of temperature trace observing on the human skin using commercially available IR camera

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

    2016-09-01

    One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. Three years ago, we have demonstrated principal possibility to see a temperature trace, induced by food eating or water drinking, on the human body skin by using a passive THz camera. However, this camera is very expensive. Therefore, for practice it will be very convenient if one can use the IR camera for this purpose. In contrast to passive THz camera using, the IR camera does not allow to see the object under clothing, if an image, produced by this camera, is used directly. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To overcome this disadvantage we develop novel approach for computer processing of IR camera images. It allows us to increase a temperature resolution of IR camera as well as increasing of human year effective susceptibility. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments were made with measurements of a body temperature covered by T-shirt. Shown results are very important for the detection of forbidden objects, cancelled inside the human body, by using non-destructive control without using X-rays.

  15. Approximations to camera sensor noise

    Science.gov (United States)

    Jin, Xiaodan; Hirakawa, Keigo

    2013-02-01

    Noise is present in all image sensor data. Poisson distribution is said to model the stochastic nature of the photon arrival process, while it is common to approximate readout/thermal noise by additive white Gaussian noise (AWGN). Other sources of signal-dependent noise such as Fano and quantization also contribute to the overall noise profile. Question remains, however, about how best to model the combined sensor noise. Though additive Gaussian noise with signal-dependent noise variance (SD-AWGN) and Poisson corruption are two widely used models to approximate the actual sensor noise distribution, the justification given to these types of models are based on limited evidence. The goal of this paper is to provide a more comprehensive characterization of random noise. We concluded by presenting concrete evidence that Poisson model is a better approximation to real camera model than SD-AWGN. We suggest further modification to Poisson that may improve the noise model.

  16. Streak camera recording of interferometer fringes

    International Nuclear Information System (INIS)

    Parker, N.L.; Chau, H.H.

    1977-01-01

    The use of an electronic high-speed camera in the streaking mode to record interference fringe motion from a velocity interferometer is discussed. Advantages of this method over the photomultiplier tube-oscilloscope approach are delineated. Performance testing and data for the electronic streak camera are discussed. The velocity profile of a mylar flyer accelerated by an electrically exploded bridge, and the jump-off velocity of metal targets struck by these mylar flyers are measured in the camera tests. Advantages of the streak camera include portability, low cost, ease of operation and maintenance, simplified interferometer optics, and rapid data analysis

  17. Decision about buying a gamma camera

    International Nuclear Information System (INIS)

    Ganatra, R.D.

    1992-01-01

    A large part of the referral to a nuclear medicine department is usually for imaging studies. Sooner or later, the nuclear medicine specialist will be called upon to make a decision about when and what type of gamma camera to buy. There is no longer an option of choosing between a rectilinear scanner and a gamma camera as the former is virtually out of the market. The decision that one has to make is when to invest in a gamma camera, and then on what basis to select the gamma camera

  18. The suitability of lightfield camera depth maps for coordinate measurement applications

    Science.gov (United States)

    Rangappa, Shreedhar; Tailor, Mitul; Petzing, Jon; Kinnell, Peter; Jackson, Michael

    2015-12-01

    Plenoptic cameras can capture 3D information in one exposure without the need for structured illumination, allowing grey scale depth maps of the captured image to be created. The Lytro, a consumer grade plenoptic camera, provides a cost effective method of measuring depth of multiple objects under controlled lightning conditions. In this research, camera control variables, environmental sensitivity, image distortion characteristics, and the effective working range of two Lytro first generation cameras were evaluated. In addition, a calibration process has been created, for the Lytro cameras, to deliver three dimensional output depth maps represented in SI units (metre). The novel results show depth accuracy and repeatability of +10.0 mm to -20.0 mm, and 0.5 mm respectively. For the lateral X and Y coordinates, the accuracy was +1.56 μm to -2.59 μm and the repeatability was 0.25 μm.

  19. Engineering task plan for flammable gas atmosphere mobile color video camera systems

    International Nuclear Information System (INIS)

    Kohlman, E.H.

    1995-01-01

    This Engineering Task Plan (ETP) describes the design, fabrication, assembly, and testing of the mobile video camera systems. The color video camera systems will be used to observe and record the activities within the vapor space of a tank on a limited exposure basis. The units will be fully mobile and designed for operation in the single-shell flammable gas producing tanks. The objective of this tank is to provide two mobile camera systems for use in flammable gas producing single-shell tanks (SSTs) for the Flammable Gas Tank Safety Program. The camera systems will provide observation, video recording, and monitoring of the activities that occur in the vapor space of applied tanks. The camera systems will be designed to be totally mobile, capable of deployment up to 6.1 meters into a 4 inch (minimum) riser

  20. Trusted Objects

    Energy Technology Data Exchange (ETDEWEB)

    CAMPBELL,PHILIP L.; PIERSON,LYNDON G.; WITZKE,EDWARD L.

    1999-10-27

    In the world of computers a trusted object is a collection of possibly-sensitive data and programs that can be allowed to reside and execute on a computer, even on an adversary's machine. Beyond the scope of one computer we believe that network-based agents in high-consequence and highly reliable applications will depend on this approach, and that the basis for such objects is what we call ''faithful execution.''

  1. Hardware Middleware for Person Tracking on Embedded Distributed Smart Cameras

    Directory of Open Access Journals (Sweden)

    Ali Akbar Zarezadeh

    2012-01-01

    Full Text Available Tracking individuals is a prominent application in such domains like surveillance or smart environments. This paper provides a development of a multiple camera setup with jointed view that observes moving persons in a site. It focuses on a geometry-based approach to establish correspondence among different views. The expensive computational parts of the tracker are hardware accelerated via a novel system-on-chip (SoC design. In conjunction with this vision application, a hardware object request broker (ORB middleware is presented as the underlying communication system. The hardware ORB provides a hardware/software architecture to achieve real-time intercommunication among multiple smart cameras. Via a probing mechanism, a performance analysis is performed to measure network latencies, that is, time traversing the TCP/IP stack, in both software and hardware ORB approaches on the same smart camera platform. The empirical results show that using the proposed hardware ORB as client and server in separate smart camera nodes will considerably reduce the network latency up to 100 times compared to the software ORB.

  2. The very soft X-ray emission of X-ray-faint early-type galaxies

    Science.gov (United States)

    Pellegrini, S.; Fabbiano, G.

    1994-01-01

    A recent reanaylsis of Einstein data, and new ROSAT observations, have revealed the presence of at least two components in the X-ray spectra of X-ray faint early-type galaxies: a relatively hard component (kT greater than 1.5 keV), and a very soft component (kT approximately 0.2-0.3 keV). In this paper we address the problem of the nature of the very soft component and whether it can be due to a hot interstellar medium (ISM), or is most likely originated by the collective emission of very soft stellar sources. To this purpose, hydrodynamical evolutionary sequences for the secular behavior of gas flows in ellipticals have been performed, varying the Type Ia supernovae rate of explosion, and the dark matter amount and distribution. The results are compared with the observational X-ray data: the average Einstein spectrum for six X-ray faint early-type galaxies (among which are NGC 4365 and NGC 4697), and the spectrum obtained by the ROSAT pointed observation of NGC 4365. The very soft component could be entirely explained with a hot ISM only in galaxies such as NGC 4697, i.e., when the depth of the potential well-on which the average ISM temperature strongly depends-is quite shallow; in NGC 4365 a diffuse hot ISM would have a temperature larger than that of the very soft component, because of the deeper potential well. So, in NGC 4365 the softest contribution to the X-ray emission comes certainly from stellar sources. As stellar soft X-ray emitters, we consider late-type stellar coronae, supersoft sources such as those discovered by ROSAT in the Magellanic Clouds and M31, and RS CVn systems. All these candidates can be substantial contributors to the very soft emission, though none of them, taken separately, plausibly accounts entirely for its properties. We finally present a model for the X-ray emission of NGC 4365, to reproduce in detail the results of the ROSAT pointed observation, including the Position Sensitive Proportional Counter (PSPC) spectrum and radial

  3. Spectroscopic confirmation of an ultra-faint galaxy at the epoch of reionization

    Science.gov (United States)

    Hoag, Austin; Bradač, Maruša; Trenti, Michele; Treu, Tommaso; Schmidt, Kasper B.; Huang, Kuang-Han; Lemaux, Brian C.; He, Julie; Bernard, Stephanie R.; Abramson, Louis E.; Mason, Charlotte A.; Morishita, Takahiro; Pentericci, Laura; Schrabback, Tim

    2017-04-01

    Within one billion years of the Big Bang, intergalactic hydrogen was ionized by sources emitting ultraviolet and higher energy photons. This was the final phenomenon to globally affect all the baryons (visible matter) in the Universe. It is referred to as cosmic reionization and is an integral component of cosmology. It is broadly expected that intrinsically faint galaxies were the primary ionizing sources due to their abundance in this epoch1,2. However, at the highest redshifts (z > 7.5 lookback time 13.1 Gyr), all galaxies with spectroscopic confirmations to date are intrinsically bright and, therefore, not necessarily representative of the general population3. Here, we report the unequivocal spectroscopic detection of a low luminosity galaxy at z > 7.5. We detected the Lyman-α emission line at ˜10,504 Å in two separate observations with MOSFIRE4 on the Keck I Telescope and independently with the Hubble Space Telescope's slitless grism spectrograph, implying a source redshift of z = 7.640 ± 0.001. The galaxy is gravitationally magnified by the massive galaxy cluster MACS J1423.8+2404 (z = 0.545), with an estimated intrinsic luminosity of MAB = -19.6 ± 0.2 mag and a stellar mass of M⊙=3.0-0.8+1.5×108 solar masses. Both are an order of magnitude lower than the four other Lyman-α emitters currently known at z > 7.5, making it probably the most distant representative source of reionization found to date.

  4. Pushing the limits: detecting H2 emission from faint bipolar planetary nebulae in the IPHAS sample

    Science.gov (United States)

    Ramos-Larios, G.; Guerrero, M. A.; Sabin, L.; Santamaría, E.

    2017-09-01

    We have obtained deep narrowband images in the near-infrared H2 λ2.122 μm emission line for a sample of 15 faint Isaac Newton Telescope Photometric H α Survey (IPHAS) bipolar planetary nebulae (PNe) to search for molecular material. H2 emission is found in most of them (14 out of 15), mostly associated with rings at their equatorial regions and with their bipolar lobes. These detections add to the high occurrence of H2 emission among bipolar PNe reported in previous works, resulting from the large reservoir of molecular material in these sources and the suitable excitation conditions for H2 emission. The correlation between detailed bipolar morphology and H2 luminosity is also confirmed: bipolar PNe with broad equatorial rings (R-BPNe) have almost no continuum emission, are H2 brighter and have larger H2/Br γ line ratio than bipolar PNe with pinched equatorial waists (W-BPNe). The origin of this dichotomy is unclear. The larger size and age of R-BPNe are consistent with shock excitation of H2, whereas ultraviolet pumping is most likely the excitation mechanism in the smaller and younger W-BPNe, which would explain their lower H2 luminosity. Although both types of bipolar PNe seem to proceed from the same progenitor population, this does not imply that R-BPNe descend from W-BPNe. Otherwise, we note that some of the H2-weak bipolar PNe harbor post-common envelope binary systems and symbiotic stars. Finally, we suggest that the long-living H2 emission from R-BPNe arises from a discrete distribution of compact knots embedded within the ionized gas at the equatorial region.

  5. VizieR Online Data Catalog: Faint cataclysmic variables from SDSS (Woudt+, 2012)

    Science.gov (United States)

    Woudt, P. A.; Warner, B.; de Bude, D.; Macfarlane, S.; Schurch, M. P. E.; Zietsman, E.

    2013-01-01

    We present high-speed photometric observations of 20 faint cataclysmic variables (CVs) selected from the Sloan Digital Sky Survey (SDSS) and Catalina catalogues. Measurements are given of 15 new directly measured orbital periods, including four eclipsing dwarf novae (SDSS 0904+03, CSS 0826-00, CSS 1404-10 and CSS 1626-12), two new polars (CSS 0810+00 and CSS 1503-22) and two dwarf novae with superhumps in quiescence (CSS 0322+02 and CSS 0826-00). Whilst most of the dwarf novae presented here have periods below 2h, SDSS 0805+07 and SSS 0617-36 have relatively long orbital periods of 5.489 and 3.440h, respectively. The double-humped orbital modulations observed in SSS 0221-26, CSS 0345-01, CSS 1300+11 and CSS 1443-17 are typical of low-mass transfer rate dwarf novae. The white dwarf primary of SDSS 0919+08 is confirmed to have non-radial oscillations, and quasi-periodic oscillations were observed in the short-period dwarf nova CSS 1028-08 during outburst. We further report the detection of a new nova-like variable (SDSS 1519+06). The frequency distribution of orbital periods of CVs in the Catalina Real-time Transient Survey (CRTS) has a high peak near ~80min orbital period, independently confirming that found by Gansicke et al. (2009MNRAS.397.2170G) from SDSS sources. We also observe a marked correlation between the median in the orbital period distribution and the outburst class, in the sense that dwarf novae with a single observed outburst (over the 5-year baseline of the CRTS coverage) occur predominantly at shortest orbital period. (2 data files).

  6. Comparison Between Four Detection Algorithms for GEO Objects

    Science.gov (United States)

    Yanagisawa, T.; Uetsuhara, M.; Banno, H.; Kurosaki, H.; Kinoshita, D.; Kitazawa, Y.; Hanada, T.

    2012-09-01

    Four detection algorithms for GEO objects are being developed under the collaboration between Kyushu University, IHI corporation and JAXA. Each algorithm is designed to process CCD images to detect GEO objects. First one is PC based stacking method which has been developed in JAXA since 2000. Numerous CCD images are used to detect faint GEO objects below the limiting magnitude of a single CCD image. Sub-images are cropped from many CCD image to fit the movement of the objects. A median image of all the sub-images is then created. Although this method has an ability to detect faint objects, it takes time to analyze. Second one is the line-identifying technique which also uses many CCD frames and finds any series of objects that are arrayed on a straight line from the first frame to the last frame. This can analyze data faster than the stacking method, but cannot detect faint objects as the stacking method. Third one is the robust stacking method developed by IHI corporation which uses average instead of median to reduce analysis time. This has same analysis speed as the line-identifying technique and better detection capabilities in terms of the darkness. Forth one is the FPGA based stacking method which uses binalized images and a new algorithm installed in a FPGA board which reduce analysis time about one thousandth. All four algorithms analyzed the same sets of data to evaluate their advantages and disadvantages. By comparing their analysis times and results, an optimal usage of these algorithms are considered.

  7. Adaptation Computing Parameters of Pan-Tilt-Zoom Cameras for Traffic Monitoring

    Directory of Open Access Journals (Sweden)

    Ya Lin WU

    2014-01-01

    Full Text Available The Closed- CIRCUIT television (CCTV cameras have been widely used in recent years for traffic monitoring and surveillance applications. We can use CCTV cameras to extract automatically real-time traffic parameters according to the image processing and tracking technologies. Especially, the pan-tilt-zoom (PTZ cameras can provide flexible view selection as well as a wider observation range, and this makes the traffic parameters can be accurately calculated. Therefore, that the parameters of PTZ cameras are calibrated plays an important role in vision-based traffic applications. However, in the specific traffic environment, which is that the license plate number of the illegal parking is located, the parameters of PTZ cameras have to be updated according to the position and distance of illegal parking. In proposed traffic monitoring systems, we use the ordinary webcam and PTZ camera. We get vanishing-point of traffic lane lines in the pixel-based coordinate system by fixed webcam. The parameters of PTZ camera can be initialized by distance of the traffic monitoring and specific objectives and vanishing-point. And then we can use the coordinate position of the illegally parked car to update the parameters of PTZ camera and then get the real word coordinate position of the illegally parked car and use it to compute the distance. The result shows the error of the tested distance and real distance is only 0.2064 meter.

  8. Robust Calibration of Cameras with Telephoto Lens Using Regularized Least Squares

    Directory of Open Access Journals (Sweden)

    Mingpei Liang

    2014-01-01

    Full Text Available Cameras with telephoto lens are usually used to recover details of an object that is either small or located far away from the cameras. However, the calibration of this kind of cameras is not as accurate as the one of cameras with short focal lengths that are commonly used in many vision applications. This paper has two contributions. First, we present a first-order error analysis that shows the relation between focal length and estimation uncertainties of camera parameters. To our knowledge, this error analysis with respect to focal length has not been studied in the area of camera calibration. Second, we propose a robust algorithm to calibrate the camera with a long focal length without using additional devices. By adding a regularization term, our algorithm makes the estimation of the image of the absolute conic well posed. As a consequence, the covariance of camera parameters can be reduced greatly. We further used simulations and real data to verify our proposed algorithm and obtained very stable results.

  9. Modeling and calibration of a 4-line-camera system for precise coordinate measurement

    Science.gov (United States)

    Zhou, Kai; Wang, Xiangjun

    2017-10-01

    The research in this paper is intended for a precise position measurement system based on 4 line-cameras. Each camera is equipped with a cylindrical lens that can stretch an image point into a line which is perpendicular to the linear CCD, thus providing one dimension of projected coordinate of the corresponding 3D point. This system has a symmetrical structure with four line-cameras being divided into two identical groups. In each group, the two linecameras are fixed together with orientations normal to each other. Those two line-cameras form a 2D image sensor that can determine a light of sight on which the target point lies, just like an area array CCD camera does. With two groups of line-cameras, the position of 3D points can be detected. In this paper, a model of the 4-line-camera system for computing the coordinates of 3D target points in object frame is proposed, which is linear and computationally efficient. In addition, a calibration approach is presented in which the exterior orientations of the four line-cameras are obtained. Experiment results have shown that the system can achieve high accuracy in coordinate measurement of light spots from 0.5m to 3.5m, which demonstrate the performance of our proposed modeling and calibration methods.

  10. Occlusion handling framework for tracking in smart camera networks by per-target assistance task assignment

    Science.gov (United States)

    Bo, Nyan Bo; Deboeverie, Francis; Veelaert, Peter; Philips, Wilfried

    2017-09-01

    Occlusion is one of the most difficult challenges in the area of visual tracking. We propose an occlusion handling framework to improve the performance of local tracking in a smart camera view in a multicamera network. We formulate an extensible energy function to quantify the quality of a camera's observation of a particular target by taking into account both person-person and object-person occlusion. Using this energy function, a smart camera assesses the quality of observations over all targets being tracked. When it cannot adequately observe of a target, a smart camera estimates the quality of observation of the target from view points of other assisting cameras. If a camera with better observation of the target is found, the tracking task of the target is carried out with the assistance of that camera. In our framework, only positions of persons being tracked are exchanged between smart cameras. Thus, communication bandwidth requirement is very low. Performance evaluation of our method on challenging video sequences with frequent and severe occlusions shows that the accuracy of a baseline tracker is considerably improved. We also report the performance comparison to the state-of-the-art trackers in which our method outperforms.

  11. Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2009-01-01

    This article attempts to create a framework for understanding modern fashion phenomena on the basis of Durkheim's sociology of religion. It focuses on Durkheim's conception of the relation between the cult and the sacred object, on his notion of 'exteriorisation', and on his theory of the social...... symbol in an attempt to describe the peculiar attraction of the fashion object and its social constitution. However, Durkheim's notions of cult and ritual must undergo profound changes if they are to be used in an analysis of fashion. The article tries to expand the Durkheimian cult, radically enlarging...... it without totally dispersing it; depicting it as held together exclusively by the sheer 'force' of the sacred object. Firstly, the article introduces the themes and problems surrounding Durkheim's conception of the sacred. Next, it briefly sketches an outline of fashion phenomena in Durkheimian categories...

  12. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    NARCIS (Netherlands)

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P.T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many

  13. Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2009-01-01

    This article attempts to create a framework for understanding modern fashion phenomena on the basis of Durkheim's sociology of religion. It focuses on Durkheim's conception of the relation between the cult and the sacred object, on his notion of 'exteriorisation', and on his theory of the social...... symbol in an attempt to describe the peculiar attraction of the fashion object and its social constitution. However, Durkheim's notions of cult and ritual must undergo profound changes if they are to be used in an analysis of fashion. The article tries to expand the Durkheimian cult, radically enlarging...... of the enlargement of the cult into individual behaviour....

  14. Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task

    Directory of Open Access Journals (Sweden)

    Nicholas T. Bott

    2017-06-01

    Full Text Available Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive “window on the brain,” and the recording of eye movements using web cameras is a burgeoning area of research.Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS.Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera.Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits (r = 0.88–0.92. Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81–0.88. There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets (r = 0.88–0.94. Significantly fewer data quality issues were encountered using the built-in web camera.Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as

  15. Finding Objects for Assisting Blind People.

    Science.gov (United States)

    Yi, Chucai; Flores, Roberto W; Chincha, Ricardo; Tian, Yingli

    2013-07-01

    Computer vision technology has been widely used for blind assistance, such as navigation and wayfinding. However, few camera-based systems are developed for helping blind or visually-impaired people to find daily necessities. In this paper, we propose a prototype system of blind-assistant object finding by camera-based network and matching-based recognition. We collect a dataset of daily necessities and apply Speeded-Up Robust Features (SURF) and Scale Invariant Feature Transform (SIFT) feature descriptors to perform object recognition. Experimental results demonstrate the effectiveness of our prototype system.

  16. Ultra fast x-ray streak camera

    International Nuclear Information System (INIS)

    Coleman, L.W.; McConaghy, C.F.

    1975-01-01

    A unique ultrafast x-ray sensitive streak camera, with a time resolution of 50psec, has been built and operated. A 100A thick gold photocathode on a beryllium vacuum window is used in a modified commerical image converter tube. The X-ray streak camera has been used in experiments to observe time resolved emission from laser-produced plasmas. (author)

  17. An Open Standard for Camera Trap Data

    Directory of Open Access Journals (Sweden)

    Tavis Forrester

    2016-12-01

    Full Text Available Camera traps that capture photos of animals are a valuable tool for monitoring biodiversity. The use of camera traps is rapidly increasing and there is an urgent need for standardization to facilitate data management, reporting and data sharing. Here we offer the Camera Trap Metadata Standard as an open data standard for storing and sharing camera trap data, developed by experts from a variety of organizations. The standard captures information necessary to share data between projects and offers a foundation for collecting the more detailed data needed for advanced analysis. The data standard captures information about study design, the type of camera used, and the location and species names for all detections in a standardized way. This information is critical for accurately assessing results from individual camera trapping projects and for combining data from multiple studies for meta-analysis. This data standard is an important step in aligning camera trapping surveys with best practices in data-intensive science. Ecology is moving rapidly into the realm of big data, and central data repositories are becoming a critical tool and are emerging for camera trap data. This data standard will help researchers standardize data terms, align past data to new repositories, and provide a framework for utilizing data across repositories and research projects to advance animal ecology and conservation.

  18. Active spectral imaging nondestructive evaluation (SINDE) camera

    Energy Technology Data Exchange (ETDEWEB)

    Simova, E.; Rochefort, P.A., E-mail: eli.simova@cnl.ca [Canadian Nuclear Laboratories, Chalk River, Ontario (Canada)

    2016-06-15

    A proof-of-concept video camera for active spectral imaging nondestructive evaluation has been demonstrated. An active multispectral imaging technique has been implemented in the visible and near infrared by using light emitting diodes with wavelengths spanning from 400 to 970 nm. This shows how the camera can be used in nondestructive evaluation to inspect surfaces and spectrally identify materials and corrosion. (author)

  19. CCD Color Camera Characterization for Image Measurements

    NARCIS (Netherlands)

    Withagen, P.J.; Groen, F.C.A.; Schutte, K.

    2007-01-01

    In this article, we will analyze a range of different types of cameras for its use in measurements. We verify a general model of a charged coupled device camera using experiments. This model includes gain and offset, additive and multiplicative noise, and gamma correction. It is shown that for

  20. Driving with head-slaved camera system

    NARCIS (Netherlands)

    Oving, A.B.; Erp, J.B.F. van

    2001-01-01

    In a field experiment, we tested the effectiveness of a head-slaved camera system for driving an armoured vehicle under armour. This system consists of a helmet-mounted display (HMD), a headtracker, and a motion platform with two cameras. Subjects performed several driving tasks on paved and in

  1. High resolution RGB color line scan camera

    Science.gov (United States)

    Lynch, Theodore E.; Huettig, Fred

    1998-04-01

    A color line scan camera family which is available with either 6000, 8000 or 10000 pixels/color channel, utilizes off-the-shelf lenses, interfaces with currently available frame grabbers, includes on-board pixel by pixel offset correction, and is configurable and controllable via RS232 serial port for computer controlled or stand alone operation is described in this paper. This line scan camera is based on an available 8000 element monochrome line scan camera designed by AOA for OEM use. The new color version includes improvements such as better packaging and additional user features which make the camera easier to use. The heart of the camera is a tri-linear CCD sensor with on-chip color balancing for maximum accuracy and pinned photodiodes for low lag response. Each color channel is digitized to 12 bits and all three channels are multiplexed together so that the resulting camera output video is either a 12 or 8 bit data stream at a rate of up to 24Megpixels/sec. Conversion from 12 to 8 bit, or user-defined gamma, is accomplished by on board user-defined video look up tables. The camera has two user-selectable operating modes; lows speed, high sensitivity mode or high speed, reduced sensitivity mode. The intended uses of the camera include industrial inspection, digital archiving, document scanning, and graphic arts applications.

  2. Laser scanning camera inspects hazardous area

    International Nuclear Information System (INIS)

    Fryatt, A.; Miprode, C.

    1985-01-01

    Main operational characteristics of a new laser scanning camera are presented. The camera is intended primarily for low level high resolution viewing inside nuclear reactors. It uses a He-Ne laser beam raster; by detecting the reflected light by means of a phomultiplier, the subject under observation can be reconstructed in an electronic video store and reviewed on a conventional monitor screen

  3. Rosetta Star Tracker and Navigation Camera

    DEFF Research Database (Denmark)

    Thuesen, Gøsta

    1998-01-01

    Proposal in response to the Invitation to Tender (ITT) issued by Matra Marconi Space (MSS) for the procurement of the ROSETTA Star Tracker and Navigation Camera.......Proposal in response to the Invitation to Tender (ITT) issued by Matra Marconi Space (MSS) for the procurement of the ROSETTA Star Tracker and Navigation Camera....

  4. Centering mount for a gamma camera

    International Nuclear Information System (INIS)

    Mirkhodzhaev, A.Kh.; Kuznetsov, N.K.; Ostryj, Yu.E.

    1988-01-01

    A device for centering a γ-camera detector in case of radionuclide diagnosis is described. It permits the use of available medical coaches instead of a table with a transparent top. The device can be used for centering a detector (when it is fixed at the low end of a γ-camera) on a required area of the patient's body

  5. Securing Embedded Smart Cameras with Trusted Computing

    Directory of Open Access Journals (Sweden)

    Winkler Thomas

    2011-01-01

    Full Text Available Camera systems are used in many applications including video surveillance for crime prevention and investigation, traffic monitoring on highways or building monitoring and automation. With the shift from analog towards digital systems, the capabilities of cameras are constantly increasing. Today's smart camera systems come with considerable computing power, large memory, and wired or wireless communication interfaces. With onboard image processing and analysis capabilities, cameras not only open new possibilities but also raise new challenges. Often overlooked are potential security issues of the camera system. The increasing amount of software running on the cameras turns them into attractive targets for attackers. Therefore, the protection of camera devices and delivered data is of critical importance. In this work we present an embedded camera prototype that uses Trusted Computing to provide security guarantees for streamed videos. With a hardware-based security solution, we ensure integrity, authenticity, and confidentiality of videos. Furthermore, we incorporate image timestamping, detection of platform reboots, and reporting of the system status. This work is not limited to theoretical considerations but also describes the implementation of a prototype system. Extensive evaluation results illustrate the practical feasibility of the approach.

  6. Architecture of PAU survey camera readout electronics

    Science.gov (United States)

    Castilla, Javier; Cardiel-Sas, Laia; De Vicente, Juan; Illa, Joseph; Jimenez, Jorge; Maiorino, Marino; Martinez, Gustavo

    2012-07-01

    PAUCam is a new camera for studying the physics of the accelerating universe. The camera will consist of eighteen 2Kx4K HPK CCDs: sixteen for science and two for guiding. The camera will be installed at the prime focus of the WHT (William Herschel Telescope). In this contribution, the architecture of the readout electronics system is presented. Back- End and Front-End electronics are described. Back-End consists of clock, bias and video processing boards, mounted on Monsoon crates. The Front-End is based on patch panel boards. These boards are plugged outside the camera feed-through panel for signal distribution. Inside the camera, individual preamplifier boards plus kapton cable completes the path to connect to each CCD. The overall signal distribution and grounding scheme is shown in this paper.

  7. Stereo Cameras for Clouds (STEREOCAM) Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Romps, David [Univ. of California, Berkeley, CA (United States); Oktem, Rusen [Univ. of California, Berkeley, CA (United States)

    2017-10-31

    The three pairs of stereo camera setups aim to provide synchronized and stereo calibrated time series of images that can be used for 3D cloud mask reconstruction. Each camera pair is positioned at approximately 120 degrees from the other pair, with a 17o-19o pitch angle from the ground, and at 5-6 km distance from the U.S. Department of Energy (DOE) Central Facility at the Atmospheric Radiation Measurement (ARM) Climate Research Facility Southern Great Plains (SGP) observatory to cover the region from northeast, northwest, and southern views. Images from both cameras of the same stereo setup can be paired together to obtain 3D reconstruction by triangulation. 3D reconstructions from the ring of three stereo pairs can be combined together to generate a 3D mask from surrounding views. This handbook delivers all stereo reconstruction parameters of the cameras necessary to make 3D reconstructions from the stereo camera images.

  8. Moving object detection using background subtraction

    CERN Document Server

    Shaikh, Soharab Hossain; Chaki, Nabendu

    2014-01-01

    This Springer Brief presents a comprehensive survey of the existing methodologies of background subtraction methods. It presents a framework for quantitative performance evaluation of different approaches and summarizes the public databases available for research purposes. This well-known methodology has applications in moving object detection from video captured with a stationery camera, separating foreground and background objects and object classification and recognition. The authors identify common challenges faced by researchers including gradual or sudden illumination change, dynamic bac

  9. Superconducting millimetre-wave cameras

    Science.gov (United States)

    Monfardini, Alessandro

    2017-05-01

    I present a review of the developments in kinetic inductance detectors (KID) for mm-wave and THz imaging-polarimetry in the framework of the Grenoble collaboration. The main application that we have targeted so far is large field-of-view astronomy. I focus in particular on our own experiment: NIKA2 (Néel IRAM KID Arrays). NIKA2 is today the largest millimetre camera available to the astronomical community for general purpose observations. It consists of a dual-band, dual-polarisation, multi-thousands pixels system installed at the IRAM 30-m telescope at Pico Veleta (Spain). I start with a general introduction covering the underlying physics and the KID working principle. Then I describe briefly the instrument and the detectors, to conclude with examples of pictures taken on the Sky by NIKA2 and its predecessor, NIKA. Thanks to these results, together with the relative simplicity and low cost of the KID fabrication, industrial applications requiring passive millimetre-THz imaging have now become possible.

  10. First Light for World's Largest 'Thermometer Camera'

    Science.gov (United States)

    2007-08-01

    LABOCA in Service at APEX The world's largest bolometer camera for submillimetre astronomy is now in service at the 12-m APEX telescope, located on the 5100m high Chajnantor plateau in the Chilean Andes. LABOCA was specifically designed for the study of extremely cold astronomical objects and, with its large field of view and very high sensitivity, will open new vistas in our knowledge of how stars form and how the first galaxies emerged from the Big Bang. ESO PR Photo 35a/07 ESO PR Photo 35a/07 LABOCA on APEX "A large fraction of all the gas in the Universe has extremely cold temperatures of around minus 250 degrees Celsius, a mere 20 degrees above absolute zero," says Karl Menten, director at the Max Planck Institute for Radioastronomy (MPIfR) in Bonn, Germany, that built LABOCA. "Studying these cold clouds requires looking at the light they radiate in the submillimetre range, with very sophisticated detectors." Astronomers use bolometers for this task, which are, in essence, thermometers. They detect incoming radiation by registering the resulting rise in temperature. More specifically, a bolometer detector consists of an extremely thin foil that absorbs the incoming light. Any change of the radiation's intensity results in a slight change in temperature of the foil, which can then be registered by sensitive electronic thermometers. To be able to measure such minute temperature fluctuations requires the bolometers to be cooled down to less than 0.3 degrees above absolute zero, that is below minus 272.85 degrees Celsius. "Cooling to such low temperatures requires using liquid helium, which is no simple feat for an observatory located at 5100m altitude," says Carlos De Breuck, the APEX instrument scientist at ESO. Nor is it simple to measure the weak temperature radiation of astronomical objects. Millimetre and submillimetre radiation opens a window into the enigmatic cold Universe, but the signals from space are heavily absorbed by water vapour in the Earth

  11. Influence of Digital Camera Errors on the Photogrammetric Image Processing

    Science.gov (United States)

    Sužiedelytė-Visockienė, Jūratė; Bručas, Domantas

    2009-01-01

    The paper deals with the calibration of digital camera Canon EOS 350D, often used for the photogrammetric 3D digitalisation and measurements of industrial and construction site objects. During the calibration data on the optical and electronic parameters, influencing the distortion of images, such as correction of the principal point, focal length of the objective, radial symmetrical and non-symmetrical distortions were obtained. The calibration was performed by means of the Tcc software implementing the polynomial of Chebichev and using a special test-field with the marks, coordinates of which are precisely known. The main task of the research - to determine how parameters of the camera calibration influence the processing of images, i. e. the creation of geometric model, the results of triangulation calculations and stereo-digitalisation. Two photogrammetric projects were created for this task. In first project the non-corrected and in the second the corrected ones, considering the optical errors of the camera obtained during the calibration, images were used. The results of analysis of the images processing is shown in the images and tables. The conclusions are given.

  12. DEPTH CAMERAS ON UAVs: A FIRST APPROACH

    Directory of Open Access Journals (Sweden)

    A. Deris

    2017-02-01

    Full Text Available Accurate depth information retrieval of a scene is a field under investigation in the research areas of photogrammetry, computer vision and robotics. Various technologies, active, as well as passive, are used to serve this purpose such as laser scanning, photogrammetry and depth sensors, with the latter being a promising innovative approach for fast and accurate 3D object reconstruction using a broad variety of measuring principles including stereo vision, infrared light or laser beams. In this study we investigate the use of the newly designed Stereolab's ZED depth camera based on passive stereo depth calculation, mounted on an Unmanned Aerial Vehicle with an ad-hoc setup, specially designed for outdoor scene applications. Towards this direction, the results of its depth calculations and scene reconstruction generated by Simultaneous Localization and Mapping (SLAM algorithms are compared and evaluated based on qualitative and quantitative criteria with respect to the ones derived by a typical Structure from Motion (SfM and Multiple View Stereo (MVS pipeline for a challenging cultural heritage application.

  13. Camera Networks The Acquisition and Analysis of Videos over Wide Areas

    CERN Document Server

    Roy-Chowdhury, Amit K

    2012-01-01

    As networks of video cameras are installed in many applications like security and surveillance, environmental monitoring, disaster response, and assisted living facilities, among others, image understanding in camera networks is becoming an important area of research and technology development. There are many challenges that need to be addressed in the process. Some of them are listed below: - Traditional computer vision challenges in tracking and recognition, robustness to pose, illumination, occlusion, clutter, recognition of objects, and activities; - Aggregating local information for wide

  14. Towards Adaptive Virtual Camera Control In Computer Games

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ...

  15. Unified framework for recognition, localization and mapping using wearable cameras.

    Science.gov (United States)

    Vázquez-Martín, Ricardo; Bandera, Antonio

    2012-08-01

    Monocular approaches to simultaneous localization and mapping (SLAM) have recently addressed with success the challenging problem of the fast computation of dense reconstructions from a single, moving camera. Thus, if these approaches initially relied on the detection of a reduced set of interest points to estimate the camera position and the map, they are currently able to reconstruct dense maps from a handheld camera while the camera coordinates are simultaneously computed. However, these maps of 3-dimensional points usually remain meaningless, that is, with no memorable items and without providing a way of encoding spatial relationships between objects and paths. In humans and mobile robotics, landmarks play a key role in the internalization of a spatial representation of an environment. They are memorable cues that can serve to define a region of the space or the location of other objects. In a topological representation of the space, landmarks can be identified and located according to its structural, perceptive or semantic significance and distinctiveness. But on the other hand, landmarks may be difficult to be located in a metric representation of the space. Restricted to the domain of visual landmarks, this work describes an approach where the map resulting from a point-based, monocular SLAM is annotated with the semantic information provided by a set of distinguished landmarks. Both features are obtained from the image. Hence, they can be linked by associating to each landmark all those point-based features that are superimposed to the landmark in a given image (key-frame). Visual landmarks will be obtained by means of an object-based, bottom-up attention mechanism, which will extract from the image a set of proto-objects. These proto-objects could not be always associated with natural objects, but they will typically constitute significant parts of these scene objects and can be appropriately annotated with semantic information. Moreover, they will be

  16. The NIKA2 large-field-of-view millimetre continuum camera for the 30 m IRAM telescope

    Science.gov (United States)

    Adam, R.; Adane, A.; Ade, P. A. R.; André, P.; Andrianasolo, A.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Bracco, A.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; De Petris, M.; Désert, F.-X.; Doyle, S.; Driessen, E. F. C.; Evans, R.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Leggeri, J.-P.; Lestrade, J.-F.; Macías-Pérez, J. F.; Mauskopf, P.; Mayet, F.; Maury, A.; Monfardini, A.; Navarro, S.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Rigby, A.; Ritacco, A.; Romero, C.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-01-01

    Context. Millimetre-wave continuum astronomy is today an indispensable tool for both general astrophysics studies (e.g. star formation, nearby galaxies) and cosmology (e.g. cosmic microwave background and high-redshift galaxies). General purpose, large-field-of-view instruments are needed to map the sky at intermediate angular scales not accessible by the high-resolution interferometers (e.g. ALMA in Chile, NOEMA in the French Alps) and by the coarse angular resolution space-borne or ground-based surveys (e.g. Planck, ACT, SPT). These instruments have to be installed at the focal plane of the largest single-dish telescopes, which are placed at high altitude on selected dry observing sites. In this context, we have constructed and deployed a three-thousand-pixel dual-band (150 GHz and 260 GHz, respectively 2 mm and 1.15 mm wavelengths) camera to image an instantaneous circular field-of-view of 6.5 arcmin in diameter, and configurable to map the linear polarisation at 260 GHz. Aims: First, we are providing a detailed description of this instrument, named NIKA2 (New IRAM KID Arrays 2), in particular focussing on the cryogenics, optics, focal plane arrays based on Kinetic Inductance Detectors, and the readout electronics. The focal planes and part of the optics are cooled down to the nominal 150 mK operating temperature by means of an adhoc dilution refrigerator. Secondly, we are presenting the performance measured on the sky during the commissioning runs that took place between October 2015 and April 2017 at the 30-m IRAM telescope at Pico Veleta, near Granada (Spain). Methods: We have targeted a number of astronomical sources. Starting from beam-maps on primary and secondary calibrators we have then gone to extended sources and faint objects. Both internal (electronic) and on-the-sky calibrations are applied. The general methods are described in the present paper. Results: NIKA2 has been successfully deployed and commissioned, performing in-line with expectations. In

  17. Automatic Chessboard Detection for Intrinsic and Extrinsic Camera Parameter Calibration

    Directory of Open Access Journals (Sweden)

    Jose María Armingol

    2010-03-01

    Full Text Available There are increasing applications that require precise calibration of cameras to perform accurate measurements on objects located within images, and an automatic algorithm would reduce this time consuming calibration procedure. The method proposed in this article uses a pattern similar to that of a chess board, which is found automatically in each image, when no information regarding the number of rows or columns is supplied to aid its detection. This is carried out by means of a combined analysis of two Hough transforms, image corners and invariant properties of the perspective transformation. Comparative analysis with more commonly used algorithms demonstrate the viability of the algorithm proposed, as a valuable tool for camera calibration.

  18. True RGB line-scan camera for color machine vision applications

    Science.gov (United States)

    Lemstrom, Guy F.

    1994-10-01

    The design and technical capabilities of a true RGB 3 CCD chip color line scan camera are presented within this paper. The camera was developed for accurate color monitoring and analysis in industrial applications. A black & white line scan camera has been designed and built utilizing the same modular architecture of the color line scan camera. Color separation is made possible with a tri-chromatic RGB beam splitter. Three CCD linear arrays are precisely mounted to the output surfaces of the prism and the outputs of each CCD are exactly matched pixel by pixel. The beam splitter prism can be tailored to separate other spectral components than the standard RGB. A typical CCD can detect between 200 and 100 nm. Either two or three spectral regions can be separated using a beam splitter prism. The camera is totally digital and has a 16-bit parallel computer interface to communicate with a signal processing board. Because of the open architecture of the camera it's possible for the customer to design a board with some special functions handling the preprocessing of the data (for example RGB - HSI conversion). The camera can also be equipped with a high speed CPU-board with enough of local memory to do some image processing inside the camera before sending the data forward. The camera has been used in real industrial applications and has proven that its high resolution and high dynamic range can be used to measure minute color differences, enabling the separation or grading of objects such as minerals, food or other materials that could not otherwise be measured with a black and white camera.

  19. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    International Nuclear Information System (INIS)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef; Goiffon, Vincent; Corbiere, Franck; Rolando, Sebastien; Molina, Romain; Estribeau, Magali; Avon, Barbara; Magnan, Pierre; Paillet, Philippe; Duhamel, Olivier; Gaillardin, Marc; Raine, Melanie

    2015-01-01

    There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO 2 ) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations of the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera

  20. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef [Universite de Saint-Etienne, Lab. Hubert Curien, UMR-CNRS 5516, F-42000 Saint-Etienne (France); Goiffon, Vincent; Corbiere, Franck; Rolando, Sebastien; Molina, Romain; Estribeau, Magali; Avon, Barbara; Magnan, Pierre [ISAE, Universite de Toulouse, F-31055 Toulouse (France); Paillet, Philippe; Duhamel, Olivier; Gaillardin, Marc; Raine, Melanie [CEA, DAM, DIF, F-91297 Arpajon (France)

    2015-07-01

    There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO{sub 2}) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations of the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera

  1. SLR digital camera for forensic photography

    Science.gov (United States)

    Har, Donghwan; Son, Youngho; Lee, Sungwon

    2004-06-01

    Forensic photography, which was systematically established in the late 19th century by Alphonse Bertillon of France, has developed a lot for about 100 years. The development will be more accelerated with the development of high technologies, in particular the digital technology. This paper reviews three studies to answer the question: Can the SLR digital camera replace the traditional silver halide type ultraviolet photography and infrared photography? 1. Comparison of relative ultraviolet and infrared sensitivity of SLR digital camera to silver halide photography. 2. How much ultraviolet or infrared sensitivity is improved when removing the UV/IR cutoff filter built in the SLR digital camera? 3. Comparison of relative sensitivity of CCD and CMOS for ultraviolet and infrared. The test result showed that the SLR digital camera has a very low sensitivity for ultraviolet and infrared. The cause was found to be the UV/IR cutoff filter mounted in front of the image sensor. Removing the UV/IR cutoff filter significantly improved the sensitivity for ultraviolet and infrared. Particularly for infrared, the sensitivity of the SLR digital camera was better than that of the silver halide film. This shows the possibility of replacing the silver halide type ultraviolet photography and infrared photography with the SLR digital camera. Thus, the SLR digital camera seems to be useful for forensic photography, which deals with a lot of ultraviolet and infrared photographs.

  2. UAV CAMERAS: OVERVIEW AND GEOMETRIC CALIBRATION BENCHMARK

    Directory of Open Access Journals (Sweden)

    M. Cramer

    2017-08-01

    Full Text Available Different UAV platforms and sensors are used in mapping already, many of them equipped with (sometimes modified cameras as known from the consumer market. Even though these systems normally fulfil their requested mapping accuracy, the question arises, which system performs best? This asks for a benchmark, to check selected UAV based camera systems in well-defined, reproducible environments. Such benchmark is tried within this work here. Nine different cameras used on UAV platforms, representing typical camera classes, are considered. The focus is laid on the geometry here, which is tightly linked to the process of geometrical calibration of the system. In most applications the calibration is performed in-situ, i.e. calibration parameters are obtained as part of the project data itself. This is often motivated because consumer cameras do not keep constant geometry, thus, cannot be seen as metric cameras. Still, some of the commercial systems are quite stable over time, as it was proven from repeated (terrestrial calibrations runs. Already (pre-calibrated systems may offer advantages, especially when the block geometry of the project does not allow for a stable and sufficient in-situ calibration. Especially for such scenario close to metric UAV cameras may have advantages. Empirical airborne test flights in a calibration field have shown how block geometry influences the estimated calibration parameters and how consistent the parameters from lab calibration can be reproduced.

  3. Uav Cameras: Overview and Geometric Calibration Benchmark

    Science.gov (United States)

    Cramer, M.; Przybilla, H.-J.; Zurhorst, A.

    2017-08-01

    Different UAV platforms and sensors are used in mapping already, many of them equipped with (sometimes) modified cameras as known from the consumer market. Even though these systems normally fulfil their requested mapping accuracy, the question arises, which system performs best? This asks for a benchmark, to check selected UAV based camera systems in well-defined, reproducible environments. Such benchmark is tried within this work here. Nine different cameras used on UAV platforms, representing typical camera classes, are considered. The focus is laid on the geometry here, which is tightly linked to the process of geometrical calibration of the system. In most applications the calibration is performed in-situ, i.e. calibration parameters are obtained as part of the project data itself. This is often motivated because consumer cameras do not keep constant geometry, thus, cannot be seen as metric cameras. Still, some of the commercial systems are quite stable over time, as it was proven from repeated (terrestrial) calibrations runs. Already (pre-)calibrated systems may offer advantages, especially when the block geometry of the project does not allow for a stable and sufficient in-situ calibration. Especially for such scenario close to metric UAV cameras may have advantages. Empirical airborne test flights in a calibration field have shown how block geometry influences the estimated calibration parameters and how consistent the parameters from lab calibration can be reproduced.

  4. New camera systems for fuel services

    International Nuclear Information System (INIS)

    Hummel, W.; Beck, H.J.

    2010-01-01

    AREVA NP Fuel Services have many years of experience in visual examination and measurements on fuel assemblies and associated core components by using state of the art cameras and measuring technologies. The used techniques allow the surface and dimensional characterization of materials and shapes by visual examination. New enhanced and sophisticated technologies for fuel services f. e. are two shielded color camera systems for use under water and close inspection of a fuel assembly. Nowadays the market requirements for detecting and characterization of small defects (lower than the 10th of one mm) or cracks and analyzing surface appearances on an irradiated fuel rod cladding or fuel assembly structure parts have increased. Therefore it is common practice to use movie cameras with higher resolution. The radiation resistance of high resolution CCD cameras is in general very low and it is not possible to use them unshielded close to a fuel assembly. By extending the camera with a mirror system and shielding around the sensitive parts, the movie camera can be utilized for fuel assembly inspection. AREVA NP Fuel Services is now equipped with such kind of movie cameras. (orig.)

  5. Temperature resolution enhancing of commercially available THz passive cameras due to computer processing of images

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Kuchik, Igor E.

    2014-06-01

    As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection of concealed object: minimal size of the object, maximal distance of the detection, image detail. One of probable ways for a quality image enhancing consists in computer processing of image. Using computer processing of the THz image of objects concealed on the human body, one may improve it many times. Consequently, the instrumental resolution of such device may be increased without any additional engineering efforts. We demonstrate new possibilities for seeing the clothes details, which raw images, produced by the THz cameras, do not allow to see. We achieve good quality of the image due to applying various spatial filters with the aim to demonstrate independence of processed images on math operations. This result demonstrates a feasibility of objects seeing. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China).

  6. Camera-based driver assistance systems

    Science.gov (United States)

    Grimm, Michael

    2013-04-01

    In recent years, camera-based driver assistance systems have taken an important step: from laboratory setup to series production. This tutorial gives a brief overview on the technology behind driver assistance systems, presents the most significant functionalities and focuses on the processes of developing camera-based systems for series production. We highlight the critical points which need to be addressed when camera-based driver assistance systems are sold in their thousands, worldwide - and the benefit in terms of safety which results from it.

  7. A Benchmark for Virtual Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2015-01-01

    Automatically animating and placing the virtual camera in a dynamic environment is a challenging task. The camera is expected to maximise and maintain a set of properties — i.e. visual composition — while smoothly moving through the environment and avoiding obstacles. A large number of different....... For this reason, in this paper, we propose a benchmark for the problem of virtual camera control and we analyse a number of different problems in different virtual environments. Each of these scenarios is described through a set of complexity measures and, as a result of this analysis, a subset of scenarios...

  8. Determining camera parameters for round glassware measurements

    International Nuclear Information System (INIS)

    Baldner, F O; Costa, P B; Leta, F R; Gomes, J F S; Filho, D M E S

    2015-01-01

    Nowadays there are many types of accessible cameras, including digital single lens reflex ones. Although these cameras are not usually employed in machine vision applications, they can be an interesting choice. However, these cameras have many available parameters to be chosen by the user and it may be difficult to select the best of these in order to acquire images with the needed metrological quality. This paper proposes a methodology to select a set of parameters that will supply a machine vision system with the needed quality image, considering the measurement required of a laboratory glassware

  9. Structure-From for Calibration of a Vehicle Camera System with Non-Overlapping Fields-Of in AN Urban Environment

    Science.gov (United States)

    Hanel, A.; Stilla, U.

    2017-05-01

    Vehicle environment cameras observing traffic participants in the area around a car and interior cameras observing the car driver are important data sources for driver intention recognition algorithms. To combine information from both camera groups, a camera system calibration can be performed. Typically, there is no overlapping field-of-view between environment and interior cameras. Often no marked reference points are available in environments, which are a large enough to cover a car for the system calibration. In this contribution, a calibration method for a vehicle camera system with non-overlapping camera groups in an urban environment is described. A-priori images of an urban calibration environment taken with an external camera are processed with the structure-frommotion method to obtain an environment point cloud. Images of the vehicle interior, taken also with an external camera, are processed to obtain an interior point cloud. Both point clouds are tied to each other with images of both image sets showing the same real-world objects. The point clouds are transformed into a self-defined vehicle coordinate system describing the vehicle movement. On demand, videos can be recorded with the vehicle cameras in a calibration drive. Poses of vehicle environment cameras and interior cameras are estimated separately using ground control points from the respective point cloud. All poses of a vehicle camera estimated for different video frames are optimized in a bundle adjustment. In an experiment, a point cloud is created from images of an underground car park, as well as a point cloud of the interior of a Volkswagen test car is created. Videos of two environment and one interior cameras are recorded. Results show, that the vehicle camera poses are estimated successfully especially when the car is not moving. Position standard deviations in the centimeter range can be achieved for all vehicle cameras. Relative distances between the vehicle cameras deviate between

  10. STRUCTURE-FROM-MOTION FOR CALIBRATION OF A VEHICLE CAMERA SYSTEM WITH NON-OVERLAPPING FIELDS-OF-VIEW IN AN URBAN ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    A. Hanel

    2017-05-01

    Full Text Available Vehicle environment cameras observing traffic participants in the area around a car and interior cameras observing the car driver are important data sources for driver intention recognition algorithms. To combine information from both camera groups, a camera system calibration can be performed. Typically, there is no overlapping field-of-view between environment and interior cameras. Often no marked reference points are available in environments, which are a large enough to cover a car for the system calibration. In this contribution, a calibration method for a vehicle camera system with non-overlapping camera groups in an urban environment is described. A-priori images of an urban calibration environment taken with an external camera are processed with the structure-frommotion method to obtain an environment point cloud. Images of the vehicle interior, taken also with an external camera, are processed to obtain an interior point cloud. Both point clouds are tied to each other with images of both image sets showing the same real-world objects. The point clouds are transformed into a self-defined vehicle coordinate system describing the vehicle movement. On demand, videos can be recorded with the vehicle cameras in a calibration drive. Poses of vehicle environment cameras and interior cameras are estimated separately using ground control points from the respective point cloud. All poses of a vehicle camera estimated for different video frames are optimized in a bundle adjustment. In an experiment, a point cloud is created from images of an underground car park, as well as a point cloud of the interior of a Volkswagen test car is created. Videos of two environment and one interior cameras are recorded. Results show, that the vehicle camera poses are estimated successfully especially when the car is not moving. Position standard deviations in the centimeter range can be achieved for all vehicle cameras. Relative distances between the vehicle

  11. Reconstruction of sculpture from its profiles with unknown camera positions.

    Science.gov (United States)

    Wong, Kwan-Yee Kenneth; Cipolla, Roberto

    2004-03-01

    Profiles of a sculpture provide rich information about its geometry, and can be used for shape recovery under known camera motion. By exploiting correspondences induced by epipolar tangents on the profiles, a successful solution to motion estimation from profiles has been developed in the special case of circular motion. The main drawbacks of using circular motion alone, namely the difficulty in adding new views and part of the object always being invisible, can be overcome by incorporating arbitrary general views of the object and registering its new profiles with the set of profiles resulted from the circular motion. In this paper, we describe a complete and practical system for producing a three-dimensional (3-D) model from uncalibrated images of an arbitrary object using its profiles alone. Experimental results on various objects are presented, demonstrating the quality of the reconstructions using the estimated motion.

  12. A bag of tricks: Using proper motions of Galactic stars to identify the Hercules ultra-faint dwarf galaxy members

    Science.gov (United States)

    Fabrizio, M.; Raimondo, G.; Brocato, E.; Bellini, A.; Libralato, M.; Testa, V.; Cantiello, M.; Musella, I.; Clementini, G.; Carini, R.; Marconi, M.; Piotto, G.; Ripepi, V.; Buonanno, R.; Sani, E.; Speziali, R.

    2014-10-01

    Context. Discovered in the last decade as overdensities of resolved stars, the ultra-faint dwarfs (UFDs) are among the least luminous, most dark-matter dominated, and most metal-poor galaxies known today. They appear as sparse, loose objects with high mass-to-light ratios. Hercules is the prototype of the UFD galaxies. To date, there are still no firm constraints on its total luminosity due to the difficulty of disentangling Hercules bona-fide stars from the severe Galactic field contamination. Aims: To better constrain Hercules properties, we aim at removing foreground and background contaminants in the galaxy field using the proper motions of the Milky Way stars and the colour-colour diagram. Methods: We have obtained images of Hercules in the rSloan , BBessel and Uspec bands with the Large Binocular Telescope (LBT) and LBC-BIN mode capabilities. The rSloan new dataset combined with data from the LBT archive span a time baseline of about 5 yr, allowing us to measure proper motions of stars in the Hercules direction for the first time. The Uspec data along with existing LBT photometry allowed us to use colour-colour diagram to further remove the field contamination. Results: Thanks to a highly-accurate procedure to derive the rSloan -filter geometric distortion solution for the LBC-red, we were able to measure stellar relative proper motions to a precision of better than 5 mas yr-1 down to rSloan≃ 22 mag and disentangle a significant fraction (>90%) of Milky Way contaminants. We ended up with a sample of 528 sources distributed over a large portion of the galaxy body (~0.12 deg2). Of these sources, 171 turned out to be background galaxies and additional foreground stars from the analysis of the Uspec - BBessel vs. BBessel - rSloan colour-colour diagram. This leaves us with a sample of 357 likely members of the Hercules UFD. We compared the cleaned colour-magnitude diagram (CMD) with evolutionary models and synthetic CMDs, confirming the presence in Hercules of

  13. Ultra-faint ultraviolet galaxies at z ∼ 2 behind the lensing cluster A1689: The luminosity function, dust extinction, and star formation rate density

    Energy Technology Data Exchange (ETDEWEB)

    Alavi, Anahita; Siana, Brian; Freeman, William R.; Dominguez, Alberto [Department of Physics and Astronomy, University of California, Riverside, CA 92521 (United States); Richard, Johan [Centre de Recherche Astrophysique de Lyon, Université Lyon 1, 9 Avenue Charles André, F-69561 Saint Genis Laval Cedex (France); Stark, Daniel P.; Robertson, Brant [Department of Astronomy, Steward Observatory, University of Arizona, 933 North Cherry Avenue, Rm N204, Tucson, AZ 85721 (United States); Scarlata, Claudia [Minnesota Institute for Astrophysics, University of Minnesota, Minneapolis, MN 55455 (United States); Teplitz, Harry I.; Rafelski, Marc [Infrared Processing and Analysis Center, Caltech, Pasadena, CA 91125 (United States); Kewley, Lisa, E-mail: anahita.alavi@email.ucr.edu [Research School of Astronomy and Astrophysics, The Australian National University, Cotter Road, Weston Creek, ACT 2611 (Australia)

    2014-01-10

    We have obtained deep ultraviolet imaging of the lensing cluster A1689 with the WFC3/UVIS camera onboard the Hubble Space Telescope in the F275W (30 orbits) and F336W (4 orbits) filters. These images are used to identify z ∼ 2 star-forming galaxies via their Lyman break, in the same manner that galaxies are typically selected at z ≥ 3. Because of the unprecedented depth of the images and the large magnification provided by the lensing cluster, we detect galaxies 100× fainter than previous surveys at this redshift. After removing all multiple images, we have 58 galaxies in our sample in the range –19.5 < M {sub 1500} < –13 AB mag. Because the mass distribution of A1689 is well constrained, we are able to calculate the intrinsic sensitivity of the observations as a function of source plane position, allowing for accurate determinations of effective volume as a function of luminosity. We fit the faint-end slope of the luminosity function to be α = –1.74 ± 0.08, which is consistent with the values obtained for 2.5 < z < 6. Notably, there is no turnover in the luminosity function down to M {sub 1500} = –13 AB mag. We fit the UV spectral slopes with photometry from existing Hubble optical imaging. The observed trend of increasingly redder slopes with luminosity at higher redshifts is observed in our sample, but with redder slopes at all luminosities and average reddening of (E(B – V)) = 0.15 mag. We assume the stars in these galaxies are metal poor (0.2 Z {sub ☉}) compared to their brighter counterparts (Z {sub ☉}), resulting in bluer assumed intrinsic UV slopes and larger derived values for dust extinction. The total UV luminosity density at z ∼ 2 is 4.31{sub −0.60}{sup +0.68}×10{sup 26} erg s{sup –1} Hz{sup –1} Mpc{sup –3}, more than 70% of which is emitted by galaxies in the luminosity range of our sample. Finally, we determine the global star formation rate density from UV-selected galaxies at z ∼ 2 (assuming a constant dust

  14. Evidence of a Non-universal Stellar Initial Mass Function. Insights from HST Optical Imaging of Six Ultra-faint Dwarf Milky Way Satellites

    Science.gov (United States)

    Gennaro, Mario; Tchernyshyov, Kirill; Brown, Thomas M.; Geha, Marla; Avila, Roberto J.; Guhathakurta, Puragra; Kalirai, Jason S.; Kirby, Evan N.; Renzini, Alvio; Simon, Joshua D.; Tumlinson, Jason; Vargas, Luis C.

    2018-03-01

    Using deep observations obtained with the Advanced Camera for Surveys (ACS) on board the Hubble Space Telescope (HST), we demonstrate that the sub-solar stellar initial mass function (IMF) of six ultra-faint dwarf Milky Way satellites (UFDs) is more bottom light than the IMF of the Milky Way disk. Our data have a lower-mass limit of ∼0.45 M ⊙, while the upper limit is ∼0.8 M ⊙, set by the turnoff mass of these old, metal-poor systems. If formulated as a single power law, we obtain a shallower IMF slope than the Salpeter value of ‑2.3, ranging from ‑1.01 for Leo IV to ‑1.87 for Boötes I. The significance of these deviations depends on the galaxy and is typically 95% or more. When modeled as a log-normal, the IMF fit results in a higher peak mass than in the Milky Way disk, but a Milky Way disk value for the characteristic system mass (∼0.22 M ⊙) is excluded at only 68% significance, and only for some UFDs in the sample. We find that the IMF slope correlates well with the galaxy mean metallicity, and to a lesser degree, with the velocity dispersion and the total mass. The strength of the observed correlations is limited by shot noise in the number of observed stars, but future space-based missions like the James Webb Space Telescope (JWST) and the Wide-Field Infrared Survey Telescope ( WFIRST) will enhance both the number of dwarf Milky Way satellites that can be studied in such detail and the observation depth for individual galaxies. Based on observations made with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555. These observations are associated with program GO-12549.

  15. High Performance Imaging through Occlusion via Energy Minimization-Based Optimal Camera Selection

    Directory of Open Access Journals (Sweden)

    Tao Yang

    2013-11-01

    Full Text Available Seeing an object in a cluttered scene with severe occlusion is a significantly challenging task for many computer vision applications. Although camera array synthetic aperture imaging has proven to be an effective way for occluded object imaging, its imaging quality is often significantly decreased by the shadows of the foreground occluder. To overcome this problem, some recent research has been presented to label the foreground occluder via object segmentation or 3D reconstruction. However, these methods usually fail in the case of complicated occluder or severe occlusion. In this paper, we present a novel optimal camera selection algorithm to handle the problem above. Firstly, in contrast to the traditional synthetic aperture photography methods, we formulate the occluded object imaging as a problem of visible light ray selection from the optimal camera view. To the best of our knowledge, this is the first time to “mosaic” a high quality occluded object image via selecting multi-view optimal visible light rays from a camera array or a single moving camera. Secondly, a greedy optimization framework is presented to propagate the visibility information among various depth focus planes. Thirdly, a multiple label energy minimization formulation is designed in each plane to select the optimal camera view. The energy is estimated in the 3D synthetic aperture image volume and integrates the multiple view intensity consistency, previous visibility property and camera view smoothness, which is minimized via graph cuts. Finally, we compare this approach with the traditional synthetic aperture imaging algorithms on UCSD light field datasets and our own datasets captured in indoor and outdoor environment, and extensive experimental results demonstrate the effectiveness and superiority of our approach.

  16. Photorealistic image synthesis and camera validation from 2D images

    Science.gov (United States)

    Santos Ferrer, Juan C.; González Chévere, David; Manian, Vidya

    2014-06-01

    This paper presents a new 3D scene reconstruction technique using the Unity 3D game engine. The method presented here allow us to reconstruct the shape of simple objects and more complex ones from multiple 2D images, including infrared and digital images from indoor scenes and only digital images from outdoor scenes and then add the reconstructed object to the simulated scene created in Unity 3D, these scenes are then validated with real world scenes. The method used different cameras settings and explores different properties in the reconstructions of the scenes including light, color, texture, shapes and different views. To achieve the highest possible resolution, it was necessary the extraction of partial textures from visible surfaces. To recover the 3D shapes and the depth of simple objects that can be represented by the geometric bodies, there geometric characteristics were used. To estimate the depth of more complex objects the triangulation method was used, for this the intrinsic and extrinsic parameters were calculated using geometric camera calibration. To implement the methods mentioned above the Matlab tool was used. The technique presented here also let's us to simulate small simple videos, by reconstructing a sequence of multiple scenes of the video separated by small margins of time. To measure the quality of the reconstructed images and video scenes the Fast Low Band Model (FLBM) metric from the Video Quality Measurement (VQM) software was used. Low bandwidth perception based features include edges and motion.

  17. Measuring the Angular Velocity of a Propeller with Video Camera Using Electronic Rolling Shutter

    Directory of Open Access Journals (Sweden)

    Yipeng Zhao

    2018-01-01

    Full Text Available Noncontact measurement for rotational motion has advantages over the traditional method which measures rotational motion by means of installing some devices on the object, such as a rotary encoder. Cameras can be employed as remote monitoring or inspecting sensors to measure the angular velocity of a propeller because of their commonplace availability, simplicity, and potentially low cost. A defect of the measurement with cameras is to process the massive data generated by cameras. In order to reduce the collected data from the camera, a camera using ERS (electronic rolling shutter is applied to measure angular velocities which are higher than the speed of the camera. The effect of rolling shutter can induce geometric distortion in the image, when the propeller rotates during capturing an image. In order to reveal the relationship between the angular velocity and the image distortion, a rotation model has been established. The proposed method was applied to measure the angular velocities of the two-blade propeller and the multiblade propeller. The experimental results showed that this method could detect the angular velocities which were higher than the camera speed, and the accuracy was acceptable.

  18. An Intelligent Space for Mobile Robot Localization Using a Multi-Camera System

    Directory of Open Access Journals (Sweden)

    Mariana Rampinelli

    2014-08-01

    Full Text Available This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization.

  19. An intelligent space for mobile robot localization using a multi-camera system.

    Science.gov (United States)

    Rampinelli, Mariana; Covre, Vitor Buback; de Queiroz, Felippe Mendonça; Vassallo, Raquel Frizera; Bastos-Filho, Teodiano Freire; Mazo, Manuel

    2014-08-15

    This paper describes an intelligent space, whose objective is to localize and control robots or robotic wheelchairs to help people. Such an intelligent space has 11 cameras distributed in two laboratories and a corridor. The cameras are fixed in the environment, and image capturing is done synchronously. The system was programmed as a client/server with TCP/IP connections, and a communication protocol was defined. The client coordinates the activities inside the intelligent space, and the servers provide the information needed for that. Once the cameras are used for localization, they have to be properly calibrated. Therefore, a calibration method for a multi-camera network is also proposed in this paper. A robot is used to move a calibration pattern throughout the field of view of the cameras. Then, the captured images and the robot odometry are used for calibration. As a result, the proposed algorithm provides a solution for multi-camera calibration and robot localization at the same time. The intelligent space and the calibration method were evaluated under different scenarios using computer simulations and real experiments. The results demonstrate the proper functioning of the intelligent space and validate the multi-camera calibration method, which also improves robot localization.

  20. Distributed Smart Cameras for Aging in Place

    National Research Council Canada - National Science Library

    Williams, Adam; Xie, Dan; Ou, Shichao; Grupen, Roderic; Hanson, Allen; Riseman, Edward

    2006-01-01

    .... The fall detector relies on features extracted from video by the camera nodes, which are sent to a central processing node where one of several machine learning techniques are applied to detect a fall...

  1. Highly Sensitive Flash LADAR Camera, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — A highly sensitive 640 x 480-element flash LADAR camera will be developed that is capable of 100-Hz rates with better than 5-cm range precision. The design is based...

  2. Projector-Camera Systems for Immersive Training

    National Research Council Canada - National Science Library

    Treskunov, Anton; Pair, Jarrell

    2006-01-01

    .... These projector-camera systems effectively paint the real world with digital light. Any surface can become an interactive projection screen allowing unprepared spaces to be transformed into an immersive environment...

  3. Ge Quantum Dot Infrared Imaging Camera Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Luna Innovations Incorporated proposes to develop a high performance Ge quantum dots-based infrared (IR) imaging camera on Si substrate. The high sensitivity, large...

  4. CALIBRATION PROCEDURES ON OBLIQUE CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    G. Kemper

    2016-06-01

    Full Text Available Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna –IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first

  5. Portable mini gamma camera for medical applications

    CERN Document Server

    Porras, E; Benlloch, J M; El-Djalil-Kadi-Hanifi, M; López, S; Pavon, N; Ruiz, J A; Sánchez, F; Sebastiá, A

    2002-01-01

    A small, portable and low-cost gamma camera for medical applications has been developed and clinically tested. This camera, based on a scintillator crystal and a Position Sensitive Photo-Multiplier Tube, has a useful field of view of 4.6 cm diameter and provides 2.2 mm of intrinsic spatial resolution. Its mobility and light weight allow to reach the patient from any desired direction. This camera images small organs with high efficiency and so addresses the demand for devices of specific clinical applications. In this paper, we present the camera and briefly describe the procedures that have led us to choose its configuration and the image reconstruction method. The clinical tests and diagnostic capability are also presented and discussed.

  6. Minimum Delay Moving Object Detection

    KAUST Repository

    Lao, Dong

    2017-05-14

    This thesis presents a general framework and method for detection of an object in a video based on apparent motion. The object moves, at some unknown time, differently than the “background” motion, which can be induced from camera motion. The goal of proposed method is to detect and segment the object as soon it moves in an online manner. Since motion estimation can be unreliable between frames, more than two frames are needed to reliably detect the object. Observing more frames before declaring a detection may lead to a more accurate detection and segmentation, since more motion may be observed leading to a stronger motion cue. However, this leads to greater delay. The proposed method is designed to detect the object(s) with minimum delay, i.e., frames after the object moves, constraining the false alarms, defined as declarations of detection before the object moves or incorrect or inaccurate segmentation at the detection time. Experiments on a new extensive dataset for moving object detection show that our method achieves less delay for all false alarm constraints than existing state-of-the-art.

  7. Compact Optical Technique for Streak Camera Calibration

    International Nuclear Information System (INIS)

    Curt Allen; Terence Davies; Frans Janson; Ronald Justin; Bruce Marshall; Oliver Sweningsen; Perry Bell; Roger Griffith; Karla Hagans; Richard Lerche

    2004-01-01

    The National Ignition Facility is under construction at the Lawrence Livermore National Laboratory for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses that are suitable for temporal calibrations

  8. MCP gated x-ray framing camera

    Science.gov (United States)

    Cai, Houzhi; Liu, Jinyuan; Niu, Lihong; Liao, Hua; Zhou, Junlan

    2009-11-01

    A four-frame gated microchannel plate (MCP) camera is described in this article. Each frame photocathode coated with gold on the MCP is part of a transmission line with open circuit end driven by the gating electrical pulse. The gating pulse is 230 ps in width and 2.5 kV in amplitude. The camera is tested by illuminating its photocathode with ultraviolet laser pulses, 266 nm in wavelength, which shows exposure time as short as 80 ps.

  9. Imaging camera with multiwire proportional chamber

    International Nuclear Information System (INIS)

    Votruba, J.

    1980-01-01

    The camera for imaging radioisotope dislocations for use in nuclear medicine or for other applications, claimed in the patent, is provided by two multiwire lattices for the x-coordinate connected to a first coincidence circuit, and by two multiwire lattices for the y-coordinate connected to a second coincidence circuit. This arrangement eliminates the need of using a collimator and increases camera sensitivity while reducing production cost. (Ha)

  10. An imaging system for a gamma camera

    International Nuclear Information System (INIS)

    Miller, D.W.; Gerber, M.S.

    1980-01-01

    A detailed description is given of a novel gamma camera which is designed to produce superior images than conventional cameras used in nuclear medicine. The detector consists of a solid state detector (e.g. germanium) which is formed to have a plurality of discrete components to enable 2-dimensional position identification. Details of the electronic processing circuits are given and the problems and limitations introduced by noise are discussed in full. (U.K.)

  11. Counting neutrons with a commercial S-CMOS camera

    Directory of Open Access Journals (Sweden)

    Patrick Van Esch

    2018-01-01

    Full Text Available It is possible to detect individual flashes from thermal neutron impacts in a ZnS scintillator using a CMOS camera looking at the scintillator screen, and off line image processing. Some preliminary results indicated that the efficiency of recognition could be improved by optimizing the light collection and the image processing. We will report on this ongoing work which is a result from the collaboration between ESS Bilbao and the ILL. The main progress to be reported is situated on the level of the on-line treatment of the imaging data. If this technology is to work on a genuine scientific instrument, it is necessary that all the processing happens on line, to avoid the accumulation of large amounts of image data to be analyzed off line. An FPGA-based real-time full-deca mode VME-compatible CameraLink board has been developed at the SCI of the ILL, which is able to manage the data flow from the camera and convert it in a reasonable “neutron impact” data flow like from a usual neutron counting detector. The main challenge of the endeavor is the optical light collection from the scintillator. While the light yield of a ZnS scintillator is a priori rather important, the amount of light collected with a photographic objective is small. Different scintillators and different light collection techniques have been experimented with and results will be shown for different setups improving upon the light recuperation on the camera sensor. Improvements on the algorithm side will also be presented. The algorithms have to be at the same time efficient in their recognition of neutron signals, in their rejection of noise signals (internal and external to the camera but also have to be simple enough to be easily implemented in the FPGA. The path from the idea of detecting individual neutron impacts with a CMOS camera to a practical working instrument detector is challenging, and in this paper we will give an overview of the part of the road that has

  12. Counting neutrons with a commercial S-CMOS camera

    Science.gov (United States)

    Patrick, Van Esch; Paolo, Mutti; Emilio, Ruiz-Martinez; Estefania, Abad Garcia; Marita, Mosconi; Jon, Ortega

    2018-01-01

    It is possible to detect individual flashes from thermal neutron impacts in a ZnS scintillator using a CMOS camera looking at the scintillator screen, and off line image processing. Some preliminary results indicated that the efficiency of recognition could be improved by optimizing the light collection and the image processing. We will report on this ongoing work which is a result from the collaboration between ESS Bilbao and the ILL. The main progress to be reported is situated on the level of the on-line treatment of the imaging data. If this technology is to work on a genuine scientific instrument, it is necessary that all the processing happens on line, to avoid the accumulation of large amounts of image data to be analyzed off line. An FPGA-based real-time full-deca mode VME-compatible CameraLink board has been developed at the SCI of the ILL, which is able to manage the data flow from the camera and convert it in a reasonable "neutron impact" data flow like from a usual neutron counting detector. The main challenge of the endeavor is the optical light collection from the scintillator. While the light yield of a ZnS scintillator is a priori rather important, the amount of light collected with a photographic objective is small. Different scintillators and different light collection techniques have been experimented with and results will be shown for different setups improving upon the light recuperation on the camera sensor. Improvements on the algorithm side will also be presented. The algorithms have to be at the same time efficient in their recognition of neutron signals, in their rejection of noise signals (internal and external to the camera) but also have to be simple enough to be easily implemented in the FPGA. The path from the idea of detecting individual neutron impacts with a CMOS camera to a practical working instrument detector is challenging, and in this paper we will give an overview of the part of the road that has already been walked.

  13. Thermal Wave Imaging: Flying SPOT Camera.

    Science.gov (United States)

    Wang, Yiqian

    1993-01-01

    A novel "Flying Spot" infrared camera for nondestructive evaluation (NDE) and nondestructive characterization is presented. The camera scans the focal point of an unmodulated heating laser beam across the sample in a raster. The detector of the camera tracks the heating spot in the same raster, but with a time delay. The detector is thus looking at the "thermal wake" of the heating spot. The time delay between heating and detection is determined by the speed of the laser spot and the distance between it and the detector image. Since this time delay can be made arbitrarily small, the camera is capable of making thermal wave images of phenomena which occur on a very short time scale. In addition, because the heat source is a very small spot, the heat flow is fully three-dimensional. This makes the camera system sensitive to features, like tightly closed vertical cracks, which are invisible to imaging systems which employ full-field heating. A detailed theory which relates the temperature profile around the heating spot to the sample thermal properties is also described. The camera represents a potentially useful tool for measuring thermal diffusivities of materials by means of fitting the recorded temperature profiles to the theoretical curves with the diffusivity as a fitting parameter.

  14. The Use of Camera Traps in Wildlife

    Directory of Open Access Journals (Sweden)

    Yasin Uçarlı

    2013-11-01

    Full Text Available Camera traps are increasingly used in the abundance and density estimates of wildlife species. Camera traps are very good alternative for direct observation in case, particularly, steep terrain, dense vegetation covered areas or nocturnal species. The main reason for the use of camera traps is eliminated that the economic, personnel and time loss in a continuous manner at the same time in different points. Camera traps, motion and heat sensitive, can take a photo or video according to the models. Crossover points and feeding or mating areas of the focal species are addressed as a priority camera trap set locations. The population size can be finding out by the images combined with Capture-Recapture methods. The population density came out the population size divided to effective sampling area size. Mating and breeding season, habitat choice, group structures and survival rates of the focal species can be achieved from the images. Camera traps are very useful to obtain the necessary data about the particularly mysterious species with economically in planning and conservation efforts.

  15. Multi-Angle Snowflake Camera Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Stuefer, Martin [Univ. of Alaska, Fairbanks, AK (United States); Bailey, J. [Univ. of Alaska, Fairbanks, AK (United States)

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASC cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.

  16. On the evolution of wafer level cameras

    Science.gov (United States)

    Welch, H.

    2011-02-01

    The introduction of small cost effective cameras based on CMOS image sensor technology has played an important role in the revolution in mobile devices of the last 10 years. Wafer-based optics manufacturing leverages the same fabrication equipment used to produce CMOS sensors. The natural integration of these two technologies allows the mass production of very low cost surface mount cameras that can fit into ever thinner mobile devices. Nano Imprint Lithography (NIL) equipment has been adapted to make precision aspheres that can be stacked using wafer bonding techniques to produce multi-element lens assemblies. This, coupled with advances in mastering technology, allows arrays of lenses with prescriptions not previously possible. A primary motivation for these methods is that it allows the consolidation of the supply chain. Image sensor manufacturers envision creating optics by simply adding layers to their existing sensor fabrication lines. Results thus far have been promising. The current alternative techniques for creating VGA cameras are discussed as well as the prime cost drivers for lens to sensor integration. Higher resolution cameras face particularly difficult challenges, but can greatly simplify the critical tilt and focus steps needed to assemble cameras that produce quality images. Finally, we discuss the future of wafer-level cameras and explore several of the novel concepts made possible by the manufacturing advantages of photolithography.

  17. Classroom multispectral imaging using inexpensive digital cameras.

    Science.gov (United States)

    Fortes, A. D.

    2007-12-01

    The proliferation of increasingly cheap digital cameras in recent years means that it has become easier to exploit the broad wavelength sensitivity of their CCDs (360 - 1100 nm) for classroom-based teaching. With the right tools, it is possible to open children's eyes to the invisible world of UVA and near-IR radiation either side of our narrow visual band. The camera-filter combinations I describe can be used to explore the world of animal vision, looking for invisible markings on flowers, or in bird plumage, for example. In combination with a basic spectroscope (such as the Project-STAR handheld plastic spectrometer, 25), it is possible to investigate the range of human vision and camera sensitivity, and to explore the atomic and molecular absorption lines from the solar and terrestrial atmospheres. My principal use of the cameras has been to teach multispectral imaging of the kind used to determine remotely the composition of planetary surfaces. A range of camera options, from 50 circuit-board mounted CCDs up to $900 semi-pro infrared camera kits (including mobile phones along the way), and various UV-vis-IR filter options will be presented. Examples of multispectral images taken with these systems are used to illustrate the range of classroom topics that can be covered. Particular attention is given to learning about spectral reflectance curves and comparing images from Earth and Mars taken using the same filter combination that it used on the Mars Rovers.

  18. High-speed CCD camera at NAOC

    Science.gov (United States)

    Zhao, Zhaowang; Wang, Wei; Liu, Yangbin

    2006-06-01

    A high speed CCD camera has been completed at the National Astronomical Observatories of China (NAOC). A Kodak CCD was used in the camera. Two output ports are used to read out CCD data and total speed achieved 60M pixels per second. The Kodak KAI-4021 image sensor is a high-performance 2Kx2K-pixel interline transfer device. The 7.4μ square pixels with micro lenses provide high sensitivity and the large full well capacity results in high dynamic range. The inter-line transfer structure provides high quality image and enables electronic shuttering for precise exposure control. The electronic shutter provides a method of precisely controlling the image exposure time without any mechanical components. The camera is controlled by a NIOS II family of embedded processors, which is Altera's second-generation soft-core embedded processor for FPGAs. The powerful embedded processors make the camera with splendid features to satisfy continuously appearing new observational requirements. This camera is very flexible and is easy to implement new special functions. Since FPGA and other peripheral logic signals are triggered by a single master clock, the whole system is perfectly synchronized. By using this technique the camera cuts off the noise dramatically.

  19. Camera for Quasars in the Early Universe (CQUEAN)

    Science.gov (United States)

    Kim, Eunbin; Park, W.; Lim, J.; Jeong, H.; Kim, J.; Oh, H.; Pak, S.; Im, M.; Kuehne, J.

    2010-05-01

    The early universe of z ɳ is where the first stars, galaxies, and quasars formed, starting the re-ionization of the universe. The discovery and the study of quasars in the early universe allow us to witness the beginning of history of astronomical objects. In order to perform a medium-deep, medium-wide, imaging survey of quasars, we are developing an optical CCD camera, CQUEAN (Camera for QUasars in EArly uNiverse) which uses a 1024*1024 pixel deep-depletion CCD. It has an enhanced QE than conventional CCD at wavelength band around 1μm, thus it will be an efficient tool for observation of quasars at z > 7. It will be attached to the 2.1m telescope at McDonald Observatory, USA. A focal reducer is designed to secure a larger field of view at the cassegrain focus of 2.1m telescope. For long stable exposures, auto-guiding system will be implemented by using another CCD camera viewing an off-axis field. All these instruments will be controlled by the software written in python on linux platform. CQUEAN is expected to see the first light during summer in 2010.

  20. Gamma camera intrinsic uniformity in an unstable power supply environment.

    Science.gov (United States)

    Ejeh, John E; Adedapo, Kayode S; Akinlade, Bidemi I; Osifo, Bola O A

    2011-01-01

    The main objective of this work was to show that a gamma camera in a developing country could perform efficiently despite electricity outages using intrinsic flood uniformity tests as an index of performance. A total of 143 intrinsic uniformity test results for a new gamma camera in use in an environment with unstable power supply are presented. The integral uniformity for the central field of view (CFOV) was found to be between 3.43% and 1.49% (3.29% for acceptance test) while the integral uniformity for the useful field of view (UFOV) was between 4.51% and 1.9% (5.21% for acceptance test). The differential uniformity for the CFOV was between 1.99% and 1.04% (2.25% for acceptance test) while that of the UFOV was between 2.84% and 1.23% (2.63% for acceptance test). In conclusion, these results show that the uniformity of the gamma camera under this condition is within an acceptable range for both planar and SPET imaging.

  1. Instrumental response model and detrending for the Dark Energy Camera

    Science.gov (United States)

    Bernstein, G. M.; Abbott, T. M. C.; Desai, S.; Gruen, D.; Gruendl, R. A.; Johnson, M. D.; Lin, H.; Menanteau, F.; Morganson, E.; Neilsen, E.; Paech, K.; Walker, A. R.; Wester, W.; Yanny, B.; DES Collaboration

    2017-11-01

    We describe the model for mapping from sky brightness to the digital output of the Dark Energy Camera (DECam) and the algorithms adopted by the Dark Energy Survey (DES) for inverting this model to obtain photometric measures of celestial objects from the raw camera output. This calibration aims for fluxes that are uniform across the camera field of view and across the full angular and temporal span of the DES observations, approaching the accuracy limits set by shot noise for the full dynamic range of DES observations. The DES pipeline incorporates several substantive advances over standard detrending techniques, including principal-components-based sky and fringe subtraction; correction of the “brighter-fatter” nonlinearity; use of internal consistency in on-sky observations to disentangle the influences of quantum efficiency, pixel-size variations, and scattered light in the dome flats; and pixel-by-pixel characterization of instrument spectral response, through combination of internal-consistency constraints with auxiliary calibration data. This article provides conceptual derivations of the detrending/calibration steps, and the procedures for obtaining the necessary calibration data. Other publications will describe the implementation of these concepts for the DES operational pipeline, the detailed methods, and the validation that the techniques can bring DECam photometry and astrometry within ≈ 2 mmag and ≈ 3 mas, respectively, of fundamental atmospheric and statistical limits. The DES techniques should be broadly applicable to wide-field imagers.

  2. Family Of Calibrated Stereometric Cameras For Direct Intraoral Use

    Science.gov (United States)

    Curry, Sean; Moffitt, Francis; Symes, Douglas; Baumrind, Sheldon

    1983-07-01

    In order to study empirically the relative efficiencies of different types of orthodontic appliances in repositioning teeth in vivo, we have designed and constructed a pair of fixed-focus, normal case, fully-calibrated stereometric cameras. One is used to obtain stereo photography of single teeth, at a scale of approximately 2:1, and the other is designed for stereo imaging of the entire dentition, study casts, facial structures, and other related objects at a scale of approximately 1:8. Twin lenses simultaneously expose adjacent frames on a single roll of 70 mm film. Physical flatness of the film is ensured by the use of a spring-loaded metal pressure plate. The film is forced against a 3/16" optical glass plate upon which is etched an array of 16 fiducial marks which divide the film format into 9 rectangular regions. Using this approach, it has been possible to produce photographs which are undistorted for qualitative viewing and from which quantitative data can be acquired by direct digitization of conventional photographic enlargements. We are in the process of designing additional members of this family of cameras. All calibration and data acquisition and analysis techniques previously developed will be directly applicable to these new cameras.

  3. On the type Ia supernovae 2007on and 2011iv: Evidence for Chandrasekhar-mass explosions at the faint end of the luminosity-width relationship.

    Science.gov (United States)

    Ashall, C.; Mazzali, P. A.; Stritzinger, M. D.; Hoeflich, P.; Burns, C. R.; Gall, C.; Hsiao, E. Y.; Phillips, M. M.; Morrell, N.; Foley, Ryan J.

    2018-03-01

    Radiative transfer models of two transitional type Ia supernova (SNe Ia) have been produced using the abundance stratification technique. These two objects -designated SN 2007on and SN 2011iv- both exploded in the same galaxy, NGC 1404, which allows for a direct comparison. SN 2007on synthesised 0.25 M⊙ of 56Ni and was less luminous than SN 2011iv, which produced 0.31 M⊙ of 56Ni. SN 2007on had a lower central density (ρc) and higher explosion energy (Ekin ˜1.3 ± 0.3 × 1051erg) than SN 2011iv, and it produced less nuclear statistical equilibrium (NSE) elements (0.06 M⊙). Whereas, SN 2011iv had a larger ρc, which increased the electron capture rate in the lowest velocity regions, and produced 0.35 M⊙ of stable NSE elements. SN 2011iv had an explosion energy of ˜Ekin ˜0.9 ± 0.2 × 1051erg. Both objects had an ejecta mass consistent with the Chandrasekhar mass (Ch-mass), and their observational properties are well described by predictions from delayed-detonation explosion models. Within this framework, comparison to the sub-luminous SN 1986G indicates SN 2011iv and SN 1986G have different transition densities (ρtr) but similar ρc. Whereas, SN 1986G and SN 2007on had a similar ρtr but different ρc. Finally, we examine the colour-stretch parameter sBV vs. Lmax relation and determine that the bulk of SNe Ia (including the sub-luminous ones) are consistent with Ch-mass delayed-detonation explosions, where the main parameter driving the diversity is ρtr. We also find ρc to be driving the second order scatter observed at the faint end of the luminosity-width relationship.

  4. Two-dimensional displacement measurement using static close range photogrammetry and a single fixed camera

    Directory of Open Access Journals (Sweden)

    Abdallah M. Khalil

    2011-09-01

    Full Text Available This work describes a simple approach to measure the displacement of a moving object in two directions simultaneously. The proposed approach is based on static close range photogrammetry with a single camera and the well-known collinearity equations. The proposed approach requires neither multi-camera synchronization nor mutual camera calibration. It requires no prior knowledge of the kinematic and kinetic data of the moving object. The proposed approach was used to evaluate predefined two-dimensional displacements of a moving object. The root mean square values of the differences between the predefined and evaluated displacements in the two directions are 0.11 and 0.02 mm.

  5. Photogrammetry of a 5m Inflatable Space Antenna With Consumer Digital Cameras

    Science.gov (United States)

    Pappa, Richard S.; Giersch, Louis R.; Quagliaroli, Jessica M.

    2000-01-01

    This paper discusses photogrammetric measurements of a 5m-diameter inflatable space antenna using four Kodak DC290 (2.1 megapixel) digital cameras. The study had two objectives: 1) Determine the photogrammetric measurement precision obtained using multiple consumer-grade digital cameras and 2) Gain experience with new commercial photogrammetry software packages, specifically PhotoModeler Pro from Eos Systems, Inc. The paper covers the eight steps required using this hardware/software combination. The baseline data set contained four images of the structure taken from various viewing directions. Each image came from a separate camera. This approach simulated the situation of using multiple time-synchronized cameras, which will be required in future tests of vibrating or deploying ultra-lightweight space structures. With four images, the average measurement precision for more than 500 points on the antenna surface was less than 0.020 inches in-plane and approximately 0.050 inches out-of-plane.

  6. Absolute phase unwrapping for dual-camera system without embedding statistical features

    Science.gov (United States)

    Jiang, Chufan; Zhang, Song

    2017-09-01

    This paper proposes an absolute phase unwrapping method for three-dimensional measurement that uses two cameras and one projector. On the left camera image, each pixel has one wrapped phase value, which corresponds to multiple projector candidates with different absolute phase values. We use the geometric relationship of the system to map projector candidates into the right camera candidates. By applying a series of candidate rejection criteria, a unique correspondence pair between two camera images can be determined. Then, the absolute phase is obtained by tracing the correspondence point back to the projector space. Experimental results demonstrate that the proposed absolute phase unwrapping algorithm can successfully work on both complex geometry and multiple isolated objects measurement.

  7. Adaptive Probabilistic Tracking Embedded in Smart Cameras for Distributed Surveillance in a 3D Model

    Directory of Open Access Journals (Sweden)

    Sven Fleck

    2006-12-01

    Full Text Available Tracking applications based on distributed and embedded sensor networks are emerging today, both in the fields of surveillance and industrial vision. Traditional centralized approaches have several drawbacks, due to limited communication bandwidth, computational requirements, and thus limited spatial camera resolution and frame rate. In this article, we present network-enabled smart cameras for probabilistic tracking. They are capable of tracking objects adaptively in real time and offer a very bandwidthconservative approach, as the whole computation is performed embedded in each smart camera and only the tracking results are transmitted, which are on a higher level of abstraction. Based on this, we present a distributed surveillance system. The smart cameras' tracking results are embedded in an integrated 3D environment as live textures and can be viewed from arbitrary perspectives. Also a georeferenced live visualization embedded in Google Earth is presented.

  8. A practical block detector for a depth-encoding PET camera

    International Nuclear Information System (INIS)

    Rogers, J.G.; Moisan, C.; Hoskinson, E.M.; Andreaco, M.S.; Williams, C.W.; Nutt, R.

    1996-01-01

    The depth-of-interaction effect in block detectors degrades the image resolution in commercial PET cameras and impedes the natural evolution of smaller, less expensive cameras. A method for correcting the measured position of each detected gamma ray by measuring its depth-of-interaction was tested and found to recover 38% of the lost resolution at 7.5 cm radius in a tabletop, 50-cm-diameter camera. To obtain the desired depth sensitivity, standard commercial detectors were modified by a simple and practical process that is suitable for mass production of the detectors. The impact of the detector modifications on central image resolution and on the ability of the camera to correct for object scatter were also measured

  9. The faint end of the red sequence galaxy luminosity function: unveiling surface brightness selection effects with the CLASH clusters

    Science.gov (United States)

    Martinet, Nicolas; Durret, Florence; Adami, Christophe; Rudnick, Gregory

    2017-08-01

    Characterizing the evolution of the faint end of the cluster red sequence (RS) galaxy luminosity function (GLF) with redshift is a milestone in understanding galaxy evolution. However, the community is still divided in that respect, hesitating between an enrichment of the RS due to efficient quenching of blue galaxies from z 1 to present-day or a scenario in which the RS is built at a higher redshift and does not evolve afterwards. Recently, it has been proposed that surface brightness (SB) selection effects could possibly solve the literature disagreement, accounting for the diminishing RS faint population in ground-based observations. We investigate this hypothesis by comparing the RS GLFs of 16 CLASH clusters computed independently from ground-based Subaru/Suprime-Cam V and Ip or Ic images and space-based HST/ACS F606W and F814W images in the redshift range 0.187 ≤ z ≤ 0.686. We stack individual cluster GLFs in two redshift bins (0.187 ≤ z ≤ 0.399 and 0.400 ≤ z ≤ 0.686) and two mass (6 × 1014M⊙ ≤ M200Japan.

  10. A study on the performance evaluation of small gamma camera collimators using detective quantun efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Ho Sang

    2008-02-15

    The anger-type gamma camera and novel marker compound using Tc-99m were firstly introduced in 1963. The gamma camera systems have being improved and applied to various fields, for example, medical, industrial, and environmental fields. Gamma camera is mainly composed of collimator, detector, and signal processor. And the radiative source is namely the imaging object. The collimator is essential component of gamma camera system because the imaging performance of system is mainly dependent on the collimator. The performance evaluation of collimators can be done by using evaluating factors. In this study, the novel factors for gamma camera evaluation are suggested. The established evaluating factors by NEMA are FWHM, sensitivity, and uniformity. They have some limitations in spite of their usefulness. Firstly, performance evaluation by those factors give insensitive and indirect results only. Secondly, the evaluation of noise property is ambiguous. Thirdly, there is no synthetic evaluation of system performance. Simulation with Monte Carlo code and experiment with a small camera camera were simultaenuously performed to verify novel evaluating factors. For the evaluation of spatial resolution, MTF was applied instead of FWHM. The MTF values presents excellent linear relationship with FWHM values. The NNPS was applied instead of uniformity and sensitivity for the evaluation of noise fluctuation. The NNPS values also presents linear relationship with sensitivity and unifomity. Moreover, these novel factors give quantities as the function of spatial frequencies. Finally, the DQE values were given by calculations with MTF, NNPS, and input SNR. DQE effectively presents the synthetic evaluation of gamma camera performance. It is the conclusion that MTF, NNPS, and DQE can be novel evaluating factors for gamma camera systems and the new factor for synthetic evaluation is derived.

  11. A study on the performance evaluation of small gamma camera collimators using detective quantun efficiency

    International Nuclear Information System (INIS)

    Jeon, Ho Sang

    2008-02-01

    The anger-type gamma camera and novel marker compound using Tc-99m were firstly introduced in 1963. The gamma camera systems have being improved and applied to various fields, for example, medical, industrial, and environmental fields. Gamma camera is mainly composed of collimator, detector, and signal processor. And the radiative source is namely the imaging object. The collimator is essential component of gamma camera system because the imaging performance of system is mainly dependent on the collimator. The performance evaluation of collimators can be done by using evaluating factors. In this study, the novel factors for gamma camera evaluation are suggested. The established evaluating factors by NEMA are FWHM, sensitivity, and uniformity. They have some limitations in spite of their usefulness. Firstly, performance evaluation by those factors give insensitive and indirect results only. Secondly, the evaluation of noise property is ambiguous. Thirdly, there is no synthetic evaluation of system performance. Simulation with Monte Carlo code and experiment with a small camera camera were simultaenuously performed to verify novel evaluating factors. For the evaluation of spatial resolution, MTF was applied instead of FWHM. The MTF values presents excellent linear relationship with FWHM values. The NNPS was applied instead of uniformity and sensitivity for the evaluation of noise fluctuation. The NNPS values also presents linear relationship with sensitivity and unifomity. Moreover, these novel factors give quantities as the function of spatial frequencies. Finally, the DQE values were given by calculations with MTF, NNPS, and input SNR. DQE effectively presents the synthetic evaluation of gamma camera performance. It is the conclusion that MTF, NNPS, and DQE can be novel evaluating factors for gamma camera systems and the new factor for synthetic evaluation is derived

  12. CALIBRATION PROCEDURES IN MID FORMAT CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    F. Pivnicka

    2012-07-01

    Full Text Available A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU, the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and

  13. ALGORITHM OF PLACEMENT OF VIDEO SURVEILLANCE CAMERAS AND ITS SOFTWARE IMPLEMENTATION

    Directory of Open Access Journals (Sweden)

    Loktev Alexey Alexeevich

    2012-10-01

    Full Text Available Comprehensive distributed safety, control, and monitoring systems applied by companies and organizations of different ownership structure play a substantial role in the present-day society. Video surveillance elements that ensure image processing and decision making in automated or automatic modes are the essential components of new systems. This paper covers the modeling of video surveillance systems installed in buildings, and the algorithm, or pattern, of video camera placement with due account for nearly all characteristics of buildings, detection and recognition facilities, and cameras themselves. This algorithm will be subsequently implemented as a user application. The project contemplates a comprehensive approach to the automatic placement of cameras that take account of their mutual positioning and compatibility of tasks. The project objective is to develop the principal elements of the algorithm of recognition of a moving object to be detected by several cameras. The image obtained by different cameras will be processed. Parameters of motion are to be identified to develop a table of possible options of routes. The implementation of the recognition algorithm represents an independent research project to be covered by a different article. This project consists in the assessment of the degree of complexity of an algorithm of camera placement designated for identification of cases of inaccurate algorithm implementation, as well as in the formulation of supplementary requirements and input data by means of intercrossing sectors covered by neighbouring cameras. The project also contemplates identification of potential problems in the course of development of a physical security and monitoring system at the stage of the project design development and testing. The camera placement algorithm has been implemented as a software application that has already been pilot tested on buildings and inside premises that have irregular dimensions. The

  14. Development of the Earth Observation Camera of MIRIS

    Directory of Open Access Journals (Sweden)

    Dae-Hee Lee

    2011-09-01

    Full Text Available We have designed and manufactured the Earth observation camera (EOC of multi-purpose infrared imaging system (MIRIS. MIRIS is a main payload of the STSAT-3, which will be launched in late 2012. The main objective of the EOC is to test the operation of Korean IR technology in space, so we have designed the optical and mechanical system of the EOC to fit the IR detector system. We have assembled the flight model (FM of EOC and performed environment tests successfully. The EOC is now ready to be integrated into the satellite system waiting for operation in space, as planned.

  15. Soft x-ray streak cameras

    International Nuclear Information System (INIS)

    Stradling, G.L.

    1988-01-01

    This paper is a discussion of the development and of the current state of the art in picosecond soft x-ray streak camera technology. Accomplishments from a number of institutions are discussed. X-ray streak cameras vary from standard visible streak camera designs in the use of an x-ray transmitting window and an x-ray sensitive photocathode. The spectral sensitivity range of these instruments includes portions of the near UV and extends from the subkilovolt x- ray region to several tens of kilovolts. Attendant challenges encountered in the design and use of x-ray streak cameras include the accommodation of high-voltage and vacuum requirements, as well as manipulation of a photocathode structure which is often fragile. The x-ray transmitting window is generally too fragile to withstand atmospheric pressure, necessitating active vacuum pumping and a vacuum line of sight to the x-ray signal source. Because of the difficulty of manipulating x-ray beams with conventional optics, as is done with visible light, the size of the photocathode sensing area, access to the front of the tube, the ability to insert the streak tube into a vacuum chamber and the capability to trigger the sweep with very short internal delay times are issues uniquely relevant to x-ray streak camera use. The physics of electron imaging may place more stringent limitations on the temporal and spatial resolution obtainable with x-ray photocathodes than with the visible counterpart. Other issues which are common to the entire streak camera community also concern the x-ray streak camera users and manufacturers

  16. Real-time object detection, tracking and occlusion reasoning

    Energy Technology Data Exchange (ETDEWEB)

    Divakaran, Ajay; Yu, Qian; Tamrakar, Amir; Sawhney, Harpreet Singh; Zhu, Jiejie; Javed, Omar; Liu, Jingen; Cheng, Hui; Eledath, Jayakrishnan

    2018-02-27

    A system for object detection and tracking includes technologies to, among other things, detect and track moving objects, such as pedestrians and/or vehicles, in a real-world environment, handle static and dynamic occlusions, and continue tracking moving objects across the fields of view of multiple different cameras.

  17. Etiology of faint in children with recurrent syncope and syncope‘ hemodynamic patterns

    OpenAIRE

    Kinčinienė, Odeta

    2010-01-01

    Object of dissertation: new practice of orthostatic test for prognostication mechanism of syncope in children. There is no published studies in Lithuania where researched causes and mechanisms of children syncope. Trying to find published studies about pediatric syncope or orthostatic test’s mechanism prediction was failed despite of presence such trials in adults. Research was performed in two steps. The first step: detail anamnesis, objective physical examination, basic laboratory ex...

  18. Combine TV-L1 model with guided image filtering for wide and faint ring artifacts correction of in-line x-ray phase contrast computed tomography.

    Science.gov (United States)

    Ji, Dongjiang; Qu, Gangrong; Hu, Chunhong; Zhao, Yuqing; Chen, Xiaodong

    2018-01-01

    In practice, mis-calibrated detector pixels give rise to wide and faint ring artifacts in the reconstruction image of the In-line phase-contrast computed tomography (IL-PC-CT). Ring artifacts correction is essential in IL-PC-CT. In this study, a novel method of wide and faint ring artifacts correction was presented based on combining TV-L1 model with guided image filtering (GIF) in the reconstruction image domain. The new correction method includes two main steps namely, the GIF step and the TV-L1 step. To validate the performance of this method, simulation data and real experimental synchrotron data are provided. The results demonstrate that TV-L1 model with GIF step can effectively correct the wide and faint ring artifacts for IL-PC-CT.

  19. Characterization of SWIR cameras by MRC measurements

    Science.gov (United States)

    Gerken, M.; Schlemmer, H.; Haan, Hubertus A.; Siemens, Christofer; Münzberg, M.

    2014-05-01

    Cameras for the SWIR wavelength range are becoming more and more important because of the better observation range for day-light operation under adverse weather conditions (haze, fog, rain). In order to choose the best suitable SWIR camera or to qualify a camera for a given application, characterization of the camera by means of the Minimum Resolvable Contrast MRC concept is favorable as the MRC comprises all relevant properties of the instrument. With the MRC known for a given camera device the achievable observation range can be calculated for every combination of target size, illumination level or weather conditions. MRC measurements in the SWIR wavelength band can be performed widely along the guidelines of the MRC measurements of a visual camera. Typically measurements are performed with a set of resolution targets (e.g. USAF 1951 target) manufactured with different contrast values from 50% down to less than 1%. For a given illumination level the achievable spatial resolution is then measured for each target. The resulting curve is showing the minimum contrast that is necessary to resolve the structure of a target as a function of spatial frequency. To perform MRC measurements for SWIR cameras at first the irradiation parameters have to be given in radiometric instead of photometric units which are limited in their use to the visible range. In order to do so, SWIR illumination levels for typical daylight and twilight conditions have to be defined. At second, a radiation source is necessary with appropriate emission in the SWIR range (e.g. incandescent lamp) and the irradiance has to be measured in W/m2 instead of Lux = Lumen/m2. At third, the contrast values of the targets have to be calibrated newly for the SWIR range because they typically differ from the values determined for the visual range. Measured MRC values of three cameras are compared to the specified performance data of the devices and the results of a multi-band in-house designed Vis-SWIR camera

  20. How to Build Your Own Document Camera for around $100

    Science.gov (United States)

    Van Orden, Stephen

    2010-01-01

    Document cameras can have great utility in second language classrooms. However, entry-level consumer document cameras start at around $350. This article describes how the author built three document cameras and offers suggestions for how teachers can successfully build their own quality document camera using a webcam for around $100.

  1. Advanced system for Gamma Cameras modernization

    International Nuclear Information System (INIS)

    Osorio Deliz, J. F.; Diaz Garcia, A.; Arista Romeu, E. J.

    2015-01-01

    Analog and digital gamma cameras still largely used in developing countries. Many of them rely in old hardware electronics, which in many cases limits their use in actual nuclear medicine diagnostic studies. Consequently, there are different worldwide companies that produce medical equipment engaged into a partial or total Gamma Cameras modernization. Present work has demonstrated the possibility of substitution of almost entire signal processing electronics placed at inside a Gamma Camera detector head by a digitizer PCI card. this card includes four 12 Bits Analog-to-Digital-Converters of 50 MHz speed. It has been installed in a PC and controlled through software developed in Lab View. Besides, there were done some changes to the hardware inside the detector head including redesign of the Orientation Display Block (ODA card). Also a new electronic design was added to the Microprocessor Control Block (MPA card) which comprised a PIC micro controller acting as a tuning system for individual Photomultiplier Tubes. The images, obtained by measurement of 99m Tc point radioactive source, using modernized camera head demonstrate its overall performance. The system was developed and tested in an old Gamma Camera ORBITER II SIEMENS GAMMASONIC at National Institute of Oncology and Radiobiology (INOR) under CAMELUD project supported by National Program PNOULU and IAEA . (Author)

  2. Design of Endoscopic Capsule With Multiple Cameras.

    Science.gov (United States)

    Gu, Yingke; Xie, Xiang; Li, Guolin; Sun, Tianjia; Wang, Dan; Yin, Zheng; Zhang, Pengfei; Wang, Zhihua

    2015-08-01

    In order to reduce the miss rate of the wireless capsule endoscopy, in this paper, we propose a new system of the endoscopic capsule with multiple cameras. A master-slave architecture, including an efficient bus architecture and a four level clock management architecture, is applied for the Multiple Cameras Endoscopic Capsule (MCEC). For covering more area of the gastrointestinal tract wall with low power, multiple cameras with a smart image capture strategy, including movement sensitive control and camera selection, are used in the MCEC. To reduce the data transfer bandwidth and power consumption to prolong the MCEC's working life, a low complexity image compressor with PSNR 40.7 dB and compression rate 86% is implemented. A chipset is designed and implemented for the MCEC and a six cameras endoscopic capsule prototype is implemented by using the chipset. With the smart image capture strategy, the coverage rate of the MCEC prototype can achieve 98% and its power consumption is only about 7.1 mW.

  3. Designing Camera Networks by Convex Quadratic Programming

    KAUST Repository

    Ghanem, Bernard

    2015-05-04

    ​In this paper, we study the problem of automatic camera placement for computer graphics and computer vision applications. We extend the problem formulations of previous work by proposing a novel way to incorporate visibility constraints and camera-to-camera relationships. For example, the placement solution can be encouraged to have cameras that image the same important locations from different viewing directions, which can enable reconstruction and surveillance tasks to perform better. We show that the general camera placement problem can be formulated mathematically as a convex binary quadratic program (BQP) under linear constraints. Moreover, we propose an optimization strategy with a favorable trade-off between speed and solution quality. Our solution is almost as fast as a greedy treatment of the problem, but the quality is significantly higher, so much so that it is comparable to exact solutions that take orders of magnitude more computation time. Because it is computationally attractive, our method also allows users to explore the space of solutions for variations in input parameters. To evaluate its effectiveness, we show a range of 3D results on real-world floorplans (garage, hotel, mall, and airport). ​

  4. View-based 3-D object retrieval

    CERN Document Server

    Gao, Yue

    2014-01-01

    Content-based 3-D object retrieval has attracted extensive attention recently and has applications in a variety of fields, such as, computer-aided design, tele-medicine,mobile multimedia, virtual reality, and entertainment. The development of efficient and effective content-based 3-D object retrieval techniques has enabled the use of fast 3-D reconstruction and model design. Recent technical progress, such as the development of camera technologies, has made it possible to capture the views of 3-D objects. As a result, view-based 3-D object retrieval has become an essential but challenging res

  5. Achieving thermography with a thermal security camera using uncooled amorphous silicon microbolometer image sensors

    Science.gov (United States)

    Wang, Yu-Wei; Tesdahl, Curtis; Owens, Jim; Dorn, David

    2012-06-01

    Advancements in uncooled microbolometer technology over the last several years have opened up many commercial applications which had been previously cost prohibitive. Thermal technology is no longer limited to the military and government market segments. One type of thermal sensor with low NETD which is available in the commercial market segment is the uncooled amorphous silicon (α-Si) microbolometer image sensor. Typical thermal security cameras focus on providing the best image quality by auto tonemaping (contrast enhancing) the image, which provides the best contrast depending on the temperature range of the scene. While this may provide enough information to detect objects and activities, there are further benefits of being able to estimate the actual object temperatures in a scene. This thermographic ability can provide functionality beyond typical security cameras by being able to monitor processes. Example applications of thermography[2] with thermal camera include: monitoring electrical circuits, industrial machinery, building thermal leaks, oil/gas pipelines, power substations, etc...[3][5] This paper discusses the methodology of estimating object temperatures by characterizing/calibrating different components inside a thermal camera utilizing an uncooled amorphous silicon microbolometer image sensor. Plots of system performance across camera operating temperatures will be shown.

  6. OPTIMAL CAMERA NETWORK DESIGN FOR 3D MODELING OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    B. S. Alsadik

    2012-07-01

    Full Text Available Digital cultural heritage documentation in 3D is subject to research and practical applications nowadays. Image-based modeling is a technique to create 3D models, which starts with the basic task of designing the camera network. This task is – however – quite crucial in practical applications because it needs a thorough planning and a certain level of expertise and experience. Bearing in mind todays computational (mobile power we think that the optimal camera network should be designed in the field, and, therefore, making the preprocessing and planning dispensable. The optimal camera network is designed when certain accuracy demands are fulfilled with a reasonable effort, namely keeping the number of camera shots at a minimum. In this study, we report on the development of an automatic method to design the optimum camera network for a given object of interest, focusing currently on buildings and statues. Starting from a rough point cloud derived from a video stream of object images, the initial configuration of the camera network assuming a high-resolution state-of-the-art non-metric camera is designed. To improve the image coverage and accuracy, we use a mathematical penalty method of optimization with constraints. From the experimental test, we found that, after optimization, the maximum coverage is attained beside a significant improvement of positional accuracy. Currently, we are working on a guiding system, to ensure, that the operator actually takes the desired images. Further next steps will include a reliable and detailed modeling of the object applying sophisticated dense matching techniques.

  7. The eye of the camera: effects of security cameras on pro-social behavior

    NARCIS (Netherlands)

    van Rompay, T.J.L.; Vonk, D.J.; Fransen, M.L.

    2009-01-01

    This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise

  8. PROPERTY OF THE LARGE FORMAT DIGITAL AERIAL CAMERA DMC II

    Directory of Open Access Journals (Sweden)

    K. Jacobsen

    2012-07-01

    Full Text Available Z/I Imaging introduced with the DMC II 140, 230 and 250 digital aerial cameras with a very large format CCD for the panchromatic channel. The CCDs have with 140 / 230 / 250 mega pixel a size not available in photogrammetry before. CCDs in general have a very high relative accuracy, but the overall geometry has to be checked as well as the influence of not flat CCDs. A CCD with a size of 96mm × 82mm must have a flatness or knowledge of flatness in the range of 1μm if the camera accuracy in the range of 1.3μm shall not be influenced. The DMC II cameras have been evaluated with three different flying heights leading to 5cm, 9cm and 15cm or 20cm GSD, crossing flight lines and 60% side lap. The optimal test conditions guaranteed the precise determination of the object coordinates as well as the systematic image errors. All three camera types show only very small systematic image errors, ranging in the root mean square between 0.12μm up to 0.3μm with extreme values not exceeding 1.6μm. The remaining systematic image errors, determined by analysis of the image residuals and not covered by the additional parameters, are negligible. A standard deviation of the object point heights below the GSD, determined at independent check points, even in blocks with just 20% side lap and 60% end lap is standard. Corresponding to the excellent image geometry the object point coordinates are only slightly influenced by the self calibration. For all DMCII types the handling of image models for data acquisition must not be supported by an improvement of the image coordinates by the determined systematic image errors. Such an improvement up to now is not standard for photogrammetric software packages. The advantage of a single monolithic CCD is obvious. An edge analysis of pan-sharpened DMC II 250 images resulted in factors for the effective resolution below 1.0. The result below 1.0 is only possible by contrast enhancement, but this requires with low image noise

  9. Acceptance/Operational Test Report for Tank 241-AN-104 camera and camera purge control system

    International Nuclear Information System (INIS)

    Castleberry, J.L.

    1995-11-01

    This Acceptance/Operational Test Procedure (ATP/OTP) will document the satisfactory operation of the camera purge panel, purge control panel, color camera system and associated control components destined for installation. The final acceptance of the complete system will be performed in the field. The purge panel and purge control panel will be tested for its safety interlock which shuts down the camera and pan-and-tilt inside the tank vapor space during loss of purge pressure and that the correct purge volume exchanges are performed as required by NFPA 496. This procedure is separated into seven sections. This Acceptance/Operational Test Report documents the successful acceptance and operability testing of the 241-AN-104 camera system and camera purge control system

  10. Gate Simulation of a Gamma Camera

    International Nuclear Information System (INIS)

    Abidi, Sana; Mlaouhi, Zohra

    2008-01-01

    Medical imaging is a very important diagnostic because it allows for an exploration of the internal human body. The nuclear imaging is an imaging technique used in the nuclear medicine. It is to determine the distribution in the body of a radiotracers by detecting the radiation it emits using a detection device. Two methods are commonly used: Single Photon Emission Computed Tomography (SPECT) and the Positrons Emission Tomography (PET). In this work we are interested on modelling of a gamma camera. This simulation is based on Monte-Carlo language and in particular Gate simulator (Geant4 Application Tomographic Emission). We have simulated a clinical gamma camera called GAEDE (GKS-1) and then we validate these simulations by experiments. The purpose of this work is to monitor the performance of these gamma camera and the optimization of the detector performance and the the improvement of the images quality. (Author)

  11. Mechanical Design of the LSST Camera

    Energy Technology Data Exchange (ETDEWEB)

    Nordby, Martin; Bowden, Gordon; Foss, Mike; Guiffre, Gary; /SLAC; Ku, John; /Unlisted; Schindler, Rafe; /SLAC

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors in image reconstruction. Design and analysis for the camera body and cryostat will be detailed.

  12. Phase camera experiment for Advanced Virgo

    International Nuclear Information System (INIS)

    Agatsuma, Kazuhiro; Beuzekom, Martin van; Schaaf, Laura van der; Brand, Jo van den

    2016-01-01

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser modulation/demodulation method is used to measure mirror displacements and used for the position controls. This plays a significant role because the quality of controls affect the noise level of the GW detector. The phase camera is able to monitor each sideband separately, which has a great benefit for the manipulation of the delicate controls. Also, overcoming mirror aberrations will be an essential part of Advanced Virgo (AdV), which is a GW detector close to Pisa. Especially low-frequency sidebands can be affected greatly by aberrations in one of the interferometer cavities. The phase cameras allow tracking such changes because the state of the sidebands gives information on mirror aberrations. A prototype of the phase camera has been developed and is currently tested. The performance checks are almost completed and the installation of the optics at the AdV site has started. After the installation and commissioning, the phase camera will be combined to a thermal compensation system that consists of CO 2 lasers and compensation plates. In this paper, we focus on the prototype and show some limitations from the scanner performance. - Highlights: • The phase camera is being developed for a gravitational wave detector. • A scanner performance limits the operation speed and layout design of the system. • An operation range was found by measuring the frequency response of the scanner.

  13. Dynamical scene analysis with a moving camera: mobile targets detection system

    International Nuclear Information System (INIS)

    Hennebert, Christine

    1996-01-01

    This thesis work deals with the detection of moving objects in monocular image sequences acquired with a mobile camera. We propose a method able to detect small moving objects in visible or infrared images of real outdoor scenes. In order to detect objects of very low apparent motion, we consider an analysis on a large temporal interval. We have chosen to compensate for the dominant motion due to the camera displacement for several consecutive images in order to form a sub-sequence of images for which the camera seems virtually static. We have also developed a new approach allowing to extract the different layers of a real scene in order to deal with cases where the 2D motion due to the camera displacement cannot be globally compensated for. To this end, we use a hierarchical model with two levels: the local merging step and the global merging one. Then, an appropriate temporal filtering is applied to registered image sub-sequence to enhance signals corresponding to moving objects. The detection issue is stated as a labeling problem within a statistical regularization based on Markov Random Fields. Our method has been validated on numerous real image sequences depicting complex outdoor scenes. Finally, the feasibility of an integrated circuit for mobile object detection has been proved. This circuit could lead to an ASIC creation. (author) [fr

  14. Results with the UKIRT infrared camera

    International Nuclear Information System (INIS)

    Mclean, I.S.

    1987-01-01

    Recent advances in focal plane array technology have made an immense impact on infrared astronomy. Results from the commissioning of the first infrared camera on UKIRT (the world's largest IR telescope) are presented. The camera, called IRCAM 1, employs the 62 x 58 InSb DRO array from SBRC in an otherwise general purpose system which is briefly described. Several imaging modes are possible including staring, chopping and a high-speed snapshot mode. Results to be presented include the first true high resolution images at IR wavelengths of the entire Orion nebula

  15. Camera-enabled techniques for organic synthesis

    Directory of Open Access Journals (Sweden)

    Steven V. Ley

    2013-05-01

    Full Text Available A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future.

  16. Nonmedical applications of a positron camera

    International Nuclear Information System (INIS)

    Hawkesworth, M.R.; Parker, D.J.; Fowles, P.; Crilly, J.F.; Jefferies, N.L.; Jonkers, G.

    1991-01-01

    The positron camera in the School on Physics and Space Research, University of Birmingham, is based on position-sensitive multiwire γ-ray detectors developed at the Rutherford Appleton Laboratory. The current characteristics of the camera are discussed with particular reference to its suitability for flow mapping in industrial subjects. The techniques developed for studying the dynamics of processes with time scales ranging from milliseconds to days are described, and examples of recent results from a variety of industrial applications are presented. (orig.)

  17. Improving depth maps of plants by using a set of five cameras

    Science.gov (United States)

    Kaczmarek, Adam L.

    2015-03-01

    Obtaining high-quality depth maps and disparity maps with the use of a stereo camera is a challenging task for some kinds of objects. The quality of these maps can be improved by taking advantage of a larger number of cameras. The research on the usage of a set of five cameras to obtain disparity maps is presented. The set consists of a central camera and four side cameras. An algorithm for making disparity maps called multiple similar areas (MSA) is introduced. The algorithm was specially designed for the set of five cameras. Experiments were performed with the MSA algorithm and the stereo matching algorithm based on the sum of sum of squared differences (sum of SSD, SSSD) measure. Moreover, the following measures were included in the experiments: sum of absolute differences (SAD), zero-mean SAD (ZSAD), zero-mean SSD (ZSSD), locally scaled SAD (LSAD), locally scaled SSD (LSSD), normalized cross correlation (NCC), and zero-mean NCC (ZNCC). Algorithms presented were applied to images of plants. Making depth maps of plants is difficult because parts of leaves are similar to each other. The potential usability of the described algorithms is especially high in agricultural applications such as robotic fruit harvesting.

  18. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    Directory of Open Access Journals (Sweden)

    Bruno Roux

    2008-11-01

    Full Text Available The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1 the use of unprocessed image data did not improve the results of image analyses; 2 vignetting had a significant effect, especially for the modified camera, and 3 normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  19. User-assisted visual search and tracking across distributed multi-camera networks

    Science.gov (United States)

    Raja, Yogesh; Gong, Shaogang; Xiang, Tao

    2011-11-01

    Human CCTV operators face several challenges in their task which can lead to missed events, people or associations, including: (a) data overload in large distributed multi-camera environments; (b) short attention span; (c) limited knowledge of what to look for; and (d) lack of access to non-visual contextual intelligence to aid search. Developing a system to aid human operators and alleviate such burdens requires addressing the problem of automatic re-identification of people across disjoint camera views, a matching task made difficult by factors such as lighting, viewpoint and pose changes and for which absolute scoring approaches are not best suited. Accordingly, we describe a distributed multi-camera tracking (MCT) system to visually aid human operators in associating people and objects effectively over multiple disjoint camera views in a large public space. The system comprises three key novel components: (1) relative measures of ranking rather than absolute scoring to learn the best features for matching; (2) multi-camera behaviour profiling as higher-level knowledge to reduce the search space and increase the chance of finding correct matches; and (3) human-assisted data mining to interactively guide search and in the process recover missing detections and discover previously unknown associations. We provide an extensive evaluation of the greater effectiveness of the system as compared to existing approaches on industry-standard i-LIDS multi-camera data.

  20. New `Moons' of Saturn May Be Transient Objects

    Science.gov (United States)

    1996-01-01

    ring plane crossings (RPX) . At the corresponding times, the Sun illuminates the thin Saturnian rings exactly from the side. Due to its own orbital motion around the Sun, the Earth will cross the ring plane either once or three times, just before and/or after a solar RPX event. In 1995, this happened on May 22 and August 10, and there will be a third Earth RPX event on February 11, 1996. RPX Events Offer Improved Possibilities to Discover Faint Moons The apparent brightness of Saturn's rings decreases dramatically around the time of a solar RPX event. It is then much easier to detect faint moons which would otherwise be lost in the strong glare of Saturn's ring system. Also, the edge-on view improves the chances of detecting faint and dilute rings [3]. Moreover, numerous `mutual events' (eclipses and occultations) occur between the moons during this period; exact timing of these events allows highly improved determination of the motions and orbits around Saturn of these objects. The most recent Earth RPX event took place on August 10, 1995. At this time, Saturn was situated nearly opposite the Sun (in `opposition'), as seen from the Earth, and conditions were very favourable for astronomical observations from both hemispheres. However, because of the longer nights during the southern winter, observing possibilities were particularly good in the south and thus at the ESO La Silla Observatory. The ADONIS Observations Here, a team of astronomers (Jean-Luc Beuzit, Bruno Sicardy and Francois Poulet of the Paris Observatory; Pablo Prado from ESO) followed this rare event during 6 half-nights around August 10, 1995, with the advanced ADONIS adaptive optics camera at the ESO 3.6-m telescope. This instrument neutralizes the image-smearing effects of the atmospheric turbulence and records very sharp images on an infrared-sensitive 256 x 256 pixel detector with a scale of 0.05 arcsec/pixel. Most of the Saturn images were taken through the `short K' filter with a central

  1. Searching for Faint Traces of CO(2-1) and HCN(4-3) Gas In Debris Disks

    Science.gov (United States)

    Stafford Lambros, Zachary; Hughes, A. Meredith

    2018-01-01

    The surprising presence of molecular gas in the debris disks around main sequence stars provides an opportunity to study the dissipation of primordial gas and, potentially, the composition of gas in other solar systems. Molecular gas is not expected to survive beyond the pre-main sequence phase, and it is not yet clear whether the gas is a remnant of the primordial protoplanetary material or whether the gas, like the dust, is second-generation material produced by collisional or photodesorption from planetesimals, exocomets, or the icy mantles of dust grains. Here we present two related efforts to characterize the prevalence and properties of gas in debris disks. First, we place the lowest limits to date on the CO emission from an M star debris disk, using 0.3" resolution observations of CO(2-1) emission from the AU Mic system with the Atacama Large Millimeter/submillimeter Array (ALMA). We place a 3-sigma upper limit on the integrated flux of 0.39 Jy km/s, corresponding to a maximum CO mass of 5e10-6 (Earth Masses) if the gas is in LTE. We also present the results of an ALMA search for HCN(4-3) emission from the prototypical gas-rich debris disk around 49 Ceti at a spatial resolution of 0.3". Despite hosting one of the brightest CO-rich debris disks yet discovered, our observations of 49 Ceti also yield a low upper limit of 0.057 Jy km/s in the HCN line, leaving CO as the only molecule clearly detected in emission from a debris disk. We employ several methods of detecting faint line emission from debris disks, including a model based on Keplerian kinematics as well as a spectral shifting method previously used to detect faint CO emission from the Fomalhaut debris disk, and compare our results.

  2. Compact Optical Technique for Streak Camera Calibration

    International Nuclear Information System (INIS)

    Bell, P; Griffith, R; Hagans, K; Lerche, R; Allen, C; Davies, T; Janson, F; Justin, R; Marshall, B; Sweningsen, O

    2004-01-01

    The National Ignition Facility (NIF) is under construction at the Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses1 (optical comb generators) that are suitable for temporal calibrations. These optical comb generators (Figure 1) are used with the LLNL optical streak cameras. They are small, portable light sources that produce a series of temporally short, uniformly spaced, optical pulses. Comb generators have been produced with 0.1, 0.5, 1, 3, 6, and 10-GHz pulse trains of 780-nm wavelength light with individual pulse durations of ∼25-ps FWHM. Signal output is via a fiber-optic connector. Signal is transported from comb generator to streak camera through multi-mode, graded-index optical fibers. At the NIF, ultra-fast streak-cameras are used by the Laser Fusion Program experimentalists to record fast transient optical signals. Their temporal resolution is unmatched by any other transient recorder. Their ability to spatially discriminate an image along the input slit allows them to function as a one-dimensional image recorder, time-resolved spectrometer, or multichannel transient recorder. Depending on the choice of photocathode, they can be made sensitive to photon energies from 1.1 eV to 30 keV and beyond. Comb generators perform two important functions for LLNL streak-camera users. First, comb generators are used as a precision time-mark generator for calibrating streak camera sweep rates. Accuracy is achieved by averaging many streak camera images of comb generator signals. Time-base calibrations with portable comb generators are easily done in both the calibration laboratory and in situ. Second, comb signals are applied

  3. Deep Rapid Optical Follow-Up of Gravitational Wave Sources with the Dark Energy Camera

    Science.gov (United States)

    Cowperthwaite, Philip

    2018-01-01

    The detection of an electromagnetic counterpart associated with a gravitational wave detection by the Advanced LIGO and VIRGO interferometers is one of the great observational challenges of our time. The large localization regions and potentially faint counterparts require the use of wide-field, large aperture telescopes. As a result, the Dark Energy Camera, a 3.3 sq deg CCD imager on the 4-m Blanco telescope at CTIO in Chile is the most powerful instrument for this task in the Southern Hemisphere. I will report on the results from our joint program between the community and members of the dark energy survey to conduct rapid and efficient follow-up of gravitational wave sources. This includes systematic searches for optical counterparts, as well as developing an understanding of contaminating sources on timescales not normally probed by traditional untargeted supernova surveys. I will additionally comment on the immense science gains to be made by a joint detection and discuss future prospects from the standpoint of both next generation wide-field telescopes and next generation gravitational wave detectors.

  4. Underwater video enhancement using multi-camera super-resolution

    Science.gov (United States)

    Quevedo, E.; Delory, E.; Callicó, G. M.; Tobajas, F.; Sarmiento, R.

    2017-12-01

    Image spatial resolution is critical in several fields such as medicine, communications or satellite, and underwater applications. While a large variety of techniques for image restoration and enhancement has been proposed in the literature, this paper focuses on a novel Super-Resolution fusion algorithm based on a Multi-Camera environment that permits to enhance the quality of underwater video sequences without significantly increasing computation. In order to compare the quality enhancement, two objective quality metrics have been used: PSNR (Peak Signal-to-Noise Ratio) and the SSIM (Structural SIMilarity) index. Results have shown that the proposed method enhances the objective quality of several underwater sequences, avoiding the appearance of undesirable artifacts, with respect to basic fusion Super-Resolution algorithms.

  5. QuadCam - A Quadruple Polarimetric Camera for Space Situational Awareness

    Science.gov (United States)

    Skuljan, J.

    A specialised quadruple polarimetric camera for space situational awareness, QuadCam, has been built at the Defence Technology Agency (DTA), New Zealand, as part of collaboration with the Defence Science and Technology Laboratory (Dstl), United Kingdom. The design was based on a similar system originally developed at Dstl, with some significant modifications for improved performance. The system is made up of four identical CCD cameras looking in the same direction, but in a different plane of polarisation at 0, 45, 90 and 135 degrees with respect to the reference plane. A standard set of Stokes parameters can be derived from the four images in order to describe the state of polarisation of an object captured in the field of view. The modified design of the DTA QuadCam makes use of four small Raspberry Pi computers, so that each camera is controlled by its own computer in order to speed up the readout process and ensure that the four individual frames are taken simultaneously (to within 100-200 microseconds). In addition, a new firmware was requested from the camera manufacturer so that an output signal is generated to indicate the state of the camera shutter. A specialised GPS unit (also developed at DTA) is then used to monitor the shutter signals from the four cameras and record the actual time of exposure to an accuracy of about 100 microseconds. This makes the system well suited for the observation of fast-moving objects in the low Earth orbit (LEO). The QuadCam is currently mounted on a Paramount MEII robotic telescope mount at the newly built DTA space situational awareness observatory located on Whangaparaoa Peninsula near Auckland, New Zealand. The system will be used for tracking satellites in low Earth orbit and geostationary belt as well. The performance of the camera has been evaluated and a series of test images have been collected in order to derive the polarimetric signatures for selected satellites.

  6. Design of an experimental four-camera setup for enhanced 3D surface reconstruction in microsurgery

    Directory of Open Access Journals (Sweden)

    Marzi Christian

    2017-09-01

    Full Text Available Future fully digital surgical visualization systems enable a wide range of new options. Caused by optomechanical limitations a main disadvantage of today’s surgical microscopes is their incapability of providing arbitrary perspectives to more than two observers. In a fully digital microscopic system, multiple arbitrary views can be generated from a 3D reconstruction. Modern surgical microscopes allow replacing the eyepieces by cameras in order to record stereoscopic videos. A reconstruction from these videos can only contain the amount of detail the recording camera system gathers from the scene. Therefore, covered surfaces can result in a faulty reconstruction for deviating stereoscopic perspectives. By adding cameras recording the object from different angles, additional information of the scene is acquired, allowing to improve the reconstruction. Our approach is to use a fixed four-camera setup as a front-end system to capture enhanced 3D topography of a pseudo-surgical scene. This experimental setup would provide images for the reconstruction algorithms and generation of multiple observing stereo perspectives. The concept of the designed setup is based on the common main objective (CMO principle of current surgical microscopes. These systems are well established and optically mature. Furthermore, the CMO principle allows a more compact design and a lowered effort in calibration than cameras with separate optics. Behind the CMO four pupils separate the four channels which are recorded by one camera each. The designed system captures an area of approximately 28mm × 28mm with four cameras. Thus, allowing to process images of 6 different stereo perspectives. In order to verify the setup, it is modelled in silico. It can be used in further studies to test algorithms for 3D reconstruction from up to four perspectives and provide information about the impact of additionally recorded perspectives on the enhancement of a reconstruction.

  7. Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging.

    Science.gov (United States)

    Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; Geert Sander de Jong, Jan; van Geest, Bert; Stoop, Karel; Young, Ian Ted

    2012-12-01

    We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.

  8. Modulated electron-multiplied fluorescence lifetime imaging microscope: all-solid-state camera for fluorescence lifetime imaging

    Science.gov (United States)

    Zhao, Qiaole; Schelen, Ben; Schouten, Raymond; van den Oever, Rein; Leenen, René; van Kuijk, Harry; Peters, Inge; Polderdijk, Frank; Bosiers, Jan; Raspe, Marcel; Jalink, Kees; de Jong, Jan Geert Sander; van Geest, Bert; Stoop, Karel; Young, Ian Ted

    2012-12-01

    We have built an all-solid-state camera that is directly modulated at the pixel level for frequency-domain fluorescence lifetime imaging microscopy (FLIM) measurements. This novel camera eliminates the need for an image intensifier through the use of an application-specific charge coupled device design in a frequency-domain FLIM system. The first stage of evaluation for the camera has been carried out. Camera characteristics such as noise distribution, dark current influence, camera gain, sampling density, sensitivity, linearity of photometric response, and optical transfer function have been studied through experiments. We are able to do lifetime measurement using our modulated, electron-multiplied fluorescence lifetime imaging microscope (MEM-FLIM) camera for various objects, e.g., fluorescein solution, fixed green fluorescent protein (GFP) cells, and GFP-actin stained live cells. A detailed comparison of a conventional microchannel plate (MCP)-based FLIM system and the MEM-FLIM system is presented. The MEM-FLIM camera shows higher resolution and a better image quality. The MEM-FLIM camera provides a new opportunity for performing frequency-domain FLIM.

  9. The LLL compact 10-ps streak camera

    International Nuclear Information System (INIS)

    Thomas, S.W.; Houghton, J.W.; Tripp, G.R.; Coleman, L.W.

    1975-01-01

    The 10-ps streak camera has been redesigned to simplify its operation, reduce manufacturing costs, and improve its appearance. The electronics have been simplified, a film indexer added, and a contacted slit has been evaluated. Data support a 10-ps resolution. (author)

  10. Terrain mapping camera for Chandrayaan-1

    Indian Academy of Sciences (India)

    The Terrain Mapping Camera (TMC) on India's first satellite for lunar exploration, Chandrayaan-1, is for generating high-resolution 3-dimensional maps of the Moon. With this instrument, a complete topographic map of the Moon with 5 m spatial resolution and 10-bit quantization will be available for scientific studies.

  11. A multidetector scintillation camera with 254 channels

    DEFF Research Database (Denmark)

    Sveinsdottir, E; Larsen, B; Rommer, P

    1977-01-01

    A computer-based scintillation camera has been designed for both dynamic and static radionuclide studies. The detecting head has 254 independent sodium iodide crystals, each with a photomultiplier and amplifier. In dynamic measurements simultaneous events can be recorded, and 1 million total coun...

  12. FPS camera sync and reset chassis

    International Nuclear Information System (INIS)

    Yates, G.J.

    1980-06-01

    The sync and reset chassis provides all the circuitry required to synchronize an event to be studied, a remote free-running focus projection and scanning (FPS) data-acquisition TV camera, and a video signal recording system. The functions, design, and operation of this chassis are described in detail

  13. The Legal Implications of Surveillance Cameras

    Science.gov (United States)

    Steketee, Amy M.

    2012-01-01

    The nature of school security has changed dramatically over the last decade. Schools employ various measures, from metal detectors to identification badges to drug testing, to promote the safety and security of staff and students. One of the increasingly prevalent measures is the use of security cameras. In fact, the U.S. Department of Education…

  14. A novel super-resolution camera model

    Science.gov (United States)

    Shao, Xiaopeng; Wang, Yi; Xu, Jie; Wang, Lin; Liu, Fei; Luo, Qiuhua; Chen, Xiaodong; Bi, Xiangli

    2015-05-01

    Aiming to realize super resolution(SR) to single image and video reconstruction, a super resolution camera model is proposed for the problem that the resolution of the images obtained by traditional cameras behave comparatively low. To achieve this function we put a certain driving device such as piezoelectric ceramics in the camera. By controlling the driving device, a set of continuous low resolution(LR) images can be obtained and stored instantaneity, which reflect the randomness of the displacements and the real-time performance of the storage very well. The low resolution image sequences have different redundant information and some particular priori information, thus it is possible to restore super resolution image factually and effectively. The sample method is used to derive the reconstruction principle of super resolution, which analyzes the possible improvement degree of the resolution in theory. The super resolution algorithm based on learning is used to reconstruct single image and the variational Bayesian algorithm is simulated to reconstruct the low resolution images with random displacements, which models the unknown high resolution image, motion parameters and unknown model parameters in one hierarchical Bayesian framework. Utilizing sub-pixel registration method, a super resolution image of the scene can be reconstructed. The results of 16 images reconstruction show that this camera model can increase the image resolution to 2 times, obtaining images with higher resolution in currently available hardware levels.

  15. Face identification in videos from mobile cameras

    NARCIS (Netherlands)

    Mu, Meiru; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2014-01-01

    It is still challenging to recognize faces reliably in videos from mobile camera, although mature automatic face recognition technology for still images has been available for quite some time. Suppose we want to be alerted when suspects appear in the recording of a police Body-Cam, even a good face

  16. A multidetector scintillation camera with 254 channels

    DEFF Research Database (Denmark)

    Sveinsdottir, E; Larsen, B; Rommer, P

    1977-01-01

    A computer-based scintillation camera has been designed for both dynamic and static radionuclide studies. The detecting head has 254 independent sodium iodide crystals, each with a photomultiplier and amplifier. In dynamic measurements simultaneous events can be recorded, and 1 million total counts...

  17. Digital Camera Project Fosters Communication Skills

    Science.gov (United States)

    Fisher, Ashley; Lazaros, Edward J.

    2009-01-01

    This article details the many benefits of educators' use of digital camera technology and provides an activity in which students practice taking portrait shots of classmates, manipulate the resulting images, and add language arts practice by interviewing their subjects to produce a photo-illustrated Word document. This activity gives…

  18. Phase camera experiment for Advanced Virgo

    NARCIS (Netherlands)

    Agatsuma, Kazuhiro; Van Beuzekom, Martin; Van Der Schaaf, Laura; Van Den Brand, Jo

    2016-01-01

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser

  19. A multi-criteria approach to camera motion design for volume data animation.

    Science.gov (United States)

    Hsu, Wei-Hsien; Zhang, Yubo; Ma, Kwan-Liu

    2013-12-01

    We present an integrated camera motion design and path generation system for building volume data animations. Creating animations is an essential task in presenting complex scientific visualizations. Existing visualization systems use an established animation function based on keyframes selected by the user. This approach is limited in providing the optimal in-between views of the data. Alternatively, computer graphics and virtual reality camera motion planning is frequently focused on collision free movement in a virtual walkthrough. For semi-transparent, fuzzy, or blobby volume data the collision free objective becomes insufficient. Here, we provide a set of essential criteria focused on computing camera paths to establish effective animations of volume data. Our dynamic multi-criteria solver coupled with a force-directed routing algorithm enables rapid generation of camera paths. Once users review the resulting animation and evaluate the camera motion, they are able to determine how each criterion impacts path generation. In this paper, we demonstrate how incorporating this animation approach with an interactive volume visualization system reduces the effort in creating context-aware and coherent animations. This frees the user to focus on visualization tasks with the objective of gaining additional insight from the volume data.

  20. Integrating Gigabit ethernet cameras into EPICS at Diamond light source

    International Nuclear Information System (INIS)

    Cobb, T.

    2012-01-01

    At Diamond Light Source a range of cameras are used to provide images for diagnostic purposes in both the accelerator and photo beamlines. The accelerator and existing beamlines use Point Grey Flea and Flea2 Firewire cameras. We have selected Gigabit Ethernet cameras supporting GigE Vision for our new photon beamlines. GigE Vision is an interface standard for high speed Ethernet cameras which encourages inter-operability between manufacturers. This paper describes the challenges encountered while integrating GigE Vision cameras from a range of vendors into EPICS. GigE Vision cameras appear to be more reliable than the Firewire cameras, and the simple cabling makes much easier to move the cameras to different positions. Upcoming power over Ethernet versions of the cameras will reduce the number of cables still further

  1. New nuclear medicine gamma camera systems

    International Nuclear Information System (INIS)

    Villacorta, Edmundo V.

    1997-01-01

    The acquisition of the Open E.CAM and DIACAM gamma cameras by Makati Medical Center is expected to enhance the capabilities of its nuclear medicine facilities. When used as an aid to diagnosis, nuclear medicine entails the introduction of a minute amount of radioactive material into the patient; thus, no reaction or side-effect is expected. When it reaches the particular target organ, depending on the radiopharmaceutical, a lesion will appear as a decrease (cold) area or increase (hot) area in the radioactive distribution as recorded byu the gamma cameras. Gamma camera images in slices or SPECT (Single Photon Emission Computer Tomography), increase the sensitivity and accuracy in detecting smaller and deeply seated lesions, which otherwise may not be detected in the regular single planar images. Due to the 'open' design of the equipment, claustrophobic patients will no longer feel enclosed during the procedure. These new gamma cameras yield improved resolution and superb image quality, and the higher photon sensitivity shortens imaging acquisition time. The E.CAM, which is the latest generation gamma camera, is featured by its variable angle dual-head system, the only one available in the Philipines, and the excellent choice for Myocardial Perfusion Imaging (MPI). From the usual 45 minutes, the acquisition time for gated SPECT imaging of the heart has now been remarkably reduced to 12 minutes. 'Gated' infers snap-shots of the heart in selected phases of its contraction and relaxation as triggered by ECG. The DIACAM is installed in a room with access outside the main entrance of the department, intended specially for bed-borne patients. Both systems are equipped with a network of high performance Macintosh ICOND acquisition and processing computers. Added to the hardware is the ICON processing software which allows total simultaneous acquisition and processing capabilities in the same operator's terminal. Video film and color printers are also provided. Together

  2. Camera Geo calibration Using an MCMC Approach (Author’s Manuscript)

    Science.gov (United States)

    2016-08-19

    environments and infer location using local image descrip- tors [4, 16]. Li et al. [17] exploit geo-registered 3D points clouds to estimate camera pose. Many...loca- tions, downloaded the corresponding equirectangular panora- mas and extracted a perspective image from each. For each query, we labeled objects

  3. The Topological Panorama Camera: A New Tool for Teaching Concepts Related to Space and Time.

    Science.gov (United States)

    Gelphman, Janet L.; And Others

    1992-01-01

    Included are the description, operating characteristics, uses, and future plans for the Topological Panorama Camera, which is an experimental, robotic photographic device capable of producing visual renderings of the mathematical characteristics of an equation in terms of position changes of an object or in terms of the shape of the space…

  4. Integrating motion-detection cameras and hair snags for wolverine identification

    Science.gov (United States)

    Audrey J. Magoun; Clinton D. Long; Michael K. Schwartz; Kristine L. Pilgrim; Richard E. Lowell; Patrick Valkenburg

    2011-01-01

    We developed an integrated system for photographing a wolverine's (Gulo gulo) ventral pattern while concurrently collecting hair for microsatellite DNA genotyping. Our objectives were to 1) test the system on a wild population of wolverines using an array of camera and hair-snag (C&H) stations in forested habitat where wolverines were known to occur, 2)...

  5. Fluorescence decay times of indigo-derivatives measured by means of a synchroscan streak camera

    Science.gov (United States)

    Lill, E.; Hefferle, P.; Schneider, S.; Dörr, F.

    1980-06-01

    Employing a synchronously pumped, modelocked dye laser for excitation in connection with a commercial, continuously operated streak camera the solvent dependent fluorescence decaytimes of several indigo-derivatives exhibiting a low fluorescence quantum efficiency were determined with a temporal resolution of about 5 ps in order to further elucidate their energy relaxation mechanisms, which are the object of continuous controversy.

  6. X-ray imaging using digital cameras

    Science.gov (United States)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  7. The SALSA Project - High-End Aerial 3d Camera

    Science.gov (United States)

    Rüther-Kindel, W.; Brauchle, J.

    2013-08-01

    The ATISS measurement drone, developed at the University of Applied Sciences Wildau, is an electrical powered motor glider with a maximum take-off weight of 25 kg including a payload capacity of 10 kg. Two 2.5 kW engines enable ultra short take-off procedures and the motor glider design results in a 1 h endurance. The concept of ATISS is based on the idea to strictly separate between aircraft and payload functions, which makes ATISS a very flexible research platform for miscellaneous payloads. ATISS is equipped with an autopilot for autonomous flight patterns but under permanent pilot control from the ground. On the basis of ATISS the project SALSA was undertaken. The aim was to integrate a system for digital terrain modelling. Instead of a laser scanner a new design concept was chosen based on two synchronized high resolution digital cameras, one in a fixed nadir orientation and the other in a oblique orientation. Thus from every object on the ground images from different view angles are taken. This new measurement camera system MACS-TumbleCam was developed at the German Aerospace Center DLR Berlin-Adlershof especially for the ATISS payload concept. Special advantage in comparison to laser scanning is the fact, that instead of a cloud of points a surface including texture is generated and a high-end inertial orientation system can be omitted. The first test flights show a ground resolution of 2 cm and height resolution of 3 cm, which underline the extraordinary capabilities of ATISS and the MACS measurement camera system.

  8. Symbiotic Stars in X-rays. II. Faint Sources Detected with XMM-Newton and Chandra

    Science.gov (United States)

    Nunez, N. E.; Luna, G. J. M.; Pillitteri, I.; Mukai, K.

    2014-01-01

    We report the detection from four symbiotic stars that were not known to be X-ray sources. These four object show a ß-type X-ray spectrum, that is, their spectra can be modeled with an absorbed optically thin thermal emission with temperatures of a few million degrees. Photometric series obtained with the Optical Monitor on board XMM-Newton from V2416 Sgr and NSV 25735 support the proposed scenario where the X-ray emission is produced in a shock-heated region inside the symbiotic nebulae.

  9. The Lyα luminosity function at z = 5.7 - 6.6 and the steep drop of the faint end: implications for reionization

    Science.gov (United States)

    Santos, Sérgio; Sobral, David; Matthee, Jorryt

    2016-12-01

    We present new results from the widest narrow-band survey search for Lyα emitters at z = 5.7, just after reionization. We survey a total of 7 deg2 spread over the COSMOS, UDS and SA22 fields. We find over 11 000 line emitters, out of which 514 are robust Lyα candidates at z = 5.7 within a volume of 6.3 × 106 Mpc3. Our Lyα emitters span a wide range in Lyα luminosities, from faint to bright (LLyα ˜ 1042.5-44 erg s-1) and rest-frame equivalent widths (EW0 ˜ 25-1000 Å) in a single, homogeneous data set. By combining all our fields, we find that the faint end slope of the z = 5.7 Lyα luminosity function is very steep, with α =-2.3^{+0.4}_{-0.3}. We also present an updated z = 6.6 Lyα luminosity function, based on comparable volumes and obtained with the same methods, which we directly compare with that at z = 5.7. We find a significant decline of the number density of faint Lyα emitters from z = 5.7 to 6.6 (by 0.5 ± 0.1 dex), but no evolution at the bright end/no evolution in L*. Faint Lyα emitters at z = 6.6 show much more extended haloes than those at z = 5.7, suggesting that neutral Hydrogen plays an important role, increasing the scattering and leading to observations missing faint Lyα emission within the epoch of reionization. Altogether, our results suggest that we are observing patchy reionization which happens first around the brightest Lyα emitters, allowing the number densities of those sources to remain unaffected by the increase of neutral Hydrogen fraction from z ˜ 5 to 7.

  10. VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras

    Science.gov (United States)

    Champey, Patrick R.; Kobayashi, Ken; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.

  11. Featured Image: Revealing Hidden Objects with Color

    Science.gov (United States)

    Kohler, Susanna

    2018-02-01

    Stunning color astronomical images can often be the motivation for astronomers to continue slogging through countless data files, calculations, and simulations as we seek to understand the mysteries of the universe. But sometimes the stunning images can, themselves, be the source of scientific discovery. This is the case with the below image of Lynds Dark Nebula 673, located in the Aquila constellation, that was captured with the Mayall 4-meter telescope at Kitt Peak National Observatory by a team of scientists led by Travis Rector (University of Alaska Anchorage). After creating the image with a novel color-composite imaging method that reveals faint H emission (visible in red in both images here), Rector and collaborators identified the presence of a dozen new Herbig-Haro objects small cloud patches that are caused when material is energetically flung out from newly born stars. The image adapted above shows three of the new objects, HH 118789, aligned with two previously known objects, HH 32 and 332 suggesting they are driven by the same source. For more beautiful images and insight into the authors discoveries, check out the article linked below!Full view of Lynds Dark Nebula 673. Click for the larger view this beautiful composite image deserves! [T.A. Rector (University of Alaska Anchorage) and H. Schweiker (WIYN and NOAO/AURA/NSF)]CitationT. A. Rector et al 2018 ApJ 852 13. doi:10.3847/1538-4357/aa9ce1

  12. Design of achromatic and apochromatic plastic micro-objectives.

    Science.gov (United States)

    Greisukh, Grigoriy I; Ezhov, Evgeniy G; Levin, Il'ya A; Stepanov, Sergei A

    2010-08-10

    The possibility and the efficiency of using a single diffractive lens to achromatize and apochromatize micro-objectives with plastic lenses are shown. In addition, recommendations are given on assembling the starting configurations of the objectives and calculating the design parameters required for subsequent optimization. It is also shown that achievable optical performance of achromatic and apochromatic micro-objectives with plastic lenses satisfy the qualifying standards for cell-phone objectives and closed-circuit television (CCTV) cameras.

  13. A sampling ultra-high-speed streak camera based on the use of a unique photomultiplier

    International Nuclear Information System (INIS)

    Marode, Emmanuel

    An apparatus reproducing the ''streak'' mode of a high-speed camera is proposed for the case of a slit AB whose variations in luminosity are repetitive. A photomultiplier, analysing the object AB point by point, and a still camera, photographing a slit fixed on the oscilloscope screen parallel to the sweep direction are placed on a mobile platform P. The movement of P assures a time-resolved analysis of AB. The resolution is of the order of 2.10 -9 s, and can be improved [fr

  14. Night sky quality monitoring in existing and planned dark sky parks by digital cameras

    Directory of Open Access Journals (Sweden)

    Zoltán Kolláth

    2017-06-01

    Full Text Available A crucial part of the qualification of international dark sky places (IDSPs is the objective measurement of night time sky luminance or radiance. Modern digital cameras provide an alternative way to perform all sky imaging either by a fisheye lens or by a mosaic image taken by a wide angle lens. Here we present a method for processing raw camera images to obtain calibrated measurements of sky quality. The comparison of the night sky quality of different European locations is also presented to demonstrate the use of our technique.

  15. Evolutionary Fuzzy Block-Matching-Based Camera Raw Image Denoising.

    Science.gov (United States)

    Yang, Chin-Chang; Guo, Shu-Mei; Tsai, Jason Sheng-Hong

    2017-09-01

    An evolutionary fuzzy block-matching-based image denoising algorithm is proposed to remove noise from a camera raw image. Recently, a variance stabilization transform is widely used to stabilize the noise variance, so that a Gaussian denoising algorithm can be used to remove the signal-dependent noise in camera sensors. However, in the stabilized domain, the existed denoising algorithm may blur too much detail. To provide a better estimate of the noise-free signal, a new block-matching approach is proposed to find similar blocks by the use of a type-2 fuzzy logic system (FLS). Then, these similar blocks are averaged with the weightings which are determined by the FLS. Finally, an efficient differential evolution is used to further improve the performance of the proposed denoising algorithm. The experimental results show that the proposed denoising algorithm effectively improves the performance of image denoising. Furthermore, the average performance of the proposed method is better than those of two state-of-the-art image denoising algorithms in subjective and objective measures.

  16. Simulation-based camera navigation training in laparoscopy-a randomized trial.

    Science.gov (United States)

    Nilsson, Cecilia; Sorensen, Jette Led; Konge, Lars; Westen, Mikkel; Stadeager, Morten; Ottesen, Bent; Bjerrum, Flemming

    2017-05-01

    Inexperienced operating assistants are often tasked with the important role of handling camera navigation during laparoscopic surgery. Incorrect handling can lead to poor visualization, increased operating time, and frustration for the operating surgeon-all of which can compromise patient safety. The objectives of this trial were to examine how to train laparoscopic camera navigation and to explore the transfer of skills to the operating room. A randomized, single-center superiority trial with three groups: The first group practiced simulation-based camera navigation tasks (camera group), the second group practiced performing a simulation-based cholecystectomy (procedure group), and the third group received no training (control group). Participants were surgical novices without prior laparoscopic experience. The primary outcome was assessment of camera navigation skills during a laparoscopic cholecystectomy. The secondary outcome was technical skills after training, using a previously developed model for testing camera navigational skills. The exploratory outcome measured participants' motivation toward the task as an operating assistant. Thirty-six participants were randomized. No significant difference was found in the primary outcome between the three groups (p = 0.279). The secondary outcome showed no significant difference between the interventions groups, total time 167 s (95% CI, 118-217) and 194 s (95% CI, 152-236) for the camera group and the procedure group, respectively (p = 0.369). Both interventions groups were significantly faster than the control group, 307 s (95% CI, 202-412), p = 0.018 and p = 0.045, respectively. On the exploratory outcome, the control group for two dimensions, interest/enjoyment (p = 0.030) and perceived choice (p = 0.033), had a higher score. Simulation-based training improves the technical skills required for camera navigation, regardless of practicing camera navigation or the procedure itself. Transfer to the

  17. Principle of some gamma cameras (efficiencies, limitations, development)

    International Nuclear Information System (INIS)

    Allemand, R.; Bourdel, J.; Gariod, R.; Laval, M.; Levy, G.; Thomas, G.

    1975-01-01

    The quality of scintigraphic images is shown to depend on the efficiency of both the input collimator and the detector. Methods are described by which the quality of these images may be improved by adaptations to either the collimator (Fresnel zone camera, Compton effect camera) or the detector (Anger camera, image amplification camera). The Anger camera and image amplification camera are at present the two main instruments whereby acceptable space and energy resolutions may be obtained. A theoretical comparative study of their efficiencies is carried out, independently of their technological differences, after which the instruments designed or under study at the LETI are presented: these include the image amplification camera, the electron amplifier tube camera using a semi-conductor target CdTe and HgI 2 detector [fr

  18. GPM GROUND VALIDATION DC-8 CAMERA NADIR GCPEX V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation DC-8 Camera Nadir GCPEx dataset contains geo-located visible-wavelength imagery of the ground obtained from the nadir camera aboard the...

  19. GPM GROUND VALIDATION DC-8 CAMERA NADIR GCPEX V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation DC-8 Camera Nadir GCPEx dataset contains geo-located, visible-wavelength imagery of the ground obtained from the nadir camera aboard the...

  20. Declarative camera control for automatic cinematography

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, D.B.; Anderson, S.E.; Li-wei He [Univ. of Washington, Seattle, WA (United States)] [and others

    1996-12-31

    Animations generated by interactive 3D computer graphics applications are typically portrayed either from a particular character`s point of view or from a small set of strategically-placed viewpoints. By ignoring camera placement, such applications fail to realize important storytelling capabilities that have been explored by cinematographers for many years. In this paper, we describe several of the principles of cinematography and show how they can be formalized into a declarative language, called the Declarative Camera Control Language (DCCL). We describe the application of DCCL within the context of a simple interactive video game and argue that DCCL represents cinematic knowledge at the same level of abstraction as expert directors by encoding 16 idioms from a film textbook. These idioms produce compelling animations, as demonstrated on the accompanying videotape.