WorldWideScience

Sample records for narrow angle camera

  1. Telescope and mirrors development for the monolithic silicon carbide instrument of the osiris narrow angle camera

    Science.gov (United States)

    Calvel, Bertrand; Castel, Didier; Standarovski, Eric; Rousset, Gérard; Bougoin, Michel

    2017-11-01

    The international Rosetta mission, now planned by ESA to be launched in January 2003, will provide a unique opportunity to directly study the nucleus of comet 46P/Wirtanen and its activity in 2013. We describe here the design, the development and the performances of the telescope of the Narrow Angle Camera of the OSIRIS experiment et its Silicon Carbide telescope which will give high resolution images of the cometary nucleus in the visible spectrum. The development of the mirrors has been specifically detailed. The SiC parts have been manufactured by BOOSTEC, polished by STIGMA OPTIQUE and ion figured by IOM under the prime contractorship of ASTRIUM. ASTRIUM was also in charge of the alignment. The final optical quality of the aligned telescope is 30 nm rms wavefront error.

  2. Ocular Biometrics of Myopic Eyes With Narrow Angles.

    Science.gov (United States)

    Chong, Gabriel T; Wen, Joanne C; Su, Daniel Hsien-Wen; Stinnett, Sandra; Asrani, Sanjay

    2016-02-01

    The purpose of this study was to compare the ocular biometrics between myopic patients with and without narrow angles. Patients with a stable myopic refraction (myopia worse than -1.00 D spherical equivalent) were prospectively recruited. Angle status was assessed using gonioscopy and biometric measurements were performed using an anterior segment optical coherence tomography and an IOLMaster. A total of 29 patients (58 eyes) were enrolled with 13 patients (26 eyes) classified as having narrow angles and 16 patients (32 eyes) classified as having open angles. Baseline demographics of age, sex, and ethnicity did not differ significantly between the 2 groups. The patients with narrow angles were on average older than those with open angles but the difference did not reach statistical significance (P=0.12). The central anterior chamber depth was significantly less in the eyes with narrow angles (P=0.05). However, the average lens thickness, although greater in the eyes with narrow angles, did not reach statistical significance (P=0.10). Refractive error, axial lengths, and iris thicknesses did not differ significantly between the 2 groups (P=0.32, 0.47, 0.15). Narrow angles can occur in myopic eyes. Routine gonioscopy is therefore recommended for all patients regardless of refractive error.

  3. Multi-Angle Snowflake Camera Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Stuefer, Martin [Univ. of Alaska, Fairbanks, AK (United States); Bailey, J. [Univ. of Alaska, Fairbanks, AK (United States)

    2016-07-01

    The Multi-Angle Snowflake Camera (MASC) takes 9- to 37-micron resolution stereographic photographs of free-falling hydrometers from three angles, while simultaneously measuring their fall speed. Information about hydrometeor size, shape orientation, and aspect ratio is derived from MASC photographs. The instrument consists of three commercial cameras separated by angles of 36º. Each camera field of view is aligned to have a common single focus point about 10 cm distant from the cameras. Two near-infrared emitter pairs are aligned with the camera’s field of view within a 10-angular ring and detect hydrometeor passage, with the lower emitters configured to trigger the MASC cameras. The sensitive IR motion sensors are designed to filter out slow variations in ambient light. Fall speed is derived from successive triggers along the fall path. The camera exposure times are extremely short, in the range of 1/25,000th of a second, enabling the MASC to capture snowflake sizes ranging from 30 micrometers to 3 cm.

  4. Associations between Narrow Angle and Adult Anthropometry: The Liwan Eye Study

    Science.gov (United States)

    Jiang, Yuzhen; He, Mingguang; Friedman, David S.; Khawaja, Anthony P.; Lee, Pak Sang; Nolan, Winifred P.; Yin, Qiuxia; Foster, Paul J.

    2015-01-01

    Purpose To assess the associations between narrow angle and adult anthropometry. Methods Chinese adults aged 50 years and older were recruited from a population-based survey in the Liwan District of Guangzhou, China. Narrow angle was defined as the posterior trabecular meshwork not visible under static gonioscopy in at least three quadrants (i.e. a circumference of at least 270°). Logistic regression models were used to examine the associations between narrow angle and anthropomorphic measures (height, weight and body mass index, BMI). Results Among the 912 participants, lower weight, shorter height, and lower BMI were significantly associated with narrower angle width (tests for trend: mean angle width in degrees vs weight p<0.001; vs height p<0.001; vs BMI p = 0.012). In univariate analyses, shorter height, lower weight and lower BMI were all significantly associated with greater odds of narrow angle. The crude association between height and narrow angle was largely attributable to a stronger association with age and sex. Lower BMI and weight remained significantly associated with narrow angle after adjustment for height, age, sex, axial ocular biometric measures and education. In analyses stratified by sex, the association between BMI and narrow angle was only observed in women. Conclusion Lower BMI and weight were associated with significantly greater odds of narrow angle after adjusting for age, education, axial ocular biometric measures and height. The odds of narrow angle increased 7% per 1 unit decrease in BMI. This association was most evident in women. PMID:24707840

  5. Non-contact measurement of rotation angle with solo camera

    Science.gov (United States)

    Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun

    2015-02-01

    For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.

  6. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    Science.gov (United States)

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  7. Association of lens vault with narrow angles among different ethnic groups.

    Science.gov (United States)

    Lee, Roland Y; Huang, Guofu; Cui, Qi N; He, Mingguang; Porco, Travis C; Lin, Shan C

    2012-06-01

    To compare lens vault between open-angle and narrow-angle eyes in African-, Caucasian-, Hispanic-, Chinese- and Filipino-Americans. In this prospective study, 436 patients with open angle and narrow angle based on the Shaffer gonioscopic grading classification underwent anterior-segment optical coherence tomography. The Zhongshan Angle Assessment Program was used to calculate lens vault. The narrow-angle group included 32 Chinese-Americans, 22 Filipino-Americans, 26 African-Americans, 24 Hispanic-Americans and 73 Caucasian-Americans. The open-angle group included 56 Chinese-Americans, 29 Filipino-Americans, 45 African-Americans, 27 Hispanic-Americans and 102 Caucasian-Americans. Linear mixed effect regression models, accounting for the use of both eyes and adjusting for age, sex, pupil diameter and spherical equivalent, were used to test for the ethnicity and angle coefficients. Tukey's multiple comparison test was used for pairwise comparisons among the open-angle racial groups. Significant difference in lens vault was found among the open-angle racial groups (P = 0.022). For the open-angle patients, mean values for the lens vault measurements were 265 ± 288 µm for Chinese-Americans, 431 ± 248 µm for Caucasian-Americans, 302 ± 213 µm for Filipino-Americans, 304 ± 263 µm for Hispanic-Americans and 200 ± 237 µm for African-Americans. Using Tukey's multiple comparison for pairwise comparisons among the open-angle racial groups, a significant difference was found between African-American and Caucasian-Americans groups (P values for the rest of the pairwise comparisons were not statistically significant. No significant difference was found among the narrow-angle racial groups (P = 0.14). Comparison between the open angle and narrow angle within each racial group revealed significant difference for all racial groups (P < 0.05). Among all the ethnicities included in this study, narrow-angle eyes have greater lens vault compared to open-angle

  8. High prevalence of narrow angles among Filipino-American patients.

    Science.gov (United States)

    Seider, Michael I; Sáles, Christopher S; Lee, Roland Y; Agadzi, Anthony K; Porco, Travis C; Weinreb, Robert N; Lin, Shan C

    2011-03-01

    To determine the prevalence of gonioscopically narrow anterior chamber angles in a Filipino-American clinic population. The records of 122 consecutive, new, self-declared Filipino-American patients examined in a comprehensive ophthalmology clinic in Vallejo, California were reviewed retrospectively. After exclusion, 222 eyes from 112 patients remained for analysis. Data were collected for anterior chamber angle grade as determined by gonioscopy (Shaffer system), age, sex, manifest refraction (spherical equivalent), intraocular pressure, and cup-to-disk ratio. Data from both eyes of patients were included and modeled using standard linear mixed-effects regression. As a comparison, data were also collected from a group of 30 consecutive White patients from the same clinic. After exclusion, 50 eyes from 25 White patients remained for comparison. At least 1 eye of 24% of Filipino-American patients had a narrow anterior chamber angle (Shaffer grade ≤ 2). Filipino-American angle grade significantly decreased with increasingly hyperopic refraction (P=0.007) and larger cup-to-disk ratio (P=0.038). Filipino-American women had significantly decreased angle grades compared with men (P=0.028), but angle grade did not vary by intraocular pressure or age (all, P≥ 0.059). Narrow anterior chamber angles are highly prevalent in Filipino-American patients in our clinic population.

  9. High Prevalence of Narrow Angles among Chinese-American Glaucoma and Glaucoma Suspect Patients

    Science.gov (United States)

    Seider, Michael I; Pekmezci, Melike; Han, Ying; Sandhu, Simi; Kwok, Shiu Y; Lee, Roland Y; Lin, Shan C

    2009-01-01

    Purpose To evaluate the prevalence of gonioscopically narrow angles in a Chinese-American population with glaucoma or glaucoma suspicion. Patients and Methods Charts from all Chinese-American patients seen in a comprehensive ophthalmology clinic in the Chinatown district of San Francisco in 2002 were reviewed. One eye from each patient with glaucoma or glaucoma suspicion that met inclusion criteria was included (n=108). Data was collected for gender, age, race (self-declared), refraction (spherical equivalent), intraocular pressure (IOP), gonioscopy and vertical cup-to-disk ratio (CDR). Results Sixty percent (n=65) of Chinese-American eyes with glaucoma or glaucoma suspicion had gonioscopically narrow angles (Shaffer grade ≤2 in three or more quadrants). Those with narrow angles were significantly older (P=0.004) than their open angle counterparts, but the two groups did not differ in terms of gender, refraction, IOP or CDR (all, P≥0.071). In a multivariate model including age, gender and refraction as predictors of angle grade (open or narrow), only age was a significant predictor of angle grade (P=0.004). Conclusions A large proportion of Chinese-Americans in our study population with glaucoma or glaucoma suspicion had gonioscopically narrow angles. In multivariate analysis, patients with narrow angles were older than those with open angles but did not differ from them in terms of gender or refraction. Continued evaluation of angle closure glaucoma risk among Chinese-Americans is needed. PMID:19826385

  10. High prevalence of narrow angles among Chinese-American glaucoma and glaucoma suspect patients.

    Science.gov (United States)

    Seider, Michael I; Pekmezci, Melike; Han, Ying; Sandhu, Simi; Kwok, Shiu Y; Lee, Roland Y; Lin, Shan C

    2009-01-01

    To evaluate the prevalence of gonioscopically narrow angles in a Chinese-American population with glaucoma or glaucoma suspicion. Charts from all Chinese-American patients seen in a comprehensive ophthalmology clinic in the Chinatown district of San Francisco in 2002 were reviewed. One eye from each patient with glaucoma or glaucoma suspicion that met inclusion criteria was included (n=108). Data were collected for sex, age, race (self-declared), refraction (spherical equivalent), intraocular pressure, gonioscopy, and vertical cup-to-disk ratio. Sixty percent (n=65) of Chinese-American eyes with glaucoma or glaucoma suspicion had gonioscopically narrow angles (Shaffer grade or = 0.071). In a multivariate model including age, sex, and refraction as predictors of angle grade (open or narrow), only age was a significant predictor of angle grade (P=0.004). A large proportion of Chinese-Americans in our study population with glaucoma or glaucoma suspicion had gonioscopically narrow angles. In multivariate analysis, patients with narrow angles were older than those with open angles but did not differ from them in terms of sex or refraction. Continued evaluation of angle closure glaucoma risk among Chinese-Americans is needed.

  11. Multi-Angle Snowflake Camera Value-Added Product

    Energy Technology Data Exchange (ETDEWEB)

    Shkurko, Konstantin [Univ. of Utah, Salt Lake City, UT (United States); Garrett, T. [Univ. of Utah, Salt Lake City, UT (United States); Gaustad, K [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-12-01

    The Multi-Angle Snowflake Camera (MASC) addresses a need for high-resolution multi-angle imaging of hydrometeors in freefall with simultaneous measurement of fallspeed. As illustrated in Figure 1, the MASC consists of three cameras, separated by 36°, each pointing at an identical focal point approximately 10 cm away. Located immediately above each camera, a light aims directly at the center of depth of field for its corresponding camera. The focal point at which the cameras are aimed lies within a ring through which hydrometeors fall. The ring houses a system of near-infrared emitter-detector pairs, arranged in two arrays separated vertically by 32 mm. When hydrometeors pass through the lower array, they simultaneously trigger all cameras and lights. Fallspeed is calculated from the time it takes to traverse the distance between the upper and lower triggering arrays. The trigger electronics filter out ambient light fluctuations associated with varying sunlight and shadows. The microprocessor onboard the MASC controls the camera system and communicates with the personal computer (PC). The image data is sent via FireWire 800 line, and fallspeed (and camera control) is sent via a Universal Serial Bus (USB) line that relies on RS232-over-USB serial conversion. See Table 1 for specific details on the MASC located at the Oliktok Point Mobile Facility on the North Slope of Alaska. The value-added product (VAP) detailed in this documentation analyzes the raw data (Section 2.0) using Python: images rely on OpenCV image processing library and derived aggregated statistics rely on some clever averaging. See Sections 4.1 and 4.2 for more details on what variables are computed.

  12. Assessment of narrow angles by gonioscopy, Van Herick method and anterior segment optical coherence tomography.

    Science.gov (United States)

    Park, Seong Bae; Sung, Kyung Rim; Kang, Sung Yung; Jo, Jung Woo; Lee, Kyoung Sub; Kook, Michael S

    2011-07-01

    To evaluate anterior chamber (AC) angles using gonioscopy, Van Herick technique and anterior segment optical coherence tomography (AS-OCT). One hundred forty-eight consecutive subjects were enrolled. The agreement between any two of three diagnostic methods, gonioscopy, AS-OCT and Van Herick, was calculated in narrow-angle patients. The area under receiver-operating characteristic curves (AUC) for discriminating between narrow and open angles determined by gonioscopy was calculated in all participants for AS-OCT parameter angle opening distance (AOD), angle recess area, trabecular iris surface area and anterior chamber depth (ACD). As a subgroup analysis, capability of AS-OCT parameters for detecting angle closure defined by AS-OCT was assessed in narrow-angle patients. The agreement between the Van Herick method and gonioscopy in detecting angle closure was excellent in narrow angles (κ = 0.80, temporal; κ = 0.82, nasal). However, agreement between gonioscopy and AS-OCT and between the Van Herick method and AS-OCT was poor (κ = 0.11-0.16). Discrimination capability of AS-OCT parameters between open and narrow angles determined by gonioscopy was excellent for all AS-OCT parameters (AUC, temporal: AOD500 = 0.96, nasal: AOD500 = 0.99). The AUCs for detecting angle closure defined by AS-OCT image in narrow angle subjects was good for all AS-OCT parameters (AUC, 0.80-0.94) except for ACD (temporal: ACD = 0.70, nasal: ACD = 0.63). Assessment of narrow angles by gonioscopy and the Van Herick technique showed good agreement, but both measurements revealed poor agreement with AS-OCT. The angle closure detection capability of AS-OCT parameters was excellent; however, it was slightly lower in ACD.

  13. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    Science.gov (United States)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J.

    2015-09-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features. In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP. At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process. The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques.

  14. Photogrammetric measurement of 3D freeform millimetre-sized objects with micro features: an experimental validation of the close-range camera calibration model for narrow angles of view

    International Nuclear Information System (INIS)

    Percoco, Gianluca; Sánchez Salmerón, Antonio J

    2015-01-01

    The measurement of millimetre and micro-scale features is performed by high-cost systems based on technologies with narrow working ranges to accurately control the position of the sensors. Photogrammetry would lower the costs of 3D inspection of micro-features and would be applicable to the inspection of non-removable micro parts of large objects too. Unfortunately, the behaviour of photogrammetry is not known when photogrammetry is applied to micro-features.In this paper, the authors address these issues towards the application of digital close-range photogrammetry (DCRP) to the micro-scale, taking into account that in literature there are research papers stating that an angle of view (AOV) around 10° is the lower limit to the application of the traditional pinhole close-range calibration model (CRCM), which is the basis of DCRP.At first a general calibration procedure is introduced, with the aid of an open-source software library, to calibrate narrow AOV cameras with the CRCM. Subsequently the procedure is validated using a reflex camera with a 60 mm macro lens, equipped with extension tubes (20 and 32 mm) achieving magnification of up to 2 times approximately, to verify literature findings with experimental photogrammetric 3D measurements of millimetre-sized objects with micro-features. The limitation experienced by the laser printing technology, used to produce the bi-dimensional pattern on common paper, has been overcome using an accurate pattern manufactured with a photolithographic process.The results of the experimental activity prove that the CRCM is valid for AOVs down to 3.4° and that DCRP results are comparable with the results of existing and more expensive commercial techniques. (paper)

  15. THE TREATMENT OF OPEN- AND NARROW-ANGLE GLAUCOMA

    African Journals Online (AJOL)

    1971-04-10

    Apr 10, 1971 ... glaucoma will be considered: narrow-angle glaucoma. (acute glaucoma) and ... emotional or a physical crisis. The pain is in the distribu- .... ness, not increased pressure, haunts people suffering from glaucoma'.' The saga of ...

  16. Predictors of Intraocular Pressure After Phacoemulsification in Primary Open-Angle Glaucoma Eyes with Wide Versus Narrower Angles (An American Ophthalmological Society Thesis).

    Science.gov (United States)

    Lin, Shan C; Masis, Marisse; Porco, Travis C; Pasquale, Louis R

    2017-08-01

    To assess if narrower-angle status and anterior segment optical coherence tomography (AS-OCT) parameters can predict intraocular pressure (IOP) drop in primary open-angle glaucoma (POAG) patients after cataract surgery. This was a prospective case series of consecutive cataract surgery patients with POAG and no peripheral anterior synechiae (PAS) using a standardized postoperative management protocol. Preoperatively, patients underwent gonioscopy and AS-OCT. The same glaucoma medication regimen was resumed by 1 month. Potential predictors of IOP reduction included narrower-angle status by gonioscopy and angle-opening distance (AOD500) as well as other AS-OCT parameters. Mixed-effects regression adjusted for use of both eyes and other potential confounders. We enrolled 66 eyes of 40 glaucoma patients. The IOP reduction at 1 year was 4.2±3 mm Hg (26%, P gonioscopy. By AOD500 classification, the narrower-angle group had 3.4±3 mm Hg (21%, P <.001) reduction vs 2.5±3 mm Hg (16%, P <.001) in the wide-angle group ( P =.031 for difference). When the entire cohort was assessed, iris thickness, iris area, and lens vault were correlated with increasing IOP reduction at 1 year ( P <.05 for all). In POAG eyes, cataract surgery lowered IOP to a greater degree in the narrower-angle group than in the wide-angle group, and parameters relating to iris thickness and area, as well as lens vault, were correlated with IOP reduction. These findings can guide ophthalmologists in their selection of cataract surgery as a potential management option.

  17. The neutron small-angle camera D11 at the high-flux reactor, Grenoble

    International Nuclear Information System (INIS)

    Ibel, K.

    1976-01-01

    The neutron small-angle scattering system at the high-flux reactor in Grenoble consists of three major parts: the supply of cold neutrons via bent neutron guides; the small-angle camera D11; and the data handling facilities. The camera D11 has an overall length of 80 m. The effective length of the camera is variable. The full length of the collimator before the fixed sample position can be reduced by movable neutron guides; the second flight path of 40 m full length contains detector sites in various positions. Thus a large range of momentum transfers can be used with the same relative resolution. Scattering angles between 5 x 10 -4 and 0.5 rad and neutron wavelengths from 0.2 to 2.0 nm are available. A large-area position-sensitive detector is used which allows simultaneous recording of intensities scattered at different angles; it is a multiwire proportional chamber. 3808 elements of 1 cm 2 are arranged in a two-dimensional matrix. (Auth.)

  18. A single camera photogrammetry system for multi-angle fast localization of EEG electrodes.

    Science.gov (United States)

    Qian, Shuo; Sheng, Yang

    2011-11-01

    Photogrammetry has become an effective method for the determination of electroencephalography (EEG) electrode positions in three dimensions (3D). Capturing multi-angle images of the electrodes on the head is a fundamental objective in the design of photogrammetry system for EEG localization. Methods in previous studies are all based on the use of either a rotating camera or multiple cameras, which are time-consuming or not cost-effective. This study aims to present a novel photogrammetry system that can realize simultaneous acquisition of multi-angle head images in a single camera position. Aligning two planar mirrors with the angle of 51.4°, seven views of the head with 25 electrodes are captured simultaneously by the digital camera placed in front of them. A complete set of algorithms for electrode recognition, matching, and 3D reconstruction is developed. It is found that the elapsed time of the whole localization procedure is about 3 min, and camera calibration computation takes about 1 min, after the measurement of calibration points. The positioning accuracy with the maximum error of 1.19 mm is acceptable. Experimental results demonstrate that the proposed system provides a fast and cost-effective method for the EEG positioning.

  19. ORNL 10-m small-angle X-ray scattering camera

    International Nuclear Information System (INIS)

    Hendricks, R.W.

    1979-12-01

    A new small-angle x-ray scattering camera utilizing a rotating anode x-ray source, crystal monochromatization of the incident beam, pinhole collimation, and a two-dimensional position-sensitive proportional counter was developed. The sample, and the resolution element of the detector are each approximately 1 x 1 mm 2 , the camera was designed so that the focal spot-to-sample and sample-to-detector distances may each be varied in 0.5-m increments up to 5 m to provide a system resolution in the range 0.5 to 4.0 mrad. A large, general-purpose specimen chamber has been provided into which a wide variety of special-purpose specimen holders can be mounted. The detector has an active area of 200 x 200 mm and has up to 200 x 200 resolution elements. The data are recorded in the memory of a minicomputer by a high-speed interface which uses a microprocessor to map the position of an incident photon into an absolute minicomputer memory address. The data recorded in the computer memory can be processed on-line by a variety of programs designed to enhance the user's interaction with the experiment. At the highest angular resolution (0.4 mrad), the flux incident on the specimen is 1.0 x 10 6 photons/s with the x-ray source operating at 45 kV and 100 mA. SAX and its associated programs OVF and MOT are high-priority, pre-queued, nonresident foreground tasks which run under the ModComp II MAX III operating system to provide complete user control of the ORNL 10-m small-angle x-ray scattering camera

  20. Automatic helmet-wearing detection for law enforcement using CCTV cameras

    Science.gov (United States)

    Wonghabut, P.; Kumphong, J.; Satiennam, T.; Ung-arunyawee, R.; Leelapatra, W.

    2018-04-01

    The objective of this research is to develop an application for enforcing helmet wearing using CCTV cameras. The developed application aims to help law enforcement by police, and eventually resulting in changing risk behaviours and consequently reducing the number of accidents and its severity. Conceptually, the application software implemented using C++ language and OpenCV library uses two different angle of view CCTV cameras. Video frames recorded by the wide-angle CCTV camera are used to detect motorcyclists. If any motorcyclist without helmet is found, then the zoomed (narrow-angle) CCTV is activated to capture image of the violating motorcyclist and the motorcycle license plate in real time. Captured images are managed by database implemented using MySQL for ticket issuing. The results show that the developed program is able to detect 81% of motorcyclists on various motorcycle types during daytime and night-time. The validation results reveal that the program achieves 74% accuracy in detecting the motorcyclist without helmet.

  1. EVALUATION OF THE QUALITY OF ACTION CAMERAS WITH WIDE-ANGLE LENSES IN UAV PHOTOGRAMMETRY

    OpenAIRE

    Hastedt, H.; Ekkel, T.; Luhmann, T.

    2016-01-01

    The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry....

  2. Winter precipitation particle size distribution measurement by Multi-Angle Snowflake Camera

    Science.gov (United States)

    Huang, Gwo-Jong; Kleinkort, Cameron; Bringi, V. N.; Notaroš, Branislav M.

    2017-12-01

    From the radar meteorology viewpoint, the most important properties for quantitative precipitation estimation of winter events are 3D shape, size, and mass of precipitation particles, as well as the particle size distribution (PSD). In order to measure these properties precisely, optical instruments may be the best choice. The Multi-Angle Snowflake Camera (MASC) is a relatively new instrument equipped with three high-resolution cameras to capture the winter precipitation particle images from three non-parallel angles, in addition to measuring the particle fall speed using two pairs of infrared motion sensors. However, the results from the MASC so far are usually presented as monthly or seasonally, and particle sizes are given as histograms, no previous studies have used the MASC for a single storm study, and no researchers use MASC to measure the PSD. We propose the methodology for obtaining the winter precipitation PSD measured by the MASC, and present and discuss the development, implementation, and application of the new technique for PSD computation based on MASC images. Overall, this is the first study of the MASC-based PSD. We present PSD MASC experiments and results for segments of two snow events to demonstrate the performance of our PSD algorithm. The results show that the self-consistency of the MASC measured single-camera PSDs is good. To cross-validate PSD measurements, we compare MASC mean PSD (averaged over three cameras) with the collocated 2D Video Disdrometer, and observe good agreements of the two sets of results.

  3. Comparison of Scheimpflug imaging and spectral domain anterior segment optical coherence tomography for detection of narrow anterior chamber angles.

    Science.gov (United States)

    Grewal, D S; Brar, G S; Jain, R; Grewal, S P S

    2011-05-01

    To compare the performance of anterior chamber volume (ACV) and anterior chamber depth (ACD) obtained using Scheimpflug imaging with angle opening distance (AOD500) and trabecular-iris space area (TISA500) obtained using spectral domain anterior segment optical coherence tomography (SD-ASOCT) in detecting narrow angles classified using gonioscopy. In this prospective, cross-sectional observational study, 265 eyes of 265 consecutive patients underwent sequential Scheimpflug imaging, SD-ASOCT imaging, and gonioscopy. Correlations between gonioscopy grading, ACV, ACD, AOD500, and TISA500 were evaluated. Area under receiver operating characteristic curve (AUC), sensitivity, specificity, and likelihood ratios (LRs) were calculated to assess the performance of ACV, ACD, AOD500, and TISA500 in detecting narrow angles (defined as Shaffer grade ≤1 in all quadrants). SD-ASOCT images were obtained at the nasal and temporal quadrants only. Twenty-eight eyes (10.6%) were classified as narrow angles on gonioscopy. ACV correlated with gonioscopy grading (P<0.001) for temporal (r=0.204), superior (r=0.251), nasal (r=0.213), and inferior (r=0.236) quadrants. ACV correlated with TISA500 for nasal (r=0.135, P=0.029) and temporal (P=0.160, P=0.009) quadrants and also with AOD500 for nasal (r=0.498, P<0.001) and temporal (r=0.517, P<0.001) quadrants. For detection of narrow angles, ACV (AUC=0.935; 95% confidence interval (CI) =0.898-0.961) performed similar to ACD (AUC=0.88, P=0.06) and significantly better than AOD500 nasal (AUC=0.761, P=0.001), AOD500 temporal (AUC=0.808, P<0.001), TISA500 nasal (AUC=0.756, P<0.001), and TISA500 temporal (AUC=0.738, P<0.001). Using a cutoff of 113 mm(3), ACV had 90% sensitivity and 88% specificity for detecting narrow angles. Positive and negative LRs for ACV were 8.63 (95% CI=7.4-10.0) and 0.11 (95% CI=0.03-0.4), respectively. ACV measurements using Scheimpflug imaging outperformed AOD500 and TISA500 using SD-ASOCT for detecting narrow angles.

  4. Image reconstruction from limited angle Compton camera data

    International Nuclear Information System (INIS)

    Tomitani, T.; Hirasawa, M.

    2002-01-01

    The Compton camera is used for imaging the distributions of γ ray direction in a γ ray telescope for astrophysics and for imaging radioisotope distributions in nuclear medicine without the need for collimators. The integration of γ rays on a cone is measured with the camera, so that some sort of inversion method is needed. Parra found an analytical inversion algorithm based on spherical harmonics expansion of projection data. His algorithm is applicable to the full set of projection data. In this paper, six possible reconstruction algorithms that allow image reconstruction from projections with a finite range of scattering angles are investigated. Four algorithms have instability problems and two others are practical. However, the variance of the reconstructed image diverges in these two cases, so that window functions are introduced with which the variance becomes finite at a cost of spatial resolution. These two algorithms are compared in terms of variance. The algorithm based on the inversion of the summed back-projection is superior to the algorithm based on the inversion of the summed projection. (author)

  5. Improved iris localization by using wide and narrow field of view cameras for iris recognition

    Science.gov (United States)

    Kim, Yeong Gon; Shin, Kwang Yong; Park, Kang Ryoung

    2013-10-01

    Biometrics is a method of identifying individuals by their physiological or behavioral characteristics. Among other biometric identifiers, iris recognition has been widely used for various applications that require a high level of security. When a conventional iris recognition camera is used, the size and position of the iris region in a captured image vary according to the X, Y positions of a user's eye and the Z distance between a user and the camera. Therefore, the searching area of the iris detection algorithm is increased, which can inevitably decrease both the detection speed and accuracy. To solve these problems, we propose a new method of iris localization that uses wide field of view (WFOV) and narrow field of view (NFOV) cameras. Our study is new as compared to previous studies in the following four ways. First, the device used in our research acquires three images, one each of the face and both irises, using one WFOV and two NFOV cameras simultaneously. The relation between the WFOV and NFOV cameras is determined by simple geometric transformation without complex calibration. Second, the Z distance (between a user's eye and the iris camera) is estimated based on the iris size in the WFOV image and anthropometric data of the size of the human iris. Third, the accuracy of the geometric transformation between the WFOV and NFOV cameras is enhanced by using multiple matrices of the transformation according to the Z distance. Fourth, the searching region for iris localization in the NFOV image is significantly reduced based on the detected iris region in the WFOV image and the matrix of geometric transformation corresponding to the estimated Z distance. Experimental results showed that the performance of the proposed iris localization method is better than that of conventional methods in terms of accuracy and processing time.

  6. Measurement of defects on the wall by use of the inclination angle of laser slit beam and position tracking algorithm of camera

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Hwan; Yoon, Ji Sup; Jung, Jae Hoo; Hong, Dong Hee; Park, Gee Yong

    2001-01-01

    In this paper, a method of measuring the size of defects on the wall and restructuring the defect image is proposed based on the estimation algorithm of a camera orientation which uses the declination angle of the line slit beam. To reconstruct the image, an algorithm of estimating the horizontally inclined angle of CCD camera is presented. This algorithm adopts a 3-dimensional coordinate transformation of the image plane where both the LASER beam and the original image of the defects exist. The estimation equation is obtained by using the information of the beam projected on the wall and the parameters of this equation are experimentally obtained. With this algorithm, the original image of the defect can be reconstructed into the image which is obtained by a camera normal to the wall. From the result of a series of experiment shows that the measuring accuracy of the defect is within 0.5% error bound of real defect size under 30 degree of the horizontally inclined angle. Also, the accuracy is deteriorates with the error rate of 1% for every 10 degree increase of the horizontally inclined angle. The estimation error increases in the range of 30{approx}50 degree due to the existence of dead zone of defect depth, and defect length can not be measured due to the disappearance of image data above 70 degree. In case of under water condition, the measuring accuracy is also influenced due to the changed field of view of both the camera and the laser slit beam caused by the refraction rate in the water. The proposed algorithm provides the method of reconstructing the image taken at any arbitrary camera orientation into the image which is obtained by a camera normal to the wall and thus it enables the accurate measurement of the defect lengths only by using a single camera and a laser slit beam.

  7. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    Science.gov (United States)

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  8. Fabrication of multi-focal microlens array on curved surface for wide-angle camera module

    Science.gov (United States)

    Pan, Jun-Gu; Su, Guo-Dung J.

    2017-08-01

    In this paper, we present a wide-angle and compact camera module that consists of microlens array with different focal lengths on curved surface. The design integrates the principle of an insect's compound eye and the human eye. It contains a curved hexagonal microlens array and a spherical lens. Compared with normal mobile phone cameras which usually need no less than four lenses, but our proposed system only uses one lens. Furthermore, the thickness of our proposed system is only 2.08 mm and diagonal full field of view is about 100 degrees. In order to make the critical microlens array, we used the inkjet printing to control the surface shape of each microlens for achieving different focal lengths and use replication method to form curved hexagonal microlens array.

  9. Examination of the ''Ultra-wide-angle compton camera'' in Fukushima

    International Nuclear Information System (INIS)

    Takeda, Shin'ichiro; Watanabe, Shin; Takahashi, Tadayuki

    2012-01-01

    Japan Aerospace Exploration Agency (JAXA) has made the camera in the title, which can visualize radioactive substances emitting gamma ray in a wide-angle view of almost 180 degrees (hemisphere) and this paper explains its technological details and actual examination in Iitatemura Village, Fukushima Prefecture. The camera has a detector module consisting from 5-laminated structure of 2 layers of Si-double-sided strip detector (Si-DSD) and 3 layers of CdTe-DSD at 4 mm pitch, and their device size and electrode pitch are made the same, which enables the detector tray and analog application specific integrated circuit (ASIC) usable to communize the read-out circuits and for economical reduction. Two modules are placed side by side for increasing sensitivity and car-loaded to operate at -5 degree for the examination. The CdTe-DSD has actually Pt cathode and Al anode (Pt/CdTe/Al) for reduction of electric leaking and increase of energy resolution for 137 Cs gamma ray (662 keV). Data from the detector are digital pulse height values, which are then converted to the hit information of the detected position and energy. The hit event due to photoelectric absorption peak in CdTe originated from Compton scattering in Si is selected to be back-projected on the celestial hemisphere, leading to the torus depending on the direction of the gamma ray, of which accumulation results in specifying the position of the source. At the Village of 2-3 mcSv/h of ambient dose environment, locally accumulated radioactive substances (30 mcSv/h) are successfully visualized. With use of soft gamma ray detector in ASTRO-H satellite under development in JAXA, the improved camera can be more sensitive and may be useful in such a case as de-contamination to monitor its results in real time. (T.T.)

  10. New comparative clinical and biometric findings between acute primary angle-closure and glaucomatous eyes with narrow angle

    Directory of Open Access Journals (Sweden)

    Rafael Vidal Mérula

    2010-12-01

    Full Text Available Purpose: To compare, clinically and biometrically, affected and fellow acute primary angle-closure (APAC eyes and glaucomatous eyes with narrow angle (NA. Methods: Comparative case series; 30 patients with APAC and 27 glaucomatous patients with NA were evaluated. Keratometry (K, central corneal thickness (CCT, lens thickness (LT, axial length (AL and anterior chamber depth (ACD were measured. Parameters defined as lens posisiton (LP and relative lens position (RLP were calculated. Results: Biometric difference between APAC-affected and fellow eyes was found only in LP (P=0.046. When fellow eyes were compared to glaucomatous eyes with NA, differences were found in ACD (P=0.009, AL (P=0.010, and LT/AL (P=0.005. The comparison between APAC-affected and glaucomatous eyes with NA showed significant differences in almost all biometric parameters, except for LT (P=0.148 and RLP (P=0.374. We found that the logistic regression model (LRM, built with three parameters (K, CCT and LT/AL, higher than 0.334 could be a reasonable instrument to differentiate APAC eyes from glaucomatous eyes with NA. Conclusions: This study showed that APAC-affected and fellow eyes have similar biometric features, and glaucomatous eyes with NA have a less crowded anterior segment. The LRM built showed promising results in distinguishing APAC from glaucomatous eyes with NA.

  11. Ceres Photometry and Albedo from Dawn Framing Camera Images

    Science.gov (United States)

    Schröder, S. E.; Mottola, S.; Keller, H. U.; Li, J.-Y.; Matz, K.-D.; Otto, K.; Roatsch, T.; Stephan, K.; Raymond, C. A.; Russell, C. T.

    2015-10-01

    The Dawn spacecraft is in orbit around dwarf planet Ceres. The onboard Framing Camera (FC) [1] is mapping the surface through a clear filter and 7 narrow-band filters at various observational geometries. Generally, Ceres' appearance in these images is affected by shadows and shading, effects which become stronger for larger solar phase angles, obscuring the intrinsic reflective properties of the surface. By means of photometric modeling we attempt to remove these effects and reconstruct the surface albedo over the full visible wavelength range. Knowledge of the albedo distribution will contribute to our understanding of the physical nature and composition of the surface.

  12. Application of a one-dimensional position-sensitive detector to a Kratky small-angle x-ray camera

    International Nuclear Information System (INIS)

    Russell, T.P.; Stein, R.S.; Kopp, M.K.; Zedler, R.E.; Hendricks, R.W.; Lin, J.S.

    1979-01-01

    A conventional Kratky small-angle collimation system has been modified to allow the use of a one-dimensional position-sensitive x-ray detector. The detector was designed specifically for use with a long-slit camera and has uniform sensitivity over the entire beam in the slit-length direction. Procedures for alignment of the collimation system are given, and a variety of tests of the performance of the system are presented. Among the latter are measurements of electronic noise and parasitic scattering as well as comparisons against samples which were also measured on other cameras. The good agreement of these comparisons demonstrates the success of the use of a position-sensitive detector with the Kratky collimation system

  13. Application of a one-dimensional position-sensitive detector to a Kratky small-angle x-ray camera

    Energy Technology Data Exchange (ETDEWEB)

    Russell, T.P.; Stein, R.S.; Kopp, M.K.; Zedler, R.E.; Hendricks, R.W.; Lin, J.S.

    1979-01-01

    A conventional Kratky small-angle collimation system has been modified to allow the use of a one-dimensional position-sensitive x-ray detector. The detector was designed specifically for use with a long-slit camera and has uniform sensitivity over the entire beam in the slit-length direction. Procedures for alignment of the collimation system are given, and a variety of tests of the performance of the system are presented. Among the latter are measurements of electronic noise and parasitic scattering as well as comparisons against samples which were also measured on other cameras. The good agreement of these comparisons demonstrates the success of the use of a position-sensitive detector with the Kratky collimation system.

  14. Multispectral calibration to enhance the metrology performance of C-mount camera systems

    Directory of Open Access Journals (Sweden)

    S. Robson

    2014-06-01

    Full Text Available Low cost monochrome camera systems based on CMOS sensors and C-mount lenses have been successfully applied to a wide variety of metrology tasks. For high accuracy work such cameras are typically equipped with ring lights to image retro-reflective targets as high contrast image features. Whilst algorithms for target image measurement and lens modelling are highly advanced, including separate RGB channel lens distortion correction, target image circularity compensation and a wide variety of detection and centroiding approaches, less effort has been directed towards optimising physical target image quality by considering optical performance in narrow wavelength bands. This paper describes an initial investigation to assess the effect of wavelength on camera calibration parameters for two different camera bodies and the same ‘C-mount’ wide angle lens. Results demonstrate the expected strong influence on principal distance, radial and tangential distortion, and also highlight possible trends in principal point, orthogonality and affinity parameters which are close to the parameter estimation noise level from the strong convergent self-calibrating image networks.

  15. O2 atmospheric band measurements with WINDII: Performance of a narrow band filter/wide angle Michelson combination in space

    International Nuclear Information System (INIS)

    Ward, W.E.; Hersom, C.H.; Tai, C.C.; Gault, W.A.; Shepherd, G.G.; Solheim, B.H.

    1994-01-01

    Among the emissions viewed by the Wind Imaging Interferometer (WINDII) on the Upper Atmosphere Research Satellite (UARS) are selected lines in the (0-0) transition of the O2 atmospheric band. These lines are viewed simultaneously using a narrow band filter/wide-angle Michelson interferometer combination. The narrow band filter is used to separate the lines on the CCD (spectral-spatial scanning) and the Michelson used to modulate the emissions so that winds and rotational temperatures may be measured from the Doppler shifts and relative intensities of the lines. In this report this technique will be outlined and the on-orbit behavior since launch summarized

  16. EVALUATION OF THE QUALITY OF ACTION CAMERAS WITH WIDE-ANGLE LENSES IN UAV PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    H. Hastedt

    2016-06-01

    Full Text Available The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.

  17. Evaluation of the Quality of Action Cameras with Wide-Angle Lenses in Uav Photogrammetry

    Science.gov (United States)

    Hastedt, H.; Ekkel, T.; Luhmann, T.

    2016-06-01

    The application of light-weight cameras in UAV photogrammetry is required due to restrictions in payload. In general, consumer cameras with normal lens type are applied to a UAV system. The availability of action cameras, like the GoPro Hero4 Black, including a wide-angle lens (fish-eye lens) offers new perspectives in UAV projects. With these investigations, different calibration procedures for fish-eye lenses are evaluated in order to quantify their accuracy potential in UAV photogrammetry. Herewith the GoPro Hero4 is evaluated using different acquisition modes. It is investigated to which extent the standard calibration approaches in OpenCV or Agisoft PhotoScan/Lens can be applied to the evaluation processes in UAV photogrammetry. Therefore different calibration setups and processing procedures are assessed and discussed. Additionally a pre-correction of the initial distortion by GoPro Studio and its application to the photogrammetric purposes will be evaluated. An experimental setup with a set of control points and a prospective flight scenario is chosen to evaluate the processing results using Agisoft PhotoScan. Herewith it is analysed to which extent a pre-calibration and pre-correction of a GoPro Hero4 will reinforce the reliability and accuracy of a flight scenario.

  18. Wide Angle Michelson Doppler Imaging Interferometer (WAMDII)

    Science.gov (United States)

    Roberts, B.

    1986-01-01

    The wide angle Michelson Doppler imaging interferometer (WAMDII) is a specialized type of optical Michelson interferometer working at sufficiently long path difference to measure Doppler shifts and to infer Doppler line widths of naturally occurring upper atmospheric Gaussian line emissions. The instrument is intended to measure vertical profiles of atmospheric winds and temperatures within the altitude range of 85 km to 300 km. The WAMDII consists of a Michelson interferometer followed by a camera lens and an 85 x 106 charge coupled device photodiode array. Narrow band filters in a filter wheel are used to isolate individual line emissions and the lens forms an image of the emitting region on the charge coupled device array.

  19. Comparison and evaluation of datasets for off-angle iris recognition

    Science.gov (United States)

    Kurtuncu, Osman M.; Cerme, Gamze N.; Karakaya, Mahmut

    2016-05-01

    In this paper, we investigated the publicly available iris recognition datasets and their data capture procedures in order to determine if they are suitable for the stand-off iris recognition research. Majority of the iris recognition datasets include only frontal iris images. Even if a few datasets include off-angle iris images, the frontal and off-angle iris images are not captured at the same time. The comparison of the frontal and off-angle iris images shows not only differences in the gaze angle but also change in pupil dilation and accommodation as well. In order to isolate the effect of the gaze angle from other challenging issues including dilation and accommodation, the frontal and off-angle iris images are supposed to be captured at the same time by using two different cameras. Therefore, we developed an iris image acquisition platform by using two cameras in this work where one camera captures frontal iris image and the other one captures iris images from off-angle. Based on the comparison of Hamming distance between frontal and off-angle iris images captured with the two-camera- setup and one-camera-setup, we observed that Hamming distance in two-camera-setup is less than one-camera-setup ranging from 0.05 to 0.001. These results show that in order to have accurate results in the off-angle iris recognition research, two-camera-setup is necessary in order to distinguish the challenging issues from each other.

  20. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    Science.gov (United States)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  1. MESS MDIS MAP PROJ REGIONAL TARGETED MOSAIC RDR V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — Abstract ======== The Mercury Dual Imaging System (MDIS) consists of two cameras, a Wide Angle Camera (WAC) and a Narrow Angle Camera (NAC), mounted on a common...

  2. Hong's grading for evaluating anterior chamber angle width.

    Science.gov (United States)

    Kim, Seok Hwan; Kang, Ja Heon; Park, Ki Ho; Hong, Chul

    2012-11-01

    To compare Hong's grading method with anterior segment optical coherence tomography (AS-OCT), gonioscopy, and the dark-room prone-position test (DRPT) for evaluating anterior chamber width. The anterior chamber angle was graded using Hong's grading method, and Hong's angle width was calculated from the arctangent of Hong's grades. The correlation between Hong's angle width and AS-OCT parameters was analyzed. The area under the receiver operating characteristic curve (AUC) for Hong's grading method when discriminating between narrow and open angles as determined by gonioscopy was calculated. Correlation analysis was performed between Hong's angle width and intraocular pressure (IOP) changes determined by DRPT. A total of 60 subjects were enrolled. Of these subjects, 53.5 % had a narrow angle. Hong's angle width correlated significantly with the AS-OCT parameters (r = 0.562-0.719, P < 0.01). A Bland-Altman plot showed relatively good agreement between Hong's angle width and the angle width obtained by AS-OCT. The ability of Hong's grading method to discriminate between open and narrow angles was good (AUC = 0.868, 95 % CI 0.756-0.942). A significant linear correlation was found between Hong's angle width and IOP change determined by DRPT (r = -0.761, P < 0.01). Hong's grading method is useful for detecting narrow angles. Hong's grading correlated well with AS-OCT parameters and DRPT.

  3. Dust mass distribution around comet 67P/Churyumov-Gerasimenko determined via parallax measurements using Rosetta's OSIRIS cameras

    Science.gov (United States)

    Ott, T.; Drolshagen, E.; Koschny, D.; Güttler, C.; Tubiana, C.; Frattin, E.; Agarwal, J.; Sierks, H.; Bertini, I.; Barbieri, C.; Lamy, P. I.; Rodrigo, R.; Rickman, H.; A'Hearn, M. F.; Barucci, M. A.; Bertaux, J.-L.; Boudreault, S.; Cremonese, G.; Da Deppo, V.; Davidsson, B.; Debei, S.; De Cecco, M.; Deller, J.; Feller, C.; Fornasier, S.; Fulle, M.; Geiger, B.; Gicquel, A.; Groussin, O.; Gutiérrez, P. J.; Hofmann, M.; Hviid, S. F.; Ip, W.-H.; Jorda, L.; Keller, H. U.; Knollenberg, J.; Kovacs, G.; Kramm, J. R.; Kührt, E.; Küppers, M.; Lara, L. M.; Lazzarin, M.; Lin, Z.-Y.; López-Moreno, J. J.; Marzari, F.; Mottola, S.; Naletto, G.; Oklay, N.; Pajola, M.; Shi, X.; Thomas, N.; Vincent, J.-B.; Poppe, B.

    2017-07-01

    The OSIRIS (optical, spectroscopic and infrared remote imaging system) instrument on board the ESA Rosetta spacecraft collected data of 67P/Churyumov-Gerasimenko for over 2 yr. OSIRIS consists of two cameras, a Narrow Angle Camera and a Wide Angle Camera. For specific imaging sequences related to the observation of dust aggregates in 67P's coma, the two cameras were operating simultaneously. The two cameras are mounted 0.7 m apart from each other, as a result this baseline yields a parallax shift of the apparent particle trails on the analysed images directly proportional to their distance. Thanks to such shifts, the distance between observed dust aggregates and the spacecraft was determined. This method works for particles closer than 6000 m to the spacecraft and requires very few assumptions. We found over 250 particles in a suitable distance range with sizes of some centimetres, masses in the range of 10-6-102 kg and a mean velocity of about 2.4 m s-1 relative to the nucleus. Furthermore, the spectral slope was analysed showing a decrease in the median spectral slope of the particles with time. The further a particle is from the spacecraft the fainter is its signal. For this reason, this was counterbalanced by a debiasing. Moreover, the dust mass-loss rate of the nucleus could be computed as well as the Afρ of the comet around perihelion. The summed-up dust mass-loss rate for the mass bins 10-4-102 kg is almost 8300 kg s-1.

  4. Pool Boiling CHF in Inclined Narrow Annuli

    International Nuclear Information System (INIS)

    Kang, Myeong Gie

    2010-01-01

    Pool boiling heat transfer has been studied extensively since it is frequently encountered in various heat transfer equipment. Recently, it has been widely investigated in nuclear power plants for application to the advanced light water reactors designs. Through the review on the published results it can be concluded that knowledge on the combined effects of the surface orientation and a confined space on pool boiling heat transfer is of great practical importance and also of great academic interest. Fujita et al. investigated pool boiling heat transfer, from boiling inception to the critical heat flux (CHF, q' CHF ), in a confined narrow space between heated and unheated parallel rectangular plates. They identified that both the confined space and the surface orientation changed heat transfer much. Kim and Suh changed the surface orientation angles of a downward heating rectangular channel having a narrow gap from the downward-facing position (180 .deg.) to the vertical position (90 .deg.). They observed that the CHF generally decreased as the inclination angle (θ ) increased. Yao and Chang studied pool boiling heat transfer in a confined heat transfer for vertical narrow annuli with closed bottoms. They observed that when the gap size ( s ) of the annulus was decreased the effect of space confinement to boiling heat transfer increased. The CHF was occurred at much lower value for the confined space comparing to the unconfined pool boiling. Pool boiling heat transfer in narrow horizontal annular crevices was studied by Hung and Yao. They concluded that the CHF decreased with decreasing gap size of the annuli and described the importance of the thin film evaporation to explain the lower CHF of narrow crevices. The effect of the inclination angle on the CHF on countercurrent boiling in an inclined uniformly heated tube with closed bottoms was also studied by Liu et al. They concluded that the CHF reduced with the inclination angle decrease. A study was carried out

  5. MUSIC - Multifunctional stereo imaging camera system for wide angle and high resolution stereo and color observations on the Mars-94 mission

    Science.gov (United States)

    Oertel, D.; Jahn, H.; Sandau, R.; Walter, I.; Driescher, H.

    1990-10-01

    Objectives of the multifunctional stereo imaging camera (MUSIC) system to be deployed on the Soviet Mars-94 mission are outlined. A high-resolution stereo camera (HRSC) and wide-angle opto-electronic stereo scanner (WAOSS) are combined in terms of hardware, software, technology aspects, and solutions. Both HRSC and WAOSS are push-button instruments containing a single optical system and focal plates with several parallel CCD line sensors. Emphasis is placed on the MUSIC system's stereo capability, its design, mass memory, and data compression. A 1-Gbit memory is divided into two parts: 80 percent for HRSC and 20 percent for WAOSS, while the selected on-line compression strategy is based on macropixel coding and real-time transform coding.

  6. Reliability of sagittal plane hip, knee, and ankle joint angles from a single frame of video data using the GAITRite camera system.

    Science.gov (United States)

    Ross, Sandy A; Rice, Clinton; Von Behren, Kristyn; Meyer, April; Alexander, Rachel; Murfin, Scott

    2015-01-01

    The purpose of this study was to establish intra-rater, intra-session, and inter-rater, reliability of sagittal plane hip, knee, and ankle angles with and without reflective markers using the GAITRite walkway and single video camera between student physical therapists and an experienced physical therapist. This study included thirty-two healthy participants age 20-59, stratified by age and gender. Participants performed three successful walks with and without markers applied to anatomical landmarks. GAITRite software was used to digitize sagittal hip, knee, and ankle angles at two phases of gait: (1) initial contact; and (2) mid-stance. Intra-rater reliability was more consistent for the experienced physical therapist, regardless of joint or phase of gait. Intra-session reliability was variable, the experienced physical therapist showed moderate to high reliability (intra-class correlation coefficient (ICC) = 0.50-0.89) and the student physical therapist showed very poor to high reliability (ICC = 0.07-0.85). Inter-rater reliability was highest during mid-stance at the knee with markers (ICC = 0.86) and lowest during mid-stance at the hip without markers (ICC = 0.25). Reliability of a single camera system, especially at the knee joint shows promise. Depending on the specific type of reliability, error can be attributed to the testers (e.g. lack of digitization practice and marker placement), participants (e.g. loose fitting clothing) and camera systems (e.g. frame rate and resolution). However, until the camera technology can be upgraded to a higher frame rate and resolution, and the software can be linked to the GAITRite walkway, the clinical utility for pre/post measures is limited.

  7. Search for narrow baryon resonances (of masses through 3.4 and 5 GeV) through a π-p large angle elastic scattering formation experiment

    International Nuclear Information System (INIS)

    Chauveau, J.

    1981-01-01

    This work describes a search for narrow baryon resonances (of masses between 3.4 and 5 GeV) through a π - p large angle elastic scattering formation experiment. An optimization of the sensitivity of the experiment to detect resonances is obtained by the measurement of the central part of the angular distribution (/cos theta*/ -4 . The apparatus and data analysis are described in details. No narrow resonance has been found, the sensitivity of the experiment being characterized by a width GAMMA approximately equal to 1 MeV and an elasticity x approximately equal to 0.01. Finally, the differential cross section measurement is compared to some parton models [fr

  8. Computing camera heading: A study

    Science.gov (United States)

    Zhang, John Jiaxiang

    2000-08-01

    An accurate estimate of the motion of a camera is a crucial first step for the 3D reconstruction of sites, objects, and buildings from video. Solutions to the camera heading problem can be readily applied to many areas, such as robotic navigation, surgical operation, video special effects, multimedia, and lately even in internet commerce. From image sequences of a real world scene, the problem is to calculate the directions of the camera translations. The presence of rotations makes this problem very hard. This is because rotations and translations can have similar effects on the images, and are thus hard to tell apart. However, the visual angles between the projection rays of point pairs are unaffected by rotations, and their changes over time contain sufficient information to determine the direction of camera translation. We developed a new formulation of the visual angle disparity approach, first introduced by Tomasi, to the camera heading problem. Our new derivation makes theoretical analysis possible. Most notably, a theorem is obtained that locates all possible singularities of the residual function for the underlying optimization problem. This allows identifying all computation trouble spots beforehand, and to design reliable and accurate computational optimization methods. A bootstrap-jackknife resampling method simultaneously reduces complexity and tolerates outliers well. Experiments with image sequences show accurate results when compared with the true camera motion as measured with mechanical devices.

  9. Finger Angle-Based Hand Gesture Recognition for Smart Infrastructure Using Wearable Wrist-Worn Camera

    Directory of Open Access Journals (Sweden)

    Feiyu Chen

    2018-03-01

    Full Text Available The arising of domestic robots in smart infrastructure has raised demands for intuitive and natural interaction between humans and robots. To address this problem, a wearable wrist-worn camera (WwwCam is proposed in this paper. With the capability of recognizing human hand gestures in real-time, it enables services such as controlling mopping robots, mobile manipulators, or appliances in smart-home scenarios. The recognition is based on finger segmentation and template matching. Distance transformation algorithm is adopted and adapted to robustly segment fingers from the hand. Based on fingers’ angles relative to the wrist, a finger angle prediction algorithm and a template matching metric are proposed. All possible gesture types of the captured image are first predicted, and then evaluated and compared to the template image to achieve the classification. Unlike other template matching methods relying highly on large training set, this scheme possesses high flexibility since it requires only one image as the template, and can classify gestures formed by different combinations of fingers. In the experiment, it successfully recognized ten finger gestures from number zero to nine defined by American Sign Language with an accuracy up to 99.38%. Its performance was further demonstrated by manipulating a robot arm using the implemented algorithms and WwwCam to transport and pile up wooden building blocks.

  10. THERMAL EFFECTS ON CAMERA FOCAL LENGTH IN MESSENGER STAR CALIBRATION AND ORBITAL IMAGING

    Directory of Open Access Journals (Sweden)

    S. Burmeister

    2018-04-01

    Full Text Available We analyse images taken by the MErcury Surface, Space ENviorment, GEochemistry, and Ranging (MESSENGER spacecraft for the camera’s thermal response in the harsh thermal environment near Mercury. Specifically, we study thermally induced variations in focal length of the Mercury Dual Imaging System (MDIS. Within the several hundreds of images of star fields, the Wide Angle Camera (WAC typically captures up to 250 stars in one frame of the panchromatic channel. We measure star positions and relate these to the known star coordinates taken from the Tycho-2 catalogue. We solve for camera pointing, the focal length parameter and two non-symmetrical distortion parameters for each image. Using data from the temperature sensors on the camera focal plane we model a linear focal length function in the form of f(T = A0 + A1 T. Next, we use images from MESSENGER’s orbital mapping mission. We deal with large image blocks, typically used for the production of a high-resolution digital terrain models (DTM. We analyzed images from the combined quadrangles H03 and H07, a selected region, covered by approx. 10,600 images, in which we identified about 83,900 tiepoints. Using bundle block adjustments, we solved for the unknown coordinates of the control points, the pointing of the camera – as well as the camera’s focal length. We then fit the above linear function with respect to the focal plane temperature. As a result, we find a complex response of the camera to thermal conditions of the spacecraft. To first order, we see a linear increase by approx. 0.0107 mm per degree temperature for the Narrow-Angle Camera (NAC. This is in agreement with the observed thermal response seen in images of the panchromatic channel of the WAC. Unfortunately, further comparisons of results from the two methods, both of which use different portions of the available image data, are limited. If leaving uncorrected, these effects may pose significant difficulties in

  11. The Influence of Face Angle and Club Path on the Resultant Launch Angle of a Golf Ball

    Directory of Open Access Journals (Sweden)

    Paul Wood

    2018-02-01

    Full Text Available A two-part experimental study was conducted in order to better understand how the delivered face angle and club path of a golf club influences the initial launch direction of a golf ball for various club types. A robust understanding of how these parameters influence the ball direction has implications for both coaches and club designers. The first study used a large sample of golfers hitting shots with different clubs. Initial ball direction was measured with a Foresight Sports camera system, while club delivery parameters were recorded with a Vicon motion capture system. The second study used a golf robot and Vision Research camera to measure club and ball parameters. Results from these experiments show that the launch direction fell closer to face angle than club path. The percent toward the face angle ranged from 61% to 83%, where 100% designates a launch angle entirely toward the face angle.

  12. Narrow linewidth pulsed optical parametric oscillator

    Indian Academy of Sciences (India)

    Tunable narrow linewidth radiation by optical parametric oscillation has many applications, particularly in spectroscopic investigation. In this paper, different techniques such as injection seeding, use of spectral selecting element like grating, grating and etalon in combination, grazing angle of incidence, entangled cavity ...

  13. Wavefront analysis for plenoptic camera imaging

    International Nuclear Information System (INIS)

    Luan Yin-Sen; Xu Bing; Yang Ping; Tang Guo-Mao

    2017-01-01

    The plenoptic camera is a single lens stereo camera which can retrieve the direction of light rays while detecting their intensity distribution. In this paper, to reveal more truths of plenoptic camera imaging, we present the wavefront analysis for the plenoptic camera imaging from the angle of physical optics but not from the ray tracing model of geometric optics. Specifically, the wavefront imaging model of a plenoptic camera is analyzed and simulated by scalar diffraction theory and the depth estimation is redescribed based on physical optics. We simulate a set of raw plenoptic images of an object scene, thereby validating the analysis and derivations and the difference between the imaging analysis methods based on geometric optics and physical optics are also shown in simulations. (paper)

  14. Application of narrow-band television to industrial and commercial communications

    Science.gov (United States)

    Embrey, B. C., Jr.; Southworth, G. R.

    1974-01-01

    The development of narrow-band systems for use in space systems is presented. Applications of the technology to future spacecraft requirements are discussed along with narrow-band television's influence in stimulating development within the industry. The transferral of the technology into industrial and commercial communications is described. Major areas included are: (1) medicine; (2) education; (3) remote sensing for traffic control; and (5) weather observation. Applications in data processing, image enhancement, and information retrieval are provided by the combination of the TV camera and the computer.

  15. Autonomous pedestrian localization technique using CMOS camera sensors

    Science.gov (United States)

    Chun, Chanwoo

    2014-09-01

    We present a pedestrian localization technique that does not need infrastructure. The proposed angle-only measurement method needs specially manufactured shoes. Each shoe has two CMOS cameras and two markers such as LEDs attached on the inward side. The line of sight (LOS) angles towards the two markers on the forward shoe are measured using the two cameras on the other rear shoe. Our simulation results shows that a pedestrian walking down in a shopping mall wearing this device can be accurately guided to the front of a destination store located 100m away, if the floor plan of the mall is available.

  16. New narrow baryon resonances in pp inelastic scattering

    International Nuclear Information System (INIS)

    Tatischeff, B.; Willis, N.; Comets, M.P.; Courtat, P.; Gacougnolle, R.; Le Bornec, Y.; Loireleux, E.; Reide, F.; Yonnet, J.; Boivin, M.

    1999-01-01

    The reaction pp → pπ + X has been studied at 3 energies (T p 1520, 1805 and 2100 MeV) and 6 angles from 0 angle up to 17 angle (lab.). Several narrow states have been observed in missing mass spectra at: 1004, 1044, 1094 MeV. Their widths are typically one order of magnitude smaller than the widths of N * of Δ. Possible biases are discussed. These masses are in agreement with those calculated within a simple phenomenological mass formula based on color magnetic interaction between two colored quark clusters. (authors)

  17. Contemporary Approach to the Diagnosis and Management of Primary Angle-Closure Disease.

    Science.gov (United States)

    Razeghinejad, M Reza; Myers, Jonathan S

    2018-05-16

    Primary angle closure disease spectrum varies from a narrow angle to advanced glaucoma. A variety of imaging technologies may assist the clinician in determining the pathophysiology and diagnosis of primary angle closure, but gonioscopy remains a mainstay of clinical evaluation. Laser iridotomy effectively eliminates the pupillary block component of angle closure; however, studies show that in many patients the iridocorneal angle remains narrow from underlying anatomic issues, and increasing lens size often leads to further narrowing over time. Recent studies have further characterized the role of the lens in angle closure disease, and cataract or clear lens extraction is increasingly used earlier in its management. As a first surgical step in angle closure glaucoma, lens extraction alone often effectively controls the pressure with less risk of complications than concurrent or stand alone glaucoma surgery, but may not be sufficient in more advanced or severe disease. We provide a comprehensive review on the primary angle-closure disease nomenclature, imaging, and current laser and surgical management. Copyright © 2018. Published by Elsevier Inc.

  18. Effect of bubble interface parameters on predicted of bubble departure diameter in a narrow channel

    International Nuclear Information System (INIS)

    Xu Jianjun; Xie Tianzhou; Zhou Wenbin; Chen Bingde; Huang Yanping

    2014-01-01

    The predicted model on the bubble departure diameter in a narrow channel is built by analysis of forces acting on the bubble, and effects of bubble interface parameters such as the bubble inclination angle, upstream contact angle, downstream contact angle and bubble contact diameter on predicted bubble departure diameters in a narrow channel are analysed by comparing with the visual experimental data. Based on the above results, the bubble interface parameters as the input parameters used to obtain the bubble departure diameter in a narrow channel are assured, and the bubble departure diameters in a narrow channel are predicted by solving the force equation. The predicted bubble departure diameters are verified by the 58 bubble departure diameters obtained from the vertical and inclined visual experiment, and the predicted results agree with the experimental results. The different forces acting on the bubble are obtained and the effect of thermal parameters in this experiment on bubble departure diameters is analysed. (authors)

  19. Laser peripheral iridoplasty for angle-closure.

    Science.gov (United States)

    Ng, Wai Siene; Ang, Ghee Soon; Azuara-Blanco, Augusto

    2012-02-15

    Angle-closure glaucoma is a leading cause of irreversible blindness in the world. Treatment is aimed at opening the anterior chamber angle and lowering the IOP with medical and/or surgical treatment (e.g. trabeculectomy, lens extraction). Laser iridotomy works by eliminating pupillary block and widens the anterior chamber angle in the majority of patients. When laser iridotomy fails to open the anterior chamber angle, laser iridoplasty may be recommended as one of the options in current standard treatment for angle-closure. Laser peripheral iridoplasty works by shrinking and pulling the peripheral iris tissue away from the trabecular meshwork. Laser peripheral iridoplasty can be used for crisis of acute angle-closure and also in non-acute situations.   To assess the effectiveness of laser peripheral iridoplasty in the treatment of narrow angles (i.e. primary angle-closure suspect), primary angle-closure (PAC) or primary angle-closure glaucoma (PACG) in non-acute situations when compared with any other intervention. In this review, angle-closure will refer to patients with narrow angles (PACs), PAC and PACG. We searched CENTRAL (which contains the Cochrane Eyes and Vision Group Trials Register) (The Cochrane Library 2011, Issue 12), MEDLINE (January 1950 to January 2012), EMBASE (January 1980 to January 2012), Latin American and Caribbean Literature on Health Sciences (LILACS) (January 1982 to January 2012), the metaRegister of Controlled Trials (mRCT) (www.controlled-trials.com), ClinicalTrials.gov (www.clinicaltrials.gov) and the WHO International Clinical Trials Registry Platform (ICTRP) (www.who.int/ictrp/search/en). There were no date or language restrictions in the electronic searches for trials. The electronic databases were last searched on 5 January 2012. We included only randomised controlled trials (RCTs) in this review. Patients with narrow angles, PAC or PACG were eligible. We excluded studies that included only patients with acute presentations

  20. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  1. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2 0 deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  2. Capturing method for integral three-dimensional imaging using multiviewpoint robotic cameras

    Science.gov (United States)

    Ikeya, Kensuke; Arai, Jun; Mishina, Tomoyuki; Yamaguchi, Masahiro

    2018-03-01

    Integral three-dimensional (3-D) technology for next-generation 3-D television must be able to capture dynamic moving subjects with pan, tilt, and zoom camerawork as good as in current TV program production. We propose a capturing method for integral 3-D imaging using multiviewpoint robotic cameras. The cameras are controlled through a cooperative synchronous system composed of a master camera controlled by a camera operator and other reference cameras that are utilized for 3-D reconstruction. When the operator captures a subject using the master camera, the region reproduced by the integral 3-D display is regulated in real space according to the subject's position and view angle of the master camera. Using the cooperative control function, the reference cameras can capture images at the narrowest view angle that does not lose any part of the object region, thereby maximizing the resolution of the image. 3-D models are reconstructed by estimating the depth from complementary multiviewpoint images captured by robotic cameras arranged in a two-dimensional array. The model is converted into elemental images to generate the integral 3-D images. In experiments, we reconstructed integral 3-D images of karate players and confirmed that the proposed method satisfied the above requirements.

  3. Ultra-fast framing camera tube

    Science.gov (United States)

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  4. Limited-angle imaging in positron cameras: theory and practice

    Energy Technology Data Exchange (ETDEWEB)

    Tam, K.C.

    1979-10-01

    The principles of operation of planar positron camera systems made up of multiwire proportional chambers as detectors and electromagnetic delay lines for coordinate readout are discussed. Gamma converters are coupled to the wire chambers to increase detection efficiency and improve spatial resolution. The conversion efficiencies of these converters are calculated and the results compare favorably to the experimentally measured values.

  5. Limited-angle imaging in positron cameras: theory and practice

    International Nuclear Information System (INIS)

    Tam, K.C.

    1979-10-01

    The principles of operation of planar positron camera systems made up of multiwire proportional chambers as detectors and electromagnetic delay lines for coordinate readout are discussed. Gamma converters are coupled to the wire chambers to increase detection efficiency and improve spatial resolution. The conversion efficiencies of these converters are calculated and the results compare favorably to the experimentally measured values

  6. Acquisition and visualization techniques for narrow spectral color imaging.

    Science.gov (United States)

    Neumann, László; García, Rafael; Basa, János; Hegedüs, Ramón

    2013-06-01

    This paper introduces a new approach in narrow-band imaging (NBI). Existing NBI techniques generate images by selecting discrete bands over the full visible spectrum or an even wider spectral range. In contrast, here we perform the sampling with filters covering a tight spectral window. This image acquisition method, named narrow spectral imaging, can be particularly useful when optical information is only available within a narrow spectral window, such as in the case of deep-water transmittance, which constitutes the principal motivation of this work. In this study we demonstrate the potential of the proposed photographic technique on nonunderwater scenes recorded under controlled conditions. To this end three multilayer narrow bandpass filters were employed, which transmit at 440, 456, and 470 nm bluish wavelengths, respectively. Since the differences among the images captured in such a narrow spectral window can be extremely small, both image acquisition and visualization require a novel approach. First, high-bit-depth images were acquired with multilayer narrow-band filters either placed in front of the illumination or mounted on the camera lens. Second, a color-mapping method is proposed, using which the input data can be transformed onto the entire display color gamut with a continuous and perceptually nearly uniform mapping, while ensuring optimally high information content for human perception.

  7. REFLECTANCE CALIBRATION SCHEME FOR AIRBORNE FRAME CAMERA IMAGES

    Directory of Open Access Journals (Sweden)

    U. Beisl

    2012-07-01

    Full Text Available The image quality of photogrammetric images is influenced by various effects from outside the camera. One effect is the scattered light from the atmosphere that lowers contrast in the images and creates a colour shift towards the blue. Another is the changing illumination during the day which results in changing image brightness within an image block. In addition, there is the so-called bidirectional reflectance of the ground (BRDF effects that is giving rise to a view and sun angle dependent brightness gradient in the image itself. To correct for the first two effects an atmospheric correction with reflectance calibration is chosen. The effects have been corrected successfully for ADS linescan sensor data by using a parametrization of the atmospheric quantities. Following Kaufman et al. the actual atmospheric condition is estimated by the brightness of a dark pixel taken from the image. The BRDF effects are corrected using a semi-empirical modelling of the brightness gradient. Both methods are now extended to frame cameras. Linescan sensors have a viewing geometry that is only dependent from the cross track view zenith angle. The difference for frame cameras now is to include the extra dimension of the view azimuth into the modelling. Since both the atmospheric correction and the BRDF correction require a model inversion with the help of image data, a different image sampling strategy is necessary which includes the azimuth angle dependence. For the atmospheric correction a sixth variable is added to the existing five variables visibility, view zenith angle, sun zenith angle, ground altitude, and flight altitude – thus multiplying the number of modelling input combinations for the offline-inversion. The parametrization has to reflect the view azimuth angle dependence. The BRDF model already contains the view azimuth dependence and is combined with a new sampling strategy.

  8. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates.

    Science.gov (United States)

    Hobbs, Michael T; Brehme, Cheryl S

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  9. An improved camera trap for amphibians, reptiles, small mammals, and large invertebrates

    Science.gov (United States)

    Hobbs, Michael T.; Brehme, Cheryl S.

    2017-01-01

    Camera traps are valuable sampling tools commonly used to inventory and monitor wildlife communities but are challenged to reliably sample small animals. We introduce a novel active camera trap system enabling the reliable and efficient use of wildlife cameras for sampling small animals, particularly reptiles, amphibians, small mammals and large invertebrates. It surpasses the detection ability of commonly used passive infrared (PIR) cameras for this application and eliminates problems such as high rates of false triggers and high variability in detection rates among cameras and study locations. Our system, which employs a HALT trigger, is capable of coupling to digital PIR cameras and is designed for detecting small animals traversing small tunnels, narrow trails, small clearings and along walls or drift fencing.

  10. Distribution and Parameter's Calculations of Television Cameras Inside a Nuclear Facility

    International Nuclear Information System (INIS)

    El-kafas, A.A.

    2009-01-01

    In this work, a distribution of television cameras and parameter's calculation inside and outside a nuclear facility is presented. Each of exterior and interior camera systems will be described and explained. The work shows the overall closed circuit television system. Fixed and moving cameras with various lens format and different angles of view are used. The calculations of width of images sensitive area and Lens focal length for the cameras will be introduced. The work shows the camera locations and distributions inside and outside the nuclear facility. The technical specifications and parameters for cameras selection are tabulated

  11. Cosmic ray zenith angle distribution at low geomagnetic latitude

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, G [Instituto de Astronomia y Fisica del Espacio, Buenos Aires, Argentina; Gagliardini, A; Ghielmetti, H S

    1977-12-01

    The intensity of secondary charged cosmic rays at different zenith angles was measured by narrow angle Geiger-Mueller telescopes up to an atmospheric depth of 2 g cm/sup -2/. The angular distribution observed at high altitudes is nearly flat at small angles around the vertical and suggests that the particle intensity peaks at large zenith angles, close to the horizon.

  12. Estimation of signal intensity for online measurement X-ray pinhole camera

    International Nuclear Information System (INIS)

    Dong Jianjun; Liu Shenye; Yang Guohong; Yu Yanning

    2009-01-01

    The signal intensity was estimated for on-line measurement X-ray pinhole camera with CCD as measurement equipment. The X-ray signal intensity counts after the attenuation of thickness-varied Be filters and different material flat mirrors respectively were estimated using the energy spectrum of certain laser prototype and the quantum efficiency curve of PI-SX1300 CCD camera. The calculated results indicate that Be filters no thicker than 200 μm can only reduce signal intensity by one order of magnitude, and so can Au flat mirror with 3 degree incident angle, Ni, C and Si flat mirrors with 5 degree incident angle,but the signal intensity counts for both attenuation methods are beyond the saturation counts of the CCD camera. We also calculated the attenuation of signal intensity for different thickness Be filters combined with flat mirrors, indicates that the combination of Be filters with the thickness between 20 and 40 μm and Au flat mirror with 3 degree incident angle or Ni flat mirror with 5 degree incident angle is a good choice for the attenuation of signal intensity. (authors)

  13. Comparison of myocardial perfusion imaging between the new high-speed gamma camera and the standard anger camera

    International Nuclear Information System (INIS)

    Tanaka, Hirokazu; Chikamori, Taishiro; Hida, Satoshi

    2013-01-01

    Cadmium-zinc-telluride (CZT) solid-state detectors have been recently introduced into the field of myocardial perfusion imaging. The aim of this study was to prospectively compare the diagnostic performance of the CZT high-speed gamma camera (Discovery NM 530c) with that of the standard 3-head gamma camera in the same group of patients. The study group consisted of 150 consecutive patients who underwent a 1-day stress-rest 99m Tc-sestamibi or tetrofosmin imaging protocol. Image acquisition was performed first on a standard gamma camera with a 15-min scan time each for stress and for rest. All scans were immediately repeated on a CZT camera with a 5-min scan time for stress and a 3-min scan time for rest, using list mode. The correlations between the CZT camera and the standard camera for perfusion and function analyses were strong within narrow Bland-Altman limits of agreement. Using list mode analysis, image quality for stress was rated as good or excellent in 97% of the 3-min scans, and in 100% of the ≥4-min scans. For CZT scans at rest, similarly, image quality was rated as good or excellent in 94% of the 1-min scans, and in 100% of the ≥2-min scans. The novel CZT camera provides excellent image quality, which is equivalent to standard myocardial single-photon emission computed tomography, despite a short scan time of less than half of the standard time. (author)

  14. X-Ray Powder Diffraction with Guinier - Haegg Focusing Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Allan

    1970-12-15

    The Guinier - Haegg focusing camera is discussed with reference to its use as an instrument for rapid phase analysis. An actual camera and the alignment procedure employed in its setting up are described. The results obtained with the instrument are compared with those obtained with Debye - Scherrer cameras and powder diffractometers. Exposure times of 15 - 30 minutes with compounds of simple structure are roughly one-sixth of those required for Debye - Scherrer patterns. Coupled with the lower background resulting from the use of a monochromatic X-ray beam, the shorter exposure time gives a ten-fold increase in sensitivity for the detection of minor phases as compared with the Debye - Scherrer camera. Attention is paid to the precautions taken to obtain reliable Bragg angles from Guinier - Haegg film measurements, with particular reference to calibration procedures. The evaluation of unit cell parameters from Guinier - Haegg data is discussed together with the application of tests for the presence of angle-dependent systematic errors. It is concluded that with proper calibration procedures and least squares treatment of the data, accuracies of the order of 0.005% are attainable. A compilation of diffraction data for a number of compounds examined in the Active Central Laboratory at Studsvik is presented to exemplify the scope of this type of powder camera.

  15. X-Ray Powder Diffraction with Guinier - Haegg Focusing Cameras

    International Nuclear Information System (INIS)

    Brown, Allan

    1970-12-01

    The Guinier - Haegg focusing camera is discussed with reference to its use as an instrument for rapid phase analysis. An actual camera and the alignment procedure employed in its setting up are described. The results obtained with the instrument are compared with those obtained with Debye - Scherrer cameras and powder diffractometers. Exposure times of 15 - 30 minutes with compounds of simple structure are roughly one-sixth of those required for Debye - Scherrer patterns. Coupled with the lower background resulting from the use of a monochromatic X-ray beam, the shorter exposure time gives a ten-fold increase in sensitivity for the detection of minor phases as compared with the Debye - Scherrer camera. Attention is paid to the precautions taken to obtain reliable Bragg angles from Guinier - Haegg film measurements, with particular reference to calibration procedures. The evaluation of unit cell parameters from Guinier - Haegg data is discussed together with the application of tests for the presence of angle-dependent systematic errors. It is concluded that with proper calibration procedures and least squares treatment of the data, accuracies of the order of 0.005% are attainable. A compilation of diffraction data for a number of compounds examined in the Active Central Laboratory at Studsvik is presented to exemplify the scope of this type of powder camera

  16. Angle Performance on Optima XE

    International Nuclear Information System (INIS)

    David, Jonathan; Satoh, Shu

    2011-01-01

    Angle control on high energy implanters is important due to shrinking device dimensions, and sensitivity to channeling at high beam energies. On Optima XE, beam-to-wafer angles are controlled in both the horizontal and vertical directions. In the horizontal direction, the beam angle is measured through a series of narrow slits, and any angle adjustment is made by steering the beam with the corrector magnet. In the vertical direction, the beam angle is measured through a high aspect ratio mask, and any angle adjustment is made by slightly tilting the wafer platen during implant.Using a sensitive channeling condition, we were able to quantify the angle repeatability of Optima XE. By quantifying the sheet resistance sensitivity to both horizontal and vertical angle variation, the total angle variation was calculated as 0.04 deg. (1σ). Implants were run over a five week period, with all of the wafers selected from a single boule, in order to control for any crystal cut variation.

  17. Direct cone beam SPECT reconstruction with camera tilt

    International Nuclear Information System (INIS)

    Jianying Li; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.; Zongjian Cao; Tsui, B.M.W.

    1993-01-01

    A filtered backprojection (FBP) algorithm is derived to perform cone beam (CB) single-photon emission computed tomography (SPECT) reconstruction with camera tilt using circular orbits. This algorithm reconstructs the tilted angle CB projection data directly by incorporating the tilt angle into it. When the tilt angle becomes zero, this algorithm reduces to that of Feldkamp. Experimentally acquired phantom studies using both a two-point source and the three-dimensional Hoffman brain phantom have been performed. The transaxial tilted cone beam brain images and profiles obtained using the new algorithm are compared with those without camera tilt. For those slices which have approximately the same distance from the detector in both tilt and non-tilt set-ups, the two transaxial reconstructions have similar profiles. The two-point source images reconstructed from this new algorithm and the tilted cone beam brain images are also compared with those reconstructed from the existing tilted cone beam algorithm. (author)

  18. Stereo Cameras for Clouds (STEREOCAM) Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Romps, David [Univ. of California, Berkeley, CA (United States); Oktem, Rusen [Univ. of California, Berkeley, CA (United States)

    2017-10-31

    The three pairs of stereo camera setups aim to provide synchronized and stereo calibrated time series of images that can be used for 3D cloud mask reconstruction. Each camera pair is positioned at approximately 120 degrees from the other pair, with a 17o-19o pitch angle from the ground, and at 5-6 km distance from the U.S. Department of Energy (DOE) Central Facility at the Atmospheric Radiation Measurement (ARM) Climate Research Facility Southern Great Plains (SGP) observatory to cover the region from northeast, northwest, and southern views. Images from both cameras of the same stereo setup can be paired together to obtain 3D reconstruction by triangulation. 3D reconstructions from the ring of three stereo pairs can be combined together to generate a 3D mask from surrounding views. This handbook delivers all stereo reconstruction parameters of the cameras necessary to make 3D reconstructions from the stereo camera images.

  19. Influence of narrow fuel spray angle and split injection strategies on combustion efficiency and engine performance in a common rail direct injection diesel engine

    Directory of Open Access Journals (Sweden)

    Raouf Mobasheri

    2017-03-01

    Full Text Available Direct injection diesel engines have been widely used in transportation and stationary power systems because of their inherent high thermal efficiency. On the other hand, emission regulations such as NOx and particulates have become more stringent from the standpoint of preserving the environment in recent years. In this study, previous results of multiple injection strategies have been further investigated to analyze the effects of narrow fuel spray angle on optimum multiple injection schemes in a heavy duty common rail direct injection diesel engine. An advanced computational fluid dynamics simulation has been carried out on a Caterpillar 3401 diesel engine for a conventional part load condition in 1600 r/min at two exhaust gas recirculation rates. A good agreement of calculated and measured in-cylinder pressure, heat release rate and pollutant formation trends was obtained under various operating points. Three different included spray angles have been studied in comparison with the traditional spray injection angle. The results show that spray targeting is very effective for controlling the in-cylinder mixture distributions especially when it accompanied with various injection strategies. It was found that the optimum engine performance for simultaneous reduction of soot and NOx emissions was achieved with 105° included spray angle along with an optimized split injection strategy. The results show, in this case, the fuel spray impinges at the edge of the piston bowl and a counterclockwise flow motion is generated that pushes mixture toward the center of the piston bowl.

  20. Interference-induced angle-independent acoustical transparency

    International Nuclear Information System (INIS)

    Qi, Lehua; Yu, Gaokun; Wang, Ning; Wang, Xinlong; Wang, Guibo

    2014-01-01

    It is revealed that the Fano-like interference leads to the extraordinary acoustic transmission through a slab metamaterial of thickness much smaller than the wavelength, with each unit cell consisting of a Helmholtz resonator and a narrow subwavelength slit. More importantly, both the theoretical analysis and experimental measurement show that the angle-independent acoustical transparency can be realized by grafting a Helmholtz resonator and a quarter-wave resonator to the wall of a narrow subwavelength slit in each unit cell of a slit array. The observed phenomenon results from the interferences between the waves propagating in the slit, those re-radiated by the Helmholtz resonator, and those re-radiated by the quarter-wave resonator. The proposed design may find its applications in designing angle-independent acoustical filters and controlling the phase of the transmitted waves

  1. Combined ab interno trabeculotomy and lens extraction: a novel management option for combined uveitic and chronic narrow angle raised intraocular pressure.

    Science.gov (United States)

    Lin, Siying; Gupta, Bhaskar; Rossiter, Jonathan

    2016-02-01

    Minimally invasive glaucoma surgery is a developing area that has the potential to replace traditional glaucoma surgery, with its known risk profile, but at present there are no randomised controlled data to validate its use. We report on a case where sequential bilateral combined ab interno trabeculotomy and lens extraction surgery was performed on a 45-year-old woman with combined uveitic and chronic narrow angle raised intraocular pressure. Maximal medical management alone could not control the intraocular pressure. At 12-month follow-up, the patient had achieved stable intraocular pressure in both eyes on a combination of topical ocular antiglaucomatous and steroid therapies. This case demonstrates the effectiveness of trabecular meshwork ablation via ab interno trabeculotomy in a case of complex mixed mechanism glaucoma. 2016 BMJ Publishing Group Ltd.

  2. Cancer of the colon spleen angle. Presentation of a case

    International Nuclear Information System (INIS)

    Martinez Sanchez, Yariana; De la Rosa Perez, Nereida; Barcelo Casanova, Renato E

    2010-01-01

    The colon cancer is currently an important public health problem in developed countries. It is the fourth most common cancer in the world. We report the case of a 65-years-old, black, female patient, assisting our consultation with dyspeptic disturbances as the unique symptom, without known risk factors. We indicated a colon by enema and a distal narrowing was observed at the colon spleen angle, at the same zone of the physiologic narrowing at that level. A colonoscopy was carried out diagnosing a left colon tumor near the spleen angle. It was operated with segmental resection of the spleen angle and a biopsy was made. Pathologic anatomy informed a well-differentiated colon adenocarcinoma

  3. Evaluation of the anterior chamber angle in Asian Indian eyes by ultrasound biomicroscopy and gonioscopy.

    Science.gov (United States)

    Kaushik, Sushmita; Jain, Rajeev; Pandav, Surinder Singh; Gupta, Amod

    2006-09-01

    To compare the ultrasound biomicroscopic measurement of the anterior chamber angle in Asian Indian eyes, with the angle width estimated by gonioscopy. Patients with open and closed angles attending a glaucoma clinic were recruited for the study. Temporal quadrants of the angles of patients were categorized by gonioscopy as Grade 0 to Grade 4, using Shaffer's classification. These angles were quantified by ultrasound biomicroscopy (UBM) using the following biometric characteristics: Angle opening distance at 250 micro (AOD 250) and 500 micro (AOD 500) from the scleral spur and trabecular meshwork-ciliary process distance (TCPD). The angles were further segregated as "narrow angles" (Schaffer's Grade 2 or less) and "open angles" (Schaffer's Grade 3 and 4). The UBM measurements were computed in each case and analyzed in relation to the gonioscopic angle evaluation. One hundred and sixty three eyes of 163 patients were analyzed. One hundred and six eyes had "narrow angles" and 57 eyes had "open angles" on gonioscopy. There was a significant difference among the mean UBM measurements of each angle grade estimated by gonioscopy (P gonioscopy grades was significant at the 0.01 level. The mean AOD 250, AOD 500 and TCPD in narrow angles were 58+/-49 micro, 102+/-84 micro and 653+/-124 respectively, while it was 176+/-47 micro, 291+/-62 micro and 883+/-94 micro in eyes with open angles (P gonioscopy correlated significantly with the angle dimensions measured by UBM. Gonioscopy, though a subjective test, is a reliable method for estimation of the angle width.

  4. TRANSFORMATION ALGORITHM FOR IMAGES OBTAINED BY OMNIDIRECTIONAL CAMERAS

    Directory of Open Access Journals (Sweden)

    V. P. Lazarenko

    2015-01-01

    Full Text Available Omnidirectional optoelectronic systems find their application in areas where a wide viewing angle is critical. However, omnidirectional optoelectronic systems have a large distortion that makes their application more difficult. The paper compares the projection functions of traditional perspective lenses and omnidirectional wide angle fish-eye lenses with a viewing angle not less than 180°. This comparison proves that distortion models of omnidirectional cameras cannot be described as a deviation from the classic model of pinhole camera. To solve this problem, an algorithm for transforming omnidirectional images has been developed. The paper provides a brief comparison of the four calibration methods available in open source toolkits for omnidirectional optoelectronic systems. Geometrical projection model is given used for calibration of omnidirectional optical system. The algorithm consists of three basic steps. At the first step, we calculate he field of view of a virtual pinhole PTZ camera. This field of view is characterized by an array of 3D points in the object space. At the second step the array of corresponding pixels for these three-dimensional points is calculated. Then we make a calculation of the projection function that expresses the relation between a given 3D point in the object space and a corresponding pixel point. In this paper we use calibration procedure providing the projection function for calibrated instance of the camera. At the last step final image is formed pixel-by-pixel from the original omnidirectional image using calculated array of 3D points and projection function. The developed algorithm gives the possibility for obtaining an image for a part of the field of view of an omnidirectional optoelectronic system with the corrected distortion from the original omnidirectional image. The algorithm is designed for operation with the omnidirectional optoelectronic systems with both catadioptric and fish-eye lenses

  5. Angle imaging: Advances and challenges

    Science.gov (United States)

    Quek, Desmond T L; Nongpiur, Monisha E; Perera, Shamira A; Aung, Tin

    2011-01-01

    Primary angle closure glaucoma (PACG) is a major form of glaucoma in large populous countries in East and South Asia. The high visual morbidity from PACG is related to the destructive nature of the asymptomatic form of the disease. Early detection of anatomically narrow angles is important and the subsequent prevention of visual loss from PACG depends on an accurate assessment of the anterior chamber angle (ACA). This review paper discusses the advantages and limitations of newer ACA imaging technologies, namely ultrasound biomicroscopy, Scheimpflug photography, anterior segment optical coherence tomography and EyeCam, highlighting the current clinical evidence comparing these devices with each other and with clinical dynamic indentation gonioscopy, the current reference standard. PMID:21150037

  6. Region of Interest Selection Interface for Wide-Angle Arthroscope

    Directory of Open Access Journals (Sweden)

    Jung Kyunghwa

    2015-01-01

    Full Text Available We have proposed a new interface for an wide-angle endoscope for solo surgery. The wide-angle arthroscopic view and magnified region of interest (ROI within the wide view were shown simultaneously. With a camera affixed to surgical instruments, the position of the ROI could be determined by manipulating the surgical instrument. Image features acquired by the A-KAZE approach were used to estimate the change of position of the surgical instrument by tracking the features every time the camera moved. We examined the accuracy of ROI selection using three different images, which were different-sized square arrays and tested phantom experiments. When the number of ROIs was twelve, the success rate was best, and the rate diminished as the size of ROIs decreased. The experimental results showed that the method of using a camera without additional sensors satisfied the appropriate accuracy required for ROI selection, and this interface was helpful in performing surgery with fewer assistants.

  7. Software for fast cameras and image handling on MAST

    International Nuclear Information System (INIS)

    Shibaev, S.

    2008-01-01

    The rapid progress in fast imaging gives new opportunities for fusion research. The data obtained by fast cameras play an important and ever-increasing role in analysis and understanding of plasma phenomena. The fast cameras produce a huge amount of data which creates considerable problems for acquisition, analysis, and storage. We use a number of fast cameras on the Mega-Amp Spherical Tokamak (MAST). They cover several spectral ranges: broadband visible, infra-red and narrow band filtered for spectroscopic studies. These cameras are controlled by programs developed in-house. The programs provide full camera configuration and image acquisition in the MAST shot cycle. Despite the great variety of image sources, all images should be stored in a single format. This simplifies development of data handling tools and hence the data analysis. A universal file format has been developed for MAST images which supports storage in both raw and compressed forms, using either lossless or lossy compression. A number of access and conversion routines have been developed for all languages used on MAST. Two movie-style display tools have been developed-Windows native and Qt based for Linux. The camera control programs run as autonomous data acquisition units with full camera configuration set and stored locally. This allows easy porting of the code to other data acquisition systems. The software developed for MAST fast cameras has been adapted for several other tokamaks where it is in regular use

  8. Gamma cameras - a method of evaluation

    International Nuclear Information System (INIS)

    Oates, L.; Bibbo, G.

    2000-01-01

    Full text: With the sophistication and longevity of the modern gamma camera it is not often that the need arises to evaluate a gamma camera for purchase. We have recently been placed in the position of retiring our two single headed cameras of some vintage and replacing them with a state of the art dual head variable angle gamma camera. The process used for the evaluation consisted of five parts: (1) Evaluation of the technical specification as expressed in the tender document; (2) A questionnaire adapted from the British Society of Nuclear Medicine; (3) Site visits to assess gantry configuration, movement, patient access and occupational health, welfare and safety considerations; (4) Evaluation of the processing systems offered; (5) Whole of life costing based on equally configured systems. The results of each part of the evaluation were expressed using a weighted matrix analysis with each of the criteria assessed being weighted in accordance with their importance to the provision of an effective nuclear medicine service for our centre and the particular importance to paediatric nuclear medicine. This analysis provided an objective assessment of each gamma camera system from which a purchase recommendation was made. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  9. Delay line clipping in a scintillation camera system

    International Nuclear Information System (INIS)

    Hatch, K.F.

    1979-01-01

    The present invention provides a novel base line restoring circuit and a novel delay line clipping circuit in a scintillation camera system. Single and double delay line clipped signal waveforms are generated for increasing the operational frequency and fidelity of data detection of the camera system by base line distortion such as undershooting, overshooting, and capacitive build-up. The camera system includes a set of photomultiplier tubes and associated amplifiers which generate sequences of pulses. These pulses are pulse-height analyzed for detecting a scintillation having an energy level which falls within a predetermined energy range. Data pulses are combined to provide coordinates and energy of photopeak events. The amplifiers are biassed out of saturation over all ranges of pulse energy level and count rate. Single delay line clipping circuitry is provided for narrowing the pulse width of the decaying electrical data pulses which increase operating speed without the occurrence of data loss. (JTA)

  10. Evaluation of the anterior chamber angle in Asian Indian eyes by ultrasound biomicroscopy and gonioscopy

    Directory of Open Access Journals (Sweden)

    Kaushik Sushmita

    2006-01-01

    Full Text Available Purpose: To compare the ultrasound biomicroscopic measurement of the anterior chamber angle in Asian Indian eyes, with the angle width estimated by gonioscopy. Materials and Methods: Participants: Patients with open and closed angles attending a glaucoma clinic were recruited for the study. Observation Procedures: Temporal quadrants of the angles of patients were categorized by gonioscopy as Grade 0 to Grade 4, using Shaffer′s classification. These angles were quantified by ultrasound biomicroscopy (UBM using the following biometric characteristics: Angle opening distance at 250 µ (AOD 250 and 500 µ (AOD 500 from the scleral spur and trabecular meshwork-ciliary process distance (TCPD. The angles were further segregated as "narrow angles" (Schaffer′s Grade 2 or less and "open angles" (Schaffer′s Grade 3 and 4. Main Outcome Measures: The UBM measurements were computed in each case and analyzed in relation to the gonioscopic angle evaluation. Results: One hundred and sixty three eyes of 163 patients were analyzed. One hundred and six eyes had "narrow angles" and 57 eyes had "open angles" on gonioscopy. There was a significant difference among the mean UBM measurements of each angle grade estimated by gonioscopy ( P < 0.001. The Pearson correlation coefficient between all UBM parameters and gonioscopy grades was significant at the 0.01 level. The mean AOD 250, AOD 500 and TCPD in narrow angles were 58±49 µ, 102±84 µ and 653±124 respectively, while it was 176±47 µ, 291±62 µ and 883±94 µ in eyes with open angles ( P < 0.001 respectively. Conclusions: The angle width estimated by gonioscopy correlated significantly with the angle dimensions measured by UBM. Gonioscopy, though a subjective test, is a reliable method for estimation of the angle width.

  11. Object tracking using multiple camera video streams

    Science.gov (United States)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  12. Computing Installation Parameters Of CCTV Cameras for Traffic Surveillance

    OpenAIRE

    Pratishtha Gupta; G. N. Purohit

    2013-01-01

    For properly installing CCTV cameras on any intersection point for traffic surveillance, some parametersneed to be determined in order to get maximum benefit. The height, angle of placement of the CCTVcamera is used to determine the view or the area that the camera will cover with proper resolution. Theresolution should not be too high to cover less traffic and should not be too low to cover large but hardlydistinguishable traffic.This paper concerns with computation of the required CCTV inst...

  13. Omnidirectional narrow optical filters for circularly polarized light in a nanocomposite structurally chiral medium.

    Science.gov (United States)

    Avendaño, Carlos G; Palomares, Laura O

    2018-04-20

    We consider the propagation of electromagnetic waves throughout a nanocomposite structurally chiral medium consisting of metallic nanoballs randomly dispersed in a structurally chiral material whose dielectric properties can be represented by a resonant effective uniaxial tensor. It is found that an omnidirectional narrow pass band and two omnidirectional narrow band gaps are created in the blue optical spectrum for right and left circularly polarized light, as well as narrow reflection bands for right circularly polarized light that can be controlled by varying the light incidence angle and the filling fraction of metallic inclusions.

  14. A wide angle view imaging diagnostic with all reflective, in-vessel optics at JET

    Energy Technology Data Exchange (ETDEWEB)

    Clever, M. [Institute of Energy and Climate Research – Plasma Physics, Forschungszentrum Jülich GmbH, Association EURATOM-FZJ, 52425 Jülich (Germany); Arnoux, G.; Balshaw, N. [Euratom/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Garcia-Sanchez, P. [Laboratorio Nacional de Fusion, Asociacion EURATOM-CIEMAT, Madrid (Spain); Patel, K. [Euratom/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom); Sergienko, G. [Institute of Energy and Climate Research – Plasma Physics, Forschungszentrum Jülich GmbH, Association EURATOM-FZJ, 52425 Jülich (Germany); Soler, D. [Winlight System, 135 rue Benjamin Franklin, ZA Saint Martin, F-84120 Pertuis (France); Stamp, M.F.; Williams, J.; Zastrow, K.-D. [Euratom/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon OX14 3DB (United Kingdom)

    2013-10-15

    Highlights: ► A new wide angle view camera system has been installed at JET. ► The system helps to protect the ITER-like wall plasma facing components from damage. ► The coverage of the vessel by camera observation systems was increased. ► The system comprises an in-vessel part with parabolic and flat mirrors. ► The required image quality for plasma monitoring and wall protection was delivered. -- Abstract: A new wide angle view camera system has been installed at JET in preparation for the ITER-like wall campaigns. It considerably increases the coverage of the vessel by camera observation systems and thereby helps to protect the – compared to carbon – more fragile plasma facing components from damage. The system comprises an in-vessel part with parabolic and flat mirrors and an ex-vessel part with beam splitters, lenses and cameras. The system delivered the image quality required for plasma monitoring and wall protection.

  15. Multi Camera Multi Object Tracking using Block Search over Epipolar Geometry

    Directory of Open Access Journals (Sweden)

    Saman Sargolzaei

    2000-01-01

    Full Text Available We present strategy for multi-objects tracking in multi camera environment for the surveillance and security application where tracking multitude subjects are of utmost importance in a crowded scene. Our technique assumes partially overlapped multi-camera setup where cameras share common view from different angle to assess positions and activities of subjects under suspicion. To establish spatial correspondence between camera views we employ an epipolar geometry technique. We propose an overlapped block search method to find the interested pattern (target in new frames. Color pattern update scheme has been considered to further optimize the efficiency of the object tracking when object pattern changes due to object motion in the field of views of the cameras. Evaluation of our approach is presented with the results on PETS2007 dataset..

  16. ASSESSMENT OF LENS THICKNESS IN ANGLE CLOSURE DISEASE

    Directory of Open Access Journals (Sweden)

    Nishat Sultana Khayoom

    2016-08-01

    Full Text Available BACKGROUND Anterior chamber depth and lens thickness have been considered as important biometric determinants in primary angle-closure glaucoma. Patients with primary narrow angle may be classified as a primary angle closure suspect (PACS, or as having primary angle closure (PAC or primary angle closure glaucoma (PACG. 23.9% of patients with primary angle closure disease are in India, which highlights the importance of understanding the disease, its natural history, and its underlying pathophysiology, so that we may try to establish effective methods of treatment and preventative measures to delay, or even arrest, disease progression, thereby reducing visual morbidity. AIM To determine the lens thickness using A-scan biometry and its significance in various stages of angle closure disease. MATERIALS AND METHODS Patients attending outpatient department at Minto Ophthalmic Hospital between October 2013 to May 2015 were screened for angle closure disease and subsequently evaluated at glaucoma department. In our study, lens thickness showed a direct correlation with shallowing of the anterior chamber by determining the LT/ ACD ratio. A decrease in anterior chamber depth is proportional to the narrowing of the angle which contributes to the progression of the angle closure disease from just apposition to occlusion enhancing the risk for optic nerve damage and visual field loss. Hence, if the lens thickness values are assessed earlier in the disease process, appropriate intervention can be planned. CONCLUSION Determination of lens changes along with anterior chamber depth and axial length morphometrically can aid in early detection of angle closure. The role of lens extraction for PACG is a subject of increased interest. Lens extraction promotes the benefits of anatomical opening of the angle, IOP reduction and improved vision. This potential intervention may be one among the armamentarium of approaches for PACG. Among the current treatment modalities

  17. Augmented reality glass-free three-dimensional display with the stereo camera

    Science.gov (United States)

    Pang, Bo; Sang, Xinzhu; Chen, Duo; Xing, Shujun; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu

    2017-10-01

    An improved method for Augmented Reality (AR) glass-free three-dimensional (3D) display based on stereo camera used for presenting parallax contents from different angle with lenticular lens array is proposed. Compared with the previous implementation method of AR techniques based on two-dimensional (2D) panel display with only one viewpoint, the proposed method can realize glass-free 3D display of virtual objects and real scene with 32 virtual viewpoints. Accordingly, viewers can get abundant 3D stereo information from different viewing angles based on binocular parallax. Experimental results show that this improved method based on stereo camera can realize AR glass-free 3D display, and both of virtual objects and real scene have realistic and obvious stereo performance.

  18. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  19. Calibration of action cameras for photogrammetric purposes.

    Science.gov (United States)

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-09-18

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  20. Calibration of Action Cameras for Photogrammetric Purposes

    Directory of Open Access Journals (Sweden)

    Caterina Balletti

    2014-09-01

    Full Text Available The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a easy to handle, (b capable of performing under extreme conditions and more importantly (c able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  1. CCD Camera Lens Interface for Real-Time Theodolite Alignment

    Science.gov (United States)

    Wake, Shane; Scott, V. Stanley, III

    2012-01-01

    Theodolites are a common instrument in the testing, alignment, and building of various systems ranging from a single optical component to an entire instrument. They provide a precise way to measure horizontal and vertical angles. They can be used to align multiple objects in a desired way at specific angles. They can also be used to reference a specific location or orientation of an object that has moved. Some systems may require a small margin of error in position of components. A theodolite can assist with accurately measuring and/or minimizing that error. The technology is an adapter for a CCD camera with lens to attach to a Leica Wild T3000 Theodolite eyepiece that enables viewing on a connected monitor, and thus can be utilized with multiple theodolites simultaneously. This technology removes a substantial part of human error by relying on the CCD camera and monitors. It also allows image recording of the alignment, and therefore provides a quantitative means to measure such error.

  2. Large-grazing-angle, multi-image Kirkpatrick-Baez microscope as the front end to a high-resolution streak camera for OMEGA

    International Nuclear Information System (INIS)

    Gotchev, O.V.; Hayes, L.J.; Jaanimagi, P.A.; Knauer, J.P.; Marshall, F.J.; Meyerhofer, D.D.

    2003-01-01

    A high-resolution x-ray microscope with a large grazing angle has been developed, characterized, and fielded at the Laboratory for Laser Energetics. It increases the sensitivity and spatial resolution in planar direct-drive hydrodynamic stability experiments, relevant to inertial confinement fusion research. It has been designed to work as the optical front end of the PJX - a high-current, high-dynamic-range x-ray streak camera. Optical design optimization, results from numerical ray tracing, mirror-coating choice, and characterization have been described previously [O. V. Gotchev, et al., Rev. Sci. Instrum. 74, 2178 (2003)]. This work highlights the optics' unique mechanical design and flexibility and considers certain applications that benefit from it. Characterization of the microscope's resolution in terms of its modulation transfer function over the field of view is shown. Recent results from hydrodynamic stability experiments, diagnosed with the optic and the PJX, are provided to confirm the microscope's advantages as a high-resolution, high-throughput x-ray optical front end for streaked imaging

  3. Optical Enhancement of Exoskeleton-Based Estimation of Glenohumeral Angles

    Science.gov (United States)

    Cortés, Camilo; Unzueta, Luis; de los Reyes-Guzmán, Ana; Ruiz, Oscar E.; Flórez, Julián

    2016-01-01

    In Robot-Assisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical marker-based MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR. PMID:27403044

  4. Review of Calibration Methods for Scheimpflug Camera

    Directory of Open Access Journals (Sweden)

    Cong Sun

    2018-01-01

    Full Text Available The Scheimpflug camera offers a wide range of applications in the field of typical close-range photogrammetry, particle image velocity, and digital image correlation due to the fact that the depth-of-view of Scheimpflug camera can be greatly extended according to the Scheimpflug condition. Yet, the conventional calibration methods are not applicable in this case because the assumptions used by classical calibration methodologies are not valid anymore for cameras undergoing Scheimpflug condition. Therefore, various methods have been investigated to solve the problem over the last few years. However, no comprehensive review exists that provides an insight into recent calibration methods of Scheimpflug cameras. This paper presents a survey of recent calibration methods of Scheimpflug cameras with perspective lens, including the general nonparametric imaging model, and analyzes in detail the advantages and drawbacks of the mainstream calibration models with respect to each other. Real data experiments including calibrations, reconstructions, and measurements are performed to assess the performance of the models. The results reveal that the accuracies of the RMM, PLVM, PCIM, and GNIM are basically equal, while the accuracy of GNIM is slightly lower compared with the other three parametric models. Moreover, the experimental results reveal that the parameters of the tangential distortion are likely coupled with the tilt angle of the sensor in Scheimpflug calibration models. The work of this paper lays the foundation of further research of Scheimpflug cameras.

  5. Generation of tunable narrow-band surface-emitted terahertz radiation in periodically poled lithium niobate.

    Science.gov (United States)

    Weiss, C; Torosyan, G; Avetisyan, Y; Beigang, R

    2001-04-15

    Generation of tunable narrow-band terahertz (THz) radiation perpendicular to the surface of periodically poled lithium niobate by optical rectification of femtosecond pulses is reported. The generated THz radiation can be tuned by use of different poling periods and different observation angles, limited only by the available bandwidth of the pump pulse. Typical bandwidths were 50-100 GHz, depending on the collection angle and the number of periods involved.

  6. A hands-free region-of-interest selection interface for solo surgery with a wide-angle endoscope: preclinical proof of concept.

    Science.gov (United States)

    Jung, Kyunghwa; Choi, Hyunseok; Hong, Hanpyo; Adikrishna, Arnold; Jeon, In-Ho; Hong, Jaesung

    2017-02-01

    A hands-free region-of-interest (ROI) selection interface is proposed for solo surgery using a wide-angle endoscope. A wide-angle endoscope provides images with a larger field of view than a conventional endoscope. With an appropriate selection interface for a ROI, surgeons can also obtain a detailed local view as if they moved a conventional endoscope in a specific position and direction. To manipulate the endoscope without releasing the surgical instrument in hand, a mini-camera is attached to the instrument, and the images taken by the attached camera are analyzed. When a surgeon moves the instrument, the instrument orientation is calculated by an image processing. Surgeons can select the ROI with this instrument movement after switching from 'task mode' to 'selection mode.' The accelerated KAZE algorithm is used to track the features of the camera images once the instrument is moved. Both the wide-angle and detailed local views are displayed simultaneously, and a surgeon can move the local view area by moving the mini-camera attached to the surgical instrument. Local view selection for a solo surgery was performed without releasing the instrument. The accuracy of camera pose estimation was not significantly different between camera resolutions, but it was significantly different between background camera images with different numbers of features (P solo surgeries without a camera assistant.

  7. Bubble departure diameter in narrow rectangular channel under rolling condition

    Energy Technology Data Exchange (ETDEWEB)

    Xie, T.; Chen, B.; Yan, X.; Xu, J.; Huang, Y.; Xiao, Z. [Nuclear Power Inst. of China, Chengdu, Sichuan (China)

    2014-07-01

    Forced convective subcooled boiling flow experiments were conducted in a vertical upward narrow rectangular channel under rolling motion. A high-speed digital video camera was used to capture the dynamics of the bubble nucleation process. Bubble departure diameters were obtained from the images. A bubble departure model based on force balance analysis was proposed to predict the bubble departure size under rolling condition by considering the additional centrifugal, tangential and Coriolis force. The proposed model agreed well with the experimental data within the averaged relative deviation of 5%. (author)

  8. A ToF-Camera as a 3D Vision Sensor for Autonomous Mobile Robotics

    Directory of Open Access Journals (Sweden)

    Sobers Lourdu Xavier Francis

    2015-11-01

    Full Text Available The aim of this paper is to deploy a time-of-flight (ToF based photonic mixer device (PMD camera on an Autonomous Ground Vehicle (AGV whose overall target is to traverse from one point to another in hazardous and hostile environments employing obstacle avoidance without human intervention. The hypothesized approach of applying a ToF Camera for an AGV is a suitable approach to autonomous robotics because, as the ToF camera can provide three-dimensional (3D information at a low computational cost, it is utilized to extract information about obstacles after their calibration and ground testing and is mounted and integrated with the Pioneer mobile robot. The workspace is a two-dimensional (2D world map which has been divided into a grid/cells, where the collision-free path defined by the graph search algorithm is a sequence of cells the AGV can traverse to reach the target. PMD depth data is used to populate traversable areas and obstacles by representing a grid/cells of suitable size. These camera data are converted into Cartesian coordinates for entry into a workspace grid map. A more optimal camera mounting angle is needed and adopted by analysing the camera's performance discrepancy, such as pixel detection, the detection rate and the maximum perceived distances, and infrared (IR scattering with respect to the ground surface. This mounting angle is recommended to be half the vertical field-of-view (FoV of the PMD camera. A series of still and moving tests are conducted on the AGV to verify correct sensor operations, which show that the postulated application of the ToF camera in the AGV is not straightforward. Later, to stabilize the moving PMD camera and to detect obstacles, a tracking feature detection algorithm and the scene flow technique are implemented to perform a real-time experiment.

  9. The first demonstration of the concept of “narrow-FOV Si/CdTe semiconductor Compton camera”

    Energy Technology Data Exchange (ETDEWEB)

    Ichinohe, Yuto, E-mail: ichinohe@astro.isas.jaxa.jp [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa 252-5210 (Japan); University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-0033 (Japan); Uchida, Yuusuke; Watanabe, Shin [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa 252-5210 (Japan); University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-0033 (Japan); Edahiro, Ikumi [Hiroshima University, 1-3-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Hayashi, Katsuhiro [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa 252-5210 (Japan); Kawano, Takafumi; Ohno, Masanori [Hiroshima University, 1-3-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Ohta, Masayuki [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa 252-5210 (Japan); Takeda, Shin' ichiro [Okinawa Institute of Science and Technology Graduate University, 1919-1 Tancha, Onna-son, Okinawa 904-0495 (Japan); Fukazawa, Yasushi [Hiroshima University, 1-3-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Katsuragawa, Miho [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa 252-5210 (Japan); University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-0033 (Japan); Nakazawa, Kazuhiro [University of Tokyo, 7-3-1 Hongo, Bunkyo, Tokyo 113-0033 (Japan); Odaka, Hirokazu [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuo, Sagamihara, Kanagawa 252-5210 (Japan); Tajima, Hiroyasu [Solar-Terrestrial Environment Laboratory, Nagoya University, Furo-cho, Chikusa, Nagoya, Aichi 464-8601 (Japan); Takahashi, Hiromitsu [Hiroshima University, 1-3-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8526 (Japan); and others

    2016-01-11

    The Soft Gamma-ray Detector (SGD), to be deployed on board the ASTRO-H satellite, has been developed to provide the highest sensitivity observations of celestial sources in the energy band of 60–600 keV by employing a detector concept which uses a Compton camera whose field-of-view is restricted by a BGO shield to a few degree (narrow-FOV Compton camera). In this concept, the background from outside the FOV can be heavily suppressed by constraining the incident direction of the gamma ray reconstructed by the Compton camera to be consistent with the narrow FOV. We, for the first time, demonstrate the validity of the concept using background data taken during the thermal vacuum test and the low-temperature environment test of the flight model of SGD on ground. We show that the measured background level is suppressed to less than 10% by combining the event rejection using the anti-coincidence trigger of the active BGO shield and by using Compton event reconstruction techniques. More than 75% of the signals from the field-of-view are retained against the background rejection, which clearly demonstrates the improvement of signal-to-noise ratio. The estimated effective area of 22.8 cm{sup 2} meets the mission requirement even though not all of the operational parameters of the instrument have been fully optimized yet.

  10. Energy characteristics of the double slot in the narrow wall of a rectangular waveguide

    OpenAIRE

    Martynenko, S. A.

    2005-01-01

    Based on approximation of the half-wave field distribution in the slots, an expression is derived for internal mutual conductance of closely-spaced slots, which form a double inclined slot in the narrow wall of a rectangular waveguide. The narrow wall has cut-outs reaching the broad wall. With the use of the method of induced magnetomotive forces, a mathematical model is devised for calculating the energy characteristics of the double slot. The impact of angle of inclination of the slots, dim...

  11. Mars Orbiter Camera Views the 'Face on Mars' - Best View from Viking

    Science.gov (United States)

    1998-01-01

    Shortly after midnight Sunday morning (5 April 1998 12:39 AM PST), the Mars Orbiter Camera (MOC) on the Mars Global Surveyor (MGS) spacecraft successfully acquired a high resolution image of the 'Face on Mars' feature in the Cydonia region. The image was transmitted to Earth on Sunday, and retrieved from the mission computer data base Monday morning (6 April 1998). The image was processed at the Malin Space Science Systems (MSSS) facility 9:15 AM and the raw image immediately transferred to the Jet Propulsion Laboratory (JPL) for release to the Internet. The images shown here were subsequently processed at MSSS.The picture was acquired 375 seconds after the spacecraft's 220th close approach to Mars. At that time, the 'Face', located at approximately 40.8o N, 9.6o W, was 275 miles (444 km) from the spacecraft. The 'morning' sun was 25o above the horizon. The picture has a resolution of 14.1 feet (4.3 meters) per pixel, making it ten times higher resolution than the best previous image of the feature, which was taken by the Viking Mission in the mid-1970's. The full image covers an area 2.7 miles (4.4 km) wide and 25.7 miles (41.5 km) long.This Viking Orbiter image is one of the best Viking pictures of the area Cydonia where the 'Face' is located. Marked on the image are the 'footprint' of the high resolution (narrow angle) Mars Orbiter Camera image and the area seen in enlarged views (dashed box). See PIA01440-1442 for these images in raw and processed form.Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  12. The effect of electron collimator leaf shape on the build-up dose in narrow electron MLC fields

    International Nuclear Information System (INIS)

    Vatanen, T; Vaeaenaenen, A; Lahtinen, T; Traneus, E

    2009-01-01

    Previously, we have found that the build-up dose from abutting narrow electron beams formed with unfocussed electron multi-leaf collimator (eMLC) steal leaves was higher than with the respective open field. To investigate more closely the effect of leaf material and shape on dose in the build-up region, straight, round (radius 1.5 cm) and leaf ends with a different front face angle of α (leaf front face pointing towards the beam axis at an angle of 90 - α) made of steel, brass and tungsten were modelled using the BEAMnrc code. Based on a treatment head simulation of a Varian 2100 C/D linac, depth-dose curves and profiles in water were calculated for narrow 6, 12 and 20 MeV eMLC beams (width 1.0 cm, length 10 cm) at source-to-surface distances (SSD) of 102 and 105 cm. The effects of leaf material and front face angle were evaluated based on electron fluence, angle and energy spectra. With a leaf front face angle of 15 deg., the dose in the build-up region of the 6 MeV field varied between 91 and 100%, while for straight and round leaf shapes the dose varied between 89 and 100%. The variation was between 94 and 100% for 12 and 20 MeV. For abutting narrow 6 MeV fields with total field size 5 x 10 cm 2 , the build-up doses at 5 mm depth for the face angle 15 deg. and straight and round leaf shapes were 96% and 86% (SSD 102 cm) and 89% and 85% (SSD 105 cm). With higher energies, the effect of eMLC leaf shape on dose at 5 mm was slight (3-4% units with 12 MeV) and marginal with 20 MeV. The fluence, energy and angle spectra for total and leaf scattered electrons were practically the same for different leaf materials with 6 MeV. With high energies, the spectra for tungsten were more peaked due to lower leaf transmission. Compared with straight leaf ends, the face angle of 15 deg. and round leaf ends led to a 1 mm (for 6 MeV) and between 1 and 5 mm (12 and 20 MeV at a SSD of 105 cm) decrease of therapeutic range and increase of the field size, respectively. However

  13. Dynamic-angle spinning and double rotation of quadrupolar nuclei

    International Nuclear Information System (INIS)

    Mueller, K.T.; California Univ., Berkeley, CA

    1991-07-01

    Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-1/2 nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids

  14. Dynamic-angle spinning and double rotation of quadrupolar nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, K.T. (Lawrence Berkeley Lab., CA (United States) California Univ., Berkeley, CA (United States). Dept. of Chemistry)

    1991-07-01

    Nuclear magnetic resonance (NMR) spectroscopy of quadrupolar nuclei is complicated by the coupling of the electric quadrupole moment of the nucleus to local variations in the electric field. The quadrupolar interaction is a useful source of information about local molecular structure in solids, but it tends to broaden resonance lines causing crowding and overlap in NMR spectra. Magic- angle spinning, which is routinely used to produce high resolution spectra of spin-{1/2} nuclei like carbon-13 and silicon-29, is incapable of fully narrowing resonances from quadrupolar nuclei when anisotropic second-order quadrupolar interactions are present. Two new sample-spinning techniques are introduced here that completely average the second-order quadrupolar coupling. Narrow resonance lines are obtained and individual resonances from distinct nuclear sites are identified. In dynamic-angle spinning (DAS) a rotor containing a powdered sample is reoriented between discrete angles with respect to high magnetic field. Evolution under anisotropic interactions at the different angles cancels, leaving only the isotropic evolution of the spin system. In the second technique, double rotation (DOR), a small rotor spins within a larger rotor so that the sample traces out a complicated trajectory in space. The relative orientation of the rotors and the orientation of the larger rotor within the magnetic field are selected to average both first- and second-order anisotropic broadening. The theory of quadrupolar interactions, coherent averaging theory, and motional narrowing by sample reorientation are reviewed with emphasis on the chemical shift anisotropy and second-order quadrupolar interactions experienced by half-odd integer spin quadrupolar nuclei. The DAS and DOR techniques are introduced and illustrated with application to common quadrupolar systems such as sodium-23 and oxygen-17 nuclei in solids.

  15. Laser line scan underwater imaging by complementary metal-oxide-semiconductor camera

    Science.gov (United States)

    He, Zhiyi; Luo, Meixing; Song, Xiyu; Wang, Dundong; He, Ning

    2017-12-01

    This work employs the complementary metal-oxide-semiconductor (CMOS) camera to acquire images in a scanning manner for laser line scan (LLS) underwater imaging to alleviate backscatter impact of seawater. Two operating features of the CMOS camera, namely the region of interest (ROI) and rolling shutter, can be utilized to perform image scan without the difficulty of translating the receiver above the target as the traditional LLS imaging systems have. By the dynamically reconfigurable ROI of an industrial CMOS camera, we evenly divided the image into five subareas along the pixel rows and then scanned them by changing the ROI region automatically under the synchronous illumination by the fun beams of the lasers. Another scanning method was explored by the rolling shutter operation of the CMOS camera. The fun beam lasers were turned on/off to illuminate the narrow zones on the target in a good correspondence to the exposure lines during the rolling procedure of the camera's electronic shutter. The frame synchronization between the image scan and the laser beam sweep may be achieved by either the strobe lighting output pulse or the external triggering pulse of the industrial camera. Comparison between the scanning and nonscanning images shows that contrast of the underwater image can be improved by our LLS imaging techniques, with higher stability and feasibility than the mechanically controlled scanning method.

  16. G-APDs in Cherenkov astronomy: The FACT camera

    International Nuclear Information System (INIS)

    Krähenbühl, T.; Anderhub, H.; Backes, M.; Biland, A.; Boller, A.; Braun, I.; Bretz, T.; Commichau, V.; Djambazov, L.; Dorner, D.; Farnier, C.; Gendotti, A.; Grimm, O.; Gunten, H. von; Hildebrand, D.; Horisberger, U.; Huber, B.; Kim, K.-S.; Köhne, J.-H.; Krumm, B.

    2012-01-01

    Geiger-mode avalanche photodiodes (G-APD, SiPM) are a much discussed alternative to photomultiplier tubes in Cherenkov astronomy. The First G-APD Cherenkov Telescope (FACT) collaboration builds a camera based on a hexagonal array of 1440 G-APDs and has now finalized its construction phase. A light-collecting solid PMMA cone is glued to each G-APD to eliminate dead space between the G-APDs by increasing the active area, and to restrict the light collection angle of the sensor to the reflector area in order to reduce the amount of background light. The processing of the signals is integrated in the camera and includes the digitization using the domino ring sampling chip DRS4.

  17. Azimuthal critical heat flux in narrow rectangular channels

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Hoon; Noh, Sang Woo; Kim, Sung Joong; Suh, Kune Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-07-01

    Tests were conducted to examine the critical heat flux (CHF) on the one-dimensional downward heating rectangular channel having a narrow gap by changing the orientation of the copper test heater assembly in a pool of saturated water under the atmospheric pressure. The test parameters include both the gap sizes of 1, 2, 5 and 10mm, and the surface orientation angles from the downward-facing position (180{sup o}) to the vertical position (90{sup o}), respectively. Also, the CHF experiments were performed for pool boiling with varying heater surface orientations in the unconfined space at the atmospheric pressure using the rectangular test section. It was observed that the CHF generally decreases as the surface inclination angle increases and as the gap size decreases. In consistency with several studies reported in the literature, it was found that there exists a transition angle above which the CHF changes with a rapid slope. An engineering correlation is developed for the CHF during natural convective boiling in the inclined, confined rectangular channels with the aid of dimensional analysis.

  18. Prognostic Significance of Frontal QRS-T Angle in Patients with Idiopathic Dilated Cardiomyopathy

    Directory of Open Access Journals (Sweden)

    Sheng-Na Li

    2016-01-01

    Conclusions: The frontal QRS-T angle is a powerful predictor of all-cause mortality, cardiac mortality, and worsening heart failure in IDC patients, independent of well-established prognostic factors. Optimized therapy significantly narrows the QRS-T angle, which might be an indicator of medication compliance, but this requires further investigation.

  19. Characterization of a PET Camera Optimized for Prostate Imaging

    International Nuclear Information System (INIS)

    Huber, Jennifer S.; Choong, Woon-Seng; Moses, William W.; Qi, Jinyi; Hu, Jicun; Wang, G.C.; Wilson, David; Oh, Sang; Huesman, RonaldH.; Derenzo, Stephen E.

    2005-01-01

    We present the characterization of a positron emission tomograph for prostate imaging that centers a patient between a pair of external curved detector banks (ellipse: 45 cm minor, 70 cm major axis). The distance between detector banks adjusts to allow patient access and to position the detectors as closely as possible for maximum sensitivity with patients of various sizes. Each bank is composed of two axial rows of 20 HR+ block detectors for a total of 80 detectors in the camera. The individual detectors are angled in the transaxial plane to point towards the prostate to reduce resolution degradation in that region. The detectors are read out by modified HRRT data acquisition electronics. Compared to a standard whole-body PET camera, our dedicated-prostate camera has the same sensitivity and resolution, less background (less randoms and lower scatter fraction) and a lower cost. We have completed construction of the camera. Characterization data and reconstructed images of several phantoms are shown. Sensitivity of a point source in the center is 946 cps/mu Ci. Spatial resolution is 4 mm FWHM in the central region

  20. Large-Grazing-Angle, Multi-Image Kirkpatrick-Baez Microscope as the Front End to a High-Resolution Streak Camera for OMEGA

    International Nuclear Information System (INIS)

    Gotchev, O.V.; Hayes, L.J.; Jaanimagi, P.A.; Knauer, J.P.; Marshall, F.J.; Meyerhofer, D. D.

    2003-01-01

    (B204)A new, high-resolution x-ray microscope with a large grazing angle has been developed, characterized, and fielded at the Laboratory for Laser Energetics. It increases the sensitivity and spatial resolution in planar direct-drive hydrodynamic stability experiments, relevant to inertial confinement fusion (ICF) research. It has been designed to work as the optical front end of the PJX-a high-current, high-dynamic-range x-ray streak camera. Optical design optimization, results from numerical ray tracing, mirror-coating choice, and characterization have been described previously [O. V. Gotchev, et al./Rev. Sci. Instrum. 74, 2178 (2003)]. This work highlights the optics' unique mechanical design and flexibility and considers certain applications that benefit from it. Characterization of the microscope's resolution in terms of its modulation transfer function (MTF) over the field of view is shown. Recent results from hydrodynamic stability experiments, diagnosed with the optic and the PJX, are provided to confirm the microscope's advantages as a high-resolution, high-throughput x-ray optical front end for streaked imaging

  1. Environmental Effects on Measurement Uncertainties of Time-of-Flight Cameras

    DEFF Research Database (Denmark)

    Gudmundsson, Sigurjon Arni; Aanæs, Henrik; Larsen, Rasmus

    2007-01-01

    In this paper the effect the environment has on the SwissRanger SR3000 Time-Of-Flight camera is investigated. The accuracy of this camera is highly affected by the scene it is pointed at: Such as the reflective properties, color and gloss. Also the complexity of the scene has considerable effects...... on the accuracy. To mention a few: The angle of the objects to the emitted light and the scattering effects of near objects. In this paper a general overview of known such inaccuracy factors are described, followed by experiments illustrating the additional uncertainty factors. Specifically we give a better...

  2. Experimental study on downward two-phase flow in narrow rectangular channel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T.H.; Jeong, J.H. [Pusan National Univ., Busan (Korea, Republic of)

    2014-07-01

    Adiabatic vertical two-phase flow of air and water through narrow rectangular channels was investigated. This study involved the observation of flow using a high speed camera and flow regimes were determined by image processing program using a MATLAB. The flows regimes in channel with downward flow are similar to those found by previous studies with upward flow. The flow regimes in downward flow at low liquid velocity are different from the previous studies in upward flow. The flow regimes can be classified into bubbly, cap-bubbly, slug and churn flow. (author)

  3. Effect of Dissolved gas on bubble behavior of subcooled boiling in narrow channel

    International Nuclear Information System (INIS)

    Li Shaodan; Tan Sichao; Xu Chao; Gao Puzhen; Xu Jianjun

    2013-01-01

    An experimental investigation was performed to study the effect of dissolved gas on bubble behavior in narrow rectangular channel under subcooled boiling condition. A high-speed digital video camera was applied to capture the dynamics of the bubble with or without dissolved gas in a narrow rectangular channel. It is found that the dissolved gas has great influence on bubble behavior in subcooled boiling condition. The dissolved gas slows down the rate of bubble growth and condensation and makes the variation of the bubble diameter present some oscillation characteristics. This phenomenon was discussed in the view of the vapor evaporation and condensation. The existence of the dissolved gas can facilitate the survival of the bubble and promote the aggregation of bubbles, and enhence heat transfer enhancement in some ways. (authors)

  4. Functional range of movement of the hand: declination angles to reachable space.

    Science.gov (United States)

    Pham, Hai Trieu; Pathirana, Pubudu N; Caelli, Terry

    2014-01-01

    The measurement of the range of hand joint movement is an essential part of clinical practice and rehabilitation. Current methods use three finger joint declination angles of the metacarpophalangeal, proximal interphalangeal and distal interphalangeal joints. In this paper we propose an alternate form of measurement for the finger movement. Using the notion of reachable space instead of declination angles has significant advantages. Firstly, it provides a visual and quantifiable method that therapists, insurance companies and patients can easily use to understand the functional capabilities of the hand. Secondly, it eliminates the redundant declination angle constraints. Finally, reachable space, defined by a set of reachable fingertip positions, can be measured and constructed by using a modern camera such as Creative Senz3D or built-in hand gesture sensors such as the Leap Motion Controller. Use of cameras or optical-type sensors for this purpose have considerable benefits such as eliminating and minimal involvement of therapist errors, non-contact measurement in addition to valuable time saving for the clinician. A comparison between using declination angles and reachable space were made based on Hume's experiment on functional range of movement to prove the efficiency of this new approach.

  5. Foot Placement Modification for a Biped Humanoid Robot with Narrow Feet

    Directory of Open Access Journals (Sweden)

    Kenji Hashimoto

    2014-01-01

    Full Text Available This paper describes a walking stabilization control for a biped humanoid robot with narrow feet. Most humanoid robots have larger feet than human beings to maintain their stability during walking. If robot’s feet are as narrow as humans, it is difficult to realize a stable walk by using conventional stabilization controls. The proposed control modifies a foot placement according to the robot's attitude angle. If a robot tends to fall down, a foot angle is modified about the roll axis so that a swing foot contacts the ground horizontally. And a foot-landing point is also changed laterally to inhibit the robot from falling to the outside. To reduce a foot-landing impact, a virtual compliance control is applied to the vertical axis and the roll and pitch axes of the foot. Verification of the proposed method is conducted through experiments with a biped humanoid robot WABIAN-2R. WABIAN-2R realized a knee-bended walking with 30 mm breadth feet. Moreover, WABIAN-2R mounted on a human-like foot mechanism mimicking a human's foot arch structure realized a stable walking with the knee-stretched, heel-contact, and toe-off motion.

  6. Anterior segment parameters as predictors of intraocular pressure reduction after phacoemulsification in eyes with open-angle glaucoma.

    Science.gov (United States)

    Hsia, Yen C; Moghimi, Sasan; Coh, Paul; Chen, Rebecca; Masis, Marisse; Lin, Shan C

    2017-07-01

    To evaluate intraocular pressure (IOP) change after cataract surgery in eyes with open-angle glaucoma (OAG) and its relationship to angle and anterior segment parameters measured by anterior segment optical coherence tomography (AS-OCT). University of California, San Francisco, California, USA. Prospective case series. Eyes were placed into a narrow-angle group or open-angle group based on gonioscopy grading. Biometric parameters were measured using AS-OCT (Visante) preoperatively, and IOP 4 months after surgery was obtained. The IOP change and its relationship to AS-OCT parameters were evaluated. Eighty-one eyes of 69 patients were enrolled. The mean age of the patients was 76.8 years. The preoperative IOP was 15.02 mm Hg on 1.89 glaucoma medications. The average mean deviation of preoperative visual field was -4.58 dB. The mean IOP reduction was 2.1 mm Hg (12.8%) from a preoperative mean of 15.0 mm Hg. The IOP reduction was significantly greater in eyes with narrow angles than in eyes with open angles (20.4% versus 8.0%) (P = .002). In multivariate analysis, preoperative IOP (β = -0.53, P < .001, R 2  = 0.40), angle-opening distance at 500 mm (β = 5.83, P = .02, R 2  = 0.45), angle-opening distance at 750 mm (β = 5.82, P = .001, R 2  = 0.52), and lens vault (β = -0.002, P = .009, R 2  = 0.47) were associated with IOP reduction postoperatively. In eyes with OAG, IOP reduction after cataract surgery was greater in eyes with narrower angles. Preoperative IOP, angle-opening distance, and lens vault were predictors for IOP reduction. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  7. Smartphone-Guided Needle Angle Selection During CT-Guided Procedures.

    Science.gov (United States)

    Xu, Sheng; Krishnasamy, Venkatesh; Levy, Elliot; Li, Ming; Tse, Zion Tsz Ho; Wood, Bradford John

    2018-01-01

    In CT-guided intervention, translation from a planned needle insertion angle to the actual insertion angle is estimated only with the physician's visuospatial abilities. An iPhone app was developed to reduce reliance on operator ability to estimate and reproduce angles. The iPhone app overlays the planned angle on the smartphone's camera display in real-time based on the smartphone's orientation. The needle's angle is selected by visually comparing the actual needle with the guideline in the display. If the smartphone's screen is perpendicular to the planned path, the smartphone shows the Bull's-Eye View mode, in which the angle is selected after the needle's hub overlaps the tip in the camera. In phantom studies, we evaluated the accuracies of the hardware, the Guideline mode, and the Bull's-Eye View mode and showed the app's clinical efficacy. A proof-of-concept clinical case was also performed. The hardware accuracy was 0.37° ± 0.27° (mean ± SD). The mean error and navigation time were 1.0° ± 0.9° and 8.7 ± 2.3 seconds for a senior radiologist with 25 years' experience and 1.5° ± 1.3° and 8.0 ± 1.6 seconds for a junior radiologist with 4 years' experience. The accuracy of the Bull's-Eye View mode was 2.9° ± 1.1°. Combined CT and smart-phone guidance was significantly more accurate than CT-only guidance for the first needle pass (p = 0.046), which led to a smaller final targeting error (mean distance from needle tip to target, 2.5 vs 7.9 mm). Mobile devices can be useful for guiding needle-based interventions. The hardware is low cost and widely available. The method is accurate, effective, and easy to implement.

  8. A LEGO Mindstorms Brewster angle microscope

    Science.gov (United States)

    Fernsler, Jonathan; Nguyen, Vincent; Wallum, Alison; Benz, Nicholas; Hamlin, Matthew; Pilgram, Jessica; Vanderpoel, Hunter; Lau, Ryan

    2017-09-01

    A Brewster Angle Microscope (BAM) built from a LEGO Mindstorms kit, additional LEGO bricks, and several standard optics components, is described. The BAM was built as part of an undergraduate senior project and was designed, calibrated, and used to image phospholipid, cholesterol, soap, and oil films on the surface of water. A BAM uses p-polarized laser light reflected off a surface at the Brewster angle, which ideally yields zero reflectivity. When a film of different refractive index is added to the surface a small amount of light is reflected, which can be imaged in a microscope camera. Films of only one molecule (approximately 1 nm) thick, a monolayer, can be observed easily in the BAM. The BAM was used in a junior-level Physical Chemistry class to observe phase transitions of a monolayer and the collapse of a monolayer deposited on the water surface in a Langmuir trough. Using a photometric calculation, students observed a change in thickness of a monolayer during a phase transition of 7 Å, which was accurate to within 1 Å of the value determined by more advanced methods. As supplementary material, we provide a detailed manual on how to build the BAM, software to control the BAM and camera, and image processing software.

  9. CALIBRATION PROCEDURES ON OBLIQUE CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    G. Kemper

    2016-06-01

    step with the help of the nadir camera and the GPS/IMU data, an initial orientation correction and radial correction were calculated. With this approach, the whole project was calculated and calibrated in one step. During the iteration process the radial and tangential parameters were switched on individually for the camera heads and after that the camera constants and principal point positions were checked and finally calibrated. Besides that, the bore side calibration can be performed either on basis of the nadir camera and their offsets, or independently for each camera without correlation to the others. This must be performed in a complete mission anyway to get stability between the single camera heads. Determining the lever arms of the nodal-points to the IMU centre needs more caution than for a single camera especially due to the strong tilt angle. Prepared all these previous steps, you get a highly accurate sensor that enables a fully automated data extraction with a rapid update of you existing data. Frequently monitoring urban dynamics is then possible in fully 3D environment.

  10. Dual-mode switching of a liquid crystal panel for viewing angle control

    Science.gov (United States)

    Baek, Jong-In; Kwon, Yong-Hoan; Kim, Jae Chang; Yoon, Tae-Hoon

    2007-03-01

    The authors propose a method to control the viewing angle of a liquid crystal (LC) panel using dual-mode switching. To realize both wide viewing angle (WVA) characteristics and narrow viewing angle (NVA) characteristics with a single LC panel, the authors use two different dark states. The LC layer can be aligned homogeneously parallel to the transmission axis of the bottom polarizer for WVA dark state operation, while it can be aligned vertically for NVA dark state operation. The authors demonstrated that viewing angle control can be achieved with a single panel without any loss of contrast at the front.

  11. The anterior chamber angle width in adults in a tertiary eye hospital ...

    African Journals Online (AJOL)

    2011-03-25

    Mar 25, 2011 ... had visual acuity assessment, visual field analysis, ophthalmoscopy, intraocular pressure measurement, ... Peripheral anterior synechiae were observed in three eyes. ..... The high incidence of narrow angles with the near.

  12. An investigation of the effects of spray angle and injection strategy on dimethyl ether (DME) combustion and exhaust emission characteristics in a common-rail diesel engine

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Seung Hyun; Cha, June Pyo [Graduate School of Hanyang University, Hanyang University, 17 Haengdang-dong, Sungdong-gu, Seoul, 133-791 (Korea); Lee, Chang Sik [Department of Mechanical Engineering, Hanyang University, 17 Haengdang-dong, Sungdong-gu, Seoul 133-791 (Korea)

    2010-11-15

    An experimental investigation was performed on the effects of spray angle and injection strategies (single and multiple) on the combustion characteristics, concentrations of exhaust emissions, and the particle size distribution in a direct-injection (DI) compression ignition engine fueled with dimethyl ether (DME) fuel. In this study, two types of narrow spray angle injectors ({theta}{sub spray} = 70 and 60 ) were examined and its results were compared with the results of conventional spray angle ({theta}{sub spray} = 156 ). In addition, to investigate the optimal operating conditions, early single-injection and multiple-injection strategies were employed to reduce cylinder wall-wetting of the injected fuels and to promote the ignition of premixed charge. The engine test was performed at 1400 rpm, and the injection timings were varied from TDC to BTDC 40 of the crank angle. The experimental results showed that the combustion pressure from single combustion for narrow-angle injectors ({theta}{sub spray} = 70 and 60 ) is increased, as compared to the results of the wide-angle injector ({theta}{sub spray} = 156 ) with advanced injection timing of BTDC 35 . In addition, two peaks of the rate of heat release (ROHR) are generated by the combustion of air-fuel premixed mixtures. DME combustion for all test injectors indicated low levels of soot emissions at all injection timings. The NO{sub x} emissions for narrow-angle injectors simultaneously increased in proportion to the advance in injection timing up to BTDC 25 , whereas BTDC 20 for the wide-angle injector. For multiple injections, the combustion pressure and ROHR of the first injection with narrow-angle injectors are combusted more actively, and the ignition delay of the second injected fuel is shorter than with the wide-angle injector. However, the second combustion pressure and ROHR were lower than during the first injection, and combustion durations are prolonged, as compared to the wide-angle injector. With

  13. An investigation of the effects of spray angle and injection strategy on dimethyl ether (DME) combustion and exhaust emission characteristics in a common-rail diesel engine

    International Nuclear Information System (INIS)

    Yoon, Seung Hyun; Cha, June Pyo; Lee, Chang Sik

    2010-01-01

    An experimental investigation was performed on the effects of spray angle and injection strategies (single and multiple) on the combustion characteristics, concentrations of exhaust emissions, and the particle size distribution in a direct-injection (DI) compression ignition engine fueled with dimethyl ether (DME) fuel. In this study, two types of narrow spray angle injectors (θ spray = 70 and 60 ) were examined and its results were compared with the results of conventional spray angle (θ spray = 156 ). In addition, to investigate the optimal operating conditions, early single-injection and multiple-injection strategies were employed to reduce cylinder wall-wetting of the injected fuels and to promote the ignition of premixed charge. The engine test was performed at 1400 rpm, and the injection timings were varied from TDC to BTDC 40 of the crank angle. The experimental results showed that the combustion pressure from single combustion for narrow-angle injectors (θ spray = 70 and 60 ) is increased, as compared to the results of the wide-angle injector (θ spray = 156 ) with advanced injection timing of BTDC 35 . In addition, two peaks of the rate of heat release (ROHR) are generated by the combustion of air-fuel premixed mixtures. DME combustion for all test injectors indicated low levels of soot emissions at all injection timings. The NO x emissions for narrow-angle injectors simultaneously increased in proportion to the advance in injection timing up to BTDC 25 , whereas BTDC 20 for the wide-angle injector. For multiple injections, the combustion pressure and ROHR of the first injection with narrow-angle injectors are combusted more actively, and the ignition delay of the second injected fuel is shorter than with the wide-angle injector. However, the second combustion pressure and ROHR were lower than during the first injection, and combustion durations are prolonged, as compared to the wide-angle injector. With advanced timing of the first injection, narrow-angle

  14. Payload topography camera of Chang'e-3

    International Nuclear Information System (INIS)

    Yu, Guo-Bin; Liu, En-Hai; Zhao, Ru-Jin; Zhong, Jie; Zhou, Xiang-Dong; Zhou, Wu-Lin; Wang, Jin; Chen, Yuan-Pei; Hao, Yong-Jie

    2015-01-01

    Chang'e-3 was China's first soft-landing lunar probe that achieved a successful roving exploration on the Moon. A topography camera functioning as the lander's “eye” was one of the main scientific payloads installed on the lander. It was composed of a camera probe, an electronic component that performed image compression, and a cable assembly. Its exploration mission was to obtain optical images of the lunar topography in the landing zone for investigation and research. It also observed rover movement on the lunar surface and finished taking pictures of the lander and rover. After starting up successfully, the topography camera obtained static images and video of rover movement from different directions, 360° panoramic pictures of the lunar surface around the lander from multiple angles, and numerous pictures of the Earth. All images of the rover, lunar surface, and the Earth were clear, and those of the Chinese national flag were recorded in true color. This paper describes the exploration mission, system design, working principle, quality assessment of image compression, and color correction of the topography camera. Finally, test results from the lunar surface are provided to serve as a reference for scientific data processing and application. (paper)

  15. Experimental visualization coalesced interaction of sliding bubble near wall in vertical narrow rectangular channel

    International Nuclear Information System (INIS)

    Xu Jianjun; Chen Bingde; Wang Xiaojun

    2011-01-01

    The characteristic of the coalesced sliding bubble was visually observed by wide side and narrow side of the narrow rectangular channel using high speed digital camera. The results show that the coalesced time among the sliding bubbles is quick, and the new formation of coalesced bubble is not lift-off, and it continues to slide along the heated surface in low heat flux for the isolated bubble region. The influence region is about 2 times projected area of the sliding bubble when the sliding bubbles begin to interact. The sliding bubble velocities increase duo to the interaction among the bubbles, which contributes to enhance heat transfer of this region. Finally, the effect of coalesced interaction of growing bubble in the nucleation sites on bubble lift-off was discussed and analysed. (authors)

  16. Fallspeed measurement and high-resolution multi-angle photography of hydrometeors in freefall

    OpenAIRE

    T. J. Garrett; C. Fallgatter; K. Shkurko; D. Howlett

    2012-01-01

    We describe here a new instrument for imaging hydrometeors in freefall. The Multi-Angle Snowflake Camera (MASC) captures high resolution photographs of hydrometeors from three angles while simultaneously measuring their fallspeed. Based on the stereoscopic photographs captured over the two months of continuous measurements obtained at a high altitude location within the Wasatch Front in Utah, we derive statistics for fallspeed, hydrometeor size, shape, orientation and aspect ratio. From a sel...

  17. Remote classification from an airborne camera using image super-resolution.

    Science.gov (United States)

    Woods, Matthew; Katsaggelos, Aggelos

    2017-02-01

    The image processing technique known as super-resolution (SR), which attempts to increase the effective pixel sampling density of a digital imager, has gained rapid popularity over the last decade. The majority of literature focuses on its ability to provide results that are visually pleasing to a human observer. In this paper, we instead examine the ability of SR to improve the resolution-critical capability of an imaging system to perform a classification task from a remote location, specifically from an airborne camera. In order to focus the scope of the study, we address and quantify results for the narrow case of text classification. However, we expect the results generalize to a large set of related, remote classification tasks. We generate theoretical results through simulation, which are corroborated by experiments with a camera mounted on a DJI Phantom 3 quadcopter.

  18. Prospective case series on trabecular-iris angle status after an acute episode of phacomorphic angle closure

    Directory of Open Access Journals (Sweden)

    Jacky Lee

    2013-02-01

    Full Text Available AIM:To investigate the trabecular-iris angle with ultrasound biomicroscopy (UBM post cataract extraction after an acute attack of phacomorphic angle closure.METHODS: This prospective study involved 10 cases of phacomorphic angle closure that underwent cataract extraction and intraocular lens insertion after intraocular pressure (IOP lowering. Apart from visual acuity and IOP, the trabecular-iris angle was measured by gonioscopy and UBM at 3 months post attack.RESULTS: In 10 consecutive cases of acute phacomorphic angle closure from December 2009 to December 2010, gonioscopic findings showed peripheral anterior synechiae (PAS ≤ 90° in 30% of phacomorphic patients and a mean Shaffer grading of (3.1±1.0. UBM showed a mean angle of (37.1°±4.5° in the phacomorphic eye with the temporal quadrant being the most opened and (37.1°±8.0° in the contralateral uninvolved eye. The mean time from consultation to cataract extraction was (1.4±0.7 days and the mean total duration of phacomorphic angle closure was (3.6±2.8 days but there was no correlation to the degree of angle closure on UBM (Spearman correlation P=0.7. The presenting mean IOP was (50.5±7.4 mmHg and the mean IOP at 3 months was (10.5±3.4 mmHg but there were no correlations with the degree of angle closure (Spearman correlations P=0.9.CONCLUSION:An open trabecular-iris angle and normal IOP can be achieved after an acute attack of phacomorphic angle closure if cataract extraction is performed within 1 day - 2 days after IOP control. Gonioscopic findings were in agreement with UBM, which provided a more specific and object angle measurement. The superior angle is relatively more narrowed compared to the other quadrants. All contralateral eyes in this series had open angles.

  19. Can we Use Low-Cost 360 Degree Cameras to Create Accurate 3d Models?

    Science.gov (United States)

    Barazzetti, L.; Previtali, M.; Roncoroni, F.

    2018-05-01

    360 degree cameras capture the whole scene around a photographer in a single shot. Cheap 360 cameras are a new paradigm in photogrammetry. The camera can be pointed to any direction, and the large field of view reduces the number of photographs. This paper aims to show that accurate metric reconstructions can be achieved with affordable sensors (less than 300 euro). The camera used in this work is the Xiaomi Mijia Mi Sphere 360, which has a cost of about 300 USD (January 2018). Experiments demonstrate that millimeter-level accuracy can be obtained during the image orientation and surface reconstruction steps, in which the solution from 360° images was compared to check points measured with a total station and laser scanning point clouds. The paper will summarize some practical rules for image acquisition as well as the importance of ground control points to remove possible deformations of the network during bundle adjustment, especially for long sequences with unfavorable geometry. The generation of orthophotos from images having a 360° field of view (that captures the entire scene around the camera) is discussed. Finally, the paper illustrates some case studies where the use of a 360° camera could be a better choice than a project based on central perspective cameras. Basically, 360° cameras become very useful in the survey of long and narrow spaces, as well as interior areas like small rooms.

  20. Contact Angle Measurements Using a Simplified Experimental Setup

    Science.gov (United States)

    Lamour, Guillaume; Hamraoui, Ahmed; Buvailo, Andrii; Xing, Yangjun; Keuleyan, Sean; Prakash, Vivek; Eftekhari-Bafrooei, Ali; Borguet, Eric

    2010-01-01

    A basic and affordable experimental apparatus is described that measures the static contact angle of a liquid drop in contact with a solid. The image of the drop is made with a simple digital camera by taking a picture that is magnified by an optical lens. The profile of the drop is then processed with ImageJ free software. The ImageJ contact…

  1. A multi-camera system for real-time pose estimation

    Science.gov (United States)

    Savakis, Andreas; Erhard, Matthew; Schimmel, James; Hnatow, Justin

    2007-04-01

    This paper presents a multi-camera system that performs face detection and pose estimation in real-time and may be used for intelligent computing within a visual sensor network for surveillance or human-computer interaction. The system consists of a Scene View Camera (SVC), which operates at a fixed zoom level, and an Object View Camera (OVC), which continuously adjusts its zoom level to match objects of interest. The SVC is set to survey the whole filed of view. Once a region has been identified by the SVC as a potential object of interest, e.g. a face, the OVC zooms in to locate specific features. In this system, face candidate regions are selected based on skin color and face detection is accomplished using a Support Vector Machine classifier. The locations of the eyes and mouth are detected inside the face region using neural network feature detectors. Pose estimation is performed based on a geometrical model, where the head is modeled as a spherical object that rotates upon the vertical axis. The triangle formed by the mouth and eyes defines a vertical plane that intersects the head sphere. By projecting the eyes-mouth triangle onto a two dimensional viewing plane, equations were obtained that describe the change in its angles as the yaw pose angle increases. These equations are then combined and used for efficient pose estimation. The system achieves real-time performance for live video input. Testing results assessing system performance are presented for both still images and video.

  2. Versatility of the CFR algorithm for limited angle reconstruction

    International Nuclear Information System (INIS)

    Fujieda, I.; Heiskanen, K.; Perez-Mendez, V.

    1990-01-01

    The constrained Fourier reconstruction (CFR) algorithm and the iterative reconstruction-reprojection (IRR) algorithm are evaluated based on their accuracy for three types of limited angle reconstruction problems. The cFR algorithm performs better for problems such as Xray CT imaging of a nuclear reactor core with one large data gap due to structural blocking of the source and detector pair. For gated heart imaging by Xray CT, radioisotope distribution imaging by PET or SPECT, using a polygonal array of gamma cameras with insensitive gaps between camera boundaries, the IRR algorithm has a slight advantage over the CFR algorithm but the difference is not significant

  3. Comparison of Three Smart Camera Architectures for Real-Time Machine Vision System

    Directory of Open Access Journals (Sweden)

    Abdul Waheed Malik

    2013-12-01

    Full Text Available This paper presents a machine vision system for real-time computation of distance and angle of a camera from a set of reference points located on a target board. Three different smart camera architectures were explored to compare performance parameters such as power consumption, frame speed and latency. Architecture 1 consists of hardware machine vision modules modeled at Register Transfer (RT level and a soft-core processor on a single FPGA chip. Architecture 2 is commercially available software based smart camera, Matrox Iris GT. Architecture 3 is a two-chip solution composed of hardware machine vision modules on FPGA and an external microcontroller. Results from a performance comparison show that Architecture 2 has higher latency and consumes much more power than Architecture 1 and 3. However, Architecture 2 benefits from an easy programming model. Smart camera system with FPGA and external microcontroller has lower latency and consumes less power as compared to single FPGA chip having hardware modules and soft-core processor.

  4. Exploring the Moon at High-Resolution: First Results From the Lunar Reconnaissance Orbiter Camera (LROC)

    Science.gov (United States)

    Robinson, Mark; Hiesinger, Harald; McEwen, Alfred; Jolliff, Brad; Thomas, Peter C.; Turtle, Elizabeth; Eliason, Eric; Malin, Mike; Ravine, A.; Bowman-Cisneros, Ernest

    The Lunar Reconnaissance Orbiter (LRO) spacecraft was launched on an Atlas V 401 rocket from the Cape Canaveral Air Force Station Launch Complex 41 on June 18, 2009. After spending four days in Earth-Moon transit, the spacecraft entered a three month commissioning phase in an elliptical 30×200 km orbit. On September 15, 2009, LRO began its planned one-year nominal mapping mission in a quasi-circular 50 km orbit. A multi-year extended mission in a fixed 30×200 km orbit is optional. The Lunar Reconnaissance Orbiter Camera (LROC) consists of a Wide Angle Camera (WAC) and two Narrow Angle Cameras (NACs). The WAC is a 7-color push-frame camera, which images the Moon at 100 and 400 m/pixel in the visible and UV, respectively, while the two NACs are monochrome narrow-angle linescan imagers with 0.5 m/pixel spatial resolution. LROC was specifically designed to address two of the primary LRO mission requirements and six other key science objectives, including 1) assessment of meter-and smaller-scale features in order to select safe sites for potential lunar landings near polar resources and elsewhere on the Moon; 2) acquire multi-temporal synoptic 100 m/pixel images of the poles during every orbit to unambiguously identify regions of permanent shadow and permanent or near permanent illumination; 3) meter-scale mapping of regions with permanent or near-permanent illumination of polar massifs; 4) repeat observations of potential landing sites and other regions to derive high resolution topography; 5) global multispectral observations in seven wavelengths to characterize lunar resources, particularly ilmenite; 6) a global 100-m/pixel basemap with incidence angles (60° -80° ) favorable for morphological interpretations; 7) sub-meter imaging of a variety of geologic units to characterize their physical properties, the variability of the regolith, and other key science questions; 8) meter-scale coverage overlapping with Apollo-era panoramic images (1-2 m/pixel) to document

  5. Effectiveness of Variable-Gain Kalman Filter Based on Angle Error Calculated from Acceleration Signals in Lower Limb Angle Measurement with Inertial Sensors

    Science.gov (United States)

    Watanabe, Takashi

    2013-01-01

    The wearable sensor system developed by our group, which measured lower limb angles using Kalman-filtering-based method, was suggested to be useful in evaluation of gait function for rehabilitation support. However, it was expected to reduce variations of measurement errors. In this paper, a variable-Kalman-gain method based on angle error that was calculated from acceleration signals was proposed to improve measurement accuracy. The proposed method was tested comparing to fixed-gain Kalman filter and a variable-Kalman-gain method that was based on acceleration magnitude used in previous studies. First, in angle measurement in treadmill walking, the proposed method measured lower limb angles with the highest measurement accuracy and improved significantly foot inclination angle measurement, while it improved slightly shank and thigh inclination angles. The variable-gain method based on acceleration magnitude was not effective for our Kalman filter system. Then, in angle measurement of a rigid body model, it was shown that the proposed method had measurement accuracy similar to or higher than results seen in other studies that used markers of camera-based motion measurement system fixing on a rigid plate together with a sensor or on the sensor directly. The proposed method was found to be effective in angle measurement with inertial sensors. PMID:24282442

  6. Vision and spectroscopic sensing for joint tracing in narrow gap laser butt welding

    Science.gov (United States)

    Nilsen, Morgan; Sikström, Fredrik; Christiansson, Anna-Karin; Ancona, Antonio

    2017-11-01

    The automated laser beam butt welding process is sensitive to positioning the laser beam with respect to the joint because a small offset may result in detrimental lack of sidewall fusion. This problem is even more pronounced in case of narrow gap butt welding, where most of the commercial automatic joint tracing systems fail to detect the exact position and size of the gap. In this work, a dual vision and spectroscopic sensing approach is proposed to trace narrow gap butt joints during laser welding. The system consists of a camera with suitable illumination and matched optical filters and a fast miniature spectrometer. An image processing algorithm of the camera recordings has been developed in order to estimate the laser spot position relative to the joint position. The spectral emissions from the laser induced plasma plume have been acquired by the spectrometer, and based on the measurements of the intensities of selected lines of the spectrum, the electron temperature signal has been calculated and correlated to variations of process conditions. The individual performances of these two systems have been experimentally investigated and evaluated offline by data from several welding experiments, where artificial abrupt as well as gradual deviations of the laser beam out of the joint were produced. Results indicate that a combination of the information provided by the vision and spectroscopic systems is beneficial for development of a hybrid sensing system for joint tracing.

  7. Earth elevation map production and high resolution sensing camera imaging analysis

    Science.gov (United States)

    Yang, Xiubin; Jin, Guang; Jiang, Li; Dai, Lu; Xu, Kai

    2010-11-01

    The Earth's digital elevation which impacts space camera imaging has prepared and imaging has analysed. Based on matching error that TDI CCD integral series request of the speed of image motion, statistical experimental methods-Monte Carlo method is used to calculate the distribution histogram of Earth's elevation in image motion compensated model which includes satellite attitude changes, orbital angular rate changes, latitude, longitude and the orbital inclination changes. And then, elevation information of the earth's surface from SRTM is read. Earth elevation map which produced for aerospace electronic cameras is compressed and spliced. It can get elevation data from flash according to the shooting point of latitude and longitude. If elevation data between two data, the ways of searching data uses linear interpolation. Linear interpolation can better meet the rugged mountains and hills changing requests. At last, the deviant framework and camera controller are used to test the character of deviant angle errors, TDI CCD camera simulation system with the material point corresponding to imaging point model is used to analyze the imaging's MTF and mutual correlation similarity measure, simulation system use adding cumulation which TDI CCD imaging exceeded the corresponding pixel horizontal and vertical offset to simulate camera imaging when stability of satellite attitude changes. This process is practicality. It can effectively control the camera memory space, and meet a very good precision TDI CCD camera in the request matches the speed of image motion and imaging.

  8. Viewing angle switching of patterned vertical alignment liquid crystal display

    International Nuclear Information System (INIS)

    Lim, Young Jin; Jeong, Eun; Chin, Mi Hyung; Lee, Seung Hee; Ji, Seunghoon; Lee, Gi-Dong

    2008-01-01

    Viewing angle control of a patterned vertical alignment (PVA) liquid crystal display using only one panel is investigated. In conventional PVA modes, a vertically aligned liquid crystal (LC) director tilts down in four directions making 45 deg. with respect to crossed polarizers to exhibit a wide viewing angle. In the viewing angle control device, one pixel was divided into two sub-pixels such that the LC director in the main pixel is controlled to be tilted down in multiple directions making an angle with the polarizer, playing the role of main display with the wide viewing angle, while the LC director in the sub-pixel is controlled to be tilted down to the polarizer axis, playing the role of sub-pixel to the viewing angle control for the narrow viewing angle. Using sub-pixel control, light leakage or any type of information such as characters and image can be generated in oblique viewing directions without distorting the image quality in the normal direction, which will prevent others from peeping at the displayed image by overlapping the displayed image with the made image

  9. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    Science.gov (United States)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  10. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    Science.gov (United States)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  11. 100-ps framing-camera tube

    International Nuclear Information System (INIS)

    Kalibjian, R.

    1978-01-01

    The optoelectronic framing-camera tube described is capable of recording two-dimensional image frames with high spatial resolution in the <100-ps range. Framing is performed by streaking a two-dimensional electron image across narrow slits. The resulting dissected electron line images from the slits are restored into framed images by a restorer deflector operating synchronously with the dissector deflector. The number of framed images on the tube's viewing screen equals the number of dissecting slits in the tube. Performance has been demonstrated in a prototype tube by recording 135-ps-duration framed images of 2.5-mm patterns at the cathode. The limitation in the framing speed is in the external drivers for the deflectors and not in the tube design characteristics. Faster frame speeds in the <100-ps range can be obtained by use of faster deflection drivers

  12. Calibration Procedures in Mid Format Camera Setups

    Science.gov (United States)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied

  13. Ectomography - a tomographic method for gamma camera imaging

    International Nuclear Information System (INIS)

    Dale, S.; Edholm, P.E.; Hellstroem, L.G.; Larsson, S.

    1985-01-01

    In computerised gamma camera imaging the projections are readily obtained in digital form, and the number of picture elements may be relatively few. This condition makes emission techniques suitable for ectomography - a tomographic technique for directly visualising arbitrary sections of the human body. The camera rotates around the patient to acquire different projections in a way similar to SPECT. This method differs from SPECT, however, in that the camera is placed at an angle to the rotational axis, and receives two-dimensional, rather than one-dimensional, projections. Images of body sections are reconstructed by digital filtration and combination of the acquired projections. The main advantages of ectomography - a high and uniform resolution, a low and uniform attenuation and a high signal-to-noise ratio - are obtained when imaging sections close and parallel to a body surface. The filtration eliminates signals representing details outside the section and gives the section a certain thickness. Ectomographic transverse images of a line source and of a human brain have been reconstructed. Details within the sections are correctly visualised and details outside are effectively eliminated. For comparison, the same sections have been imaged with SPECT. (author)

  14. Global calibration of multi-cameras with non-overlapping fields of view based on photogrammetry and reconfigurable target

    Science.gov (United States)

    Xia, Renbo; Hu, Maobang; Zhao, Jibin; Chen, Songlin; Chen, Yueling

    2018-06-01

    Multi-camera vision systems are often needed to achieve large-scale and high-precision measurement because these systems have larger fields of view (FOV) than a single camera. Multiple cameras may have no or narrow overlapping FOVs in many applications, which pose a huge challenge to global calibration. This paper presents a global calibration method for multi-cameras without overlapping FOVs based on photogrammetry technology and a reconfigurable target. Firstly, two planar targets are fixed together and made into a long target according to the distance between the two cameras to be calibrated. The relative positions of the two planar targets can be obtained by photogrammetric methods and used as invariant constraints in global calibration. Then, the reprojection errors of target feature points in the two cameras’ coordinate systems are calculated at the same time and optimized by the Levenberg–Marquardt algorithm to find the optimal solution of the transformation matrix between the two cameras. Finally, all the camera coordinate systems are converted to the reference coordinate system in order to achieve global calibration. Experiments show that the proposed method has the advantages of high accuracy (the RMS error is 0.04 mm) and low cost and is especially suitable for on-site calibration.

  15. Influence of anatomic landmarks in the virtual environment on simulated angled laparoscope navigation

    OpenAIRE

    Buzink, S.N.; Christie, L.S.; Goossens, R.H.M.; De Ridder, H.; Jakimowicz, J.J.

    2010-01-01

    Background - The aim of this study is to investigate the influence of the presence of anatomic landmarks on the performance of angled laparoscope navigation on the SimSurgery SEP simulator. Methods - Twenty-eight experienced laparoscopic surgeons (familiar with 30º angled laparoscope, >100 basic laparoscopic procedures, >5 advanced laparoscopic procedures) and 23 novices (no laparoscopy experience) performed the Camera Navigation task in an abstract virtual environment (CN-box) and in a virtu...

  16. Be Foil ''Filter Knee Imaging'' NSTX Plasma with Fast Soft X-ray Camera

    International Nuclear Information System (INIS)

    B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

    2005-01-01

    A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28 o ) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip

  17. Fall speed measurement and high-resolution multi-angle photography of hydrometeors in free fall

    OpenAIRE

    T. J. Garrett; C. Fallgatter; K. Shkurko; D. Howlett

    2012-01-01

    We describe here a new instrument for imaging hydrometeors in free fall. The Multi-Angle Snowflake Camera (MASC) captures high-resolution photographs of hydrometeors from three angles while simultaneously measuring their fall speed. Based on the stereoscopic photographs captured over the two months of continuous measurements obtained at a high altitude location within the Wasatch Front in Utah, we derive statistics for fall speed, hydrometeor size, shape, orientation and asp...

  18. Real-Time Algorithm for Relative Position Estimation Between Person and Robot Using a Monocular Camera

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Uk [Samsung Electroics, Suwon (Korea, Republic of); Sun, Ju Young; Won, Mooncheol [Chungnam Nat' l Univ., Daejeon (Korea, Republic of)

    2013-12-15

    In this paper, we propose a real-time algorithm for estimating the relative position of a person with respect to a robot (camera) using a monocular camera. The algorithm detects the head and shoulder regions of a person using HOG (Histogram of Oriented Gradient) feature vectors and an SVM (Support Vector Machine) classifier. The size and location of the detected area are used for calculating the relative distance and angle between the person and the camera on a robot. To increase the speed of the algorithm, we use a GPU and NVIDIA's CUDA library; the resulting algorithm speed is ∼ 15 Hz. The accuracy of the algorithm is compared with the output of a SICK laser scanner.

  19. Real-Time Algorithm for Relative Position Estimation Between Person and Robot Using a Monocular Camera

    International Nuclear Information System (INIS)

    Lee, Jung Uk; Sun, Ju Young; Won, Mooncheol

    2013-01-01

    In this paper, we propose a real-time algorithm for estimating the relative position of a person with respect to a robot (camera) using a monocular camera. The algorithm detects the head and shoulder regions of a person using HOG (Histogram of Oriented Gradient) feature vectors and an SVM (Support Vector Machine) classifier. The size and location of the detected area are used for calculating the relative distance and angle between the person and the camera on a robot. To increase the speed of the algorithm, we use a GPU and NVIDIA's CUDA library; the resulting algorithm speed is ∼ 15 Hz. The accuracy of the algorithm is compared with the output of a SICK laser scanner

  20. Search for atmospheric holes with the Viking cameras

    International Nuclear Information System (INIS)

    Frank, L.A.; Sigwarth, J.B.; Craven, J.D.

    1989-01-01

    Images taken with the two ultraviolet cameras on board the Viking spacecraft were examined for evidence of transient decreases of Earth's ultraviolet dayglow. Comparison of near-limb observations of dayglow intensities with those at smaller angles to the nadir with the camera sensitive to OI 130.4 nm emissions supports the existence of transient decreases in the near-nadir dayglow. However, the amount of near-nadir imaging is severely limited and only several significant events are found. More decisive confirmation of the existence of such transient decreases must await a larger survey from another spacecraft. The diameters of these regions as detected with Viking are ∼50 to 100 km. Occurrence frequencies, intensity decreases, and dimensions for these clusters of darkened pixels are similar to those previously reported for such events, or atmospheric holes, as seen in images of the ultraviolet dayglow with Dynamics Explorer 1

  1. Taking it all in : special camera films in 3-D

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, L.

    2006-07-15

    Details of a 360-degree digital camera designed by Immersive Media Telemmersion were presented. The camera has been employed extensively in the United States for homeland security and intelligence-gathering purposes. In Canada, the cameras are now being used by the oil and gas industry. The camera has 11 lenses pointing in all directions and generates high resolution movies that can be analyzed frame-by-frame from every angle. Global positioning satellite data can be gathered during filming so that operators can pinpoint any location. The 11 video streams use more than 100 million pixels per second. After filming, the system displays synchronized, high-resolution video streams, capturing a full motion spherical world complete with directional sound. It can be viewed on a computer monitor, video screen, or head-mounted display. Pembina Pipeline Corporation recently used the Telemmersion system to plot a proposed pipeline route between Alberta's Athabasca region and Edmonton. It was estimated that more than $50,000 was saved by using the camera. The resulting video has been viewed by Pembina's engineering, environmental and geotechnical groups who were able to accurately note the route's river crossings. The cameras were also used to estimate timber salvage. Footage was then given to the operations group, to help staff familiarize themselves with the terrain, the proposed route's right-of-way, and the number of water crossings and access points. Oil and gas operators have also used the equipment on a recently acquired block of land to select well sites. 4 figs.

  2. Optimized fan-shaped chiral metamaterial as an ultrathin narrow-band circular polarizer at visible frequencies

    Science.gov (United States)

    He, Yizhuo; Wang, Xinghai; Ingram, Whitney; Ai, Bin; Zhao, Yiping

    2018-04-01

    Chiral metamaterials have the great ability to manipulate the circular polarizations of light, which can be utilized to build ultrathin circular polarizers. Here we build a narrow-band circular polarizer at visible frequencies based on plasmonic fan-shaped chiral nanostructures. In order to achieve the best optical performance, we systematically investigate how different fabrication factors affect the chiral optical response of the fan-shaped chiral nanostructures, including incident angle of vapor depositions, nanostructure thickness, and post-deposition annealing. The optimized fan-shaped nanostructures show two narrow bands for different circular polarizations with the maximum extinction ratios 7.5 and 6.9 located at wavelength 687 nm and 774 nm, respectively.

  3. Wide-Field Optic for Autonomous Acquisition of Laser Link

    Science.gov (United States)

    Page, Norman A.; Charles, Jeffrey R.; Biswas, Abhijit

    2011-01-01

    An innovation reported in Two-Camera Acquisition and Tracking of a Flying Target, NASA Tech Briefs, Vol. 32, No. 8 (August 2008), p. 20, used a commercial fish-eye lens and an electronic imaging camera for initially locating objects with subsequent handover to an actuated narrow-field camera. But this operated against a dark-sky background. An improved solution involves an optical design based on custom optical components for the wide-field optical system that directly addresses the key limitations in acquiring a laser signal from a moving source such as an aircraft or a spacecraft. The first challenge was to increase the light collection entrance aperture diameter, which was approximately 1 mm in the first prototype. The new design presented here increases this entrance aperture diameter to 4.2 mm, which is equivalent to a more than 16 times larger collection area. One of the trades made in realizing this improvement was to restrict the field-of-view to +80 deg. elevation and 360 azimuth. This trade stems from practical considerations where laser beam propagation over the excessively high air mass, which is in the line of sight (LOS) at low elevation angles, results in vulnerability to severe atmospheric turbulence and attenuation. An additional benefit of the new design is that the large entrance aperture is maintained even at large off-axis angles when the optic is pointed at zenith. The second critical limitation for implementing spectral filtering in the design was tackled by collimating the light prior to focusing it onto the focal plane. This allows the placement of the narrow spectral filter in the collimated portion of the beam. For the narrow band spectral filter to function properly, it is necessary to adequately control the range of incident angles at which received light intercepts the filter. When this angle is restricted via collimation, narrower spectral filtering can be implemented. The collimated beam (and the filter) must be relatively large to

  4. Hydrogen and deuterium NMR of solids by magic-angle spinning

    International Nuclear Information System (INIS)

    Eckman, R.R.

    1982-10-01

    The nuclear magnetic resonance of solids has long been characterized by very large specral broadening which arises from internuclear dipole-dipole coupling or the nuclear electric quadrupole interaction. These couplings can obscure the smaller chemical shift interaction and make that information unavailable. Two important and difficult cases are that of hydrogen and deuterium. The development of cross polarization, heteronuclear radiofrequency decoupling, and coherent averaging of nuclear spin interactions has provided measurement of chemical shift tensors in solids. Recently, double quantum NMR and double quantum decoupling have led to measurement of deuterium and proton chemical shift tensors, respectively. A general problem of these experiments is the overlapping of the tensor powder pattern spectra of magnetically distinct sites which cannot be resolved. In this work, high resolution NMR of hydrogen and deuterium in solids is demonstrated. For both nuclei, the resonances are narrowed to obtain liquid-like isotropic spectra by high frequency rotation of the sample about an axis inclined at the magic angle, β/sub m/ = Arccos (3/sup -1/2/), with respect to the direction of the external magnetic field. For deuterium, the powder spectra were narrowed by over three orders of magnitude by magic angle rotation with precise control of β. A second approach was the observation of deuterium double quantum transitions under magic angle rotation. For hydrogen, magic angle rotation alone could be applied to obtain the isotropic spectrum when H/sub D/ was small. This often occurs naturally when the nuclei are semi-dilute or involved in internal motion. In the general case of large H/sub D/, isotropic spectra were obtained by dilution of 1 H with 2 H combined with magic angle rotation. The resolution obtained represents the practical limit for proton NMR of solids

  5. An ordinary camera in an extraordinary location: Outreach with the Mars Webcam

    Science.gov (United States)

    Ormston, T.; Denis, M.; Scuka, D.; Griebel, H.

    2011-09-01

    The European Space Agency's Mars Express mission was launched in 2003 and was Europe's first mission to Mars. On-board was a small camera designed to provide ‘visual telemetry’ of the separation of the Beagle-2 lander. After achieving its goal it was shut down while the primary science mission of Mars Express got underway. In 2007 this camera was reactivated by the flight control team of Mars Express for the purpose of providing public education and outreach—turning it into the ‘Mars Webcam’.The camera is a small, 640×480 pixel colour CMOS camera with a wide-angle 30°×40° field of view. This makes it very similar in almost every way to the average home PC webcam. The major difference is that this webcam is not in an average location but is instead in orbit around Mars. On a strict basis of non-interference with the primary science activities, the camera is turned on to provide unique wide-angle views of the planet below.A highly automated process ensures that the observations are scheduled on the spacecraft and then uploaded to the internet as rapidly as possible. There is no intermediate stage, so that visitors to the Mars Webcam blog serve as ‘citizen scientists’. Full raw datasets and processing instructions are provided along with a mechanism to allow visitors to comment on the blog. Members of the public are encouraged to use this in either a personal or an educational context and work with the images. We then take their excellent work and showcase it back on the blog. We even apply techniques developed by them to improve the data and webcam experience for others.The accessibility and simplicity of the images also makes the data ideal for educational use, especially as educational projects can then be showcased on the site as inspiration for others. The oft-neglected target audience of space enthusiasts is also important as this allows them to participate as part of an interplanetary instrument team.This paper will cover the history of the

  6. Development of compact Compton camera for 3D image reconstruction of radioactive contamination

    Science.gov (United States)

    Sato, Y.; Terasaka, Y.; Ozawa, S.; Nakamura Miyamura, H.; Kaburagi, M.; Tanifuji, Y.; Kawabata, K.; Torii, T.

    2017-11-01

    The Fukushima Daiichi Nuclear Power Station (FDNPS), operated by Tokyo Electric Power Company Holdings, Inc., went into meltdown after the large tsunami caused by the Great East Japan Earthquake of March 11, 2011. Very large amounts of radionuclides were released from the damaged plant. Radiation distribution measurements inside FDNPS buildings are indispensable to execute decommissioning tasks in the reactor buildings. We have developed a compact Compton camera to measure the distribution of radioactive contamination inside the FDNPS buildings three-dimensionally (3D). The total weight of the Compton camera is lower than 1.0 kg. The gamma-ray sensor of the Compton camera employs Ce-doped GAGG (Gd3Al2Ga3O12) scintillators coupled with a multi-pixel photon counter. Angular correction of the detection efficiency of the Compton camera was conducted. Moreover, we developed a 3D back-projection method using the multi-angle data measured with the Compton camera. We successfully observed 3D radiation images resulting from the two 137Cs radioactive sources, and the image of the 9.2 MBq source appeared stronger than that of the 2.7 MBq source.

  7. CALIBRATION PROCEDURES IN MID FORMAT CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    F. Pivnicka

    2012-07-01

    camera can be applied. However, there is a misalignment (bore side angle that must be evaluated by photogrammetric process using advanced tools e.g. in Bingo. Once, all these parameters have been determined, the system is capable for projects without or with only a few ground control points. But which effect has the photogrammetric process when directly applying the achieved direct orientation values compared with an AT based on a proper tiepoint matching? The paper aims to show the steps to be done by potential users and gives a kind of quality estimation about the importance and quality influence of the various calibration and adjustment steps.

  8. Analyzing Gait Using a Time-of-Flight Camera

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl; Paulsen, Rasmus Reinhold; Larsen, Rasmus

    2009-01-01

    An algorithm is created, which performs human gait analysis using spatial data and amplitude images from a Time-of-flight camera. For each frame in a sequence the camera supplies cartesian coordinates in space for every pixel. By using an articulated model the subject pose is estimated in the depth...... map in each frame. The pose estimation is based on likelihood, contrast in the amplitude image, smoothness and a shape prior used to solve a Markov random field. Based on the pose estimates, and the prior that movement is locally smooth, a sequential model is created, and a gait analysis is done...... on this model. The output data are: Speed, Cadence (steps per minute), Step length, Stride length (stride being two consecutive steps also known as a gait cycle), and Range of motion (angles of joints). The created system produces good output data of the described output parameters and requires no user...

  9. New nuclear medicine gamma camera systems

    International Nuclear Information System (INIS)

    Villacorta, Edmundo V.

    1997-01-01

    The acquisition of the Open E.CAM and DIACAM gamma cameras by Makati Medical Center is expected to enhance the capabilities of its nuclear medicine facilities. When used as an aid to diagnosis, nuclear medicine entails the introduction of a minute amount of radioactive material into the patient; thus, no reaction or side-effect is expected. When it reaches the particular target organ, depending on the radiopharmaceutical, a lesion will appear as a decrease (cold) area or increase (hot) area in the radioactive distribution as recorded byu the gamma cameras. Gamma camera images in slices or SPECT (Single Photon Emission Computer Tomography), increase the sensitivity and accuracy in detecting smaller and deeply seated lesions, which otherwise may not be detected in the regular single planar images. Due to the 'open' design of the equipment, claustrophobic patients will no longer feel enclosed during the procedure. These new gamma cameras yield improved resolution and superb image quality, and the higher photon sensitivity shortens imaging acquisition time. The E.CAM, which is the latest generation gamma camera, is featured by its variable angle dual-head system, the only one available in the Philipines, and the excellent choice for Myocardial Perfusion Imaging (MPI). From the usual 45 minutes, the acquisition time for gated SPECT imaging of the heart has now been remarkably reduced to 12 minutes. 'Gated' infers snap-shots of the heart in selected phases of its contraction and relaxation as triggered by ECG. The DIACAM is installed in a room with access outside the main entrance of the department, intended specially for bed-borne patients. Both systems are equipped with a network of high performance Macintosh ICOND acquisition and processing computers. Added to the hardware is the ICON processing software which allows total simultaneous acquisition and processing capabilities in the same operator's terminal. Video film and color printers are also provided. Together

  10. A controllable viewing angle LCD with an optically isotropic liquid crystal

    International Nuclear Information System (INIS)

    Kim, Min Su; Lim, Young Jin; Yoon, Sukin; Kang, Shin-Woong; Lee, Seung Hee; Kim, Miyoung; Wu, Shin-Tson

    2010-01-01

    An optically isotropic liquid crystal (LC) such as a blue phase LC or an optically isotropic nano-structured LC exhibits a very wide viewing angle because the induced birefringence is along the in-plane electric field. Utilizing such a material, we propose a liquid crystal display (LCD) whose viewing angle can be switched from wide view to narrow view using only one panel. In the device, each pixel is divided into two parts: a major pixel and a sub-pixel. The main pixels display the images while the sub-pixels control the viewing angle. In the main pixels, birefringence is induced by horizontal electric fields through inter-digital electrodes leading to a wide viewing angle, while in the sub-pixels, birefringence is induced by the vertical electric field so that phase retardation occurs only at oblique angles. As a result, the dark state (or contrast ratio) of the entire pixel can be controlled by the voltage of the sub-pixels. Such a switchable viewing angle LCD is attractive for protecting personal privacy.

  11. Image quality testing of assembled IR camera modules

    Science.gov (United States)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  12. A small-angle camera for resonant scattering experiments at the storage ring DORIS

    International Nuclear Information System (INIS)

    Stuhrmann, H.B.; Gabriel, A.

    1983-01-01

    Resonant small-angle scattering is measured routinely in the wavelength range of 0.6 to 3.25 A with the instrument X15 at the storage ring DORIS. The monochromatic beam with a vertical offset of 1.22 m is achieved by a double monochromator system with a constant exit slit. The small-angle instrument allows for sample-detector distances between 0.37 and 7.33 m. A multiwire proportional counter with a sensitive area of 200 X 200 mm detects the scattered intensity with a spatial resolution of 2 X 2 mm. Its sensitivity can be adapted to the requirements of the experiment by activating a drift chamber of 8 cm depth at the back end of the detector. The performance of the instrument as a function of the wavelength is described. The energy resolution is about 1 eV at the L 3 absorption edge of caesium, as shown by the resonant scattering of ferritin in 30% CsCl solution. (Auth.)

  13. Hydrodynamics of slug flow in a vertical narrow rectangular channel under laminar flow condition

    International Nuclear Information System (INIS)

    Wang, Yang; Yan, Changqi; Cao, Xiaxin; Sun, Licheng; Yan, Chaoxing; Tian, Qiwei

    2014-01-01

    Highlights: • Slug flow hydrodynamics in a vertical narrow rectangular duct were investigated. • The velocity of trailing Taylor bubble undisturbed by the leading one was measured. • Correlation of Taylor bubble velocity with liquid slug length ahead it was proposed. • Evolution of length distributions of Taylor bubble and liquid slug was measured. • The model of predicted length distributions was applied to the rectangular channel. - Abstract: The hydrodynamics of gas–liquid two-phase slug flow in a vertical narrow rectangular channel with the cross section of 2.2 mm × 43 mm is investigated using a high speed video camera system. Simultaneous measurements of velocity and duration of Taylor bubble and liquid slug made it possible to determine the length distributions of the liquid slug and Taylor bubble. Taylor bubble velocity is dependent on the length of the liquid slug ahead, and an empirical correlation is proposed based on the experimental data. The length distributions of Taylor bubbles and liquid slugs are positively skewed (log-normal distribution) at all measuring positions for all flow conditions. A modified model based on that for circular tubes is adapted to predict the length distributions in the present narrow rectangular channel. In general, the experimental data is well predicted by the modified model

  14. Development of underwater camera using high-definition camera

    International Nuclear Information System (INIS)

    Tsuji, Kenji; Watanabe, Masato; Takashima, Masanobu; Kawamura, Shingo; Tanaka, Hiroyuki

    2012-01-01

    In order to reduce the time for core verification or visual inspection of BWR fuels, the underwater camera using a High-Definition camera has been developed. As a result of this development, the underwater camera has 2 lights and 370 x 400 x 328mm dimensions and 20.5kg weight. Using the camera, 6 or so spent-fuel IDs are identified at 1 or 1.5m distance at a time, and 0.3mmφ pin-hole is recognized at 1.5m distance and 20 times zoom-up. Noises caused by radiation less than 15 Gy/h are not affected the images. (author)

  15. Kinematics of a vertical axis wind turbine with a variable pitch angle

    Science.gov (United States)

    Jakubowski, Mateusz; Starosta, Roman; Fritzkowski, Pawel

    2018-01-01

    A computational model for the kinematics of a vertical axis wind turbine (VAWT) is presented. A H-type rotor turbine with a controlled pitch angle is considered. The aim of this solution is to improve the VAWT productivity. The discussed method is related to a narrow computational branch based on the Blade Element Momentum theory (BEM theory). The paper can be regarded as a theoretical basis and an introduction to further studies with the application of BEM. The obtained torque values show the main advantage of using the variable pitch angle.

  16. A procedure for generating quantitative 3-D camera views of tokamak divertors

    International Nuclear Information System (INIS)

    Edmonds, P.H.; Medley, S.S.

    1996-05-01

    A procedure is described for precision modeling of the views for imaging diagnostics monitoring tokamak internal components, particularly high heat flux divertor components. These models are required to enable predictions of resolution and viewing angle for the available viewing locations. Because of the oblique views expected for slot divertors, fully 3-D perspective imaging is required. A suite of matched 3-D CAD, graphics and animation applications are used to provide a fast and flexible technique for reproducing these views. An analytic calculation of the resolution and viewing incidence angle is developed to validate the results of the modeling procedures. The calculation is applicable to any viewed surface describable with a coordinate array. The Tokamak Physics Experiment (TPX) diagnostics for infrared viewing are used as an example to demonstrate the implementation of the tools. For the TPX experiment the available locations are severely constrained by access limitations at the end resulting images are marginal in both resolution and viewing incidence angle. Full coverage of the divertor is possible if an array of cameras is installed at 45 degree toroidal intervals. Two poloidal locations are required in order to view both the upper and lower divertors. The procedures described here provide a complete design tool for in-vessel viewing, both for camera location and for identification of viewed surfaces. Additionally these same tools can be used for the interpretation of the actual images obtained by the actual diagnostic

  17. Prediction of Weld Residual Stress of Narrow Gap Welds

    International Nuclear Information System (INIS)

    Yang, Jun Seog; Huh, Nam Su

    2010-01-01

    The conventional welding technique such as shield metal arc welding has been mostly applied to the piping system of the nuclear power plants. It is well known that this welding technique causes the overheating and welding defects due to the large groove angle of weld. On the other hand, the narrow gap welding(NGW) technique has many merits, for instance, the reduction of welding time, the shrinkage of weld and the small deformation of the weld due to the small groove angle and welding bead width comparing with the conventional welds. These characteristics of NGW affect the deformation behavior and the distribution of welding residual stress of NGW, thus it is believed that the residual stress results obtained from conventional welding procedure may not be applied to structural integrity evaluation of NGW. In this paper, the welding residual stress of NGW was predicted using the nonlinear finite element analysis to simulate the thermal and mechanical effects of the NGW. The present results can be used as the important information to perform the flaw evaluation and to improve the weld procedure of NGW

  18. Sub-Camera Calibration of a Penta-Camera

    Science.gov (United States)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  19. Relative camera localisation in non-overlapping camera networks using multiple trajectories

    NARCIS (Netherlands)

    John, V.; Englebienne, G.; Kröse, B.J.A.

    2012-01-01

    In this article we present an automatic camera calibration algorithm using multiple trajectories in a multiple camera network with non-overlapping field-of-views (FOV). Visible trajectories within a camera FOV are assumed to be measured with respect to the camera local co-ordinate system.

  20. NEW METHOD FOR THE CALIBRATION OF MULTI-CAMERA MOBILE MAPPING SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. P. Kersting

    2012-07-01

    Full Text Available Mobile Mapping Systems (MMS allow for fast and cost-effective collection of geo-spatial information. Such systems integrate a set of imaging sensors and a position and orientation system (POS, which entails GPS and INS units. System calibration is a crucial process to ensure the attainment of the expected accuracy of such systems. It involves the calibration of the individual sensors as well as the calibration of the mounting parameters relating the system components. The mounting parameters of multi-camera MMS include two sets of relative orientation parameters (ROP: the lever arm offsets and the boresight angles relating the cameras and the IMU body frame and the ROP among the cameras (in the absence of GPS/INS data. In this paper, a novel single-step calibration method, which has the ability of estimating these two sets of ROP, is devised. Besides the ability to estimate the ROP among the cameras, the proposed method can use such parameters as prior information in the ISO procedure. The implemented procedure consists of an integrated sensor orientation (ISO where the GPS/INS-derived position and orientation and the system mounting parameters are directly incorporated in the collinearity equations. The concept of modified collinearity equations has been used by few authors for single-camera systems. In this paper, a new modification to the collinearity equations for GPS/INS-assisted multicamera systems is introduced. Experimental results using a real dataset demonstrate the feasibility of the proposed method.

  1. New Method for the Calibration of Multi-Camera Mobile Mapping Systems

    Science.gov (United States)

    Kersting, A. P.; Habib, A.; Rau, J.

    2012-07-01

    Mobile Mapping Systems (MMS) allow for fast and cost-effective collection of geo-spatial information. Such systems integrate a set of imaging sensors and a position and orientation system (POS), which entails GPS and INS units. System calibration is a crucial process to ensure the attainment of the expected accuracy of such systems. It involves the calibration of the individual sensors as well as the calibration of the mounting parameters relating the system components. The mounting parameters of multi-camera MMS include two sets of relative orientation parameters (ROP): the lever arm offsets and the boresight angles relating the cameras and the IMU body frame and the ROP among the cameras (in the absence of GPS/INS data). In this paper, a novel single-step calibration method, which has the ability of estimating these two sets of ROP, is devised. Besides the ability to estimate the ROP among the cameras, the proposed method can use such parameters as prior information in the ISO procedure. The implemented procedure consists of an integrated sensor orientation (ISO) where the GPS/INS-derived position and orientation and the system mounting parameters are directly incorporated in the collinearity equations. The concept of modified collinearity equations has been used by few authors for single-camera systems. In this paper, a new modification to the collinearity equations for GPS/INS-assisted multicamera systems is introduced. Experimental results using a real dataset demonstrate the feasibility of the proposed method.

  2. Addressing challenges of modulation transfer function measurement with fisheye lens cameras

    Science.gov (United States)

    Deegan, Brian M.; Denny, Patrick E.; Zlokolica, Vladimir; Dever, Barry; Russell, Laura

    2015-03-01

    Modulation transfer function (MTF) is a well defined and accepted method of measuring image sharpness. The slanted edge test, as defined in ISO12233 is a standard method of calculating MTF, and is widely used for lens alignment and auto-focus algorithm verification. However, there are a number of challenges which should be considered when measuring MTF in cameras with fisheye lenses. Due to trade-offs related Petzval curvature, planarity of the optical plane is difficult to achieve in fisheye lenses. It is therefore critical to have the ability to accurately measure sharpness throughout the entire image, particularly for lens alignment. One challenge for fisheye lenses is that, because of the radial distortion, the slanted edges will have different angles, depending on the location within the image and on the distortion profile of the lens. Previous work in the literature indicates that MTF measurements are robust for angles between 2 and 10 degrees. Outside of this range, MTF measurements become unreliable. Also, the slanted edge itself will be curved by the lens distortion, causing further measurement problems. This study summarises the difficulties in the use of MTF for sharpness measurement in fisheye lens cameras, and proposes mitigations and alternative methods.

  3. Control of Pan-tilt Mechanism Angle using Position Matrix Method

    Directory of Open Access Journals (Sweden)

    Hendri Maja Saputra

    2013-12-01

    Full Text Available Control of a Pan-Tilt Mechanism (PTM angle for the bomb disposal robot Morolipi-V2 using inertial sensor measurement unit, x-IMU, has been done. The PTM has to be able to be actively controlled both manually and automatically in order to correct the orientation of the moving Morolipi-V2 platform. The x-IMU detects the platform orientation and sends the result in order to automatically control the PTM. The orientation is calculated using the quaternion combined with Madwick and Mahony filter methods. The orientation data that consists of angles of roll (α, pitch (β, and yaw (γ from the x-IMU are then being sent to the camera for controlling the PTM motion (pan & tilt angles after calculating the reverse angle using position matrix method. Experiment results using Madwick and Mahony methods show that the x-IMU can be used to find the robot platform orientation. Acceleration data from accelerometer and flux from magnetometer produce noise with standard deviation of 0.015 g and 0.006 G, respectively. Maximum absolute errors caused by Madgwick and Mahony method with respect to Xaxis are 48.45º and 33.91º, respectively. The x-IMU implementation as inertia sensor to control the Pan-Tilt Mechanism shows a good result, which the probability of pan angle tends to be the same with yaw and tilt angle equal to the pitch angle, except a very small angle shift due to the influence of roll angle..

  4. Quantification of Finger-Tapping Angle Based on Wearable Sensors.

    Science.gov (United States)

    Djurić-Jovičić, Milica; Jovičić, Nenad S; Roby-Brami, Agnes; Popović, Mirjana B; Kostić, Vladimir S; Djordjević, Antonije R

    2017-01-25

    We propose a novel simple method for quantitative and qualitative finger-tapping assessment based on miniature inertial sensors (3D gyroscopes) placed on the thumb and index-finger. We propose a simplified description of the finger tapping by using a single angle, describing rotation around a dominant axis. The method was verified on twelve subjects, who performed various tapping tasks, mimicking impaired patterns. The obtained tapping angles were compared with results of a motion capture camera system, demonstrating excellent accuracy. The root-mean-square (RMS) error between the two sets of data is, on average, below 4°, and the intraclass correlation coefficient is, on average, greater than 0.972. Data obtained by the proposed method may be used together with scores from clinical tests to enable a better diagnostic. Along with hardware simplicity, this makes the proposed method a promising candidate for use in clinical practice. Furthermore, our definition of the tapping angle can be applied to all tapping assessment systems.

  5. TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope

    Science.gov (United States)

    Baug, T.; Ojha, D. K.; Ghosh, S. K.; Sharma, S.; Pandey, A. K.; Kumar, Brijesh; Ghosh, Arpan; Ninan, J. P.; Naik, M. B.; D’Costa, S. L. A.; Poojary, S. S.; Sandimani, P. R.; Shah, H.; Krishna Reddy, B.; Pandey, S. B.; Chand, H.

    Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512×512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1-5μm wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11-14 and 2017 October 7-31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of ˜86.5‧‧×86.5‧‧ on the DOT with a pixel scale of 0.169‧‧. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of ˜0.45‧‧ realized in the NIR K-band on 2017 October 16. The camera is found to be capable of deep observations in the J, H and K bands comparable to other 4m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4μm) magnitudes of 9.2 in the narrow L-band (nbL; λcen˜ 3.59μm). Hence, the camera could be a good complementary instrument to observe the bright nbL-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] ≲ 7.92 mag) and the WISE W1-band ([3.4] ≲ 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3μm are also detected. Details of the observations and estimated parameters are presented in this paper.

  6. Wide field and diffraction limited array camera for SIRTF

    International Nuclear Information System (INIS)

    Fazio, G.G.; Koch, D.G.; Melnick, G.J.

    1986-01-01

    The Infrared Array Camera for the Space Infrared Telescope Facility (SIRTF/IRAC) is capable of two-dimensional photometry in either a wide field or diffraction-limited mode over the wavelength interval from 2 to 30 microns. Three different two-dimensional direct readout (DRO) array detectors are being considered: Band 1-InSb or Si:In (2-5 microns) 128 x 128 pixels, Band 2-Si:Ga (5-18 microns) 64 x 64 pixels, and Band 3-Si:Sb (18-30 microns) 64 x 64 pixels. The hybrid DRO readout architecture has the advantages of low read noise, random pixel access with individual readout rates, and nondestructive readout. The scientific goals of IRAC are discussed, which are the basis for several important requirements and capabilities of the array camera: (1) diffraction-limited resolution from 2-30 microns, (2) use of the maximum unvignetted field of view of SIRTF, (3) simultaneous observations within the three infrared spectral bands, and (4) the capability for broad and narrow bandwidth spectral resolution. A strategy has been developed to minimize the total electronic and environmental noise sources to satisfy the scientific requirements. 7 references

  7. [New directions in the hypotensive therapy of open-angle glaucoma (experimental and clinical research)].

    Science.gov (United States)

    Bunin, A Ia; Ermakov, V N; Filina, A A

    1993-01-01

    Clinical use of eye drops of a hybrid beta-alpha-adrenoblocker OF-4680 to reduce intraocular pressure has shown a high efficacy of the drug, not inferior to thymolol, for local hypotensive therapy of open-angle glaucoma. A combination of thymolol with taurin helped reduce the inhibiting effect of the beta-blocker on chamber humor secretion and simultaneously enhanced its discharge. The results evidence the desirability of correcting glutathion deficiency, detected in the patients with narrow-angle glaucoma, by lipoic acid.

  8. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    Directory of Open Access Journals (Sweden)

    Yu Lu

    2016-04-01

    Full Text Available A new compact large field of view (FOV multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second.

  9. II-VI Narrow-Bandgap Semiconductors for Optoelectronics

    Science.gov (United States)

    Baker, Ian

    The field of narrow-gap II-VI materials is dominated by the compound semiconductor mercury cadmium telluride, (Hg1-x Cd x Te or MCT), which supports a large industry in infrared detectors, cameras and infrared systems. It is probably true to say that HgCdTe is the third most studied semiconductor after silicon and gallium arsenide. Hg1-x Cd x Te is the material most widely used in high-performance infrared detectors at present. By changing the composition x the spectral response of the detector can be made to cover the range from 1 μm to beyond 17 μm. The advantages of this system arise from a number of features, notably: close lattice matching, high optical absorption coefficient, low carrier generation rate, high electron mobility and readily available doping techniques. These advantages mean that very sensitive infrared detectors can be produced at relatively high operating temperatures. Hg1-x Cd x Te multilayers can be readily grown in vapor-phase epitaxial processes. This provides the device engineer with complex doping and composition profiles that can be used to further enhance the electro-optic performance, leading to low-cost, large-area detectors in the future. The main purpose of this chapter is to describe the applications, device physics and technology of II-VI narrow-bandgap devices, focusing on HgCdTe but also including Hg1-x Mn x Te and Hg1-x Zn x Te. It concludes with a review of the research and development programs into third-generation infrared detector technology (so-called GEN III detectors) being performed in centers around the world.

  10. Development of a stiffness-angle law for simplifying the measurement of human hair stiffness.

    Science.gov (United States)

    Jung, I K; Park, S C; Lee, Y R; Bin, S A; Hong, Y D; Eun, D; Lee, J H; Roh, Y S; Kim, B M

    2018-04-01

    This research examines the benefits of caffeine absorption on hair stiffness. To test hair stiffness, we have developed an evaluation method that is not only accurate, but also inexpensive. Our evaluation method for measuring hair stiffness culminated in a model, called the Stiffness-Angle Law, which describes the elastic properties of hair and can be widely applied to the development of hair care products. Small molecules (≤500 g mol -1 ) such as caffeine can be absorbed into hair. A common shampoo containing 4% caffeine was formulated and applied to hair 10 times, after which the hair stiffness was measured. The caffeine absorption of the treated hair was observed using Fourier-transform infrared spectroscopy (FTIR) with a focal plane array (FPA) detector. Our evaluation method for measuring hair stiffness consists of a regular camera and a support for single strands of hair. After attaching the hair to the support, the bending angle of the hair was observed with a camera and measured. Then, the hair strand was weighed. The stiffness of the hair was calculated based on our proposed Stiffness-Angle Law using three variables: angle, weight of hair and the distance the hair was pulled across the support. The caffeine absorption was confirmed by FTIR analysis. The concentration of amide bond in the hair certainly increased due to caffeine absorption. After caffeine was absorbed into the hair, the bending angle and weight of the hair changed. Applying these measured changes to the Stiffness-Angle Law, it was confirmed that the hair stiffness increased by 13.2% due to caffeine absorption. The theoretical results using the Stiffness-Angle Law agree with the visual examinations of hair exposed to caffeine and also the known results of hair stiffness from a previous report. Our evaluation method combined with our proposed Stiffness-Angle Law effectively provides an accurate and inexpensive evaluation technique for measuring bending stiffness of human hair. © 2018

  11. Development of a compact scintillator-based high-resolution Compton camera for molecular imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kishimoto, A., E-mail: daphne3h-aya@ruri.waseda.jp [Research Institute for Science and Engineering, Waseda University, 3-4-1 Ohkubo, Shinjuku, Tokyo (Japan); Kataoka, J.; Koide, A.; Sueoka, K.; Iwamoto, Y.; Taya, T. [Research Institute for Science and Engineering, Waseda University, 3-4-1 Ohkubo, Shinjuku, Tokyo (Japan); Ohsuka, S. [Central Research Laboratory, Hamamatsu Photonics K.K., 5000 Hirakuchi, Hamakita-ku, Hamamatsu, Shizuoka (Japan)

    2017-02-11

    The Compton camera, which shows gamma-ray distribution utilizing the kinematics of Compton scattering, is a promising detector capable of imaging across a wide range of energy. In this study, we aim to construct a small-animal molecular imaging system in a wide energy range by using the Compton camera. We developed a compact medical Compton camera based on a Ce-doped Gd{sub 3}Al{sub 2}Ga{sub 3}O{sub 12} (Ce:GAGG) scintillator and multi-pixel photon counter (MPPC). A basic performance confirmed that for 662 keV, the typical energy resolution was 7.4 % (FWHM) and the angular resolution was 4.5° (FWHM). We then used the medical Compton camera to conduct imaging experiments based on a 3-D imaging reconstruction algorithm using the multi-angle data acquisition method. The result confirmed that for a {sup 137}Cs point source at a distance of 4 cm, the image had a spatial resolution of 3.1 mm (FWHM). Furthermore, we succeeded in producing 3-D multi-color image of different simultaneous energy sources ({sup 22}Na [511 keV], {sup 137}Cs [662 keV], and {sup 54}Mn [834 keV]).

  12. SU-F-J-206: Systematic Evaluation of the Minimum Detectable Shift Using a Range- Finding Camera

    Energy Technology Data Exchange (ETDEWEB)

    Platt, M; Platt, M [College of Medicine University of Cincinnati, Cincinnati, OH (United States); Lamba, M [University of Cincinnati, Cincinnati, OH (United States); Mascia, A [University of Cincinnati Medical Center, Cincinnati, OH (United States); Huang, K [UC Health Barret Cancer Center, Cincinnati, OH (United States)

    2016-06-15

    Purpose: The robotic table used for patient alignment in proton therapy is calibrated only at commissioning under well-defined conditions and table shifts may vary over time and with differing conditions. The purpose of this study is to systematically investigate minimum detectable shifts using a time-of-flight (TOF) range-finding camera for table position feedback. Methods: A TOF camera was used to acquire one hundred 424 × 512 range images from a flat surface before and after known shifts. Range was assigned by averaging central regions of the image across multiple images. Depth resolution was determined by evaluating the difference between the actual shift of the surface and the measured shift. Depth resolution was evaluated for number of images averaged, area of sensor over which depth was averaged, distance from camera to surface, central versus peripheral image regions, and angle of surface relative to camera. Results: For one to one thousand images with a shift of one millimeter the range in error was 0.852 ± 0.27 mm to 0.004 ± 0.01 mm (95% C.I.). For varying regions of the camera sensor the range in error was 0.02 ± 0.05 mm to 0.47 ± 0.04 mm. The following results are for 10 image averages. For areas ranging from one pixel to 9 × 9 pixels the range in error was 0.15 ± 0.09 to 0.29 ± 0.15 mm (1σ). For distances ranging from two to four meters the range in error was 0.15 ± 0.09 to 0.28 ± 0.15 mm. For an angle of incidence between thirty degrees and ninety degrees the average range in error was 0.11 ± 0.08 to 0.17 ± 0.09 mm. Conclusion: It is feasible to use a TOF camera for measuring shifts in flat surfaces under clinically relevant conditions with submillimeter precision.

  13. Narrow linewidth operation of the RILIS titanium: Sapphire laser at ISOLDE/CERN

    CERN Document Server

    Rothe, S; Wendt, K D A; Fedosseev, V N; Kron, T; Marsh, B A

    2013-01-01

    A narrow linewidth operating mode for the Ti:sapphire laser of the CERN ISOLDE Resonance Ionization Laser Ion Source (RILIS) has been developed. This satisfies the laser requirements for the programme of in-source resonance ionization spectroscopy measurements and improves the selectivity for isomer separation using RILIS. A linewidth reduction from typically 10 GHz down to 1 GHz was achieved by the intra-cavity insertion of a second (thick) Fabry-Perot etalon. Reliable operation during a laser scan was achieved through motorized control of the tilt angle of each etalon. A scanning, stabilization and mode cleaning procedure was developed and implemented in LabVIEW. The narrow linewidth operation was confirmed in a high resolution spectroscopy study of francium isotopes by the Collinear Resonance Ionization Spectroscopy experiment. The resulting laser scans demonstrate the suitability of the laser, in terms of linewidth, spectral purity and stability for high resolution in-source spectroscopy and isomer select...

  14. Mobile phone camera benchmarking: combination of camera speed and image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-01-01

    When a mobile phone camera is tested and benchmarked, the significance of quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. For example, ISO 15781 defines several measurements to evaluate various camera system delays. However, the speed or rapidity metrics of the mobile phone's camera system have not been used with the quality metrics even if the camera speed has become more and more important camera performance feature. There are several tasks in this work. Firstly, the most important image quality metrics are collected from the standards and papers. Secondly, the speed related metrics of a mobile phone's camera system are collected from the standards and papers and also novel speed metrics are identified. Thirdly, combinations of the quality and speed metrics are validated using mobile phones in the market. The measurements are done towards application programming interface of different operating system. Finally, the results are evaluated and conclusions are made. The result of this work gives detailed benchmarking results of mobile phone camera systems in the market. The paper defines also a proposal of combined benchmarking metrics, which includes both quality and speed parameters.

  15. Effect of EMIC Wave Normal Angle Distribution on Relativistic Electron Scattering in Outer RB

    Science.gov (United States)

    Khazanov, G. V.; Gamayunov, K. V.

    2007-01-01

    We present the equatorial and bounce average pitch angle diffusion coefficients for scattering of relativistic electrons by the H+ mode of EMIC waves. Both the model (prescribed) and self consistent distributions over the wave normal angle are considered. The main results of our calculation can be summarized as follows: First, in comparison with field aligned waves, the intermediate and highly oblique waves reduce the pitch angle range subject to diffusion, and strongly suppress the scattering rate for low energy electrons (E less than 2 MeV). Second, for electron energies greater than 5 MeV, the |n| = 1 resonances operate only in a narrow region at large pitch-angles, and despite their greatest contribution in case of field aligned waves, cannot cause electron diffusion into the loss cone. For those energies, oblique waves at |n| greater than 1 resonances are more effective, extending the range of pitch angle diffusion down to the loss cone boundary, and increasing diffusion at small pitch angles by orders of magnitude.

  16. Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry

    Science.gov (United States)

    Chikatsu, Hirofumi; Takahashi, Yoji

    2009-08-01

    The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

  17. A simulation of orientation dependent, global changes in camera sensitivity in ECT

    International Nuclear Information System (INIS)

    Bieszk, J.A.; Hawman, E.G.; Malmin, R.E.

    1984-01-01

    ECT promises the abilities to: 1) observe radioisotope distributions in a patient without the summation of overlying activity to reduce contrast, and 2) measure quantitatively these distributions to further and more accurately assess organ function. Ideally, camera-based ECT systems should have a performance that is independent of camera orientation or gantry angle. This study is concerned with ECT quantitation errors that can arise from angle-dependent variations of camera sensitivity. Using simulated phantoms representative of heart and liver sections, the effects of sensitivity changes on reconstructed images were assessed both visually and quantitatively based on ROI sums. The sinogram for each test image was simulated with 128 linear digitization and 180 angular views. The global orientation-dependent sensitivity was modelled by applying an angular sensitivity dependence to the sinograms of the test images. Four sensitivity variations were studied. Amplitudes of 0% (as a reference), 5%, 10%, and 25% with a cosθ dependence were studied as well as a cos2θ dependence with a 5% amplitude. Simulations were done with and without Poisson noise to: 1) determine trends in the quantitative effects as a function of the magnitude of the variation, and 2) to see how these effects are manifested in studies having statistics comparable to clinical cases. For the most realistic sensitivity variation (cosθ, 5% ampl.), the ROIs chosen in the present work indicated changes of <0.5% in the noiseless case and <5% for the case with Poisson noise. The effects of statistics appear to dominate any effects due to global, sinusoidal, orientation-dependent sensitivity changes in the cases studied

  18. Framing-camera tube developed for sub-100-ps range

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    A new framing-camera tube, developed by Electronics Engineering, is capable of recording two-dimensional image frames with high spatial resolution in the sub-100-ps range. Framing is performed by streaking a two-dimensional electron image across narrow slits; the resulting electron-line images from the slits are restored into a framed image by a restorer deflector operating synchronously with the dissector deflector. We have demonstrated its performance in a prototype tube by recording 125-ps-duration framed images of 2.5-mm patterns. The limitation in the framing speed is in the external electronic drivers for the deflectors and not in the tube design characteristics. Shorter frame durations (below 100 ps) can be obtained by use of faster deflection drivers

  19. Proposal of a Budget-Friendly Camera Holder for Endoscopic Ear Surgery.

    Science.gov (United States)

    Ozturan, Orhan; Yenigun, Alper; Aksoy, Fadlullah; Ertas, Burak

    2018-01-01

    Endoscopic ear surgery (EES) is increasingly a preferred technique in otologic society. It offers excellent visualization of the anatomical structures directly and behind the corners with variable angled telescopes. It also provides reduced operative morbidity due to being able to perform surgical interventions with less invasive approaches. Operative preparation and setup time and cost of endoscopy system are less expensive compared with surgical microscopes. On the other hand, the main disadvantage of EES is that the surgery has to be performed with 1 single hand. It is certainly restrictive for an ear surgeon who has been operating with 2 hands under otologic microscopic views for years and certainly requires a learning period and perseverance. Holding the endoscope by a second surgeon is not executable because of insufficient surgical space.Endoscope/camera holders have been developed for those who need the comfort and convenience afforded by double-handed microscopic ear surgery. An ideal endoscope holder should be easy-to-set up, easily controlled, providing a variety of angled views, allowing the surgeon to operate with 2 hands and, budget-friendly. In this article, a commercially available 11-inch magic arm camera holder is proposed by the authors to be used in EES due to its versatile, convenient, and budget-friendly features. It allows 2-handed EES through existing technology and is affordable for surgeons looking for a low-cost and practical solution.

  20. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, U.; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is mounted in the ray inlet opening of the camera, while the others are placed on separate supports. The supports are swingably mounted upon a column one above the other through about 90 0 to a collimator exchange position. Each of the separate supports is swingable to a vertically aligned position, with limiting of the swinging movement and positioning of the support at the desired exchange position. The collimators are carried on the supports by means of a series of vertically disposed coil springs. Projections on the camera are movable from above into grooves of the collimator at the exchange position, whereupon the collimator is turned so that it is securely prevented from falling out of the camera head

  1. Retrieval of Garstang's emission function from all-sky camera images

    Science.gov (United States)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio; Kundracik, František

    2015-10-01

    The emission function from ground-based light sources predetermines the skyglow features to a large extent, while most mathematical models that are used to predict the night sky brightness require the information on this function. The radiant intensity distribution on a clear sky is experimentally determined as a function of zenith angle using the theoretical approach published only recently in MNRAS, 439, 3405-3413. We have made the experiments in two localities in Slovakia and Mexico by means of two digital single lens reflex professional cameras operating with different lenses that limit the system's field-of-view to either 180º or 167º. The purpose of using two cameras was to identify variances between two different apertures. Images are taken at different distances from an artificial light source (a city) with intention to determine the ratio of zenith radiance relative to horizontal irradiance. Subsequently, the information on the fraction of the light radiated directly into the upward hemisphere (F) is extracted. The results show that inexpensive devices can properly identify the upward emissions with adequate reliability as long as the clear sky radiance distribution is dominated by a largest ground-based light source. Highly unstable turbidity conditions can also make the parameter F difficult to find or even impossible to retrieve. The measurements at low elevation angles should be avoided due to a potentially parasitic effect of direct light emissions from luminaires surrounding the measuring site.

  2. Measuring high-resolution sky luminance distributions with a CCD camera.

    Science.gov (United States)

    Tohsing, Korntip; Schrempf, Michael; Riechelmann, Stefan; Schilke, Holger; Seckmeyer, Gunther

    2013-03-10

    We describe how sky luminance can be derived from a newly developed hemispherical sky imager (HSI) system. The system contains a commercial compact charge coupled device (CCD) camera equipped with a fish-eye lens. The projection of the camera system has been found to be nearly equidistant. The luminance from the high dynamic range images has been calculated and then validated with luminance data measured by a CCD array spectroradiometer. The deviation between both datasets is less than 10% for cloudless and completely overcast skies, and differs by no more than 20% for all sky conditions. The global illuminance derived from the HSI pictures deviates by less than 5% and 20% under cloudless and cloudy skies for solar zenith angles less than 80°, respectively. This system is therefore capable of measuring sky luminance with the high spatial and temporal resolution of more than a million pixels and every 20 s respectively.

  3. Enlarging the angle of view in Michelson-interferometer-based shearography by embedding a 4f system.

    Science.gov (United States)

    Wu, Sijin; He, Xiaoyuan; Yang, Lianxiang

    2011-07-20

    Digital shearography based on Michelson interferometers suffers from the disadvantage of a small angle of view due to the structure. We demonstrate a novel digital shearography system with a large angle of view. In the optical arrangement, the imaging lens is in front of the Michelson interferometer rather than behind it as in traditional digital shearography. Thus, the angle of view is no longer limited by the Michelson interferometer. The images transmitting between the separate lens and camera are accomplished by a 4f system in the new style of shearography. The influences of the 4f system on shearography are also discussed. © 2011 Optical Society of America

  4. Io Pele plume

    Science.gov (United States)

    2000-01-01

    Voyager 1 took this narrow-angle camera image on 5 March 1979 from a distance of 450,000 kilometers. At this geometry, the camera looks straight down through a volcanic plume at one of Io's most active volcanos, Pele. The large heart-shaped feature is the region where Pele's plume falls to the surface. At the center of the 'heart' is the small dark fissure that is the source of the eruption. The Voyager Project is managed by the Jet Propulsion Laboratory for NASA's Office of Space Science.

  5. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    Science.gov (United States)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially

  6. a Uav-Based Low-Cost Stereo Camera System for Archaeological Surveys - Experiences from Doliche (turkey)

    Science.gov (United States)

    Haubeck, K.; Prinz, T.

    2013-08-01

    The use of Unmanned Aerial Vehicles (UAVs) for surveying archaeological sites is becoming more and more common due to their advantages in rapidity of data acquisition, cost-efficiency and flexibility. One possible usage is the documentation and visualization of historic geo-structures and -objects using UAV-attached digital small frame cameras. These monoscopic cameras offer the possibility to obtain close-range aerial photographs, but - under the condition that an accurate nadir-waypoint flight is not possible due to choppy or windy weather conditions - at the same time implicate the problem that two single aerial images not always meet the required overlap to use them for 3D photogrammetric purposes. In this paper, we present an attempt to replace the monoscopic camera with a calibrated low-cost stereo camera that takes two pictures from a slightly different angle at the same time. Our results show that such a geometrically predefined stereo image pair can be used for photogrammetric purposes e.g. the creation of digital terrain models (DTMs) and orthophotos or the 3D extraction of single geo-objects. Because of the limited geometric photobase of the applied stereo camera and the resulting base-height ratio the accuracy of the DTM however directly depends on the UAV flight altitude.

  7. Model for diffusion of a narrow beam of charged particles

    International Nuclear Information System (INIS)

    Eisenhauer, C.

    1980-01-01

    A simple analytic expression is presented to describe the three-dimensioned spatial distribution of flux or energy deposition by a narrow beam of charged particles. In this expression distances are expressed in terms of a scaling parameter that is proportional to the mean square scattering angle in a single collision. Finite ranges are expressed in terms of the continuous-slowing-down range. Track-length distributions for one-velocity particles and energy deposition for electrons are discussed. Comparisons with rigorous Monte Carlo calculations show that departures from the analytic expression can be expressed as a slowly varying function of order unity. This function can be used as a basis for interpolation over a wide range of source energies and materials

  8. Observations and computations of narrow Kelvin ship wakes

    Directory of Open Access Journals (Sweden)

    Francis Noblesse

    2016-01-01

    Full Text Available Computations of far-field ship waves, based on linear potential flow theory and the Hogner approximation, are reported for monohull ships and catamarans. Specifically, far-field ship waves are computed for six monohull ships at four Froude numbers F≡V/gL=0.58, 0.68, 0.86, 1.58 and for six catamarans with nondimensional hull spacing s≡S/L=0.25 at two Froude numbers Fs≡V/gS=1 and 2.5. Here, g is the gravitational acceleration, V and L denote the ship speed and length, and S is the separation distance between the twin hulls of a catamaran. The computations show that, although the amplitudes of the waves created by a ship are strongly influenced by the shape of the ship hull, as well known, the ray angles where the largest waves are found are only weakly influenced by the hull shape and indeed are mostly a kinematic feature of the flow around a ship hull. An important practical consequence of this flow feature is that the apparent wake angle of general monohull ships or catamarans (with arbitrarily-shaped hulls can be estimated, without computations, by means of simple analytical relations; these relations, obtained elsewhere via parametric computations, are given here. Moreover, the influence of the two parameters Fs and s that largely determine the ray angles of the dominant waves created by a catamaran is illustrated via computations for three catamarans with hull spacings s=0.2, 0.35, 0.5 at four Froude numbers Fs=1, 1.5, 2, 2.5. These computations confirm that the largest waves created by wide and/or fast catamarans are found at ray angles that only depend on Fs (i.e. that do not depend on the hull spacing s in agreement with an elementary analysis of lateral interference between the dominant waves created by the bows (or sterns of the twin hulls of a catamaran. The dominant-waves ray angles predicted by the theory of wave-interference effects for monohull ships and catamarans are also compared with the observations of narrow Kelvin ship

  9. Photometric Calibration of Consumer Video Cameras

    Science.gov (United States)

    Suggs, Robert; Swift, Wesley, Jr.

    2007-01-01

    Equipment and techniques have been developed to implement a method of photometric calibration of consumer video cameras for imaging of objects that are sufficiently narrow or sufficiently distant to be optically equivalent to point or line sources. Heretofore, it has been difficult to calibrate consumer video cameras, especially in cases of image saturation, because they exhibit nonlinear responses with dynamic ranges much smaller than those of scientific-grade video cameras. The present method not only takes this difficulty in stride but also makes it possible to extend effective dynamic ranges to several powers of ten beyond saturation levels. The method will likely be primarily useful in astronomical photometry. There are also potential commercial applications in medical and industrial imaging of point or line sources in the presence of saturation.This development was prompted by the need to measure brightnesses of debris in amateur video images of the breakup of the Space Shuttle Columbia. The purpose of these measurements is to use the brightness values to estimate relative masses of debris objects. In most of the images, the brightness of the main body of Columbia was found to exceed the dynamic ranges of the cameras. A similar problem arose a few years ago in the analysis of video images of Leonid meteors. The present method is a refined version of the calibration method developed to solve the Leonid calibration problem. In this method, one performs an endto- end calibration of the entire imaging system, including not only the imaging optics and imaging photodetector array but also analog tape recording and playback equipment (if used) and any frame grabber or other analog-to-digital converter (if used). To automatically incorporate the effects of nonlinearity and any other distortions into the calibration, the calibration images are processed in precisely the same manner as are the images of meteors, space-shuttle debris, or other objects that one seeks to

  10. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, Ul; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is replaceably mounted in the ray inlet opening of the camera, while the others are placed on separate supports. Supports are swingably mounted upon a column one above the other

  11. MAG narrow gap welding - an economic way to minimize welding expenses

    International Nuclear Information System (INIS)

    Kast, W.; Scholz, E.; Weyland, F.

    1982-01-01

    The thicker structural components are, the more important it is to take measures to reduce the volume of the weld. The welding process requiring the smallest possible weld section is the so-called narrow gap process. In submerged arc narrow gap welding as well as in MAG narrow gap welding different variants are imaginable, some of them already in practical use. With regard to efficiency and weld quality an optimum variant of the MAG narrow gap welding process is described. It constitutes a two wire system in which two wire electrodes of 1.2 mm diameter are arranged one behind the other. In order to avoid lack of fusion, the wire guides are slightly pointed towards each groove face. Thus, by inclining the two arcs burning one behind the other in the direction of weld progress, it is achieved that two separately solidifying weld pools and two beads per layer are simultaneously formed. Welding parameters are selected in such a way that a heat input of 16-20 kJ/cm and a deposition rate of 11-16 kgs/h are obtained. In spite of this comparatively high deposition rate, good impact values are found both in the weld and HAZ (largely reduced coarse-grain zone) which is due to an optimum weld build-up. With the available welding equipment the process can be applied to structural members having a thickness of 40-400 mm. The width of gap is 13 mm (root section) with a bevel angle of 1 0 . As filler metal, basic flux-cored wires are used which, depending on the base metal to be welded and the required tensile properties, can be of the Mn-, MnMo-, MnCrMo-, MnNi-, or MnNiMo-alloyed types. (orig.)

  12. Evaluation of anterior chamber angle under dark and light conditions in angle closure glaucoma: An anterior segment OCT study.

    Science.gov (United States)

    Masoodi, Habibeh; Jafarzadehpur, Ebrahim; Esmaeili, Alireza; Abolbashari, Fereshteh; Ahmadi Hosseini, Seyed Mahdi

    2014-08-01

    To evaluate changes of nasal and temporal anterior chamber angle (ACA) in subjects with angle closure glaucoma using Spectralis AS-OCT (SAS-OCT) under dark and light conditions. Based on dark-room gonioscopy, 24 subjects with open angles and 86 with narrow angles participated in this study. The nasal and temporal angle opening distance at 500 μm anterior to the scleral spur (AOD500), nasal and temporal ACA were measured using SAS-OCT in light and dark conditions. In 2 groups, ACA and AOD500 in nasal and temporal quadrants were significantly greater in light compared to dark (all with p=0.000). The AOD500 and ACA were significantly higher in nasal than temporal in measured conditions for 2 groups except the ACA and AOD500 of normal group measured in light. The difference between nasal and temporal in dark (29.07 ± 65.71 μm for AOD500 and 5.7 ± 4.07° for ACA) was greater than light (24.86 ± 79.85 μm for AOD500 and 2.09 ± 7.21° for ACA) condition. But the difference was only significant for ACA (p=0.000). The correlation analysis showed a negative correlation between AOD500 and pupil diameter in temporal and nasal quadrants (both with p=0.000). While temporal AOD500 difference correlated with spherical equivalent, temporal and asal gonioscopy, nasal AOD correlated with IOP, temporal and nasal gonioscopy. Clinically important changes in ACA structure could be detected with SAS-OCT in nasal and temporal quadrants under different illumination intensity. The results could help in improvement of examination condition for better and more accurate assessment of individuals with angle closure glaucoma. Copyright © 2014 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  13. Investigation of the microstructures of ion beams emitted from PF-1000 at different angles to the Z-axis

    Energy Technology Data Exchange (ETDEWEB)

    Skladnik-Sadowska, E.; Czaus, K.; Malinowski, K.; Kwiatkowski, R.; Zebrowski, J. [The Andrzej Soltan Institute for Nuclear Studies, IPJ, 05-400 Otwock-Swierk (Poland); Sadowski, M.J. [The Andrzej Soltan Institute for Nuclear Studies, IPJ, 0R-400 Otwock-Swierk (Poland)] [Institute of Plasma Physics and Laser Microfusion, IPPLM, 01-497 Warsaw (Poland); Paduch, M.; Scholz, M. [Institute of Plasma Physics and Laser Microfusion, IPPLM, 01-497 Warsaw (Poland); Kubes, P. [Czech Technical University, CVUT, 166-27 Prague (Czech Republic); Garkusha, I.E. [Institute of Plasma Physics, NSC KIPT, 61-108 Kharkov (Ukraine); Talebitaher, A. [Plasma Radiation Sources Laboratory, NIE NTU, 637616 Singapore (Singapore)

    2011-07-01

    The paper describes diagnostics of fast ion beams emitted from the large PF-1000 facility operated at 21-27 kV, 290-480 kJ. The use was made of pinhole cameras equipped with PM-355 nuclear track detectors and placed at different angles to the discharge axis. The ion measurements performed at 0 degree angle, as well as those at 60 degrees angle showed a complex spatial structure of the fast ion beams. The ion measurements, which were for the first time performed in the upstream direction (at 180 degrees angle), have proved that some fast deuteron beams are emitted also in the upstream direction. This document is composed of a paper and a poster. (authors)

  14. Search for narrow baryons in pi /sup -/p elastic scattering at large angles

    CERN Document Server

    Baillon, Paul; Benayoun, M; Chauveau, J; Chew, D; Ferro-Luzzi, M; Kahane, J; Lellouch, D; Leruste, P; Liaud, P; Moreau, F; Perreau, J M; Séguinot, Jacques; Sené, R; Tocqueville, J; Urban, M

    1980-01-01

    Hoping to find resonant structures in the momentum dependence of pi /sup -/p elastic scattering the authors have measured the differential cross section for this reaction at c.m. angles near 90 degrees . An intense pion beam ( approximately=10/sup 7/ pi /s) has been used, together with a high incident momentum resolution (dP/P approximately =2*10/sup -4/), to scan the region of laboratory momenta from 5.75 to 13.02 GeV/c (c.m. energy from 3.42 to 5.03 GeV). The sensitivity attained by the experiment is such that signals would have been seen corresponding to the formation of non-strange baryon resonances having width larger than approximately=0.1 MeV and elasticity larger than a few per cent. Within these limits no resonances were sighted. (4 refs) .

  15. Forward rectification: spatial image normalization for a video from a forward facing vehicle camera

    Science.gov (United States)

    Prun, Viktor; Polevoy, Dmitri; Postnikov, Vassiliy

    2017-03-01

    The work in this paper is focused around visual ADAS (Advanced Driver Assistance Systems). We introduce forward rectification - a technique for making computer vision algorithms more robust against camera mount point and mount angles. Using the technique can increase the quality of recognition as well as lower the dimensionality for algorithm invariance, making it possible to apply simpler affine-invariant algorithms for applications that require projective invariance. Providing useful results this rectification requires thorough calibration of the camera, which can be done automatically or semi-automatically. The technique is of general nature and can be applied to different algorithms, such as pattern matching detectors, convolutional neural networks. The applicability of the technique is demonstrated on HOG-based car detector detection rate.

  16. Weak antilocalization effect in exfoliated black phosphorus revealed by temperature- and angle-dependent magnetoconductivity

    KAUST Repository

    Hou, Zhipeng; Gong, Chen; Wang, Yue; Zhang, Qiang; Yang, Bingchao; Zhang, Hongwei; Liu, Enke; Liu, Zhongyuan; Zeng, Zhongming; Wu, Guangheng; Wang, Wenhong; Zhang, Xixiang

    2018-01-01

    Recently, there have been increasingly debates on whether there exists a surface resonance state (SRS) in black phosphorus (BP), as suggested by recent angle-resolved photoemission spectroscopy (ARPES) results. To resolve this issue, we have performed temperature- and angle-dependent magnetoconductivity measurements on exfoliated, high-quality BP single crystals. A pronounced weak-antilocalization (WAL) effect was observed within a narrow temperature range of 8 - 16 K, with the electrical current flowing parallel to the cleaved ac-plane (along the a- or c-axis) and the magnetic field along the b-axis. The angle-dependent magnetoconductivity and the Hikami-Larkin-Nagaoka (HLN) model-fitted results have revealed that the observed WAL effect shows surface-bulk coherent features, which supports the existence of SRS in black phosphorus.

  17. Weak antilocalization effect in exfoliated black phosphorus revealed by temperature- and angle-dependent magnetoconductivity

    KAUST Repository

    Hou, Zhipeng

    2018-01-10

    Recently, there have been increasingly debates on whether there exists a surface resonance state (SRS) in black phosphorus (BP), as suggested by recent angle-resolved photoemission spectroscopy (ARPES) results. To resolve this issue, we have performed temperature- and angle-dependent magnetoconductivity measurements on exfoliated, high-quality BP single crystals. A pronounced weak-antilocalization (WAL) effect was observed within a narrow temperature range of 8 - 16 K, with the electrical current flowing parallel to the cleaved ac-plane (along the a- or c-axis) and the magnetic field along the b-axis. The angle-dependent magnetoconductivity and the Hikami-Larkin-Nagaoka (HLN) model-fitted results have revealed that the observed WAL effect shows surface-bulk coherent features, which supports the existence of SRS in black phosphorus.

  18. Radiation-resistant camera tube

    International Nuclear Information System (INIS)

    Kuwahata, Takao; Manabe, Sohei; Makishima, Yasuhiro

    1982-01-01

    It was a long time ago that Toshiba launched on manufacturing black-and-white radiation-resistant camera tubes employing nonbrowning face-plate glass for ITV cameras used in nuclear power plants. Now in compliance with the increasing demand in nuclear power field, the Company is at grips with the development of radiation-resistant single color-camera tubes incorporating a color-stripe filter for color ITV cameras used under radiation environment. Herein represented are the results of experiments on characteristics of materials for single color-camera tubes and prospects for commercialization of the tubes. (author)

  19. GRACE star camera noise

    Science.gov (United States)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  20. Small-angle neutron scattering and cyclic voltammetry study on electrochemically oxidized and reduced pyrolytic carbon

    International Nuclear Information System (INIS)

    Braun, A.; Kohlbrecher, J.; Baertsch, M.; Schnyder, B.; Koetz, R.; Haas, O.; Wokaun, A.

    2004-01-01

    The electrochemical double layer capacitance and internal surface area of a pyrolytic carbon material after electrochemical oxidation and subsequent reduction was studied with cyclic voltammetry and small-angle neutron scattering. Oxidation yields an enhanced internal surface area (activation), and subsequent reduction causes a decrease of this internal surface area. The change of the Porod constant, as obtained from small-angle neutron scattering, reveals that the decrease in internal surface area is not caused merely by a closing or narrowing of the pores, but by a partial collapse of the pore network

  1. A METHOD FOR SELF-CALIBRATION IN SATELLITE WITH HIGH PRECISION OF SPACE LINEAR ARRAY CAMERA

    Directory of Open Access Journals (Sweden)

    W. Liu

    2016-06-01

    Full Text Available At present, the on-orbit calibration of the geometric parameters of a space surveying camera is usually processed by data from a ground calibration field after capturing the images. The entire process is very complicated and lengthy and cannot monitor and calibrate the geometric parameters in real time. On the basis of a large number of on-orbit calibrations, we found that owing to the influence of many factors, e.g., weather, it is often difficult to capture images of the ground calibration field. Thus, regular calibration using field data cannot be ensured. This article proposes a real time self-calibration method for a space linear array camera on a satellite using the optical auto collimation principle. A collimating light source and small matrix array CCD devices are installed inside the load system of the satellite; these use the same light path as the linear array camera. We can extract the location changes of the cross marks in the matrix array CCD to determine the real-time variations in the focal length and angle parameters of the linear array camera. The on-orbit status of the camera is rapidly obtained using this method. On one hand, the camera’s change regulation can be mastered accurately and the camera’s attitude can be adjusted in a timely manner to ensure optimal photography; in contrast, self-calibration of the camera aboard the satellite can be realized quickly, which improves the efficiency and reliability of photogrammetric processing.

  2. Narrow, duplicated internal auditory canal

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, T. [Servico de Neurorradiologia, Hospital Garcia de Orta, Avenida Torrado da Silva, 2801-951, Almada (Portugal); Shayestehfar, B. [Department of Radiology, UCLA Oliveview School of Medicine, Los Angeles, California (United States); Lufkin, R. [Department of Radiology, UCLA School of Medicine, Los Angeles, California (United States)

    2003-05-01

    A narrow internal auditory canal (IAC) constitutes a relative contraindication to cochlear implantation because it is associated with aplasia or hypoplasia of the vestibulocochlear nerve or its cochlear branch. We report an unusual case of a narrow, duplicated IAC, divided by a bony septum into a superior relatively large portion and an inferior stenotic portion, in which we could identify only the facial nerve. This case adds support to the association between a narrow IAC and aplasia or hypoplasia of the vestibulocochlear nerve. The normal facial nerve argues against the hypothesis that the narrow IAC is the result of a primary bony defect which inhibits the growth of the vestibulocochlear nerve. (orig.)

  3. Flooding correlations in narrow channel

    International Nuclear Information System (INIS)

    Kim, S. H.; Baek, W. P.; Chang, S. H.

    1999-01-01

    Heat transfer in narrow gap is considered as important phenomena in severe accidents in nuclear power plants. Also in heat removal of electric chip. Critical heat flux(CHF) in narrow gap limits the maximum heat transfer rate in narrow channel. In case of closed bottom channel, flooding limited CHF occurrence is observed. Flooding correlations will be helpful to predict the CHF in closed bottom channel. In present study, flooding data for narrow channel geometry were collected and the work to recognize the effect of the span, w and gap size, s were performed. And new flooding correlations were suggested for high-aspect-ratio geometry. Also, flooding correlation was applied to flooding limited CHF data

  4. S-matrix description of anomalous large-angle heavy-ion scattering

    Energy Technology Data Exchange (ETDEWEB)

    Frahn, W E; Hussein, M S [Sao Paulo Univ. (Brazil). Inst. de Fisica; Canto, L F; Donangelo, R [Rio de Janeiro Univ. (Brazil). Inst. de Fisica

    1981-10-12

    We present a quantitative description of the well-known anomalous features observed in the large-angle scattering of n..cap alpha.. type heavy ions, in particular of the pronounced structures in the backangle excitation function for /sup 16/O + /sup 28/Si. Our treatment is based on the close connection between these anomalies and particular structural deviations of the partial-wave S-matrix from normal strong-absorption behaviour. The properties of these deviations are found to be rather well specified by the data: they are localized within a narrow 'l-window' centered at a critical angular momentum significantly smaller than the grazing value, and have a parity-dependent as well as a parity-independent part. These properties provide important clues as to the physical processes causing the large-angle enhancement.

  5. S-matrix description of anomalus large-angle heavy-ion scattering

    International Nuclear Information System (INIS)

    Frahn, W.E.; Hussein, M.S.; Canto, L.F.; Donangelo, R.J.

    1981-01-01

    A quantitative description of the well-known anomalous features observed in the large-angle scattering of n.α type heavy ions, in particular of the pronounced structures in the backangle excitation function or 16 O + 28 Si is presented. This treatment is based on the close connection between these anomalies and particular structural deviations of the partial-wave S-matrix from normal strong-absorption behaviour. The properties of these deviations are found to be rather well specified by the data: they are localized within a narrow 'l-window' centered at a critical angular momentum significantly smaller than the grazing value, and have a parity-dependent as well as a parity-independent part. These properties provide important clues as to the physical processes causing the large-angle enhancement. (Author) [pt

  6. [Chamber Angle Assessment in Clinical Practice - A Comparison between Optical Coherence Tomography and Gonioscopy].

    Science.gov (United States)

    Mösler, M P; Werner, J U; Lang, G K

    2015-07-01

    In glaucoma the structures of the anterior chamber are important for classification, therapy, progression and prognosis. In this context anterior segment optical coherence tomography (AS-OCT) gains more relevance. This study compares AS-OCT with gonioscopy in diagnostic performance of chamber angle (CA) assessment. 104 consecutive subjects with glaucoma underwent AS-OCT imaging using the Visante OCT. RESULTS were compared to gonioscopic grading from patient history using the Shaffer system. In addition, anterior chamber depth (ACD) assessment using slitlamp examination was evaluated as a prognostic factor for chamber angle width (CAW) and verified by AS-OCT measurement. Average CAW was 29° (AS-OCT). 17 % of the CAs that were "wide" in gonioscopy (variance 5-55°), showed a "narrow" CA in AS-OCT. 35 % of the CAs that were "narrow" in gonioscopy (variance 0-39°) showed a "wide" CA in AS-OCT. ACD assessment using slitlamp examination is a good predictor for CAW. In this context the technique provides equal informative value as gonioscopy. In cases of "wide" ACDs it is even superior. The critical ACD for an increased risk of angle closure is 2.4 mm. Concerning the critical ACD (gonioscopy difficult or impossible, optical coherence tomography is an effective alternative to the gold standard and is to some extent even superior. Georg Thieme Verlag KG Stuttgart · New York.

  7. Statistical meandering wake model and its application to yaw-angle optimisation of wind farms

    International Nuclear Information System (INIS)

    Thøgersen, E; Tranberg, B; Greiner, M; Herp, J

    2017-01-01

    The wake produced by a wind turbine is dynamically meandering and of rather narrow nature. Only when looking at large time averages, the wake appears to be static and rather broad, and is then well described by simple engineering models like the Jensen wake model (JWM). We generalise the latter deterministic models to a statistical meandering wake model (SMWM), where a random directional deflection is assigned to a narrow wake in such a way that on average it resembles a broad Jensen wake. In a second step, the model is further generalised to wind-farm level, where the deflections of the multiple wakes are treated as independently and identically distributed random variables. When carefully calibrated to the Nysted wind farm, the ensemble average of the statistical model produces the same wind-direction dependence of the power efficiency as obtained from the standard Jensen model. Upon using the JWM to perform a yaw-angle optimisation of wind-farm power output, we find an optimisation gain of 6.7% for the Nysted wind farm when compared to zero yaw angles and averaged over all wind directions. When applying the obtained JWM-based optimised yaw angles to the SMWM, the ensemble-averaged gain is calculated to be 7.5%. This outcome indicates the possible operational robustness of an optimised yaw control for real-life wind farms. (paper)

  8. Statistical meandering wake model and its application to yaw-angle optimisation of wind farms

    Science.gov (United States)

    Thøgersen, E.; Tranberg, B.; Herp, J.; Greiner, M.

    2017-05-01

    The wake produced by a wind turbine is dynamically meandering and of rather narrow nature. Only when looking at large time averages, the wake appears to be static and rather broad, and is then well described by simple engineering models like the Jensen wake model (JWM). We generalise the latter deterministic models to a statistical meandering wake model (SMWM), where a random directional deflection is assigned to a narrow wake in such a way that on average it resembles a broad Jensen wake. In a second step, the model is further generalised to wind-farm level, where the deflections of the multiple wakes are treated as independently and identically distributed random variables. When carefully calibrated to the Nysted wind farm, the ensemble average of the statistical model produces the same wind-direction dependence of the power efficiency as obtained from the standard Jensen model. Upon using the JWM to perform a yaw-angle optimisation of wind-farm power output, we find an optimisation gain of 6.7% for the Nysted wind farm when compared to zero yaw angles and averaged over all wind directions. When applying the obtained JWM-based optimised yaw angles to the SMWM, the ensemble-averaged gain is calculated to be 7.5%. This outcome indicates the possible operational robustness of an optimised yaw control for real-life wind farms.

  9. Gamma camera

    International Nuclear Information System (INIS)

    Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    The design of a collimation system for a gamma camera for use in nuclear medicine is described. When used with a 2-dimensional position sensitive radiation detector, the novel system can produce superior images than conventional cameras. The optimal thickness and positions of the collimators are derived mathematically. (U.K.)

  10. Picosecond camera

    International Nuclear Information System (INIS)

    Decroisette, Michel

    A Kerr cell activated by infrared pulses of a model locked Nd glass laser, acts as an ultra-fast and periodic shutter, with a few p.s. opening time. Associated with a S.T.L. camera, it gives rise to a picosecond camera allowing us to study very fast effects [fr

  11. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    Science.gov (United States)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  12. Reducing the Variance of Intrinsic Camera Calibration Results in the ROS Camera_Calibration Package

    Science.gov (United States)

    Chiou, Geoffrey Nelson

    The intrinsic calibration of a camera is the process in which the internal optical and geometric characteristics of the camera are determined. If accurate intrinsic parameters of a camera are known, the ray in 3D space that every point in the image lies on can be determined. Pairing with another camera allows for the position of the points in the image to be calculated by intersection of the rays. Accurate intrinsics also allow for the position and orientation of a camera relative to some world coordinate system to be calculated. These two reasons for having accurate intrinsic calibration for a camera are especially important in the field of industrial robotics where 3D cameras are frequently mounted on the ends of manipulators. In the ROS (Robot Operating System) ecosystem, the camera_calibration package is the default standard for intrinsic camera calibration. Several researchers from the Industrial Robotics & Automation division at Southwest Research Institute have noted that this package results in large variances in the intrinsic parameters of the camera when calibrating across multiple attempts. There are also open issues on this matter in their public repository that have not been addressed by the developers. In this thesis, we confirm that the camera_calibration package does indeed return different results across multiple attempts, test out several possible hypothesizes as to why, identify the reason, and provide simple solution to fix the cause of the issue.

  13. Commercialization of radiation tolerant camera

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10{sup 6} - 10{sup 8} rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  14. Commercialization of radiation tolerant camera

    International Nuclear Information System (INIS)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10 6 - 10 8 rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  15. Cameras in mobile phones

    Science.gov (United States)

    Nummela, Ville; Viinikanoja, Jarkko; Alakarhu, Juha

    2006-04-01

    One of the fastest growing markets in consumer markets today are camera phones. During past few years total volume has been growing fast and today millions of mobile phones with camera will be sold. At the same time resolution and functionality of the cameras has been growing from CIF towards DSC level. From camera point of view the mobile world is an extremely challenging field. Cameras should have good image quality but in small size. They also need to be reliable and their construction should be suitable for mass manufacturing. All components of the imaging chain should be well optimized in this environment. Image quality and usability are the most important parameters to user. The current trend of adding more megapixels to cameras and at the same time using smaller pixels is affecting both. On the other hand reliability and miniaturization are key drivers for product development as well as the cost. In optimized solution all parameters are in balance but the process of finding the right trade-offs is not an easy task. In this paper trade-offs related to optics and their effects to image quality and usability of cameras are discussed. Key development areas from mobile phone camera point of view are also listed.

  16. Development of a swim-type ROV for narrow space inspection

    International Nuclear Information System (INIS)

    Okada, Satoshi; Otani, Kenichi; Kobayashi, Ryosuke; Ohno, Kazunori

    2017-01-01

    The swim-type remotely operated vehicle (ROV) for inspection of narrow spaces in nuclear power plants has been developed. Many structures are crowded in a confined space at regular intervals in the bottom area of a reactor. So, the thickness of the ROV shape is an important design point to ensure that the ROV can move in the space. The developed ROV has a three-dimensional swimming mechanism using six thrusters, three cameras for observing the position while moving and for making inspections easily, and a localization system. The localization system combines two elements: a gyroscope to detect the progression direction; and a slit laser that detects the progression distance using the optical cutting method. The localization method is called the modified inertial navigation (MIN) method and it was evaluated in a mock-up examination. The ROV was able to move smoothly using the MIN method and its position could be detected without making a mistake in the route followed. (author)

  17. On the Calibration and Accuracy of the Guinier Camera for the Determination of Interplanar Spacings

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, Manfred

    1962-03-15

    Equations describing the relative mean error in the determination of interplanar spacings have been theoretically deduced for transmission cameras and X-ray investigations with and without internal standard. Together with standard film exposures of known substances these equations may be used to determine once and for all the accuracy of a Guinier camera under different experimental conditions. This is shown for an 80 mm camera positioned asymmetrically to the incident beam ({alpha} = 30 deg). Pieces of Scotch tape were used as powder sample holders, and comparator measurements on k ({alpha}{sub 1} + {alpha}{sub 2}) spectra as well as rapid d-scale readings are discussed. It was found that a camera constant substantially independent of {theta} can be determined most accurately for any film with the aid of only a few calibration lines diffracted to high {theta} angles if for an optimal line measuring accuracy of {+-} 0. 01 mm the sample is never allowed to deviate by more than {+-} 0. 05 mm from the focussing cylinder defined by the film position. Using tape as sample holder, this condition can be fulfilled with a probability of at least 80 %. With thin glass slides on the other hand, the systematic error may be reduced so far that the use of internal standards after the primary calibration of a camera will appear unnecessary.

  18. Effects of injection angles on combustion processes using multiple injection strategies in an HSDI diesel engine

    Energy Technology Data Exchange (ETDEWEB)

    Tiegang Fang; Robert E. Coverdill; Chia-fon F. Lee; Robert A. White [North Carolina State University, Raleigh, NC (United States). Department of Mechanical and Aerospace Engineering

    2008-11-15

    Effects of injection angles and injection pressure on the combustion processes employing multiple injection strategies in a high-speed direct-injection (HSDI) diesel engine are presented in this work. Whole-cycle combustion and liquid spray evolution processes were visualized using a high-speed video camera. NOx emissions were measured in the exhaust pipe. Different heat release patterns are seen for two different injectors with a 70-degree tip and a 150-degree tip. No evidence of fuel-wall impingement is found for the first injection of the 150-degree tip, but for the 70-degree tip, some fuel impinges on the bowl wall and a fuel film is formed. For the second injection, a large amount of fuel deposition is observed for the 70-degree tip. Weak flame is seen for the first injection of the 150-degree tip while two sorts of flames are seen for the first injection of the 70-degree tip including an early weak flame and a late luminous film combustion flame. Ignition occurs near the spray tip in the vicinity of the bowl wall for the second injection events of the 150-degree tip, however, it is near the injector tip in the central region of the bowl for the 70-degree tip. The flame is more homogeneous for the 150-degree tip with higher injection pressure with little soot formation similar to a premixed-charge-compression-ignition (PCCI) combustion. For other cases, liquid fuel is injected into flames showing diffusion flame combustion. More soot luminosity is seen for the 70-degree tip due to significant fuel film deposition on the piston wall with fuel film combustion for both injection events. Lower NOx emissions were obtained for the narrow-angle injector due to the rich air-fuel mixture near the bowl wall during the combustion process. 30 refs., 11 figs., 3 tabs.

  19. Astigmatism-free high-brightness 1060 nm edge-emitting lasers with narrow circular beam profile.

    Science.gov (United States)

    Miah, Md Jarez; Kalosha, Vladimir P; Bimberg, Dieter; Pohl, Johannes; Weyers, Markus

    2016-12-26

    1060 nm high-brightness vertical broad-area edge-emitting lasers providing anastigmatic high optical power into a narrow circular beam profile are demonstrated. Ridge-waveguide (RW) lasers yield record 2.2 W single-transverse mode power in the 1060-nm wavelength range under continuous-wave (cw) operation at room temperature with excellent beam quality factor M2 ≤ 2. Independent of operating current the astigmatism is only 2.5 µm. 3 mm long broad-area (BA) lasers produce a θvert as narrow as 9° full width at half maximum, which agrees well with our simulation results, being insensitive to drive current. 5 mm long BA lasers deliver highest ever reported cw 12 W multimode output power among lasers showing θvert <10° in the 1060-nm wavelength range. The emitted laser beams from both RW and BA lasers show a perfect circular shape with ≤10° divergence angle at record 2.1 W and 4.2 W cw-mode output power, respectively.

  20. A Precise Visual Method for Narrow Butt Detection in Specular Reflection Workpiece Welding

    Directory of Open Access Journals (Sweden)

    Jinle Zeng

    2016-09-01

    Full Text Available During the complex path workpiece welding, it is important to keep the welding torch aligned with the groove center using a visual seam detection method, so that the deviation between the torch and the groove can be corrected automatically. However, when detecting the narrow butt of a specular reflection workpiece, the existing methods may fail because of the extremely small groove width and the poor imaging quality. This paper proposes a novel detection method to solve these issues. We design a uniform surface light source to get high signal-to-noise ratio images against the specular reflection effect, and a double-line laser light source is used to obtain the workpiece surface equation relative to the torch. Two light sources are switched on alternately and the camera is synchronized to capture images when each light is on; then the position and pose between the torch and the groove can be obtained nearly at the same time. Experimental results show that our method can detect the groove effectively and efficiently during the welding process. The image resolution is 12.5 μm and the processing time is less than 10 ms per frame. This indicates our method can be applied to real-time narrow butt detection during high-speed welding process.

  1. Advantages of computer cameras over video cameras/frame grabbers for high-speed vision applications

    Science.gov (United States)

    Olson, Gaylord G.; Walker, Jo N.

    1997-09-01

    Cameras designed to work specifically with computers can have certain advantages in comparison to the use of cameras loosely defined as 'video' cameras. In recent years the camera type distinctions have become somewhat blurred, with a great presence of 'digital cameras' aimed more at the home markets. This latter category is not considered here. The term 'computer camera' herein is intended to mean one which has low level computer (and software) control of the CCD clocking. These can often be used to satisfy some of the more demanding machine vision tasks, and in some cases with a higher rate of measurements than video cameras. Several of these specific applications are described here, including some which use recently designed CCDs which offer good combinations of parameters such as noise, speed, and resolution. Among the considerations for the choice of camera type in any given application would be such effects as 'pixel jitter,' and 'anti-aliasing.' Some of these effects may only be relevant if there is a mismatch between the number of pixels per line in the camera CCD and the number of analog to digital (A/D) sampling points along a video scan line. For the computer camera case these numbers are guaranteed to match, which alleviates some measurement inaccuracies and leads to higher effective resolution.

  2. Geometric approach to the design of an imaging probe to evaluate the iridocorneal angle structures

    Science.gov (United States)

    Hong, Xun Jie Jeesmond; V. K., Shinoj; Murukeshan, V. M.; Baskaran, M.; Aung, Tin

    2017-06-01

    Photographic imaging methods allow the tracking of anatomical changes in the iridocorneal angle structures and the monitoring of treatment responses overtime. In this work, we aim to design an imaging probe to evaluate the iridocorneal angle structures using geometrical optics. We first perform an analytical analysis on light propagation from the anterior chamber of the eye to the exterior medium using Snell's law. This is followed by adopting a strategy to achieve uniform near field irradiance, by simplifying the complex non-rotational symmetric irradiance distribution of LEDs tilted at an angle. The optimization is based on the geometric design considerations of an angled circular ring array of 4 LEDs (or a 2 × 2 square LED array). The design equation give insights on variable parameters such as the illumination angle of the LEDs, ring array radius, viewing angle of the LEDs, and the working distance. A micro color CCD video camera that has sufficient resolution to resolve the iridocorneal angle structures at the required working distance is then chosen. The proposed design aspects fulfil the safety requirements recommended by the International Commission on Non-ionizing Radiation Protection.

  3. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  4. Kernel integration scatter model for parallel beam gamma camera and SPECT point source response

    International Nuclear Information System (INIS)

    Marinkovic, P.M.

    2001-01-01

    Scatter correction is a prerequisite for quantitative single photon emission computed tomography (SPECT). In this paper a kernel integration scatter Scatter correction is a prerequisite for quantitative SPECT. In this paper a kernel integration scatter model for parallel beam gamma camera and SPECT point source response based on Klein-Nishina formula is proposed. This method models primary photon distribution as well as first Compton scattering. It also includes a correction for multiple scattering by applying a point isotropic single medium buildup factor for the path segment between the point of scatter an the point of detection. Gamma ray attenuation in the object of imaging, based on known μ-map distribution, is considered too. Intrinsic spatial resolution of the camera is approximated by a simple Gaussian function. Collimator is modeled simply using acceptance angles derived from the physical dimensions of the collimator. Any gamma rays satisfying this angle were passed through the collimator to the crystal. Septal penetration and scatter in the collimator were not included in the model. The method was validated by comparison with Monte Carlo MCNP-4a numerical phantom simulation and excellent results were obtained. The physical phantom experiments, to confirm this method, are planed to be done. (author)

  5. Divergence-ratio axi-vision camera (Divcam): A distance mapping camera

    International Nuclear Information System (INIS)

    Iizuka, Keigo

    2006-01-01

    A novel distance mapping camera the divergence-ratio axi-vision camera (Divcam) is proposed. The decay rate of the illuminating light with distance due to the divergence of the light is used as means of mapping the distance. Resolutions of 10 mm over a range of meters and 0.5 mm over a range of decimeters were achieved. The special features of this camera are its high resolution real-time operation, simplicity, compactness, light weight, portability, and yet low fabrication cost. The feasibility of various potential applications is also included

  6. Thermal Cameras and Applications

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. This type of camera was originally developed as a surveillance and night vision tool for the military, but recently the price has dropped, significantly opening up...... a broader field of applications. Deploying this type of sensor in vision systems eliminates the illumination problems of normal greyscale and RGB cameras. This survey provides an overview of the current applications of thermal cameras. Applications include animals, agriculture, buildings, gas detection......, industrial, and military applications, as well as detection, tracking, and recognition of humans. Moreover, this survey describes the nature of thermal radiation and the technology of thermal cameras....

  7. Radiation camera exposure control

    International Nuclear Information System (INIS)

    Martone, R.J.; Yarsawich, M.; Wolczek, W.

    1976-01-01

    A system and method for governing the exposure of an image generated by a radiation camera to an image sensing camera is disclosed. The exposure is terminated in response to the accumulation of a predetermined quantity of radiation, defining a radiation density, occurring in a predetermined area. An index is produced which represents the value of that quantity of radiation whose accumulation causes the exposure termination. The value of the predetermined radiation quantity represented by the index is sensed so that the radiation camera image intensity can be calibrated to compensate for changes in exposure amounts due to desired variations in radiation density of the exposure, to maintain the detectability of the image by the image sensing camera notwithstanding such variations. Provision is also made for calibrating the image intensity in accordance with the sensitivity of the image sensing camera, and for locating the index for maintaining its detectability and causing the proper centering of the radiation camera image

  8. Performance of the gamma-ray camera based on GSO(Ce) scintillator array and PSPMT with the ASIC readout system

    International Nuclear Information System (INIS)

    Ueno, Kazuki; Hattori, Kaori; Ida, Chihiro; Iwaki, Satoru; Kabuki, Shigeto; Kubo, Hidetoshi; Kurosawa, Shunsuke; Miuchi, Kentaro; Nagayoshi, Tsutomu; Nishimura, Hironobu; Orito, Reiko; Takada, Atsushi; Tanimori, Toru

    2008-01-01

    We have studied the performance of a readout system with ASIC chips for a gamma-ray camera based on a 64-channel multi-anode PSPMT (Hamamatsu flat-panel H8500) coupled to a GSO(Ce) scintillator array. The GSO array consists of 8x8 pixels of 6x6x13 mm 3 with the same pixel pitch as the anode of the H8500. This camera is intended to serve as an absorber of an electron tracking Compton gamma-ray camera that measures gamma rays up to ∼1 MeV. Because we need a readout system with low power consumption for a balloon-borne experiment, we adopted a 32-channel ASIC chip, IDEAS VA32 H DR11, which has one of the widest dynamic range among commercial chips. However, in the case of using a GSO(Ce) crystal and the H8500, the dynamic range of VA32 H DR11 is narrow, and therefore the H8500 has to be operated with a low gain of about 10 5 . If the H8500 is operated with a low gain, the camera has a narrow incident-energy dynamic range from 100 to 700 keV, and a bad energy resolution of 13.0% (FWHM) at 662 keV. We have therefore developed an attenuator board in order to operate the H8500 with the typical gain of 10 6 , which can measure up to ∼1 MeV gamma ray. The board makes the variation of the anode gain uniform and widens the dynamic range of the H8500. The system using the new attenuator board has a good uniformity of min:max∼1:1.6, an incident-energy dynamic range from 30 to 900 keV, a position resolution of less than 6 mm, and a typical energy resolution of 10.6% (FWHM) at 662 keV with a low power consumption of about 1.7 W/64ch

  9. Use of cameras for monitoring visibility impairment

    Science.gov (United States)

    Malm, William; Cismoski, Scott; Prenni, Anthony; Peters, Melanie

    2018-02-01

    Webcams and automated, color photography cameras have been routinely operated in many U.S. national parks and other federal lands as far back as 1988, with a general goal of meeting interpretive needs within the public lands system and communicating effects of haze on scenic vistas to the general public, policy makers, and scientists. Additionally, it would be desirable to extract quantifiable information from these images to document how visibility conditions change over time and space and to further reflect the effects of haze on a scene, in the form of atmospheric extinction, independent of changing lighting conditions due to time of day, year, or cloud cover. Many studies have demonstrated a link between image indexes and visual range or extinction in urban settings where visibility is significantly degraded and where scenes tend to be gray and devoid of color. In relatively clean, clear atmospheric conditions, clouds and lighting conditions can sometimes affect the image radiance field as much or more than the effects of haze. In addition, over the course of many years, cameras have been replaced many times as technology improved or older systems wore out, and therefore camera image pixel density has changed dramatically. It is shown that gradient operators are very sensitive to image resolution while contrast indexes are not. Furthermore, temporal averaging and time of day restrictions allow for developing quantitative relationships between atmospheric extinction and contrast-type indexes even when image resolution has varied over time. Temporal averaging effectively removes the variability of visibility indexes associated with changing cloud cover and weather conditions, and changes in lighting conditions resulting from sun angle effects are best compensated for by restricting averaging to only certain times of the day.

  10. Adapting Virtual Camera Behaviour

    DEFF Research Database (Denmark)

    Burelli, Paolo

    2013-01-01

    In a three-dimensional virtual environment aspects such as narrative and interaction completely depend on the camera since the camera defines the player’s point of view. Most research works in automatic camera control aim to take the control of this aspect from the player to automatically gen- er...

  11. MicroCameras and Photometers (MCP) on board the TARANIS satellite

    Science.gov (United States)

    Farges, T.; Hébert, P.; Le Mer-Dachard, F.; Ravel, K.; Gaillac, S.

    2017-12-01

    TARANIS (Tool for the Analysis of Radiations from lightNing and Sprites) is a CNES micro satellite. Its main objective is to study impulsive transfers of energy between the Earth atmosphere and the space environment. It will be sun-synchronous at an altitude of 700 km. It will be launched in 2019 for at least 2 years. Its payload is composed of several electromagnetic instruments in different wavelengths (from gamma-rays to radio waves including optical). TARANIS instruments are currently in calibration and qualification phase. The purpose is to present the MicroCameras and Photometers (MCP) design, to show its performances after its recent characterization and at last to discuss the scientific objectives and how we want to answer it with the MCP observations. The MicroCameras, developed by Sodern, are dedicated to the spatial description of TLEs and their parent lightning. They are able to differentiate sprite and lightning thanks to two narrow bands ([757-767 nm] and [772-782 nm]) that provide simultaneous pairs of images of an Event. Simulation results of the differentiation method will be shown. After calibration and tests, the MicroCameras are now delivered to the CNES for integration on the payload. The Photometers, developed by Bertin Technologies, will provide temporal measurements and spectral characteristics of TLEs and lightning. There are key instrument because of their capability to detect on-board TLEs and then switch all the instruments of the scientific payload in their high resolution acquisition mode. Photometers use four spectral bands in the [170-260 nm], [332-342 nm], [757-767 nm] and [600-900 nm] and have the same field of view as cameras. The on-board TLE detection algorithm remote-controlled parameters have been tuned before launch using the electronic board and simulated or real events waveforms. After calibration, the Photometers are now going through the environmental tests. They will be delivered to the CNES for integration on the

  12. Improving the Geolocation Algorithm for Sensors Onboard the ISS: Effect of Drift Angle

    Directory of Open Access Journals (Sweden)

    Changyong Dou

    2014-05-01

    Full Text Available The drift angle caused by the Earth’s self-rotation may introduce rotational displacement artifact on the geolocation results of imagery acquired by an Earth observing sensor onboard the International Space Station (ISS. If uncorrected, it would cause a gradual degradation of positional accuracy from the center towards the edges of an image. One correction method to account for the drift angle effect was developed. The drift angle was calculated from the ISS state vectors and positional information of the ground nadir point of the imagery. Tests with images acquired by the International Space Station Agriculture Camera (ISSAC using Google EarthTM as a reference indicated that applying the drift angle correction can reduce the residual geolocation error for the corner points of the ISSAC images from over 1000 to less than 500 m. The improved geolocation accuracy is well within the inherent geolocation uncertainty of up to 800 m, mainly due to imprecise knowledge of the ISS attitude and state parameters required to perform the geolocation algorithm.

  13. Making Ceramic Cameras

    Science.gov (United States)

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  14. Camera Movement in Narrative Cinema

    DEFF Research Database (Denmark)

    Nielsen, Jakob Isak

    2007-01-01

    section unearths what characterizes the literature on camera movement. The second section of the dissertation delineates the history of camera movement itself within narrative cinema. Several organizational principles subtending the on-screen effect of camera movement are revealed in section two...... but they are not organized into a coherent framework. This is the task that section three meets in proposing a functional taxonomy for camera movement in narrative cinema. Two presumptions subtend the taxonomy: That camera movement actively contributes to the way in which we understand the sound and images on the screen......, commentative or valuative manner. 4) Focalization: associating the movement of the camera with the viewpoints of characters or entities in the story world. 5) Reflexive: inviting spectators to engage with the artifice of camera movement. 6) Abstract: visualizing abstract ideas and concepts. In order...

  15. Design of an experimental four-camera setup for enhanced 3D surface reconstruction in microsurgery

    Directory of Open Access Journals (Sweden)

    Marzi Christian

    2017-09-01

    Full Text Available Future fully digital surgical visualization systems enable a wide range of new options. Caused by optomechanical limitations a main disadvantage of today’s surgical microscopes is their incapability of providing arbitrary perspectives to more than two observers. In a fully digital microscopic system, multiple arbitrary views can be generated from a 3D reconstruction. Modern surgical microscopes allow replacing the eyepieces by cameras in order to record stereoscopic videos. A reconstruction from these videos can only contain the amount of detail the recording camera system gathers from the scene. Therefore, covered surfaces can result in a faulty reconstruction for deviating stereoscopic perspectives. By adding cameras recording the object from different angles, additional information of the scene is acquired, allowing to improve the reconstruction. Our approach is to use a fixed four-camera setup as a front-end system to capture enhanced 3D topography of a pseudo-surgical scene. This experimental setup would provide images for the reconstruction algorithms and generation of multiple observing stereo perspectives. The concept of the designed setup is based on the common main objective (CMO principle of current surgical microscopes. These systems are well established and optically mature. Furthermore, the CMO principle allows a more compact design and a lowered effort in calibration than cameras with separate optics. Behind the CMO four pupils separate the four channels which are recorded by one camera each. The designed system captures an area of approximately 28mm × 28mm with four cameras. Thus, allowing to process images of 6 different stereo perspectives. In order to verify the setup, it is modelled in silico. It can be used in further studies to test algorithms for 3D reconstruction from up to four perspectives and provide information about the impact of additionally recorded perspectives on the enhancement of a reconstruction.

  16. Feasibility Study of Utilization of Action Camera, GoPro Hero 4, Google Glass, and Panasonic HX-A100 in Spine Surgery.

    Science.gov (United States)

    Lee, Chang Kyu; Kim, Youngjun; Lee, Nam; Kim, Byeongwoo; Kim, Doyoung; Yi, Seong

    2017-02-15

    Study for feasibility of commercially available action cameras in recording video of spine. Recent innovation of the wearable action camera with high-definition video recording enables surgeons to use camera in the operation at ease without high costs. The purpose of this study is to compare the feasibility, safety, and efficacy of commercially available action cameras in recording video of spine surgery. There are early reports of medical professionals using Google Glass throughout the hospital, Panasonic HX-A100 action camera, and GoPro. This study is the first report for spine surgery. Three commercially available cameras were tested: GoPro Hero 4 Silver, Google Glass, and Panasonic HX-A100 action camera. Typical spine surgery was selected for video recording; posterior lumbar laminectomy and fusion. Three cameras were used by one surgeon and video was recorded throughout the operation. The comparison was made on the perspective of human factor, specification, and video quality. The most convenient and lightweight device for wearing and holding throughout the long operation time was Google Glass. The image quality; all devices except Google Glass supported HD format and GoPro has unique 2.7K or 4K resolution. Quality of video resolution was best in GoPro. Field of view, GoPro can adjust point of interest, field of view according to the surgery. Narrow FOV option was the best for recording in GoPro to share the video clip. Google Glass has potentials by using application programs. Connectivity such as Wi-Fi and Bluetooth enables video streaming for audience, but only Google Glass has two-way communication feature in device. Action cameras have the potential to improve patient safety, operator comfort, and procedure efficiency in the field of spinal surgery and broadcasting a surgery with development of the device and applied program in the future. N/A.

  17. A directional fast neutron detector using scintillating fibers and an intensified CCD camera system

    International Nuclear Information System (INIS)

    Holslin, Daniel; Armstrong, A.W.; Hagan, William; Shreve, David; Smith, Scott

    1994-01-01

    We have been developing and testing a scintillating fiber detector (SFD) for use as a fast neutron sensor which can discriminate against neutrons entering at angles non-parallel to the fiber axis (''directionality''). The detector/convertor component is a fiber bundle constructed of plastic scintillating fibers each measuring 10 cm long and either 0.3 mm or 0.5 mm in diameter. Extensive Monte Carlo simulations were made to optimize the bundle response to a range of fast neutron energies and to intense fluxes of high energy gamma-rays. The bundle is coupled to a set of gamma-ray insenitive electro-optic intensifiers whose output is viewed by a CCD camera directly coupled to the intensifiers. Two types of CCD cameras were utilized: 1) a standard, interline RS-170 camera with electronic shuttering and 2) a high-speed (up to 850 frame/s) field-transfer camera. Measurements of the neutron detection efficiency and directionality were made using 14 MeV neutrons, and the response to gamma-rays was performed using intense fluxes from radioisotopic sources (up to 20 R/h). Recently, the detector was constructed and tested using a large 10 cm by 10 cm square fiber bundle coupled to a 10 cm diameter GEN I intensifier tube. We present a description of the various detector systems and report the results of experimental tests. ((orig.))

  18. Game of thrown bombs in 3D: using high speed cameras and photogrammetry techniques to reconstruct bomb trajectories at Stromboli (Italy)

    Science.gov (United States)

    Gaudin, D.; Taddeucci, J.; Scarlato, P.; Del Bello, E.; Houghton, B. F.; Orr, T. R.; Andronico, D.; Kueppers, U.

    2015-12-01

    Large juvenile bombs and lithic clasts, produced and ejected during explosive volcanic eruptions, follow ballistic trajectories. Of particular interest are: 1) the determination of ejection velocity and launch angle, which give insights into shallow conduit conditions and geometry; 2) particle trajectories, with an eye on trajectory evolution caused by collisions between bombs, as well as the interaction between bombs and ash/gas plumes; and 3) the computation of the final emplacement of bomb-sized clasts, which is important for hazard assessment and risk management. Ground-based imagery from a single camera only allows the reconstruction of bomb trajectories in a plan perpendicular to the line of sight, which may lead to underestimation of bomb velocities and does not allow the directionality of the ejections to be studied. To overcome this limitation, we adapted photogrammetry techniques to reconstruct 3D bomb trajectories from two or three synchronized high-speed video cameras. In particular, we modified existing algorithms to consider the errors that may arise from the very high velocity of the particles and the impossibility of measuring tie points close to the scene. Our method was tested during two field campaigns at Stromboli. In 2014, two high-speed cameras with a 500 Hz frame rate and a ~2 cm resolution were set up ~350m from the crater, 10° apart and synchronized. The experiment was repeated with similar parameters in 2015, but using three high-speed cameras in order to significantly reduce uncertainties and allow their estimation. Trajectory analyses for tens of bombs at various times allowed for the identification of shifts in the mean directivity and dispersal angle of the jets during the explosions. These time evolutions are also visible on the permanent video-camera monitoring system, demonstrating the applicability of our method to all kinds of explosive volcanoes.

  19. Mixel camera--a new push-broom camera concept for high spatial resolution keystone-free hyperspectral imaging.

    Science.gov (United States)

    Høye, Gudrun; Fridman, Andrei

    2013-05-06

    Current high-resolution push-broom hyperspectral cameras introduce keystone errors to the captured data. Efforts to correct these errors in hardware severely limit the optical design, in particular with respect to light throughput and spatial resolution, while at the same time the residual keystone often remains large. The mixel camera solves this problem by combining a hardware component--an array of light mixing chambers--with a mathematical method that restores the hyperspectral data to its keystone-free form, based on the data that was recorded onto the sensor with large keystone. A Virtual Camera software, that was developed specifically for this purpose, was used to compare the performance of the mixel camera to traditional cameras that correct keystone in hardware. The mixel camera can collect at least four times more light than most current high-resolution hyperspectral cameras, and simulations have shown that the mixel camera will be photon-noise limited--even in bright light--with a significantly improved signal-to-noise ratio compared to traditional cameras. A prototype has been built and is being tested.

  20. Optimization design of periscope type 3X zoom lens design for a five megapixel cellphone camera

    Science.gov (United States)

    Sun, Wen-Shing; Tien, Chuen-Lin; Pan, Jui-Wen; Chao, Yu-Hao; Chu, Pu-Yi

    2016-11-01

    This paper presents a periscope type 3X zoom lenses design for a five megapixel cellphone camera. The configuration of optical system uses the right angle prism in front of the zoom lenses to change the optical path rotated by a 90° angle resulting in the zoom lenses length of 6 mm. The zoom lenses can be embedded in mobile phone with a thickness of 6 mm. The zoom lenses have three groups with six elements. The half field of view is varied from 30° to 10.89°, the effective focal length is adjusted from 3.142 mm to 9.426 mm, and the F-number is changed from 2.8 to 5.13.

  1. VUV testing of science cameras at MSFC: QE measurement of the CLASP flight cameras

    Science.gov (United States)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-08-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint MSFC, National Astronomical Observatory of Japan (NAOJ), Instituto de Astrofisica de Canarias (IAC) and Institut D'Astrophysique Spatiale (IAS) sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512 × 512 detector, dual channel analog readout and an internally mounted cold block. At the flight CCD temperature of -20C, the CLASP cameras exceeded the low-noise performance requirements (UV, EUV and X-ray science cameras at MSFC.

  2. Neutron cameras for ITER

    International Nuclear Information System (INIS)

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-01-01

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from 16 N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with 16 N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins

  3. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    NARCIS (Netherlands)

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P.T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many

  4. Convolutional Neural Network-Based Human Detection in Nighttime Images Using Visible Light Camera Sensors.

    Science.gov (United States)

    Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung

    2017-05-08

    Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.

  5. Convolutional Neural Network-Based Human Detection in Nighttime Images Using Visible Light Camera Sensors

    Directory of Open Access Journals (Sweden)

    Jong Hyun Kim

    2017-05-01

    Full Text Available Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1 and two open databases (Korea advanced institute of science and technology (KAIST and computer vision center (CVC databases, as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.

  6. Motion tracking in narrow spaces: a structured light approach

    DEFF Research Database (Denmark)

    Olesen, Oline Vinter; Paulsen, Rasmus; Højgaard, Liselotte

    2010-01-01

    We present a novel tracking system for patient head motion inside 3D medical scanners. Currently, the system is targeted at the Siemens High Resolution Research Tomograph (HRRT) PET scanner. Partial face surfaces are reconstructed using a miniaturized structured light system. The reconstructed 3D...... the system to a standard optical motion tracker based on a rigid tracking tool. Our system achieves an angular RMSE of 0.11 degrees demonstrating its relevance for motion compensated 3D scan image reconstructions as well as its competitiveness against the standard optical system with an RMSE of 0.08 degrees...... point clouds are matched to a reference surface using a robust iterative closest point algorithm. A main challenge is the narrow geometry requiring a compact structured light system and an oblique angle of observation. The system is validated using a mannequin head mounted on a rotary stage. We compare...

  7. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  8. A single-layer wide-angle negative-index metamaterial at visible frequencies.

    Science.gov (United States)

    Burgos, Stanley P; de Waele, Rene; Polman, Albert; Atwater, Harry A

    2010-05-01

    Metamaterials are materials with artificial electromagnetic properties defined by their sub-wavelength structure rather than their chemical composition. Negative-index materials (NIMs) are a special class of metamaterials characterized by an effective negative index that gives rise to such unusual wave behaviour as backwards phase propagation and negative refraction. These extraordinary properties lead to many interesting functions such as sub-diffraction imaging and invisibility cloaking. So far, NIMs have been realized through layering of resonant structures, such as split-ring resonators, and have been demonstrated at microwave to infrared frequencies over a narrow range of angles-of-incidence and polarization. However, resonant-element NIM designs suffer from the limitations of not being scalable to operate at visible frequencies because of intrinsic fabrication limitations, require multiple functional layers to achieve strong scattering and have refractive indices that are highly dependent on angle of incidence and polarization. Here we report a metamaterial composed of a single layer of coupled plasmonic coaxial waveguides that exhibits an effective refractive index of -2 in the blue spectral region with a figure-of-merit larger than 8. The resulting NIM refractive index is insensitive to both polarization and angle-of-incidence over a +/-50 degree angular range, yielding a wide-angle NIM at visible frequencies.

  9. Improving Situational Awareness in camera surveillance by combining top-view maps with camera images

    NARCIS (Netherlands)

    Kooi, F.L.; Zeeders, R.

    2009-01-01

    The goal of the experiment described is to improve today's camera surveillance in public spaces. Three designs with the camera images combined on a top-view map were compared to each other and to the current situation in camera surveillance. The goal was to test which design makes spatial

  10. The effect of electrode vertex angle on automatic tungsten-inert-gas welds for stainless steel 304L plates

    International Nuclear Information System (INIS)

    Maarek, V.; Sharir, Y.; Stern, A.

    1980-03-01

    The effect of electrode vertex angle on penetration depth and weld bead width, in automatic tungsten-inert-gas (TIG) dcsp bead-on-plate welding with different currents, has been studied for stainless steel 304L plates 1.5 mm and 8 mm thick. It has been found that for thin plates, wider and deeper welds are obtained when using sharper electrodes while, for thick plates, narrower and deeper welds are produced when blunt electrodes (vertex angle 180 deg) are used. An explanation of the results, based on a literature survey, is included

  11. Production of a table of diffusion of light at small angles

    International Nuclear Information System (INIS)

    Desert, Sylvain

    2001-01-01

    This thesis reports the development of an optical table for the analysis, in absolute unit, of the light diffused by samples in air within an angle range from 1 to 25 degrees, by using a 16 bit Ccd camera. In this installation, a sample is located in a parallelepiped vessel where it is illuminated by a laser beam, and the power of this laser is controlled by means of a polarizer system. A lens is placed behind the sample, and the sensor (a Ccd camera) behind its focal point. After some generalities about light diffusion (Van de Huist criterion, Rayleigh diffusion, Mie theory), the author presents the different components of the experimental set-up, reports its calibration and the measurement of its performance (linearity, dynamics and detectability, angular range and resolution). He describes how a diffusion measurement is performed: experimental protocol, data processing, experimental limitations. He reports the application to light diffusion by latexes [fr

  12. Angle-resolved reflection spectroscopy of high-quality PMMA opal crystal

    Science.gov (United States)

    Nemtsev, Ivan V.; Tambasov, Igor A.; Ivanenko, Alexander A.; Zyryanov, Victor Ya.

    2018-02-01

    PMMA opal crystal was prepared by a simple hybrid method, which includes sedimentation, meniscus formation and evaporation. We investigated three surfaces of this crystal by angle-resolved reflective light spectroscopy and SEM study. The angle-resolved reflective measurements were carried out in the 400-1100 nm range. We have determined the high-quality ordered surface of the crystal region. Narrow particle size distribution of the surface has been revealed. The average particle diameter obtained with SEM was nearly 361 nm. The most interesting result was that reflectivity of the surface turned out up to 98% at normal light incidence. Using a fit of dependences of the maximum reflectivity wavelength from an angle based on the Bragg-Snell law, the wavelength of maximum 0° reflectivity, the particle diameter and the fill factor have been determined. For the best surface maximum reflectivity wavelength of a 0° angle was estimated to be 869 nm. The particle diameter and fill factor were calculated as 372 nm and 0.8715, respectively. The diameter obtained by fitting is in excellent agreement with the particle diameter obtained with SEM. The reflectivity maximum is assumed to increase significantly when increasing the fill factor. We believe that using our simple approach to manufacture PMMA opal crystals will significantly increase the fabrication of high-quality photonic crystal templates and thin films.

  13. VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras

    Science.gov (United States)

    Champey, Patrick R.; Kobayashi, Ken; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.

  14. Advanced CCD camera developments

    Energy Technology Data Exchange (ETDEWEB)

    Condor, A. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  15. Performance of Very Small Robotic Fish Equipped with CMOS Camera

    Directory of Open Access Journals (Sweden)

    Yang Zhao

    2015-10-01

    Full Text Available Underwater robots are often used to investigate marine animals. Ideally, such robots should be in the shape of fish so that they can easily go unnoticed by aquatic animals. In addition, lacking a screw propeller, a robotic fish would be less likely to become entangled in algae and other plants. However, although such robots have been developed, their swimming speed is significantly lower than that of real fish. Since to carry out a survey of actual fish a robotic fish would be required to follow them, it is necessary to improve the performance of the propulsion system. In the present study, a small robotic fish (SAPPA was manufactured and its propulsive performance was evaluated. SAPPA was developed to swim in bodies of freshwater such as rivers, and was equipped with a small CMOS camera with a wide-angle lens in order to photograph live fish. The maximum swimming speed of the robot was determined to be 111 mm/s, and its turning radius was 125 mm. Its power consumption was as low as 1.82 W. During trials, SAPPA succeeded in recognizing a goldfish and capturing an image of it using its CMOS camera.

  16. A modified captive bubble method for determining advancing and receding contact angles

    International Nuclear Information System (INIS)

    Xue, Jian; Shi, Pan; Zhu, Lin; Ding, Jianfu; Chen, Qingmin; Wang, Qingjun

    2014-01-01

    Graphical abstract: - Highlights: • A modified captive bubble method for determining advancing and receding contact angle is proposed. • We have designed a pressure chamber with a pressure control system to the original experimental. • The modified method overcomes the deviation of the bubble in the traditional captive bubble method. • The modified captive bubble method allows a smaller error from the test. - Abstract: In this work, a modification to the captive bubble method was proposed to test the advancing and receding contact angle. This modification is done by adding a pressure chamber with a pressure control system to the original experimental system equipped with an optical angle mater equipped with a high speed CCD camera, a temperature control system and a computer. A series of samples with highly hydrophilic, hydrophilic, hydrophobic and superhydrophobic surfaces were prepared. The advancing and receding contact angles of these samples with highly hydrophilic, hydrophilic, and hydrophobic surfaces through the new methods was comparable to the result tested by the traditional sessile drop method. It is proved that this method overcomes the limitation of the traditional captive bubble method and the modified captive bubble method allows a smaller error from the test. However, due to the nature of the captive bubble technique, this method is also only suitable for testing the surface with advancing or receding contact angle below 130°

  17. A modified captive bubble method for determining advancing and receding contact angles

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Jian; Shi, Pan; Zhu, Lin [Key Laboratory of High Performance Polymer Materials and Technology (Nanjing University), Ministry of Eduction, Nanjing 210093 (China); Ding, Jianfu [Security and Disruptive Technologies, National Research Council Canada, 1200 Montreal Road, Ottawa, K1A 0R6, Ontario (Canada); Chen, Qingmin [Key Laboratory of High Performance Polymer Materials and Technology (Nanjing University), Ministry of Eduction, Nanjing 210093 (China); Wang, Qingjun, E-mail: njuwqj@nju.edu.cn [Key Laboratory of High Performance Polymer Materials and Technology (Nanjing University), Ministry of Eduction, Nanjing 210093 (China)

    2014-03-01

    Graphical abstract: - Highlights: • A modified captive bubble method for determining advancing and receding contact angle is proposed. • We have designed a pressure chamber with a pressure control system to the original experimental. • The modified method overcomes the deviation of the bubble in the traditional captive bubble method. • The modified captive bubble method allows a smaller error from the test. - Abstract: In this work, a modification to the captive bubble method was proposed to test the advancing and receding contact angle. This modification is done by adding a pressure chamber with a pressure control system to the original experimental system equipped with an optical angle mater equipped with a high speed CCD camera, a temperature control system and a computer. A series of samples with highly hydrophilic, hydrophilic, hydrophobic and superhydrophobic surfaces were prepared. The advancing and receding contact angles of these samples with highly hydrophilic, hydrophilic, and hydrophobic surfaces through the new methods was comparable to the result tested by the traditional sessile drop method. It is proved that this method overcomes the limitation of the traditional captive bubble method and the modified captive bubble method allows a smaller error from the test. However, due to the nature of the captive bubble technique, this method is also only suitable for testing the surface with advancing or receding contact angle below 130°.

  18. Statistical meandering wake model and its application to yaw-angle optimisation of wind farms

    DEFF Research Database (Denmark)

    Thøgersen, Emil; Tranberg, Bo; Herp, Jürgen

    2017-01-01

    deterministic models to a statistical meandering wake model (SMWM), where a random directional deflection is assigned to a narrow wake in such a way that on average it resembles a broad Jensen wake. In a second step, the model is further generalised to wind-farm level, where the deflections of the multiple...... wakes are treated as independently and identically distributed random variables. When carefully calibrated to the Nysted wind farm, the ensemble average of the statistical model produces the same wind-direction dependence of the power efficiency as obtained from the standard Jensen model. Upon using...... the JWM to perform a yaw-angle optimisation of wind-farm power output, we find an optimisation gain of 6.7% for the Nysted wind farm when compared to zero yaw angles and averaged over all wind directions. When applying the obtained JWM-based optimised yaw angles to the SMWM, the ensemble-averaged gain...

  19. Solid state video cameras

    CERN Document Server

    Cristol, Y

    2013-01-01

    Solid State Video Cameras reviews the state of the art in the field of solid-state television cameras as compiled from patent literature. Organized into 10 chapters, the book begins with the basic array types of solid-state imagers and appropriate read-out circuits and methods. Documents relating to improvement of picture quality, such as spurious signal suppression, uniformity correction, or resolution enhancement, are also cited. The last part considerssolid-state color cameras.

  20. A study on projection angles for an optimal image of PNS water's view on children

    International Nuclear Information System (INIS)

    Son, Sang Hyuk; Song, Young Geun; Kim, Sung Kyu; Hong, Sang Woo; Kim, Je Bong

    2007-01-01

    This study is to calculate the proper angle for the optimal image of PNS Water's view on children, comparing and analyzing the PNS Water's projection angles between children and adults at every age. This study randomly selected 50 patients who visited the Medical Center from January to May in 2005, and examined the incidence path of central ray, taking a PNS Water's and skull trans-Lat. view in Water's filming position while attaching a lead ball mark on the Orbit, EAM, and acanthion of the patient's skull. And then, we calculated the incidence angles (angle A) of the line connected from OML and the petrous ridge to the inferior margin of maxilla on general (random) patient's skull image, following the incidence path of central ray. Finally, we analyzed two pieces of the graphs at ages, developing out the patient's ideal images at PNS Water's filming position taken by a digital camera, and calculating the angle (angle B) between OML and IP(Image Plate). The angle between OML and IP is about 43 .deg. in 4-years-old children, which is higher than 37 .deg. as age increases the angle decreases, it goes to 37 .deg. around 30 years of age. That is similar result to maxillary growth period. We can get better quality of Water's image for children when taking the PNS Water's view if we change the projection angles, considering maxillary growth for patients in every age stage

  1. Multiple Sensor Camera for Enhanced Video Capturing

    Science.gov (United States)

    Nagahara, Hajime; Kanki, Yoshinori; Iwai, Yoshio; Yachida, Masahiko

    A resolution of camera has been drastically improved under a current request for high-quality digital images. For example, digital still camera has several mega pixels. Although a video camera has the higher frame-rate, the resolution of a video camera is lower than that of still camera. Thus, the high-resolution is incompatible with the high frame rate of ordinary cameras in market. It is difficult to solve this problem by a single sensor, since it comes from physical limitation of the pixel transfer rate. In this paper, we propose a multi-sensor camera for capturing a resolution and frame-rate enhanced video. Common multi-CCDs camera, such as 3CCD color camera, has same CCD for capturing different spectral information. Our approach is to use different spatio-temporal resolution sensors in a single camera cabinet for capturing higher resolution and frame-rate information separately. We build a prototype camera which can capture high-resolution (2588×1958 pixels, 3.75 fps) and high frame-rate (500×500, 90 fps) videos. We also proposed the calibration method for the camera. As one of the application of the camera, we demonstrate an enhanced video (2128×1952 pixels, 90 fps) generated from the captured videos for showing the utility of the camera.

  2. Peripheral laser iridoplasty opens angle in plateau iris by thinning the cross-sectional tissues

    Directory of Open Access Journals (Sweden)

    Liu J

    2013-09-01

    Full Text Available Ji Liu,1,2 Tania Lamba,1 David A Belyea1 1Department of Ophthalmology, The George Washington University, Washington DC, USA; 2Yale Eye Center, Yale University, New Haven, CT, USA Abstract: Plateau iris syndrome has been described as persistent angle narrowing or occlusion with intraocular pressure elevation after peripheral iridotomy due to the abnormal plateau iris configuration. Argon laser peripheral iridoplasty (ALPI is an effective adjunct procedure to treat plateau iris syndrome. Classic theory suggests that the laser causes the contraction of the far peripheral iris stroma, "pulls" the iris away from the angle, and relieves the iris-angle apposition. We report a case of plateau iris syndrome that was successfully treated with ALPI. Spectral domain optical coherence tomography confirmed the angle was open at areas with laser treatment but remained appositionally closed at untreated areas. Further analysis suggested significant cross-sectional thinning of the iris at laser-treated areas in comparison with untreated areas. The findings indicate that APLI opens the angle, not only by contracting the iris stroma, but also by thinning the iris tissue at the crowded angle. This is consistent with the ALPI technique to aim at the iris as far peripheral as possible. This case also suggests that spectral domain optical coherence tomography is a useful adjunct imaging tool to gonioscopy in assessing the angle condition. Keywords: plateau iris, optic coherence tomography, argon laser peripheral iridoplasty, angle-closure glaucoma

  3. Using DSLR cameras in digital holography

    Science.gov (United States)

    Hincapié-Zuluaga, Diego; Herrera-Ramírez, Jorge; García-Sucerquia, Jorge

    2017-08-01

    In Digital Holography (DH), the size of the bidimensional image sensor to record the digital hologram, plays a key role on the performance of this imaging technique; the larger the size of the camera sensor, the better the quality of the final reconstructed image. Scientific cameras with large formats are offered in the market, but their cost and availability limit their use as a first option when implementing DH. Nowadays, DSLR cameras provide an easy-access alternative that is worthwhile to be explored. The DSLR cameras are a wide, commercial, and available option that in comparison with traditional scientific cameras, offer a much lower cost per effective pixel over a large sensing area. However, in the DSLR cameras, with their RGB pixel distribution, the sampling of information is different to the sampling in monochrome cameras usually employed in DH. This fact has implications in their performance. In this work, we discuss why DSLR cameras are not extensively used for DH, taking into account the problem reported by different authors of object replication. Simulations of DH using monochromatic and DSLR cameras are presented and a theoretical deduction for the replication problem using the Fourier theory is also shown. Experimental results of DH implementation using a DSLR camera show the replication problem.

  4. Imaging performance of a multiwire proportional-chamber positron camera

    International Nuclear Information System (INIS)

    Perez-Mandez, V.; Del Guerra, A.; Nelson, W.R.; Tam, K.C.

    1982-08-01

    A new design - fully three-dimensional - Positron Camera is presented, made of six MultiWire Proportional Chamber modules arranged to form the lateral surface of a hexagonal prism. A true coincidence rate of 56000 c/s is expected with an equal accidental rate for a 400 μCi activity uniformly distributed in a approx. 3 l water phantom. A detailed Monte Carlo program has been used to investigate the dependence of the spatial resolution on the geometrical and physical parameters. A spatial resolution of 4.8 mm FWHM has been obtained for a 18 F point-like source in a 10 cm radius water phantom. The main properties of the limited angle reconstruction algorithms are described in relation to the proposed detector geometry

  5. Selecting a digital camera for telemedicine.

    Science.gov (United States)

    Patricoski, Chris; Ferguson, A Stewart

    2009-06-01

    The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.

  6. Jihadism, Narrow and Wide

    DEFF Research Database (Denmark)

    Sedgwick, Mark

    2015-01-01

    The term “jihadism” is popular, but difficult. It has narrow senses, which are generally valuable, and wide senses, which may be misleading. This article looks at the derivation and use of “jihadism” and of related terms, at definitions provided by a number of leading scholars, and at media usage....... It distinguishes two main groups of scholarly definitions, some careful and narrow, and some appearing to match loose media usage. However, it shows that even these scholarly definitions actually make important distinctions between jihadism and associated political and theological ideology. The article closes...

  7. Temporal Imaging CeBr3 Compton Camera: A New Concept for Nuclear Decommissioning and Nuclear Waste Management

    Science.gov (United States)

    Iltis, A.; Snoussi, H.; Magalhaes, L. Rodrigues de; Hmissi, M. Z.; Zafiarifety, C. Tata; Tadonkeng, G. Zeufack; Morel, C.

    2018-01-01

    During nuclear decommissioning or waste management operations, a camera that could make an image of the contamination field and identify and quantify the contaminants would be a great progress. Compton cameras have been proposed, but their limited efficiency for high energy gamma rays and their cost have severely limited their application. Our objective is to promote a Compton camera for the energy range (200 keV - 2 MeV) that uses fast scintillating crystals and a new concept for locating scintillation event: Temporal Imaging. Temporal Imaging uses monolithic plates of fast scintillators and measures photons time of arrival distribution in order to locate each gamma ray with a high precision in space (X,Y,Z), time (T) and energy (E). This provides a native estimation of the depth of interaction (Z) of every detected gamma ray. This also allows a time correction for the propagation time of scintillation photons inside the crystal, therefore resulting in excellent time resolution. The high temporal resolution of the system makes it possible to veto quite efficiently background by using narrow time coincidence (system is better than 1 nSv/h in a 60 s acquisition with a 22Na source. The project TEMPORAL is funded by the ANDRA/PAI under the grant No. RTSCNADAA160019.

  8. Fall speed measurement and high-resolution multi-angle photography of hydrometeors in free fall

    Directory of Open Access Journals (Sweden)

    T. J. Garrett

    2012-11-01

    Full Text Available We describe here a new instrument for imaging hydrometeors in free fall. The Multi-Angle Snowflake Camera (MASC captures high-resolution photographs of hydrometeors from three angles while simultaneously measuring their fall speed. Based on the stereoscopic photographs captured over the two months of continuous measurements obtained at a high altitude location within the Wasatch Front in Utah, we derive statistics for fall speed, hydrometeor size, shape, orientation and aspect ratio. From a selection of the photographed hydrometeors, an illustration is provided for how the instrument might be used for making improved microwave scattering calculations. Complex, aggregated snowflake shapes appear to be more strongly forward scattering, at the expense of reduced back-scatter, than heavily rimed graupel particles of similar size.

  9. Location accuracy evaluation of lightning location systems using natural lightning flashes recorded by a network of high-speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. C. V.; Campos, L. Z. D. S.; Pinto, O., Jr.; Antunes, L.

    2014-12-01

    This work presents a method for the evaluation of location accuracy of all Lightning Location System (LLS) in operation in southeastern Brazil, using natural cloud-to-ground (CG) lightning flashes. This can be done through a multiple high-speed cameras network (RAMMER network) installed in the Paraiba Valley region - SP - Brazil. The RAMMER network (Automated Multi-camera Network for Monitoring and Study of Lightning) is composed by four high-speed cameras operating at 2,500 frames per second. Three stationary black-and-white (B&W) cameras were situated in the cities of São José dos Campos and Caçapava. A fourth color camera was mobile (installed in a car), but operated in a fixed location during the observation period, within the city of São José dos Campos. The average distance among cameras was 13 kilometers. Each RAMMER sensor position was determined so that the network can observe the same lightning flash from different angles and all recorded videos were GPS (Global Position System) time stamped, allowing comparisons of events between cameras and the LLS. The RAMMER sensor is basically composed by a computer, a Phantom high-speed camera version 9.1 and a GPS unit. The lightning cases analyzed in the present work were observed by at least two cameras, their position was visually triangulated and the results compared with BrasilDAT network, during the summer seasons of 2011/2012 and 2012/2013. The visual triangulation method is presented in details. The calibration procedure showed an accuracy of 9 meters between the accurate GPS position of the object triangulated and the result from the visual triangulation method. Lightning return stroke positions, estimated with the visual triangulation method, were compared with LLS locations. Differences between solutions were not greater than 1.8 km.

  10. Portable retinal imaging for eye disease screening using a consumer-grade digital camera

    Science.gov (United States)

    Barriga, Simon; Larichev, Andrey; Zamora, Gilberto; Soliz, Peter

    2012-03-01

    The development of affordable means to image the retina is an important step toward the implementation of eye disease screening programs. In this paper we present the i-RxCam, a low-cost, hand-held, retinal camera for widespread applications such as tele-retinal screening for eye diseases like diabetic retinopathy (DR), glaucoma, and age-related ocular diseases. Existing portable retinal imagers do not meet the requirements of a low-cost camera with sufficient technical capabilities (field of view, image quality, portability, battery power, and ease-of-use) to be distributed widely to low volume clinics, such as the offices of single primary care physicians serving rural communities. The i-RxCam uses a Nikon D3100 digital camera body. The camera has a CMOS sensor with 14.8 million pixels. We use a 50mm focal lens that gives a retinal field of view of 45 degrees. The internal autofocus can compensate for about 2D (diopters) of focusing error. The light source is an LED produced by Philips with a linear emitting area that is transformed using a light pipe to the optimal shape at the eye pupil, an annulus. To eliminate corneal reflex we use a polarization technique in which the light passes through a nano-wire polarizer plate. This is a novel type of polarizer featuring high polarization separation (contrast ratio of more than 1000) and very large acceptance angle (>45 degrees). The i-RxCam approach will yield a significantly more economical retinal imaging device that would allow mass screening of the at-risk population.

  11. Polarization sensitive camera for the in vitro diagnostic and monitoring of dental erosion

    Science.gov (United States)

    Bossen, Anke; Rakhmatullina, Ekaterina; Lussi, Adrian; Meier, Christoph

    Due to a frequent consumption of acidic food and beverages, the prevalence of dental erosion increases worldwide. In an initial erosion stage, the hard dental tissue is softened due to acidic demineralization. As erosion progresses, a gradual tissue wear occurs resulting in thinning of the enamel. Complete loss of the enamel tissue can be observed in severe clinical cases. Therefore, it is essential to provide a diagnosis tool for an accurate detection and monitoring of dental erosion already at early stages. In this manuscript, we present the development of a polarization sensitive imaging camera for the visualization and quantification of dental erosion. The system consists of two CMOS cameras mounted on two sides of a polarizing beamsplitter. A horizontal linearly polarized light source is positioned orthogonal to the camera to ensure an incidence illumination and detection angles of 45°. The specular reflected light from the enamel surface is collected with an objective lens mounted on the beam splitter and divided into horizontal (H) and vertical (V) components on each associate camera. Images of non-eroded and eroded enamel surfaces at different erosion degrees were recorded and assessed with diagnostic software. The software was designed to generate and display two types of images: distribution of the reflection intensity (V) and a polarization ratio (H-V)/(H+V) throughout the analyzed tissue area. The measurements and visualization of these two optical parameters, i.e. specular reflection intensity and the polarization ratio, allowed detection and quantification of enamel erosion at early stages in vitro.

  12. Device Physics of Narrow Gap Semiconductors

    CERN Document Server

    Chu, Junhao

    2010-01-01

    Narrow gap semiconductors obey the general rules of semiconductor science, but often exhibit extreme features of these rules because of the same properties that produce their narrow gaps. Consequently these materials provide sensitive tests of theory, and the opportunity for the design of innovative devices. Narrow gap semiconductors are the most important materials for the preparation of advanced modern infrared systems. Device Physics of Narrow Gap Semiconductors offers descriptions of the materials science and device physics of these unique materials. Topics covered include impurities and defects, recombination mechanisms, surface and interface properties, and the properties of low dimensional systems for infrared applications. This book will help readers to understand not only the semiconductor physics and materials science, but also how they relate to advanced opto-electronic devices. The last chapter applies the understanding of device physics to photoconductive detectors, photovoltaic infrared detector...

  13. Submerged arc narrow gap welding of the steel DIN 20MnMoNi55

    International Nuclear Information System (INIS)

    Moraes, M.M.

    1987-01-01

    The methodology for submerged arc narrow gap welding for high thickness rolled steel DIN 20MnMoNi55 was developed, using din S3NiMo1 04 mm and 05 mm wires, and DIN 8B435 flux. For this purpose, submerged arc narrow gap welded joints with 50 mm and 120 mm thickness were made aiming the welding parameters optimization and the study of the influence of welding voltage, wire diameter and wire to groove face distance on the operational performance and on the welded joint quality, specially on the ISO-V impact toughness. These welded joints were checked by non-destructive mechanical and metallographic tests. Results were compared with those obtained by one 120 mm thickness submerged arc conventional gap welded joint, using the same base metal and consumables (05 mm wire). The analysis of the results shows that the increasing of the wire to groove face distance and the welding voltage increases the hardness and the ISO-V impact toughness of the weld metal. It shows that the reduction of the gap angle is the main cause for the obtained of a heat affected zone free from coarse grains, the reduction of the welding voltage, the increasing of the wire to groove face distance, and the grounding optimization also contribute for that. It was also concluded that the quality and the execution complexity level of a narrow gap welded joint are identical to a conventional gap welded joint. (author) [pt

  14. Optically trapped atomic resonant devices for narrow linewidth spectral imaging

    Science.gov (United States)

    Qian, Lipeng

    This thesis focuses on the development of atomic resonant devices for spectroscopic applications. The primary emphasis is on the imaging properties of optically thick atomic resonant fluorescent filters and their applications. In addition, this thesis presents a new concept for producing very narrow linewidth light as from an atomic vapor lamp pumped by a nanosecond pulse system. This research was motivated by application for missile warning system, and presents an innovative approach to a wide angle, ultra narrow linewidth imaging filter using a potassium vapor cell. The approach is to image onto and collect the fluorescent photons emitted from the surface of an optically thick potassium vapor cell, generating a 2 GHz pass-band imaging filter. This linewidth is narrow enough to fall within a Fraunhefer dark zone in the solar spectrum, thus make the detection solar blind. Experiments are conducted to measure the absorption line shape of the potassium resonant filter, the quantum efficiency of the fluorescent behavior, and the resolution of the fluorescent image. Fluorescent images with different spatial frequency components are analyzed by using a discrete Fourier transform, and the imaging capability of the fluorescent filter is described by its Modulation Transfer Function. For the detection of radiation that is spectrally broader than the linewidth of the potassium imaging filter, the fluorescent image is seen to be blurred by diffuse fluorescence from the slightly off resonant photons. To correct this, an ultra-thin potassium imaging filter is developed and characterized. The imaging property of the ultra-thin potassium imaging cell is tested with a potassium seeded flame, yielding a resolution image of ˜ 20 lines per mm. The physics behind the atomic resonant fluorescent filter is radiation trapping. The diffusion process of the resonant photons trapped in the atomic vapor is theoretically described in this thesis. A Monte Carlo method is used to simulate the

  15. Automatic locking radioisotope camera lock

    International Nuclear Information System (INIS)

    Rosauer, P.J.

    1978-01-01

    The lock of the present invention secures the isotope source in a stored shielded condition in the camera until a positive effort has been made to open the lock and take the source outside of the camera and prevents disconnection of the source pigtail unless the source is locked in a shielded condition in the camera. It also gives a visual indication of the locked or possible exposed condition of the isotope source and prevents the source pigtail from being completely pushed out of the camera, even when the lock is released. (author)

  16. The investigation of ship maneuvering with hydrodynamic effects between ships in curved narrow channel

    Directory of Open Access Journals (Sweden)

    Chun-Ki Lee

    2016-01-01

    Full Text Available The hydrodynamic interaction between two large vessels can't be neglected when two large vessels are closed to each other in restricted waterways such as in a harbor or narrow channel. This paper is mainly concerned with the ship maneuvering motion based on the hydrodynamic interaction effects between two large vessels moving each other in curved narrow channel. In this research, the characteristic features of the hydrodynamic interaction forces between two large vessels are described and illustrated, and the effects of velocity ratio and the spacing between two vessels are summarized and discussed. Also, the Inchon outer harbor area through the PALMI island channel in Korea was selected, and the ship maneuvering simulation was carried out to propose an appropriate safe speed and distance between two ships, which is required to avoid sea accident in confined waters. From the inspection of this investigation, it indicates the following result. Under the condition of SP12≤0.5L, it may encounter a dangerous tendency of grounding or collision due to the combined effect of the interaction between ships and external forces. Also considering the interaction and wind effect as a parameter, an overtaken and overtaking vessel in narrow channel can navigate while keeping its own original course under the following conditions; the lateral separation between two ships is about kept at 0.6 times of ship length and 15 degrees of range in maximum rudder angle. On the other hand, two ships while overtaking in curved narrow channel such as Inchon outer harbor in Korea should be navigated under the following conditions; SP12 is about kept at 1.0 times of ship length and the wind velocity should not be stronger than 10 m/s.

  17. Ladder beam and camera video recording system for evaluating forelimb and hindlimb deficits after sensorimotor cortex injury in rats.

    Science.gov (United States)

    Soblosky, J S; Colgin, L L; Chorney-Lane, D; Davidson, J F; Carey, M E

    1997-12-30

    Hindlimb and forelimb deficits in rats caused by sensorimotor cortex lesions are frequently tested by using the narrow flat beam (hindlimb), the narrow pegged beam (hindlimb and forelimb) or the grid-walking (forelimb) tests. Although these are excellent tests, the narrow flat beam generates non-parametric data so that using more powerful parametric statistical analyses are prohibited. All these tests can be difficult to score if the rat is moving rapidly. Foot misplacements, especially on the grid-walking test, are indicative of an ongoing deficit, but have not been reliably and accurately described and quantified previously. In this paper we present an easy to construct and use horizontal ladder-beam with a camera system on rails which can be used to evaluate both hindlimb and forelimb deficits in a single test. By slow motion videotape playback we were able to quantify and demonstrate foot misplacements which go beyond the recovery period usually seen using more conventional measures (i.e. footslips and footfaults). This convenient system provides a rapid and reliable method for recording and evaluating rat performance on any type of beam and may be useful for measuring sensorimotor recovery following brain injury.

  18. Acceptance/Operational Test Report for Tank 241-AN-104 camera and camera purge control system

    International Nuclear Information System (INIS)

    Castleberry, J.L.

    1995-11-01

    This Acceptance/Operational Test Procedure (ATP/OTP) will document the satisfactory operation of the camera purge panel, purge control panel, color camera system and associated control components destined for installation. The final acceptance of the complete system will be performed in the field. The purge panel and purge control panel will be tested for its safety interlock which shuts down the camera and pan-and-tilt inside the tank vapor space during loss of purge pressure and that the correct purge volume exchanges are performed as required by NFPA 496. This procedure is separated into seven sections. This Acceptance/Operational Test Report documents the successful acceptance and operability testing of the 241-AN-104 camera system and camera purge control system

  19. Fast centroid algorithm for determining the surface plasmon resonance angle using the fixed-boundary method

    International Nuclear Information System (INIS)

    Zhan, Shuyue; Wang, Xiaoping; Liu, Yuling

    2011-01-01

    To simplify the algorithm for determining the surface plasmon resonance (SPR) angle for special applications and development trends, a fast method for determining an SPR angle, called the fixed-boundary centroid algorithm, has been proposed. Two experiments were conducted to compare three centroid algorithms from the aspects of the operation time, sensitivity to shot noise, signal-to-noise ratio (SNR), resolution, and measurement range. Although the measurement range of this method was narrower, the other performance indices were all better than the other two centroid methods. This method has outstanding performance, high speed, good conformity, low error and a high SNR and resolution. It thus has the potential to be widely adopted

  20. Development of a visible framing camera diagnostic for the study of current initiation in z-pinch plasmas

    International Nuclear Information System (INIS)

    Muron, D.J.; Hurst, M.J.; Derzon, M.S.

    1996-01-01

    The authors assembled and tested a visible framing camera system to take 5 ns FWHM images of the early time emission from a z-pinch plasma. This diagnostic was used in conjunction with a visible streak camera allowing early time emissions measurements to diagnose current initiation. Individual frames from gated image intensifiers were proximity coupled to charge injection device (CID) cameras and read out at video rate and 8-bit resolution. A mirror was used to view the pinch from a 90-degree angle. The authors observed the destruction of the mirror surface, due to the high surface heating, and the subsequent reduction in signal reflected from the mirror. Images were obtained that showed early time ejecta and a nonuniform emission from the target. This initial test of the equipment highlighted problems with this measurement. They observed non-uniformities in early time emission. This is believed to be due to either spatially varying current density or heating of the foam. Images were obtained that showed early time ejecta from the target. The results and suggestions for improvement are discussed in the text

  1. Development of high-speed video cameras

    Science.gov (United States)

    Etoh, Takeharu G.; Takehara, Kohsei; Okinaka, Tomoo; Takano, Yasuhide; Ruckelshausen, Arno; Poggemann, Dirk

    2001-04-01

    Presented in this paper is an outline of the R and D activities on high-speed video cameras, which have been done in Kinki University since more than ten years ago, and are currently proceeded as an international cooperative project with University of Applied Sciences Osnabruck and other organizations. Extensive marketing researches have been done, (1) on user's requirements on high-speed multi-framing and video cameras by questionnaires and hearings, and (2) on current availability of the cameras of this sort by search of journals and websites. Both of them support necessity of development of a high-speed video camera of more than 1 million fps. A video camera of 4,500 fps with parallel readout was developed in 1991. A video camera with triple sensors was developed in 1996. The sensor is the same one as developed for the previous camera. The frame rate is 50 million fps for triple-framing and 4,500 fps for triple-light-wave framing, including color image capturing. Idea on a video camera of 1 million fps with an ISIS, In-situ Storage Image Sensor, was proposed in 1993 at first, and has been continuously improved. A test sensor was developed in early 2000, and successfully captured images at 62,500 fps. Currently, design of a prototype ISIS is going on, and, hopefully, will be fabricated in near future. Epoch-making cameras in history of development of high-speed video cameras by other persons are also briefly reviewed.

  2. A novel single-step procedure for the calibration of the mounting parameters of a multi-camera terrestrial mobile mapping system

    Science.gov (United States)

    Habib, A.; Kersting, P.; Bang, K.; Rau, J.

    2011-12-01

    Mobile Mapping Systems (MMS) can be defined as moving platforms which integrates a set of imaging sensors and a position and orientation system (POS) for the collection of geo-spatial information. In order to fully explore the potential accuracy of such systems and guarantee accurate multi-sensor integration, a careful system calibration must be carried out. System calibration involves individual sensor calibration as well as the estimation of the inter-sensor geometric relationship. This paper tackles a specific component of the system calibration process of a multi-camera MMS - the estimation of the relative orientation parameters among the cameras, i.e., the inter-camera geometric relationship (lever-arm offsets and boresight angles among the cameras). For that purpose, a novel single step procedure, which is easy to implement and not computationally intensive, will be introduced. The proposed method is implemented in such a way that it can also be used for the estimation of the mounting parameters among the cameras and the IMU body frame, in case of directly georeferenced systems. The performance of the proposed method is evaluated through experimental results using simulated data. A comparative analysis between the proposed single-step and the two-step, which makes use of the traditional bundle adjustment procedure, is demonstrated.

  3. Power estimation of martial arts movement using 3D motion capture camera

    Science.gov (United States)

    Azraai, Nur Zaidi; Awang Soh, Ahmad Afiq Sabqi; Mat Jafri, Mohd Zubir

    2017-06-01

    Motion capture camera (MOCAP) has been widely used in many areas such as biomechanics, physiology, animation, arts, etc. This project is done by approaching physics mechanics and the extended of MOCAP application through sports. Most researchers will use a force plate, but this will only can measure the force of impact, but for us, we are keen to observe the kinematics of the movement. Martial arts is one of the sports that uses more than one part of the human body. For this project, martial art `Silat' was chosen because of its wide practice in Malaysia. 2 performers have been selected, one of them has an experienced in `Silat' practice and another one have no experience at all so that we can compare the energy and force generated by the performers. Every performer will generate a punching with same posture which in this project, two types of punching move were selected. Before the measuring start, a calibration has been done so the software knows the area covered by the camera and reduce the error when analyze by using the T stick that have been pasted with a marker. A punching bag with mass 60 kg was hung on an iron bar as a target. The use of this punching bag is to determine the impact force of a performer when they punch. This punching bag also will be stuck with the optical marker so we can observe the movement after impact. 8 cameras have been used and placed with 2 cameras at every side of the wall with different angle in a rectangular room 270 ft2 and the camera covered approximately 50 ft2. We covered only a small area so less noise will be detected and make the measurement more accurate. A Marker has been pasted on the limb of the entire hand that we want to observe and measure. A passive marker used in this project has a characteristic to reflect the infrared that being generated by the camera. The infrared will reflected to the camera sensor so the marker position can be detected and show in software. The used of many cameras is to increase the

  4. Video camera use at nuclear power plants

    International Nuclear Information System (INIS)

    Estabrook, M.L.; Langan, M.O.; Owen, D.E.

    1990-08-01

    A survey of US nuclear power plants was conducted to evaluate video camera use in plant operations, and determine equipment used and the benefits realized. Basic closed circuit television camera (CCTV) systems are described and video camera operation principles are reviewed. Plant approaches for implementing video camera use are discussed, as are equipment selection issues such as setting task objectives, radiation effects on cameras, and the use of disposal cameras. Specific plant applications are presented and the video equipment used is described. The benefits of video camera use --- mainly reduced radiation exposure and increased productivity --- are discussed and quantified. 15 refs., 6 figs

  5. The development of large-aperture test system of infrared camera and visible CCD camera

    Science.gov (United States)

    Li, Yingwen; Geng, Anbing; Wang, Bo; Wang, Haitao; Wu, Yanying

    2015-10-01

    Infrared camera and CCD camera dual-band imaging system is used in many equipment and application widely. If it is tested using the traditional infrared camera test system and visible CCD test system, 2 times of installation and alignment are needed in the test procedure. The large-aperture test system of infrared camera and visible CCD camera uses the common large-aperture reflection collimator, target wheel, frame-grabber, computer which reduces the cost and the time of installation and alignment. Multiple-frame averaging algorithm is used to reduce the influence of random noise. Athermal optical design is adopted to reduce the change of focal length location change of collimator when the environmental temperature is changing, and the image quality of the collimator of large field of view and test accuracy are also improved. Its performance is the same as that of the exotic congener and is much cheaper. It will have a good market.

  6. Narrow dibaryon resonances

    International Nuclear Information System (INIS)

    Kajdalov, A.B.

    1986-01-01

    Experimental data on np interactions indicating to existence of narrow resonances in pp-system are discussed. Possible theoretical interpretations of these resonances are given. Experimental characteristics of the dibaryon resonances with isospin I=2 are considered

  7. Experimental and computational approaches to evaluate the environmental mitigation effect in narrow spaces by noble metal chemical addition (NMCA)

    International Nuclear Information System (INIS)

    Shimizu, Ryosuke; Ota, Nobuyuki; Nagase, Makoto; Aizawa, Motohiro; Ishida, Kazushige; Wada, Yoichi

    2014-01-01

    The environmental mitigation effect of NMCA in a narrow space was evaluated by experimental and computational approaches. In the experiment at 8 MPa and 553K, T-tube whose branched line had a narrow space was prepared, and the Zr electrodes were set in the branched line at certain intervals, which were 1, 3, 5, 7, 9, 11, 15 and 29 cm from the opening section of the branched line. Electrochemical corrosion potential (ECP) at the tip of the branched narrow space varied in response to the water chemistry in the main line which was at right angle with the branched line. Computational fluid dynamics (CFD) analysis reproduced the experimental results. It was also confirmed by CFD analysis that the ingress of water from the main line into the narrow space was accelerated by cavity flow and thermal convection. By CFD analysis in a thermal sleeve of actual plant condition, which had a narrow space, the concentration of dissolved oxygen at a tip of the thermal sleeve reached at 250 ppb within 300 sec, which was the same concentration of the main line. Noble metal deposition on the surface of the thermal sleeve was evaluated by mass transfer model. Noble metal deposition was the largest near the opening section of the branched line, and gradually decreased toward the tip section. In light of the consumption of dissolved oxygen in the branched line, noble metal deposition in the thermal sleeve was sufficient to reduce the ECP. It was expected that NMCA could mitigate the corrosion environment in the thermal sleeve. (author)

  8. Developing a Low-Cost System for 3d Data Acquisition

    Science.gov (United States)

    Kossieris, S.; Kourounioti, O.; Agrafiotis, P.; Georgopoulos, A.

    2017-11-01

    In this paper, a developed low-cost system is described, which aims to facilitate 3D documentation fast and reliably by acquiring the necessary data in outdoor environment for the 3D documentation of façades especially in the case of very narrow streets. In particular, it provides a viable solution for buildings up to 8-10m high and streets as narrow as 2m or even less. In cases like that, it is practically impossible or highly time-consuming to acquire images in a conventional way. This practice would lead to a huge number of images and long processing times. The developed system was tested in the narrow streets of a medieval village on the Greek island of Chios. There, in order to by-pass the problem of short taking distances, it was thought to use high definition action cameras together with a 360˚ camera, which are usually provided with very wide-angle lenses and are capable of acquiring images, of high definition, are rather cheap and, most importantly, extremely light. Results suggest that the system can perform fast 3D data acquisition adequate for deliverables of high quality.

  9. A new star tracker concept for satellite attitude determination based on a multi-purpose panoramic camera

    Science.gov (United States)

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele; Pernechele, Claudio; Dionisio, Cesare

    2017-11-01

    This paper presents an innovative algorithm developed for attitude determination of a space platform. The algorithm exploits images taken from a multi-purpose panoramic camera equipped with hyper-hemispheric lens and used as star tracker. The sensor architecture is also original since state-of-the-art star trackers accurately image as many stars as possible within a narrow- or medium-size field-of-view, while the considered sensor observes an extremely large portion of the celestial sphere but its observation capabilities are limited by the features of the optical system. The proposed original approach combines algorithmic concepts, like template matching and point cloud registration, inherited from the computer vision and robotic research fields, to carry out star identification. The final aim is to provide a robust and reliable initial attitude solution (lost-in-space mode), with a satisfactory accuracy level in view of the multi-purpose functionality of the sensor and considering its limitations in terms of resolution and sensitivity. Performance evaluation is carried out within a simulation environment in which the panoramic camera operation is realistically reproduced, including perturbations in the imaged star pattern. Results show that the presented algorithm is able to estimate attitude with accuracy better than 1° with a success rate around 98% evaluated by densely covering the entire space of the parameters representing the camera pointing in the inertial space.

  10. Total number albedo and average cosine of the polar angle of low-energy photons reflected from water

    Directory of Open Access Journals (Sweden)

    Marković Srpko

    2007-01-01

    Full Text Available The total number albedo and average cosine of the polar angle for water and initial photon energy range from 20 keV to 100 keV are presented in this pa per. A water shield in the form of a thick, homogenous plate and per pendicular incidence of the monoenergetic photon beam are assumed. The results were obtained through Monte Carlo simulations of photon reflection by means of the MCNP computer code. Calculated values for the total number albedo were compared with data previously published and good agreement was confirmed. The dependence of the average cosine of the polar angle on energy is studied in detail. It has been found that the total average cosine of the polar angle has values in the narrow interval of 0.66-0.67, approximately corresponding to the reflection angle of 48°, and that it does not depend on the initial photon energy.

  11. Intracranial cerebrospinal fluid spaces imaging using a pulse-triggered three-dimensional turbo spin echo MR sequence with variable flip-angle distribution

    International Nuclear Information System (INIS)

    Hodel, Jerome; Silvera, Jonathan; Bekaert, Olivier; Decq, Philippe; Rahmouni, Alain; Bastuji-Garin, Sylvie; Vignaud, Alexandre; Petit, Eric; Durning, Bruno

    2011-01-01

    To assess the three-dimensional turbo spin echo with variable flip-angle distribution magnetic resonance sequence (SPACE: Sampling Perfection with Application optimised Contrast using different flip-angle Evolution) for the imaging of intracranial cerebrospinal fluid (CSF) spaces. We prospectively investigated 18 healthy volunteers and 25 patients, 20 with communicating hydrocephalus (CH), five with non-communicating hydrocephalus (NCH), using the SPACE sequence at 1.5T. Volume rendering views of both intracranial and ventricular CSF were obtained for all patients and volunteers. The subarachnoid CSF distribution was qualitatively evaluated on volume rendering views using a four-point scale. The CSF volumes within total, ventricular and subarachnoid spaces were calculated as well as the ratio between ventricular and subarachnoid CSF volumes. Three different patterns of subarachnoid CSF distribution were observed. In healthy volunteers we found narrowed CSF spaces within the occipital aera. A diffuse narrowing of the subarachnoid CSF spaces was observed in patients with NCH whereas patients with CH exhibited narrowed CSF spaces within the high midline convexity. The ratios between ventricular and subarachnoid CSF volumes were significantly different among the volunteers, patients with CH and patients with NCH. The assessment of CSF spaces volume and distribution may help to characterise hydrocephalus. (orig.)

  12. Intracranial cerebrospinal fluid spaces imaging using a pulse-triggered three-dimensional turbo spin echo MR sequence with variable flip-angle distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hodel, Jerome [Unite Analyse et Restauration du Mouvement, UMR-CNRS, 8005 LBM ParisTech Ensam, Paris (France); University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neuroradiology, Creteil (France); Hopital Henri Mondor, Creteil (France); Silvera, Jonathan [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neuroradiology, Creteil (France); Bekaert, Olivier; Decq, Philippe [Unite Analyse et Restauration du Mouvement, UMR-CNRS, 8005 LBM ParisTech Ensam, Paris (France); University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Neurosurgery, Creteil (France); Rahmouni, Alain [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Radiology, Creteil (France); Bastuji-Garin, Sylvie [University Paris Est Creteil (UPEC), Creteil (France); Assistance Publique-Hopitaux de Paris, Paris (France); Hopital Henri Mondor, Department of Public Health, Creteil (France); Vignaud, Alexandre [Siemens Healthcare, Saint Denis (France); Petit, Eric; Durning, Bruno [Laboratoire Images Signaux et Systemes Intelligents, UPEC, Creteil (France)

    2011-02-15

    To assess the three-dimensional turbo spin echo with variable flip-angle distribution magnetic resonance sequence (SPACE: Sampling Perfection with Application optimised Contrast using different flip-angle Evolution) for the imaging of intracranial cerebrospinal fluid (CSF) spaces. We prospectively investigated 18 healthy volunteers and 25 patients, 20 with communicating hydrocephalus (CH), five with non-communicating hydrocephalus (NCH), using the SPACE sequence at 1.5T. Volume rendering views of both intracranial and ventricular CSF were obtained for all patients and volunteers. The subarachnoid CSF distribution was qualitatively evaluated on volume rendering views using a four-point scale. The CSF volumes within total, ventricular and subarachnoid spaces were calculated as well as the ratio between ventricular and subarachnoid CSF volumes. Three different patterns of subarachnoid CSF distribution were observed. In healthy volunteers we found narrowed CSF spaces within the occipital aera. A diffuse narrowing of the subarachnoid CSF spaces was observed in patients with NCH whereas patients with CH exhibited narrowed CSF spaces within the high midline convexity. The ratios between ventricular and subarachnoid CSF volumes were significantly different among the volunteers, patients with CH and patients with NCH. The assessment of CSF spaces volume and distribution may help to characterise hydrocephalus. (orig.)

  13. Human tracking over camera networks: a review

    Science.gov (United States)

    Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang

    2017-12-01

    In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.

  14. Microprocessor-controlled wide-range streak camera

    Science.gov (United States)

    Lewis, Amy E.; Hollabaugh, Craig

    2006-08-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  15. Microprocessor-controlled, wide-range streak camera

    International Nuclear Information System (INIS)

    Amy E. Lewis; Craig Hollabaugh

    2006-01-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized

  16. Bio-inspired motion detection in an FPGA-based smart camera module

    International Nuclear Information System (INIS)

    Koehler, T; Roechter, F; Moeller, R; Lindemann, J P

    2009-01-01

    Flying insects, despite their relatively coarse vision and tiny nervous system, are capable of carrying out elegant and fast aerial manoeuvres. Studies of the fly visual system have shown that this is accomplished by the integration of signals from a large number of elementary motion detectors (EMDs) in just a few global flow detector cells. We developed an FPGA-based smart camera module with more than 10 000 single EMDs, which is closely modelled after insect motion-detection circuits with respect to overall architecture, resolution and inter-receptor spacing. Input to the EMD array is provided by a CMOS camera with a high frame rate. Designed as an adaptable solution for different engineering applications and as a testbed for biological models, the EMD detector type and parameters such as the EMD time constants, the motion-detection directions and the angle between correlated receptors are reconfigurable online. This allows a flexible and simultaneous detection of complex motion fields such as translation, rotation and looming, such that various tasks, e.g., obstacle avoidance, height/distance control or speed regulation can be performed by the same compact device

  17. Deep narrow band imagery of the diffuse ISM in M33

    Science.gov (United States)

    Hester, J. Jeff; Kulkarni, Shrinivas R.

    1990-01-01

    Very deep narrow band images were obtained for several fields in the local group spiral galaxy M33 using a wide field reimaging Charge Coupled Device (CCD) camera on the 1.5 m telescope at Palomar Observatory. The reimaging system uses a 306 mm collimator and a 58 mm camera lens to put a 16 minute by 16 minute field onto a Texas Instruments 800 x 800 pixel CCD at a resolution of 1.2 arcseconds pixel (-1). The overall system is f/1.65. Images were obtained in the light of H alpha (S II) lambda lambda 6717, 6731, (O III) lambda 5007, and line-free continuum bands 100A wide, centered at 6450A and 5100A. Assuming a distance of 600 kpc to M33 (Humphreys 1980, Ap. J., 241, 587), this corresponds to a linear scale of 3.5 pc pixel (-1), and a field size of 2.8 kpc x 2.8 kpc. Researchers discuss the H alpha imagery of a field centered approx. equal to 8 minutes NE of the nucleus, including the supergiant HII region complex NGC 604. Two 2000 second H alpha images and two 300 second red continuum images were obtained of two slightly offset fields. The fields were offset to allow for discrimination between real emission and possible artifacts in the images. All images were resampled to align them with one of the H alpha frames. The continuum images were normalized to the line images using the results of aperture photometry on a grid of stars in the field, then the rescaled continuum data were directly subtracted from the line data.

  18. Statistical study of ion pitch-angle distributions

    International Nuclear Information System (INIS)

    Sibeck, D.G.; Mcentire, R.W.; Lui, A.T.Y.; Krimigis, S.M.

    1987-01-01

    Preliminary results of a statistical study of energetic (34-50 keV) ion pitch-angle distributions (PADs) within 9 Re of earth provide evidence for an orderly pattern consistent with both drift-shell splitting and magnetopause shadowing. Normal ion PADs dominate the dayside and inner magnetosphere. Butterfly PADs typically occur in a narrow belt stretching from dusk to dawn through midnight, where they approach within 6 Re of earth. While those ion butterfly PADs that typically occur on closed drift paths are mainly caused by drift-shell splitting, there is also evidence for magnetopause shadowing in observations of more frequent butterfly PAD occurrence in the outer magnetosphere near dawn than dusk. Isotropic and gradient boundary PADs terminate the tailward extent of the butterfly ion PAD belt. 9 references

  19. 24/7 security system: 60-FPS color EMCCD camera with integral human recognition

    Science.gov (United States)

    Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.

    2007-04-01

    An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.

  20. Robust and adaptive band-to-band image transform of UAS miniature multi-lens multispectral camera

    Science.gov (United States)

    Jhan, Jyun-Ping; Rau, Jiann-Yeou; Haala, Norbert

    2018-03-01

    Utilizing miniature multispectral (MS) or hyperspectral (HS) cameras by mounting them on an Unmanned Aerial System (UAS) has the benefits of convenience and flexibility to collect remote sensing imagery for precision agriculture, vegetation monitoring, and environment investigation applications. Most miniature MS cameras adopt a multi-lens structure to record discrete MS bands of visible and invisible information. The differences in lens distortion, mounting positions, and viewing angles among lenses mean that the acquired original MS images have significant band misregistration errors. We have developed a Robust and Adaptive Band-to-Band Image Transform (RABBIT) method for dealing with the band co-registration of various types of miniature multi-lens multispectral cameras (Mini-MSCs) to obtain band co-registered MS imagery for remote sensing applications. The RABBIT utilizes modified projective transformation (MPT) to transfer the multiple image geometry of a multi-lens imaging system to one sensor geometry, and combines this with a robust and adaptive correction (RAC) procedure to correct several systematic errors and to obtain sub-pixel accuracy. This study applies three state-of-the-art Mini-MSCs to evaluate the RABBIT method's performance, specifically the Tetracam Miniature Multiple Camera Array (MiniMCA), Micasense RedEdge, and Parrot Sequoia. Six MS datasets acquired at different target distances and dates, and locations are also applied to prove its reliability and applicability. Results prove that RABBIT is feasible for different types of Mini-MSCs with accurate, robust, and rapid image processing efficiency.

  1. Towards Adaptive Virtual Camera Control In Computer Games

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We inves- tigate the relationship between camera placement and playing behaviour in games and build a user...... model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ...... camera control in games is discussed....

  2. Perawatan Maloklusi Angle Klas II Divisi 1 Menggunakan Bionator Myofungsional

    Directory of Open Access Journals (Sweden)

    Ragil Irawan

    2014-06-01

    Full Text Available Maloklusi Angle klas II divisi 1 mempunyai ciri tonjol mesiobukal molar pertama atas beroklusi dengan interdental premolar kedua  dan molar pertama bawah, jarak gigit yang besar, lengkung gigi sempit dan profil cembung. Bionator pertama kali diperkenalkan oleh Balter dan merupakan alat ortodontik myofungsional yang digunakan untuk merawat diskrepansi rahang. Tujuan pemaparan kasus adalah menyajikan kemajuan kasus maloklusi Angle Klas II divisi 1 disertai diskrepansi rahang menggunakan alat myofunctional bionator. Seorang perempuan berusia 13 tahun mengeluhkan gigi depan atas maju. Diagnosis pasien maloklusi Angle klas II divisi 1, hubungan skeletal klas II dengan protrusif maksila dan retrusif mandibula, protrusif insisivus atas disertai palatal bite, cross bite posterior, jarak gigit 11 mm, tumpang gigit 5,25 mm, SNA 84°, SNB 76°. Pasien dirawat menggunakan alat myofungsional bionator. Hasil perawatan setelah satu tahun overjet menjadi 6,25 mm dan SNB 78°. Kesimpulannya adalah alat myofungsional bionator efektif untuk merawat maloklusi Angle Klas II divisi 1 yang disertai diskrepansi rahang. Class II division 1 Angle Malocclussion Treatment  Using  Myofunctional  Bionator. Malocclusion Angle Class II division I is characterized by the upper mesio-buccal cups first permanent molars occludes in interdental second bicuspid and lower first molar permanent, increased overjet, narrow arch form, and convex profile. Bionator originally developed by Balter and used to treat jaw discrepancy. The goal of this case is to present the progress of myofunctional bionator appliance in treating malocclusion Class II division 1 with jaw discrepancies. A Female 13 years old complained protrusive anterior teeth. Diagnosis is malocclusion Angle Class II division I, class II skeletal relationship with maxilla protrusive, mandible retrusive, protrusive upper incisor, palatal bite, posterior cross bite, overjet 11 mm, overbite 5,25 mm SNA 84°, SNB 76

  3. Spheres of Earth: An Introduction to Making Observations of Earth Using an Earth System's Science Approach. Student Guide

    Science.gov (United States)

    Graff, Paige Valderrama; Baker, Marshalyn (Editor); Graff, Trevor (Editor); Lindgren, Charlie (Editor); Mailhot, Michele (Editor); McCollum, Tim (Editor); Runco, Susan (Editor); Stefanov, William (Editor); Willis, Kim (Editor)

    2010-01-01

    Scientists from the Image Science and Analysis Laboratory (ISAL) at NASA's Johnson Space Center (JSC) work with astronauts onboard the International Space Station (ISS) who take images of Earth. Astronaut photographs, sometimes referred to as Crew Earth Observations, are taken using hand-held digital cameras onboard the ISS. These digital images allow scientists to study our Earth from the unique perspective of space. Astronauts have taken images of Earth since the 1960s. There is a database of over 900,000 astronaut photographs available at http://eol.jsc.nasa.gov . Images are requested by ISAL scientists at JSC and astronauts in space personally frame and acquire them from the Destiny Laboratory or other windows in the ISS. By having astronauts take images, they can specifically frame them according to a given request and need. For example, they can choose to use different lenses to vary the amount of area (field of view) an image will cover. Images can be taken at different times of the day which allows different lighting conditions to bring out or highlight certain features. The viewing angle at which an image is acquired can also be varied to show the same area from different perspectives. Pointing the camera straight down gives you a nadir shot. Pointing the camera at an angle to get a view across an area would be considered an oblique shot. Being able to change these variables makes astronaut photographs a unique and useful data set. Astronaut photographs are taken from the ISS from altitudes of 300 - 400 km (185 to 250 miles). One of the current cameras being used, the Nikon D3X digital camera, can take images using a 50, 100, 250, 400 or 800mm lens. These different lenses allow for a wider or narrower field of view. The higher the focal length (800mm for example) the narrower the field of view (less area will be covered). Higher focal lengths also show greater detail of the area on the surface being imaged. Scientists from the Image Science and Analysis

  4. Effect of cutter tip angle on cutting characteristics of acrylic worksheet subjected to punch/die shearing

    Directory of Open Access Journals (Sweden)

    Masami Kojima

    2016-12-01

    Full Text Available This paper aims to describe the effect of tool geometry on cutting characteristics of a 1.0 mm thickness acrylic worksheet subjected to a punch/die shearing. A set of side-wedge punch and side-wedge die which had the edge angle of 30°, 60° and/or 90° was prepared and used for cutting off the worksheet. A load cell and a CCD camera were installed in the cutting system to investigate the cutting load resistance and the side-view deformation of the worksheet. From experimental results, it was revealed that a cracking pattern at a sheared zone was remarkably affected by the edge angle of cutting tool. A cracking direction was almost coincident to the edge angle when considering the punch/die edge angle of 30°, while any matching of them was not observed in case of the punch/die edge angle of 60°, 90°. By using the 30° side-wedge tool, a flat-smooth sheared surface was generated. When combing the punch edge angle of 90° and the die edge angle of 60°, the cracking profile was characterized by the both edge angles for each part (die and punch. Carrying out an elasto-plastic finite element method analysis of cutter indentation with a few of symmetric and asymmetric punch/die edges, the stress distribution and deformation flow at the sheared zone were discussed with the initiation of surface cracks

  5. A new slit lamp-based technique for anterior chamber angle estimation.

    Science.gov (United States)

    Gispets, Joan; Cardona, Genís; Tomàs, Núria; Fusté, Cèlia; Binns, Alison; Fortes, Miguel A

    2014-06-01

    To design and test a new noninvasive method for anterior chamber angle (ACA) estimation based on the slit lamp that is accessible to all eye-care professionals. A new technique (slit lamp anterior chamber estimation [SLACE]) that aims to overcome some of the limitations of the van Herick procedure was designed. The technique, which only requires a slit lamp, was applied to estimate the ACA of 50 participants (100 eyes) using two different slit lamp models, and results were compared with gonioscopy as the clinical standard. The Spearman nonparametric correlation between ACA values as determined by gonioscopy and SLACE were 0.81 (p gonioscopy (Spaeth classification). The SLACE technique, when compared with gonioscopy, displayed good accuracy in the detection of narrow angles, and it may be useful for eye-care clinicians without access to expensive alternative equipment or those who cannot perform gonioscopy because of legal constraints regarding the use of diagnostic drugs.

  6. Gamma camera system

    International Nuclear Information System (INIS)

    Miller, D.W.; Gerber, M.S.; Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    A detailed description is given of a novel gamma camera which is designed to produce superior images than conventional cameras used in nuclear medicine. The detector consists of a solid state detector (e.g. germanium) which is formed to have a plurality of discrete components to enable 2-dimensional position identification. Details of the electronic processing circuits are given and the problems and limitations introduced by noise are discussed in full. (U.K.)

  7. The eye of the camera: effects of security cameras on pro-social behavior

    NARCIS (Netherlands)

    van Rompay, T.J.L.; Vonk, D.J.; Fransen, M.L.

    2009-01-01

    This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise

  8. Passive auto-focus for digital still cameras and camera phones: Filter-switching and low-light techniques

    Science.gov (United States)

    Gamadia, Mark Noel

    In order to gain valuable market share in the growing consumer digital still camera and camera phone market, camera manufacturers have to continually add and improve existing features to their latest product offerings. Auto-focus (AF) is one such feature, whose aim is to enable consumers to quickly take sharply focused pictures with little or no manual intervention in adjusting the camera's focus lens. While AF has been a standard feature in digital still and cell-phone cameras, consumers often complain about their cameras' slow AF performance, which may lead to missed photographic opportunities, rendering valuable moments and events with undesired out-of-focus pictures. This dissertation addresses this critical issue to advance the state-of-the-art in the digital band-pass filter, passive AF method. This method is widely used to realize AF in the camera industry, where a focus actuator is adjusted via a search algorithm to locate the in-focus position by maximizing a sharpness measure extracted from a particular frequency band of the incoming image of the scene. There are no known systematic methods for automatically deriving the parameters such as the digital pass-bands or the search step-size increments used in existing passive AF schemes. Conventional methods require time consuming experimentation and tuning in order to arrive at a set of parameters which balance AF performance in terms of speed and accuracy ultimately causing a delay in product time-to-market. This dissertation presents a new framework for determining an optimal set of passive AF parameters, named Filter- Switching AF, providing an automatic approach to achieve superior AF performance, both in good and low lighting conditions based on the following performance measures (metrics): speed (total number of iterations), accuracy (offset from truth), power consumption (total distance moved), and user experience (in-focus position overrun). Performance results using three different prototype cameras

  9. Image compensation for camera and lighting variability

    Science.gov (United States)

    Daley, Wayne D.; Britton, Douglas F.

    1996-12-01

    With the current trend of integrating machine vision systems in industrial manufacturing and inspection applications comes the issue of camera and illumination stabilization. Unless each application is built around a particular camera and highly controlled lighting environment, the interchangeability of cameras of fluctuations in lighting become a problem as each camera usually has a different response. An empirical approach is proposed where color tile data is acquired using the camera of interest, and a mapping is developed to some predetermined reference image using neural networks. A similar analytical approach based on a rough analysis of the imaging systems is also considered for deriving a mapping between cameras. Once a mapping has been determined, all data from one camera is mapped to correspond to the images of the other prior to performing any processing on the data. Instead of writing separate image processing algorithms for the particular image data being received, the image data is adjusted based on each particular camera and lighting situation. All that is required when swapping cameras is the new mapping for the camera being inserted. The image processing algorithms can remain the same as the input data has been adjusted appropriately. The results of utilizing this technique are presented for an inspection application.

  10. Basic Boiling Experiments with An Inclined Narrow Gap Associated With In-Vessel Retention

    International Nuclear Information System (INIS)

    Terazu, Kuninobu; Watanabe, Fukashi; Iwaki, Chikako; Yokobori, Seiichi; Akinaga, Makoto; Hamazaki, Ryoichi; SATO, Ken-ichi

    2002-01-01

    In the case of a severe accident with relocation of the molten corium into the lower plenum of reactor pressure vessel (RPV), the successful in-vessel corium retention (IVR) can prevent the progress to ex-vessel events with uncertainties and avoid the containment failure. One of the key phenomena governing the possibility of IVR would be the gap formation and cooling between a corium crust and the RPV wall, and for the achievement of IVR, it would be necessary to supply cooling water to RPV as early as possible. The BWR features relative to IVR behavior are a deep and massive water pool in the lower plenum, and many of control rod drive guide tubes (CRDGT) installed in the lower head of RPV, in which water is injected continuously except in the case of station blackout scenario. The present paper describes the basic boiling experiment conducted in order to investigate the boiling characteristics in an inclined narrow gap simulating a part of the lower head curvature. The boiling experiments were composed of visualization tests and heat transfer tests. In the visualization tests, two types of inclined gap were constructed using the parallel plate and the V-shaped parallel plate with heating from the top plate, and the boiling flow pattern was observed with various gap width and heat flux. These observation results showed that water was easily supplied from the gap bottom of parallel plate even in a very narrow gap with smaller width than 1 mm, and water could flow continuously in the narrow gap by the geometric and thermal imbalance from the experiment results using the V-shaped parallel plate. In the heat transfer tests, the critical heat flux (CHF) data in an inclined narrow channel formed by the parallel plates were measured in terms of the parameters of gap width, heated length and inclined angle of a channel, and the effect of inclination was incorporated into the existing CHF correlation for a narrow gap. The CHF correlation modified for an inclined narrow gap

  11. Optimising camera traps for monitoring small mammals.

    Directory of Open Access Journals (Sweden)

    Alistair S Glen

    Full Text Available Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1 trigger speed, 2 passive infrared vs. microwave sensor, 3 white vs. infrared flash, and 4 still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea, feral cats (Felis catus and hedgehogs (Erinaceuseuropaeus. Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  12. THE EFFECTS OF THREE DIFFERENT REAR KNEE ANGLES ON KINEMATICS IN THE SPRINT START

    Directory of Open Access Journals (Sweden)

    C. Milanese

    2014-08-01

    Full Text Available The purpose of this study was to investigate the rear knee angle range in the set position that allows sprinters to reach greater propulsion on the rear block during the sprint start. Eleven university-track team sprinters performed the sprint start using three rear knee angle conditions: 90°, 115° and 135°. A motion capture system consisting of 8 digital cameras (250 Hz was used to record kinematic parameters at the starting block phase and the acceleration phase. The following variables were considered: horizontal velocity of the centre of mass (COM, COM height, block time, pushing time on the rear block, percentage of pushing time on the rear block, force impulse, push-off angle and length of the first two strides. The main results show that first, horizontal block velocity is significantly greater at 90° vs 115° and 135° rear knee angle (p<0.05 and p<0.001 respectively at block clearance and the first two strides; second, during the pushing phase, the percentage of pushing time of the rear leg is significantly greater at 90° vs 135° rear knee angle (p<0.01. No significant difference was found for block time among the conditions. These results indicate that block velocity is the main kinematic parameter affected by rear knee angle during the starting block phase and acceleration phase. Furthermore, the 90° rear knee angle allows for a better push-off of the rear leg than larger angles at the set position. The findings of this study provide some direction and useful practical advice in defining an efficient rear leg biomechanical configuration at the set position.

  13. The Camera of the MASCOT Asteroid Lander on Board Hayabusa 2

    Science.gov (United States)

    Jaumann, R.; Schmitz, N.; Koncz, A.; Michaelis, H.; Schroeder, S. E.; Mottola, S.; Trauthan, F.; Hoffmann, H.; Roatsch, T.; Jobs, D.; Kachlicki, J.; Pforte, B.; Terzer, R.; Tschentscher, M.; Weisse, S.; Mueller, U.; Perez-Prieto, L.; Broll, B.; Kruselburger, A.; Ho, T.-M.; Biele, J.; Ulamec, S.; Krause, C.; Grott, M.; Bibring, J.-P.; Watanabe, S.; Sugita, S.; Okada, T.; Yoshikawa, M.; Yabuta, H.

    2017-07-01

    The MASCOT Camera (MasCam) is part of the Mobile Asteroid Surface Scout (MASCOT) lander's science payload. MASCOT has been launched to asteroid (162173) Ryugu onboard JAXA's Hayabusa 2 asteroid sample return mission on Dec 3rd, 2014. It is scheduled to arrive at Ryugu in 2018, and return samples to Earth by 2020. MasCam was designed and built by DLR's Institute of Planetary Research, together with Airbus-DS Germany. The scientific goals of the MasCam investigation are to provide ground truth for the orbiter's remote sensing observations, provide context for measurements by the other lander instruments (radiometer, spectrometer and magnetometer), the orbiter sampling experiment, and characterize the geological context, compositional variations and physical properties of the surface (e.g. rock and regolith particle size distributions). During daytime, clear filter images will be acquired. During night, illumination of the dark surface is performed by an LED array, equipped with 4×36 monochromatic light-emitting diodes (LEDs) working in four spectral bands. Color imaging will allow the identification of spectrally distinct surface units. Continued imaging during the surface mission phase and the acquisition of image series at different sun angles over the course of an asteroid day will contribute to the physical characterization of the surface and also allow the investigation of time-dependent processes and to determine the photometric properties of the regolith. The MasCam observations, combined with the MASCOT hyperspectral microscope (MMEGA) and radiometer (MARA) thermal observations, will cover a wide range of observational scales and serve as a strong tie point between Hayabusa 2's remote-sensing scales (103-10^{-3} m) and sample scales (10^{-3}-10^{-6} m). The descent sequence and the close-up images will reveal the surface features over a broad range of scales, allowing an assessment of the surface's diversity and close the gap between the orbital observations

  14. Large-amplitude and narrow-band vibration phenomenon of a foursquare fix-supported flexible plate in a rigid narrow channel

    Energy Technology Data Exchange (ETDEWEB)

    Liu Lifang, E-mail: liu_lifang1106@yahoo.cn [School of Nuclear Science and Engineering, North China Electric Power University, Zhuxinzhuang, Dewai, Beijing 102206 (China); Lu Daogang, E-mail: ludaogang@ncepu.edu.cn [School of Nuclear Science and Engineering, North China Electric Power University, Zhuxinzhuang, Dewai, Beijing 102206 (China); Li Yang, E-mail: qinxiuyi@sina.com [School of Nuclear Science and Engineering, North China Electric Power University, Zhuxinzhuang, Dewai, Beijing 102206 (China); Zhang Pan, E-mail: zhangpan@ncepu.edu.cn [School of Nuclear Science and Engineering, North China Electric Power University, Zhuxinzhuang, Dewai, Beijing 102206 (China); Niu Fenglei, E-mail: niufenglei@ncepu.edu.cn [School of Nuclear Science and Engineering, North China Electric Power University, Zhuxinzhuang, Dewai, Beijing 102206 (China)

    2011-08-15

    Highlights: > FIV of a foursquare fix-supported flexible plate exposed to axial flow was studied. > Special designed test section and advanced measuring equipments were adopted. > The narrow-band vibration phenomenon with large amplitude was observed. > Line of plate's vibration amplitude and flow rate was investigated. > The phenomenon and the measurement error were analyzed. - Abstract: An experiment was performed to analyze the flow-induced vibration behavior of a foursquare fix-supported flexible plate exposed to the axial flow within a rigid narrow channel. The large-amplitude and narrow-band vibration phenomenon was observed in the experiment when the flow velocity varied with the range of 0-5 m/s. The occurring condition and some characteristics of the large-amplitude and narrow-band vibrations were investigated.

  15. Science, conservation, and camera traps

    Science.gov (United States)

    Nichols, James D.; Karanth, K. Ullas; O'Connel, Allan F.; O'Connell, Allan F.; Nichols, James D.; Karanth, K. Ullas

    2011-01-01

    Biologists commonly perceive camera traps as a new tool that enables them to enter the hitherto secret world of wild animals. Camera traps are being used in a wide range of studies dealing with animal ecology, behavior, and conservation. Our intention in this volume is not to simply present the various uses of camera traps, but to focus on their use in the conduct of science and conservation. In this chapter, we provide an overview of these two broad classes of endeavor and sketch the manner in which camera traps are likely to be able to contribute to them. Our main point here is that neither photographs of individual animals, nor detection history data, nor parameter estimates generated from detection histories are the ultimate objective of a camera trap study directed at either science or management. Instead, the ultimate objectives are best viewed as either gaining an understanding of how ecological systems work (science) or trying to make wise decisions that move systems from less desirable to more desirable states (conservation, management). Therefore, we briefly describe here basic approaches to science and management, emphasizing the role of field data and associated analyses in these processes. We provide examples of ways in which camera trap data can inform science and management.

  16. Camera for coherent diffractive imaging and holography with a soft-x-ray free-electron laser

    International Nuclear Information System (INIS)

    Bajt, Sasa; Chapman, Henry N.; Spiller, Eberhard A.; Alameda, Jennifer B.; Woods, Bruce W.; Frank, Matthias; Bogan, Michael J.; Barty, Anton; Boutet, Sebastien; Marchesini, Stefano; Hau-Riege, Stefan P.; Hajdu, Janos; Shapiro, David

    2008-01-01

    We describe a camera to record coherent scattering patterns with a soft-x-ray free-electron laser (FEL). The camera consists of a laterally graded multilayer mirror, which reflects the diffraction pattern onto a CCD detector. The mirror acts as a bandpass filter for both the wavelength and the angle, which isolates the desired scattering pattern from nonsample scattering or incoherent emission from the sample. The mirror also solves the particular problem of the extreme intensity of the FEL pulses, which are focused to greater than 10 14 W/cm 2 . The strong undiffracted pulse passes through a hole in the mirror and propagates onto a beam dump at a distance behind the instrument rather than interacting with a beam stop placed near the CCD. The camera concept is extendable for the full range of the fundamental wavelength of the free electron laser in Hamburg (FLASH) FEL (i.e., between 6 and 60 nm) and into the water window. We have fabricated and tested various multilayer mirrors for wavelengths of 32, 16, 13.5, and 4.5 nm. At the shorter wavelengths mirror roughness must be minimized to reduce scattering from the mirror. We have recorded over 30,000 diffraction patterns at the FLASH FEL with no observable mirror damage or degradation of performance

  17. Soft x-ray streak cameras

    International Nuclear Information System (INIS)

    Stradling, G.L.

    1988-01-01

    This paper is a discussion of the development and of the current state of the art in picosecond soft x-ray streak camera technology. Accomplishments from a number of institutions are discussed. X-ray streak cameras vary from standard visible streak camera designs in the use of an x-ray transmitting window and an x-ray sensitive photocathode. The spectral sensitivity range of these instruments includes portions of the near UV and extends from the subkilovolt x- ray region to several tens of kilovolts. Attendant challenges encountered in the design and use of x-ray streak cameras include the accommodation of high-voltage and vacuum requirements, as well as manipulation of a photocathode structure which is often fragile. The x-ray transmitting window is generally too fragile to withstand atmospheric pressure, necessitating active vacuum pumping and a vacuum line of sight to the x-ray signal source. Because of the difficulty of manipulating x-ray beams with conventional optics, as is done with visible light, the size of the photocathode sensing area, access to the front of the tube, the ability to insert the streak tube into a vacuum chamber and the capability to trigger the sweep with very short internal delay times are issues uniquely relevant to x-ray streak camera use. The physics of electron imaging may place more stringent limitations on the temporal and spatial resolution obtainable with x-ray photocathodes than with the visible counterpart. Other issues which are common to the entire streak camera community also concern the x-ray streak camera users and manufacturers

  18. Evolution of deformation velocity in narrowing for Zircaloy 2

    Energy Technology Data Exchange (ETDEWEB)

    Cetlin, P R [Minas Gerais Univ., Belo Horizonte (Brazil). Dept. de Engenharia Metalurgica; Okuda, M Y [Goias Univ., Goiania (Brazil). Inst. de Matematica e Fisica

    1980-09-01

    Some studies on the deformation instability in strain shows that the differences in this instability may lead to localized narrowing or elongated narrowing, for Zircaloy-2. The variation of velocity deformation with the narrowing evolution is expected to be different for these two cases. The mentioned variation is discussed, a great difference in behavior having been observed for the case of localized narrowing.

  19. New camera systems for fuel services

    International Nuclear Information System (INIS)

    Hummel, W.; Beck, H.J.

    2010-01-01

    AREVA NP Fuel Services have many years of experience in visual examination and measurements on fuel assemblies and associated core components by using state of the art cameras and measuring technologies. The used techniques allow the surface and dimensional characterization of materials and shapes by visual examination. New enhanced and sophisticated technologies for fuel services f. e. are two shielded color camera systems for use under water and close inspection of a fuel assembly. Nowadays the market requirements for detecting and characterization of small defects (lower than the 10th of one mm) or cracks and analyzing surface appearances on an irradiated fuel rod cladding or fuel assembly structure parts have increased. Therefore it is common practice to use movie cameras with higher resolution. The radiation resistance of high resolution CCD cameras is in general very low and it is not possible to use them unshielded close to a fuel assembly. By extending the camera with a mirror system and shielding around the sensitive parts, the movie camera can be utilized for fuel assembly inspection. AREVA NP Fuel Services is now equipped with such kind of movie cameras. (orig.)

  20. Automatic multi-camera calibration for deployable positioning systems

    Science.gov (United States)

    Axelsson, Maria; Karlsson, Mikael; Rudner, Staffan

    2012-06-01

    Surveillance with automated positioning and tracking of subjects and vehicles in 3D is desired in many defence and security applications. Camera systems with stereo or multiple cameras are often used for 3D positioning. In such systems, accurate camera calibration is needed to obtain a reliable 3D position estimate. There is also a need for automated camera calibration to facilitate fast deployment of semi-mobile multi-camera 3D positioning systems. In this paper we investigate a method for automatic calibration of the extrinsic camera parameters (relative camera pose and orientation) of a multi-camera positioning system. It is based on estimation of the essential matrix between each camera pair using the 5-point method for intrinsically calibrated cameras. The method is compared to a manual calibration method using real HD video data from a field trial with a multicamera positioning system. The method is also evaluated on simulated data from a stereo camera model. The results show that the reprojection error of the automated camera calibration method is close to or smaller than the error for the manual calibration method and that the automated calibration method can replace the manual calibration.

  1. Towards next generation 3D cameras

    Science.gov (United States)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (robotic inspection and assembly systems.

  2. Relative and Absolute Calibration of a Multihead Camera System with Oblique and Nadir Looking Cameras for a Uas

    Science.gov (United States)

    Niemeyer, F.; Schima, R.; Grenzdörffer, G.

    2013-08-01

    Numerous unmanned aerial systems (UAS) are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg) are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis" software and will give an overview of the results and experiences of test flights.

  3. Narrow Networks on the Individual Marketplace in 2017.

    Science.gov (United States)

    Polski, Daniel; Weiner, Janet; Zhang, Yuehan

    2017-09-01

    This Issue Brief describes the breadth of physician networks on the ACA marketplaces in 2017. We find that the overall rate of narrow networks is 21%, which is a decline since 2014 (31%) and 2016 (25%). Narrow networks are concentrated in plans sold on state-based marketplaces, at 42%, compared to 10% of plans on federally-facilitated marketplaces. Issuers that have traditionally offered Medicaid coverage have the highest prevalence of narrow network plans at 36%, with regional/local plans and provider-based plans close behind at 27% and 30%. We also find large differences in narrow networks by state and by plan type.

  4. Transmission electron microscope CCD camera

    Science.gov (United States)

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  5. Comparison of the contrast in conventional and lattice resolved ADF STEM images of InGaAs/GaAs structures using different camera lengths

    Science.gov (United States)

    Qiu, Y.; Lari, L.; Ross, I. M.; Walther, T.

    2011-11-01

    A procedure to quantify annular dark field (ADF) images in scanning transmission electron microscopy (STEM) has been applied to two 200kV transmission electron microscopes (TEMs), a JEOL 2010F and a double aberration-corrected JEOL 2200FSC. A series of ADF images is acquired as a function of the camera length (i.e. inner detection angle). Then the intensity ratio of InGaAs and GaAs is plotted vs. camera length and extrapolated to zero, at which point the contrast behaves exactly as predicted by Rutherford's scattering. The linearity of ADF intensity ratio vs. camera length improves significantly by using the JEOL 2200FSC compared to the JEOL 2010F at medium resolution. A high-resolution ADF image at 2MX nominal magnification acquired in the JEOL 2200FSC shows the same linearity of intensity ratio vs. camera length, independent of whether the ratios of the average background intensities or the fringe amplitudes are used for the analysis. This is explained by both group III and group V atoms contributing to the {111} fringes observed, similar to low resolution data.

  6. Homography-based multiple-camera person-tracking

    Science.gov (United States)

    Turk, Matthew R.

    2009-01-01

    Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of

  7. Biometric gonioscopy and the effects of age, race, and sex on the anterior chamber angle

    Science.gov (United States)

    Congdon, N G; Foster, P J; Wamsley, S; Gutmark, J; Nolan, W; Seah, S K; Johnson, G J; Broman, A T

    2002-01-01

    Aim: To utilise a novel method for making measurements in the anterior chamber in order to compare the anterior chamber angles of people of European, African, and east Asian descent aged 40 years and over. Methods: A cross sectional study on 15 people of each sex from each decade from the 40s to the 70s, from each of three racial groups—black, white, and Chinese Singaporeans. Biometric gonioscopy (BG) utilises a slit lamp mounted reticule to make measurements from the apparent iris insertion to Schwalbe's line through a Goldmann one mirror goniolens. The main outcome measures were BG measurements of the anterior chamber angle as detailed above. Results: There was no significant difference in angle measurement between black, white, and Chinese races in this study. However, at younger ages people of Chinese race appeared to have deeper angles than white or black people, whereas the angles of older Chinese were significantly narrower (p = 0.004 for the difference in slope of BG by age between Chinese and both black and white people). Conclusion: The failure to detect a difference in angle measurements between these groups was surprising, given the much higher prevalence of angle closure among Chinese. It appears that the overall apparent similarity of BG means between Chinese and Western populations may mask very different trends with age. The apparently more rapid decline in angle width measurements with age among Chinese may be due to the higher prevalence of cataract or “creeping angle closure.” However, longitudinal inferences from cross sectional data are problematic, and this may represent a cohort phenomenon caused by the increasing prevalence of myopia in the younger Singaporean population. PMID:11801496

  8. Reconstruction in PET cameras with irregular sampling and depth of interaction capability

    International Nuclear Information System (INIS)

    Virador, P.R.G.; Moses, W.W.; Huesman, R.H.

    1998-01-01

    The authors present 2D reconstruction algorithms for a rectangular PET camera capable of measuring depth of interaction (DOI). The camera geometry leads to irregular radial and angular sampling of the tomographic data. DOI information increases sampling density, allowing the use of evenly spaced quarter-crystal width radial bins with minimal interpolation of irregularly spaced data. In the regions where DOI does not increase sampling density (chords normal to crystal faces), fine radial sinogram binning leads to zero efficiency bins if uniform angular binning is used. These zero efficiency sinogram bins lead to streak artifacts if not corrected. To minimize these unnormalizable sinogram bins the authors use two angular binning schemes: Fixed Width and Natural Width. Fixed Width uses a fixed angular width except in the problem regions where appropriately chosen widths are applied. Natural Width uses angle widths which are derived from intrinsic detector sampling. Using a modified filtered-backprojection algorithm to accommodate these angular binning schemes, the authors reconstruct artifact free images with nearly isotropic and position independent spatial resolution. Results from Monte Carlo data indicate that they have nearly eliminated image degradation due to crystal penetration

  9. Radiometric Cross-Calibration of GAOFEN-1 Wfv Cameras with LANDSAT-8 Oli and Modis Sensors Based on Radiation and Geometry Matching

    Science.gov (United States)

    Li, J.; Wu, Z.; Wei, X.; Zhang, Y.; Feng, F.; Guo, F.

    2018-04-01

    Cross-calibration has the advantages of high precision, low resource requirements and simple implementation. It has been widely used in recent years. The four wide-field-of-view (WFV) cameras on-board Gaofen-1 satellite provide high spatial resolution and wide combined coverage (4 × 200 km) without onboard calibration. In this paper, the four-band radiometric cross-calibration coefficients of WFV1 camera were obtained based on radiation and geometry matching taking Landsat 8 OLI (Operational Land Imager) sensor as reference. Scale Invariant Feature Transform (SIFT) feature detection method and distance and included angle weighting method were introduced to correct misregistration of WFV-OLI image pair. The radiative transfer model was used to eliminate difference between OLI sensor and WFV1 camera through the spectral match factor (SMF). The near-infrared band of WFV1 camera encompasses water vapor absorption bands, thus a Look Up Table (LUT) for SMF varies from water vapor amount is established to estimate the water vapor effects. The surface synchronization experiment was designed to verify the reliability of the cross-calibration coefficients, which seem to perform better than the official coefficients claimed by the China Centre for Resources Satellite Data and Application (CCRSDA).

  10. Scintillation camera for high activity sources

    International Nuclear Information System (INIS)

    Arseneau, R.E.

    1978-01-01

    The invention described relates to a scintillation camera used for clinical medical diagnosis. Advanced recognition of many unacceptable pulses allows the scintillation camera to discard such pulses at an early stage in processing. This frees the camera to process a greater number of pulses of interest within a given period of time. Temporary buffer storage allows the camera to accommodate pulses received at a rate in excess of its maximum rated capability due to statistical fluctuations in the level of radioactivity of the radiation source measured. (U.K.)

  11. Decision about buying a gamma camera

    International Nuclear Information System (INIS)

    Ganatra, R.D.

    1992-01-01

    A large part of the referral to a nuclear medicine department is usually for imaging studies. Sooner or later, the nuclear medicine specialist will be called upon to make a decision about when and what type of gamma camera to buy. There is no longer an option of choosing between a rectilinear scanner and a gamma camera as the former is virtually out of the market. The decision that one has to make is when to invest in a gamma camera, and then on what basis to select the gamma camera

  12. Decision about buying a gamma camera

    Energy Technology Data Exchange (ETDEWEB)

    Ganatra, R D

    1993-12-31

    A large part of the referral to a nuclear medicine department is usually for imaging studies. Sooner or later, the nuclear medicine specialist will be called upon to make a decision about when and what type of gamma camera to buy. There is no longer an option of choosing between a rectilinear scanner and a gamma camera as the former is virtually out of the market. The decision that one has to make is when to invest in a gamma camera, and then on what basis to select the gamma camera 1 tab., 1 fig

  13. Selective-imaging camera

    Science.gov (United States)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  14. Video Chat with Multiple Cameras

    OpenAIRE

    MacCormick, John

    2012-01-01

    The dominant paradigm for video chat employs a single camera at each end of the conversation, but some conversations can be greatly enhanced by using multiple cameras at one or both ends. This paper provides the first rigorous investigation of multi-camera video chat, concentrating especially on the ability of users to switch between views at either end of the conversation. A user study of 23 individuals analyzes the advantages and disadvantages of permitting a user to switch between views at...

  15. Microprocessor-controlled, wide-range streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Amy E. Lewis, Craig Hollabaugh

    2006-09-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera’s user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  16. Q-angle in patellofemoral pain: relationship with dynamic knee valgus, hip abductor torque, pain and function

    Directory of Open Access Journals (Sweden)

    Gabriel Peixoto Leão Almeida

    2016-04-01

    Full Text Available OBJECTIVE: To investigate the relationship between the q-angle and anterior knee pain severity, functional capacity, dynamic knee valgus and hip abductor torque in women with patellofemoral pain syndrome (PFPS. METHODS: This study included 22 women with PFPS. The q-angle was assessed using goniometry: the participants were positioned in dorsal decubitus with the knee and hip extended, and the hip and foot in neutral rotation. Anterior knee pain severity was assessed using a visual analog scale, and functional capacity was assessed using the anterior knee pain scale. Dynamic valgus was evaluated using the frontal plane projection angle (FPPA of the knee, which was recorded using a digital camera during step down, and hip abductor peak torque was recorded using a handheld dynamometer. RESULTS: The q-angle did not present any significant correlation with severity of knee pain (r = -0.29; p = 0.19, functional capacity (r = -0.08; p = 0.72, FPPA (r = -0.28; p = 0.19 or isometric peak torque of the abductor muscles (r = -0.21; p = 0.35. CONCLUSION: The q-angle did not present any relationship with pain intensity, functional capacity, FPPA, or hip abductor peak torque in the patients with PFPS.

  17. Bayesian Estimator for Angle Recovery: Event Classification and Reconstruction in Positron Emission Tomography

    International Nuclear Information System (INIS)

    Foudray, Angela M K; Levin, Craig S

    2007-01-01

    PET at the highest level is an inverse problem: reconstruct the location of the emission (which localize biological function) from detected photons. Ideally, one would like to directly measure an annihilation photon's incident direction on the detector. In the developed algorithm, Bayesian Estimation for Angle Recovery (BEAR), we utilized the increased information gathered from localizing photon interactions in the detector and developed a Bayesian estimator for a photon's incident direction. Probability distribution functions (PDFs) were filled using an interaction energy weighted mean or center of mass (COM) reference space, which had the following computational advantages: (1) a significant reduction in the size of the data in measurement space, making further manipulation and searches faster (2) the construction of COM space does not depend on measurement location, it takes advantage of measurement symmetries, and data can be added to the training set without knowledge and recalculation of prior training data, (3) calculation of posterior probability map is fully parallelizable, it can scale to any number of processors. These PDFs were used to estimate the point spread function (PSF) in incident angle space for (i) algorithm assessment and (ii) to provide probability selection criteria for classification. The algorithm calculates both the incident θ and φ angle, with ∼16 degrees RMS in both angles, limiting the incoming direction to a narrow cone. Feature size did not improve using the BEAR algorithm as an angle filter, but the contrast ratio improved 40% on average

  18. Knee Angle and Stride Length in Association with Ball Speed in Youth Baseball Pitchers

    Directory of Open Access Journals (Sweden)

    Bart van Trigt

    2018-05-01

    Full Text Available The purpose of this study was to determine whether stride length and knee angle of the leading leg at foot contact, at the instant of maximal external rotation of the shoulder, and at ball release are associated with ball speed in elite youth baseball pitchers. In this study, fifty-two elite youth baseball pitchers (mean age 15.2 SD (standard deviation 1.7 years pitched ten fastballs. Data were collected with three high-speed video cameras at a frequency of 240 Hz. Stride length and knee angle of the leading leg were calculated at foot contact, maximal external rotation, and ball release. The associations between these kinematic variables and ball speed were separately determined using generalized estimating equations. Stride length as percentage of body height and knee angle at foot contact were not significantly associated with ball speed. However, knee angles at maximal external rotation and ball release were significantly associated with ball speed. Ball speed increased by 0.45 m/s (1 mph with an increase in knee extension of 18 degrees at maximal external rotation and 19.5 degrees at ball release. In conclusion, more knee extension of the leading leg at maximal external rotation and ball release is associated with higher ball speeds in elite youth baseball pitchers.

  19. Development of a large-solid-angle and multi-device detection system for elemental analysis

    International Nuclear Information System (INIS)

    Satoh, T.; Ishii, K.; Kamiya, T.; Sakai, T.; Oikawa, M.; Arakawa, K.; Matsuyama, S.; Yamazaki, H.

    2003-01-01

    A new detection apparatus for both low energy X-rays like 1 keV and back scattered protons of MeV energy was developed. The detection apparatus consists of a large-solid-angle multi-device Si detector and a data acquisition system. The detector has 45 detection devices which are arranged in the shape of a pentagonal pyramid and fully cover a sample. A micro-beam irradiates the sample through the center of the pentagonal pyramid and X-rays emitted from the sample are detected in a solid angle of about 1.0 sr. This novel detection setup has about five times higher sensitivity than a conventional micro-PIXE camera. In addition, not only X-rays but back scattered protons can be detected, since the counting rate of back scattered protons per detection device is small despite lack of a passive absorber

  20. Analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, I.; Osorio Deliz, J. F.; Diaz Garcia, A.

    2013-01-01

    This research work was carried out to develop an analyzer for gamma cameras diagnostic. It is composed of an electronic system that includes hardware and software capabilities, and operates from the acquisition of the 4 head position signals of a gamma camera detector. The result is the spectrum of the energy delivered by nuclear radiation coming from the camera detector head. This system includes analog processing of position signals from the camera, digitization and the subsequent processing of the energy signal in a multichannel analyzer, sending data to a computer via a standard USB port and processing of data in a personal computer to obtain the final histogram. The circuits are composed of an analog processing board and a universal kit with micro controller and programmable gate array. (Author)

  1. Single Camera Calibration in 3D Vision

    Directory of Open Access Journals (Sweden)

    Caius SULIMAN

    2009-12-01

    Full Text Available Camera calibration is a necessary step in 3D vision in order to extract metric information from 2D images. A camera is considered to be calibrated when the parameters of the camera are known (i.e. principal distance, lens distorsion, focal length etc.. In this paper we deal with a single camera calibration method and with the help of this method we try to find the intrinsic and extrinsic camera parameters. The method was implemented with succes in the programming and simulation environment Matlab.

  2. RELATIVE AND ABSOLUTE CALIBRATION OF A MULTIHEAD CAMERA SYSTEM WITH OBLIQUE AND NADIR LOOKING CAMERAS FOR A UAS

    Directory of Open Access Journals (Sweden)

    F. Niemeyer

    2013-08-01

    Full Text Available Numerous unmanned aerial systems (UAS are currently flooding the market. For the most diverse applications UAVs are special designed and used. Micro and mini UAS (maximum take-off weight up to 5 kg are of particular interest, because legal restrictions are still manageable but also the payload capacities are sufficient for many imaging sensors. Currently a camera system with four oblique and one nadir looking cameras is under development at the Chair for Geodesy and Geoinformatics. The so-called "Four Vision" camera system was successfully built and tested in the air. A MD4-1000 UAS from microdrones is used as a carrier system. Light weight industrial cameras are used and controlled by a central computer. For further photogrammetric image processing, each individual camera, as well as all the cameras together have to be calibrated. This paper focuses on the determination of the relative orientation between the cameras with the „Australis“ software and will give an overview of the results and experiences of test flights.

  3. A camera specification for tendering purposes

    International Nuclear Information System (INIS)

    Lunt, M.J.; Davies, M.D.; Kenyon, N.G.

    1985-01-01

    A standardized document is described which is suitable for sending to companies which are being invited to tender for the supply of a gamma camera. The document refers to various features of the camera, the performance specification of the camera, maintenance details, price quotations for various options and delivery, installation and warranty details. (U.K.)

  4. Characterization of lipid films by an angle-interrogation surface plasmon resonance imaging device.

    Science.gov (United States)

    Liu, Linlin; Wang, Qiong; Yang, Zhong; Wang, Wangang; Hu, Ning; Luo, Hongyan; Liao, Yanjian; Zheng, Xiaolin; Yang, Jun

    2015-04-01

    Surface topographies of lipid films have an important significance in the analysis of the preparation of giant unilamellar vesicles (GUVs). In order to achieve accurately high-throughput and rapidly analysis of surface topographies of lipid films, a homemade SPR imaging device is constructed based on the classical Kretschmann configuration and an angle interrogation manner. A mathematical model is developed to accurately describe the shift including the light path in different conditions and the change of the illumination point on the CCD camera, and thus a SPR curve for each sampling point can also be achieved, based on this calculation method. The experiment results show that the topographies of lipid films formed in distinct experimental conditions can be accurately characterized, and the measuring resolution of the thickness lipid film may reach 0.05 nm. Compared with existing SPRi devices, which realize detection by monitoring the change of the reflective-light intensity, this new SPRi system can achieve the change of the resonance angle on the entire sensing surface. Thus, it has higher detection accuracy as the traditional angle-interrogation SPR sensor, with much wider detectable range of refractive index. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. State of art in radiation tolerant camera

    Energy Technology Data Exchange (ETDEWEB)

    Choi; Young Soo; Kim, Seong Ho; Cho, Jae Wan; Kim, Chang Hoi; Seo, Young Chil

    2002-02-01

    Working in radiation environment such as nuclear power plant, RI facility, nuclear fuel fabrication facility, medical center has to be considered radiation exposure, and we can implement these job by remote observation and operation. However the camera used for general industry is weakened at radiation, so radiation-tolerant camera is needed for radiation environment. The application of radiation-tolerant camera system is nuclear industry, radio-active medical, aerospace, and so on. Specially nuclear industry, the demand is continuous in the inspection of nuclear boiler, exchange of pellet, inspection of nuclear waste. In the nuclear developed countries have been an effort to develop radiation-tolerant cameras. Now they have many kinds of radiation-tolerant cameras which can tolerate to 10{sup 6}-10{sup 8} rad total dose. In this report, we examine into the state-of-art about radiation-tolerant cameras, and analyze these technology. We want to grow up the concern of developing radiation-tolerant camera by this paper, and upgrade the level of domestic technology.

  6. Multi-target detection and positioning in crowds using multiple camera surveillance

    Science.gov (United States)

    Huang, Jiahu; Zhu, Qiuyu; Xing, Yufeng

    2018-04-01

    In this study, we propose a pixel correspondence algorithm for positioning in crowds based on constraints on the distance between lines of sight, grayscale differences, and height in a world coordinates system. First, a Gaussian mixture model is used to obtain the background and foreground from multi-camera videos. Second, the hair and skin regions are extracted as regions of interest. Finally, the correspondences between each pixel in the region of interest are found under multiple constraints and the targets are positioned by pixel clustering. The algorithm can provide appropriate redundancy information for each target, which decreases the risk of losing targets due to a large viewing angle and wide baseline. To address the correspondence problem for multiple pixels, we construct a pixel-based correspondence model based on a similar permutation matrix, which converts the correspondence problem into a linear programming problem where a similar permutation matrix is found by minimizing an objective function. The correct pixel correspondences can be obtained by determining the optimal solution of this linear programming problem and the three-dimensional position of the targets can also be obtained by pixel clustering. Finally, we verified the algorithm with multiple cameras in experiments, which showed that the algorithm has high accuracy and robustness.

  7. 16 CFR 501.1 - Camera film.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the net... should be expressed, provided: (a) The net quantity of contents on packages of movie film and bulk still...

  8. Securing Embedded Smart Cameras with Trusted Computing

    Directory of Open Access Journals (Sweden)

    Winkler Thomas

    2011-01-01

    Full Text Available Camera systems are used in many applications including video surveillance for crime prevention and investigation, traffic monitoring on highways or building monitoring and automation. With the shift from analog towards digital systems, the capabilities of cameras are constantly increasing. Today's smart camera systems come with considerable computing power, large memory, and wired or wireless communication interfaces. With onboard image processing and analysis capabilities, cameras not only open new possibilities but also raise new challenges. Often overlooked are potential security issues of the camera system. The increasing amount of software running on the cameras turns them into attractive targets for attackers. Therefore, the protection of camera devices and delivered data is of critical importance. In this work we present an embedded camera prototype that uses Trusted Computing to provide security guarantees for streamed videos. With a hardware-based security solution, we ensure integrity, authenticity, and confidentiality of videos. Furthermore, we incorporate image timestamping, detection of platform reboots, and reporting of the system status. This work is not limited to theoretical considerations but also describes the implementation of a prototype system. Extensive evaluation results illustrate the practical feasibility of the approach.

  9. Principle of some gamma cameras (efficiencies, limitations, development)

    International Nuclear Information System (INIS)

    Allemand, R.; Bourdel, J.; Gariod, R.; Laval, M.; Levy, G.; Thomas, G.

    1975-01-01

    The quality of scintigraphic images is shown to depend on the efficiency of both the input collimator and the detector. Methods are described by which the quality of these images may be improved by adaptations to either the collimator (Fresnel zone camera, Compton effect camera) or the detector (Anger camera, image amplification camera). The Anger camera and image amplification camera are at present the two main instruments whereby acceptable space and energy resolutions may be obtained. A theoretical comparative study of their efficiencies is carried out, independently of their technological differences, after which the instruments designed or under study at the LETI are presented: these include the image amplification camera, the electron amplifier tube camera using a semi-conductor target CdTe and HgI 2 detector [fr

  10. Streak camera recording of interferometer fringes

    International Nuclear Information System (INIS)

    Parker, N.L.; Chau, H.H.

    1977-01-01

    The use of an electronic high-speed camera in the streaking mode to record interference fringe motion from a velocity interferometer is discussed. Advantages of this method over the photomultiplier tube-oscilloscope approach are delineated. Performance testing and data for the electronic streak camera are discussed. The velocity profile of a mylar flyer accelerated by an electrically exploded bridge, and the jump-off velocity of metal targets struck by these mylar flyers are measured in the camera tests. Advantages of the streak camera include portability, low cost, ease of operation and maintenance, simplified interferometer optics, and rapid data analysis

  11. Edge detection of magnetic anomalies using analytic signal of tilt angle (ASTA)

    Science.gov (United States)

    Alamdar, K.; Ansari, A. H.; Ghorbani, A.

    2009-04-01

    Magnetic is a commonly used geophysical technique to identify and image potential subsurface targets. Interpretation of magnetic anomalies is a complex process due to the superposition of multiple magnetic sources, presence of geologic and cultural noise and acquisition and positioning error. Both the vertical and horizontal derivatives of potential field data are useful; horizontal derivative, enhance edges whereas vertical derivative narrow the width of anomaly and so locate source bodies more accurately. We can combine vertical and horizontal derivative of magnetic field to achieve analytic signal which is independent to body magnetization direction and maximum value of this lies over edges of body directly. Tilt angle filter is phased-base filter and is defined as angle between vertical derivative and total horizontal derivative. Tilt angle value differ from +90 degree to -90 degree and its zero value lies over body edge. One of disadvantage of this filter is when encountering with deep sources the detected edge is blurred. For overcome this problem many authors introduced new filters such as total horizontal derivative of tilt angle or vertical derivative of tilt angle which Because of using high-order derivative in these filters results may be too noisy. If we combine analytic signal and tilt angle, a new filter termed (ASTA) is produced which its maximum value lies directly over body edge and is easer than tilt angle to delineate body edge and no complicity of tilt angle. In this work new filter has been demonstrated on magnetic data from an area in Sar- Cheshme region in Iran. This area is located in 55 degree longitude and 32 degree latitude and is a copper potential region. The main formation in this area is Andesith and Trachyandezite. Magnetic surveying was employed to separate the boundaries of Andezite and Trachyandezite from adjacent area. In this regard a variety of filters such as analytic signal, tilt angle and ASTA filter have been applied which

  12. QSOs with narrow emission lines

    International Nuclear Information System (INIS)

    Baldwin, J.A.; Mcmahon, R.; Hazard, C.; Williams, R.E.

    1988-01-01

    Observations of two new high-redshift, narrow-lined QSOs (NLQSOs) are presented and discussed together with observations of similar objects reported in the literature. Gravitational lensing is ruled out as a possible means of amplifying the luminosity for one of these objects. It is found that the NLQSOs have broad bases on their emission lines as well as the prominent narrow cores which define this class. Thus, these are not pole-on QSOs. The FWHM of the emission lines fits onto the smoothly falling tail of the lower end of the line-width distribution for complete QSO samples. The equivalent widths of the combined broad and narrow components of the lines are normal for QSOs of the luminosity range under study. However, the NLQSOs do show ionization differences from broader-lined QSOs; most significant, the semiforbidden C III/C IV intensity ratio is unusually low. The N/C abundance ratio in these objects is found to be normal; the Al/C abundance ratio may be quite high. 38 references

  13. Gaze Estimation for Off-Angle Iris Recognition Based on the Biometric Eye Model

    Energy Technology Data Exchange (ETDEWEB)

    Karakaya, Mahmut [ORNL; Barstow, Del R [ORNL; Santos-Villalobos, Hector J [ORNL; Thompson, Joseph W [ORNL; Bolme, David S [ORNL; Boehnen, Chris Bensing [ORNL

    2013-01-01

    Iris recognition is among the highest accuracy biometrics. However, its accuracy relies on controlled high quality capture data and is negatively affected by several factors such as angle, occlusion, and dilation. Non-ideal iris recognition is a new research focus in biometrics. In this paper, we present a gaze estimation method designed for use in an off-angle iris recognition framework based on the ANONYMIZED biometric eye model. Gaze estimation is an important prerequisite step to correct an off-angle iris images. To achieve the accurate frontal reconstruction of an off-angle iris image, we first need to estimate the eye gaze direction from elliptical features of an iris image. Typically additional information such as well-controlled light sources, head mounted equipment, and multiple cameras are not available. Our approach utilizes only the iris and pupil boundary segmentation allowing it to be applicable to all iris capture hardware. We compare the boundaries with a look-up-table generated by using our biologically inspired biometric eye model and find the closest feature point in the look-up-table to estimate the gaze. Based on the results from real images, the proposed method shows effectiveness in gaze estimation accuracy for our biometric eye model with an average error of approximately 3.5 degrees over a 50 degree range.

  14. The fly's eye camera system

    Science.gov (United States)

    Mészáros, L.; Pál, A.; Csépány, G.; Jaskó, A.; Vida, K.; Oláh, K.; Mezö, G.

    2014-12-01

    We introduce the Fly's Eye Camera System, an all-sky monitoring device intended to perform time domain astronomy. This camera system design will provide complementary data sets for other synoptic sky surveys such as LSST or Pan-STARRS. The effective field of view is obtained by 19 cameras arranged in a spherical mosaic form. These individual cameras of the device stand on a hexapod mount that is fully capable of achieving sidereal tracking for the subsequent exposures. This platform has many advantages. First of all it requires only one type of moving component and does not include unique parts. Hence this design not only eliminates problems implied by unique elements, but the redundancy of the hexapod allows smooth operations even if one or two of the legs are stuck. In addition, it can calibrate itself by observed stars independently from both the geographical location (including northen and southern hemisphere) and the polar alignment of the full mount. All mechanical elements and electronics are designed within the confines of our institute Konkoly Observatory. Currently, our instrument is in testing phase with an operating hexapod and reduced number of cameras.

  15. Equilibrium contact angle or the most-stable contact angle?

    Science.gov (United States)

    Montes Ruiz-Cabello, F J; Rodríguez-Valverde, M A; Cabrerizo-Vílchez, M A

    2014-04-01

    It is well-established that the equilibrium contact angle in a thermodynamic framework is an "unattainable" contact angle. Instead, the most-stable contact angle obtained from mechanical stimuli of the system is indeed experimentally accessible. Monitoring the susceptibility of a sessile drop to a mechanical stimulus enables to identify the most stable drop configuration within the practical range of contact angle hysteresis. Two different stimuli may be used with sessile drops: mechanical vibration and tilting. The most stable drop against vibration should reveal the changeless contact angle but against the gravity force, it should reveal the highest resistance to slide down. After the corresponding mechanical stimulus, once the excited drop configuration is examined, the focus will be on the contact angle of the initial drop configuration. This methodology needs to map significantly the static drop configurations with different stable contact angles. The most-stable contact angle, together with the advancing and receding contact angles, completes the description of physically realizable configurations of a solid-liquid system. Since the most-stable contact angle is energetically significant, it may be used in the Wenzel, Cassie or Cassie-Baxter equations accordingly or for the surface energy evaluation. © 2013 Elsevier B.V. All rights reserved.

  16. Motorcycle detection and counting using stereo camera, IR camera, and microphone array

    Science.gov (United States)

    Ling, Bo; Gibson, David R. P.; Middleton, Dan

    2013-03-01

    Detection, classification, and characterization are the key to enhancing motorcycle safety, motorcycle operations and motorcycle travel estimation. Average motorcycle fatalities per Vehicle Mile Traveled (VMT) are currently estimated at 30 times those of auto fatalities. Although it has been an active research area for many years, motorcycle detection still remains a challenging task. Working with FHWA, we have developed a hybrid motorcycle detection and counting system using a suite of sensors including stereo camera, thermal IR camera and unidirectional microphone array. The IR thermal camera can capture the unique thermal signatures associated with the motorcycle's exhaust pipes that often show bright elongated blobs in IR images. The stereo camera in the system is used to detect the motorcyclist who can be easily windowed out in the stereo disparity map. If the motorcyclist is detected through his or her 3D body recognition, motorcycle is detected. Microphones are used to detect motorcycles that often produce low frequency acoustic signals. All three microphones in the microphone array are placed in strategic locations on the sensor platform to minimize the interferences of background noises from sources such as rain and wind. Field test results show that this hybrid motorcycle detection and counting system has an excellent performance.

  17. Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images

    Science.gov (United States)

    Awumah, Anna; Mahanti, Prasun; Robinson, Mark

    2016-10-01

    Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).

  18. Recent developments in X-ray and neutron small-angle scattering instrumentation and data analysis

    International Nuclear Information System (INIS)

    Schelten, J.

    1978-01-01

    The developments in instrumentation and data analysis that have occurred in the field of small-angle X-ray and neutron scattering since 1973 are reviewed. For X-rays, the cone camera collimation was invented, synchrotrons and storage rings were demonstrated to be intense sources of X-radiation, and one- and two-dimensional position-sensitive detectors were interfaced to cameras with both point and line collimation. For neutrons, the collimators and detectors on the Juelich and Grenoble machines were improved, new D11-type instruments were built or are under construction at several sites, double-crystal instruments were set up, and various new machines have been proposed. Significant progress in data analysis and evaluation has been made through application of mathematical techniques such as the use of spline functions, error minimization with constraints, and linear programming. Several special experiments, unusual in respect to the anisotropy of the scattering pattern, gravitational effects, moving scatterers, and dynamic fast time slicing, are discussed. (Auth.)

  19. Sky light polarization detection with linear polarizer triplet in light field camera inspired by insect vision.

    Science.gov (United States)

    Zhang, Wenjing; Cao, Yu; Zhang, Xuanzhe; Liu, Zejin

    2015-10-20

    Stable information of a sky light polarization pattern can be used for navigation with various advantages such as better performance of anti-interference, no "error cumulative effect," and so on. But the existing method of sky light polarization measurement is weak in real-time performance or with a complex system. Inspired by the navigational capability of a Cataglyphis with its compound eyes, we introduce a new approach to acquire the all-sky image under different polarization directions with one camera and without a rotating polarizer, so as to detect the polarization pattern across the full sky in a single snapshot. Our system is based on a handheld light field camera with a wide-angle lens and a triplet linear polarizer placed over its aperture stop. Experimental results agree with the theoretical predictions. Not only real-time detection but simple and costless architecture demonstrates the superiority of the approach proposed in this paper.

  20. Intramyocardial arterial narrowing in dogs with subaortic stenosis.

    Science.gov (United States)

    Falk, T; Jönsson, L; Pedersen, H D

    2004-09-01

    Earlier studies have described intramyocardial arterial narrowing based on hyperplasia and hypertrophy of the vessel wall in dogs with subaortic stenosis (SAS). In theory, such changes might increase the risk of sudden death, as they seem to do in heart disease in other species. This retrospective pathological study describes and quantifies intramyocardial arterial narrowing in 44 dogs with naturally occurring SAS and in eight control dogs. The majority of the dogs with SAS died suddenly (n=27); nine had died or been euthanased with signs of heart failure and eight were euthanased without clinical signs. Dogs with SAS had significantly narrower intramyocardial arteries (Pdogs. Male dogs and those with more severe hypertrophy had more vessel narrowing (P=0.02 and P=0.02, respectively), whereas dogs with dilated hearts had slightly less pronounced arterial thickening (P=0.01). Arterial narrowing was not related to age, but fibrosis increased with age (P=0.047). Dogs that died suddenly did not have a greater number of arterial changes than other dogs with SAS. This study suggests that most dogs with SAS have intramyocardial arterial narrowing and that the risk of dying suddenly is not significantly related to the overall degree of vessel obliteration.

  1. Unified framework for recognition, localization and mapping using wearable cameras.

    Science.gov (United States)

    Vázquez-Martín, Ricardo; Bandera, Antonio

    2012-08-01

    Monocular approaches to simultaneous localization and mapping (SLAM) have recently addressed with success the challenging problem of the fast computation of dense reconstructions from a single, moving camera. Thus, if these approaches initially relied on the detection of a reduced set of interest points to estimate the camera position and the map, they are currently able to reconstruct dense maps from a handheld camera while the camera coordinates are simultaneously computed. However, these maps of 3-dimensional points usually remain meaningless, that is, with no memorable items and without providing a way of encoding spatial relationships between objects and paths. In humans and mobile robotics, landmarks play a key role in the internalization of a spatial representation of an environment. They are memorable cues that can serve to define a region of the space or the location of other objects. In a topological representation of the space, landmarks can be identified and located according to its structural, perceptive or semantic significance and distinctiveness. But on the other hand, landmarks may be difficult to be located in a metric representation of the space. Restricted to the domain of visual landmarks, this work describes an approach where the map resulting from a point-based, monocular SLAM is annotated with the semantic information provided by a set of distinguished landmarks. Both features are obtained from the image. Hence, they can be linked by associating to each landmark all those point-based features that are superimposed to the landmark in a given image (key-frame). Visual landmarks will be obtained by means of an object-based, bottom-up attention mechanism, which will extract from the image a set of proto-objects. These proto-objects could not be always associated with natural objects, but they will typically constitute significant parts of these scene objects and can be appropriately annotated with semantic information. Moreover, they will be

  2. Performance Evaluation of Thermographic Cameras for Photogrammetric Measurements

    Science.gov (United States)

    Yastikli, N.; Guler, E.

    2013-05-01

    The aim of this research is the performance evaluation of the termographic cameras for possible use for photogrammetric documentation and deformation analyses caused by moisture and isolation problem of the historical and cultural heritage. To perform geometric calibration of the termographic camera, the 3D test object was designed with 77 control points which were distributed in different depths. For performance evaluation, Flir A320 termographic camera with 320 × 240 pixels and lens with 18 mm focal length was used. The Nikon D3X SLR digital camera with 6048 × 4032 pixels and lens with 20 mm focal length was used as reference for comparison. The size of pixel was 25 μm for the Flir A320 termographic camera and 6 μm for the Nikon D3X SLR digital camera. The digital images of the 3D test object were recorded with the Flir A320 termographic camera and Nikon D3X SLR digital camera and the image coordinate of the control points in the images were measured. The geometric calibration parameters, including the focal length, position of principal points, radial and tangential distortions were determined with introduced additional parameters in bundle block adjustments. The measurement of image coordinates and bundle block adjustments with additional parameters were performed using the PHIDIAS digital photogrammetric system. The bundle block adjustment was repeated with determined calibration parameter for both Flir A320 termographic camera and Nikon D3X SLR digital camera. The obtained standard deviation of measured image coordinates was 9.6 μm and 10.5 μm for Flir A320 termographic camera and 8.3 μm and 7.7 μm for Nikon D3X SLR digital camera. The obtained standard deviation of measured image points in Flir A320 termographic camera images almost same accuracy level with digital camera in comparison with 4 times bigger pixel size. The obtained results from this research, the interior geometry of the termographic cameras and lens distortion was modelled efficiently

  3. IMPLEMENTATION OF A REAL-TIME STACKING ALGORITHM IN A PHOTOGRAMMETRIC DIGITAL CAMERA FOR UAVS

    Directory of Open Access Journals (Sweden)

    A. Audi

    2017-08-01

    Full Text Available In the recent years, unmanned aerial vehicles (UAVs have become an interesting tool in aerial photography and photogrammetry activities. In this context, some applications (like cloudy sky surveys, narrow-spectral imagery and night-vision imagery need a longexposure time where one of the main problems is the motion blur caused by the erratic camera movements during image acquisition. This paper describes an automatic real-time stacking algorithm which produces a high photogrammetric quality final composite image with an equivalent long-exposure time using several images acquired with short-exposure times. Our method is inspired by feature-based image registration technique. The algorithm is implemented on the light-weight IGN camera, which has an IMU sensor and a SoC/FPGA. To obtain the correct parameters for the resampling of images, the presented method accurately estimates the geometrical relation between the first and the Nth image, taking into account the internal parameters and the distortion of the camera. Features are detected in the first image by the FAST detector, than homologous points on other images are obtained by template matching aided by the IMU sensors. The SoC/FPGA in the camera is used to speed up time-consuming parts of the algorithm such as features detection and images resampling in order to achieve a real-time performance as we want to write only the resulting final image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images, as well as block diagrams of the described architecture. The resulting stacked image obtained on real surveys doesn’t seem visually impaired. Timing results demonstrate that our algorithm can be used in real-time since its processing time is less than the writing time of an image in the storage device. An interesting by-product of this algorithm is the 3D rotation

  4. Imaging capabilities of germanium gamma cameras

    International Nuclear Information System (INIS)

    Steidley, J.W.

    1977-01-01

    Quantitative methods of analysis based on the use of a computer simulation were developed and used to investigate the imaging capabilities of germanium gamma cameras. The main advantage of the computer simulation is that the inherent unknowns of clinical imaging procedures are removed from the investigation. The effects of patient scattered radiation were incorporated using a mathematical LSF model which was empirically developed and experimentally verified. Image modifying effects of patient motion, spatial distortions, and count rate capabilities were also included in the model. Spatial domain and frequency domain modeling techniques were developed and used in the simulation as required. The imaging capabilities of gamma cameras were assessed using low contrast lesion source distributions. The results showed that an improvement in energy resolution from 10% to 2% offers significant clinical advantages in terms of improved contrast, increased detectability, and reduced patient dose. The improvements are of greatest significance for small lesions at low contrast. The results of the computer simulation were also used to compare a design of a hypothetical germanium gamma camera with a state-of-the-art scintillation camera. The computer model performed a parametric analysis of the interrelated effects of inherent and technological limitations of gamma camera imaging. In particular, the trade-off between collimator resolution and collimator efficiency for detection of a given low contrast lesion was directly addressed. This trade-off is an inherent limitation of both gamma cameras. The image degrading effects of patient motion, camera spatial distortions, and low count rate were shown to modify the improvements due to better energy resolution. Thus, based on this research, the continued development of germanium cameras to the point of clinical demonstration is recommended

  5. Longitudinal changes of angle configuration in primary angle-closure suspects: the Zhongshan Angle-Closure Prevention Trial.

    Science.gov (United States)

    Jiang, Yuzhen; Chang, Dolly S; Zhu, Haogang; Khawaja, Anthony P; Aung, Tin; Huang, Shengsong; Chen, Qianyun; Munoz, Beatriz; Grossi, Carlota M; He, Mingguang; Friedman, David S; Foster, Paul J

    2014-09-01

    To determine longitudinal changes in angle configuration in the eyes of primary angle-closure suspects (PACS) treated by laser peripheral iridotomy (LPI) and in untreated fellow eyes. Longitudinal cohort study. Primary angle-closure suspects aged 50 to 70 years were enrolled in a randomized, controlled clinical trial. Each participant was treated by LPI in 1 randomly selected eye, with the fellow eye serving as a control. Angle width was assessed in a masked fashion using gonioscopy and anterior segment optical coherence tomography (AS-OCT) before and at 2 weeks, 6 months, and 18 months after LPI. Angle width in degrees was calculated from Shaffer grades assessed under static gonioscopy. Angle configuration was also evaluated using angle opening distance (AOD250, AOD500, AOD750), trabecular-iris space area (TISA500, TISA750), and angle recess area (ARA) measured in AS-OCT images. No significant difference was found in baseline measures of angle configuration between treated and untreated eyes. At 2 weeks after LPI, the drainage angle on gonioscopy widened from a mean of 13.5° at baseline to a mean of 25.7° in treated eyes, which was also confirmed by significant increases in all AS-OCT angle width measures (Pgonioscopy (P = 0.18), AOD250 (P = 0.167) and ARA (P = 0.83). In untreated eyes, angle width consistently decreased across all follow-up visits after LPI, with a more rapid longitudinal decrease compared with treated eyes (P values for all variables ≤0.003). The annual rate of change in angle width was equivalent to 1.2°/year (95% confidence interval [CI], 0.8-1.6) in treated eyes and 1.6°/year (95% CI, 1.3-2.0) in untreated eyes (P<0.001). Angle width of treated eyes increased markedly after LPI, remained stable for 6 months, and then decreased significantly by 18 months after LPI. Untreated eyes experienced a more consistent and rapid decrease in angle width over the same time period. Copyright © 2014 American Academy of Ophthalmology. Published by

  6. The influence of flip angle on the magic angle effect

    International Nuclear Information System (INIS)

    Zurlo, J.V.; Blacksin, M.F.; Karimi, S.

    2000-01-01

    Objective. To assess the impact of flip angle with gradient sequences on the ''magic angle effect''. We characterized the magic angle effect in various gradient echo sequences and compared the signal- to-noise ratios present on these sequences with the signal-to-noise ratios of spin echo sequences.Design. Ten normal healthy volunteers were positioned such that the flexor hallucis longus tendon remained at approximately at 55 to the main magnetic field (the magic angle). The tendon was imaged by a conventional spin echo T1- and T2-weighted techniques and by a series of gradient techniques. Gradient sequences were altered by both TE and flip angle. Signal-to-noise measurements were obtained at segments of the flexor hallucis longus tendon demonstrating the magic angle effect to quantify the artifact. Signal-to-noise measurements were compared and statistical analysis performed. Similar measurements were taken of the anterior tibialis tendon as an internal control.Results and conclusions. We demonstrated the magic angle effect on all the gradient sequences. The intensity of the artifact was affected by both the TE and flip angle. Low TE values and a high flip angle demonstrated the greatest magic angle effect. At TE values less than 30 ms, a high flip angle will markedly increase the magic angle effect. (orig.)

  7. Highly Tunable Narrow Bandpass MEMS Filter

    KAUST Repository

    Hafiz, Md Abdullah Al

    2017-07-07

    We demonstrate a proof-of-concept highly tunable narrow bandpass filter based on electrothermally and electrostatically actuated microelectromechanical-system (MEMS) resonators. The device consists of two mechanically uncoupled clamped-clamped arch resonators, designed such that their resonance frequencies are independently tuned to obtain the desired narrow passband. Through the electrothermal and electrostatic actuation, the stiffness of the structures is highly tunable. We experimentally demonstrate significant percentage tuning (~125%) of the filter center frequency by varying the applied electrothermal voltages to the resonating structures, while maintaining a narrow passband of 550 ± 50 Hz, a stopband rejection of >17 dB, and a passband ripple ≤ 2.5 dB. An analytical model based on the Euler-Bernoulli beam theory is used to confirm the behavior of the filter, and the origin of the high tunability using electrothermal actuation is discussed.

  8. Stereo Pinhole Camera: Assembly and experimental activities

    Directory of Open Access Journals (Sweden)

    Gilmário Barbosa Santos

    2015-05-01

    Full Text Available This work describes the assembling of a stereo pinhole camera for capturing stereo-pairs of images and proposes experimental activities with it. A pinhole camera can be as sophisticated as you want, or so simple that it could be handcrafted with practically recyclable materials. This paper describes the practical use of the pinhole camera throughout history and currently. Aspects of optics and geometry involved in the building of the stereo pinhole camera are presented with illustrations. Furthermore, experiments are proposed by using the images obtained by the camera for 3D visualization through a pair of anaglyph glasses, and the estimation of relative depth by triangulation is discussed.

  9. Pellet ablation and cloud flow characteristics in the JIPP T-IIU plasma with the injection-angle controllable system

    International Nuclear Information System (INIS)

    Sakakita, H.; Sato, K.N.; Liang, R.; Hamada, Y.; Ando, A.; Kano, Y.; Sakamoto, M.

    1994-01-01

    Pellet ablation and flow characteristics of ablation cloud have been studied in the JIPP T-IIU plasma by using an injection-angle controllable system. A new technique for an ice pellet injection system with controllability of injection angle has been developed and installed to the JIPP T-IIU tokamak in order to vary deposition profile of ice pellets within a plasma. Injection angle can be varied easily and successfully during an interval of two plasma shots in the course of an experiment, so that one can carry out various basic experiments by varying the pellet deposition profile. The injection angle has been varied poloidally from -6 to 6 degree by changing the angle of the last stage drift tube. This situation makes possible for pellets to aim at from about r = -2a/3 to r = 2a/3 of the plasma. From two dimensional observations by CCD cameras, details of the pellet ablation structures with various injection angles have been studied, and a couple of interesting phenomena have been found. In the case of an injection angle (θ) larger than a certain value (θ ≥ 4 o ), a pellet penetrates straightly through the plasma with a trace of straight ablation cloud, which has been expected from usual theoretical consideration. On the other hand, a long helical tail of ablation light has been observed in the case of the angle smaller than the certain value (θ ≤ 4 o ). (author) 4 refs., 4 figs

  10. First results from the TOPSAT camera

    Science.gov (United States)

    Greenway, Paul; Tosh, Ian; Morris, Nigel; Burton, Gary; Cawley, Steve

    2017-11-01

    The TopSat camera is a low cost remote sensing imager capable of producing 2.5 metre resolution panchromatic imagery, funded by the British National Space Centre's Mosaic programme. The instrument was designed and assembled at the Space Science & Technology Department of the CCLRC's Rutherford Appleton Laboratory (RAL) in the UK, and was launched on the 27th October 2005 from Plesetsk Cosmodrome in Northern Russia on a Kosmos-3M. The camera utilises an off-axis three mirror system, which has the advantages of excellent image quality over a wide field of view, combined with a compactness that makes its overall dimensions smaller than its focal length. Keeping the costs to a minimum has been a major design driver in the development of this camera. The camera is part of the TopSat mission, which is a collaboration between four UK organisations; QinetiQ, Surrey Satellite Technology Ltd (SSTL), RAL and Infoterra. Its objective is to demonstrate provision of rapid response high resolution imagery to fixed and mobile ground stations using a low cost minisatellite. The paper "Development of the TopSat Camera" presented by RAL at the 5th ICSO in 2004 described the opto-mechanical design, assembly, alignment and environmental test methods implemented. Now that the spacecraft is in orbit and successfully acquiring images, this paper presents the first results from the camera and makes an initial assessment of the camera's in-orbit performance.

  11. Frontal Plane Modelling of Human Dynamics during Standing in Narrow-Stance

    Science.gov (United States)

    Sonobe, M.; Yamaguchi, H.; Hino, J.

    2016-09-01

    Standing ride type vehicles like electric skateboards have been developed in recent years. Although these vehicles have advantages as being compact and low cost due to their simple structure, it is necessary to improve the riding quality. Therefore, the system aiding riders to keep their balance on a skateboard by feedback control or feedforward control has been required. To achieve it, a human balance model should be built as simple as possible. In this study, we focus on the human balance modelling during standing when the support surface moves largely. We restricted the model on frontal plane and narrow stance because the restrictions allow us to assume single-degree-of-freedom model. The balance control system is generally assumed as a delayed feedback control system. The model was identified through impulse response test and frequency response test. As a result, we found the phase between acceleration of the skateboard and posture angle become opposite phase in low frequency range.

  12. The Eye of the Camera

    NARCIS (Netherlands)

    van Rompay, Thomas Johannes Lucas; Vonk, Dorette J.; Fransen, M.L.

    2009-01-01

    This study addresses the effects of security cameras on prosocial behavior. Results from previous studies indicate that the presence of others can trigger helping behavior, arising from the need for approval of others. Extending these findings, the authors propose that security cameras can likewise

  13. Effects of pelvic rotation and needle angle on pubic arch interference during transperineal prostate implants

    International Nuclear Information System (INIS)

    Tincher, Sandra A.; Kim, Robert Y.; Ezekiel, Mark P.; Zinsli, Tom; Fiveash, John B.; Raben, David A.; Bueschen, Anton J.; Urban, Donald A.

    2000-01-01

    Purpose: Pubic arch interference due to an enlarged prostate gland or a narrow pubic arch is often a limiting factor in adequate prostate coverage during transperineal brachytherapy. The purpose of this study was to evaluate the effects of both pelvic rotation and needle angles on pubic arch interference using CT-based 3-D information. Methods and Materials: Seven patients had CT imaging in both supine and lithotomy positions and 3-D treatment planning was performed with three needle angles (20 downward, 0, 20 upward). The pubic arch interference was then measured and comparisons were made for each needle trajectory and pelvic position. Results: Increasing pelvic rotation from supine to lithotomy position shows less pubic arch interference. Directing the needle tip upward shows less pubic arch interference in both supine and lithotomy positions when compared to needle tips directed downward. Conclusions: Both pelvic position and needle angles are important factors influencing pubic arch interference. Preplanning CT-based 3-D information may assist for individualized treatment planning in patients with a significant bony interference, thus avoiding pubic arch interference during implantation

  14. Poster: A Software-Defined Multi-Camera Network

    OpenAIRE

    Chen, Po-Yen; Chen, Chien; Selvaraj, Parthiban; Claesen, Luc

    2016-01-01

    The widespread popularity of OpenFlow leads to a significant increase in the number of applications developed in SoftwareDefined Networking (SDN). In this work, we propose the architecture of a Software-Defined Multi-Camera Network consisting of small, flexible, economic, and programmable cameras which combine the functions of the processor, switch, and camera. A Software-Defined Multi-Camera Network can effectively reduce the overall network bandwidth and reduce a large amount of the Capex a...

  15. Gamma camera performance: technical assessment protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bolster, A.A. [West Glasgow Hospitals NHS Trust, London (United Kingdom). Dept. of Clinical Physics; Waddington, W.A. [University College London Hospitals NHS Trust, London (United Kingdom). Inst. of Nuclear Medicine

    1996-12-31

    This protocol addresses the performance assessment of single and dual headed gamma cameras. No attempt is made to assess the performance of any associated computing systems. Evaluations are usually performed on a gamma camera commercially available within the United Kingdom and recently installed at a clinical site. In consultation with the manufacturer, GCAT selects the site and liaises with local staff to arrange a mutually convenient time for assessment. The manufacturer is encouraged to have a representative present during the evaluation. Three to four days are typically required for the evaluation team to perform the necessary measurements. When access time is limited, the team will modify the protocol to test the camera as thoroughly as possible. Data are acquired on the camera`s computer system and are subsequently transferred to the independent GCAT computer system for analysis. This transfer from site computer to the independent system is effected via a hardware interface and Interfile data transfer. (author).

  16. Initial inflight calibration for Hayabusa2 optical navigation camera (ONC) for science observations of asteroid Ryugu

    Science.gov (United States)

    Suzuki, H.; Yamada, M.; Kouyama, T.; Tatsumi, E.; Kameda, S.; Honda, R.; Sawada, H.; Ogawa, N.; Morota, T.; Honda, C.; Sakatani, N.; Hayakawa, M.; Yokota, Y.; Yamamoto, Y.; Sugita, S.

    2018-01-01

    Hayabusa2, the first sample return mission to a C-type asteroid was launched by the Japan Aerospace Exploration Agency (JAXA) on December 3, 2014 and will arrive at the asteroid in the middle of 2018 to collect samples from its surface, which may contain both hydrated minerals and organics. The optical navigation camera (ONC) system on board the Hayabusa2 consists of three individual framing CCD cameras, ONC-T for a telescopic nadir view, ONC-W1 for a wide-angle nadir view, and ONC-W2 for a wide-angle slant view will be used to observe the surface of Ryugu. The cameras will be used to measure the global asteroid shape, local morphologies, and visible spectroscopic properties. Thus, image data obtained by ONC will provide essential information to select landing (sampling) sites on the asteroid. This study reports the results of initial inflight calibration based on observations of Earth, Mars, Moon, and stars to verify and characterize the optical performance of the ONC, such as flat-field sensitivity, spectral sensitivity, point-spread function (PSF), distortion, and stray light of ONC-T, and distortion for ONC-W1 and W2. We found some potential problems that may influence our science observations. This includes changes in sensitivity of flat fields for all bands from those that were measured in the pre-flight calibration and existence of a stray light that arises under certain conditions of spacecraft attitude with respect to the sun. The countermeasures for these problems were evaluated by using data obtained during initial in-flight calibration. The results of our inflight calibration indicate that the error of spectroscopic measurements around 0.7 μm using 0.55, 0.70, and 0.86 μm bands of the ONC-T can be lower than 0.7% after these countermeasures and pixel binning. This result suggests that our ONC-T would be able to detect typical strength (∼3%) of the serpentine absorption band often found on CM chondrites and low albedo asteroids with ≥ 4

  17. The Light Field Attachment: Turning a DSLR into a Light Field Camera Using a Low Budget Camera Ring

    KAUST Repository

    Wang, Yuwang

    2016-11-16

    We propose a concept for a lens attachment that turns a standard DSLR camera and lens into a light field camera. The attachment consists of 8 low-resolution, low-quality side cameras arranged around the central high-quality SLR lens. Unlike most existing light field camera architectures, this design provides a high-quality 2D image mode, while simultaneously enabling a new high-quality light field mode with a large camera baseline but little added weight, cost, or bulk compared with the base DSLR camera. From an algorithmic point of view, the high-quality light field mode is made possible by a new light field super-resolution method that first improves the spatial resolution and image quality of the side cameras and then interpolates additional views as needed. At the heart of this process is a super-resolution method that we call iterative Patch- And Depth-based Synthesis (iPADS), which combines patch-based and depth-based synthesis in a novel fashion. Experimental results obtained for both real captured data and synthetic data confirm that our method achieves substantial improvements in super-resolution for side-view images as well as the high-quality and view-coherent rendering of dense and high-resolution light fields.

  18. PERFORMANCE EVALUATION OF THERMOGRAPHIC CAMERAS FOR PHOTOGRAMMETRIC MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2013-05-01

    Full Text Available The aim of this research is the performance evaluation of the termographic cameras for possible use for photogrammetric documentation and deformation analyses caused by moisture and isolation problem of the historical and cultural heritage. To perform geometric calibration of the termographic camera, the 3D test object was designed with 77 control points which were distributed in different depths. For performance evaluation, Flir A320 termographic camera with 320 × 240 pixels and lens with 18 mm focal length was used. The Nikon D3X SLR digital camera with 6048 × 4032 pixels and lens with 20 mm focal length was used as reference for comparison. The size of pixel was 25 μm for the Flir A320 termographic camera and 6 μm for the Nikon D3X SLR digital camera. The digital images of the 3D test object were recorded with the Flir A320 termographic camera and Nikon D3X SLR digital camera and the image coordinate of the control points in the images were measured. The geometric calibration parameters, including the focal length, position of principal points, radial and tangential distortions were determined with introduced additional parameters in bundle block adjustments. The measurement of image coordinates and bundle block adjustments with additional parameters were performed using the PHIDIAS digital photogrammetric system. The bundle block adjustment was repeated with determined calibration parameter for both Flir A320 termographic camera and Nikon D3X SLR digital camera. The obtained standard deviation of measured image coordinates was 9.6 μm and 10.5 μm for Flir A320 termographic camera and 8.3 μm and 7.7 μm for Nikon D3X SLR digital camera. The obtained standard deviation of measured image points in Flir A320 termographic camera images almost same accuracy level with digital camera in comparison with 4 times bigger pixel size. The obtained results from this research, the interior geometry of the termographic cameras and lens distortion was

  19. High resolution RGB color line scan camera

    Science.gov (United States)

    Lynch, Theodore E.; Huettig, Fred

    1998-04-01

    A color line scan camera family which is available with either 6000, 8000 or 10000 pixels/color channel, utilizes off-the-shelf lenses, interfaces with currently available frame grabbers, includes on-board pixel by pixel offset correction, and is configurable and controllable via RS232 serial port for computer controlled or stand alone operation is described in this paper. This line scan camera is based on an available 8000 element monochrome line scan camera designed by AOA for OEM use. The new color version includes improvements such as better packaging and additional user features which make the camera easier to use. The heart of the camera is a tri-linear CCD sensor with on-chip color balancing for maximum accuracy and pinned photodiodes for low lag response. Each color channel is digitized to 12 bits and all three channels are multiplexed together so that the resulting camera output video is either a 12 or 8 bit data stream at a rate of up to 24Megpixels/sec. Conversion from 12 to 8 bit, or user-defined gamma, is accomplished by on board user-defined video look up tables. The camera has two user-selectable operating modes; lows speed, high sensitivity mode or high speed, reduced sensitivity mode. The intended uses of the camera include industrial inspection, digital archiving, document scanning, and graphic arts applications.

  20. An Open Standard for Camera Trap Data

    Directory of Open Access Journals (Sweden)

    Tavis Forrester

    2016-12-01

    Full Text Available Camera traps that capture photos of animals are a valuable tool for monitoring biodiversity. The use of camera traps is rapidly increasing and there is an urgent need for standardization to facilitate data management, reporting and data sharing. Here we offer the Camera Trap Metadata Standard as an open data standard for storing and sharing camera trap data, developed by experts from a variety of organizations. The standard captures information necessary to share data between projects and offers a foundation for collecting the more detailed data needed for advanced analysis. The data standard captures information about study design, the type of camera used, and the location and species names for all detections in a standardized way. This information is critical for accurately assessing results from individual camera trapping projects and for combining data from multiple studies for meta-analysis. This data standard is an important step in aligning camera trapping surveys with best practices in data-intensive science. Ecology is moving rapidly into the realm of big data, and central data repositories are becoming a critical tool and are emerging for camera trap data. This data standard will help researchers standardize data terms, align past data to new repositories, and provide a framework for utilizing data across repositories and research projects to advance animal ecology and conservation.

  1. Structure of polyacrylic acid and polymethacrylic acid solutions: a small angle neutron scattering study

    Energy Technology Data Exchange (ETDEWEB)

    Moussaid, A. (Lab. d' Ultrasons et de Dynamique des Fluides Complexes, Univ. Louis Pasteur, 67 - Strasbourg (France)); Schosseler, F. (Lab. d' Ultrasons et de Dynamique des Fluides Complexes, Univ. Louis Pasteur, 67 - Strasbourg (France)); Munch, J.P. (Lab. d' Ultrasons et de Dynamique des Fluides Complexes, Univ. Louis Pasteur, 67 - Strasbourg (France)); Candau, S.J. (Lab. d' Ultrasons et de Dynamique des Fluides Complexes, Univ. Louis Pasteur, 67 - Strasbourg (France))

    1993-04-01

    The intensity scattered from polyacrylic acid and polymethacrylic acid solutions has been measured by small angle neutron scattering experiments. The influence of polymer concentration, ionization degree, temperature and salt content has been investigated. Results are in qualitative agreement with a model which predicts the existence of microphases in the unstable region of the phase diagram. Quantitative comparison with the theory is performed by fitting the theoretical structure factor to the experimental data. For a narrow range of ionization degrees nearly quantitative agreement with the theory is found for the polyacrylic acid system. (orig.).

  2. Structure of polyacrylic acid and polymethacrylic acid solutions : a small angle neutron scattering study

    Science.gov (United States)

    Moussaid, A.; Schosseler, F.; Munch, J. P.; Candau, S. J.

    1993-04-01

    The intensity scattered from polyacrylic acid and polymethacrylic acid solutions has been measured by small angle neutron scattering experiemnts. The influence of polymer concentration, ionization degree, temperature and salt content has been investigated. Results are in qualitative agreement with a model which predicts the existence of microphases in the unstable region of the phase diagram. Quantitative comparison with the theory is performed by fitting the theoretical structure factor to the experimental data. For a narrow range of ionizaiton degrees nearly quantitative agreement with the theory is found for the polyacrylic acide system.

  3. Dose evaluation of narrow-beam

    International Nuclear Information System (INIS)

    Goto, Shinichi

    1999-01-01

    Reliability of the dose from the narrow photon beam becomes more important since the single high-dose rate radiosurgery becoming popular. The dose evaluation for the optimal dose is difficult due to absence of lateral electronic equilibrium. Data necessary for treatment regimen are TMR (tissue maximum ratio), OCR (off center ratio) and S c,p (total scatter factor). The narrow-beam was 10 MV X-ray from Varian Clinac 2100C equipped with cylindrical Fischer collimator CBI system. Detection was performed by Kodak XV-2 film, a PTW natural diamond detector M60003, Scanditronics silicon detector EDD-5 or Fujitec micro-chamber FDC-9.4C. Phantoms were the water equivalent one (PTW, RW3), water one (PTW, MP3 system) and Wellhofer WP600 system. Factors above were actually measured to reveal that in the dose evaluation of narrow photon beam, TMR should be measured by micro-chamber, OCR, by film, and S c,p , by the two. The use of diamond detector was recommended for more precise measurement and evaluation of the dose. The importance of water phantom in the radiosurgery system was also shown. (K.H.)

  4. Camera network video summarization

    Science.gov (United States)

    Panda, Rameswar; Roy-Chowdhury, Amit K.

    2017-05-01

    Networks of vision sensors are deployed in many settings, ranging from security needs to disaster response to environmental monitoring. Many of these setups have hundreds of cameras and tens of thousands of hours of video. The difficulty of analyzing such a massive volume of video data is apparent whenever there is an incident that requires foraging through vast video archives to identify events of interest. As a result, video summarization, that automatically extract a brief yet informative summary of these videos, has attracted intense attention in the recent years. Much progress has been made in developing a variety of ways to summarize a single video in form of a key sequence or video skim. However, generating a summary from a set of videos captured in a multi-camera network still remains as a novel and largely under-addressed problem. In this paper, with the aim of summarizing videos in a camera network, we introduce a novel representative selection approach via joint embedding and capped l21-norm minimization. The objective function is two-fold. The first is to capture the structural relationships of data points in a camera network via an embedding, which helps in characterizing the outliers and also in extracting a diverse set of representatives. The second is to use a capped l21-norm to model the sparsity and to suppress the influence of data outliers in representative selection. We propose to jointly optimize both of the objectives, such that embedding can not only characterize the structure, but also indicate the requirements of sparse representative selection. Extensive experiments on standard multi-camera datasets well demonstrate the efficacy of our method over state-of-the-art methods.

  5. The use of a portable gamma camera for preoperative lymphatic mapping: a comparison with a conventional gamma camera

    Energy Technology Data Exchange (ETDEWEB)

    Vidal-Sicart, Sergi; Paredes, Pilar [Hospital Clinic Barcelona, Nuclear Medicine Department (CDIC), Barcelona (Spain); Institut d' Investigacio Biomedica Agusti Pi Sunyer (IDIBAPS), Barcelona (Spain); Vermeeren, Lenka; Valdes-Olmos, Renato A. [Netherlands Cancer Institute-Antoni van Leeuwenhoek Hospital (NKI-AVL), Nuclear Medicine Department, Amsterdam (Netherlands); Sola, Oriol [Hospital Clinic Barcelona, Nuclear Medicine Department (CDIC), Barcelona (Spain)

    2011-04-15

    Planar lymphoscintigraphy is routinely used for preoperative sentinel node visualization, but large gamma cameras are not always available. We evaluated the reproducibility of lymphatic mapping with a smaller and portable gamma camera. In two centres, 52 patients with breast cancer received preoperative lymphoscintigraphy with a conventional gamma camera with a field of view of 40 x 40 cm. Static anterior and lateral images were performed at 15 min, 2 h and 4 h after injection of the radiotracer ({sup 99m}Tc-nanocolloid). At 2 h after injection, anterior and oblique images were also performed with a portable gamma camera (Sentinella, Oncovision) positioned to obtain a field of view of 20 x 20 cm. Visualization of lymphatic drainage on conventional images and images with the portable device were compared for number of nodes depicted, their intensity and localization of sentinel nodes. The images performed with the conventional gamma camera depicted sentinel nodes in 94%, while the portable gamma camera showed drainage in 73%. There was however no significant difference in visualization between the two devices when a lead shield was used to mask the injection area in 43 patients (95 vs 88%, p = 0.25). Second-echelon nodes were visualized in 62% of the patients with the conventional gamma camera and in 29% of the cases with the portable gamma camera. Preoperative imaging with a portable gamma camera fitted with a pinhole collimator to obtain a field of view of 20 x 20 cm is able to depict sentinel nodes in 88% of the cases, if a lead shield is used to mask the injection site. This device may be useful in centres without the possibility to perform a preoperative image. (orig.)

  6. The use of a portable gamma camera for preoperative lymphatic mapping: a comparison with a conventional gamma camera

    International Nuclear Information System (INIS)

    Vidal-Sicart, Sergi; Paredes, Pilar; Vermeeren, Lenka; Valdes-Olmos, Renato A.; Sola, Oriol

    2011-01-01

    Planar lymphoscintigraphy is routinely used for preoperative sentinel node visualization, but large gamma cameras are not always available. We evaluated the reproducibility of lymphatic mapping with a smaller and portable gamma camera. In two centres, 52 patients with breast cancer received preoperative lymphoscintigraphy with a conventional gamma camera with a field of view of 40 x 40 cm. Static anterior and lateral images were performed at 15 min, 2 h and 4 h after injection of the radiotracer ( 99m Tc-nanocolloid). At 2 h after injection, anterior and oblique images were also performed with a portable gamma camera (Sentinella, Oncovision) positioned to obtain a field of view of 20 x 20 cm. Visualization of lymphatic drainage on conventional images and images with the portable device were compared for number of nodes depicted, their intensity and localization of sentinel nodes. The images performed with the conventional gamma camera depicted sentinel nodes in 94%, while the portable gamma camera showed drainage in 73%. There was however no significant difference in visualization between the two devices when a lead shield was used to mask the injection area in 43 patients (95 vs 88%, p = 0.25). Second-echelon nodes were visualized in 62% of the patients with the conventional gamma camera and in 29% of the cases with the portable gamma camera. Preoperative imaging with a portable gamma camera fitted with a pinhole collimator to obtain a field of view of 20 x 20 cm is able to depict sentinel nodes in 88% of the cases, if a lead shield is used to mask the injection site. This device may be useful in centres without the possibility to perform a preoperative image. (orig.)

  7. Autonomous Multicamera Tracking on Embedded Smart Cameras

    Directory of Open Access Journals (Sweden)

    Bischof Horst

    2007-01-01

    Full Text Available There is currently a strong trend towards the deployment of advanced computer vision methods on embedded systems. This deployment is very challenging since embedded platforms often provide limited resources such as computing performance, memory, and power. In this paper we present a multicamera tracking method on distributed, embedded smart cameras. Smart cameras combine video sensing, processing, and communication on a single embedded device which is equipped with a multiprocessor computation and communication infrastructure. Our multicamera tracking approach focuses on a fully decentralized handover procedure between adjacent cameras. The basic idea is to initiate a single tracking instance in the multicamera system for each object of interest. The tracker follows the supervised object over the camera network, migrating to the camera which observes the object. Thus, no central coordination is required resulting in an autonomous and scalable tracking approach. We have fully implemented this novel multicamera tracking approach on our embedded smart cameras. Tracking is achieved by the well-known CamShift algorithm; the handover procedure is realized using a mobile agent system available on the smart camera network. Our approach has been successfully evaluated on tracking persons at our campus.

  8. Nd:Yag laser iridotomy in Shaffer-Etienne grade 1 and 2:angle widening in our case studies

    Directory of Open Access Journals (Sweden)

    Sandra Cinzia Carlesimo

    2015-08-01

    Full Text Available AIM:To obtain widening of a potentially occludable angle, in according to Kanski’s indications, through preventive Nd:Yag laser iridotomy. The observational study was performed by using gonioscopy for the selection and follow-up of 1165 treated eyes and exploiting Shaffer-Etienne gonioscopic classification as a quality/quantity test of the angle recession.METHODS:Between September 2000 and July 2012, 586 patients were selected at the Outpatients’ Ophthalmological Clinic of the Policlinico Umberto I of Rome in order to undergo Nd:Yag laser iridotomy. A Goldmann type contact lens, Q-switched mode, 2-3 defocus, and 7-9 mJ intensity with 2-3 impulse discharges were used for surgery.RESULTS:From as early as the first week, a whole 360° angle widening were evident in the patients, thus showing the success of Nd:Yag laser iridotomy in solving relative pupil block. The angle remained narrow by 270° in 14 eyes only, despite repetitions of further treatment with laser iridotomy in a different part of the iris, twice in 10 eyes and three times in 4 eyes.CONCLUSION:Nd:Yag laser iridotomy revealed itself as being a safe and effective treatment in widening those critical Shaffer-Etienne grade 1 and 2 potentially occludable angles.

  9. Video Sharing System Based on Wi-Fi Camera

    OpenAIRE

    Qidi Lin; Hewei Yu; Jinbin Huang; Weile Liang

    2015-01-01

    This paper introduces a video sharing platform based on WiFi, which consists of camera, mobile phone and PC server. This platform can receive wireless signal from the camera and show the live video on the mobile phone captured by camera. In addition, it is able to send commands to camera and control the camera's holder to rotate. The platform can be applied to interactive teaching and dangerous area's monitoring and so on. Testing results show that the platform can share ...

  10. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  11. An Open Standard for Camera Trap Data

    NARCIS (Netherlands)

    Forrester, Tavis; O'Brien, Tim; Fegraus, Eric; Jansen, P.A.; Palmer, Jonathan; Kays, Roland; Ahumada, Jorge; Stern, Beth; McShea, William

    2016-01-01

    Camera traps that capture photos of animals are a valuable tool for monitoring biodiversity. The use of camera traps is rapidly increasing and there is an urgent need for standardization to facilitate data management, reporting and data sharing. Here we offer the Camera Trap Metadata Standard as an

  12. Polarizing aperture stereoscopic cinema camera

    Science.gov (United States)

    Lipton, Lenny

    2012-07-01

    The art of stereoscopic cinematography has been held back because of the lack of a convenient way to reduce the stereo camera lenses' interaxial to less than the distance between the eyes. This article describes a unified stereoscopic camera and lens design that allows for varying the interaxial separation to small values using a unique electro-optical polarizing aperture design for imaging left and right perspective views onto a large single digital sensor, the size of the standard 35 mm frame, with the means to select left and right image information. Even with the added stereoscopic capability, the appearance of existing camera bodies will be unaltered.

  13. Control system for gamma camera

    International Nuclear Information System (INIS)

    Miller, D.W.

    1977-01-01

    An improved gamma camera arrangement is described which utilizing a solid state detector, formed of high purity germanium. the central arrangement of the camera operates to effect the carrying out of a trapezoidal filtering operation over antisymmetrically summed spatial signals through gated integration procedures utilizing idealized integrating intervals. By simultaneously carrying out peak energy evaluation of the input signals, a desirable control over pulse pile-up phenomena is achieved. Additionally, through the use of the time derivative of incoming pulse or signal energy information to initially enable the control system, a low level information evaluation is provided serving to enhance the signal processing efficiency of the camera

  14. Scintillating camera

    International Nuclear Information System (INIS)

    Vlasbloem, H.

    1976-01-01

    The invention relates to a scintillating camera and in particular to an apparatus for determining the position coordinates of a light pulse emitting point on the anode of an image intensifier tube which forms part of a scintillating camera, comprising at least three photomultipliers which are positioned to receive light emitted by the anode screen on their photocathodes, circuit means for processing the output voltages of the photomultipliers to derive voltages that are representative of the position coordinates; a pulse-height discriminator circuit adapted to be fed with the sum voltage of the output voltages of the photomultipliers for gating the output of the processing circuit when the amplitude of the sum voltage of the output voltages of the photomultipliers lies in a predetermined amplitude range, and means for compensating the distortion introduced in the image on the anode screen

  15. The "All Sky Camera Network"

    Science.gov (United States)

    Caldwell, Andy

    2005-01-01

    In 2001, the "All Sky Camera Network" came to life as an outreach program to connect the Denver Museum of Nature and Science (DMNS) exhibit "Space Odyssey" with Colorado schools. The network is comprised of cameras placed strategically at schools throughout Colorado to capture fireballs--rare events that produce meteorites.…

  16. Initial laboratory evaluation of color video cameras: Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    Terry, P.L.

    1993-07-01

    Sandia National Laboratories has considerable experience with monochrome video cameras used in alarm assessment video systems. Most of these systems, used for perimeter protection, were designed to classify rather than to identify intruders. The monochrome cameras were selected over color cameras because they have greater sensitivity and resolution. There is a growing interest in the identification function of security video systems for both access control and insider protection. Because color camera technology is rapidly changing and because color information is useful for identification purposes, Sandia National Laboratories has established an on-going program to evaluate the newest color solid-state cameras. Phase One of the Sandia program resulted in the SAND91-2579/1 report titled: Initial Laboratory Evaluation of Color Video Cameras. The report briefly discusses imager chips, color cameras, and monitors, describes the camera selection, details traditional test parameters and procedures, and gives the results reached by evaluating 12 cameras. Here, in Phase Two of the report, we tested 6 additional cameras using traditional methods. In addition, all 18 cameras were tested by newly developed methods. This Phase 2 report details those newly developed test parameters and procedures, and evaluates the results.

  17. Movement-based Interaction in Camera Spaces

    DEFF Research Database (Denmark)

    Eriksson, Eva; Riisgaard Hansen, Thomas; Lykke-Olesen, Andreas

    2006-01-01

    In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movement-based projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space......, relations, and feedback. We see these as central for describing and analysing movement-based systems using camera tracking and we show how these three concepts can be used to analyse other camera tracking applications....

  18. Photogrammetric Applications of Immersive Video Cameras

    OpenAIRE

    Kwiatek, K.; Tokarczyk, R.

    2014-01-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to ov...

  19. 21 CFR 886.1120 - Opthalmic camera.

    Science.gov (United States)

    2010-04-01

    ... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding area...

  20. Full-parallax 3D display from stereo-hybrid 3D camera system

    Science.gov (United States)

    Hong, Seokmin; Ansari, Amir; Saavedra, Genaro; Martinez-Corral, Manuel

    2018-04-01

    In this paper, we propose an innovative approach for the production of the microimages ready to display onto an integral-imaging monitor. Our main contribution is using a stereo-hybrid 3D camera system, which is used for picking up a 3D data pair and composing a denser point cloud. However, there is an intrinsic difficulty in the fact that hybrid sensors have dissimilarities and therefore should be equalized. Handled data facilitate to generating an integral image after projecting computationally the information through a virtual pinhole array. We illustrate this procedure with some imaging experiments that provide microimages with enhanced quality. After projection of such microimages onto the integral-imaging monitor, 3D images are produced with great parallax and viewing angle.

  1. The effect of narrow provider networks on health care use.

    Science.gov (United States)

    Atwood, Alicia; Lo Sasso, Anthony T

    2016-12-01

    Network design is an often overlooked aspect of health insurance contracts. Recent policy factors have resulted in narrower provider networks. We provide plausibly causal evidence on the effect of narrow network plans offered by a large national health insurance carrier in a major metropolitan market. Our econometric design exploits the fact that some firms offer a narrow network plan to their employees and some do not. Our results show that narrow network health plans lead to reductions in health care utilization and spending. We find evidence that narrow networks save money by selecting lower cost providers into the network. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A study on the angle between the abdominal aorta and the superior mesenteric artery by 3D image reconstruction

    International Nuclear Information System (INIS)

    Kim, Young Keun; Choi, Sung Kwan

    2003-01-01

    SMAS (Superior Mesenteric Artery Syndrome) is a disease caused by a chronic obstruction of the duodenum (transverse portion ), which is hardly detectable. However, it is known that when the superior mesenteric artery and abdominal aorta form a narrow angle, that the transverse portion of the duodenum is pressed down between the superior mesenteric artery and the abdominal aorta, and that this can lead to obstruction of the duodenum. Measuring this angle is a complicated job using conventional angiography, and results often turns out to be inaccurate. In addition, no attempt has been made to determine the value of this angle in Koreans. In this study, we conducted abdominal CT angiography using MIP (maximum intensity projection) on patients with no clinical evidence of SMAS in order to determine the angle at which the superior mesenteric artery branches from the abdominal aorta by using PC based software (Rapidia ver. 1.2) for the image reconstruction. Accordingly, we found that the mean angle between the abdominal aorta and the superior mesenteric artery was 50.05 ± 15.87 .deg. on average, and that the angle in men (53.64 ± 16.57 .deg.) is higher than in women (46.46 ± 14.98 .deg. ). We hope that the angles determined by our study will serve as an important indicator for detecting SMAS

  3. Longitudinal Changes of Angle Configuration in Primary Angle-Closure Suspects

    Science.gov (United States)

    Jiang, Yuzhen; Chang, Dolly S.; Zhu, Haogang; Khawaja, Anthony P.; Aung, Tin; Huang, Shengsong; Chen, Qianyun; Munoz, Beatriz; Grossi, Carlota M.

    2015-01-01

    Objective To determine longitudinal changes in angle configuration in the eyes of primary angle-closure suspects (PACS) treated by laser peripheral iridotomy (LPI) and in untreated fellow eyes. Design Longitudinal cohort study. Participants Primary angle-closure suspects aged 50 to 70 years were enrolled in a randomized, controlled clinical trial. Methods Each participant was treated by LPI in 1 randomly selected eye, with the fellow eye serving as a control. Angle width was assessed in a masked fashion using gonioscopy and anterior segment optical coherence tomography (AS-OCT) before and at 2 weeks, 6 months, and 18 months after LPI. Main Outcome Measures Angle width in degrees was calculated from Shaffer grades assessed under static gonioscopy. Angle configuration was also evaluated using angle opening distance (AOD250, AOD500, AOD750), trabecular-iris space area (TISA500, TISA750), and angle recess area (ARA) measured in AS-OCT images. Results No significant difference was found in baseline measures of angle configuration between treated and untreated eyes. At 2 weeks after LPI, the drainage angle on gonioscopy widened from a mean of 13.5° at baseline to a mean of 25.7° in treated eyes, which was also confirmed by significant increases in all AS-OCT angle width measures (Pgonioscopy (P = 0.18), AOD250 (P = 0.167) and ARA (P = 0.83). In untreated eyes, angle width consistently decreased across all follow-up visits after LPI, with a more rapid longitudinal decrease compared with treated eyes (P values for all variables ≤0.003). The annual rate of change in angle width was equivalent to 1.2°/year (95% confidence interval [CI], 0.8–1.6) in treated eyes and 1.6°/year (95% CI, 1.3–2.0) in untreated eyes (P<0.001). Conclusions Angle width of treated eyes increased markedly after LPI, remained stable for 6 months, and then decreased significantly by 18 months after LPI. Untreated eyes experienced a more consistent and rapid decrease in angle width over the

  4. Solutions on a high-speed wide-angle zoom lens with aspheric surfaces

    Science.gov (United States)

    Yamanashi, Takanori

    2012-10-01

    Recent development in CMOS and digital camera technology has accelerated the business and market share of digital cinematography. In terms of optical design, this technology has increased the need to carefully consider pixel pitch and characteristics of the imager. When the field angle at the wide end, zoom ratio, and F-number are specified, choosing an appropriate zoom lens type is crucial. In addition, appropriate power distributions and lens configurations are required. At points near the wide end of a zoom lens, it is known that an aspheric surface is an effective means to correct off-axis aberrations. On the other hand, optical designers have to focus on manufacturability of aspheric surfaces and perform required analysis with respect to the surface shape. Centration errors aside, it is also important to know the sensitivity to aspheric shape errors and their effect on image quality. In this paper, wide angle cine zoom lens design examples are introduced and their main characteristics are described. Moreover, technical challenges are pointed out and solutions are proposed.

  5. Influence of anatomic landmarks in the virtual environment on simulated angled laparoscope navigation

    Science.gov (United States)

    Christie, Lorna S.; Goossens, Richard H. M.; de Ridder, Huib; Jakimowicz, Jack J.

    2010-01-01

    Background The aim of this study is to investigate the influence of the presence of anatomic landmarks on the performance of angled laparoscope navigation on the SimSurgery SEP simulator. Methods Twenty-eight experienced laparoscopic surgeons (familiar with 30° angled laparoscope, >100 basic laparoscopic procedures, >5 advanced laparoscopic procedures) and 23 novices (no laparoscopy experience) performed the Camera Navigation task in an abstract virtual environment (CN-box) and in a virtual representation of the lower abdomen (CN-abdomen). They also rated the realism and added value of the virtual environments on seven-point scales. Results Within both groups, the CN-box task was accomplished in less time and with shorter tip trajectory than the CN-abdomen task (Wilcoxon test, p  0.05). In both groups, the CN tasks were perceived as hard work and more challenging than anticipated. Conclusions Performance of the angled laparoscope navigation task is influenced by the virtual environment surrounding the exercise. The task was performed better in an abstract environment than in a virtual environment with anatomic landmarks. More insight is required into the influence and function of different types of intrinsic and extrinsic feedback on the effectiveness of preclinical simulator training. PMID:20419318

  6. Prism-based single-camera system for stereo display

    Science.gov (United States)

    Zhao, Yue; Cui, Xiaoyu; Wang, Zhiguo; Chen, Hongsheng; Fan, Heyu; Wu, Teresa

    2016-06-01

    This paper combines the prism and single camera and puts forward a method of stereo imaging with low cost. First of all, according to the principle of geometrical optics, we can deduce the relationship between the prism single-camera system and dual-camera system, and according to the principle of binocular vision we can deduce the relationship between binoculars and dual camera. Thus we can establish the relationship between the prism single-camera system and binoculars and get the positional relation of prism, camera, and object with the best effect of stereo display. Finally, using the active shutter stereo glasses of NVIDIA Company, we can realize the three-dimensional (3-D) display of the object. The experimental results show that the proposed approach can make use of the prism single-camera system to simulate the various observation manners of eyes. The stereo imaging system, which is designed by the method proposed by this paper, can restore the 3-D shape of the object being photographed factually.

  7. The Use of Camera Traps in Wildlife

    Directory of Open Access Journals (Sweden)

    Yasin Uçarlı

    2013-11-01

    Full Text Available Camera traps are increasingly used in the abundance and density estimates of wildlife species. Camera traps are very good alternative for direct observation in case, particularly, steep terrain, dense vegetation covered areas or nocturnal species. The main reason for the use of camera traps is eliminated that the economic, personnel and time loss in a continuous manner at the same time in different points. Camera traps, motion and heat sensitive, can take a photo or video according to the models. Crossover points and feeding or mating areas of the focal species are addressed as a priority camera trap set locations. The population size can be finding out by the images combined with Capture-Recapture methods. The population density came out the population size divided to effective sampling area size. Mating and breeding season, habitat choice, group structures and survival rates of the focal species can be achieved from the images. Camera traps are very useful to obtain the necessary data about the particularly mysterious species with economically in planning and conservation efforts.

  8. Architectural Design Document for Camera Models

    DEFF Research Database (Denmark)

    Thuesen, Gøsta

    1998-01-01

    Architecture of camera simulator models and data interface for the Maneuvering of Inspection/Servicing Vehicle (MIV) study.......Architecture of camera simulator models and data interface for the Maneuvering of Inspection/Servicing Vehicle (MIV) study....

  9. Small-angle neutron scattering at pulsed spallation sources

    International Nuclear Information System (INIS)

    Seeger, P.A.; Hjelm, R.P. Jr.

    1991-01-01

    The importance of small-angle neutron scattering (SANS) in biological, chemical, physical and engineering research mandates that all intense neutron sources be equipped with SANS instruments. Four existing instruments at pulsed sources are described and the general differences between pulsed-source and reactor-based instrument designs are discussed. The basic geometries are identical, but dynamic range is generally achieved by using a broad band of wavelengths (with time-of-flight analysis) rather than by moving the detector. This allows optimization for maximum beam intensity at a given beam size over the full dynamic range with fixed collimation. Data-acquisition requirements at a pulsed source are more severe, requiring large fast histrograming memories. Data reduction is also more complex, as all wavelength-dependent and angle-dependent backgrounds and nonlinearities must be accounted for before data can be transformed to intensity vs momentum transfer (Q). A comparison is shown between the Los Alamos pulsed instrument and D11 (Institut Laue-Langevin) and examples from the four major topics of the conference are shown. The general conclusion is that reactor-based instruments remain superior at very low Q or if only a narrow range of Q is required, but that the current generation of pulsed-source instruments is competitive of moderate Q and may be faster when a wide range of Q is required. (orig.)

  10. Quantifying seasonal variation of leaf area index using near-infrared digital camera in a rice paddy

    Science.gov (United States)

    Hwang, Y.; Ryu, Y.; Kim, J.

    2017-12-01

    Digital camera has been widely used to quantify leaf area index (LAI). Numerous simple and automatic methods have been proposed to improve the digital camera based LAI estimates. However, most studies in rice paddy relied on arbitrary thresholds or complex radiative transfer models to make binary images. Moreover, only a few study reported continuous, automatic observation of LAI over the season in rice paddy. The objective of this study is to quantify seasonal variations of LAI using raw near-infrared (NIR) images coupled with a histogram shape-based algorithm in a rice paddy. As vegetation highly reflects the NIR light, we installed NIR digital camera 1.8 m above the ground surface and acquired unsaturated raw format images at one-hour intervals between 15 to 80 º solar zenith angles over the entire growing season in 2016 (from May to September). We applied a sub-pixel classification combined with light scattering correction method. Finally, to confirm the accuracy of the quantified LAI, we also conducted direct (destructive sampling) and indirect (LAI-2200) manual observations of LAI once per ten days on average. Preliminary results show that NIR derived LAI agreed well with in-situ observations but divergence tended to appear once rice canopy is fully developed. The continuous monitoring of LAI in rice paddy will help to understand carbon and water fluxes better and evaluate satellite based LAI products.

  11. Hidden cameras everything you need to know about covert recording, undercover cameras and secret filming

    CERN Document Server

    Plomin, Joe

    2016-01-01

    Providing authoritative information on the practicalities of using hidden cameras to expose abuse or wrongdoing, this book is vital reading for anyone who may use or encounter secret filming. It gives specific advice on using phones or covert cameras and unravels the complex legal and ethical issues that need to be considered.

  12. The paediatric Bohler's angle and crucial angle of Gissane: a case series

    Directory of Open Access Journals (Sweden)

    Crawford Haemish A

    2011-01-01

    Full Text Available Abstract Background Bohler's angle and the crucial angle of Gissane can be used to assess calcaneal fractures. While the normal adult values of these angles are widely known, the normal paediatric values have not yet been established. Our aim is to investigate Bohler's angle and the crucial angle of Gissane in a paediatric population and establish normal paediatric reference values. Method We measured Bohler's angle and the crucial angle of Gissane using normal plain ankle radiographs of 763 patients from birth to 14 years of age completed over a five year period from July 2003 to June 2008. Results In our paediatric study group, the mean Bohler's angle was 35.2 degrees and the mean crucial angle of Gissane was 111.3 degrees. In an adult comparison group, the mean Bohler's angle was 39.2 degrees and the mean crucial angle of Gissane was 113.8 degrees. The differences in Bohler's angle and the crucial angle of Gissane between these two groups were statistically significant. Conclusion We have presented the normal values of Bohler's angle and the crucial angle of Gissane in a paediatric population. These values may provide a useful comparison to assist with the management of the paediatric calcaneal fracture.

  13. Long wavelength infrared camera (LWIRC): a 10 micron camera for the Keck Telescope

    Energy Technology Data Exchange (ETDEWEB)

    Wishnow, E.H.; Danchi, W.C.; Tuthill, P.; Wurtz, R.; Jernigan, J.G.; Arens, J.F.

    1998-05-01

    The Long Wavelength Infrared Camera (LWIRC) is a facility instrument for the Keck Observatory designed to operate at the f/25 forward Cassegrain focus of the Keck I telescope. The camera operates over the wavelength band 7-13 {micro}m using ZnSe transmissive optics. A set of filters, a circular variable filter (CVF), and a mid-infrared polarizer are available, as are three plate scales: 0.05``, 0.10``, 0.21`` per pixel. The camera focal plane array and optics are cooled using liquid helium. The system has been refurbished with a 128 x 128 pixel Si:As detector array. The electronics readout system used to clock the array is compatible with both the hardware and software of the other Keck infrared instruments NIRC and LWS. A new pre-amplifier/A-D converter has been designed and constructed which decreases greatly the system susceptibility to noise.

  14. Creation of the {pi} angle standard for the flat angle measurements

    Energy Technology Data Exchange (ETDEWEB)

    Giniotis, V; Rybokas, M, E-mail: gi@ap.vtu.l, E-mail: MRybokas@gama.l [Department of Information Technologies, Vilnius Gediminas Technical University, Sauletekio al. 11, 10223 Vilnius-40 (Lithuania)

    2010-07-01

    Angle measurements are based mainly on multiangle prisms - polygons with autocollimators, rotary encoders for high accuracy and circular scales as the standards of the flat angle. Traceability of angle measurements is based on the standard of the plane angle - prism (polygon) calibrated at an appropriate accuracy. Some metrological institutions have established their special test benches (comparators) equipped with circular scales or rotary encoders of high accuracy and polygons with autocollimators for angle calibration purposes. Nevertheless, the standard (etalon) of plane angle - polygon has many restrictions for the transfer of angle unit - radian (rad) and other units of angle. It depends on the number of angles formed by the flat sides of the polygon that is restricted by technological and metrological difficulties related to the production and accuracy determination of the polygon. A possibility to create the standard of the angle equal to {pi} rad or half the circle or the full angle is proposed. It can be created by the circular scale with the rotation axis of very high accuracy and two precision reading instruments, usually, photoelectric microscopes (PM), placed on the opposite sides of the circular scale using the special alignment steps. A great variety of angle units and values can be measured and its traceability ensured by applying the third PM on the scale. Calibration of the circular scale itself and other scale or rotary encoder as well is possible using the proposed method with an implementation of {pi} rad as the primary standard angle. The method proposed enables to assure a traceability of angle measurements at every laboratory having appropriate environment and reading instruments of appropriate accuracy together with a rotary table with the rotation axis of high accuracy - rotation trajectory (runout) being in the range of 0.05 {mu}m. Short information about the multipurpose angle measurement test bench developed is presented.

  15. Using a laser scanning camera for reactor inspection

    International Nuclear Information System (INIS)

    Armour, I.A.; Adrain, R.S.; Klewe, R.C.

    1984-01-01

    Inspection of nuclear reactors is normally carried out using TV or film cameras. There are, however, several areas where these cameras show considerable shortcomings. To overcome these difficulties, laser scanning cameras have been developed. This type of camera can be used for general visual inspection as well as the provision of high resolution video images with high ratio on and off-axis zoom capability. In this paper, we outline the construction and operation of a laser scanning camera and give examples of how it has been used in various power stations, and indicate future potential developments. (author)

  16. Qualification Tests of Micro-camera Modules for Space Applications

    Science.gov (United States)

    Kimura, Shinichi; Miyasaka, Akira

    Visual capability is very important for space-based activities, for which small, low-cost space cameras are desired. Although cameras for terrestrial applications are continually being improved, little progress has been made on cameras used in space, which must be extremely robust to withstand harsh environments. This study focuses on commercial off-the-shelf (COTS) CMOS digital cameras because they are very small and are based on an established mass-market technology. Radiation and ultrahigh-vacuum tests were conducted on a small COTS camera that weighs less than 100 mg (including optics). This paper presents the results of the qualification tests for COTS cameras and for a small, low-cost COTS-based space camera.

  17. Decoupling Intensity Radiated by the Emitter in Distance Estimation from Camera to IR Emitter

    Directory of Open Access Journals (Sweden)

    Carlos Andrés Luna Vázquez

    2013-05-01

    Full Text Available Various models using radiometric approach have been proposed to solve the problem of estimating the distance between a camera and an infrared emitter diode (IRED. They depend directly on the radiant intensity of the emitter, set by the IRED bias current. As is known, this current presents a drift with temperature, which will be transferred to the distance estimation method. This paper proposes an alternative approach to remove temperature drift in the distance estimation method by eliminating the dependence on radiant intensity. The main aim was to use the relative accumulated energy together with other defined models, such as the zeroth-frequency component of the FFT of the IRED image and the standard deviation of pixel gray level intensities in the region of interest containing the IRED image. By using the abovementioned models, an expression free of IRED radiant intensity was obtained. Furthermore, the final model permitted simultaneous estimation of the distance between the IRED and the camera and the IRED orientation angle. The alternative presented in this paper gave a 3% maximum relative error over a range of distances up to 3 m.

  18. Modelling Virtual Camera Behaviour Through Player Gaze

    DEFF Research Database (Denmark)

    Picardi, Andrea; Burelli, Paolo; Yannakakis, Georgios N.

    2012-01-01

    industry and game AI research focus on the devel- opment of increasingly sophisticated systems to automate the control of the virtual camera integrating artificial intel- ligence algorithms within physical simulations. However, in both industry and academia little research has been carried out......In a three-dimensional virtual environment, aspects such as narrative and interaction largely depend on the placement and animation of the virtual camera. Therefore, virtual camera control plays a critical role in player experience and, thereby, in the overall quality of a computer game. Both game...... on the relationship between virtual camera, game-play and player behaviour. We run a game user experiment to shed some light on this relationship and identify relevant dif- ferences between camera behaviours through different game sessions, playing behaviours and player gaze patterns. Re- sults show that users can...

  19. Event detection intelligent camera development

    International Nuclear Information System (INIS)

    Szappanos, A.; Kocsis, G.; Molnar, A.; Sarkozi, J.; Zoletnik, S.

    2008-01-01

    A new camera system 'event detection intelligent camera' (EDICAM) is being developed for the video diagnostics of W-7X stellarator, which consists of 10 distinct and standalone measurement channels each holding a camera. Different operation modes will be implemented for continuous and for triggered readout as well. Hardware level trigger signals will be generated from real time image processing algorithms optimized for digital signal processor (DSP) and field programmable gate array (FPGA) architectures. At full resolution a camera sends 12 bit sampled 1280 x 1024 pixels with 444 fps which means 1.43 Terabyte over half an hour. To analyse such a huge amount of data is time consuming and has a high computational complexity. We plan to overcome this problem by EDICAM's preprocessing concepts. EDICAM camera system integrates all the advantages of CMOS sensor chip technology and fast network connections. EDICAM is built up from three different modules with two interfaces. A sensor module (SM) with reduced hardware and functional elements to reach a small and compact size and robust action in harmful environment as well. An image processing and control unit (IPCU) module handles the entire user predefined events and runs image processing algorithms to generate trigger signals. Finally a 10 Gigabit Ethernet compatible image readout card functions as the network interface for the PC. In this contribution all the concepts of EDICAM and the functions of the distinct modules are described

  20. Positron emission tomography camera

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    A positron emission tomography camera having a plurality of detector rings positioned side-by-side or offset by one-half of the detector cross section around a patient area to detect radiation therefrom. Each detector ring or offset ring includes a plurality of photomultiplier tubes and a plurality of scintillation crystals are positioned relative to the photomultiplier tubes whereby each tube is responsive to more than one crystal. Each alternate crystal in the ring is offset by one-half or less of the thickness of the crystal such that the staggered crystals are seen by more than one photomultiplier tube. This sharing of crystals and photomultiplier tubes allows identification of the staggered crystal and the use of smaller detectors shared by larger photomultiplier tubes thereby requiring less photomultiplier tubes, creating more scanning slices, providing better data sampling, and reducing the cost of the camera. The offset detector ring geometry reduces the costs of the positron camera and improves its performance

  1. Correlates of Narrow Bracketing

    DEFF Research Database (Denmark)

    Koch, Alexander; Nafziger, Julia

    We examine whether different phenomena of narrow bracketing can be traced back to some common characteristic and whether and how different phenomena are related. We find that making dominated lottery choices or ignoring the endowment when making risky choices are related phenomena and are both as...

  2. An enhanced narrow-band imaging method for the microvessel detection

    Science.gov (United States)

    Yu, Feng; Song, Enmin; Liu, Hong; Wan, Youming; Zhu, Jun; Hung, Chih-Cheng

    2018-02-01

    A medical endoscope system combined with the narrow-band imaging (NBI), has been shown to be a superior diagnostic tool for early cancer detection. The NBI can reveal the morphologic changes of microvessels in the superficial cancer. In order to improve the conspicuousness of microvessel texture, we propose an enhanced NBI method to improve the conspicuousness of endoscopic images. To obtain the more conspicuous narrow-band images, we use the edge operator to extract the edge information of the narrow-band blue and green images, and give a weight to the extracted edges. Then, the weighted edges are fused with the narrow-band blue and green images. Finally, the displayed endoscopic images are reconstructed with the enhanced narrow-band images. In addition, we evaluate the performance of enhanced narrow-band images with different edge operators. Experimental results indicate that the Sobel and Canny operators achieve the best performance of all. Compared with traditional NBI method of Olympus company, our proposed method has more conspicuous texture of microvessel.

  3. Glaucoma, Open-Angle

    Science.gov (United States)

    ... Home » Statistics and Data » Glaucoma, Open-angle Listen Glaucoma, Open-angle Open-angle Glaucoma Defined In open-angle glaucoma, the fluid passes ... 2010 2010 U.S. Age-Specific Prevalence Rates for Glaucoma by Age and Race/Ethnicity The prevalence of ...

  4. Adapting to the 30-degree visual perspective by emulating the angled laparoscope: a simple and low-cost solution for basic surgical training.

    Science.gov (United States)

    Daniel, Lorias Espinoza; Tapia, Fernando Montes; Arturo, Minor Martínez; Ricardo, Ordorica Flores

    2014-12-01

    The ability to handle and adapt to the visual perspectives generated by angled laparoscopes is crucial for skilled laparoscopic surgery. However, the control of the visual work space depends on the ability of the operator of the camera, who is often not the most experienced member of the surgical team. Here, we present a simple, low-cost option for surgical training that challenges the learner with static and dynamic visual perspectives at 30 degrees using a system that emulates the angled laparoscope. A system was developed using a low-cost camera and readily available materials to emulate the angled laparoscope. Nine participants undertook 3 tasks to test spatial adaptation to the static and dynamic visual perspectives at 30 degrees. Completing each task to a predefined satisfactory level ensured precision of execution of the tasks. Associated metrics (time and error rate) were recorded, and the performance of participants were determined. A total of 450 repetitions were performed by 9 residents at various stages of training. All the tasks were performed with a visual perspective of 30 degrees using the system. Junior residents were more proficient than senior residents. This system is a viable and low-cost alternative for developing the basic psychomotor skills necessary for the handling and adaptation to visual perspectives of 30 degrees, without depending on a laparoscopic tower, in junior residents. More advanced skills may then be acquired by other means, such as in the operating theater or through clinical experience.

  5. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    Science.gov (United States)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.

    2011-01-01

    The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.

  6. Aspects of Voyager photogrammetry

    Science.gov (United States)

    Wu, Sherman S. C.; Schafer, Francis J.; Jordan, Raymond; Howington, Annie-Elpis

    1987-01-01

    In January 1986, Voyager 2 took a series of pictures of Uranus and its satellites with the Imaging Science System (ISS) on board the spacecraft. Based on six stereo images from the ISS narrow-angle camera, a topographic map was compiled of the Southern Hemisphere of Miranda, one of Uranus' moons. Assuming a spherical figure, a 20-km surface relief is shown on the map. With three additional images from the ISS wide-angle camera, a control network of Miranda's Southern Hemisphere was established by analytical photogrammetry, producing 88 ground points for the control of multiple-model compilation on the AS-11AM analytical stereoplotter. Digital terrain data from the topographic map of Miranda have also been produced. By combining these data and the image data from the Voyager 2 mission, perspective views or even a movie of the mapped area can be made. The application of these newly developed techniques to Voyager 1 imagery, which includes a few overlapping pictures of Io and Ganymede, permits the compilation of contour maps or topographic profiles of these bodies on the analytical stereoplotters.

  7. A CHF Model in Narrow Gaps under Saturated Boiling

    International Nuclear Information System (INIS)

    Park, Suki; Kim, Hyeonil; Park, Cheol

    2014-01-01

    Many researchers have paid a great attention to the CHF in narrow gaps due to enormous industrial applications. Especially, a great number of researches on the CHF have been carried out in relation to nuclear safety issues such as in-vessel retention for nuclear power plants during a severe accident. Analytical studies to predict the CHF in narrow gaps have been also reported. Yu et al. (2012) developed an analytical model to predict the CHF on downward facing and inclined heaters based on the model of Kandlikar et al. (2001) for an upward facing heater. A new theoretical model is developed to predict the CHF in narrow gaps under saturated pool boiling. This model is applicable when one side of coolant channels or both sides are heated including the effects of heater orientation. The present model is compared with the experimental CHF data obtained in narrow gaps. A new analytical CHF model is proposed to predict CHF for narrow gaps under saturated pool boiling. This model can be applied to one-side or two-sides heating surface and also consider the effects of heater orientation on CHF. The present model is compared with the experimental data obtained in narrow gaps with one heater. The comparisons indicate that the present model shows a good agreement with the experimental CHF data in the horizontal annular tubes. However, it generally under-predicts the experimental data in the narrow rectangular gaps except the data obtained in the gap thickness of 10 mm and the horizontal downward facing heater

  8. Traveling wave deflector design for femtosecond streak camera

    International Nuclear Information System (INIS)

    Pei, Chengquan; Wu, Shengli; Luo, Duan; Wen, Wenlong; Xu, Junkai; Tian, Jinshou; Zhang, Minrui; Chen, Pin; Chen, Jianzhong; Liu, Rong

    2017-01-01

    In this paper, a traveling wave deflection deflector (TWD) with a slow-wave property induced by a microstrip transmission line is proposed for femtosecond streak cameras. The pass width and dispersion properties were simulated. In addition, the dynamic temporal resolution of the femtosecond camera was simulated by CST software. The results showed that with the proposed TWD a femtosecond streak camera can achieve a dynamic temporal resolution of less than 600 fs. Experiments were done to test the femtosecond streak camera, and an 800 fs dynamic temporal resolution was obtained. Guidance is provided for optimizing a femtosecond streak camera to obtain higher temporal resolution.

  9. Traveling wave deflector design for femtosecond streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Pei, Chengquan; Wu, Shengli [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China); Luo, Duan [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Wen, Wenlong [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); Xu, Junkai [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Tian, Jinshou, E-mail: tianjs@opt.ac.cn [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); Collaborative Innovation Center of Extreme Optics, Shanxi University, Taiyuan, Shanxi 030006 (China); Zhang, Minrui; Chen, Pin [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Chen, Jianzhong [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China); Liu, Rong [Xi' an Technological University, Xi' an 710021 (China)

    2017-05-21

    In this paper, a traveling wave deflection deflector (TWD) with a slow-wave property induced by a microstrip transmission line is proposed for femtosecond streak cameras. The pass width and dispersion properties were simulated. In addition, the dynamic temporal resolution of the femtosecond camera was simulated by CST software. The results showed that with the proposed TWD a femtosecond streak camera can achieve a dynamic temporal resolution of less than 600 fs. Experiments were done to test the femtosecond streak camera, and an 800 fs dynamic temporal resolution was obtained. Guidance is provided for optimizing a femtosecond streak camera to obtain higher temporal resolution.

  10. CameraHRV: robust measurement of heart rate variability using a camera

    Science.gov (United States)

    Pai, Amruta; Veeraraghavan, Ashok; Sabharwal, Ashutosh

    2018-02-01

    The inter-beat-interval (time period of the cardiac cycle) changes slightly for every heartbeat; this variation is measured as Heart Rate Variability (HRV). HRV is presumed to occur due to interactions between the parasym- pathetic and sympathetic nervous system. Therefore, it is sometimes used as an indicator of the stress level of an individual. HRV also reveals some clinical information about cardiac health. Currently, HRV is accurately measured using contact devices such as a pulse oximeter. However, recent research in the field of non-contact imaging Photoplethysmography (iPPG) has made vital sign measurements using just the video recording of any exposed skin (such as a person's face) possible. The current signal processing methods for extracting HRV using peak detection perform well for contact-based systems but have poor performance for the iPPG signals. The main reason for this poor performance is the fact that current methods are sensitive to large noise sources which are often present in iPPG data. Further, current methods are not robust to motion artifacts that are common in iPPG systems. We developed a new algorithm, CameraHRV, for robustly extracting HRV even in low SNR such as is common with iPPG recordings. CameraHRV combined spatial combination and frequency demodulation to obtain HRV from the instantaneous frequency of the iPPG signal. CameraHRV outperforms other current methods of HRV estimation. Ground truth data was obtained from FDA-approved pulse oximeter for validation purposes. CameraHRV on iPPG data showed an error of 6 milliseconds for low motion and varying skin tone scenarios. The improvement in error was 14%. In case of high motion scenarios like reading, watching and talking, the error was 10 milliseconds.

  11. Electron correlations in narrow band systems

    International Nuclear Information System (INIS)

    Kishore, R.

    1983-01-01

    The effect of the electron correlations in narrow bands, such as d(f) bands in the transition (rare earth) metals and their compounds and the impurity bands in doped semiconductors is studied. The narrow band systems is described, by the Hubbard Hamiltonian. By proposing a local self-energy for the interacting electron, it is found that the results are exact in both atomic and band limits and reduce to the Hartree Fock results for U/Δ → 0, where U is the intra-atomic Coulomb interaction and Δ is the bandwidth of the noninteracting electrons. For the Lorentzian form of the density of states of the noninteracting electrons, this approximation turns out to be equivalent to the third Hubbard approximation. A simple argument, based on the mean free path obtained from the imaginary part of the self energy, shows how the electron correlations can give rise to a discontinous metal-nonmetal transition as proposed by Mott. The band narrowing and the existence of the satellite below the Fermi energy in Ni, found in photoemission experiments, can also be understood. (Author) [pt

  12. Correlations associated with small angle protons produced in proton- proton collisions at 31 GeV total energy

    CERN Document Server

    Albrow, M G; Barber, D P; Bogaerts, A; Bosnjakovic, B; Brooks, J R; Clegg, A B; Erné, F C; Gee, C N P; Locke, D H; Loebinger, F K; Murphy, P G; Rudge, A; Sens, Johannes C

    1973-01-01

    High energy inelastic protons with x=2 p/sub L//s/sup 1/2/>0.99 observed in 15.3/15.3 GeV proton-proton collisions at the CERN ISR are accompanied by particles whose angular distribution is confined to a narrow cone in the opposite direction. In contrast, lower energy protons (0.72angles. The ratio of the associated charged multiplicities is approximately 0.4. (3 refs).

  13. An image-tube camera for cometary spectrography

    Science.gov (United States)

    Mamadov, O.

    The paper discusses the mounting of an image tube camera. The cathode is of antimony, sodium, potassium, and cesium. The parts used for mounting are of acrylic plastic and a fabric-based laminate. A mounting design that does not include cooling is presented. The aperture ratio of the camera is 1:27. Also discussed is the way that the camera is joined to the spectrograph.

  14. Characterization of SWIR cameras by MRC measurements

    Science.gov (United States)

    Gerken, M.; Schlemmer, H.; Haan, Hubertus A.; Siemens, Christofer; Münzberg, M.

    2014-05-01

    Cameras for the SWIR wavelength range are becoming more and more important because of the better observation range for day-light operation under adverse weather conditions (haze, fog, rain). In order to choose the best suitable SWIR camera or to qualify a camera for a given application, characterization of the camera by means of the Minimum Resolvable Contrast MRC concept is favorable as the MRC comprises all relevant properties of the instrument. With the MRC known for a given camera device the achievable observation range can be calculated for every combination of target size, illumination level or weather conditions. MRC measurements in the SWIR wavelength band can be performed widely along the guidelines of the MRC measurements of a visual camera. Typically measurements are performed with a set of resolution targets (e.g. USAF 1951 target) manufactured with different contrast values from 50% down to less than 1%. For a given illumination level the achievable spatial resolution is then measured for each target. The resulting curve is showing the minimum contrast that is necessary to resolve the structure of a target as a function of spatial frequency. To perform MRC measurements for SWIR cameras at first the irradiation parameters have to be given in radiometric instead of photometric units which are limited in their use to the visible range. In order to do so, SWIR illumination levels for typical daylight and twilight conditions have to be defined. At second, a radiation source is necessary with appropriate emission in the SWIR range (e.g. incandescent lamp) and the irradiance has to be measured in W/m2 instead of Lux = Lumen/m2. At third, the contrast values of the targets have to be calibrated newly for the SWIR range because they typically differ from the values determined for the visual range. Measured MRC values of three cameras are compared to the specified performance data of the devices and the results of a multi-band in-house designed Vis-SWIR camera

  15. Integrating Gigabit ethernet cameras into EPICS at Diamond light source

    International Nuclear Information System (INIS)

    Cobb, T.

    2012-01-01

    At Diamond Light Source a range of cameras are used to provide images for diagnostic purposes in both the accelerator and photo beamlines. The accelerator and existing beamlines use Point Grey Flea and Flea2 Firewire cameras. We have selected Gigabit Ethernet cameras supporting GigE Vision for our new photon beamlines. GigE Vision is an interface standard for high speed Ethernet cameras which encourages inter-operability between manufacturers. This paper describes the challenges encountered while integrating GigE Vision cameras from a range of vendors into EPICS. GigE Vision cameras appear to be more reliable than the Firewire cameras, and the simple cabling makes much easier to move the cameras to different positions. Upcoming power over Ethernet versions of the cameras will reduce the number of cables still further

  16. Digital airborne camera introduction and technology

    CERN Document Server

    Sandau, Rainer

    2014-01-01

    The last decade has seen great innovations on the airborne camera. This book is the first ever written on the topic and describes all components of a digital airborne camera ranging from the object to be imaged to the mass memory device.

  17. Fracture strength and probability of survival of narrow and extra-narrow dental implants after fatigue testing: In vitro and in silico analysis.

    Science.gov (United States)

    Bordin, Dimorvan; Bergamo, Edmara T P; Fardin, Vinicius P; Coelho, Paulo G; Bonfante, Estevam A

    2017-07-01

    To assess the probability of survival (reliability) and failure modes of narrow implants with different diameters. For fatigue testing, 42 implants with the same macrogeometry and internal conical connection were divided, according to diameter, as follows: narrow (Ø3.3×10mm) and extra-narrow (Ø2.9×10mm) (21 per group). Identical abutments were torqued to the implants and standardized maxillary incisor crowns were cemented and subjected to step-stress accelerated life testing (SSALT) in water. The use-level probability Weibull curves, and reliability for a mission of 50,000 and 100,000 cycles at 50N, 100, 150 and 180N were calculated. For the finite element analysis (FEA), two virtual models, simulating the samples tested in fatigue, were constructed. Loading at 50N and 100N were applied 30° off-axis at the crown. The von-Mises stress was calculated for implant and abutment. The beta (β) values were: 0.67 for narrow and 1.32 for extra-narrow implants, indicating that failure rates did not increase with fatigue in the former, but more likely were associated with damage accumulation and wear-out failures in the latter. Both groups showed high reliability (up to 97.5%) at 50 and 100N. A decreased reliability was observed for both groups at 150 and 180N (ranging from 0 to 82.3%), but no significant difference was observed between groups. Failure predominantly involved abutment fracture for both groups. FEA at 50N-load, Ø3.3mm showed higher von-Mises stress for abutment (7.75%) and implant (2%) when compared to the Ø2.9mm. There was no significant difference between narrow and extra-narrow implants regarding probability of survival. The failure mode was similar for both groups, restricted to abutment fracture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Gabor zone-plate apertures for imaging with the mercuric iodide gamma-ray camera

    Energy Technology Data Exchange (ETDEWEB)

    Patt, B E [EG and G Energy Measurements, Inc., Goleta, CA (USA); Meyyappan, A; Cai, A; Wade, G [California Univ., Santa Barbara (USA). Dept. of Electrical and Computer Engineering

    1990-12-20

    Gabor zone-plate (GZP) apertures have been developed for use in EG and G EM's mercuric iodide (HgI{sub 2}) gamma-ray camera. The purpose of such an aperture is to increase efficiency, while maintaining good resolution. The GZP is similar to the Fresnel zone plate (FZP) but it has continuous transitions between opaque and transparent regions. Because there are no sharp transitions in the transmission, the inherent interference noise in GZP imaging is lower than that in FZP imaging. GZP parameters were chosen by considering the effects of constraints such as detector pixel size, number of pixels, minimum field of view required, maximum angle of incidence tolerated, and the Nyquist criterion for the minimum sampling rate. As a result an aperture was designed and fabricated with eight zones and a diameter of 3 cm. Lead was chosen as the aperture medium due to its high attenuation coefficient. Experimental data were obtained from the camera with the above GZP aperture. The point-spread function was determined and compared to the calculated response. Excellent agreement was obtained. The reconstruction process involves simulating, by computer, planar-wave illumination of a scaled transparency of the image and recording the intensity pattern at the focal plane. (orig.).

  19. Camera Traps Can Be Heard and Seen by Animals

    Science.gov (United States)

    Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356

  20. Camera traps can be heard and seen by animals.

    Directory of Open Access Journals (Sweden)

    Paul D Meek

    Full Text Available Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5 and infrared illumination outputs (n = 7 of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21 and assessed the vision ranges (n = 3 of mammals species (where data existed to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  1. A SPECT demonstrator—revival of a gamma camera

    Science.gov (United States)

    Valastyán, I.; Kerek, A.; Molnár, J.; Novák, D.; Végh, J.; Emri, M.; Trón, L.

    2006-07-01

    A gamma camera has been updated and converted to serve as a demonstrator for educational purposes. The gantry and the camera head were the only part of the system that remained untouched. The main reason for this modernization was to increase the transparency of the gamma camera by partitioning the different logical building blocks of the system and thus providing access for inspection and improvements throughout the chain. New data acquisition and reconstruction software has been installed. By taking these measures, the camera is now used in education and also serves as a platform for tests of new hardware and software solutions. The camera is also used to demonstrate 3D (SPECT) imaging by collecting 2D projections from a rotatable cylindrical phantom. Since the camera head is not attached mechanically to the phantom, the effect of misalignment between the head and the rotation axis of the phantom can be studied.

  2. A SPECT demonstrator-revival of a gamma camera

    International Nuclear Information System (INIS)

    Valastyan, I.; Kerek, A.; Molnar, J.; Novak, D.; Vegh, J.; Emri, M.; Tron, L.

    2006-01-01

    A gamma camera has been updated and converted to serve as a demonstrator for educational purposes. The gantry and the camera head were the only part of the system that remained untouched. The main reason for this modernization was to increase the transparency of the gamma camera by partitioning the different logical building blocks of the system and thus providing access for inspection and improvements throughout the chain. New data acquisition and reconstruction software has been installed. By taking these measures, the camera is now used in education and also serves as a platform for tests of new hardware and software solutions. The camera is also used to demonstrate 3D (SPECT) imaging by collecting 2D projections from a rotatable cylindrical phantom. Since the camera head is not attached mechanically to the phantom, the effect of misalignment between the head and the rotation axis of the phantom can be studied

  3. Analysis of Camera Parameters Value in Various Object Distances Calibration

    International Nuclear Information System (INIS)

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  4. NV-CMOS HD camera for day/night imaging

    Science.gov (United States)

    Vogelsong, T.; Tower, J.; Sudol, Thomas; Senko, T.; Chodelka, D.

    2014-06-01

    SRI International (SRI) has developed a new multi-purpose day/night video camera with low-light imaging performance comparable to an image intensifier, while offering the size, weight, ruggedness, and cost advantages enabled by the use of SRI's NV-CMOS HD digital image sensor chip. The digital video output is ideal for image enhancement, sharing with others through networking, video capture for data analysis, or fusion with thermal cameras. The camera provides Camera Link output with HD/WUXGA resolution of 1920 x 1200 pixels operating at 60 Hz. Windowing to smaller sizes enables operation at higher frame rates. High sensitivity is achieved through use of backside illumination, providing high Quantum Efficiency (QE) across the visible and near infrared (NIR) bands (peak QE camera, which operates from a single 5V supply. The NVCMOS HD camera provides a substantial reduction in size, weight, and power (SWaP) , ideal for SWaP-constrained day/night imaging platforms such as UAVs, ground vehicles, fixed mount surveillance, and may be reconfigured for mobile soldier operations such as night vision goggles and weapon sights. In addition the camera with the NV-CMOS HD imager is suitable for high performance digital cinematography/broadcast systems, biofluorescence/microscopy imaging, day/night security and surveillance, and other high-end applications which require HD video imaging with high sensitivity and wide dynamic range. The camera comes with an array of lens mounts including C-mount and F-mount. The latest test data from the NV-CMOS HD camera will be presented.

  5. A naturally narrow positive-parity Θ+

    International Nuclear Information System (INIS)

    Carlson, Carl E.; Carone, Christopher D.; Kwee, Herry J.; Nazaryan, Vahagn

    2004-01-01

    We present a consistent color-flavor-spin-orbital wave function for a positive-parity Θ + that naturally explains the observed narrowness of the state. The wave function is totally symmetric in its flavor-spin part and totally antisymmetric in its color-orbital part. If flavor-spin interactions dominate, this wave function renders the positive-parity Θ + lighter than its negative-parity counterpart. We consider decays of the Θ + and compute the overlap of this state with the kinematically allowed final states. Our results are numerically small. We note that dynamical correlations between quarks are not necessary to obtain narrow pentaquark widths

  6. Apparent contact angle and contact angle hysteresis on liquid infused surfaces.

    Science.gov (United States)

    Semprebon, Ciro; McHale, Glen; Kusumaatmaja, Halim

    2016-12-21

    We theoretically investigate the apparent contact angle and contact angle hysteresis of a droplet placed on a liquid infused surface. We show that the apparent contact angle is not uniquely defined by material parameters, but also has a dependence on the relative size between the droplet and its surrounding wetting ridge formed by the infusing liquid. We derive a closed form expression for the contact angle in the limit of vanishing wetting ridge, and compute the correction for small but finite ridge, which corresponds to an effective line tension term. We also predict contact angle hysteresis on liquid infused surfaces generated by the pinning of the contact lines by the surface corrugations. Our analytical expressions for both the apparent contact angle and contact angle hysteresis can be interpreted as 'weighted sums' between the contact angles of the infusing liquid relative to the droplet and surrounding gas phases, where the weighting coefficients are given by ratios of the fluid surface tensions.

  7. Anterior segment changes after pharmacologic mydriasis using Pentacam and optical coherence tomography in angle closure suspects

    Directory of Open Access Journals (Sweden)

    Jing-Min Guo

    2015-10-01

    Full Text Available AIM:To compare the dynamic changes of anterior segment parameters especially iris morphology induced by pharmacologic mydriasis between angle closure suspects and normal controls.METHODS:The study group comprised 19 eyes of 19 angle closure suspects and 19 eyes of 19 age- and sex-matched normal open-angle eyes. Pentacam and optical coherence tomography measurements before and 30min after instillation of compound tropicamide eye drop were performed and compared. Biometric evaluations of iris tomography and anterior chamber angle were estimated by a customized image-processing software.RESULTS:Baseline axial length, iris cross sectional area and volume did not differ significantly between angle closure suspects and normal controls. Angle closure suspects had smaller pupil size, narrower anterior segment dimension and axial length, thinner iris with greater curve in comparison with normal controls. Pharmacologic mydriasis led to significant increments in iris thickness at 750 μm, anterior chamber depth and volume, whereas significant decrements in iris curve, cross sectional area and volume in both groups. Angle opening distance at 500 μm was increased significantly in normal controls (from 0.465±0.115 mm to 0.539±0.167 mm, P=0.009, but not in angle closure suspects (from 0.125±0.100 mm to 0.145±0.131 mm, P=0.326. Iris volume change per millimeter of pupil dilation (△IV/△PD decreased significantly less in angle closure suspects than normal controls (-2.47±1.33 mm2 vs -3.63±1.58 mm2, P=0.019. Linear regression analysis showed that the change of angle opening distance at 500 μm was associated most with the change of central anterior chamber depth (β=0.841, P=0.002 and △IV/△PD (β=0.028, P=0.002, followed by gender (β=0.062, P=0.032.CONCLUSION:Smaller iris volume decrement per millimeter of pupil dilation is related significantly with the less anterior angle opening in angle closure suspects after pharmacologic mydriasis. Dynamic

  8. Design of Endoscopic Capsule With Multiple Cameras.

    Science.gov (United States)

    Gu, Yingke; Xie, Xiang; Li, Guolin; Sun, Tianjia; Wang, Dan; Yin, Zheng; Zhang, Pengfei; Wang, Zhihua

    2015-08-01

    In order to reduce the miss rate of the wireless capsule endoscopy, in this paper, we propose a new system of the endoscopic capsule with multiple cameras. A master-slave architecture, including an efficient bus architecture and a four level clock management architecture, is applied for the Multiple Cameras Endoscopic Capsule (MCEC). For covering more area of the gastrointestinal tract wall with low power, multiple cameras with a smart image capture strategy, including movement sensitive control and camera selection, are used in the MCEC. To reduce the data transfer bandwidth and power consumption to prolong the MCEC's working life, a low complexity image compressor with PSNR 40.7 dB and compression rate 86% is implemented. A chipset is designed and implemented for the MCEC and a six cameras endoscopic capsule prototype is implemented by using the chipset. With the smart image capture strategy, the coverage rate of the MCEC prototype can achieve 98% and its power consumption is only about 7.1 mW.

  9. Evaluation of Red Light Camera Enforcement at Signalized Intersections

    Directory of Open Access Journals (Sweden)

    Abdulrahman AlJanahi

    2007-12-01

    Full Text Available The study attempts to find the effectiveness of adopting red light cameras in reducing red light violators. An experimental approach was adopted to investigate the use of red light cameras at signalized intersections in the Kingdom of Bahrain. The study locations were divided into three groups. The first group was related to the approaches monitored with red light cameras. The second group was related to approaches without red light cameras, but located within an intersection that had one of its approaches monitored with red light cameras. The third group was related to intersection approaches located at intersection without red light cameras (controlled sites. A methodology was developed for data collection. The data were then tested statistically by Z-test using proportion methods to compare the proportion of red light violations occurring at different sites. The study found that the proportion of red light violators at approaches monitored with red light cameras was significantly less than those at the controlled sites for most of the time. Approaches without red light cameras located within intersections having red light cameras showed, in general, fewer violations than controlled sites, but the results were not significant for all times of the day. The study reveals that red light cameras have a positive effect on reducing red light violations. However, these conclusions need further evaluations to justify their safe and economic use.

  10. Gamma ray camera

    International Nuclear Information System (INIS)

    Wang, S.-H.; Robbins, C.D.

    1979-01-01

    An Anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the Anger camera. The image intensifier tube has a negatively charged flat scintillator screen, a flat photocathode layer, and a grounded, flat output phosphor display screen, all of which have the same dimension to maintain unit image magnification; all components are contained within a grounded metallic tube, with a metallic, inwardly curved input window between the scintillator screen and a collimator. The display screen can be viewed by an array of photomultipliers or solid state detectors. There are two photocathodes and two phosphor screens to give a two stage intensification, the two stages being optically coupled by a light guide. (author)

  11. 3D for the people: multi-camera motion capture in the field with consumer-grade cameras and open source software

    Directory of Open Access Journals (Sweden)

    Brandon E. Jackson

    2016-09-01

    Full Text Available Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts.

  12. Extended spectrum SWIR camera with user-accessible Dewar

    Science.gov (United States)

    Benapfl, Brendan; Miller, John Lester; Vemuri, Hari; Grein, Christoph; Sivananthan, Siva

    2017-02-01

    Episensors has developed a series of extended short wavelength infrared (eSWIR) cameras based on high-Cd concentration Hg1-xCdxTe absorbers. The cameras have a bandpass extending to 3 microns cutoff wavelength, opening new applications relative to traditional InGaAs-based cameras. Applications and uses are discussed and examples given. A liquid nitrogen pour-filled version was initially developed. This was followed by a compact Stirling-cooled version with detectors operating at 200 K. Each camera has unique sensitivity and performance characteristics. The cameras' size, weight and power specifications are presented along with images captured with band pass filters and eSWIR sources to demonstrate spectral response beyond 1.7 microns. The soft seal Dewars of the cameras are designed for accessibility, and can be opened and modified in a standard laboratory environment. This modular approach allows user flexibility for swapping internal components such as cold filters and cold stops. The core electronics of the Stirlingcooled camera are based on a single commercial field programmable gate array (FPGA) that also performs on-board non-uniformity corrections, bad pixel replacement, and directly drives any standard HDMI display.

  13. A Benchmark for Virtual Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2015-01-01

    Automatically animating and placing the virtual camera in a dynamic environment is a challenging task. The camera is expected to maximise and maintain a set of properties — i.e. visual composition — while smoothly moving through the environment and avoiding obstacles. A large number of different....... For this reason, in this paper, we propose a benchmark for the problem of virtual camera control and we analyse a number of different problems in different virtual environments. Each of these scenarios is described through a set of complexity measures and, as a result of this analysis, a subset of scenarios...

  14. Design and implementation of a dual-wavelength intrinsic fluorescence camera system

    Science.gov (United States)

    Ortega-Martinez, Antonio; Musacchia, Joseph J.; Gutierrez-Herrera, Enoch; Wang, Ying; Franco, Walfre

    2017-03-01

    Intrinsic UV fluorescence imaging is a technique that permits the observation of spatial differences in emitted fluorescence. It relies on the fluorescence produced by the innate fluorophores in the sample, and thus can be used for marker-less in-vivo assessment of tissue. It has been studied as a tool for the study of the skin, specifically for the classification of lesions, the delimitation of lesion borders and the study of wound healing, among others. In its most basic setup, a sample is excited with a narrow-band UV light source and the resulting fluorescence is imaged with a UV sensitive camera filtered to the emission wavelength of interest. By carefully selecting the excitation/emission pair, we can observe changes in fluorescence associated with physiological processes. One of the main drawbacks of this simple setup is the inability to observe more than a single excitation/emission pair at the same time, as some phenomena are better studied when two or more different pairs are studied simultaneously. In this work, we describe the design and the hardware and software implementation of a dual wavelength portable UV fluorescence imaging system. Its main components are an UV camera, a dual wavelength UV LED illuminator (295 and 345 nm) and two different emission filters (345 and 390 nm) that can be swapped by a mechanical filter wheel. The system is operated using a laptop computer and custom software that performs basic pre-processing to improve the image. The system was designed to allow us to image fluorescent peaks of tryptophan and collagen cross links in order to study wound healing progression.

  15. The GISMO-2 Bolometer Camera

    Science.gov (United States)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; hide

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  16. [Analog gamma camera digitalization computer system].

    Science.gov (United States)

    Rojas, G M; Quintana, J C; Jer, J; Astudillo, S; Arenas, L; Araya, H

    2004-01-01

    Digitalization of analogue gamma cameras systems, using special acquisition boards in microcomputers and appropriate software for acquisition and processing of nuclear medicine images is described in detail. Microcomputer integrated systems interconnected by means of a Local Area Network (LAN) and connected to several gamma cameras have been implemented using specialized acquisition boards. The PIP software (Portable Image Processing) was installed on each microcomputer to acquire and preprocess the nuclear medicine images. A specialized image processing software has been designed and developed for these purposes. This software allows processing of each nuclear medicine exam, in a semiautomatic procedure, and recording of the results on radiological films. . A stable, flexible and inexpensive system which makes it possible to digitize, visualize, process, and print nuclear medicine images obtained from analogue gamma cameras was implemented in the Nuclear Medicine Division. Such a system yields higher quality images than those obtained with analogue cameras while keeping operating costs considerably lower (filming: 24.6%, fixing 48.2% and developing 26%.) Analogue gamma camera systems can be digitalized economically. This system makes it possible to obtain optimal clinical quality nuclear medicine images, to increase the acquisition and processing efficiency, and to reduce the steps involved in each exam.

  17. Advanced system for Gamma Cameras modernization

    International Nuclear Information System (INIS)

    Osorio Deliz, J. F.; Diaz Garcia, A.; Arista Romeu, E. J.

    2015-01-01

    Analog and digital gamma cameras still largely used in developing countries. Many of them rely in old hardware electronics, which in many cases limits their use in actual nuclear medicine diagnostic studies. Consequently, there are different worldwide companies that produce medical equipment engaged into a partial or total Gamma Cameras modernization. Present work has demonstrated the possibility of substitution of almost entire signal processing electronics placed at inside a Gamma Camera detector head by a digitizer PCI card. this card includes four 12 Bits Analog-to-Digital-Converters of 50 MHz speed. It has been installed in a PC and controlled through software developed in Lab View. Besides, there were done some changes to the hardware inside the detector head including redesign of the Orientation Display Block (ODA card). Also a new electronic design was added to the Microprocessor Control Block (MPA card) which comprised a PIC micro controller acting as a tuning system for individual Photomultiplier Tubes. The images, obtained by measurement of 99m Tc point radioactive source, using modernized camera head demonstrate its overall performance. The system was developed and tested in an old Gamma Camera ORBITER II SIEMENS GAMMASONIC at National Institute of Oncology and Radiobiology (INOR) under CAMELUD project supported by National Program PNOULU and IAEA . (Author)

  18. Estimates of md-mu and left-angle bar dd right-angle -left-angle bar uu right-angle from QCD sum rules for D and D* isospin mass differences

    International Nuclear Information System (INIS)

    Eletsky, V.L.; Ioffe, B.L.

    1993-01-01

    The recent experimental data on D +- D0 and D *+- D*0 mass differences are used as inputs in the QCD sum rules to obtain new estimates on the mass difference of light quarks and on the difference of their condensates: m d -m u =3±1 MeV, left-angle bar dd right-angle -left-angle bar uu right-angle=-(2.5±1)x10 -3 left-angle bar uu right-angle (at a standard normalization point, μ=0.5 GeV)

  19. Subglottic cysts and asymmetrical subglottic narrowing on neck radiograph

    International Nuclear Information System (INIS)

    Holinger, L.D.; Torium, D.M.; Anandappa, E.C.

    1988-01-01

    The congenital subglottic hemangioma typically appears as an asymmetric subglottic narrowing or mass on frontal neck radiograph. Therefore, soft tissue neck radiography has been advocated as a definitive non-operative approach for diagnosing these lesions. However, we have noted similar asymmetric subglottic narrowing in patients with acquired subglottic cysts. These retention cysts occur following long-term intubation in the neonate. The mechanism probably involves subglottic fibrosis which obstructs glands with subsequent cyst formation. Acquired subglottic cysts typically appear as an asymmetric narrowing on frontal or lateral soft tissue neck radiographs. These lesions may produce airway compromise but are effectively treated by forceps or laser removal. Acquired subglottic cysts must be included in the differential diagnosis of asymmetric subglottic narrowing. The definitive diagnosis is made by direct laryngoscopy, not soft tissue neck radiograph. (orig.)

  20. Effect of different electrode tip angles with tilted torch in stationary gas tungsten arc welding: A 3D simulation

    International Nuclear Information System (INIS)

    Abid, M.; Parvez, S.; Nash, D.H.

    2013-01-01

    In this study, the effect of different tip angles (30°, 60°, 90° and 120°) on the arc and weld pool behavior is analyzed in 2 mm and 5 mm arc lengths with tilted (70°) torch. Arc temperature, velocity, current density, heat flux and gas shear are investigated in the arc region and pool convection and puddle shapes are studied in the weld pool region. The arc temperature at the tungsten electrode is found the maximum with sharp tip and decreases as the tip angle increases. The arc temperature on the anode (workpiece) surface becomes concentrated with increase in tip angle. The arc velocity and gas shear stress are observed large with sharp tip and decreasing as the tip angle increases. Current density on the anode surface does not change with tip angle and observed almost the same in all the tip angles in both 2 mm and 5 mm arc lengths. Heat flux due to conduction and convection is observed more sensitive to the tip angle and decreases as the tip angle increases. The electromagnetic force is slightly observed increasing and the buoyancy force is observed slightly decreasing with increase in tip angle. Analyzing each driving force in the weld pool individually shows that the gas drag and Marangoni forces are much stronger than the electromagnetic and buoyancy forces. The weld pool shape is observed wide and shallow in sharp and narrow and deep in large tip angle. Increasing the arc length does not change the weld pool width; however, the weld pool depth significantly changes with arc length and is observed deep in short arc length. The arc properties and weld pool shapes are observed wide ahead of the electrode tip in the weld direction due to 70° torch angle. Good agreement is observed between the numerical and experimental weld pool shapes

  1. Multi-pinhole collimator design for small-object imaging with SiliSPECT: a high-resolution SPECT

    International Nuclear Information System (INIS)

    Shokouhi, S; Peterson, T E; Metzler, S D; Wilson, D W

    2009-01-01

    We have designed a multi-pinhole collimator for a dual-headed, stationary SPECT system that incorporates high-resolution silicon double-sided strip detectors. The compact camera design of our system enables imaging at source-collimator distances between 20 and 30 mm. Our analytical calculations show that using knife-edge pinholes with small-opening angles or cylindrically shaped pinholes in a focused, multi-pinhole configuration in combination with this camera geometry can generate narrow sensitivity profiles across the field of view that can be useful for imaging small objects at high sensitivity and resolution. The current prototype system uses two collimators each containing 127 cylindrically shaped pinholes that are focused toward a target volume. Our goal is imaging objects such as a mouse brain, which could find potential applications in molecular imaging.

  2. Gamma camera performance: technical assessment protocol

    International Nuclear Information System (INIS)

    Bolster, A.A.; Waddington, W.A.

    1996-01-01

    This protocol addresses the performance assessment of single and dual headed gamma cameras. No attempt is made to assess the performance of any associated computing systems. Evaluations are usually performed on a gamma camera commercially available within the United Kingdom and recently installed at a clinical site. In consultation with the manufacturer, GCAT selects the site and liaises with local staff to arrange a mutually convenient time for assessment. The manufacturer is encouraged to have a representative present during the evaluation. Three to four days are typically required for the evaluation team to perform the necessary measurements. When access time is limited, the team will modify the protocol to test the camera as thoroughly as possible. Data are acquired on the camera's computer system and are subsequently transferred to the independent GCAT computer system for analysis. This transfer from site computer to the independent system is effected via a hardware interface and Interfile data transfer. (author)

  3. a Spatio-Spectral Camera for High Resolution Hyperspectral Imaging

    Science.gov (United States)

    Livens, S.; Pauly, K.; Baeck, P.; Blommaert, J.; Nuyts, D.; Zender, J.; Delauré, B.

    2017-08-01

    Imaging with a conventional frame camera from a moving remotely piloted aircraft system (RPAS) is by design very inefficient. Less than 1 % of the flying time is used for collecting light. This unused potential can be utilized by an innovative imaging concept, the spatio-spectral camera. The core of the camera is a frame sensor with a large number of hyperspectral filters arranged on the sensor in stepwise lines. It combines the advantages of frame cameras with those of pushbroom cameras. By acquiring images in rapid succession, such a camera can collect detailed hyperspectral information, while retaining the high spatial resolution offered by the sensor. We have developed two versions of a spatio-spectral camera and used them in a variety of conditions. In this paper, we present a summary of three missions with the in-house developed COSI prototype camera (600-900 nm) in the domains of precision agriculture (fungus infection monitoring in experimental wheat plots), horticulture (crop status monitoring to evaluate irrigation management in strawberry fields) and geology (meteorite detection on a grassland field). Additionally, we describe the characteristics of the 2nd generation, commercially available ButterflEYE camera offering extended spectral range (475-925 nm), and we discuss future work.

  4. A SPATIO-SPECTRAL CAMERA FOR HIGH RESOLUTION HYPERSPECTRAL IMAGING

    Directory of Open Access Journals (Sweden)

    S. Livens

    2017-08-01

    Full Text Available Imaging with a conventional frame camera from a moving remotely piloted aircraft system (RPAS is by design very inefficient. Less than 1 % of the flying time is used for collecting light. This unused potential can be utilized by an innovative imaging concept, the spatio-spectral camera. The core of the camera is a frame sensor with a large number of hyperspectral filters arranged on the sensor in stepwise lines. It combines the advantages of frame cameras with those of pushbroom cameras. By acquiring images in rapid succession, such a camera can collect detailed hyperspectral information, while retaining the high spatial resolution offered by the sensor. We have developed two versions of a spatio-spectral camera and used them in a variety of conditions. In this paper, we present a summary of three missions with the in-house developed COSI prototype camera (600–900 nm in the domains of precision agriculture (fungus infection monitoring in experimental wheat plots, horticulture (crop status monitoring to evaluate irrigation management in strawberry fields and geology (meteorite detection on a grassland field. Additionally, we describe the characteristics of the 2nd generation, commercially available ButterflEYE camera offering extended spectral range (475–925 nm, and we discuss future work.

  5. Performance analysis for gait in camera networks

    OpenAIRE

    Michela Goffredo; Imed Bouchrika; John Carter; Mark Nixon

    2008-01-01

    This paper deploys gait analysis for subject identification in multi-camera surveillance scenarios. We present a new method for viewpoint independent markerless gait analysis that does not require camera calibration and works with a wide range of directions of walking. These properties make the proposed method particularly suitable for gait identification in real surveillance scenarios where people and their behaviour need to be tracked across a set of cameras. Tests on 300 synthetic and real...

  6. Scoliosis angle

    International Nuclear Information System (INIS)

    Marklund, T.

    1978-01-01

    The most commonly used methods of assessing the scoliotic deviation measure angles that are not clearly defined in relation to the anatomy of the patient. In order to give an anatomic basis for such measurements it is proposed to define the scoliotic deviation as the deviation the vertebral column makes with the sagittal plane. Both the Cobb and the Ferguson angles may be based on this definition. The present methods of measurement are then attempts to measure these angles. If the plane of these angles is parallel to the film, the measurement will be correct. Errors in the measurements may be incurred by the projection. A hypothetical projection, called a 'rectified orthogonal projection', is presented, which correctly represents all scoliotic angles in accordance with these principles. It can be constructed in practice with the aid of a computer and by performing measurements on two projections of the vertebral column; a scoliotic curve can be represented independent of the kyphosis and lordosis. (Auth.)

  7. Automated analysis of angle closure from anterior chamber angle images.

    Science.gov (United States)

    Baskaran, Mani; Cheng, Jun; Perera, Shamira A; Tun, Tin A; Liu, Jiang; Aung, Tin

    2014-10-21

    To evaluate a novel software capable of automatically grading angle closure on EyeCam angle images in comparison with manual grading of images, with gonioscopy as the reference standard. In this hospital-based, prospective study, subjects underwent gonioscopy by a single observer, and EyeCam imaging by a different operator. The anterior chamber angle in a quadrant was classified as closed if the posterior trabecular meshwork could not be seen. An eye was classified as having angle closure if there were two or more quadrants of closure. Automated grading of the angle images was performed using customized software. Agreement between the methods was ascertained by κ statistic and comparison of area under receiver operating characteristic curves (AUC). One hundred forty subjects (140 eyes) were included, most of whom were Chinese (102/140, 72.9%) and women (72/140, 51.5%). Angle closure was detected in 61 eyes (43.6%) with gonioscopy in comparison with 59 eyes (42.1%, P = 0.73) using manual grading, and 67 eyes (47.9%, P = 0.24) with automated grading of EyeCam images. The agreement for angle closure diagnosis between gonioscopy and both manual (κ = 0.88; 95% confidence interval [CI), 0.81-0.96) and automated grading of EyeCam images was good (κ = 0.74; 95% CI, 0.63-0.85). The AUC for detecting eyes with gonioscopic angle closure was comparable for manual and automated grading (AUC 0.974 vs. 0.954, P = 0.31) of EyeCam images. Customized software for automated grading of EyeCam angle images was found to have good agreement with gonioscopy. Human observation of the EyeCam images may still be needed to avoid gross misclassification, especially in eyes with extensive angle closure. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  8. Development of gamma camera and application to decontamination

    International Nuclear Information System (INIS)

    Yoshida, Akira; Moro, Eiji; Takahashi, Isao

    2013-01-01

    A gamma camera has been developed to support recovering from the contamination caused by the accident of Fukushima Dai-ichi Nuclear Power Plant of Tokyo Electric Power Company. The gamma camera enables recognition of the contamination by visualizing radioactivity. The gamma camera has been utilized for risk communication (explanation to community resident) at local governments in Fukushima. From now on, the gamma camera will be applied to solve decontaminations issues; improving efficiency of decontamination, visualizing the effect of decontamination work and reducing radioactive waste. (author)

  9. The making of analog module for gamma camera interface

    International Nuclear Information System (INIS)

    Yulinarsari, Leli; Rl, Tjutju; Susila, Atang; Sukandar

    2003-01-01

    The making of an analog module for gamma camera has been conducted. For computerization of planar gamma camera 37 PMT it has been developed interface hardware technology and software between the planar gamma camera with PC. With this interface gamma camera image information (Originally analog signal) was changed to digital single, therefore processes of data acquisition, image quality increase and data analysis as well as data base processing can be conducted with the help of computers, there are three gamma camera main signals, i.e. X, Y and Z . This analog module makes digitation of analog signal X and Y from the gamma camera that conveys position information coming from the gamma camera crystal. Analog conversion to digital was conducted by 2 converters ADC 12 bit with conversion time 800 ns each, conversion procedure for each coordinate X and Y was synchronized using suitable strobe signal Z for information acceptance

  10. An evolution of image source camera attribution approaches.

    Science.gov (United States)

    Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul

    2016-05-01

    Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics

  11. Driving with head-slaved camera system

    NARCIS (Netherlands)

    Oving, A.B.; Erp, J.B.F. van

    2001-01-01

    In a field experiment, we tested the effectiveness of a head-slaved camera system for driving an armoured vehicle under armour. This system consists of a helmet-mounted display (HMD), a headtracker, and a motion platform with two cameras. Subjects performed several driving tasks on paved and in

  12. Improving Photometric Calibration of Meteor Video Camera Systems

    Science.gov (United States)

    Ehlert, Steven; Kingery, Aaron; Suggs, Robert

    2017-01-01

    We present the results of new calibration tests performed by the NASA Meteoroid Environment Office (MEO) designed to help quantify and minimize systematic uncertainties in meteor photometry from video camera observations. These systematic uncertainties can be categorized by two main sources: an imperfect understanding of the linearity correction for the MEO's Watec 902H2 Ultimate video cameras and uncertainties in meteor magnitudes arising from transformations between the Watec camera's Sony EX-View HAD bandpass and the bandpasses used to determine reference star magnitudes. To address the first point, we have measured the linearity response of the MEO's standard meteor video cameras using two independent laboratory tests on eight cameras. Our empirically determined linearity correction is critical for performing accurate photometry at low camera intensity levels. With regards to the second point, we have calculated synthetic magnitudes in the EX bandpass for reference stars. These synthetic magnitudes enable direct calculations of the meteor's photometric flux within the camera bandpass without requiring any assumptions of its spectral energy distribution. Systematic uncertainties in the synthetic magnitudes of individual reference stars are estimated at approx. 0.20 mag, and are limited by the available spectral information in the reference catalogs. These two improvements allow for zero-points accurate to 0.05 - 0.10 mag in both filtered and unfiltered camera observations with no evidence for lingering systematics. These improvements are essential to accurately measuring photometric masses of individual meteors and source mass indexes.

  13. Phase camera experiment for Advanced Virgo

    International Nuclear Information System (INIS)

    Agatsuma, Kazuhiro; Beuzekom, Martin van; Schaaf, Laura van der; Brand, Jo van den

    2016-01-01

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser modulation/demodulation method is used to measure mirror displacements and used for the position controls. This plays a significant role because the quality of controls affect the noise level of the GW detector. The phase camera is able to monitor each sideband separately, which has a great benefit for the manipulation of the delicate controls. Also, overcoming mirror aberrations will be an essential part of Advanced Virgo (AdV), which is a GW detector close to Pisa. Especially low-frequency sidebands can be affected greatly by aberrations in one of the interferometer cavities. The phase cameras allow tracking such changes because the state of the sidebands gives information on mirror aberrations. A prototype of the phase camera has been developed and is currently tested. The performance checks are almost completed and the installation of the optics at the AdV site has started. After the installation and commissioning, the phase camera will be combined to a thermal compensation system that consists of CO 2 lasers and compensation plates. In this paper, we focus on the prototype and show some limitations from the scanner performance. - Highlights: • The phase camera is being developed for a gravitational wave detector. • A scanner performance limits the operation speed and layout design of the system. • An operation range was found by measuring the frequency response of the scanner.

  14. Phase camera experiment for Advanced Virgo

    Energy Technology Data Exchange (ETDEWEB)

    Agatsuma, Kazuhiro, E-mail: agatsuma@nikhef.nl [National Institute for Subatomic Physics, Amsterdam (Netherlands); Beuzekom, Martin van; Schaaf, Laura van der [National Institute for Subatomic Physics, Amsterdam (Netherlands); Brand, Jo van den [National Institute for Subatomic Physics, Amsterdam (Netherlands); VU University, Amsterdam (Netherlands)

    2016-07-11

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser modulation/demodulation method is used to measure mirror displacements and used for the position controls. This plays a significant role because the quality of controls affect the noise level of the GW detector. The phase camera is able to monitor each sideband separately, which has a great benefit for the manipulation of the delicate controls. Also, overcoming mirror aberrations will be an essential part of Advanced Virgo (AdV), which is a GW detector close to Pisa. Especially low-frequency sidebands can be affected greatly by aberrations in one of the interferometer cavities. The phase cameras allow tracking such changes because the state of the sidebands gives information on mirror aberrations. A prototype of the phase camera has been developed and is currently tested. The performance checks are almost completed and the installation of the optics at the AdV site has started. After the installation and commissioning, the phase camera will be combined to a thermal compensation system that consists of CO{sub 2} lasers and compensation plates. In this paper, we focus on the prototype and show some limitations from the scanner performance. - Highlights: • The phase camera is being developed for a gravitational wave detector. • A scanner performance limits the operation speed and layout design of the system. • An operation range was found by measuring the frequency response of the scanner.

  15. Design, Construction, Demonstration and Delivery of an Automated Narrow Gap Welding System.

    Science.gov (United States)

    1982-06-29

    DESIGN, CONSTRUCTION, DEMONSTRATION AND DELIVERY OF WE DA4I &NARROW GAP CONTRACT NO. NOOGOO-81-C-E923 TO DAVID TAYLOR NAVAL RESEARCH AND DEVELOPMENT...the automated * Narrow Gap welding process, is the narrow (3/8 - inch), square-butt joint *design. This narrow joint greatly reduces the volume of weld...AD-i45 495 DESIGN CONSTRUCTION DEMONSTRATION AiND DELIVERY OF RN 1/j AUrOMATED NARROW GAP WELDING SYSTEMI() CRC AUTOMATIC WELDING CO HOUSTON TX 29

  16. Two-Phase Algorithm for Optimal Camera Placement

    Directory of Open Access Journals (Sweden)

    Jun-Woo Ahn

    2016-01-01

    Full Text Available As markers for visual sensor networks have become larger, interest in the optimal camera placement problem has continued to increase. The most featured solution for the optimal camera placement problem is based on binary integer programming (BIP. Due to the NP-hard characteristic of the optimal camera placement problem, however, it is difficult to find a solution for a complex, real-world problem using BIP. Many approximation algorithms have been developed to solve this problem. In this paper, a two-phase algorithm is proposed as an approximation algorithm based on BIP that can solve the optimal camera placement problem for a placement space larger than in current studies. This study solves the problem in three-dimensional space for a real-world structure.

  17. Portable mini gamma camera for medical applications

    CERN Document Server

    Porras, E; Benlloch, J M; El-Djalil-Kadi-Hanifi, M; López, S; Pavon, N; Ruiz, J A; Sánchez, F; Sebastiá, A

    2002-01-01

    A small, portable and low-cost gamma camera for medical applications has been developed and clinically tested. This camera, based on a scintillator crystal and a Position Sensitive Photo-Multiplier Tube, has a useful field of view of 4.6 cm diameter and provides 2.2 mm of intrinsic spatial resolution. Its mobility and light weight allow to reach the patient from any desired direction. This camera images small organs with high efficiency and so addresses the demand for devices of specific clinical applications. In this paper, we present the camera and briefly describe the procedures that have led us to choose its configuration and the image reconstruction method. The clinical tests and diagnostic capability are also presented and discussed.

  18. Another look at volume self-calibration: calibration and self-calibration within a pinhole model of Scheimpflug cameras

    International Nuclear Information System (INIS)

    Cornic, Philippe; Le Besnerais, Guy; Champagnat, Frédéric; Illoul, Cédric; Cheminet, Adam; Le Sant, Yves; Leclaire, Benjamin

    2016-01-01

    We address calibration and self-calibration of tomographic PIV experiments within a pinhole model of cameras. A complete and explicit pinhole model of a camera equipped with a 2-tilt angles Scheimpflug adapter is presented. It is then used in a calibration procedure based on a freely moving calibration plate. While the resulting calibrations are accurate enough for Tomo-PIV, we confirm, through a simple experiment, that they are not stable in time, and illustrate how the pinhole framework can be used to provide a quantitative evaluation of geometrical drifts in the setup. We propose an original self-calibration method based on global optimization of the extrinsic parameters of the pinhole model. These methods are successfully applied to the tomographic PIV of an air jet experiment. An unexpected by-product of our work is to show that volume self-calibration induces a change in the world frame coordinates. Provided the calibration drift is small, as generally observed in PIV, the bias on the estimated velocity field is negligible but the absolute location cannot be accurately recovered using standard calibration data. (paper)

  19. Narrow Escape of Interacting Diffusing Particles

    Science.gov (United States)

    Agranov, Tal; Meerson, Baruch

    2018-03-01

    The narrow escape problem deals with the calculation of the mean escape time (MET) of a Brownian particle from a bounded domain through a small hole on the domain's boundary. Here we develop a formalism which allows us to evaluate the nonescape probability of a gas of diffusing particles that may interact with each other. In some cases the nonescape probability allows us to evaluate the MET of the first particle. The formalism is based on the fluctuating hydrodynamics and the recently developed macroscopic fluctuation theory. We also uncover an unexpected connection between the narrow escape of interacting particles and thermal runaway in chemical reactors.

  20. Color reproduction software for a digital still camera

    Science.gov (United States)

    Lee, Bong S.; Park, Du-Sik; Nam, Byung D.

    1998-04-01

    We have developed a color reproduction software for a digital still camera. The image taken by the camera was colorimetrically reproduced on the monitor after characterizing the camera and the monitor, and color matching between two devices. The reproduction was performed at three levels; level processing, gamma correction, and color transformation. The image contrast was increased after the level processing adjusting the level of dark and bright portions of the image. The relationship between the level processed digital values and the measured luminance values of test gray samples was calculated, and the gamma of the camera was obtained. The method for getting the unknown monitor gamma was proposed. As a result, the level processed values were adjusted by the look-up table created by the camera and the monitor gamma correction. For a color transformation matrix for the camera, 3 by 3 or 3 by 4 matrix was used, which was calculated by the regression between the gamma corrected values and the measured tristimulus values of each test color samples the various reproduced images were displayed on the dialogue box implemented in our software, which were generated according to four illuminations for the camera and three color temperatures for the monitor. An user can easily choose he best reproduced image comparing each others.