WorldWideScience

Sample records for astrometry instrumentation algorithms

  1. WFIRST: Astrometry with the Wide-Field Imager

    Science.gov (United States)

    Bellini, Andrea; WFIRST Astrometry Working Group

    2018-01-01

    The wide field of view and stable, sharp images delivered by WFIRST's Wide-Field Imager make it an excellent instrument for astrometry, one of five major discovery areas identified in the 2010 Decadal Survey. Compared to the Hubble Space Telescope, WFIRST's wider field of view with similar image quality will provide hundreds more astrometric targets per image as well as background galaxies and stars with precise positions in the Gaia catalog. In addition, WFIRST will operate in the infrared, a wavelength regime where the most precise astrometry has so far been achieved with adaptive optics images from large ground-based telescopes. WFIRST will provide at least a factor of three improvement in astrometry over the current state of the art in this wavelength range, while spanning a field of view thousands of times larger. WFIRST is thus poised to make major contributions to multiple science topics in which astrometry plays an important role, without major alterations to the planned mission or instrument. We summarize a few of the most compelling science cases where WFIRST astrometry could prove transformational.

  2. Global astrometry with the space interferometry mission

    Science.gov (United States)

    Boden, A.; Unwin, S.; Shao, M.

    1997-01-01

    The prospects for global astrometric measurements with the space interferometry mission (SIM) are discussed. The SIM mission will perform four microarcsec astrometric measurements on objects as faint as 20 mag using the optical interferometry technique with a 10 m baseline. The SIM satellite will perform narrow angle astrometry and global astrometry by means of an astrometric grid. The sensitivities of the SIM global astrometric performance and the grid accuracy versus instrumental parameters and sky coverage schemes are reported on. The problems in finding suitable astrometric grid objects to support microarcsec astrometry, and related ground-based observation programs are discussed.

  3. Observing exoplanet populations with high-precision astrometry

    Science.gov (United States)

    Sahlmann, Johannes

    2012-06-01

    This thesis deals with the application of the astrometry technique, consisting in measuring the position of a star in the plane of the sky, for the discovery and characterisation of extra-solar planets. It is feasible only with a very high measurement precision, which motivates the use of space observatories, the development of new ground-based astronomical instrumentation and of innovative data analysis methods: The study of Sun-like stars with substellar companions using CORALIE radial velocities and HIPPARCOS astrometry leads to the determination of the frequency of close brown dwarf companions and to the discovery of a dividing line between massive planets and brown dwarf companions; An observation campaign employing optical imaging with a very large telescope demonstrates sufficient astrometric precision to detect planets around ultra-cool dwarf stars and the first results of the survey are presented; Finally, the design and initial astrometric performance of PRIMA, ! a new dual-feed near-infrared interferometric observing facility for relative astrometry is presented.

  4. Astrometry VLBI in Space (AVS)

    Science.gov (United States)

    Cheng, Li-Jen; Reyes, George

    1995-01-01

    This paper describes a proposal for a new space radio astronomy mission for astrometry using Very Long Baseline Interferometry (VLBI) called Astrometry VLBI in Space (AVS). The ultimate goals of AVS are improving the accuracy of radio astrometry measurements to the microarcsecond level in one epoch of measurements and improving the accuracy of the transformation between the inertial radio and optical coordinate reference frames. This study will also assess the impact of this mission on astrophysics astrometry and geophysics.

  5. Astrometry VLBI in Space (AVS

    Science.gov (United States)

    Altunin, V.; Alekseev, V.; Akim, E.; Eubanks, M.; Kingham, K.; Treuhaft, R.; Sukhanov, K.

    1995-01-01

    A proposed new space radio astronomy mission for astrometry is described. The Astrometry VLBI (very long baseline) in Space (AVS) nominal mission includes two identical spacecraft, each with a 4-m antenna sending data to a 70-m ground station. The goals of AVS are improving astrometry accuracy to the microarcsecond level and improving the accuracy of the transformation between the inertial radio and optical coordinate reference frames.

  6. High Accuracy Ground-based near-Earth-asteroid Astrometry using Synthetic Tracking

    Science.gov (United States)

    Zhai, Chengxing; Shao, Michael; Saini, Navtej; Sandhu, Jagmit; Werne, Thomas; Choi, Philip; Ely, Todd A.; Jacobs, Chirstopher S.; Lazio, Joseph; Martin-Mur, Tomas J.; Owen, William M.; Preston, Robert; Turyshev, Slava; Michell, Adam; Nazli, Kutay; Cui, Isaac; Monchama, Rachel

    2018-01-01

    Accurate astrometry is crucial for determining the orbits of near-Earth-asteroids (NEAs). Further, the future of deep space high data rate communications is likely to be optical communications, such as the Deep Space Optical Communications package that is part of the baseline payload for the planned Psyche Discovery mission to the Psyche asteroid. We have recently upgraded our instrument on the Pomona College 1 m telescope, at JPL's Table Mountain Facility, for conducting synthetic tracking by taking many short exposure images. These images can be then combined in post-processing to track both asteroid and reference stars to yield accurate astrometry. Utilizing the precision of the current and future Gaia data releases, the JPL-Pomona College effort is now demonstrating precision astrometry on NEAs, which is likely to be of considerable value for cataloging NEAs. Further, treating NEAs as proxies of future spacecraft that carry optical communication lasers, our results serve as a measure of the astrometric accuracy that could be achieved for future plane-of-sky optical navigation.

  7. The Future of Astrometry in Space

    Directory of Open Access Journals (Sweden)

    Antonella Vallenari

    2018-04-01

    Full Text Available This contribution focuses on the importance of astrometry and on its future developments. Over the centuries astrometry has greatly contributed to the advance of the knowledge of the Universe. Nowadays a major breakthrough is on the way due to astrometric sky surveys from space. ESA space missions Hipparcos first and then Gaia point out the outstanding contribution that space astrometry can provide to our knowledge in many fields of astrophysics, going from the Milky Way formation and evolution, to stellar astrophysics, extra-galactic astrophysics, and fundamental physics. We briefly outline the properties of Gaia first and second data release, and the accuracies expected end-of-mission. The next big advance in space astrometry would be either to improve the astrometric accuracy of one order of magnitude, or to move to a different wavelength domain. While both options have the potential to bring us in a new era of discovery, they have to face enormous issues. We summarize the future directions in space astrometry that are proposed or under investigation by the scientific community, their main challenges and the expected outcome.

  8. REFERENCE-LESS DETECTION, ASTROMETRY, AND PHOTOMETRY OF FAINT COMPANIONS WITH ADAPTIVE OPTICS

    International Nuclear Information System (INIS)

    Gladysz, Szymon; Christou, Julian C.

    2009-01-01

    We propose a complete framework for the detection, astrometry, and photometry of faint companions from a sequence of adaptive optics (AO) corrected short exposures. The algorithms exploit the difference in statistics between the on-axis and off-axis intensity of the AO point-spread function (PSF) to differentiate real sources from speckles. We validate the new approach and illustrate its performance using moderate Strehl ratio data obtained with the natural guide star AO system on the Lick Observatory's 3 m Shane Telescope. We obtain almost a 2 mag gain in achievable contrast by using our detection method compared to 5σ detectability in long exposures. We also present a first guide to expected accuracy of differential photometry and astrometry with the new techniques. Our approach performs better than PSF-fitting in general and especially so for close companions, which are located within the uncompensated seeing (speckle) halo. All three proposed algorithms are self-calibrating, i.e., they do not require observation of a calibration star. One of the advantages of this approach is improved observing efficiency.

  9. PACMAN: PRIMA astrometric instrument software

    Science.gov (United States)

    Abuter, Roberto; Sahlmann, Johannes; Pozna, Eszter

    2010-07-01

    The dual feed astrometric instrument software of PRIMA (PACMAN) that is currently being integrated at the VLTI will use two spatially modulated fringe sensor units and a laser metrology system to carry out differential astrometry. Its software and hardware compromises a distributed system involving many real time computers and workstations operating in a synchronized manner. Its architecture has been designed to allow the construction of efficient and flexible calibration and observation procedures. In parallel, a novel scheme of integrating M-code (MATLAB/OCTAVE) with standard VLT (Very Large Telescope) control software applications had to be devised in order to support numerically intensive operations and to have the capacity of adapting to fast varying strategies and algorithms. This paper presents the instrument software, including the current operational sequences for the laboratory calibration and sky calibration. Finally, a detailed description of the algorithms with their implementation, both under M and C code, are shown together with a comparative analysis of their performance and maintainability.

  10. Micro-Arcsec mission: implications of the monitoring, diagnostic and calibration of the instrument response in the data reduction chain. .

    Science.gov (United States)

    Busonero, D.; Gai, M.

    The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.

  11. Atmospheric scintillation at Dome C, Antarctica: implications for photometry and astrometry

    Science.gov (United States)

    Kenyon, S.; Lawrence, J.; Ashley, M. C. B.; Storey, J. W. V.; Tokovinin, A.; Fossat, E.

    2006-08-01

    Night-time turbulence profiles of the atmosphere above Dome C, Antarctica, were measured during 2004, using a MASS instrument. We compare this data with turbulence profiles above Cerro Tololo and Cerro Pachon, also measured with a MASS, and find, with the exception of the owest layer, that Dome C has significantly less turbulence. In addition, the integrated at turbulence 16 km above Dome C is always less than the median values at the two Chilean sites. Using average wind speed profiles, we assess the photometric noise produced by scintillation, and the atmospheric contribution to the error budget in narrow angle differential astrometry. In comparison with the two mid-latitude sites in Chile, Dome C offers a potential gain of about 3.6 in both photometric precision (for long integrations) and narrow-angle astrometry precision. Although the data from Dome C cover a fairly limited time frame, they lend strong support to expectations that Dome C will offer significant advantages for photometric and astrometric studies.

  12. Instrument-induced spatial crosstalk deconvolution algorithm

    Science.gov (United States)

    Wright, Valerie G.; Evans, Nathan L., Jr.

    1986-01-01

    An algorithm has been developed which reduces the effects of (deconvolves) instrument-induced spatial crosstalk in satellite image data by several orders of magnitude where highly precise radiometry is required. The algorithm is based upon radiance transfer ratios which are defined as the fractional bilateral exchange of energy betwen pixels A and B.

  13. High-precision astrometry towards ELTs

    NARCIS (Netherlands)

    Massari, Davide; Fiorentino, Giuliana; Tolstoy, Eline; McConnachie, Alan; Stuik, Remko; Schreiber, Laura; Andersen, David; Clénet, Yann; Davies, Richard; Gratadour, Damien; Kuijken, Konrad; Navarro, Ramon; Pott, Jörg-Uwe; Rodeghiero, Gabriele; Turri, Paolo; Verdoes Kleijn, Gijs

    2016-01-01

    With the aim of paving the road for future accurate astrometry with MICADO at the European-ELT, we performed an astrometric study using two different but complementary approaches to investigate two critical components that contribute to the total astrometric accuracy. First, we tested the predicted

  14. Microlensing and Its Degeneracy Breakers: Parallax, Finite Source, High-Resolution Imaging, and Astrometry

    Directory of Open Access Journals (Sweden)

    Chien-Hsiu Lee

    2017-07-01

    Full Text Available First proposed by Paczynski in 1986, microlensing has been instrumental in the search for compact dark matter as well as discovery and characterization of exoplanets. In this article, we provide a brief history of microlensing, especially on the discoveries of compact objects and exoplanets. We then review the basics of microlensing and how astrometry can help break the degeneracy, providing a more robust determination of the nature of the microlensing events. We also outline prospects that will be made by on-going and forth-coming experiments/observatories.

  15. Ground-based Opportunities for Astrometry

    Science.gov (United States)

    2013-01-01

    science that is possible. The starting points for many of these projects are the surveys, such as 2MASS , DENIS, and the SDSS recently made with...Cutri, R. M. (2006). Extending the JCRF into the infrared: 2MASS -UCAC astrometry. In JDJ6: The International Celestial Reference Ssytem: Maintenance

  16. An Improved Technique for the Photometry and Astrometry of Faint Companions

    Science.gov (United States)

    Burke, Daniel; Gladysz, Szymon; Roberts, Lewis; Devaney, Nicholas; Dainty, Chris

    2009-07-01

    We propose a new approach to differential astrometry and photometry of faint companions in adaptive optics images. It is based on a prewhitening matched filter, also referred to in the literature as the Hotelling observer. We focus on cases where the signal of the companion is located within the bright halo of the parent star. Using real adaptive optics data from the 3 m Shane telescope at the Lick Observatory, we compare the performance of the Hotelling algorithm with other estimation algorithms currently used for the same problem. The real single-star data are used to generate artificial binary objects with a range of magnitude ratios. In most cases, the Hotelling observer gives significantly lower astrometric and photometric errors. In the case of high Strehl ratio (SR) data (SR ≈ 0.5), the differential photometry of a binary star with a Δm = 4.5 and a separation of 0.6″ is better than 0.1 mag a factor of 2 lower than the other algorithms considered.

  17. Instrument design and optimization using genetic algorithms

    International Nuclear Information System (INIS)

    Hoelzel, Robert; Bentley, Phillip M.; Fouquet, Peter

    2006-01-01

    This article describes the design of highly complex physical instruments by using a canonical genetic algorithm (GA). The procedure can be applied to all instrument designs where performance goals can be quantified. It is particularly suited to the optimization of instrument design where local optima in the performance figure of merit are prevalent. Here, a GA is used to evolve the design of the neutron spin-echo spectrometer WASP which is presently being constructed at the Institut Laue-Langevin, Grenoble, France. A comparison is made between this artificial intelligence approach and the traditional manual design methods. We demonstrate that the search of parameter space is more efficient when applying the genetic algorithm, and the GA produces a significantly better instrument design. Furthermore, it is found that the GA increases flexibility, by facilitating the reoptimization of the design after changes in boundary conditions during the design phase. The GA also allows the exploration of 'nonstandard' magnet coil geometries. We conclude that this technique constitutes a powerful complementary tool for the design and optimization of complex scientific apparatus, without replacing the careful thought processes employed in traditional design methods

  18. Instrument design and optimization using genetic algorithms

    Science.gov (United States)

    Hölzel, Robert; Bentley, Phillip M.; Fouquet, Peter

    2006-10-01

    This article describes the design of highly complex physical instruments by using a canonical genetic algorithm (GA). The procedure can be applied to all instrument designs where performance goals can be quantified. It is particularly suited to the optimization of instrument design where local optima in the performance figure of merit are prevalent. Here, a GA is used to evolve the design of the neutron spin-echo spectrometer WASP which is presently being constructed at the Institut Laue-Langevin, Grenoble, France. A comparison is made between this artificial intelligence approach and the traditional manual design methods. We demonstrate that the search of parameter space is more efficient when applying the genetic algorithm, and the GA produces a significantly better instrument design. Furthermore, it is found that the GA increases flexibility, by facilitating the reoptimization of the design after changes in boundary conditions during the design phase. The GA also allows the exploration of "nonstandard" magnet coil geometries. We conclude that this technique constitutes a powerful complementary tool for the design and optimization of complex scientific apparatus, without replacing the careful thought processes employed in traditional design methods.

  19. Precision Photometry and Astrometry from Pan-STARRS

    Science.gov (United States)

    Magnier, Eugene A.; Pan-STARRS Team

    2018-01-01

    The Pan-STARRS 3pi Survey has been calibrated with excellent precision for both astrometry and photometry. The Pan-STARRS Data Release 1, opened to the public on 2016 Dec 16, provides photometry in 5 well-calibrated, well-defined bandpasses (grizy) astrometrically registered to the Gaia frame. Comparisons with other surveys illustrate the high quality of the calibration and provide tests of remaining systematic errors in both Pan-STARRS and those external surveys. With photometry and astrometry of roughly 3 billion astronomical objects, the Pan-STARRS DR1 has substantial overlap with Gaia, SDSS, 2MASS and other surveys. I will discuss the astrometric tie between Pan-STARRS DR1 and Gaia and show comparisons between Pan-STARRS and other large-scale surveys.

  20. Astrometry with A-Track Using Gaia DR1 Catalogue

    Science.gov (United States)

    Kılıç, Yücel; Erece, Orhan; Kaplan, Murat

    2018-04-01

    In this work, we built all sky index files from Gaia DR1 catalogue for the high-precision astrometric field solution and the precise WCS coordinates of the moving objects. For this, we used build-astrometry-index program as a part of astrometry.net code suit. Additionally, we added astrometry.net's WCS solution tool to our previously developed software which is a fast and robust pipeline for detecting moving objects such as asteroids and comets in sequential FITS images, called A-Track. Moreover, MPC module was added to A-Track. This module is linked to an asteroid database to name the found objects and prepare the MPC file to report the results. After these innovations, we tested a new version of the A-Track code on photometrical data taken by the SI-1100 CCD with 1-meter telescope at TÜBİTAK National Observatory, Antalya. The pipeline can be used to analyse large data archives or daily sequential data. The code is hosted on GitHub under the GNU GPL v3 license.

  1. Absolute Astrometry in the next 50 Years - II

    Science.gov (United States)

    Høg, E.

    2018-01-01

    With the Gaia astrometric satellite in orbit since December 2013 it is time to look at the future of fundamental astrometry and a time frame of 50 years is needed in this matter. A space mission with Gaia-like astrometric performance is required, but not necessarily a Gaia-like satellite. A dozen science issues for a Gaia successor mission in twenty years, with launch about 2035, are presented and in this context also other possibilities for absolute astrometry with milliarcsecond (mas) or sub-mas accuracies are discussed in my report at http://arxiv.org/abs/1408.2190. In brief, the two missions (2013 and 2035) would provide an astrometric foundation for all branches of astronomy from the solar system and stellar systems, including exo-planet systems with long periods, to compact galaxies, quasars and Dark Matter substructures by data which cannot be surpassed in the next 50 years.

  2. Artificial intelligence programming with LabVIEW: genetic algorithms for instrumentation control and optimization.

    Science.gov (United States)

    Moore, J H

    1995-06-01

    A genetic algorithm for instrumentation control and optimization was developed using the LabVIEW graphical programming environment. The usefulness of this methodology for the optimization of a closed loop control instrument is demonstrated with minimal complexity and the programming is presented in detail to facilitate its adaptation to other LabVIEW applications. Closed loop control instruments have variety of applications in the biomedical sciences including the regulation of physiological processes such as blood pressure. The program presented here should provide a useful starting point for those wishing to incorporate genetic algorithm approaches to LabVIEW mediated optimization of closed loop control instruments.

  3. Using Astrometrica to Teach an Introduction to Asteroid and Comet Astrometry

    Science.gov (United States)

    Durig, Douglas T.

    2007-05-01

    We have organized a Consortium for Astronomy Research and Teaching (CART) with several small colleges and universities from the Appalachian Colleges Association (ACA). In 2006 we received a small grant from the ACA Teaching and Learning Conference to develop laboratory exercises using the on-line telescopes of the Cordell-Lorenz Observatory of The University of the South in Sewanee, TN. We have completed and tested the first two, Asteroid Astrometry and Comet Astronomy and Angular Size. We are continuing to develop several more on the HR Diagram, Cataclysmic Variables and Short Period Variable Stars. We had unknown new asteroids in the field of view the first four times we performed the Asteroid Astrometry exercise but, unfortunately, none of the students recognized the new objects. However, they were more motivated to perform the exercise because of the opportunity to discover the new objects and they performed better on the review questions than students doing a comparable virtual exercise. Both of the astrometry exercises use the Astrometrica program developed by Herbert Raab and provide an introduction to the use and applications of this very functional shareware program. The program is available for free to educators.

  4. A practicable signal processing algorithm for industrial nuclear instrument

    International Nuclear Information System (INIS)

    Tang Yaogeng; Gao Song; Yang Wujiao

    2006-01-01

    In order to reduce the statistical error and to improve dynamic performances of the industrial nuclear instrument, a practicable method of nuclear measurement signal processing is developed according to industrial nuclear measurement features. The algorithm designed is implemented with a single-chip microcomputer. The results of application in (radiation level gauge has proved the effectiveness of this method). (authors)

  5. Algorithm of choosing type of mechanical assembly production of instrument making enterprises of Industry 4.0

    Science.gov (United States)

    Zakoldaev, D. A.; Shukalov, A. V.; Zharinov, I. O.; Zharinov, O. O.

    2018-05-01

    The task of the algorithm of choosing the type of mechanical assembly production of instrument making enterprises of Industry 4.0 is being studied. There is a comparison of two project algorithms for Industry 3.0 and Industry 4.0. The algorithm of choosing the type of mechanical assembly production of instrument making enterprises of Industry 4.0 is based on the technological route analysis of the manufacturing process in a company equipped with cyber and physical systems. This algorithm may give some project solutions selected from the primary part or the auxiliary one of the production. The algorithm decisive rules are based on the optimal criterion.

  6. Results of the astrometry and direct imaging testbed for exoplanet detection

    Science.gov (United States)

    Bendek, Eduardo A.; Belikov, Ruslan; Pluzhnik, Eugene; Guyon, Olivier; Milster, Thomas; Johnson, Lee; Finan, Emily; Knight, Justin; Rodack, Alexander

    2017-09-01

    Measuring masses of long-period planets around F, G, and K stars is necessary to characterize exoplanets and assess their habitability. Imaging stellar astrometry offers a unique opportunity to solve radial velocity system inclination ambiguity and determine exoplanet masses. The main limiting factor in sparse-field astrometry, besides photon noise, is the non-systematic dynamic distortions that arise from perturbations in the optical train. Even space optics suffer from dynamic distortions in the optical system at the sub-μas level. To overcome this limitation we propose a diffractive pupil that uses an array of dots on the primary mirror creating polychromatic diffraction spikes in the focal plane, which are used to calibrate the distortions in the optical system. By combining this technology with a high-performance coronagraph, measurements of planetary systems orbits and masses can be obtained faster and more accurately than by applying traditional techniques separately. In this paper, we present the results of the combined astrometry and and highcontrast imaging experiments performed at NASA Ames Research Center as part of a Technology Development for Exoplanet Missions program. We demonstrated 2.38x10-5 λ/D astrometric accuracy per axis and 1.72x10-7 raw contrast from 1.6 to 4.5 λ/D. In addition, using a simple average subtraction post-processing we demonstrated no contamination of the coronagraph field down to 4.79x10-9 raw contrast.

  7. The OMPS Limb Profiler Instrument: Two-Dimensional Retrieval Algorithm

    Science.gov (United States)

    Rault, Didier F.

    2010-01-01

    The upcoming Ozone Mapper and Profiler Suite (OMPS), which will be launched on the NPOESS Preparatory Project (NPP) platform in early 2011, will continue monitoring the global distribution of the Earth's middle atmosphere ozone and aerosol. OMPS is composed of three instruments, namely the Total Column Mapper (heritage: TOMS, OMI), the Nadir Profiler (heritage: SBUV) and the Limb Profiler (heritage: SOLSE/LORE, OSIRIS, SCIAMACHY, SAGE III). The ultimate goal of the mission is to better understand and quantify the rate of stratospheric ozone recovery. The focus of the paper will be on the Limb Profiler (LP) instrument. The LP instrument will measure the Earth's limb radiance (which is due to the scattering of solar photons by air molecules, aerosol and Earth surface) in the ultra-violet (UV), visible and near infrared, from 285 to 1000 nm. The LP simultaneously images the whole vertical extent of the Earth's limb through three vertical slits, each covering a vertical tangent height range of 100 km and each horizontally spaced by 250 km in the cross-track direction. Measurements are made every 19 seconds along the orbit track, which corresponds to a distance of about 150km. Several data analysis tools are presently being constructed and tested to retrieve ozone and aerosol vertical distribution from limb radiance measurements. The primary NASA algorithm is based on earlier algorithms developed for the SOLSE/LORE and SAGE III limb scatter missions. All the existing retrieval algorithms rely on a spherical symmetry assumption for the atmosphere structure. While this assumption is reasonable in most of the stratosphere, it is no longer valid in regions of prime scientific interest, such as polar vortex and UTLS regions. The paper will describe a two-dimensional retrieval algorithm whereby the ozone distribution is simultaneously retrieved vertically and horizontally for a whole orbit. The retrieval code relies on (1) a forward 2D Radiative Transfer code (to model limb

  8. An intelligent identification algorithm for the monoclonal picking instrument

    Science.gov (United States)

    Yan, Hua; Zhang, Rongfu; Yuan, Xujun; Wang, Qun

    2017-11-01

    The traditional colony selection is mainly operated by manual mode, which takes on low efficiency and strong subjectivity. Therefore, it is important to develop an automatic monoclonal-picking instrument. The critical stage of the automatic monoclonal-picking and intelligent optimal selection is intelligent identification algorithm. An auto-screening algorithm based on Support Vector Machine (SVM) is proposed in this paper, which uses the supervised learning method, which combined with the colony morphological characteristics to classify the colony accurately. Furthermore, through the basic morphological features of the colony, system can figure out a series of morphological parameters step by step. Through the establishment of maximal margin classifier, and based on the analysis of the growth trend of the colony, the selection of the monoclonal colony was carried out. The experimental results showed that the auto-screening algorithm could screen out the regular colony from the other, which meets the requirement of various parameters.

  9. International VLBI Service for Geodesy and Astrometry

    Science.gov (United States)

    Vandenberg, Nancy R. (Editor); Baver, Karen D. (Editor)

    2001-01-01

    This volume of reports is the 2000 Annual Report of the International Very Long Base Interferometry (VLBI) Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the components of IVS. The 2000 Annual Report documents the work of these IVS components over the period March 1, 1999, through December 31, 2000. The reports document changes, activities, and progress of the IVS. The entire contents of this Annual Report also appear on the IVS web site at http://ivscc.gsfc.nasa.gov/publications/ar2000.

  10. Modeling Multi-wavelength Stellar Astrometry. III. Determination of the Absolute Masses of Exoplanets and Their Host Stars

    Science.gov (United States)

    Coughlin, J. L.; López-Morales, Mercedes

    2012-05-01

    Astrometric measurements of stellar systems are becoming significantly more precise and common, with many ground- and space-based instruments and missions approaching 1 μas precision. We examine the multi-wavelength astrometric orbits of exoplanetary systems via both analytical formulae and numerical modeling. Exoplanets have a combination of reflected and thermally emitted light that causes the photocenter of the system to shift increasingly farther away from the host star with increasing wavelength. We find that, if observed at long enough wavelengths, the planet can dominate the astrometric motion of the system, and thus it is possible to directly measure the orbits of both the planet and star, and thus directly determine the physical masses of the star and planet, using multi-wavelength astrometry. In general, this technique works best for, though is certainly not limited to, systems that have large, high-mass stars and large, low-mass planets, which is a unique parameter space not covered by other exoplanet characterization techniques. Exoplanets that happen to transit their host star present unique cases where the physical radii of the planet and star can be directly determined via astrometry alone. Planetary albedos and day-night contrast ratios may also be probed via this technique due to the unique signature they impart on the observed astrometric orbits. We develop a tool to examine the prospects for near-term detection of this effect, and give examples of some exoplanets that appear to be good targets for detection in the K to N infrared observing bands, if the required precision can be achieved.

  11. MODELING MULTI-WAVELENGTH STELLAR ASTROMETRY. III. DETERMINATION OF THE ABSOLUTE MASSES OF EXOPLANETS AND THEIR HOST STARS

    International Nuclear Information System (INIS)

    Coughlin, J. L.; López-Morales, Mercedes

    2012-01-01

    Astrometric measurements of stellar systems are becoming significantly more precise and common, with many ground- and space-based instruments and missions approaching 1 μas precision. We examine the multi-wavelength astrometric orbits of exoplanetary systems via both analytical formulae and numerical modeling. Exoplanets have a combination of reflected and thermally emitted light that causes the photocenter of the system to shift increasingly farther away from the host star with increasing wavelength. We find that, if observed at long enough wavelengths, the planet can dominate the astrometric motion of the system, and thus it is possible to directly measure the orbits of both the planet and star, and thus directly determine the physical masses of the star and planet, using multi-wavelength astrometry. In general, this technique works best for, though is certainly not limited to, systems that have large, high-mass stars and large, low-mass planets, which is a unique parameter space not covered by other exoplanet characterization techniques. Exoplanets that happen to transit their host star present unique cases where the physical radii of the planet and star can be directly determined via astrometry alone. Planetary albedos and day-night contrast ratios may also be probed via this technique due to the unique signature they impart on the observed astrometric orbits. We develop a tool to examine the prospects for near-term detection of this effect, and give examples of some exoplanets that appear to be good targets for detection in the K to N infrared observing bands, if the required precision can be achieved.

  12. Developments in the Aerosol Layer Height Retrieval Algorithm for the Copernicus Sentinel-4/UVN Instrument

    Science.gov (United States)

    Nanda, Swadhin; Sanders, Abram; Veefkind, Pepijn

    2016-04-01

    The Sentinel-4 mission is a part of the European Commission's Copernicus programme, the goal of which is to provide geo-information to manage environmental assets, and to observe, understand and mitigate the effects of the changing climate. The Sentinel-4/UVN instrument design is motivated by the need to monitor trace gas concentrations and aerosols in the atmosphere from a geostationary orbit. The on-board instrument is a high resolution UV-VIS-NIR (UVN) spectrometer system that provides hourly radiance measurements over Europe and northern Africa with a spatial sampling of 8 km. The main application area of Sentinel-4/UVN is air quality. One of the data products that is being developed for Sentinel-4/UVN is the Aerosol Layer Height (ALH). The goal is to determine the height of aerosol plumes with a resolution of better than 0.5 - 1 km. The ALH product thus targets aerosol layers in the free troposphere, such as desert dust, volcanic ash and biomass during plumes. KNMI is assigned with the development of the Aerosol Layer Height (ALH) algorithm. Its heritage is the ALH algorithm developed by Sanders and De Haan (ATBD, 2016) for the TROPOMI instrument on board the Sentinel-5 Precursor mission that is to be launched in June or July 2016 (tentative date). The retrieval algorithm designed so far for the aerosol height product is based on the absorption characteristics of the oxygen-A band (759-770 nm). The algorithm has heritage to the ALH algorithm developed for TROPOMI on the Sentinel 5 precursor satellite. New aspects for Sentinel-4/UVN include the higher resolution (0.116 nm compared to 0.4 for TROPOMI) and hourly observation from the geostationary orbit. The algorithm uses optimal estimation to obtain a spectral fit of the reflectance across absorption band, while assuming a single uniform layer with fixed width to represent the aerosol vertical distribution. The state vector includes amongst other elements the height of this layer and its aerosol optical

  13. Algorithms for a hand-held miniature x-ray fluorescence analytical instrument

    International Nuclear Information System (INIS)

    Elam, W.T.; Newman, D.; Ziemba, F.

    1998-01-01

    The purpose of this joint program was to provide technical assistance with the development of a Miniature X-ray Fluorescence (XRF) Analytical Instrument. This new XRF instrument is designed to overcome the weaknesses of spectrometers commercially available at the present time. Currently available XRF spectrometers (for a complete list see reference 1) convert spectral information to sample composition using the influence coefficients technique or the fundamental parameters method. They require either a standard sample with composition relatively close to the unknown or a detailed knowledge of the sample matrix. They also require a highly-trained operator and the results often depend on the capabilities of the operator. In addition, almost all existing field-portable, hand-held instruments use radioactive sources for excitation. Regulatory limits on such sources restrict them such that they can only provide relatively weak excitation. This limits all current hand-held XRF instruments to poor detection limits and/or long data collection times, in addition to the licensing requirements and disposal problems for radioactive sources. The new XRF instrument was developed jointly by Quantrad Sensor, Inc., the Naval Research Laboratory (NRL), and the Department of Energy (DOE). This report describes the analysis algorithms developed by NRL for the new instrument and the software which embodies them

  14. Astrometry of 2014MU69 for New Horizons encounter

    Science.gov (United States)

    Buie, Marc

    2017-08-01

    We propose 12 orbits of time to make high-precision astrometric measurments of the New Horizons extendedmission target, (486958) 2014MU69. These observations are in direct support of the navigation of New Horizonsleading up to its encounter in Jan 2019. These visits represent an optimized plan for improved orbit estimates that willcomplete as the target becomes directly observable by New Horizons. This astrometry is a key element leadingup to a close investigation of a Cold-Classical Kuiper Belt Object, one of the most primitive members of our solarsystem.

  15. Modernizing Pickles - A Tool for Planning and Scheduling HST Astrometry

    Science.gov (United States)

    Juarez, Aaron; McArthur, B.; Benedict, G. F.

    2007-12-01

    Pickles is a Macintosh program written in C that was developed as a tool for determining pointings and rolls of the Hubble Space Telescope (HST) to place targets and astrometric reference stars in the Fine Guidance Sensor (FGS) field of regard ("pickles"). The program was developed in the late 1980s and runs under the "Classic” System. Ongoing HST astrometry projects require that this code be ported to the Intel-Mac OSX, because the Classic System is now unsupported. Pickles is a vital part of HST astrometry research. It graphically aids the investigator to determine where, when, and how the HST/FGS combination can observe an object and associated astrometric reference stars. Presently, Pickles can extract and display star positions from Guide Star Catalogs, such as the ACRS, SAO, and AGK3 catalogs via CD-ROMs. Future improvements will provide access to these catalogs and others through the internet. As an example of the past utility of Pickles, we highlight the recent determination of parallaxes for ten galactic Cepheids to determine an improved solar-metallicity Period-Luminosity relation. Support for this work was provided by NASA through grants GO-10989, -11210, and -11211 from the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.

  16. THE BENEFITS OF VLBI ASTROMETRY TO PULSAR TIMING ARRAY SEARCHES FOR GRAVITATIONAL RADIATION

    Energy Technology Data Exchange (ETDEWEB)

    Madison, D. R.; Chatterjee, S.; Cordes, J. M. [Department of Astronomy and Center for Radiophysics and Space Research, Cornell University, Ithaca, NY 14850 (United States)

    2013-11-10

    Precision astrometry is an integral component of successful pulsar timing campaigns. Astrometric parameters are commonly derived by fitting them as parameters of a timing model to a series of pulse times of arrival (TOAs). TOAs measured to microsecond precision over spans of several years can yield position measurements with sub-milliarcsecond precision. However, timing-based astrometry can become biased if a pulsar displays any red spin noise or a red signal produced by the stochastic gravitational wave background. We investigate how noise of different spectral types is absorbed by timing models, leading to significant estimation biases in the astrometric parameters. We find that commonly used techniques for fitting timing models in the presence of red noise (Cholesky whitening) prevent the absorption of noise into the timing model remarkably well if the time baseline of observations exceeds several years, but are inadequate for dealing with shorter pulsar data sets. Independent of timing, pulsar-optimized very long baseline interferometry (VLBI) is capable of providing position estimates precise to the sub-milliarcsecond levels needed for high-precision timing. In order to make VLBI astrometric parameters useful in pulsar timing models, the transformation between the International Celestial Reference Frame (ICRF) and the dynamical solar system ephemeris used for pulsar timing must be constrained to within a few microarcseconds. We compute a transformation between the ICRF and pulsar timing frames and quantitatively discuss how the transformation will improve in coming years. We find that incorporating VLBI astrometry into the timing models of pulsars for which only a couple of years of timing data exist will lead to more realistic assessments of red spin noise and could enhance the amplitude of gravitational wave signatures in post-fit timing residuals by factors of 20 or more.

  17. Gaia Data Release 1 Open cluster astrometry: performance, limitations, and future prospects

    Czech Academy of Sciences Publication Activity Database

    van Leeuwen, F.; Vallenari, A.; Jordi, C.; Lindegren, L.; Bastian, U.; Prusti, T.; de Bruijne, J.H.J.; Brown, A.G.A.; Babusiaux, C.; Bailer-Jones, C.A.L.; Fuchs, Jan; Koubský, Pavel; Votruba, Viktor

    2017-01-01

    Roč. 601, May (2017), A19/1-A19/65 E-ISSN 1432-0746 R&D Projects: GA MŠk(CZ) LG15010 Grant - others:ESA(XE) ESA-PECS project No. 98058 Institutional support: RVO:67985815 Keywords : astrometry * open clusters and associations * proper motion and parallax Subject RIV: BN - Astronomy , Celestial Mechanics, Astrophysics OBOR OECD: Astronomy (including astrophysics,space science) Impact factor: 5.014, year: 2016

  18. International VLBI Service for Geodesy and Astrometry 2007 Annual Report

    Science.gov (United States)

    Behrend, D. (Editor); Baver, K. D. (Editor)

    2008-01-01

    This volume of reports is the 2007 Annual Report of the International VLBI Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the components of IVS. The 2007 Annual Report documents the work of these IVS components over the period January 1, 2007 through December 31, 2007. The reports document changes, activities, and progress of the IVS. The entire contents of this Annual Report also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/ar2007.

  19. International VLBI Service for Geodesy and Astrometry 2008 Annual Report

    Science.gov (United States)

    Behrend, Dirk; Baver, Karen D.

    2009-01-01

    This volume of reports is the 2008 Annual Report of the International VLBI Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the components of IVS. The 2008 Annual Report documents the work of these IVS components over the period January 1, 2008 through December 31, 2008. The reports document changes, activities, and progress of the IVS. The entire contents of this Annual Report also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/ar2008.

  20. International VLBI Service for Geodesy and Astrometry 2011 Annual Report

    Science.gov (United States)

    Baver, Karen D. (Editor); Behrend, Dirk

    2012-01-01

    This volume of reports is the 2011 Annual Report of the International VLBI Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the components of IVS. The 2011 Annual Report documents the work of these IVS components over the period January 1, 2011 through December 31, 2011. The reports document changes, activities, and progress of the IVS. The entire contents of this Annual Report also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/ar2011.

  1. International VLBI Service for Geodesy and Astrometry 2005 Annual Report

    Science.gov (United States)

    Behrend, Dirk (Editor); Baver, Karen D. (Editor)

    2006-01-01

    This volume of reports is the 2005 Annual Report of the International VLBI Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the components of IVS. The 2005 Annual Report documents the work of these IVS components over the period January 1, 2005 through December 31, 2005. The reports document changes, activities, and progress of the IVS. The entire contents of this Annual Report also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/ar2005.

  2. New algorithms and pulse-processing units in radioisotope instruments

    International Nuclear Information System (INIS)

    Antonjak, V.; Gonsjorowski, L.; Jastschuk, E.; Kwasnewski, T.

    1981-01-01

    Three new algorithms and the corresponding electronic circuits are described, beginning with the automatic gain stabilisation circuit for scintillation counters. The signal obtained as the difference between two pulse trains from amplitude discriminators has been used for photomultiplier high voltage control. Furthermore, a real time digital filter for random pulse trains is presented, showing that the variance of pulse trains is decreasing after passing the filter. The block diagram, principle of operation and basic features of the filter are given. Finally, a digital circuit for polynomial linearization of the scale function in radioisotope instruments is described. Again, the block diagram of pulse train processing, the mode of operation and programming method are given. (author)

  3. Asteroid families, dynamics and astrometry

    International Nuclear Information System (INIS)

    Williams, J.G.; Gibson, J.

    1987-01-01

    The proper elements and family assignments for the 1227 Palomar-Leiden Survey asteroids of high quality were tabulated. In addition to the large table, there are also auxiliary tables of Mars crossers and commensurate objects, histograms of the proper element distributions, and a discussion. Probably the most important part of the discussion describes the Mars crossing boundary, how the closest distances of approach to Mars and Jupiter are calculated, and why the observed population of Mars crossers should bombard that planet episodically rather than uniformly. Analytical work was done to derive velocity distributions of family forming events from proper element distributions subject to assumptions which may be appropriate for cratering events. Software was developed for a microcomputer to permit plotting of the proper elements. Three orthogonal views are generated and stereo pairs can be printed when desired. This program was created for the study of asteroid families. The astrometry task is directed toward measuring and reducing positions on faint comets and the minor planets with less common orbits. The observational material is CCD frames taken with the Palomar 1.5 m telescope. Positions of 10 comets and 16 different asteroids were published on the Minor Planet Circulars

  4. On Gamma Ray Instrument On-Board Data Processing Real-Time Computational Algorithm for Cosmic Ray Rejection

    Science.gov (United States)

    Kizhner, Semion; Hunter, Stanley D.; Hanu, Andrei R.; Sheets, Teresa B.

    2016-01-01

    Richard O. Duda and Peter E. Hart of Stanford Research Institute in [1] described the recurring problem in computer image processing as the detection of straight lines in digitized images. The problem is to detect the presence of groups of collinear or almost collinear figure points. It is clear that the problem can be solved to any desired degree of accuracy by testing the lines formed by all pairs of points. However, the computation required for n=NxM points image is approximately proportional to n2 or O(n2), becoming prohibitive for large images or when data processing cadence time is in milliseconds. Rosenfeld in [2] described an ingenious method due to Hough [3] for replacing the original problem of finding collinear points by a mathematically equivalent problem of finding concurrent lines. This method involves transforming each of the figure points into a straight line in a parameter space. Hough chose to use the familiar slope-intercept parameters, and thus his parameter space was the two-dimensional slope-intercept plane. A parallel Hough transform running on multi-core processors was elaborated in [4]. There are many other proposed methods of solving a similar problem, such as sampling-up-the-ramp algorithm (SUTR) [5] and algorithms involving artificial swarm intelligence techniques [6]. However, all state-of-the-art algorithms lack in real time performance. Namely, they are slow for large images that require performance cadence of a few dozens of milliseconds (50ms). This problem arises in spaceflight applications such as near real-time analysis of gamma ray measurements contaminated by overwhelming amount of traces of cosmic rays (CR). Future spaceflight instruments such as the Advanced Energetic Pair Telescope instrument (AdEPT) [7-9] for cosmos gamma ray survey employ large detector readout planes registering multitudes of cosmic ray interference events and sparse science gamma ray event traces' projections. The AdEPT science of interest is in the

  5. Gaia Data Release 1. Open cluster astrometry: performance, limitations, and future prospects

    OpenAIRE

    Gaia Collaboration; van Leeuwen, F.; Vallenari, A.; Jordi, C.; Lindegren, L.; Bastian, U.; Prusti, T.; de Bruijne, J. H. J.; Brown, A. G. A.; Babusiaux, C.; Bailer-Jones, C. A. L.; Biermann, M.; Evans, D. W.; Eyer, L.; Jansen, F.

    2017-01-01

    Context. Parallaxes for 331 classical Cepheids, 31 Type II Cepheids, and 364 RR Lyrae stars in common between Gaia and the Hipparcos and Tycho-2 catalogues are published in Gaia Data Release 1 (DR1) as part of the Tycho-Gaia Astrometric Solution (TGAS). \\ud Aims. In order to test these first parallax measurements of the primary standard candles of the cosmological distance ladder, which involve astrometry collected by Gaia during the initial 14 months of science operation, we compared them wi...

  6. Microarcsecond relative astrometry from the ground with a diffractive pupil

    Energy Technology Data Exchange (ETDEWEB)

    Ammons, S M; Bendek, E; Guyon, O

    2011-09-08

    The practical use of astrometry to detect exoplanets via the reflex motion of the parent star depends critically on the elimination of systematic floors in imaging systems. In the diffractive pupil technique proposed for space-based detection of exo-earths, extended diffraction spikes generated by a dotted primary mirror are referenced against a wide-field grid of background stars to calibrate changing optical distortion and achieve microarcsecond astrometric precision on bright targets (Guyon et al. 2010). We describe applications of this concept to ground-based uncrowded astrometry using a diffractive, monopupil telescope and a wide-field camera to image as many as {approx}4000 background reference stars. Final relative astrometric precision is limited by differential tip/tilt jitter caused by high altitude layers of turbulence. A diffractive 3-meter telescope is capable of reaching {approx}35 {micro}as relative astrometric error per coordinate perpendicular to the zenith vector in three hours on a bright target star (I < 10) in fields of moderate stellar density ({approx}40 stars arcmin{sup -2} with I < 23). Smaller diffractive apertures (D < 1 m) can achieve 100-200 {micro}as performance with the same stellar density and exposure time and a large telescope (6.5-10 m) could achieve as low as 10 {micro}as, nearly an order of magnitude better than current space-based facilities. The diffractive pupil enables the use of larger fields of view through calibration of changing optical distortion as well as brighter target stars (V < 6) by preventing star saturation. Permitting the sky to naturally roll to average signals over many thousands of pixels can mitigate the effects of detector imperfections.

  7. THE ARECIBO METHANOL MASER GALACTIC PLANE SURVEY. IV. ACCURATE ASTROMETRY AND SOURCE MORPHOLOGIES

    International Nuclear Information System (INIS)

    Pandian, J. D.; Momjian, E.; Xu, Y.; Menten, K. M.; Goldsmith, P. F.

    2011-01-01

    We present accurate absolute astrometry of 6.7 GHz methanol masers detected in the Arecibo Methanol Maser Galactic Plane Survey using MERLIN and the Expanded Very Large Array (EVLA). We estimate the absolute astrometry to be accurate to better than 15 and 80 mas for the MERLIN and EVLA observations, respectively. We also derive the morphologies of the maser emission distributions for sources stronger than ∼1 Jy. The median spatial extent along the major axis of the regions showing maser emission is ∼775 AU. We find a majority of methanol maser morphologies to be complex with some sources previously determined to have regular morphologies in fact being embedded within larger structures. This suggests that some maser spots do not have a compact core, which leads to them being resolved in high angular resolution observations. This also casts doubt on interpretations of the origin of methanol maser emission solely based on source morphologies. We also investigate the association of methanol masers with mid-infrared emission and find very close correspondence between methanol masers and 24 μm point sources. This adds further credence to theoretical models that predict methanol masers to be pumped by warm dust emission and firmly reinforces the finding that Class II methanol masers are unambiguous tracers of embedded high-mass protostars.

  8. International VLBI Service for Geodesy and Astrometry 2013 Annual Report

    Science.gov (United States)

    Baver, Karen D.; Behrend, Dirk; Armstrong, Kyla L.

    2014-01-01

    This volume of reports is the 2013 Annual Report of the International VLBI Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the permanent components of IVS. The IVS 2013 Annual Report documents the work of the IVS components for the calendar year 2013, our fifteenth year of existence. The reports describe changes, activities, and progress of the IVS. Many thanks to all IVS components who contributed to this Annual Report. With the exception of the first section and the last section, the contents of this Annual Report also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/ar2013.

  9. CosmoQuest Transient Tracker: Opensource Photometry & Astrometry software

    Science.gov (United States)

    Myers, Joseph L.; Lehan, Cory; Gay, Pamela; Richardson, Matthew; CosmoQuest Team

    2018-01-01

    CosmoQuest is moving from online citizen science, to observational astronomy with the creation of Transient Trackers. This open source software is designed to identify asteroids and other transient/variable objects in image sets. Transient Tracker’s features in final form will include: astrometric and photometric solutions, identification of moving/transient objects, identification of variable objects, and lightcurve analysis. In this poster we present our initial, v0.1 release and seek community input.This software builds on the existing NIH funded ImageJ libraries. Creation of this suite of opensource image manipulation routines is lead by Wayne Rasband and is released primarily under the MIT license. In this release, we are building on these libraries to add source identification for point / point-like sources, and to do astrometry. Our materials released under the Apache 2.0 license on github (http://github.com/CosmoQuestTeam) and documentation can be found at http://cosmoquest.org/TransientTracker.

  10. Some new results on the central overlap problem in astrometry

    Science.gov (United States)

    Rapaport, M.

    1998-07-01

    The central overlap problem in astrometry has been revisited in the recent last years by Eichhorn (1988) who explicitly inverted the matrix of a constrained least squares problem. In this paper, the general explicit solution of the unconstrained central overlap problem is given. We also give the explicit solution for an other set of constraints; this result is a confirmation of a conjecture expressed by Eichhorn (1988). We also consider the use of iterative methods to solve the central overlap problem. A surprising result is obtained when the classical Gauss Seidel method is used; the iterations converge immediately to the general solution of the equations; we explain this property writing the central overlap problem in a new set of variables.

  11. BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device

    Science.gov (United States)

    Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.

    2010-12-01

    The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.

  12. International VLBI Service for Geodesy and Astrometry 2012 Annual Report

    Science.gov (United States)

    Baver, Karen D.; Behrend, Dirk; Armstrong, Kyla L.

    2013-01-01

    This volume of reports is the 2012 Annual Report of the International VLBI Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the permanent components of IVS. The IVS 2012 Annual Report documents the work of the IVS components for the calendar year 2012, our fourteenth year of existence. The reports describe changes, activities, and progress ofthe IVS. Many thanks to all IVS components who contributed to this Annual Report. With the exception of the first section and parts of the last section (described below), the contents of this Annual Report also appear on the IVS Web site athttp:ivscc.gsfc.nasa.gov/publications/ar2012

  13. International VLBI Service for Geodesy and Astrometry: 2000 General Meeting Proceedings

    Science.gov (United States)

    Vandenberg, Nancy R. (Editor); Baver, Karen D. (Editor)

    2000-01-01

    This volume is the proceedings of the first General Meeting of the International Very Long Base Interferometry (VLBI) Service for Geodesy and Astrometry (IVS), held in Koetzting, Germany, February 21-24, 2000. The content of this volume also appears on the IVS web site at: http://ivscc.gsfc.nasa.gov/publications/gm2000. The goal of the program committee for the General Meeting was to provide an interesting and informative program for a wide cross section of IVS members, including station operators, program managers, and analysts. The program included reports, tutorials, invited and contributed papers, and poster presentations. The tutorial papers should be particularly useful references because each one provides an overview and introduction to a topic relevant to VLBI.

  14. WCSTools 3.0: More Tools for Image Astrometry and Catalog Searching

    Science.gov (United States)

    Mink, Douglas J.

    For five years, WCSTools has provided image astrometry for astronomers who need accurate positions for objects they wish to observe. Other functions have been added and improved since the package was first released. Support has been added for new catalogs, such as the GSC-ACT, 2MASS Point Source Catalog, and GSC II, as they have been published. A simple command line interface can search any supported catalog, returning information in several standard formats, whether the catalog is on a local disk or searchable over the World Wide Web. The catalog searching routine can be located on either end (or both ends!) of such a web connection, and the output from one catalog search can be used as the input to another search.

  15. The Dynamics of the Local Group in the Era of Precision Astrometry

    Science.gov (United States)

    Besla, Gurtina; Garavito-Camargo, Nicolas; Patel, Ekta

    2018-06-01

    Our understanding of the dynamics of our Local Group of galaxies has changed dramatically over the past few years owing to significant advancements in astrometry and our theoretical understanding of galaxy structure. New surveys now enable us to map the 3D structure of our Milky Way and the dynamics of tracers of its dark matter distribution, like globular clusters, satellite galaxies and streams, with unprecedented precision. Some results have met with controversy, challenging preconceived notions of the orbital dynamics of key components of the Local Group. I will provide an overview of this evolving picture of our Local Group and outline how we can test the cold dark matter paradigm in the era of Gaia, LSST and JWST.

  16. Polyphonic pitch detection and instrument separation

    Science.gov (United States)

    Bay, Mert; Beauchamp, James W.

    2005-09-01

    An algorithm for polyphonic pitch detection and musical instrument separation is presented. Each instrument is represented as a time-varying harmonic series. Spectral information is obtained from a monaural input signal using a spectral peak tracking method. Fundamental frequencies (F0s) for each time frame are estimated from the spectral data using an Expectation Maximization (EM) algorithm with a Gaussian mixture model representing the harmonic series. The method first estimates the most predominant F0, suppresses its series in the input, and then the EM algorithm is run iteratively to estimate each next F0. Collisions between instrument harmonics, which frequently occur, are predicted from the estimated F0s, and the resulting corrupted harmonics are ignored. The amplitudes of these corrupted harmonics are replaced by harmonics taken from a library of spectral envelopes for different instruments, where the spectrum which most closely matches the important characteristics of each extracted spectrum is chosen. Finally, each voice is separately resynthesized by additive synthesis. This algorithm is demonstrated for a trio piece that consists of 3 different instruments.

  17. International VLBI Service for Geodesy and Astrometry: 1999 Annual Report

    Science.gov (United States)

    Vandenberg, Nancy R. (Editor)

    1999-01-01

    This volume of reports is the 1999 Annual Report of the International VLBI Service for Geodesy and Astrometry -IVS. The individual reports were contributed by VLBI groups in the international geodetic community who constitute the components of IVS. The 1999 Annual Report documents the work of the IVS components for the year ending March 1, 1999, the official inauguration date of IVS. As the newest of the space technique services, IVS decided to publish this Annual Report as a reference to our organization and its components. The entire contents of this Annual Report also appear on the IVS website at: http://ivscc.gsfc.nasa.gov/pub/arl999. The IVS 1999 Annual Report will be a valuable reference for information about IVS and its components. This Annual Report will serve as a baseline from which we can measure the anticipated progress of IVS in coming years.

  18. Evolutionary programming for neutron instrument optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Bentley, Phillip M. [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany)]. E-mail: phillip.bentley@hmi.de; Pappas, Catherine [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Habicht, Klaus [Hahn-Meitner Institut, Glienicker Strasse 100, D-14109 Berlin (Germany); Lelievre-Berna, Eddy [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)

    2006-11-15

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations.

  19. Evolutionary programming for neutron instrument optimisation

    International Nuclear Information System (INIS)

    Bentley, Phillip M.; Pappas, Catherine; Habicht, Klaus; Lelievre-Berna, Eddy

    2006-01-01

    Virtual instruments based on Monte-Carlo techniques are now integral part of novel instrumentation development and the existing codes (McSTAS and Vitess) are extensively used to define and optimise novel instrumental concepts. Neutron spectrometers, however, involve a large number of parameters and their optimisation is often a complex and tedious procedure. Artificial intelligence algorithms are proving increasingly useful in such situations. Here, we present an automatic, reliable and scalable numerical optimisation concept based on the canonical genetic algorithm (GA). The algorithm was used to optimise the 3D magnetic field profile of the NSE spectrometer SPAN, at the HMI. We discuss the potential of the GA which combined with the existing Monte-Carlo codes (Vitess, McSTAS, etc.) leads to a very powerful tool for automated global optimisation of a general neutron scattering instrument, avoiding local optimum configurations

  20. Near Infrared High Resolution Spectroscopy and Spectro-astrometry of Gas in Disks around Herbig Ae/Be Stars

    OpenAIRE

    Brittain, Sean D.; Najita, Joan R.; Carr, John S.

    2015-01-01

    In this review, we describe how high resolution near infrared spectroscopy and spectro-astrometry have been used to study the disks around Herbig~Ae/Be stars. We show how these tools can be used to identify signposts of planet formation and elucidate the mechanism by which Herbig Ae/Be stars accrete. We also highlight some of the artifacts that can complicate the interpretation of spectro-astrometric measurements and discuss best practices for mitigating these effects. We conclude with a brie...

  1. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  2. Light propagation in the gravitational field of N arbitrarily moving bodies in the 1.5PN approximation for high-precision astrometry

    Science.gov (United States)

    Zschocke, Sven

    2016-05-01

    High-precision astrometry on sub-micro-arcsecond level in angular resolution requires accurate determination of the trajectory of a light-signal from the celestial light source through the gravitational field of the Solar System toward the observer. In this investigation the light trajectory in the gravitational field of N moving bodies is determined in the 1.5 post-Newtonian approximation. In the approach presented two specific issues of particular importance are accounted for: (1) According to the recommendations of International Astronomical Union, the metric of the Solar System is expressed in terms of intrinsic mass-multipoles and intrinsic spin-multipoles of the massive bodies, allowing for arbitrary shape, inner structure and rotational motion of the massive bodies of the Solar System. (2) The Solar System bodies move along arbitrary world lines which can later be specified by Solar System ephemeris. The presented analytical solution for light trajectory is a primary requirement for extremely high-precision astrometry on sub-micro-arcsecond level of accuracy and associated massive computations in astrometric data reduction. An estimation of the numerical magnitude for time delay and light deflection of the leading multipoles is given.

  3. Determining OBS Instrument Orientations: A Comparison of Algorithms

    Science.gov (United States)

    Doran, A. K.; Laske, G.

    2015-12-01

    The alignment of the orientation of the horizontal seismometer components with the geographical coordinate system is critical for a wide variety of seismic analyses, but the traditional deployment method of ocean bottom seismometers (OBS) precludes knowledge of this parameter. Current techniques for determining the orientation predominantly rely on body and surface wave data recorded from teleseismic events with sufficiently large magnitudes. Both wave types experience lateral refraction between the source and receiver as a result of heterogeneity and anisotropy, and therefore the arrival angle of any one phase can significantly deviate from the great circle minor arc. We systematically compare the results and uncertainties obtained through current determination methods, as well as describe a new algorithm that uses body wave, surface wave, and differential pressure gauge data (where available) to invert for horizontal orientation. To start with, our method is based on the easily transportable computer code of Stachnik et al. (2012) that is publicly available through IRIS. A major addition is that we utilize updated global dispersion maps to account for lateral refraction, as was done by Laske (1995). We also make measurements in a wide range of frequencies, and analyze surface wave trains of repeat orbits. Our method has the advantage of requiring fewer total events to achieve high precision estimates, which is beneficial for OBS deployments that can be as short as weeks. Although the program is designed for the purpose of use with OBS instruments, it also works with standard land installations. We intend to provide the community with a program that is easy to use, requires minimal user input, and is optimized to work with data cataloged at the IRIS DMC.

  4. GIER: A Danish computer from 1961 with a role in the modern revolution of astronomy - II

    Science.gov (United States)

    Høg, Erik

    2018-04-01

    A Danish computer, GIER, from 1961 played a vital role in the development of a new method for astrometric measurement. This method, photon counting astrometry, ultimately led to two satellites with a significant role in the modern revolution of astronomy. A GIER was installed at the Hamburg Observatory in 1964 where it was used to implement the entirely new method for the measurement of stellar positions by means of a meridian circle, at that time the fundamental instrument of astrometry. An expedition to Perth in Western Australia with the instrument and the computer was a success. This method was also implemented in space in the first ever astrometric satellite Hipparcos launched by ESA in 1989. The Hipparcos results published in 1997 revolutionized astrometry with an impact in all branches of astronomy from the solar system and stellar structure to cosmic distances and the dynamics of the Milky Way. In turn, the results paved the way for a successor, the one million times more powerful Gaia astrometry satellite launched by ESA in 2013. Preparations for a Gaia successor in twenty years are making progress.

  5. New-Generation NASA Aura Ozone Monitoring Instrument (OMI) Volcanic SO2 Dataset: Algorithm Description, Initial Results, and Continuation with the Suomi-NPP Ozone Mapping and Profiler Suite (OMPS)

    Science.gov (United States)

    Li, Can; Krotkov, Nickolay A.; Carn, Simon; Zhang, Yan; Spurr, Robert J. D.; Joiner, Joanna

    2017-01-01

    Since the fall of 2004, the Ozone Monitoring Instrument (OMI) has been providing global monitoring of volcanic SO2 emissions, helping to understand their climate impacts and to mitigate aviation hazards. Here we introduce a new-generation OMI volcanic SO2 dataset based on a principal component analysis (PCA) retrieval technique. To reduce retrieval noise and artifacts as seen in the current operational linear fit (LF) algorithm, the new algorithm, OMSO2VOLCANO, uses characteristic features extracted directly from OMI radiances in the spectral fitting, thereby helping to minimize interferences from various geophysical processes (e.g., O3 absorption) and measurement details (e.g., wavelength shift). To solve the problem of low bias for large SO2 total columns in the LF product, the OMSO2VOLCANO algorithm employs a table lookup approach to estimate SO2 Jacobians (i.e., the instrument sensitivity to a perturbation in the SO2 column amount) and iteratively adjusts the spectral fitting window to exclude shorter wavelengths where the SO2 absorption signals are saturated. To first order, the effects of clouds and aerosols are accounted for using a simple Lambertian equivalent reflectivity approach. As with the LF algorithm, OMSO2VOLCANO provides total column retrievals based on a set of predefined SO2 profiles from the lower troposphere to the lower stratosphere, including a new profile peaked at 13 km for plumes in the upper troposphere. Examples given in this study indicate that the new dataset shows significant improvement over the LF product, with at least 50% reduction in retrieval noise over the remote Pacific. For large eruptions such as Kasatochi in 2008 (approximately 1700 kt total SO2/ and Sierra Negra in 2005 (greater than 1100DU maximum SO2), OMSO2VOLCANO generally agrees well with other algorithms that also utilize the full spectral content of satellite measurements, while the LF algorithm tends to underestimate SO2. We also demonstrate that, despite the

  6. Comparing Binaural Pre-processing Strategies I: Instrumental Evaluation.

    Science.gov (United States)

    Baumgärtel, Regina M; Krawczyk-Becker, Martin; Marquardt, Daniel; Völker, Christoph; Hu, Hongmei; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Ernst, Stephan M A; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-12-30

    In a collaborative research project, several monaural and binaural noise reduction algorithms have been comprehensively evaluated. In this article, eight selected noise reduction algorithms were assessed using instrumental measures, with a focus on the instrumental evaluation of speech intelligibility. Four distinct, reverberant scenarios were created to reflect everyday listening situations: a stationary speech-shaped noise, a multitalker babble noise, a single interfering talker, and a realistic cafeteria noise. Three instrumental measures were employed to assess predicted speech intelligibility and predicted sound quality: the intelligibility-weighted signal-to-noise ratio, the short-time objective intelligibility measure, and the perceptual evaluation of speech quality. The results show substantial improvements in predicted speech intelligibility as well as sound quality for the proposed algorithms. The evaluated coherence-based noise reduction algorithm was able to provide improvements in predicted audio signal quality. For the tested single-channel noise reduction algorithm, improvements in intelligibility-weighted signal-to-noise ratio were observed in all but the nonstationary cafeteria ambient noise scenario. Binaural minimum variance distortionless response beamforming algorithms performed particularly well in all noise scenarios. © The Author(s) 2015.

  7. Deriving the true mass of an unresolved Brown Dwarf companion to an M-Dwarf with AO aided astrometry*

    Directory of Open Access Journals (Sweden)

    Kürster M.

    2011-07-01

    Full Text Available From radial velocity (RV detections alone one does not get all orbital parameters needed to derive the true mass of a non-transiting, unresolved substellar companion to a star. Additional astrometric measurements are needed to calculate the inclination and the longitude of the ascending node. Until today only few true substellar companion masses have been determined by this method with the HST fine guidance sensor [1, 2]. We aim to derive the true mass of a brown dwarf candidate companion to an early M 2.5V dwarf with groundbased high-resolution astrometry aided by adaptive optics. We found this unique brown dwarf desert object, whose distance to the host star is only 0.42 AU, in our UVES precision RV survey of M dwarfs, inferring a minimum companion mass of 27 Jupiter masses [3]. Combining the data with HIPPARCOS astrometry, we found a probability of only 2.9% that the companion is stellar. We therefore observed the host star together with a reference star within a monitoring program with VLT/NACO to derive the true mass of the companion and establish its nature (brown dwarf vs. star. Simultaneous observations of a reference field in a globular cluster are performed to determine the stability of the adaptive optics (AO plus detector system and check its suitability for such high-precision astrometric measurements over several epochs which are needed to find and analyse extrasolar planet systems.

  8. International VLBI Service for Geodesy and Astrometry: General Meeting Proceedings

    Science.gov (United States)

    Vandenberg, Nancy R. (Editor); Baver, Karen D. (Editor)

    2002-01-01

    This volume contains the proceedings of the second General Meeting of the International VLBI Service for Geodesy and Astrometry (IVS), held in Tsukuba, Japan, February 4-7, 2002. The contents of this volume also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/gm2002. The key-note of the second GM was prospectives for the future, in keeping with the re-organization of the IAG around the motivation of geodesy as 'an old science with a dynamic future' and noting that providing reference frames for Earth system science that are consistent over decades on the highest accuracy level will provide a challenging role for IVS. The goal of the meeting was to provide an interesting and informative program for a wide cross section of IVS members, including station operators, program managers, and analysts. This volume contains 72 papers and five abstracts of papers presented at the GM. The volume also includes reports about three splinter meetings held in conjunction with the GM: a mini-TOW (Technical Operations Workshop), the third IVS Analysis Workshop and a meeting of the analysis working group on geophysical modeling.

  9. Vienna VLBI and Satellite Software (VieVS) for Geodesy and Astrometry

    Science.gov (United States)

    Böhm, Johannes; Böhm, Sigrid; Boisits, Janina; Girdiuk, Anastasiia; Gruber, Jakob; Hellerschmied, Andreas; Krásná, Hana; Landskron, Daniel; Madzak, Matthias; Mayer, David; McCallum, Jamie; McCallum, Lucia; Schartner, Matthias; Teke, Kamil

    2018-04-01

    The Vienna VLBI and Satellite Software (VieVS) is state-of-the-art Very Long Baseline Interferometry (VLBI) analysis software for geodesy and astrometry. VieVS has been developed at Technische Universität Wien (TU Wien) since 2008, where it is used for research purposes and for teaching space geodetic techniques. In the past decade, it has been successfully applied on Very Long Baseline Interferometry (VLBI) observations for the determination of celestial and terrestrial reference frames as well as for the estimation of celestial pole offsets, universal Time (UT1-UTC), and polar motion based on least-squares adjustment. Furthermore, VieVS is equipped with tools for scheduling and simulating VLBI observations to extragalactic radio sources as well as to satellites and spacecraft, features which proved to be very useful for a variety of applications. VieVS is now available as version 3.0 and we do provide the software to all interested persons and institutions. A wiki with more information about VieVS is available at http://vievswiki.geo.tuwien.ac.at/.

  10. Mixed field dose equivalent measuring instruments

    International Nuclear Information System (INIS)

    Brackenbush, L.W.; McDonald, J.C.; Endres, G.W.R.; Quam, W.

    1985-01-01

    In the past, separate instruments have been used to monitor dose equivalent from neutrons and gamma rays. It has been demonstrated that it is now possible to measure simultaneously neutron and gamma dose with a single instrument, the tissue equivalent proportional counter (TEPC). With appropriate algorithms dose equivalent can also be determined from the TEPC. A simple ''pocket rem meter'' for measuring neutron dose equivalent has already been developed. Improved algorithms for determining dose equivalent for mixed fields are presented. (author)

  11. Tropospheric ozone column retrieval at northern mid-latitudes from the Ozone Monitoring Instrument by means of a neural network algorithm

    Directory of Open Access Journals (Sweden)

    P. Sellitto

    2011-11-01

    Full Text Available Monitoring tropospheric ozone from space is of critical importance in order to gain more thorough knowledge on phenomena affecting air quality and the greenhouse effect. Deriving information on tropospheric ozone from UV/VIS nadir satellite spectrometers is difficult owing to the weak sensitivity of the measured radiance spectra to variations of ozone in the troposphere. Here we propose an alternative method of analysis to retrieve tropospheric ozone columns from Ozone Monitoring Instrument radiances by means of a neural network algorithm. An extended set of ozone sonde measurements at northern mid-latitudes for the years 2004–2008 has been considered as the training and test data set. The design of the algorithm is extensively discussed. Our retrievals are compared to both tropospheric ozone residuals and optimal estimation retrievals over a similar independent test data set. Results show that our algorithm has comparable accuracy with respect to both correlative methods and its performance is slightly better over a subset containing only European ozone sonde stations. Possible sources of errors are analyzed. Finally, the capabilities of our algorithm to derive information on boundary layer ozone are studied and the results critically discussed.

  12. Optimization of virtual source parameters in neutron scattering instrumentation

    International Nuclear Information System (INIS)

    Habicht, K; Skoulatos, M

    2012-01-01

    We report on phase-space optimizations for neutron scattering instruments employing horizontal focussing crystal optics. Defining a figure of merit for a generic virtual source configuration we identify a set of optimum instrumental parameters. In order to assess the quality of the instrumental configuration we combine an evolutionary optimization algorithm with the analytical Popovici description using multidimensional Gaussian distributions. The optimum phase-space element which needs to be delivered to the virtual source by preceding neutron optics may be obtained using the same algorithm which is of general interest in instrument design.

  13. Proceedings of the Sixth General Meeting of the International VLBI Service for Geodesy and Astrometry

    Science.gov (United States)

    Behrend, Dirk (Editor); Baver, Karen D. (Editor)

    2010-01-01

    This volume is the proceedings of the sixth General Meeting of the International VLBI Service for Geodesy and Astrometry (IVS), held in Hobart, Tasmania, Australia, February 7-13, 2010. The contents of this volume also appear on the IVS Web site at http://ivscc.gsfc.nasa.gov/publications/gm2010. The keynote of the sixth GM was the new perspectives of the next generation VLBI system under the theme "VLBI2010: From Vision to Reality". The goal of the meeting was to provide an interesting and informative program for a wide cross-section of IVS members, including station operators, program managers, and analysts. This volume contains 88 papers. All papers were edited by the editors for usage of the English language, form, and minor content-related issues.

  14. A method of estimating GPS instrumental biases with a convolution algorithm

    Science.gov (United States)

    Li, Qi; Ma, Guanyi; Lu, Weijun; Wan, Qingtao; Fan, Jiangtao; Wang, Xiaolan; Li, Jinghua; Li, Changhua

    2018-03-01

    This paper presents a method of deriving the instrumental differential code biases (DCBs) of GPS satellites and dual frequency receivers. Considering that the total electron content (TEC) varies smoothly over a small area, one ionospheric pierce point (IPP) and four more nearby IPPs were selected to build an equation with a convolution algorithm. In addition, unknown DCB parameters were arranged into a set of equations with GPS observations in a day unit by assuming that DCBs do not vary within a day. Then, the DCBs of satellites and receivers were determined by solving the equation set with the least-squares fitting technique. The performance of this method is examined by applying it to 361 days in 2014 using the observation data from 1311 GPS Earth Observation Network (GEONET) receivers. The result was crosswise-compared with the DCB estimated by the mesh method and the IONEX products from the Center for Orbit Determination in Europe (CODE). The DCB values derived by this method agree with those of the mesh method and the CODE products, with biases of 0.091 ns and 0.321 ns, respectively. The convolution method's accuracy and stability were quite good and showed improvements over the mesh method.

  15. THE PHASES DIFFERENTIAL ASTROMETRY DATA ARCHIVE. II. UPDATED BINARY STAR ORBITS AND A LONG PERIOD ECLIPSING BINARY

    International Nuclear Information System (INIS)

    Muterspaugh, Matthew W.; O'Connell, J.; Hartkopf, William I.; Lane, Benjamin F.; Williamson, M.; Kulkarni, S. R.; Konacki, Maciej; Burke, Bernard F.; Colavita, M. M.; Shao, M.; Wiktorowicz, Sloane J.

    2010-01-01

    Differential astrometry measurements from the Palomar High-precision Astrometric Search for Exoplanet Systems have been combined with lower precision single-aperture measurements covering a much longer timespan (from eyepiece measurements, speckle interferometry, and adaptive optics) to determine improved visual orbits for 20 binary stars. In some cases, radial velocity observations exist to constrain the full three-dimensional orbit and determine component masses. The visual orbit of one of these binaries-α Com (HD 114378)-shows that the system is likely to have eclipses, despite its very long period of 26 years. The next eclipse is predicted to be within a week of 2015 January 24.

  16. In Pursuit of LSST Science Requirements: A Comparison of Photometry Algorithms

    Science.gov (United States)

    Becker, Andrew C.; Silvestri, Nicole M.; Owen, Russell E.; Ivezić, Željko; Lupton, Robert H.

    2007-12-01

    We have developed an end-to-end photometric data-processing pipeline to compare current photometric algorithms commonly used on ground-based imaging data. This test bed is exceedingly adaptable and enables us to perform many research and development tasks, including image subtraction and co-addition, object detection and measurements, the production of photometric catalogs, and the creation and stocking of database tables with time-series information. This testing has been undertaken to evaluate existing photometry algorithms for consideration by a next-generation image-processing pipeline for the Large Synoptic Survey Telescope (LSST). We outline the results of our tests for four packages: the Sloan Digital Sky Survey's Photo package, DAOPHOT and ALLFRAME, DOPHOT, and two versions of Source Extractor (SExtractor). The ability of these algorithms to perform point-source photometry, astrometry, shape measurements, and star-galaxy separation and to measure objects at low signal-to-noise ratio is quantified. We also perform a detailed crowded-field comparison of DAOPHOT and ALLFRAME, and profile the speed and memory requirements in detail for SExtractor. We find that both DAOPHOT and Photo are able to perform aperture photometry to high enough precision to meet LSST's science requirements, and less adequately at PSF-fitting photometry. Photo performs the best at simultaneous point- and extended-source shape and brightness measurements. SExtractor is the fastest algorithm, and recent upgrades in the software yield high-quality centroid and shape measurements with little bias toward faint magnitudes. ALLFRAME yields the best photometric results in crowded fields.

  17. Relative astrometry of compact flaring structures in Sgr A* with polarimetric very long baseline interferometry

    International Nuclear Information System (INIS)

    Johnson, Michael D.; Doeleman, Sheperd S.; Fish, Vincent L.; Broderick, Avery E.; Wardle, John F. C.; Marrone, Daniel P.

    2014-01-01

    We demonstrate that polarimetric interferometry can be used to extract precise spatial information about compact polarized flares of Sgr A*. We show that, for a faint dynamical component, a single interferometric baseline suffices to determine both its polarization and projected displacement from the quiescent intensity centroid. A second baseline enables two-dimensional reconstruction of the displacement, and additional baselines can self-calibrate using the flare, enhancing synthesis imaging of the quiescent emission. We apply this technique to simulated 1.3 mm wavelength observations of a 'hot spot' embedded in a radiatively inefficient accretion disk around Sgr A*. Our results indicate that, even with current sensitivities, polarimetric interferometry with the Event Horizon Telescope can achieve ∼5 μas relative astrometry of compact flaring structures near Sgr A* on timescales of minutes.

  18. Smart antennas for nuclear instruments

    International Nuclear Information System (INIS)

    Jain, Ranjan Bala; Singhi, B.M.

    2005-01-01

    The advances in the field of computer and communications are leading to the development of smart embedded nuclear instruments. These instruments have highly sophisticated signal-processing algorithms based on FPGA and ASICS, provisions of present day connectivity and user interfaces. The developments in the connectivity, standards and bus technologies have made possible to access these instruments on LAN and WAN with suitable reliability and security. To get rid of wires i.e. in order to access these instruments, without wires at any place, wireless technology has evolved and become integral part of day-to-day activities. The environment monitoring can be done remotely, if smart antennas are incorporated on these instruments

  19. Mosaic crystal algorithm for Monte Carlo simulations

    CERN Document Server

    Seeger, P A

    2002-01-01

    An algorithm is presented for calculating reflectivity, absorption, and scattering of mosaic crystals in Monte Carlo simulations of neutron instruments. The algorithm uses multi-step transport through the crystal with an exact solution of the Darwin equations at each step. It relies on the kinematical model for Bragg reflection (with parameters adjusted to reproduce experimental data). For computation of thermal effects (the Debye-Waller factor and coherent inelastic scattering), an expansion of the Debye integral as a rapidly converging series of exponential terms is also presented. Any crystal geometry and plane orientation may be treated. The algorithm has been incorporated into the neutron instrument simulation package NISP. (orig.)

  20. Algorithm integration using ADL (Algorithm Development Library) for improving CrIMSS EDR science product quality

    Science.gov (United States)

    Das, B.; Wilson, M.; Divakarla, M. G.; Chen, W.; Barnet, C.; Wolf, W.

    2013-05-01

    Algorithm Development Library (ADL) is a framework that mimics the operational system IDPS (Interface Data Processing Segment) that is currently being used to process data from instruments aboard Suomi National Polar-orbiting Partnership (S-NPP) satellite. The satellite was launched successfully in October 2011. The Cross-track Infrared and Microwave Sounder Suite (CrIMSS) consists of the Advanced Technology Microwave Sounder (ATMS) and Cross-track Infrared Sounder (CrIS) instruments that are on-board of S-NPP. These instruments will also be on-board of JPSS (Joint Polar Satellite System) that will be launched in early 2017. The primary products of the CrIMSS Environmental Data Record (EDR) include global atmospheric vertical temperature, moisture, and pressure profiles (AVTP, AVMP and AVPP) and Ozone IP (Intermediate Product from CrIS radiances). Several algorithm updates have recently been proposed by CrIMSS scientists that include fixes to the handling of forward modeling errors, a more conservative identification of clear scenes, indexing corrections for daytime products, and relaxed constraints between surface temperature and air temperature for daytime land scenes. We have integrated these improvements into the ADL framework. This work compares the results from ADL emulation of future IDPS system incorporating all the suggested algorithm updates with the current official processing results by qualitative and quantitative evaluations. The results prove these algorithm updates improve science product quality.

  1. Surveillance instrumentation for spent-fuel safeguards

    International Nuclear Information System (INIS)

    McKenzie, J.M.; Holmes, J.P.; Gillman, L.K.; Schmitz, J.A.; McDaniel, P.J.

    1978-01-01

    The movement, in a facility, of spent reactor fuel may be tracked using simple instrumentation together with a real time unfolding algorithm. Experimental measurements, from multiple radiation monitors and crane weight and position monitors, were obtained during spent fuel movements at the G.E. Morris Spent-Fuel Storage Facility. These data and a preliminary version of an unfolding algorithm were used to estimate the position of the centroid and the magnitude of the spent fuel radiation source. Spatial location was estimated to +-1.5 m and source magnitude to +-10% of their true values. Application of this surveillance instrumentation to spent-fuel safeguards is discussed

  2. An iterative algorithm for calculating stylus radius unambiguously

    International Nuclear Information System (INIS)

    Vorburger, T V; Zheng, A; Renegar, T B; Song, J-F; Ma, L

    2011-01-01

    The stylus radius is an important specification for stylus instruments and is commonly provided by instrument manufacturers. However, it is difficult to measure the stylus radius unambiguously. Accurate profiles of the stylus tip may be obtained by profiling over an object sharper than itself, such as a razor blade. However, the stylus profile thus obtained is a partial arc, and unless the shape of the stylus tip is a perfect sphere or circle, the effective value of the radius depends on the length of the tip profile over which the radius is determined. We have developed an iterative, least squares algorithm aimed to determine the effective least squares stylus radius unambiguously. So far, the algorithm converges to reasonable results for the least squares stylus radius. We suggest that the algorithm be considered for adoption in documentary standards describing the properties of stylus instruments.

  3. Musical Instrument Classification Based on Nonlinear Recurrence Analysis and Supervised Learning

    Directory of Open Access Journals (Sweden)

    R.Rui

    2013-04-01

    Full Text Available In this paper, the phase space reconstruction of time series produced by different instruments is discussed based on the nonlinear dynamic theory. The dense ratio, a novel quantitative recurrence parameter, is proposed to describe the difference of wind instruments, stringed instruments and keyboard instruments in the phase space by analyzing the recursive property of every instrument. Furthermore, a novel supervised learning algorithm for automatic classification of individual musical instrument signals is addressed deriving from the idea of supervised non-negative matrix factorization (NMF algorithm. In our approach, the orthogonal basis matrix could be obtained without updating the matrix iteratively, which NMF is unable to do. The experimental results indicate that the accuracy of the proposed method is improved by 3% comparing with the conventional features in the individual instrument classification.

  4. VizieR Online Data Catalog: Accurate astrometry & RVs of 4 multiple systems (Tokovinin+, 2017)

    Science.gov (United States)

    Tokovinin, A.; Latham, D. W.

    2017-10-01

    The outer subsystems are classical visual binaries. Historic micrometric measurements and modern speckle interferometric data have been obtained from the WDS database on our request. Additionally, we secured new speckle astrometry and relative photometry of two systems at the 4.1m SOAR telescope. Published radial velocities (RVs) are used here together with the new data. The RVs were measured with the CfA Digital Speedometers, initially using the 1.5m Wyeth Reflector at the Oak Ridge Observatory in the town of Harvard, Massachusetts, and subsequently with the 1.5m Tillinghast Reflector at the Whipple Observatory on Mount Hopkins, Arizona. Starting in 2009, the new fiber-fed Tillinghast Reflector Echelle Spectrograph (TRES) was used. The spectral resolution was 44000 for all three spectrographs. Two objects, HIP 101955 and 103987, were observed in 2015 with the CHIRON echelle spectrograph at the 1.5m telescope at CTIO with a spectral resolution of 80000. (4 data files).

  5. Information content of ozone retrieval algorithms

    Science.gov (United States)

    Rodgers, C.; Bhartia, P. K.; Chu, W. P.; Curran, R.; Deluisi, J.; Gille, J. C.; Hudson, R.; Mateer, C.; Rusch, D.; Thomas, R. J.

    1989-01-01

    The algorithms are characterized that were used for production processing by the major suppliers of ozone data to show quantitatively: how the retrieved profile is related to the actual profile (This characterizes the altitude range and vertical resolution of the data); the nature of systematic errors in the retrieved profiles, including their vertical structure and relation to uncertain instrumental parameters; how trends in the real ozone are reflected in trends in the retrieved ozone profile; and how trends in other quantities (both instrumental and atmospheric) might appear as trends in the ozone profile. No serious deficiencies were found in the algorithms used in generating the major available ozone data sets. As the measurements are all indirect in someway, and the retrieved profiles have different characteristics, data from different instruments are not directly comparable.

  6. An engineered design of a diffractive mask for high precision astrometry

    Science.gov (United States)

    Dennison, Kaitlin; Ammons, S. Mark; Garrel, Vincent; Marin, Eduardo; Sivo, Gaetano; Bendek, Eduardo; Guyon, Oliver

    2016-07-01

    AutoCAD, Zemax Optic Studio 15, and Interactive Data Language (IDL) with the Proper Library are used to computationally model and test a diffractive mask (DiM) suitable for use in the Gemini Multi-Conjugate Adaptive Optics System (GeMS) on the Gemini South Telescope. Systematic errors in telescope imagery are produced when the light travels through the adaptive optics system of the telescope. DiM is a transparent, flat optic with a pattern of miniscule dots lithographically applied to it. It is added ahead of the adaptive optics system in the telescope in order to produce diffraction spots that will encode systematic errors in the optics after it. Once these errors are encoded, they can be corrected for. DiM will allow for more accurate measurements in astrometry and thus improve exoplanet detection. The mechanics and physical attributes of the DiM are modeled in AutoCAD. Zemax models the ray propagation of point sources of light through the telescope. IDL and Proper simulate the wavefront and image results of the telescope. Aberrations are added to the Zemax and IDL models to test how the diffraction spots from the DiM change in the final images. Based on the Zemax and IDL results, the diffraction spots are able to encode the systematic aberrations.

  7. Radar Astrometry of Asteroid 99942 (2004 MN4): Predicting the 2029 Earth Encounter and Beyond

    Science.gov (United States)

    Giorgini, J. D.; Benner, L. A. M.; Nolan, M. C.; Ostro, S. J.

    2005-08-01

    Asteroid 2004 MN4 is expected to pass 4.6 (+/- 1.6) Earth-radii above the surface of the Earth on 2029-Apr-13. Such close approaches by objects as large as 2004 MN4 (D ≳ 0.3 km) are thought to occur at ≳ 1000-year intervals on average. 2004 MN4 is expected to reach 3rd magnitude and thus be visible to the unaided eye. With a disk 2-4 arcseconds across, it may be resolved by ground-based telescopes. Arecibo (2380-MHz) delay-Doppler radar astrometry, obtained in late January 2005, significantly corrected 2004 MN4's orbit by revealing a 1.4 arcsecond bias in pre-discovery optical measurements. Doppler-shifted echoes were acquired 4.8σ (176.4 mm/s) away from the predicted frequency on Jan 27. Range on Jan 29 was found to be 747 km (2.8σ ) closer to Earth than the pre-radar orbit predicted. Incorporation of these delay-Doppler measurements into a new weighted least-squares orbit solution moved the 2029-Apr-13 encounter prediction 5σ closer to the Earth, illustrating the problematic nature of prediction and statistical analysis with single-apparition optical data-sets. Without delay-Doppler data, the bias was not apparent, even when optical measurements spanned a full orbit period. The current combined data-set does not permit reliable trajectory propagation to encounters beyond 2029; Monte Carlo analysis shows that, by 2036, the 3σ confidence region wraps >300 degrees of heliocentric longitude around the Sun, with some sections of this statistical region experiencing low-probability encounters with the Earth in the 2030's, gravitationally scattering some possible trajectories inward to the orbit of Venus, or outward toward Mars. Future measurements from radar opportunities in August 2005 and May 2006 (SNR ≈5-10) have the potential to eliminate statistical encounters in the 2030's. Delay-Doppler astrometry from 2013 (SNR ≈30) should permit deterministic encounter prediction through 2070, shrinking the along-track uncertainty in 2036 by two orders of magnitude

  8. Recent Radar Astrometry of Asteroid 2004 MN4

    Science.gov (United States)

    Giorgini, J. D.; Benner, L. A. M.; Nolan, M. C.; Ostro, S. J.

    2005-05-01

    Arecibo (2380-MHz) delay-Doppler radar astrometry obtained in late January of 2005 significantly corrected 2004 MN4's orbit. Doppler-shifted echoes were acquired 4.8-sigma away from the predicted frequency on Jan 27, while range to the object on Jan 29 was found to be 747 km (2.8-sigma) closer to Earth than the pre-radar orbit solution predicted. Incorporation of these radar measurements into least-squares orbit solution #82 resulted in a new predicted Earth encounter on 2029-Apr-13 of 36000 +/- 9900 km (3-sigma formal uncertainties), or 5.6 +/- 1.6 Earth radii, from Earth's center. This is inside geosynchronous orbit and 27700 km (4.3 Earth radii) closer to Earth than predicted by the pre-radar ephemeris -- a 5-sigma change compared to the pre-radar orbit solution, illustrating the problematic nature of prediction and statistical analysis when only single-apparition optical data-sets are available. The current data-set does not permit reliable trajectory propagation to encounters later than 2029; this may not be possible until data from 2012-2013 are available. The corrected nominal approach distance in 2029 is approximately twice the classical Roche limit and closer than any known past or future approach by a natural object larger than 10 m, other than those detected after already impacting the Earth or it's atmosphere. Such close approaches by objects as large as 2004 MN4 (D ≳ 0.3 km) are currently thought to occur at ≳ 1000-year intervals on average. 2004 MN4 is expected to reach 3rd magnitude for observers in Europe, western Asia, and Africa, and thus be visible to the unaided eye. The asteroid's disk will be 2-4 arcseconds across and potentially resolvable with small ground-based telescopes.

  9. Phase-Retrieval Uncertainty Estimation and Algorithm Comparison for the JWST-ISIM Test Campaign

    Science.gov (United States)

    Aronstein, David L.; Smith, J. Scott

    2016-01-01

    Phase retrieval, the process of determining the exitpupil wavefront of an optical instrument from image-plane intensity measurements, is the baseline methodology for characterizing the wavefront for the suite of science instruments (SIs) in the Integrated Science Instrument Module (ISIM) for the James Webb Space Telescope (JWST). JWST is a large, infrared space telescope with a 6.5-meter diameter primary mirror. JWST is currently NASA's flagship mission and will be the premier space observatory of the next decade. ISIM contains four optical benches with nine unique instruments, including redundancies. ISIM was characterized at the Goddard Space Flight Center (GSFC) in Greenbelt, MD in a series of cryogenic vacuum tests using a telescope simulator. During these tests, phase-retrieval algorithms were used to characterize the instruments. The objective of this paper is to describe the Monte-Carlo simulations that were used to establish uncertainties (i.e., error bars) for the wavefronts of the various instruments in ISIM. Multiple retrieval algorithms were used in the analysis of ISIM phase-retrieval focus-sweep data, including an iterativetransform algorithm and a nonlinear optimization algorithm. These algorithms emphasize the recovery of numerous optical parameters, including low-order wavefront composition described by Zernike polynomial terms and high-order wavefront described by a point-by-point map, location of instrument best focus, focal ratio, exit-pupil amplitude, the morphology of any extended object, and optical jitter. The secondary objective of this paper is to report on the relative accuracies of these algorithms for the ISIM instrument tests, and a comparison of their computational complexity and their performance on central and graphical processing unit clusters. From a phase-retrieval perspective, the ISIM test campaign includes a variety of source illumination bandwidths, various image-plane sampling criteria above and below the Nyquist- Shannon

  10. Development of quality control and instrumentation performance metrics for diffuse optical spectroscopic imaging instruments in the multi-center clinical environment

    Science.gov (United States)

    Keene, Samuel T.; Cerussi, Albert E.; Warren, Robert V.; Hill, Brian; Roblyer, Darren; Leproux, AnaÑ--s.; Durkin, Amanda F.; O'Sullivan, Thomas D.; Haghany, Hosain; Mantulin, William W.; Tromberg, Bruce J.

    2013-03-01

    Instrument equivalence and quality control are critical elements of multi-center clinical trials. We currently have five identical Diffuse Optical Spectroscopic Imaging (DOSI) instruments enrolled in the American College of Radiology Imaging Network (ACRIN, #6691) trial located at five academic clinical research sites in the US. The goal of the study is to predict the response of breast tumors to neoadjuvant chemotherapy in 60 patients. In order to reliably compare DOSI measurements across different instruments, operators and sites, we must be confident that the data quality is comparable. We require objective and reliable methods for identifying, correcting, and rejecting low quality data. To achieve this goal, we developed and tested an automated quality control algorithm that rejects data points below the instrument noise floor, improves tissue optical property recovery, and outputs a detailed data quality report. Using a new protocol for obtaining dark-noise data, we applied the algorithm to ACRIN patient data and successfully improved the quality of recovered physiological data in some cases.

  11. Evaluation of beam tracking strategies for the THOR-CSW solar wind instrument

    Science.gov (United States)

    De Keyser, Johan; Lavraud, Benoit; Prech, Lubomir; Neefs, Eddy; Berkenbosch, Sophie; Beeckman, Bram; Maggiolo, Romain; Fedorov, Andrei; Baruah, Rituparna; Wong, King-Wah; Amoros, Carine; Mathon, Romain; Génot, Vincent

    2017-04-01

    We compare different beam tracking strategies for the Cold Solar Wind (CSW) plasma spectrometer on the ESA M4 THOR mission candidate. The goal is to intelligently select the energy and angular windows the instrument is sampling and to adapt these windows as the solar wind properties evolve, with the aim to maximize the velocity distribution acquisition rate while maintaining excellent energy and angular resolution. Using synthetic data constructed using high-cadence measurements by the Faraday cup instrument on the Spektr-R mission (30 ms resolution), we test the performance of energy beam tracking with or without angular beam tracking. The algorithm can be fed both by data acquired by the plasma spectrometer during the previous measurement cycle, or by data from another instrument, in casu the Faraday Cup (FAR) instrument foreseen on THOR. We verify how these beam tracking algorithms behave for different sizes of the energy and angular windows, and for different data integration times, in order to assess the limitations of the algorithm and to avoid situations in which the algorithm loses track of the beam.

  12. A Long Journey of Mathematics and Astronomy in Romania

    Science.gov (United States)

    Stavinschi, Magda

    2010-10-01

    Bucharest Astronomical Observatory celebrated recently its centenary. Its founders were all mathematicians or, better said, astronomers specialized in celestial mechanics. Their first doctoral theses were defended at Sorbonne, in the second half of the 19th century, under the guidance of the greatest specialists of the time. After they returned home, they continued what they had begun in Paris, namely celestial mechanics. The instruments they ordered and the first programmes of astronomical observations had an increasingly closer relation to mathematics, as they referred to astrometry and especially to stellar catalogues. Naturally, there were also astrophysical concerns, timid ones in the beginning, and then ever larger, especially beginning with the International Geophysical Year. The evolution of world astronomy, as well as that of Romania, seems to be following but one direction: astrophysics. The truth is that astrometry and celestial mechanics continue to lie at the basis of all astrophysical researches, actually in an entirely new and modern form. The astrometry schools recently organized, the new astrometry textbooks, as well as the IAU working groups dedicated to modern astrometry prove that the long journey of mathematics and astronomy is not over yet.

  13. XML in an Adaptive Framework for Instrument Control

    Science.gov (United States)

    Ames, Troy J.

    2004-01-01

    NASA Goddard Space Flight Center is developing an extensible framework for instrument command and control, known as Instrument Remote Control (IRC), that combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms.

  14. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    Science.gov (United States)

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  15. Aerosols and surface UV products form Ozone Monitoring Instrument observations: An overview

    NARCIS (Netherlands)

    Torres, O.; Tanskanen, A.; Veihelmann, B.; Ahn, C.; Braak, R.; Bhartia, P.K.; Veefkind, J.P.; Levelt, P.F.

    2007-01-01

    We present an overview of the theoretical and algorithmic aspects of the Ozone Monitoring Instrument (OMI) aerosol and surface UV algorithms. Aerosol properties are derived from two independent algorithms. The nearUV algorithm makes use of OMI observations in the 350-390 nm spectral region to

  16. A novel gridding algorithm to create regional trace gas maps from satellite observations

    Science.gov (United States)

    Kuhlmann, G.; Hartl, A.; Cheung, H. M.; Lam, Y. F.; Wenig, M. O.

    2014-02-01

    The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2) onto a longitude-latitude grid (level 3). The algorithm is designed for the Ozone Monitoring Instrument (OMI) and can easily be employed for similar instruments - for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI). Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly developed gridding

  17. A novel gridding algorithm to create regional trace gas maps from satellite observations

    Directory of Open Access Journals (Sweden)

    G. Kuhlmann

    2014-02-01

    Full Text Available The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2 onto a longitude–latitude grid (level 3. The algorithm is designed for the Ozone Monitoring Instrument (OMI and can easily be employed for similar instruments – for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI. Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly

  18. The operational methane retrieval algorithm for TROPOMI

    Directory of Open Access Journals (Sweden)

    H. Hu

    2016-11-01

    Full Text Available This work presents the operational methane retrieval algorithm for the Sentinel 5 Precursor (S5P satellite and its performance tested on realistic ensembles of simulated measurements. The target product is the column-averaged dry air volume mixing ratio of methane (XCH4, which will be retrieved simultaneously with scattering properties of the atmosphere. The algorithm attempts to fit spectra observed by the shortwave and near-infrared channels of the TROPOspheric Monitoring Instrument (TROPOMI spectrometer aboard S5P.The sensitivity of the retrieval performance to atmospheric scattering properties, atmospheric input data and instrument calibration errors is evaluated. In addition, we investigate the effect of inhomogeneous slit illumination on the instrument spectral response function. Finally, we discuss the cloud filters to be used operationally and as backup.We show that the required accuracy and precision of  < 1 % for the XCH4 product are met for clear-sky measurements over land surfaces and after appropriate filtering of difficult scenes. The algorithm is very stable, having a convergence rate of 99 %. The forward model error is less than 1 % for about 95 % of the valid retrievals. Model errors in the input profile of water do not influence the retrieval outcome noticeably. The methane product is expected to meet the requirements if errors in input profiles of pressure and temperature remain below 0.3 % and 2 K, respectively. We further find that, of all instrument calibration errors investigated here, our retrievals are the most sensitive to an error in the instrument spectral response function of the shortwave infrared channel.

  19. Using wound care algorithms: a content validation study.

    Science.gov (United States)

    Beitz, J M; van Rijswijk, L

    1999-09-01

    Valid and reliable heuristic devices facilitating optimal wound care are lacking. The objectives of this study were to establish content validation data for a set of wound care algorithms, to identify their associated strengths and weaknesses, and to gain insight into the wound care decision-making process. Forty-four registered nurse wound care experts were surveyed and interviewed at national and regional educational meetings. Using a cross-sectional study design and an 83-item, 4-point Likert-type scale, this purposive sample was asked to quantify the degree of validity of the algorithms' decisions and components. Participants' comments were tape-recorded, transcribed, and themes were derived. On a scale of 1 to 4, the mean score of the entire instrument was 3.47 (SD +/- 0.87), the instrument's Content Validity Index was 0.86, and the individual Content Validity Index of 34 of 44 participants was > 0.8. Item scores were lower for those related to packing deep wounds (P valid and reliable definitions. The wound care algorithms studied proved valid. However, the lack of valid and reliable wound assessment and care definitions hinders optimal use of these instruments. Further research documenting their clinical use is warranted. Research-based practice recommendations should direct the development of future valid and reliable algorithms designed to help nurses provide optimal wound care.

  20. Cellular telephone-based radiation detection instrument

    Science.gov (United States)

    Craig, William W [Pittsburg, CA; Labov, Simon E [Berkeley, CA

    2011-06-14

    A network of radiation detection instruments, each having a small solid state radiation sensor module integrated into a cellular phone for providing radiation detection data and analysis directly to a user. The sensor module includes a solid-state crystal bonded to an ASIC readout providing a low cost, low power, light weight compact instrument to detect and measure radiation energies in the local ambient radiation field. In particular, the photon energy, time of event, and location of the detection instrument at the time of detection is recorded for real time transmission to a central data collection/analysis system. The collected data from the entire network of radiation detection instruments are combined by intelligent correlation/analysis algorithms which map the background radiation and detect, identify and track radiation anomalies in the region.

  1. Accurately Localize and Recognize Instruments with Substation Inspection Robot in Complex Environments

    Directory of Open Access Journals (Sweden)

    Hui Song

    2014-07-01

    Full Text Available This paper designs and develops an automatic detection system in the substation environment where complex and multi-inspecting objects exist. The inspection robot is able to fix and identify the objects quickly using a visual servo control system. This paper focuses on the analysis of fast lockup and recognition method of the substation instruments based on an improved Adaboost algorithm. The robot adjusts its position to the best view point and best resolution for the instrument in real-time. The dial and pointer of the instruments are detected with an improved Hough algorithm, and the angle of the pointer is converted to the corresponding readings. The experimental results indicate that the inspection robot can fix and identify the substation instruments quickly, and has a wide range of practical applications.

  2. VLBI: A Fascinating Technique for Geodesy and Astrometry

    Science.gov (United States)

    Schuh, H.; Behrend, Dirk

    2012-01-01

    Since the 1970s Very Long Baseline Interferometry (VLBI) has proven to be a primary space-geodetic technique by determining precise coordinates on the Earth, by monitoring the variable Earth rotation and orientation with highest precision, and by deriving many other parameters of the Earth system. VLBI provides an important linkage to astronomy through, for instance, the determination of very precise coordinates of extragalactic radio sources. Additionally, it contributes to determining parameters of relativistic and cosmological models. After a short review of the history of geodetic VLBI and a summary of recent results, this paper describes future perspectives of this fascinating technique. The International VLBI Service for Geodesy and Astrometry (IVS), as a service of the International Association of Geodesy (IAG) and the International Astronomical Union (IAU), is well on its way to fully defining a next generation VLBI system, called VLBI2010. The goals of the new system are to achieve on scales up to the size of the Earth an accuracy of 1 mm in position and of 0.1 mm/year in velocity. Continuous observations shall be carried out 24 h per day 7 days per week in the future with initial results to be delivered within 24 h after taking the data. Special sessions, e.g. for monitoring the Earth rotation parameters, will provide the results in near real-time. These goals require a completely new technical and conceptual design of VLBI measurements. Based on extensive simulation studies, strategies have been developed by the IVS to significantly improve its product accuracy through the use of a network of small (approx 12 m) fast-slewing antennas. A new method for generating high precision delay measurements as well as improved methods for handling biases related to radio source structure, system electronics, and deformations of the antenna structures has been developed. Furthermore, as of January 2012, the construction of ten new VLBI2010 sites has been funded, with

  3. Instrument for track linear element recognition

    International Nuclear Information System (INIS)

    Krupnov, V.E.; Fedotov, O.P.

    1977-01-01

    Described is the construction of instrument for recognizing linear elements of tracks. For designing this instrument use has been made of the algorithm for conversion of the point data into a set of linear elements. The flowsheet of the instrument shows its major units such as data converter, data representation register unit, local computers, interface with the central computer. The data representation register unit comprises sixteen registers and is capable of presenting data from sixteen lines when raster scanning of a picture taken from a track chamber. The maximum capacity of the code of the coordinate of a point recorded on a picture is up to 16 digits. The time of the inner operating cycle of the instrument is 1.3 μs. The average time required for processing data containing sixteen scanning lines is 250 μs

  4. International VLBI Service for Geodesy and Astrometry 2000 Annual Report

    Science.gov (United States)

    Vandenberg, N. R. (Editor); Baver, K. D. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    This volume of reports is the 2000 Annual Report of the International Very Long Base Interferometry (VLBI) Service for Geodesy and Astrometry (IVS). The individual reports were contributed by VLBI groups in the international geodetic and astrometric community who constitute the permanent components of IVS. The IVS 2000 Annual Report documents the work of the IVS components for the period March 1, 1999 (the official inauguration date of IVS) through December 31, 2000. The reports document changes, activities, and progress of the IVS. The entire contents of this Annual Report also appear on the IVS web site at http://ivscc.gsfc.nasa.gov/publications/ar2000. This book and the web site are organized as follows: (1) The first section contains general information about IVS, a map showing the location of the components, information about the Directing Board members, and the report of the IVS Chair; (2) The second section of Special Reports contains a status report of the IVS Working Group on GPS phase center mapping, a reproduction of the resolution making IVS a Service of the International Astronomical Union (IAU), and a reprint of the VLBI Standard Interface (VSI); (3) The next seven sections hold the component reports from the Coordinators, Network Stations, Operation Centers, Correlators, Data Centers, Analysis Centers, and Technology Development Centers; and (4) The last section includes reference information about IVS: the Terms of Reference, the lists of Member and Affiliated organizations, the IVS Associate Member list, a complete list of IVS components, the list of institutions contributing to this report, and a list of acronyms. The 2000 Annual Report demonstrates the vitality of the IVS and the outstanding progress we have made during our first 22 months.

  5. Introducing ADES: A New IAU Astrometry Data Exchange Standard

    Science.gov (United States)

    Chesley, Steven R.; Hockney, George M.; Holman, Matthew J.

    2017-10-01

    For several decades, small body astrometry has been exchanged, distributed and archived in the form of 80-column ASCII records. As a replacement for this obsolescent format, we have worked with a number of members of the community to develop the Astrometric Data Exchange Standard (ADES), which was formally adopted by IAU Commission 20 in August 2015 at the XXIX General Assembly in Honolulu, Hawaii.The purpose of ADES is to ensure that useful and available observational information is submitted, archived, and disseminated as needed. Availability of more complete information will allow orbit computers to process the data more correctly, leading to improved accuracy and reliability of orbital fits. In this way, it will be possible to fully exploit the improving accuracy and increasing number of both optical and radar observations. ADES overcomes several limitations of the previous format by allowing characterization of astrometric and photometric errors, adequate precision in time and angle fields, and flexibility and extensibility.To accommodate a diverse base of users, from automated surveys to hands-on follow-up observers, the ADES protocol allows for two file formats, eXtensible Markup Language (XML) and Pipe-Separated Values (PSV). Each format carries the same information and simple tools allow users to losslessly transform back and forth between XML and PSV.We have further developed and refined ADES since it was first announced in July 2015 [1]. The proposal at that time [2] has undergone several modest revisions to aid validation and avoid overloaded fields. We now have validation schema and file transformation utilities. Suitable example files, test suites, and input/output libraries in a number of modern programming languages are now available. Acknowledgements: Useful feedback during the development of ADES has been received from numerous colleagues in the community of observers and orbit specialists working on asteroids comets and planetary satellites

  6. Implementation of the Global Parameters Determination in Gaia's Astrometric Solution (AGIS)

    Science.gov (United States)

    Raison, F.; Olias, A.; Hobbs, D.; Lindegren, L.

    2010-12-01

    Gaia is ESA’s space astrometry mission with a foreseen launch date in early 2012. Its main objective is to perform a stellar census of the 1000 Million brightest objects in our galaxy (completeness to V=20 mag) from which an astrometric catalog of micro-arcsec level accuracy will be constructed. A key element in this endeavor is the Astrometric Global Iterative Solution (AGIS). A core part of AGIS is to determine the accurate spacecraft attitude, geometric instrument calibration and astrometric model parameters for a well-behaved subset of all the objects (the ‘primary stars’). In addition, a small number of global parameters will be estimated, one of these being PPN γ. We present here the implementation of the algorithms dedicated to the determination of the global parameters.

  7. Image reconstruction design of industrial CT instrument for teaching

    International Nuclear Information System (INIS)

    Zou Yongning; Cai Yufang

    2009-01-01

    Industrial CT instrument for teaching is applied to teaching and study in field of physics and radiology major, image reconstruction is an important part of software on CT instrument. The paper expatiate on CT physical theory and first generation CT reconstruction algorithm, describe scan process of industrial CT instrument for teaching; analyze image artifact as result of displacement of rotation center, implement method of center displacement correcting, design and complete image reconstruction software, application shows that reconstructed image is very clear and qualitatively high. (authors)

  8. [Algorithm for securing an unexpected difficult airway : User analysis on a simulator].

    Science.gov (United States)

    Ott, T; Truschinski, K; Kriege, M; Naß, M; Herrmann, S; Ott, V; Sellin, S

    2018-01-01

    Critical incidents in difficult airway management are still a main contributory factor for perioperative morbidity and mortality. Many national associations have developed algorithms for management of these time critical events. For implementation of these algorithms the provision of technical requirements and procedure-related training are essential. Severe airway incidents are rare events and clinical experience of the individual operators is limited; therefore, simulation is an adequate instrument for training and evaluating difficult airway algorithms. The aim of this observational study was to evaluate the application of the institutional difficult airway algorithm among anesthetists. After ethics committee approval, anesthetists were observed while treating a "cannot intubate" (CI) and a "cannot intubate, cannot ventilate" (CICV) situation in the institutional simulation center. As leader of a supportive team the participants had to deal with an unexpected difficult airway after induction of anesthesia in a patient simulator. The following data were recorded: sequence of the applied airway instruments, time to ventilation after establishing a secured airway using any instrument in the CI situation and time to ventilation via cricothyrotomy in the CICV situation. Conformity to the algorithm was defined by the sequence of the applied instruments. Analysis comprised conformity to the algorithm, non-parametric tests for time to ventilation and differences between junior and senior anesthetists. Out of 50 participants 45 were analyzed in the CI situation. In this situation 93% of the participants acted in conformity with the algorithm. In 62% the airway was secured by flexible intubation endoscopy, in 38% with another device. Data from 46 participants were analyzed in the CICV situation. In this situation 91% acted in conformity with the algorithm. The last device used prior to the decision for cricothyrotomy was flexible intubation endoscopy in 39%, a

  9. Diabetes and quality of life: Comparing results from utility instruments and Diabetes-39.

    Science.gov (United States)

    Chen, Gang; Iezzi, Angelo; McKie, John; Khan, Munir A; Richardson, Jeff

    2015-08-01

    To compare the Diabetes-39 (D-39) with six multi-attribute utility (MAU) instruments (15D, AQoL-8D, EQ-5D, HUI3, QWB, and SF-6D), and to develop mapping algorithms which could be used to transform the D-39 scores into the MAU scores. Self-reported diabetes sufferers (N=924) and members of the healthy public (N=1760), aged 18 years and over, were recruited from 6 countries (Australia 18%, USA 18%, UK 17%, Canada 16%, Norway 16%, and Germany 15%). Apart from the QWB which was distributed normally, non-parametric rank tests were used to compare subgroup utilities and D-39 scores. Mapping algorithms were estimated using ordinary least squares (OLS) and generalised linear models (GLM). MAU instruments discriminated between diabetes patients and the healthy public; however, utilities varied between instruments. The 15D, SF-6D, AQoL-8D had the strongest correlations with the D-39. Except for the HUI3, there were significant differences by gender. Mapping algorithms based on the OLS estimator consistently gave better goodness-of-fit results. The mean absolute error (MAE) values ranged from 0.061 to 0.147, the root mean square error (RMSE) values 0.083 to 0.198, and the R-square statistics 0.428 and 0.610. Based on MAE and RMSE values the preferred mapping is D-39 into 15D. R-square statistics and the range of predicted utilities indicate the preferred mapping is D-39 into AQoL-8D. Utilities estimated from different MAU instruments differ significantly and the outcome of a study could depend upon the instrument used. The algorithms reported in this paper enable D-39 data to be mapped into utilities predicted from any of six instruments. This provides choice for those conducting cost-utility analyses. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Digital instrument for reactivity measurements in a nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chwaszczewski, S [Institute of Nuclear Research, Warsaw (Poland)

    1979-07-01

    An instrument for digital determination of the reactivity in nuclear reactors is described. It is based on the CAMAC standard apparatus, suitable for the use of pulse or current type neutron detectors and operates with prompt response and an output signal proportional to the core neutron flux. The measured data of neutron flux and reactivity can be registered by a digital display unit, an indicator, or, by request of the operator, a paper type punch. The algorithms used for reactivity calculation are considered and the results of numerical studies on those algorithms are discussed. The instrument has been used for determining the reactivity of the control elements in the fast-thermal assembly ANNA and in the research reactor MARIA. Some results of these measurements are given.

  11. Hubble Source Catalog

    Science.gov (United States)

    Lubow, S.; Budavári, T.

    2013-10-01

    We have created an initial catalog of objects observed by the WFPC2 and ACS instruments on the Hubble Space Telescope (HST). The catalog is based on observations taken on more than 6000 visits (telescope pointings) of ACS/WFC and more than 25000 visits of WFPC2. The catalog is obtained by cross matching by position in the sky all Hubble Legacy Archive (HLA) Source Extractor source lists for these instruments. The source lists describe properties of source detections within a visit. The calculations are performed on a SQL Server database system. First we collect overlapping images into groups, e.g., Eta Car, and determine nearby (approximately matching) pairs of sources from different images within each group. We then apply a novel algorithm for improving the cross matching of pairs of sources by adjusting the astrometry of the images. Next, we combine pairwise matches into maximal sets of possible multi-source matches. We apply a greedy Bayesian method to split the maximal matches into more reliable matches. We test the accuracy of the matches by comparing the fluxes of the matched sources. The result is a set of information that ties together multiple observations of the same object. A byproduct of the catalog is greatly improved relative astrometry for many of the HST images. We also provide information on nondetections that can be used to determine dropouts. With the catalog, for the first time, one can carry out time domain, multi-wavelength studies across a large set of HST data. The catalog is publicly available. Much more can be done to expand the catalog capabilities.

  12. ASTROMETRY AND RADIAL VELOCITIES OF THE PLANET HOST M DWARF GJ 317: NEW TRIGONOMETRIC DISTANCE, METALLICITY, AND UPPER LIMIT TO THE MASS OF GJ 317b

    International Nuclear Information System (INIS)

    Anglada-Escudé, Guillem; Boss, Alan P.; Weinberger, Alycia J.; Butler, R. Paul; Thompson, Ian B.; Vogt, Steven S.; Rivera, Eugenio J.

    2012-01-01

    We have obtained precision astrometry of the planet host M dwarf GJ 317 in the framework of the Carnegie Astrometric Planet Search project. The new astrometric measurements give a distance determination of 15.3 pc, 65% further than previous estimates. The resulting absolute magnitudes suggest that it is metal-rich and more massive than previously assumed. This result strengthens the correlation between high metallicity and the presence of gas giants around low-mass stars. At 15.3 pc, the minimal astrometric amplitude for planet candidate GJ 317b is 0.3 mas (edge-on orbit), just below our astrometric sensitivity. However, given the relatively large number of observations and good astrometric precision, a Bayesian Monte Carlo Markov Chain analysis indicates that the mass of planet b has to be smaller than twice the minimum mass with a 99% confidence level, with a most likely value of 2.5 M Jup . Additional radial velocity (RV) measurements obtained with Keck by the Lick-Carnegie Planet search program confirm the presence of an additional very long period planet candidate, with a period of 20 years or more. Even though such an object will imprint a large astrometric wobble on the star, its curvature is yet not evident in the astrometry. Given high metallicity, and the trend indicating that multiple systems are rich in low-mass companions, this system is likely to host additional low-mass planets in its habitable zone that can be readily detected with state-of-the-art optical and near-infrared RV measurements.

  13. ASTROMETRY AND RADIAL VELOCITIES OF THE PLANET HOST M DWARF GJ 317: NEW TRIGONOMETRIC DISTANCE, METALLICITY, AND UPPER LIMIT TO THE MASS OF GJ 317b

    Energy Technology Data Exchange (ETDEWEB)

    Anglada-Escude, Guillem; Boss, Alan P.; Weinberger, Alycia J.; Butler, R. Paul [Department of Terrestrial Magnetism, Carnegie Institution for Science, 5241 Broad Branch Road NW, Washington, DC 20015 (United States); Thompson, Ian B. [Carnegie Observatories, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); Vogt, Steven S.; Rivera, Eugenio J., E-mail: anglada@dtm.ciw.edu [UCO/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States)

    2012-02-10

    We have obtained precision astrometry of the planet host M dwarf GJ 317 in the framework of the Carnegie Astrometric Planet Search project. The new astrometric measurements give a distance determination of 15.3 pc, 65% further than previous estimates. The resulting absolute magnitudes suggest that it is metal-rich and more massive than previously assumed. This result strengthens the correlation between high metallicity and the presence of gas giants around low-mass stars. At 15.3 pc, the minimal astrometric amplitude for planet candidate GJ 317b is 0.3 mas (edge-on orbit), just below our astrometric sensitivity. However, given the relatively large number of observations and good astrometric precision, a Bayesian Monte Carlo Markov Chain analysis indicates that the mass of planet b has to be smaller than twice the minimum mass with a 99% confidence level, with a most likely value of 2.5 M{sub Jup}. Additional radial velocity (RV) measurements obtained with Keck by the Lick-Carnegie Planet search program confirm the presence of an additional very long period planet candidate, with a period of 20 years or more. Even though such an object will imprint a large astrometric wobble on the star, its curvature is yet not evident in the astrometry. Given high metallicity, and the trend indicating that multiple systems are rich in low-mass companions, this system is likely to host additional low-mass planets in its habitable zone that can be readily detected with state-of-the-art optical and near-infrared RV measurements.

  14. Development of intelligent system for a thermal analysis instrument

    International Nuclear Information System (INIS)

    Xu Xiaoli; Wu Guoxin; Shi Yongchao

    2005-01-01

    The key techniques for the intelligent analysis instrument developed are proposed. Based on the technique of virtual instrumentation, the intelligent PID control algorithm to control the temperature of thermal analysis instrument is described. The dynamic character and the robust performance of traditional PID controls are improved through the dynamic gain factor, temperature rate change factor, the forecast factor, and the temperature correction factor is introduced. Using the graphic development environment of LabVIEW, the design of system modularization and the graphic display are implemented. By means of multiple mathematical modules, intelligent data processing is realized

  15. Endoscopic vision-based tracking of multiple surgical instruments during robot-assisted surgery.

    Science.gov (United States)

    Ryu, Jiwon; Choi, Jaesoon; Kim, Hee Chan

    2013-01-01

    Robot-assisted minimally invasive surgery is effective for operations in limited space. Enhancing safety based on automatic tracking of surgical instrument position to prevent inadvertent harmful events such as tissue perforation or instrument collisions could be a meaningful augmentation to current robotic surgical systems. A vision-based instrument tracking scheme as a core algorithm to implement such functions was developed in this study. An automatic tracking scheme is proposed as a chain of computer vision techniques, including classification of metallic properties using k-means clustering and instrument movement tracking using similarity measures, Euclidean distance calculations, and a Kalman filter algorithm. The implemented system showed satisfactory performance in tests using actual robot-assisted surgery videos. Trajectory comparisons of automatically detected data and ground truth data obtained by manually locating the center of mass of each instrument were used to quantitatively validate the system. Instruments and collisions could be well tracked through the proposed methods. The developed collision warning system could provide valuable information to clinicians for safer procedures. © 2012, Copyright the Authors. Artificial Organs © 2012, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  16. Multi-objective optimization of design and testing of safety instrumented systems with MooN voting architectures using a genetic algorithm

    International Nuclear Information System (INIS)

    Torres-Echeverría, A.C.; Martorell, S.; Thompson, H.A.

    2012-01-01

    This paper presents the optimization of design and test policies of safety instrumented systems using MooN voting redundancies by a multi-objective genetic algorithm. The objectives to optimize are the Average Probability of Dangerous Failure on Demand, which represents the system safety integrity, the Spurious Trip Rate and the Lifecycle Cost. In this way safety, reliability and cost are included. This is done by using novel models of time-dependent probability of failure on demand and spurious trip rate, recently published by the authors. These models are capable of delivering the level of modeling detail required by the standard IEC 61508. Modeling includes common cause failure and diagnostic coverage. The Probability of Failure on Demand model also permits to quantify results with changing testing strategies. The optimization is performed using the multi-objective Genetic Algorithm NSGA-II. This allows weighting of the trade-offs between the three objectives and, thus, implementation of safety systems that keep a good balance between safety, reliability and cost. The complete methodology is applied to two separate case studies, one for optimization of system design with redundancy allocation and component selection and another for optimization of testing policies. Both optimization cases are performed for both systems with MooN redundancies and systems with only parallel redundancies. Their results are compared, demonstrating how introducing MooN architectures presents a significant improvement for the optimization process.

  17. Digibaro pressure instrument onboard the Phoenix Lander

    Science.gov (United States)

    Harri, A.-M.; Polkko, J.; Kahanpää, H. H.; Schmidt, W.; Genzer, M. M.; Haukka, H.; Savijarv1, H.; Kauhanen, J.

    2009-04-01

    The Phoenix Lander landed successfully on the Martian northern polar region. The mission is part of the National Aeronautics and Space Administration's (NASA's) Scout program. Pressure observations onboard the Phoenix lander were performed by an FMI (Finnish Meteorological Institute) instrument, based on a silicon diaphragm sensor head manufactured by Vaisala Inc., combined with MDA data processing electronics. The pressure instrument performed successfully throughout the Phoenix mission. The pressure instrument had 3 pressure sensor heads. One of these was the primary sensor head and the other two were used for monitoring the condition of the primary sensor head during the mission. During the mission the primary sensor was read with a sampling interval of 2 s and the other two were read less frequently as a check of instrument health. The pressure sensor system had a real-time data-processing and calibration algorithm that allowed the removal of temperature dependent calibration effects. In the same manner as the temperature sensor, a total of 256 data records (8.53 min) were buffered and they could either be stored at full resolution, or processed to provide mean, standard deviation, maximum and minimum values for storage on the Phoenix Lander's Meteorological (MET) unit.The time constant was approximately 3s due to locational constraints and dust filtering requirements. Using algorithms compensating for the time constant effect the temporal resolution was good enough to detect pressure drops associated with the passage of nearby dust devils.

  18. Real-time Astrometry Using Phase Congruency

    Science.gov (United States)

    Lambert, A.; Polo, M.; Tang, Y.

    Phase congruency is a computer vision technique that proves to perform well for determining the tracks of optical objects (Flewelling, AMOS 2014). We report on a real-time implementation of this using an FPGA and CMOS Image Sensor, with on-sky data. The lightweight instrument can provide tracking update signals to the mount of the telescope, as well as determine abnormal objects in the scene.

  19. Nuclear instrumentation for radiation measurement

    International Nuclear Information System (INIS)

    Madan, V.K.

    2012-01-01

    Nuclear radiation cannot be detected by human senses. Nuclear detectors and associated electronics facilitate detection and measurement of different types of radiation like alpha particles, beta particles, gamma radiation, and detection of neutrons. Nuclear instrumentation has evolved greatly since the discovery of radioactivity. There has been tremendous advancement in detector technology, electronics, computer technology, and development of efficient algorithms and methods for spectral processing to extract precisely qualitative and quantitative information of the radiation. Various types of detectors and nuclear instruments are presently available and are used for different applications. This paper describes nuclear radiation, its detection and measurement and associated electronics, spectral information extraction, and advances in these fields. The paper also describes challenges in this field

  20. An Algorithm for Fault-Tree Construction

    DEFF Research Database (Denmark)

    Taylor, J. R.

    1982-01-01

    An algorithm for performing certain parts of the fault tree construction process is described. Its input is a flow sheet of the plant, a piping and instrumentation diagram, or a wiring diagram of the circuits, to be analysed, together with a standard library of component functional and failure...

  1. Improved retrieval of cloud base heights from ceilometer using a non-standard instrument method

    Science.gov (United States)

    Wang, Yang; Zhao, Chuanfeng; Dong, Zipeng; Li, Zhanqing; Hu, Shuzhen; Chen, Tianmeng; Tao, Fa; Wang, Yuzhao

    2018-04-01

    Cloud-base height (CBH) is a basic cloud parameter but has not been measured accurately, especially under polluted conditions due to the interference of aerosol. Taking advantage of a comprehensive field experiment in northern China in which a variety of advanced cloud probing instruments were operated, different methods of detecting CBH are assessed. The Micro-Pulse Lidar (MPL) and the Vaisala ceilometer (CL51) provided two types of backscattered profiles. The latter has been employed widely as a standard means of measuring CBH using the manufacturer's operational algorithm to generate standard CBH products (CL51 MAN) whose quality is rigorously assessed here, in comparison with a research algorithm that we developed named value distribution equalization (VDE) algorithm. It was applied to both the profiles of lidar backscattering data from the two instruments. The VDE algorithm is found to produce more accurate estimates of CBH for both instruments and can cope with heavy aerosol loading conditions well. By contrast, CL51 MAN overestimates CBH by 400 m and misses many low level clouds under such conditions. These findings are important given that CL51 has been adopted operationally by many meteorological stations in China.

  2. A digital instrument for reactivity measurements in a nuclear reactor

    International Nuclear Information System (INIS)

    Chwaszczewski, S.

    1979-01-01

    An instrument for digital determination of the reactivity in nuclear reactors is described. It is based on the CAMAC standard apparatus, suitable for the use of pulse or current type neutron detectors and operates with prompt response and an output signal proportional to the core neutron flux. The measured data of neutron flux and reactivity can be registered by a digital display unit, an indicator, or, by request of the operator, a paper type punch. The algorithms used for reactivity calculation are considered and the results of numerical studies on those algorithms are discussed. The instrument has been used for determining the reactivity of the control elements in the fast-thermal assembly ANNA and in the research reactor MARIA. Some results of these measurements are given. (author)

  3. The firms Sandvik Coromant and Walter instruments choice algorithmization

    Directory of Open Access Journals (Sweden)

    Євген Іванович Іванов

    2015-11-01

    Full Text Available This article describes the typical algorithms for choosing modern tools made by foreign firms Sandvik Coromant and Walter. The use of modern tools is effective on both new and old equipment. Algorithms ensure orderly operation of engineers in the development of new or upgrading old processes, and may also be useful for students enrolled in the specialty "Mechanical Engineering". The use of modern tools is effective on both new and old equipment. Correctly chosen tool make it possible for you to quickly recoup the cost of new equipment and significantly improve the work of the old equipment. Currently, all cutting tools can be divided into the following groups: a a solid; b composite; c assembly; d modular (dial-up. In composite cutting tools and parts theholders are attached permanently. For example, the attachment can be blocked by welding or soldering. At modular and modular cutting tools and parts are detachable. In those parts of the modular tool there are separate assembly units (modules with standardized mounting surface. Thus one and the same cutter head may be attached to a holder (mandrel housings of different configuration and function. The choice of the cutting part of such tools includes determining the shape and size of the indexable insert (SMP, the geometry of its front surface, corner radius and tool material. Selecting the holder (mandrel body includes determining its type and size. It is necessary to take into account the possibility of technological equipment (type and size of the mounting surfaces of the tool holder and tool spindle.After selecting the tool it is necessary to determine working regimes

  4. [System design of small intellectualized ultrasound hyperthermia instrument in the LabVIEW environment].

    Science.gov (United States)

    Jiang, Feng; Bai, Jingfeng; Chen, Yazhu

    2005-08-01

    Small-scale intellectualized medical instrument has attracted great attention in the field of biomedical engineering, and LabVIEW (Laboratory Virtual Instrument Engineering Workbench) provides a convenient environment for this application due to its inherent advantages. The principle and system structure of the hyperthermia instrument are presented. Type T thermocouples are employed as thermotransducers, whose amplifier consists of two stages, providing built-in ice point compensation and thus improving work stability over temperature. Control signals produced by specially designed circuit drive the programmable counter/timer 8254 chip to generate PWM (Pulse width modulation) wave, which is used as ultrasound radiation energy control signal. Subroutine design topics such as inner-tissue real time feedback temperature control algorithm, water temperature control in the ultrasound applicator are also described. In the cancer tissue temperature control subroutine, the authors exert new improvments to PID (Proportional Integral Differential) algorithm according to the specific demands of the system and achieve strict temperature control to the target tissue region. The system design and PID algorithm improvement have experimentally proved to be reliable and excellent, meeting the requirements of the hyperthermia system.

  5. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  6. Proposed algorithm for determining the delta intercept of a thermocouple psychrometer curve

    International Nuclear Information System (INIS)

    Kurzmack, M.A.

    1993-01-01

    The USGS Hydrologic Investigations Program is currently developing instrumentation to study the unsaturated zone at Yucca Mountain in Nevada. Surface-based boreholes up to 2,500 feet in depth will be drilled, and then instrumented in order to define the water potential field within the unsaturated zone. Thermocouple psychrometers will be used to monitor the in-situ water potential. An algorithm is proposed for simply and efficiently reducing a six wire thermocouple psychrometer voltage output curve to a single value, the delta intercept. The algorithm identifies a plateau region in the psychrometer curve and extrapolates a linear regression back to the initial start of relaxation. When properly conditioned for the measurements being made, the algorithm results in reasonable results even with incomplete or noisy psychrometer curves over a 1 to 60 bar range

  7. The neutron instrument simulation package, NISP

    International Nuclear Information System (INIS)

    Seeger, P.A.; Daemen, L.L.

    2004-01-01

    The Neutron Instrument Simulation Package (NISP) performs complete source-to-detector simulations of neutron instruments, including neutrons that do not follow the expected path. The original user interface (MC( ) Web) is a web-based application, http://strider.lansce.lanl.gov/NISP/Welcome.html. This report describes in detail the newer standalone Windows version, NISP( ) Win. Instruments are assembled from menu-selected elements, including neutron sources, collimation and transport elements, samples, analyzers, and detectors. Magnetic field regions may also be specified for the propagation of polarized neutrons including spin precession. Either interface writes a geometry file that is used as input to the Monte Carlo engine (MC( ) Run) in the user's computer. Both the interface and the engine rely on a subroutine library, MCLIB. The package is completely open source. New features include capillary optics, temperature dependence of Al and Be, revised source files for ISIS, and visualization of neutron trajectories at run time. Also, a single-crystal sample type has been successfully imported from McStas (with more generalized geometry), demonstrating the capability of including algorithms from other sources, and NISP( ) Win may render the instrument in a virtual reality file. Results are shown for two instruments under development.

  8. The neutron instrument Monte Carlo library MCLIB: Recent developments

    International Nuclear Information System (INIS)

    Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.; Thelliez, T.G.

    1998-01-01

    A brief review is given of the developments since the ICANS-XIII meeting made in the neutron instrument design codes using the Monte Carlo library MCLIB. Much of the effort has been to assure that the library and the executing code MC RUN connect efficiently with the World Wide Web application MC-WEB as part of the Los Alamos Neutron Instrument Simulation Package (NISP). Since one of the most important features of MCLIB is its open structure and capability to incorporate any possible neutron transport or scattering algorithm, this document describes the current procedure that would be used by an outside user to add a feature to MCLIB. Details of the calling sequence of the core subroutine OPERATE are discussed, and questions of style are considered and additional guidelines given. Suggestions for standardization are solicited, as well as code for new algorithms

  9. VizieR Online Data Catalog: Hubble Source Catalog (V1 and V2) (Whitmore+, 2016)

    Science.gov (United States)

    Whitmore, B. C.; Allam, S. S.; Budavari, T.; Casertano, S.; Downes, R. A.; Donaldson, T.; Fall, S. M.; Lubow, S. H.; Quick, L.; Strolger, L.-G.; Wallace, G.; White, R. L.

    2016-10-01

    The HSC v1 contains members of the WFPC2, ACS/WFC, WFC3/UVIS and WFC3/IR Source Extractor source lists from HLA version DR8 (data release 8). The crossmatching process involves adjusting the relative astrometry of overlapping images so as to minimize positional offsets between closely aligned sources in different images. After correction, the astrometric residuals of crossmatched sources are significantly reduced, to typically less than 10mas. The relative astrometry is supported by using Pan-STARRS, SDSS, and 2MASS as the astrometric backbone for initial corrections. In addition, the catalog includes source nondetections. The crossmatching algorithms and the properties of the initial (Beta 0.1) catalog are described in Budavari & Lubow (2012ApJ...761..188B). The HSC v2 contains members of the WFPC2, ACS/WFC, WFC3/UVIS and WFC3/IR Source Extractor source lists from HLA version DR9.1 (data release 9.1). The crossmatching process involves adjusting the relative astrometry of overlapping images so as to minimize positional offsets between closely aligned sources in different images. After correction, the astrometric residuals of crossmatched sources are significantly reduced, to typically less than 10mas. The relative astrometry is supported by using Pan-STARRS, SDSS, and 2MASS as the astrometric backbone for initial corrections. In addition, the catalog includes source nondetections. The crossmatching algorithms and the properties of the initial (Beta 0.1) catalog are described in Budavari & Lubow (2012ApJ...761..188B). Hubble Source Catalog Acknowledgement: Based on observations made with the NASA/ESA Hubble Space Telescope, and obtained from the Hubble Legacy Archive, which is a collaboration between the Space Telescope Science Institute (STScI/NASA), the Space Telescope European Coordinating Facility (ST-ECF/ESAC/ESA) and the Canadian Astronomy Data Centre (CADC/NRC/CSA). (2 data files).

  10. Musical instruments in the 21st century identities, configurations, practices

    CERN Document Server

    Campo, Alberto; Egermann, Hauke; Hardjowirogo, Sarah-Indriyati; Weinzierl, Stefan

    2017-01-01

    By exploring the many different types and forms of contemporary musical instruments, this book contributes to a better understanding of the conditions of instrumentality in the 21st century. Providing insights from science, humanities and the arts, authors from a wide range of disciplines discuss the following questions: · What are the conditions under which an object is recognized as a musical instrument? · What are the actions and procedures typically associated with musical instruments? · What kind of (mental and physical) knowledge do we access in order to recognize or use something as a musical instrument? · How is this knowledge being shaped by cultural conventions and temporal conditions? · How do algorithmic processes 'change the game' of musical performance, and as a result, how do they affect notions of instrumentality? · How do we address the question of instrumental identity within an instrument's design process? · What properties can be used to differentiate successful and unsuccessful ins...

  11. Deconvolving instrumental and intrinsic broadening in core-shell x-ray spectroscopies

    International Nuclear Information System (INIS)

    Fister, T. T.; Seidler, G. T.; Rehr, J. J.; Kas, J. J.; Nagle, K. P.; Elam, W. T.; Cross, J. O.

    2007-01-01

    Intrinsic and experimental mechanisms frequently lead to broadening of spectral features in core-shell spectroscopies. For example, intrinsic broadening occurs in x-ray absorption spectroscopy (XAS) measurements of heavy elements where the core-hole lifetime is very short. On the other hand, nonresonant x-ray Raman scattering (XRS) and other energy loss measurements are more limited by instrumental resolution. Here, we demonstrate that the Richardson-Lucy (RL) iterative algorithm provides a robust method for deconvolving instrumental and intrinsic resolutions from typical XAS and XRS data. For the K-edge XAS of Ag, we find nearly complete removal of ∼9.3 eV full width at half maximum broadening from the combined effects of the short core-hole lifetime and instrumental resolution. We are also able to remove nearly all instrumental broadening in an XRS measurement of diamond, with the resulting improved spectrum comparing favorably with prior soft x-ray XAS measurements. We present a practical methodology for implementing the RL algorithm in these problems, emphasizing the importance of testing for stability of the deconvolution process against noise amplification, perturbations in the initial spectra, and uncertainties in the core-hole lifetime

  12. Science Data Management for the E-ELT: usecase MICADO

    NARCIS (Netherlands)

    Verdoes Kleijn, Gijs

    2015-01-01

    The E-ELT First-light instrument MICADO will explore new parameter space in terms of precision astrometry, photometry and spectroscopy. This provides challenges for the data handling and reduction to ensure MICADO takes the observational capabilities of the AO-assisted E-ELT towards its limits. Our

  13. Modern trends of electronic trading by negotiable financial instruments

    Directory of Open Access Journals (Sweden)

    I.Kravchuk

    2018-03-01

    Full Text Available International negotiable financial instrument markets have a high level of electronic trading. It is displayed using the consolidated limit order book, the widening the range of trading orders, smart order routing, high speed access to the market on the basis of latency minimizing. The result of electronic trading is the development of high-frequency trading, the positive features of which are the increase of trading activity and market liquidity, and the reducing of transaction costs. The main drawback of high-frequency trading is the potential negative impact on market stability (software failures, the manipulative incentives of algorithmic strategies, in particular, technical arbitrage, the use of homogeneous strategies, cyber risks. The current system of high-frequency trading requires particular actions from the regulators to overcome information gaps and enhanced monitoring to create a regulatory environment that could take into account both the aspects of supporting market development and its stability through appropriate macroprudential instruments, especially the implementation of the stress testing of the high-frequency algorithms in response to the shocks of various origin, taking into account the aggregate market effect of the mutual influence of different algorithmic strategies.

  14. Comparison and application of wind retrieval algorithms for small unmanned aerial systems

    Science.gov (United States)

    Bonin, T. A.; Chilson, P. B.; Zielke, B. S.; Klein, P. M.; Leeman, J. R.

    2013-07-01

    Recently, there has been an increase in use of Unmanned Aerial Systems (UASs) as platforms for conducting fundamental and applied research in the lower atmosphere due to their relatively low cost and ability to collect samples with high spatial and temporal resolution. Concurrent with this development comes the need for accurate instrumentation and measurement methods suitable for small meteorological UASs. Moreover, the instrumentation to be integrated into such platforms must be small and lightweight. Whereas thermodynamic variables can be easily measured using well-aspirated sensors onboard, it is much more challenging to accurately measure the wind with a UAS. Several algorithms have been developed that incorporate GPS observations as a means of estimating the horizontal wind vector, with each algorithm exhibiting its own particular strengths and weaknesses. In the present study, the performance of three such GPS-based wind-retrieval algorithms has been investigated and compared with wind estimates from rawinsonde and sodar observations. Each of the algorithms considered agreed well with the wind measurements from sounding and sodar data. Through the integration of UAS-retrieved profiles of thermodynamic and kinematic parameters, one can investigate the static and dynamic stability of the atmosphere and relate them to the state of the boundary layer across a variety of times and locations, which might be difficult to access using conventional instrumentation.

  15. Development and comparisons of wind retrieval algorithms for small unmanned aerial systems

    Science.gov (United States)

    Bonin, T. A.; Chilson, P. B.; Zielke, B. S.; Klein, P. M.; Leeman, J. R.

    2012-12-01

    Recently, there has been an increase in use of Unmanned Aerial Systems (UASs) as platforms for conducting fundamental and applied research in the lower atmosphere due to their relatively low cost and ability to collect samples with high spatial and temporal resolution. Concurrent with this development comes the need for accurate instrumentation and measurement methods suitable for small meteorological UASs. Moreover, the instrumentation to be integrated into such platforms must be small and lightweight. Whereas thermodynamic variables can be easily measured using well aspirated sensors onboard, it is much more challenging to accurately measure the wind with a UAS. Several algorithms have been developed that incorporate GPS observations as a means of estimating the horizontal wind vector, with each algorithm exhibiting its own particular strengths and weaknesses. In the present study, the performance of three such GPS-based wind-retrieval algorithms has been investigated and compared with wind estimates from rawinsonde and sodar observations. Each of the algorithms considered agreed well with the wind measurements from sounding and sodar data. Through the integration of UAS-retrieved profiles of thermodynamic and kinematic parameters, one can investigate the static and dynamic stability of the atmosphere and relate them to the state of the boundary layer across a variety of times and locations, which might be difficult to access using conventional instrumentation.

  16. The TOMS V9 Algorithm for OMPS Nadir Mapper Total Ozone: An Enhanced Design That Ensures Data Continuity

    Science.gov (United States)

    Haffner, D. P.; McPeters, R. D.; Bhartia, P. K.; Labow, G. J.

    2015-12-01

    The TOMS V9 total ozone algorithm will be applied to the OMPS Nadir Mapper instrument to supersede the exisiting V8.6 data product in operational processing and re-processing for public release. Becuase the quality of the V8.6 data is already quite high, enchancements in V9 are mainly with information provided by the retrieval and simplifcations to the algorithm. The design of the V9 algorithm has been influenced by improvements both in our knowledge of atmospheric effects, such as those of clouds made possible by studies with OMI, and also limitations in the V8 algorithms applied to both OMI and OMPS. But the namesake instruments of the TOMS algorithm are substantially more limited in their spectral and noise characterisitics, and a requirement of our algorithm is to also apply the algorithm to these discrete band spectrometers which date back to 1978. To achieve continuity for all these instruments, the TOMS V9 algorithm continues to use radiances in discrete bands, but now uses Rodgers optimal estimation to retrieve a coarse profile and provide uncertainties for each retrieval. The algorithm remains capable of achieving high accuracy results with a small number of discrete wavelengths, and in extreme cases, such as unusual profile shapes and high solar zenith angles, the quality of the retrievals is improved. Despite the intended design to use limited wavlenegths, the algorithm can also utilitze additional wavelengths from hyperspectral sensors like OMPS to augment the retreival's error detection and information content; for example SO2 detection and correction of Ring effect on atmospheric radiances. We discuss these and other aspects of the V9 algorithm as it will be applied to OMPS, and will mention potential improvements which aim to take advantage of a synergy with OMPS Limb Profiler and Nadir Mapper to further improve the quality of total ozone from the OMPS instrument.

  17. PCA-based algorithm for calibration of spectrophotometric analysers of food

    International Nuclear Information System (INIS)

    Morawski, Roman Z; Miekina, Andrzej

    2013-01-01

    Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler

  18. Online Personalization of Hearing Instruments

    Directory of Open Access Journals (Sweden)

    Bert de Vries

    2008-09-01

    Full Text Available Online personalization of hearing instruments refers to learning preferred tuning parameter values from user feedback through a control wheel (or remote control, during normal operation of the hearing aid. We perform hearing aid parameter steering by applying a linear map from acoustic features to tuning parameters. We formulate personalization of the steering parameters as the maximization of an expected utility function. A sparse Bayesian approach is then investigated for its suitability to find efficient feature representations. The feasibility of our approach is demonstrated in an application to online personalization of a noise reduction algorithm. A patient trial indicates that the acoustic features chosen for learning noise control are meaningful, that environmental steering of noise reduction makes sense, and that our personalization algorithm learns proper values for tuning parameters.

  19. Robotic-surgical instrument wrist pose estimation.

    Science.gov (United States)

    Fabel, Stephan; Baek, Kyungim; Berkelman, Peter

    2010-01-01

    The Compact Lightweight Surgery Robot from the University of Hawaii includes two teleoperated instruments and one endoscope manipulator which act in accord to perform assisted interventional medicine. The relative positions and orientations of the robotic instruments and endoscope must be known to the teleoperation system so that the directions of the instrument motions can be controlled to correspond closely to the directions of the motions of the master manipulators, as seen by the the endoscope and displayed to the surgeon. If the manipulator bases are mounted in known locations and all manipulator joint variables are known, then the necessary coordinate transformations between the master and slave manipulators can be easily computed. The versatility and ease of use of the system can be increased, however, by allowing the endoscope or instrument manipulator bases to be moved to arbitrary positions and orientations without reinitializing each manipulator or remeasuring their relative positions. The aim of this work is to find the pose of the instrument end effectors using the video image from the endoscope camera. The P3P pose estimation algorithm is used with a Levenberg-Marquardt optimization to ensure convergence. The correct transformations between the master and slave coordinate frames can then be calculated and updated when the bases of the endoscope or instrument manipulators are moved to new, unknown, positions at any time before or during surgical procedures.

  20. Algorithm for Compressing Time-Series Data

    Science.gov (United States)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  1. Data analysis algorithms for gravitational-wave experiments

    International Nuclear Information System (INIS)

    Bonifazi, P.; Ferrari, V.; Frasca, S.; Pallottino, G.V.; Pizzella, G.

    1978-01-01

    The analysis of the sensitivity of a gravitational-wave antenna system shows that the role of the algorithms used for the analysis of the experimental data is comparable to that of the experimental apparatus. After a discussion of the processing performed on the input signals by the antenna and the electronic instrumentation, we derive a mathematical model of the system. This model is then used as a basis for the discussion of a number of data analysis algorithms that include also the Wiener-Kolmogoroff optimum filter; the performances of the algorithms are presented in terms of signal-to-noise ratio and sensitivity to short bursts of resonant gravitational waves. The theoretical results are in good agreement with the experimental results obtained with a small cryogenic antenna (24 kg)

  2. Automatic Recognition Method for Optical Measuring Instruments Based on Machine Vision

    Institute of Scientific and Technical Information of China (English)

    SONG Le; LIN Yuchi; HAO Liguo

    2008-01-01

    Based on a comprehensive study of various algorithms, the automatic recognition of traditional ocular optical measuring instruments is realized. Taking a universal tools microscope (UTM) lens view image as an example, a 2-layer automatic recognition model for data reading is established after adopting a series of pre-processing algorithms. This model is an optimal combination of the correlation-based template matching method and a concurrent back propagation (BP) neural network. Multiple complementary feature extraction is used in generating the eigenvectors of the concurrent network. In order to improve fault-tolerance capacity, rotation invariant features based on Zernike moments are extracted from digit characters and a 4-dimensional group of the outline features is also obtained. Moreover, the operating time and reading accuracy can be adjusted dynamically by setting the threshold value. The experimental result indicates that the newly developed algorithm has optimal recognition precision and working speed. The average reading ratio can achieve 97.23%. The recognition method can automatically obtain the results of optical measuring instruments rapidly and stably without modifying their original structure, which meets the application requirements.

  3. The Mass of the Candidate Exoplanet Companion to HD 33636 from Hubble Space Telescope Astrometry and High-Precision Radial Velocities

    Science.gov (United States)

    Bean, Jacob L.; McArthur, Barbara E.; Benedict, G. Fritz; Harrison, Thomas E.; Bizyaev, Dmitry; Nelan, Edmund; Smith, Verne V.

    2007-08-01

    We have determined a dynamical mass for the companion to HD 33636 that indicates it is a low-mass star instead of an exoplanet. Our result is based on an analysis of Hubble Space Telescope (HST) astrometry and ground-based radial velocity data. We have obtained high-cadence radial velocity measurements spanning 1.3 yr of HD 33636 with the Hobby-Eberly Telescope at McDonald Observatory. We combined these data with previously published velocities to create a data set that spans 9 yr. We used this data set to search for, and place mass limits on, the existence of additional companions in the HD 33636 system. Our high-precision astrometric observations of the system with the HST Fine Guidance Sensor 1r span 1.2 yr. We simultaneously modeled the radial velocity and astrometry data to determine the parallax, proper motion, and perturbation orbit parameters of HD 33636. Our derived parallax, πabs=35.6+/-0.2 mas, agrees within the uncertainties with the Hipparcos value. We find a perturbation period P=2117.3+/-0.8 days, semimajor axis aA=14.2+/-0.2 mas, and system inclination i=4.1deg+/-0.1deg. Assuming the mass of the primary star to be MA=1.02+/-0.03 Msolar, we obtain a companion mass MB=142+/-11 MJup=0.14+/-0.01 Msolar. The much larger true mass of the companion relative to its minimum mass estimated from the spectroscopic orbit parameters (Msini=9.3 MJup) is due to the nearly face-on orbit orientation. This result demonstrates the value of follow-up astrometric observations to determine the true masses of exoplanet candidates detected with the radial velocity method. Based on data obtained with the NASA/ESA Hubble Space Telescope (HST) and the Hobby-Eberly Telescope (HET). The HST observations were obtained at the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555. The HET is a joint project of the University of Texas at Austin, Pennsylvania State University, Stanford

  4. Radio emission from Supernovae and High Precision Astrometry

    Science.gov (United States)

    Perez-Torres, M. A.

    1999-11-01

    The present thesis work makes contributions in two scientific fronts: differential astrometry over the largest angular scales ever attempted (approx. 15 arcdegrees) and numerical simulations of radio emission from very young supernovae. In the first part, we describe the results of the use of very-long-baseline interferometry (VLBI) in one experiment designed to measure with very high precision the angular distance between the radio sources 1150+812 (QSO) and 1803+784 (BL Lac). We observed the radio sources on 19 November 1993 using an intercontinental array of radio telescopes, which simultaneously recorded at 2.3 and 8.4 GHz. VLBI differential astrometry is capable, Nature allowing, of yielding source positions with precisions well below the milliarcsecond level. To achieve this precision, we first had to accurately model the rotation of the interferometric fringes via the most precise models of Earth Orientation Parameters (EOP; precession, polar motion and UT1, nutation). With this model, we successfully connected our phase delay data at both frequencies and, using difference astrometric techniques, determined the coordinates of 1803+784 relative to those of 1150+812-within the IERS reference frame--with an standard error of about 0.6 mas in each coordinate. We then corrected for several effects including propagation medium (mainly the atmosphere and ionosphere), and opacity and source-structure effects within the radio sources. We stress that our dual-frequency measurements allowed us to accurately subtract the ionosphere contribution from our data. We also used GPS-based TEC measurements to independently find the ionosphere contribution, and showed that these contributions agree with our dual-frequency measurements within about 2 standard deviations in the less favorables cases (the longest baselines), but are usually well within one standard deviation. Our estimates of the relative positions, whether using dual-frequency-based or GPS-based ionosphere

  5. UAVSAR Program: Initial Results from New Instrument Capabilities

    Science.gov (United States)

    Lou, Yunling; Hensley, Scott; Moghaddam, Mahta; Moller, Delwyn; Chapin, Elaine; Chau, Alexandra; Clark, Duane; Hawkins, Brian; Jones, Cathleen; Marks, Phillip; hide

    2013-01-01

    UAVSAR is an imaging radar instrument suite that serves as NASA's airborne facility instrument to acquire scientific data for Principal Investigators as well as a radar test-bed for new radar observation techniques and radar technology demonstration. Since commencing operational science observations in January 2009, the compact, reconfigurable, pod-based radar has been acquiring L-band fully polarimetric SAR (POLSAR) data with repeat-pass interferometric (RPI) observations underneath NASA Dryden's Gulfstream-III jet to provide measurements for science investigations in solid earth and cryospheric studies, vegetation mapping and land use classification, archaeological research, soil moisture mapping, geology and cold land processes. In the past year, we have made significant upgrades to add new instrument capabilities and new platform options to accommodate the increasing demand for UAVSAR to support scientific campaigns to measure subsurface soil moisture, acquire data in the polar regions, and for algorithm development, verification, and cross-calibration with other airborne/spaceborne instruments.

  6. Trace Gas Retrievals from the GeoTASO Aircraft Instrument

    Science.gov (United States)

    Nowlan, C. R.; Liu, X.; Leitch, J. W.; Liu, C.; Gonzalez Abad, G.; Chance, K.; Cole, J.; Delker, T.; Good, W. S.; Murcray, F.; Ruppert, L.; Soo, D.; Loughner, C.; Follette-Cook, M. B.; Janz, S. J.; Kowalewski, M. G.; Pickering, K. E.; Zoogman, P.; Al-Saadi, J. A.

    2015-12-01

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) instrument is a passive remote sensing instrument capable of making 2-D measurements of trace gases and aerosols from aircraft. The instrument measures backscattered UV and visible radiation, allowing the retrieval of trace gas amounts below the aircraft at horizontal resolutions on the order of 250 m x 250 m. GeoTASO was originally developed under NASA's Instrument Incubator Program as a test-bed instrument for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey mission, and is now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions. We present spatially resolved observations of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the DISCOVER-AQ field campaigns in Texas and Colorado, as well as comparisons with observations made by ground-based Pandora spectrometers, in situ monitoring instruments and other aircraft instruments deployed during these campaigns. These measurements at various times of day are providing a very useful data set for testing and improving TEMPO and GEMS retrieval algorithms, as well as demonstrating prototype validation strategies.

  7. Highlights of TOMS Version 9 Total Ozone Algorithm

    Science.gov (United States)

    Bhartia, Pawan; Haffner, David

    2012-01-01

    The fundamental basis of TOMS total ozone algorithm was developed some 45 years ago by Dave and Mateer. It was designed to estimate total ozone from satellite measurements of the backscattered UV radiances at few discrete wavelengths in the Huggins ozone absorption band (310-340 nm). Over the years, as the need for higher accuracy in measuring total ozone from space has increased, several improvements to the basic algorithms have been made. They include: better correction for the effects of aerosols and clouds, an improved method to account for the variation in shape of ozone profiles with season, latitude, and total ozone, and a multi-wavelength correction for remaining profile shape errors. These improvements have made it possible to retrieve total ozone with just 3 spectral channels of moderate spectral resolution (approx. 1 nm) with accuracy comparable to state-of-the-art spectral fitting algorithms like DOAS that require high spectral resolution measurements at large number of wavelengths. One of the deficiencies of the TOMS algorithm has been that it doesn't provide an error estimate. This is a particular problem in high latitudes when the profile shape errors become significant and vary with latitude, season, total ozone, and instrument viewing geometry. The primary objective of the TOMS V9 algorithm is to account for these effects in estimating the error bars. This is done by a straightforward implementation of the Rodgers optimum estimation method using a priori ozone profiles and their error covariances matrices constructed using Aura MLS and ozonesonde data. The algorithm produces a vertical ozone profile that contains 1-2.5 pieces of information (degrees of freedom of signal) depending upon solar zenith angle (SZA). The profile is integrated to obtain the total column. We provide information that shows the altitude range in which the profile is best determined by the measurements. One can use this information in data assimilation and analysis. A side

  8. Trace Gas Measurements from the GeoTASO and GCAS Airborne Instruments: An Instrument and Algorithm Test-Bed for Air Quality Observations from Geostationary Orbit

    Science.gov (United States)

    Nowlan, C. R.; Liu, X.; Janz, S. J.; Leitch, J. W.; Al-Saadi, J. A.; Chance, K.; Cole, J.; Delker, T.; Follette-Cook, M. B.; Gonzalez Abad, G.; Good, W. S.; Kowalewski, M. G.; Loughner, C.; Pickering, K. E.; Ruppert, L.; Soo, D.; Szykman, J.; Valin, L.; Zoogman, P.

    2016-12-01

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) and the GEO-CAPE Airborne Simulator (GCAS) instruments are pushbroom sensors capable of making remote sensing measurements of air quality and ocean color. Originally developed as test-bed instruments for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey, these instruments are now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions, and will provide validation capabilities after the satellite instruments are in orbit. GeoTASO and GCAS flew on two different aircraft in their first intensive air quality field campaigns during the DISCOVER-AQ missions over Texas in 2013 and Colorado in 2014. GeoTASO was also deployed in 2016 during the KORUS-AQ field campaign to make measurements of trace gases and aerosols over Korea. GeoTASO and GCAS collect spectra of backscattered solar radiation in the UV and visible that can be used to derive 2-D maps of trace gas columns below the aircraft at spatial resolutions on the order of 250 x 500 m. We present spatially resolved maps of trace gas retrievals of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the field campaigns, and comparisons with data from ground-based spectrometers, in situ monitoring instruments, and satellites.

  9. [Design and implementation of real-time continuous glucose monitoring instrument].

    Science.gov (United States)

    Huang, Yonghong; Liu, Hongying; Tian, Senfu; Jia, Ziru; Wang, Zi; Pi, Xitian

    2017-12-01

    Real-time continuous glucose monitoring can help diabetics to control blood sugar levels within the normal range. However, in the process of practical monitoring, the output of real-time continuous glucose monitoring system is susceptible to glucose sensor and environment noise, which will influence the measurement accuracy of the system. Aiming at this problem, a dual-calibration algorithm for the moving-window double-layer filtering algorithm combined with real-time self-compensation calibration algorithm is proposed in this paper, which can realize the signal drift compensation for current data. And a real-time continuous glucose monitoring instrument based on this study was designed. This real-time continuous glucose monitoring instrument consisted of an adjustable excitation voltage module, a current-voltage converter module, a microprocessor and a wireless transceiver module. For portability, the size of the device was only 40 mm × 30 mm × 5 mm and its weight was only 30 g. In addition, a communication command code algorithm was designed to ensure the security and integrity of data transmission in this study. Results of experiments in vitro showed that current detection of the device worked effectively. A 5-hour monitoring of blood glucose level in vivo showed that the device could continuously monitor blood glucose in real time. The relative error of monitoring results of the designed device ranged from 2.22% to 7.17% when comparing to a portable blood meter.

  10. CARMENES instrument control system and operational scheduler

    Science.gov (United States)

    Garcia-Piquer, Alvaro; Guàrdia, Josep; Colomé, Josep; Ribas, Ignasi; Gesa, Lluis; Morales, Juan Carlos; Pérez-Calpena, Ana; Seifert, Walter; Quirrenbach, Andreas; Amado, Pedro J.; Caballero, José A.; Reiners, Ansgar

    2014-07-01

    visibility, sky background, required time sampling coverage) and the dynamic change of the system conditions (i.e., weather, system conditions). Off-line and on-line strategies are integrated into a single tool for a suitable transfer of the target prioritization made by the science team to the real-time schedule that will be used by the instrument operators. A suitable solution will be expected to increase the efficiency of telescope operations, which will represent an important benefit in terms of scientific return and operational costs. We present the operational scheduling tool designed for CARMENES, which is based on two algorithms combining a global and a local search: Genetic Algorithms and Hill Climbing astronomy-based heuristics, respectively. The algorithm explores a large amount of potential solutions from the vast search space and is able to identify the most efficient ones. A planning solution is considered efficient when it optimizes the objectives defined, which, in our case, are related to the reduction of the time that the telescope is not in use and the maximization of the scientific return, measured in terms of the time coverage of each target in the survey. We present the results obtained using different test cases.

  11. Multiobjective optimization of strategies for operation and testing of low-demand safety instrumented systems using a genetic algorithm and fault trees

    International Nuclear Information System (INIS)

    Longhi, Antonio Eduardo Bier; Pessoa, Artur Alves; Garcia, Pauli Adriano de Almada

    2015-01-01

    Since low-demand safety instrumented systems (SISs) do not operate continuously, their failures are often only detected when the system is demanded or tested. The conduction of tests, besides adding costs, can raise risks of failure on demand during their execution and also increase the frequency of spurious activation. Additionally, it is often necessary to interrupt production to carry out tests. In light of this scenario, this paper presents a model to optimize strategies for operation and testing of these systems, applying modeling by fault trees associated with optimization by a genetic algorithm. Its main differences are: (i) ability to represent four modes of operation and test them for each SIS subsystem; (ii) ability to represent a SIS that executes more than one safety instrumented function; (iii) ability to keep track of the down-time generated in the production system; and (iv) alteration of a genetic selection mechanism that permits identification of more efficient solutions with smaller influence on the optimization parameters. These aspects are presented by applying this model in three case studies. The results obtained show the applicability of the proposed approach and its potential to help make more informed decisions. - Highlights: • Models the integrity and cost related to operation and testing of low-demand SISs. • Keeps track of the production down-time generated by SIS tests and repairs. • Allows multiobjective optimization to identify operation and testing strategies. • Enables integrated assessment of an SIS that executes more than one SIF. • Allows altering the selection mechanism to identify the most efficient strategies

  12. Deconvolution of 2D coincident Doppler broadening spectroscopy using the Richardson-Lucy algorithm

    International Nuclear Information System (INIS)

    Zhang, J.D.; Zhou, T.J.; Cheung, C.K.; Beling, C.D.; Fung, S.; Ng, M.K.

    2006-01-01

    Coincident Doppler Broadening Spectroscopy (CDBS) measurements are popular in positron solid-state studies of materials. By utilizing the instrumental resolution function obtained from a gamma line close in energy to the 511 keV annihilation line, it is possible to significantly enhance the quality of the CDBS spectra using deconvolution algorithms. In this paper, we compare two algorithms, namely the Non-Negativity Least Squares (NNLS) regularized method and the Richardson-Lucy (RL) algorithm. The latter, which is based on the method of maximum likelihood, is found to give superior results to the regularized least-squares algorithm and with significantly less computer processing time

  13. Total ozone column derived from GOME and SCIAMACHY using KNMI retrieval algorithms: Validation against Brewer measurements at the Iberian Peninsula

    Science.gov (United States)

    Antón, M.; Kroon, M.; López, M.; Vilaplana, J. M.; Bañón, M.; van der A, R.; Veefkind, J. P.; Stammes, P.; Alados-Arboledas, L.

    2011-11-01

    This article focuses on the validation of the total ozone column (TOC) data set acquired by the Global Ozone Monitoring Experiment (GOME) and the Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY) satellite remote sensing instruments using the Total Ozone Retrieval Scheme for the GOME Instrument Based on the Ozone Monitoring Instrument (TOGOMI) and Total Ozone Retrieval Scheme for the SCIAMACHY Instrument Based on the Ozone Monitoring Instrument (TOSOMI) retrieval algorithms developed by the Royal Netherlands Meteorological Institute. In this analysis, spatially colocated, daily averaged ground-based observations performed by five well-calibrated Brewer spectrophotometers at the Iberian Peninsula are used. The period of study runs from January 2004 to December 2009. The agreement between satellite and ground-based TOC data is excellent (R2 higher than 0.94). Nevertheless, the TOC data derived from both satellite instruments underestimate the ground-based data. On average, this underestimation is 1.1% for GOME and 1.3% for SCIAMACHY. The SCIAMACHY-Brewer TOC differences show a significant solar zenith angle (SZA) dependence which causes a systematic seasonal dependence. By contrast, GOME-Brewer TOC differences show no significant SZA dependence and hence no seasonality although processed with exactly the same algorithm. The satellite-Brewer TOC differences for the two satellite instruments show a clear and similar dependence on the viewing zenith angle under cloudy conditions. In addition, both the GOME-Brewer and SCIAMACHY-Brewer TOC differences reveal a very similar behavior with respect to the satellite cloud properties, being cloud fraction and cloud top pressure, which originate from the same cloud algorithm (Fast Retrieval Scheme for Clouds from the Oxygen A-Band (FRESCO+)) in both the TOSOMI and TOGOMI retrieval algorithms.

  14. Theoretical algorithms for satellite-derived sea surface temperatures

    Science.gov (United States)

    Barton, I. J.; Zavody, A. M.; O'Brien, D. M.; Cutten, D. R.; Saunders, R. W.; Llewellyn-Jones, D. T.

    1989-03-01

    Reliable climate forecasting using numerical models of the ocean-atmosphere system requires accurate data sets of sea surface temperature (SST) and surface wind stress. Global sets of these data will be supplied by the instruments to fly on the ERS 1 satellite in 1990. One of these instruments, the Along-Track Scanning Radiometer (ATSR), has been specifically designed to provide SST in cloud-free areas with an accuracy of 0.3 K. The expected capabilities of the ATSR can be assessed using transmission models of infrared radiative transfer through the atmosphere. The performances of several different models are compared by estimating the infrared brightness temperatures measured by the NOAA 9 AVHRR for three standard atmospheres. Of these, a computationally quick spectral band model is used to derive typical AVHRR and ATSR SST algorithms in the form of linear equations. These algorithms show that a low-noise 3.7-μm channel is required to give the best satellite-derived SST and that the design accuracy of the ATSR is likely to be achievable. The inclusion of extra water vapor information in the analysis did not improve the accuracy of multiwavelength SST algorithms, but some improvement was noted with the multiangle technique. Further modeling is required with atmospheric data that include both aerosol variations and abnormal vertical profiles of water vapor and temperature.

  15. The operational cloud retrieval algorithms from TROPOMI on board Sentinel-5 Precursor

    Science.gov (United States)

    Loyola, Diego G.; Gimeno García, Sebastián; Lutz, Ronny; Argyrouli, Athina; Romahn, Fabian; Spurr, Robert J. D.; Pedergnana, Mattia; Doicu, Adrian; Molina García, Víctor; Schüssler, Olena

    2018-01-01

    This paper presents the operational cloud retrieval algorithms for the TROPOspheric Monitoring Instrument (TROPOMI) on board the European Space Agency Sentinel-5 Precursor (S5P) mission scheduled for launch in 2017. Two algorithms working in tandem are used for retrieving cloud properties: OCRA (Optical Cloud Recognition Algorithm) and ROCINN (Retrieval of Cloud Information using Neural Networks). OCRA retrieves the cloud fraction using TROPOMI measurements in the ultraviolet (UV) and visible (VIS) spectral regions, and ROCINN retrieves the cloud top height (pressure) and optical thickness (albedo) using TROPOMI measurements in and around the oxygen A-band in the near infrared (NIR). Cloud parameters from TROPOMI/S5P will be used not only for enhancing the accuracy of trace gas retrievals but also for extending the satellite data record of cloud information derived from oxygen A-band measurements, a record initiated with the Global Ozone Monitoring Experiment (GOME) on board the second European Remote-Sensing Satellite (ERS-2) over 20 years ago. The OCRA and ROCINN algorithms are integrated in the S5P operational processor UPAS (Universal Processor for UV/VIS/NIR Atmospheric Spectrometers), and we present here UPAS cloud results using the Ozone Monitoring Instrument (OMI) and GOME-2 measurements. In addition, we examine anticipated challenges for the TROPOMI/S5P cloud retrieval algorithms, and we discuss the future validation needs for OCRA and ROCINN.

  16. The Gaia mission

    Czech Academy of Sciences Publication Activity Database

    Prusti, T.; de Bruijne, J.H.J.; Brown, A.G.A.; Vallenari, A.; Babusiaux, C.; Bailer-Jones, C.A.L.; Bastian, U.; Biermann, M.; Evans, D.; Eyer, L.; Fuchs, Jan; Koubský, Pavel; Votruba, Viktor

    2016-01-01

    Roč. 595, November (2016), A1/1-A1/36 E-ISSN 1432-0746 R&D Projects: GA MŠk(CZ) LG15010 Institutional support: RVO:67985815 Keywords : space vehicles: instruments * galaxy * astrometry Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics OBOR OECD: Astronomy (including astrophysics,space science) Impact factor: 5.014, year: 2016

  17. The Algorithm Theoretical Basis Document for Level 1A Processing

    Science.gov (United States)

    Jester, Peggy L.; Hancock, David W., III

    2012-01-01

    The first process of the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software converts the Level 0 data into the Level 1A Data Products. The Level 1A Data Products are the time ordered instrument data converted from counts to engineering units. This document defines the equations that convert the raw instrument data into engineering units. Required scale factors, bias values, and coefficients are defined in this document. Additionally, required quality assurance and browse products are defined in this document.

  18. First light for GRAVITY: Phase referencing optical interferometry for the Very Large Telescope Interferometer

    Science.gov (United States)

    Gravity Collaboration; Abuter, R.; Accardo, M.; Amorim, A.; Anugu, N.; Ávila, G.; Azouaoui, N.; Benisty, M.; Berger, J. P.; Blind, N.; Bonnet, H.; Bourget, P.; Brandner, W.; Brast, R.; Buron, A.; Burtscher, L.; Cassaing, F.; Chapron, F.; Choquet, É.; Clénet, Y.; Collin, C.; Coudé Du Foresto, V.; de Wit, W.; de Zeeuw, P. T.; Deen, C.; Delplancke-Ströbele, F.; Dembet, R.; Derie, F.; Dexter, J.; Duvert, G.; Ebert, M.; Eckart, A.; Eisenhauer, F.; Esselborn, M.; Fédou, P.; Finger, G.; Garcia, P.; Garcia Dabo, C. E.; Garcia Lopez, R.; Gendron, E.; Genzel, R.; Gillessen, S.; Gonte, F.; Gordo, P.; Grould, M.; Grözinger, U.; Guieu, S.; Haguenauer, P.; Hans, O.; Haubois, X.; Haug, M.; Haussmann, F.; Henning, Th.; Hippler, S.; Horrobin, M.; Huber, A.; Hubert, Z.; Hubin, N.; Hummel, C. A.; Jakob, G.; Janssen, A.; Jochum, L.; Jocou, L.; Kaufer, A.; Kellner, S.; Kendrew, S.; Kern, L.; Kervella, P.; Kiekebusch, M.; Klein, R.; Kok, Y.; Kolb, J.; Kulas, M.; Lacour, S.; Lapeyrère, V.; Lazareff, B.; Le Bouquin, J.-B.; Lèna, P.; Lenzen, R.; Lévêque, S.; Lippa, M.; Magnard, Y.; Mehrgan, L.; Mellein, M.; Mérand, A.; Moreno-Ventas, J.; Moulin, T.; Müller, E.; Müller, F.; Neumann, U.; Oberti, S.; Ott, T.; Pallanca, L.; Panduro, J.; Pasquini, L.; Paumard, T.; Percheron, I.; Perraut, K.; Perrin, G.; Pflüger, A.; Pfuhl, O.; Phan Duc, T.; Plewa, P. M.; Popovic, D.; Rabien, S.; Ramírez, A.; Ramos, J.; Rau, C.; Riquelme, M.; Rohloff, R.-R.; Rousset, G.; Sanchez-Bermudez, J.; Scheithauer, S.; Schöller, M.; Schuhler, N.; Spyromilio, J.; Straubmeier, C.; Sturm, E.; Suarez, M.; Tristram, K. R. W.; Ventura, N.; Vincent, F.; Waisberg, I.; Wank, I.; Weber, J.; Wieprecht, E.; Wiest, M.; Wiezorrek, E.; Wittkowski, M.; Woillez, J.; Wolff, B.; Yazici, S.; Ziegler, D.; Zins, G.

    2017-06-01

    GRAVITY is a new instrument to coherently combine the light of the European Southern Observatory Very Large Telescope Interferometer to form a telescope with an equivalent 130 m diameter angular resolution and a collecting area of 200 m2. The instrument comprises fiber fed integrated optics beam combination, high resolution spectroscopy, built-in beam analysis and control, near-infrared wavefront sensing, phase-tracking, dual-beam operation, and laser metrology. GRAVITY opens up to optical/infrared interferometry the techniques of phase referenced imaging and narrow angle astrometry, in many aspects following the concepts of radio interferometry. This article gives an overview of GRAVITY and reports on the performance and the first astronomical observations during commissioning in 2015/16. We demonstrate phase-tracking on stars as faint as mK ≈ 10 mag, phase-referenced interferometry of objects fainter than mK ≈ 15 mag with a limiting magnitude of mK ≈ 17 mag, minute long coherent integrations, a visibility accuracy of better than 0.25%, and spectro-differential phase and closure phase accuracy better than 0.5°, corresponding to a differential astrometric precision of better than ten microarcseconds (μas). The dual-beam astrometry, measuring the phase difference of two objects with laser metrology, is still under commissioning. First observations show residuals as low as 50 μas when following objects over several months. We illustrate the instrument performance with the observations of archetypical objects for the different instrument modes. Examples include the Galactic center supermassive black hole and its fast orbiting star S2 for phase referenced dual-beam observations and infrared wavefront sensing, the high mass X-ray binary BP Cru and the active galactic nucleus of PDS 456 for a few μas spectro-differential astrometry, the T Tauri star S CrA for a spectro-differential visibility analysis, ξ Tel and 24 Cap for high accuracy visibility observations

  19. A fast readout algorithm for Cluster Counting/Timing drift chambers on a FPGA board

    Energy Technology Data Exchange (ETDEWEB)

    Cappelli, L. [Università di Cassino e del Lazio Meridionale (Italy); Creti, P.; Grancagnolo, F. [Istituto Nazionale di Fisica Nucleare, Lecce (Italy); Pepino, A., E-mail: Aurora.Pepino@le.infn.it [Istituto Nazionale di Fisica Nucleare, Lecce (Italy); Tassielli, G. [Istituto Nazionale di Fisica Nucleare, Lecce (Italy); Fermilab, Batavia, IL (United States); Università Marconi, Roma (Italy)

    2013-08-01

    A fast readout algorithm for Cluster Counting and Timing purposes has been implemented and tested on a Virtex 6 core FPGA board. The algorithm analyses and stores data coming from a Helium based drift tube instrumented by 1 GSPS fADC and represents the outcome of balancing between cluster identification efficiency and high speed performance. The algorithm can be implemented in electronics boards serving multiple fADC channels as an online preprocessing stage for drift chamber signals.

  20. Transient response of level instruments in a research reactor

    International Nuclear Information System (INIS)

    Cheng, Lap Y.

    1989-01-01

    A numerical model has been developed to simulate the dynamics of water level instruments in a research nuclear reactor. A bubble device, with helium gas as the working fluid, is used to monitor liquid level by sensing the static head pressure due to the height of liquid in the reactor vessel. A finite-difference model is constructed to study the transient response of the water level instruments to pressure perturbations. The field equations which describe the hydraulics of the helium gas in the bubbler device are arranged in the form of a tridiagonal matrix and the field variables are solved at each time step by the Thomas algorithm. Simulation results indicate that the dynamic response of the helium gas depends mainly on the volume and the inertia of the gas in the level instrument tubings. The anomalies in the simulated level indication are attributed to the inherent lag in the level instrument due to the hydraulics of the system. 1 ref., 5 figs

  1. International VLBI Service for Geodesy and Astrometry. Delivering high-quality products and embarking on observations of the next generation

    Science.gov (United States)

    Nothnagel, A.; Artz, T.; Behrend, D.; Malkin, Z.

    2017-07-01

    The International VLBI Service for Geodesy and Astrometry (IVS) regularly produces high-quality Earth orientation parameters from observing sessions employing extensive networks or individual baselines. The master schedule is designed according to the telescope days committed by the stations and by the need for dense sampling of the Earth orientation parameters (EOP). In the pre-2011 era, the network constellations with their number of telescopes participating were limited by the playback and baseline capabilities of the hardware (Mark4) correlators. This limitation was overcome by the advent of software correlators, which can now accommodate many more playback units in a flexible configuration. In this paper, we describe the current operations of the IVS with special emphasis on the quality of the polar motion results since these are the only EOP components which can be validated against independent benchmarks. The polar motion results provided by the IVS have improved continuously over the years, now providing an agreement with IGS results at the level of 20-25 μas in a WRMS sense. At the end of the paper, an outlook is given for the realization of the VLBI Global Observing System.

  2. Towards automatic musical instrument timbre recognition

    Science.gov (United States)

    Park, Tae Hong

    This dissertation is comprised of two parts---focus on issues concerning research and development of an artificial system for automatic musical instrument timbre recognition and musical compositions. The technical part of the essay includes a detailed record of developed and implemented algorithms for feature extraction and pattern recognition. A review of existing literature introducing historical aspects surrounding timbre research, problems associated with a number of timbre definitions, and highlights of selected research activities that have had significant impact in this field are also included. The developed timbre recognition system follows a bottom-up, data-driven model that includes a pre-processing module, feature extraction module, and a RBF/EBF (Radial/Elliptical Basis Function) neural network-based pattern recognition module. 829 monophonic samples from 12 instruments have been chosen from the Peter Siedlaczek library (Best Service) and other samples from the Internet and personal collections. Significant emphasis has been put on feature extraction development and testing to achieve robust and consistent feature vectors that are eventually passed to the neural network module. In order to avoid a garbage-in-garbage-out (GIGO) trap and improve generality, extra care was taken in designing and testing the developed algorithms using various dynamics, different playing techniques, and a variety of pitches for each instrument with inclusion of attack and steady-state portions of a signal. Most of the research and development was conducted in Matlab. The compositional part of the essay includes brief introductions to "A d'Ess Are ," "Aboji," "48 13 N, 16 20 O," and "pH-SQ." A general outline pertaining to the ideas and concepts behind the architectural designs of the pieces including formal structures, time structures, orchestration methods, and pitch structures are also presented.

  3. Distributed Framework for Dynamic Telescope and Instrument Control

    Science.gov (United States)

    Ames, Troy J.; Case, Lynne

    2002-01-01

    Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see httD://www.jxta.org,) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device's IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have

  4. UV Reconstruction Algorithm And Diurnal Cycle Variability

    Science.gov (United States)

    Curylo, Aleksander; Litynska, Zenobia; Krzyscin, Janusz; Bogdanska, Barbara

    2009-03-01

    UV reconstruction is a method of estimation of surface UV with the use of available actinometrical and aerological measurements. UV reconstruction is necessary for the study of long-term UV change. A typical series of UV measurements is not longer than 15 years, which is too short for trend estimation. The essential problem in the reconstruction algorithm is the good parameterization of clouds. In our previous algorithm we used an empirical relation between Cloud Modification Factor (CMF) in global radiation and CMF in UV. The CMF is defined as the ratio between measured and modelled irradiances. Clear sky irradiance was calculated with a solar radiative transfer model. In the proposed algorithm, the time variability of global radiation during the diurnal cycle is used as an additional source of information. For elaborating an improved reconstruction algorithm relevant data from Legionowo [52.4 N, 21.0 E, 96 m a.s.l], Poland were collected with the following instruments: NILU-UV multi channel radiometer, Kipp&Zonen pyranometer, radiosonde profiles of ozone, humidity and temperature. The proposed algorithm has been used for reconstruction of UV at four Polish sites: Mikolajki, Kolobrzeg, Warszawa-Bielany and Zakopane since the early 1960s. Krzyscin's reconstruction of total ozone has been used in the calculations.

  5. The algorithm of individualization in physical education students

    Directory of Open Access Journals (Sweden)

    Barybina L.N.

    2012-11-01

    Full Text Available The algorithm of individualization of process of physical education is offered in higher educational establishment. Basis of algorithm is made by the physical, functional and psychophysiological features of students. In research took part 413 students (177 girls and 236 youths. The stages of algorithm of the author system of individualization of physical education of students are presented. It is set that youths (a type of sport is basketball and volleyball have a similar structure of indexes of psycho-physiological possibilities, physical preparedness and progress. High meaningfulness of the computer programs which are instrumental in perfection of the system of physical education is set. Also the programs allow quickly and effectively to determine the psycho-physiological features of students. It is recommended to distribute students on sporting specializations in obedience to their individual features.

  6. Next Generation Aura-OMI SO2 Retrieval Algorithm: Introduction and Implementation Status

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, Nickolay A.; Bhartia, Pawan K.

    2014-01-01

    We introduce our next generation algorithm to retrieve SO2 using radiance measurements from the Aura Ozone Monitoring Instrument (OMI). We employ a principal component analysis technique to analyze OMI radiance spectral in 310.5-340 nm acquired over regions with no significant SO2. The resulting principal components (PCs) capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering, and ozone absorption) and measurement artifacts, enabling us to account for these various interferences in SO2 retrievals. By fitting these PCs along with SO2 Jacobians calculated with a radiative transfer model to OMI-measured radiance spectra, we directly estimate SO2 vertical column density in one step. As compared with the previous generation operational OMSO2 PBL (Planetary Boundary Layer) SO2 product, our new algorithm greatly reduces unphysical biases and decreases the noise by a factor of two, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing long-term, consistent SO2 records for air quality and climate research. We have operationally implemented this new algorithm on OMI SIPS for producing the new generation standard OMI SO2 products.

  7. Optimized coincidence Doppler broadening spectroscopy using deconvolution algorithms

    International Nuclear Information System (INIS)

    Ho, K.F.; Ching, H.M.; Cheng, K.W.; Beling, C.D.; Fung, S.; Ng, K.P.

    2004-01-01

    In the last few years a number of excellent deconvolution algorithms have been developed for use in ''de-blurring'' 2D images. Here we report briefly on one such algorithm we have studied which uses the non-negativity constraint to optimize the regularization and which is applied to the 2D image like data produced in Coincidence Doppler Broadening Spectroscopy (CDBS). The system instrumental resolution functions are obtained using the 514 keV line from 85 Sr. The technique when applied to a series of well annealed polycrystalline metals gives two photon momentum data on a quality comparable to that obtainable using 1D Angular Correlation of Annihilation Radiation (ACAR). (orig.)

  8. Iterative methods used in overlap astrometric reduction techniques do not always converge

    Science.gov (United States)

    Rapaport, M.; Ducourant, C.; Colin, J.; Le Campion, J. F.

    1993-04-01

    In this paper we prove that the classical Gauss-Seidel type iterative methods used for the solution of the reduced normal equations occurring in overlapping reduction methods of astrometry do not always converge. We exhibit examples of divergence. We then analyze an alternative algorithm proposed by Wang (1985). We prove the consistency of this algorithm and verify that it can be convergent while the Gauss-Seidel method is divergent. We conjecture the convergence of Wang method for the solution of astrometric problems using overlap techniques.

  9. Portable instrument for in-vivo infrared oxymetry using spread-spectrum modulation

    Energy Technology Data Exchange (ETDEWEB)

    Trevisan, S [Dipartimento di Elettronica, Universita di Pavia, Pavia, Via Ferrata, 1 - 27100 Pavia (Italy); Bavera, M [National Institute for the Physics of Matter, INFM, C.so Perrone, 24 - 16152 Genova (Italy); Giardini, M E [National Institute for the Physics of Matter, INFM, C.so Perrone, 24 - 16152 Genova (Italy)

    2007-04-15

    Near Infrared Spectroscopy (NIRS) can be employed to monitor noninvasively and continuously local changes in hemodynamics and oxygenation of human tissues. A portable NIRS research-grade acquisition system, dedicated to measurements during muscular exercise, is presented. The instrument is able to control up to eight LED sources and two detectors. A digital correlation technique, implemented on a single-chip RISC microcontroller, performs source-to-detector multiplexing. Such algorithm is highly optimized for computational efficiency and ambient noise rejection. Software-configurable input stages allow for flexibility in instrument setup. As a result of the specific correlation technique employed, the instrument is compact, lightweight and efficient. Clinical tests on oxygen consumption show excellent performance.

  10. Portable instrument for in-vivo infrared oxymetry using spread-spectrum modulation

    International Nuclear Information System (INIS)

    Trevisan, S; Bavera, M; Giardini, M E

    2007-01-01

    Near Infrared Spectroscopy (NIRS) can be employed to monitor noninvasively and continuously local changes in hemodynamics and oxygenation of human tissues. A portable NIRS research-grade acquisition system, dedicated to measurements during muscular exercise, is presented. The instrument is able to control up to eight LED sources and two detectors. A digital correlation technique, implemented on a single-chip RISC microcontroller, performs source-to-detector multiplexing. Such algorithm is highly optimized for computational efficiency and ambient noise rejection. Software-configurable input stages allow for flexibility in instrument setup. As a result of the specific correlation technique employed, the instrument is compact, lightweight and efficient. Clinical tests on oxygen consumption show excellent performance

  11. Algorithms in practice: Comparing web journalism and criminal justice

    Directory of Open Access Journals (Sweden)

    Angèle Christin

    2017-07-01

    Full Text Available Big Data evangelists often argue that algorithms make decision-making more informed and objective—a promise hotly contested by critics of these technologies. Yet, to date, most of the debate has focused on the instruments themselves, rather than on how they are used. This article addresses this lack by examining the actual practices surrounding algorithmic technologies. Specifically, drawing on multi-sited ethnographic data, I compare how algorithms are used and interpreted in two institutional contexts with markedly different characteristics: web journalism and criminal justice. I find that there are surprising similarities in how web journalists and legal professionals use algorithms in their work. In both cases, I document a gap between the intended and actual effects of algorithms—a process I analyze as “decoupling.” Second, I identify a gamut of buffering strategies used by both web journalists and legal professionals to minimize the impact of algorithms in their daily work. Those include foot-dragging, gaming, and open critique. Of course, these similarities do not exhaust the differences between the two cases, which are explored in the discussion section. I conclude with a call for further ethnographic work on algorithms in practice as an important empirical check against the dominant rhetoric of algorithmic power.

  12. The operational cloud retrieval algorithms from TROPOMI on board Sentinel-5 Precursor

    Directory of Open Access Journals (Sweden)

    D. G. Loyola

    2018-01-01

    Full Text Available This paper presents the operational cloud retrieval algorithms for the TROPOspheric Monitoring Instrument (TROPOMI on board the European Space Agency Sentinel-5 Precursor (S5P mission scheduled for launch in 2017. Two algorithms working in tandem are used for retrieving cloud properties: OCRA (Optical Cloud Recognition Algorithm and ROCINN (Retrieval of Cloud Information using Neural Networks. OCRA retrieves the cloud fraction using TROPOMI measurements in the ultraviolet (UV and visible (VIS spectral regions, and ROCINN retrieves the cloud top height (pressure and optical thickness (albedo using TROPOMI measurements in and around the oxygen A-band in the near infrared (NIR. Cloud parameters from TROPOMI/S5P will be used not only for enhancing the accuracy of trace gas retrievals but also for extending the satellite data record of cloud information derived from oxygen A-band measurements, a record initiated with the Global Ozone Monitoring Experiment (GOME on board the second European Remote-Sensing Satellite (ERS-2 over 20 years ago. The OCRA and ROCINN algorithms are integrated in the S5P operational processor UPAS (Universal Processor for UV/VIS/NIR Atmospheric Spectrometers, and we present here UPAS cloud results using the Ozone Monitoring Instrument (OMI and GOME-2 measurements. In addition, we examine anticipated challenges for the TROPOMI/S5P cloud retrieval algorithms, and we discuss the future validation needs for OCRA and ROCINN.

  13. The CCSDS Lossless Data Compression Algorithm for Space Applications

    Science.gov (United States)

    Yeh, Pen-Shu; Day, John H. (Technical Monitor)

    2001-01-01

    In the late 80's, when the author started working at the Goddard Space Flight Center (GSFC) for the National Aeronautics and Space Administration (NASA), several scientists there were in the process of formulating the next generation of Earth viewing science instruments, the Moderate Resolution Imaging Spectroradiometer (MODIS). The instrument would have over thirty spectral bands and would transmit enormous data through the communications channel. This was when the author was assigned the task of investigating lossless compression algorithms for space implementation to compress science data in order to reduce the requirement on bandwidth and storage.

  14. Recognition of Instrumentation Gauge in the Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Nuclear emergency robots were developed in 2001 as the countermeasure following the criticality accident at the JCO (uranium refinery facility) in Tokaimura, Japan in 1999. We assumed that these nuclear emergency robots were deployed (or put into) for a mitigation (or management) of severe accident, for example, occurred at Fukushima Daiichi nuclear power plant. In the case, the image understanding using a color CCD camera, loaded on the nuclear emergency robot, is important. We proposed an image processing technique to read indication value of the IC water level gauges using the structural characteristics of the instrumentation panels (water level gauges) located inside the reactor building. At first, we recognized the scales on the instrumentation panel using the geometric shape of the panel. And then, we could read the values of the instrumentation gauge by calculating the slope of the needle on the gauge. Using the proposed algorithm, we deciphered instrumentation panels for the four water level gauges and indicators shown on the IC video released by TEPCO and Japanese Nuclear Regulatory Commission of Japan. In this paper, recognition of the instrumentation gauges inside reactor building of the nuclear power plant by an image processing technology is described.

  15. Energy conserving schemes for the simulation of musical instrument contact dynamics

    Science.gov (United States)

    Chatziioannou, Vasileios; van Walstijn, Maarten

    2015-03-01

    Collisions are an innate part of the function of many musical instruments. Due to the nonlinear nature of contact forces, special care has to be taken in the construction of numerical schemes for simulation and sound synthesis. Finite difference schemes and other time-stepping algorithms used for musical instrument modelling purposes are normally arrived at by discretising a Newtonian description of the system. However because impact forces are non-analytic functions of the phase space variables, algorithm stability can rarely be established this way. This paper presents a systematic approach to deriving energy conserving schemes for frictionless impact modelling. The proposed numerical formulations follow from discretising Hamilton's equations of motion, generally leading to an implicit system of nonlinear equations that can be solved with Newton's method. The approach is first outlined for point mass collisions and then extended to distributed settings, such as vibrating strings and beams colliding with rigid obstacles. Stability and other relevant properties of the proposed approach are discussed and further demonstrated with simulation examples. The methodology is exemplified through a case study on tanpura string vibration, with the results confirming the main findings of previous studies on the role of the bridge in sound generation with this type of string instrument.

  16. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    Science.gov (United States)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  17. Teaching computer interfacing with virtual instruments in an object-oriented language.

    Science.gov (United States)

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  18. Chlorophyll-a Algorithms for Oligotrophic Oceans: A Novel Approach Based on Three-Band Reflectance Difference

    Science.gov (United States)

    Hu, Chuanmin; Lee, Zhongping; Franz, Bryan

    2011-01-01

    A new empirical algorithm is proposed to estimate surface chlorophyll-a concentrations (Chl) in the global ocean for Chl less than or equal to 0.25 milligrams per cubic meters (approximately 77% of the global ocean area). The algorithm is based on a color index (CI), defined as the difference between remote sensing reflectance (R(sub rs), sr(sup -1) in the green and a reference formed linearly between R(sub rs) in the blue and red. For low Chl waters, in situ data showed a tighter (and therefore better) relationship between CI and Chl than between traditional band-ratios and Chl, which was further validated using global data collected concurrently by ship-borne and SeaWiFS satellite instruments. Model simulations showed that for low Chl waters, compared with the band-ratio algorithm, the CI-based algorithm (CIA) was more tolerant to changes in chlorophyll-specific backscattering coefficient, and performed similarly for different relative contributions of non-phytoplankton absorption. Simulations using existing atmospheric correction approaches further demonstrated that the CIA was much less sensitive than band-ratio algorithms to various errors induced by instrument noise and imperfect atmospheric correction (including sun glint and whitecap corrections). Image and time-series analyses of SeaWiFS and MODIS/Aqua data also showed improved performance in terms of reduced image noise, more coherent spatial and temporal patterns, and consistency between the two sensors. The reduction in noise and other errors is particularly useful to improve the detection of various ocean features such as eddies. Preliminary tests over MERIS and CZCS data indicate that the new approach should be generally applicable to all existing and future ocean color instruments.

  19. Towards the retrieval of tropospheric ozone with the ozone monitoring instrument (OMI)

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Van Peet, J.C.A.; Eremenko, M.; Veefkind, J.P.

    2015-01-01

    We have assessed the sensitivity of the operational Ozone Monitoring Instrument (OMI) ozone profile retrieval algorithm to a number of a priori and radiative transfer assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved

  20. The solar neighborhood. XXXIV. A search for planets orbiting nearby M dwarfs using astrometry

    International Nuclear Information System (INIS)

    Lurie, John C.; Henry, Todd J.; Ianna, Philip A.; Jao, Wei-Chun; Quinn, Samuel N.; Winters, Jennifer G.; Koerner, David W.; Riedel, Adric R.; Subasavage, John P.

    2014-01-01

    Astrometric measurements are presented for seven nearby stars with previously detected planets: six M dwarfs (GJ 317, GJ 667C, GJ 581, GJ 849, GJ 876, and GJ 1214) and one K dwarf (BD-10 -3166). Measurements are also presented for six additional nearby M dwarfs without known planets, but which are more favorable to astrometric detections of low mass companions, as well as three binary systems for which we provide astrometric orbit solutions. Observations have baselines of 3 to 13 years, and were made as part of the RECONS long-term astrometry and photometry program at the CTIO/SMARTS 0.9 m telescope. We provide trigonometric parallaxes and proper motions for all 16 systems, and perform an extensive analysis of the astrometric residuals to determine the minimum detectable companion mass for the 12 M dwarfs not having close stellar secondaries. For the six M dwarfs with known planets, we are not sensitive to planets, but can rule out the presence of all but the least massive brown dwarfs at periods of 2–12 years. For the six more astrometrically favorable M dwarfs, we conclude that none have brown dwarf companions, and are sensitive to companions with masses as low as 1 M Jup for periods longer than two years. In particular, we conclude that Proxima Centauri has no Jovian companions at orbital periods of 2–12 years. These results complement previously published M dwarf planet occurrence rates by providing astrometrically determined upper mass limits on potential super-Jupiter companions at orbits of two years and longer. As part of a continuing survey, these results are consistent with the paucity of super-Jupiter and brown dwarf companions we find among the over 250 red dwarfs within 25 pc observed longer than five years in our astrometric program.

  1. The solar neighborhood. XXXIV. A search for planets orbiting nearby M dwarfs using astrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lurie, John C. [Department of Astronomy, University of Washington, Seattle, WA 98195 (United States); Henry, Todd J.; Ianna, Philip A. [RECONS Institute, Chambersburg, PA 17201 (United States); Jao, Wei-Chun; Quinn, Samuel N.; Winters, Jennifer G. [Department of Physics and Astronomy, Georgia State University, Atlanta, GA 30302 (United States); Koerner, David W. [Department of Physics and Astronomy, Northern Arizona University, Flagstaff, AZ 86011 (United States); Riedel, Adric R. [Department of Astrophysics, American Museum of Natural History, New York, NY 10034 (United States); Subasavage, John P., E-mail: lurie@uw.edu [United States Naval Observatory, Flagstaff, AZ 86001 (United States)

    2014-11-01

    Astrometric measurements are presented for seven nearby stars with previously detected planets: six M dwarfs (GJ 317, GJ 667C, GJ 581, GJ 849, GJ 876, and GJ 1214) and one K dwarf (BD-10 -3166). Measurements are also presented for six additional nearby M dwarfs without known planets, but which are more favorable to astrometric detections of low mass companions, as well as three binary systems for which we provide astrometric orbit solutions. Observations have baselines of 3 to 13 years, and were made as part of the RECONS long-term astrometry and photometry program at the CTIO/SMARTS 0.9 m telescope. We provide trigonometric parallaxes and proper motions for all 16 systems, and perform an extensive analysis of the astrometric residuals to determine the minimum detectable companion mass for the 12 M dwarfs not having close stellar secondaries. For the six M dwarfs with known planets, we are not sensitive to planets, but can rule out the presence of all but the least massive brown dwarfs at periods of 2–12 years. For the six more astrometrically favorable M dwarfs, we conclude that none have brown dwarf companions, and are sensitive to companions with masses as low as 1 M{sub Jup} for periods longer than two years. In particular, we conclude that Proxima Centauri has no Jovian companions at orbital periods of 2–12 years. These results complement previously published M dwarf planet occurrence rates by providing astrometrically determined upper mass limits on potential super-Jupiter companions at orbits of two years and longer. As part of a continuing survey, these results are consistent with the paucity of super-Jupiter and brown dwarf companions we find among the over 250 red dwarfs within 25 pc observed longer than five years in our astrometric program.

  2. Vectorised Spreading Activation algorithm for centrality measurement

    Directory of Open Access Journals (Sweden)

    Alexander Troussov

    2011-01-01

    Full Text Available Spreading Activation is a family of graph-based algorithms widely used in areas such as information retrieval, epidemic models, and recommender systems. In this paper we introduce a novel Spreading Activation (SA method that we call Vectorised Spreading Activation (VSA. VSA algorithms, like “traditional” SA algorithms, iteratively propagate the activation from the initially activated set of nodes to the other nodes in a network through outward links. The level of the node’s activation could be used as a centrality measurement in accordance with dynamic model-based view of centrality that focuses on the outcomes for nodes in a network where something is flowing from node to node across the edges. Representing the activation by vectors allows the use of the information about various dimensionalities of the flow and the dynamic of the flow. In this capacity, VSA algorithms can model multitude of complex multidimensional network flows. We present the results of numerical simulations on small synthetic social networks and multi­dimensional network models of folksonomies which show that the results of VSA propagation are more sensitive to the positions of the initial seed and to the community structure of the network than the results produced by traditional SA algorithms. We tentatively conclude that the VSA methods could be instrumental to develop scalable and computationally efficient algorithms which could achieve synergy between computation of centrality indexes with detection of community structures in networks. Based on our preliminary results and on improvements made over previous studies, we foresee advances and applications in the current state of the art of this family of algorithms and their applications to centrality measurement.

  3. Using XML and Java Technologies for Astronomical Instrument Control

    Science.gov (United States)

    Ames, Troy; Case, Lynne; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center, under the Instrument Remote Control (IRC) project, is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is that the software is driven by an instrument description, written using the Instrument Markup Language (IML), a dialect of XML. IML is used to describe the command sets and command formats of the instrument, communication mechanisms, format of the data coming from the instrument, and characteristics of the graphical user interface to control and monitor the instrument. The IRC framework allows the users to define a data analysis pipeline which converts data coming out of the instrument. The data can be used in visualizations in order for the user to assess the data in real-time, if necessary. The data analysis pipeline algorithms can be supplied by the user in a variety of forms or programming languages. Although the current integration effort is targeted for the High-resolution Airborne Wideband Camera (HAWC) and the Submillimeter and Far Infrared Experiment (SAFIRE), first-light instruments of the Stratospheric Observatory for Infrared Astronomy (SOFIA), the framework is designed to be generic and extensible so that it can be applied to any instrument. Plans are underway to test the framework

  4. Diagnostic instrumentation for detection of the onset of steam tube leaks in PWRs

    International Nuclear Information System (INIS)

    Roach, W.H.

    1984-01-01

    Four tasks are addressed in this study of the detection of steam tube leaks: determination of which physical parameters indicate the onset of steam generator tube leaks; establishing performance goals for diagnostic instruments which could be used for early detection of steam generator tube leaks; defining the diagnostic instrumentation and their location which satisfy Items 1 and 2; and assessing the need for diagnostic data processing and display. Parameters are identified, performance goals established and sensor types and locations are specified in the report, with emphasis on the use of existing instrumentation with a minimum of retrofitting. A simple algorithm is developed which yields the leak rate as a function of known or measurable quantities. The conclusion is that leak rates of less than one-tenth gram per second should be detectable with existing instrumentation

  5. Automatic modal identification of cable-supported bridges instrumented with a long-term monitoring system

    Science.gov (United States)

    Ni, Y. Q.; Fan, K. Q.; Zheng, G.; Chan, T. H. T.; Ko, J. M.

    2003-08-01

    An automatic modal identification program is developed for continuous extraction of modal parameters of three cable-supported bridges in Hong Kong which are instrumented with a long-term monitoring system. The program employs the Complex Modal Indication Function (CMIF) algorithm to identify modal properties from continuous ambient vibration measurements in an on-line manner. By using the LabVIEW graphical programming language, the software realizes the algorithm in Virtual Instrument (VI) style. The applicability and implementation issues of the developed software are demonstrated by using one-year measurement data acquired from 67 channels of accelerometers deployed on the cable-stayed Ting Kau Bridge. With the continuously identified results, normal variability of modal vectors caused by varying environmental and operational conditions is observed. Such observation is very helpful for selection of appropriate measured modal vectors for structural health monitoring applications.

  6. High resolution tomographic instrument development

    International Nuclear Information System (INIS)

    1992-01-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational

  7. High resolution tomographic instrument development

    Energy Technology Data Exchange (ETDEWEB)

    1992-08-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational.

  8. High resolution tomographic instrument development

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational.

  9. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    Science.gov (United States)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  10. System automation for a bacterial colony detection and identification instrument via forward scattering

    International Nuclear Information System (INIS)

    Bae, Euiwon; Hirleman, E Daniel; Aroonnual, Amornrat; Bhunia, Arun K; Robinson, J Paul

    2009-01-01

    A system design and automation of a microbiological instrument that locates bacterial colonies and captures the forward-scattering signatures are presented. The proposed instrument integrates three major components: a colony locator, a forward scatterometer and a motion controller. The colony locator utilizes an off-axis light source to illuminate a Petri dish and an IEEE1394 camera to capture the diffusively scattered light to provide the number of bacterial colonies and two-dimensional coordinate information of the bacterial colonies with the help of a segmentation algorithm with region-growing. Then the Petri dish is automatically aligned with the respective centroid coordinate with a trajectory optimization method, such as the Traveling Salesman Algorithm. The forward scatterometer automatically computes the scattered laser beam from a monochromatic image sensor via quadrant intensity balancing and quantitatively determines the centeredness of the forward-scattering pattern. The final scattering signatures are stored to be analyzed to provide rapid identification and classification of the bacterial samples

  11. Programming Algorithms of load balancing with HA-Proxy in HTTP services

    Directory of Open Access Journals (Sweden)

    José Teodoro Mejía Viteri

    2018-02-01

    Full Text Available The access to the public and private services through the web gains daily protagonism, and sometimes they must support amounts of requests that a team can not process, so there are solutions that use algorithms that allow to distribute the load of requests of a web application in several equipment; the objective of this work is to perform an analysis of load balancing scheduling algorithms through the HA-Proxy tool, and deliver an instrument that identifies the load distribution algorithm to be used and the technological infrastructure, to largely cover implementation. The information used for this work is based on a bibliographic analysis, eld study and implementation of the different load balancing algorithms in equipment, where the distribution and its performance will be analyzed. The incorporation of this technology to the management of services on the web, improves availability, helps business continuity and through the different forms of distribution of the requests of the algorithms that can be implemented in HA-Proxy to provide those responsible for information technology systems with a view of their advantages and disadvantages.

  12. Integrated conception of hardware/software mixed systems used in nuclear instrumentation

    International Nuclear Information System (INIS)

    Dias, Ailton F.; Sorel, Yves; Akil, Mohamed

    2002-01-01

    Hardware/software codesign carries out the design of systems composed by a hardware portion, with specific components, and a software portion, with microprocessor based architecture. This paper describes the Algorithm Architecture Adequation (AAA) design methodology - originally oriented to programmable multicomponent architectures, its extension to reconfigurable circuits and its application to design and development of nuclear instrumentation systems composed by programmable and configurable circuits. AAA methodology uses an unified model to describe algorithm, architecture and implementation, based on graph theory. The great advantage of AAA methodology is the utilization of a same model from the specification to the implementation of hardware/software systems, reducing the complexity and design time. (author)

  13. Elements of an algorithm for optimizing a parameter-structural neural network

    Science.gov (United States)

    Mrówczyńska, Maria

    2016-06-01

    The field of processing information provided by measurement results is one of the most important components of geodetic technologies. The dynamic development of this field improves classic algorithms for numerical calculations in the aspect of analytical solutions that are difficult to achieve. Algorithms based on artificial intelligence in the form of artificial neural networks, including the topology of connections between neurons have become an important instrument connected to the problem of processing and modelling processes. This concept results from the integration of neural networks and parameter optimization methods and makes it possible to avoid the necessity to arbitrarily define the structure of a network. This kind of extension of the training process is exemplified by the algorithm called the Group Method of Data Handling (GMDH), which belongs to the class of evolutionary algorithms. The article presents a GMDH type network, used for modelling deformations of the geometrical axis of a steel chimney during its operation.

  14. ALGORITHMIC support for THE System Wide Information Management concept

    OpenAIRE

    2016-01-01

    The theoretical problems of computer support for the "System Wide Information Management" concept, which was proposed by experts of the International Civil Aviation Organization, are discussed. Within the framework of its provisions certain new requirements for all initial stages of air traffic management preceding the direct aircrafts control are formulated. Algorithmic instruments for ensuring a conflictlessness of a summary plan for the use of airspace during the plan’s implementation are ...

  15. Zombie algorithms: a timesaving remote sensing systems engineering tool

    Science.gov (United States)

    Ardanuy, Philip E.; Powell, Dylan C.; Marley, Stephen

    2008-08-01

    In modern horror fiction, zombies are generally undead corpses brought back from the dead by supernatural or scientific means, and are rarely under anyone's direct control. They typically have very limited intelligence, and hunger for the flesh of the living [1]. Typical spectroradiometric or hyperspectral instruments providess calibrated radiances for a number of remote sensing algorithms. The algorithms typically must meet specified latency and availability requirements while yielding products at the required quality. These systems, whether research, operational, or a hybrid, are typically cost constrained. Complexity of the algorithms can be high, and may evolve and mature over time as sensor characterization changes, product validation occurs, and areas of scientific basis improvement are identified and completed. This suggests the need for a systems engineering process for algorithm maintenance that is agile, cost efficient, repeatable, and predictable. Experience on remote sensing science data systems suggests the benefits of "plug-n-play" concepts of operation. The concept, while intuitively simple, can be challenging to implement in practice. The use of zombie algorithms-empty shells that outwardly resemble the form, fit, and function of a "complete" algorithm without the implemented theoretical basis-provides the ground systems advantages equivalent to those obtained by integrating sensor engineering models onto the spacecraft bus. Combined with a mature, repeatable process for incorporating the theoretical basis, or scientific core, into the "head" of the zombie algorithm, along with associated scripting and registration, provides an easy "on ramp" for the rapid and low-risk integration of scientific applications into operational systems.

  16. Astrometry and early astrophysics at Kuffner Observatory in the late 19th century

    Science.gov (United States)

    Habison, Peter

    The astronomer and mathematician Norbert Herz encouraged Moriz von Kuffner, owner of the beer brewery in Ottakring, to finance a private scientific observatory in the western parts of Vienna. In the years 1884-87 the Kuffner Observatory was built at the Gallitzinberg in Wien-Ottakring. It was an example of enlighted patronage and noted at the time for its rapid acquisition of new instruments and by increasing international recognition. It contained the largest heliometer in the world and the largest meridian circle in the Austrian-Hungarian Empire. Of the many scientists who worked here we mention Leo de Ball, Gustav Eberhard, Johannes Hartmann and we should not forget Karl Schwarzschild. Here in Vienna he published papers on celestial mechanics, measuring techniques, optics and his fundamental papers concerning photographic photometry, in particular the quantitative determination of the departure of the reciprocity law. The telescope and the associated camera with which he carried out his measurements are still in existence at the observatory. The observatory houses important astronomical instruments from the 19th century. All telescopes were made by Repsold und Söhne in Hamburg, and Steinheil in Munich. These two German companies were best renowned for quality and precision in high standard astronomical instruments. The Great Refractor (270/3500 mm) is still the third largest refractor in Austria. It was installed at the observatory in 1886 and was used together with the Schwarzschild Refractor for early astrophysical work including photography. It is this double refractor, where Schwarzschild carried out his measurements on photographic photometry. The Meridian Circle (132/1500 mm) was the largest meridian passage instrument of the Austro-Hungarian Empire. Today it is the largest meridian circle in Austria and still one of the largest in Europe. The telescope is equipped with one of the first impersonal micrometers of that time. First observations were carried

  17. Reusing Joint Polar Satellite System (jpss) Ground System Components to Process AURA Ozone Monitoring Instrument (omi) Science Products

    Science.gov (United States)

    Moses, J. F.; Jain, P.; Johnson, J.; Doiron, J. A.

    2017-12-01

    New Earth observation instruments are planned to enable advancements in Earth science research over the next decade. Diversity of Earth observing instruments and their observing platforms will continue to increase as new instrument technologies emerge and are deployed as part of National programs such as Joint Polar Satellite System (JPSS), Geostationary Operational Environmental Satellite system (GOES), Landsat as well as the potential for many CubeSat and aircraft missions. The practical use and value of these observational data often extends well beyond their original purpose. The practicing community needs intuitive and standardized tools to enable quick unfettered development of tailored products for specific applications and decision support systems. However, the associated data processing system can take years to develop and requires inherent knowledge and the ability to integrate increasingly diverse data types from multiple sources. This paper describes the adaptation of a large-scale data processing system built for supporting JPSS algorithm calibration and validation (Cal/Val) node to a simplified science data system for rapid application. The new configurable data system reuses scalable JAVA technologies built for the JPSS Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) system to run within a laptop environment and support product generation and data processing of AURA Ozone Monitoring Instrument (OMI) science products. Of particular interest are the root requirements necessary for integrating experimental algorithms and Hierarchical Data Format (HDF) data access libraries into a science data production system. This study demonstrates the ability to reuse existing Ground System technologies to support future missions with minimal changes.

  18. Multi-Instrument Investigation of Ionospheric Flow Channels and Their Impact on the Ionosphere and Thermosphere during Geomagnetic Storms

    Science.gov (United States)

    2017-12-29

    AFRL-AFOSR-JP-TR-2018-0009 Multi-instrument investigation of ionospheric flow channels and their impact on the ionosphere and thermosphere during...SUBTITLE Multi-instrument investigation of ionospheric flow channels and their impact on the ionosphere and thermosphere during geomagnetic storms 5a...Experiment) and GOCE (Gravity field and steady- state Ocean Circulation Explorer) satellite data. We also created a series of computer algorithms to

  19. A new algorithm to build bridges between two patient-reported health outcome instruments: the MOS SF-36® and the VR-12 Health Survey.

    Science.gov (United States)

    Selim, Alfredo; Rogers, William; Qian, Shirley; Rothendler, James A; Kent, Erin E; Kazis, Lewis E

    2018-04-19

    To develop bridging algorithms to score the Veterans Rand-12 (VR-12) scales for comparability to those of the SF-36® for facilitating multi-cohort studies using data from the National Cancer Institute Surveillance, Epidemiology, and End Results Program (SEER) linked to Medicare Health Outcomes Survey (MHOS), and to provide a model for minimizing non-statistical error in pooled analyses stemming from changes to survey instruments over time. Observational study of MHOS cohorts 1-12 (1998-2011). We modeled 2-year follow-up SF-36 scale scores from cohorts 1-6 based on baseline SF-36 scores, age, and gender, yielding 100 clusters using Classification and Regression Trees. Within each cluster, we averaged follow-up SF-36 scores. Using the same cluster specifications, expected follow-up SF-36 scores, based on cohorts 1-6, were computed for cohorts 7-8 (where the VR-12 was the follow-up survey). We created a new criterion validity measure, termed "extensibility," calculated from the square root of the mean square difference between expected SF-36 scale averages and observed VR-12 item score from cohorts 7-8, weighted by cluster size. VR-12 items were rescored to minimize this quantity. Extensibility of rescored VR-12 items and scales was considerably improved from the "simple" scoring method for comparability to the SF-36 scales. The algorithms are appropriate across a wide range of potential subsamples within the MHOS and provide robust application for future studies that span the SF-36 and VR-12 eras. It is possible that these surveys in a different setting outside the MHOS, especially in younger age groups, could produce somewhat different results.

  20. The solar neighborhood. XXXI. Discovery of an unusual red+white dwarf binary at ∼25 pc via astrometry and UV imaging

    Energy Technology Data Exchange (ETDEWEB)

    Jao, Wei-Chun; Henry, Todd J.; Winters, Jennifer G.; Gies, Douglas R. [Department of Physics and Astronomy, Georgia State University, Atlanta, GA 30302 (United States); Subasavage, John P. [US Naval Observatory, Flagstaff Station, 10391 West Naval Observatory Road, Flagstaff, AZ 86001 (United States); Riedel, Adric R. [Department of Physics and Astronomy, Hunter College, 695 Park Avenue, New York, NY 10065 (United States); Ianna, Philip A., E-mail: jao@chara.gsu.edu, E-mail: thenry@chara.gsu.edu, E-mail: winters@chara.gsu.edu, E-mail: gies@chara.gsu.edu, E-mail: jsubasavage@nofs.navy.mil, E-mail: ar494@hunter.cuny.edu, E-mail: philianna3@gmail.com [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States)

    2014-01-01

    We report the discovery of a nearby M5.0V dwarf at 24.6 pc, SCR 1848–6855, that is orbited by an unusual companion causing an astrometric perturbation of more than 200 mas. This is by far the largest perturbation found to date among more than 700 targets observed during our long-term astrometry/photometry program at the CTIO 0.9 m telescope. We present here a suite of astrometric, photometric, and spectroscopic observations of this high proper motion (∼1.''3 yr{sup –1}) system in an effort to reveal the nature of this unusual binary. The measured near-UV and optical U band fluxes exceed those expected for comparable M5.0V stars, and excess flux is also detected in the spectral range 4000-7000 Å. The elusive companion has been detected in HST-STIS+MAMA images at 1820 Å and 2700 Å, and our analysis shows that it is probably a rare, cool, white dwarf with T = 4600-5500 K. Given the long-term astrometric coverage, the prospects for an accurate mass determination are excellent, although as yet we can only provide limits on the unusual companion's mass.

  1. Algebraic dynamics algorithm: Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    WANG ShunJin; ZHANG Hua

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations,a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm.A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models.The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision,and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  2. Algebraic dynamics algorithm:Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations, a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm. A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models. The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision, and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  3. Design and implementation of embedded ion mobility spectrometry instrument based on SOPC

    Science.gov (United States)

    Zhang, Genwei; Zhao, Jiang; Yang, Liu; Liu, Bo; Jiang, Yanwei; Yang, Jie

    2015-02-01

    On the hardware platform with single CYCLONE IV FPGA Chip based on SOPC technology, the control functions of IP cores of a Ion Mobility Spectrometry instrument was tested, including 32 bit Nios II soft-core processor, high-voltage module, ion gate switch, gas flow, temperature and pressure sensors, signal acquisition and communication protocol. Embedded operating system μCLinux was successfully transplanted to the hardware platform, used to schedule all the tasks, such as system initialization, parameter setting, signal processing, recognition algorithm and results display. The system was validated using the IMS diagram of Acetone reagent, and the instrument was proved to have a strong signal resolution.

  4. Development of the Landsat Data Continuity Mission Cloud Cover Assessment Algorithms

    Science.gov (United States)

    Scaramuzza, Pat; Bouchard, M.A.; Dwyer, John L.

    2012-01-01

    The upcoming launch of the Operational Land Imager (OLI) will start the next era of the Landsat program. However, the Automated Cloud-Cover Assessment (CCA) (ACCA) algorithm used on Landsat 7 requires a thermal band and is thus not suited for OLI. There will be a thermal instrument on the Landsat Data Continuity Mission (LDCM)-the Thermal Infrared Sensor-which may not be available during all OLI collections. This illustrates a need for CCA for LDCM in the absence of thermal data. To research possibilities for full-resolution OLI cloud assessment, a global data set of 207 Landsat 7 scenes with manually generated cloud masks was created. It was used to evaluate the ACCA algorithm, showing that the algorithm correctly classified 79.9% of a standard test subset of 3.95 109 pixels. The data set was also used to develop and validate two successor algorithms for use with OLI data-one derived from an off-the-shelf machine learning package and one based on ACCA but enhanced by a simple neural network. These comprehensive CCA algorithms were shown to correctly classify pixels as cloudy or clear 88.5% and 89.7% of the time, respectively.

  5. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  6. Adaptive local backlight dimming algorithm based on local histogram and image characteristics

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Burini, Nino; Korhonen, Jari

    2013-01-01

    -off between power consumption and image quality preservation than the other algorithms representing the state of the art among feature based backlight algorithms. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.......Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that exploits the characteristics of the target image......, such as the local histograms and the average pixel intensity of each backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted...

  7. Physical Behavior in Older Persons during Daily Life: Insights from Instrumented Shoes

    Directory of Open Access Journals (Sweden)

    Christopher Moufawad el Achkar

    2016-08-01

    Full Text Available Activity level and gait parameters during daily life are important indicators for clinicians because they can provide critical insights into modifications of mobility and function over time. Wearable activity monitoring has been gaining momentum in daily life health assessment. Consequently, this study seeks to validate an algorithm for the classification of daily life activities and to provide a detailed gait analysis in older adults. A system consisting of an inertial sensor combined with a pressure sensing insole has been developed. Using an algorithm that we previously validated during a semi structured protocol, activities in 10 healthy elderly participants were recorded and compared to a wearable reference system over a 4 h recording period at home. Detailed gait parameters were calculated from inertial sensors. Dynamics of physical behavior were characterized using barcodes that express the measure of behavioral complexity. Activity classification based on the algorithm led to a 93% accuracy in classifying basic activities of daily life, i.e., sitting, standing, and walking. Gait analysis emphasizes the importance of metrics such as foot clearance in daily life assessment. Results also underline that measures of physical behavior and gait performance are complementary, especially since gait parameters were not correlated to complexity. Participants gave positive feedback regarding the use of the instrumented shoes. These results extend previous observations in showing the concurrent validity of the instrumented shoes compared to a body-worn reference system for daily-life physical behavior monitoring in older adults.

  8. Instruments to Identify Commercially Sexually Exploited Children: Feasibility of Use in an Emergency Department Setting.

    Science.gov (United States)

    Armstrong, Stephanie

    2017-12-01

    This review examines the screening instruments that are in existence today to identify commercially sexually exploited children. The instruments are compared and evaluated for their feasibility of use in an emergency department setting. Four electronic databases were searched to identify screening instruments that assessed solely for commercial sexual exploitation. Search terms included "commercially sexually exploited children," "CSEC," "domestic minor sex trafficking," "DMST," "juvenile sex trafficking," and "JST." Those terms were then searched in combination with each of the following: "tools," "instruments," "screening," "policies," "procedures," "data collection," "evidence," and "validity." Six screening instruments were found to meet the inclusion criteria. Variation among instruments included number of questions, ease of administration, information sources, scoring methods, and training information provided. Two instruments were determined to be highly feasible for use in the emergency department setting, those being the Asian Health Services and Banteay Srei's CSEC Screening Protocol and Greenbaum et al's CSEC/child sex trafficking 6-item screening tool. A current dearth of screening instruments was confirmed. It is recommended that additional screening instruments be created to include developmentally appropriate instruments for preadolescent children. Numerous positive features were identified within the instruments in this review and are suggested for use in future screening instruments, including succinctness, a simple format, easy administration, training materials, sample questions, multiple information sources, designation of questions requiring mandatory reporting, a straightforward scoring system, and an algorithm format.

  9. "Fibromyalgia and quality of life: mapping the revised fibromyalgia impact questionnaire to the preference-based instruments".

    Science.gov (United States)

    Collado-Mateo, Daniel; Chen, Gang; Garcia-Gordillo, Miguel A; Iezzi, Angelo; Adsuar, José C; Olivares, Pedro R; Gusi, Narcis

    2017-05-30

    The revised version of the Fibromyalgia Impact Questionnaire (FIQR) is one of the most widely used specific questionnaires in FM studies. However, this questionnaire does not allow calculation of QALYs as it is not a preference-based measure. The aim of this study was to develop mapping algorithm which enable FIQR scores to be transformed into utility scores that can be used in the cost utility analyses. A cross-sectional survey was conducted. One hundred and 92 Spanish women with Fibromyalgia were asked to complete four general quality of life questionnaires, i.e. EQ-5D-5 L, 15D, AQoL-8D and SF-12, and one specific disease instrument, the FIQR. A direct mapping approach was adopted to derive mapping algorithms between the FIQR and each of the four multi-attribute utility (MAU) instruments. Health state utility was treated as the dependent variable in the regression analysis, whilst the FIQR score and age were predictors. The mean utility scores ranged from 0.47 (AQoL-8D) to 0.69 (15D). All correlations between the FIQR total score and MAU instruments utility scores were highly significant (p fibromyalgia specific questionnaire.

  10. An Algorithm For Climate-Quality Atmospheric Profiling Continuity From EOS Aqua To Suomi-NPP

    Science.gov (United States)

    Moncet, J. L.

    2015-12-01

    We will present results from an algorithm that is being developed to produce climate-quality atmospheric profiling earth system data records (ESDRs) for application to hyperspectral sounding instrument data from Suomi-NPP, EOS Aqua, and other spacecraft. The current focus is on data from the S-NPP Cross-track Infrared Sounder (CrIS) and Advanced Technology Microwave Sounder (ATMS) instruments as well as the Atmospheric InfraRed Sounder (AIRS) on EOS Aqua. The algorithm development at Atmospheric and Environmental Research (AER) has common heritage with the optimal estimation (OE) algorithm operationally processing S-NPP data in the Interface Data Processing Segment (IDPS), but the ESDR algorithm has a flexible, modular software structure to support experimentation and collaboration and has several features adapted to the climate orientation of ESDRs. Data record continuity benefits from the fact that the same algorithm can be applied to different sensors, simply by providing suitable configuration and data files. The radiative transfer component uses an enhanced version of optimal spectral sampling (OSS) with updated spectroscopy, treatment of emission that is not in local thermodynamic equilibrium (non-LTE), efficiency gains with "global" optimal sampling over all channels, and support for channel selection. The algorithm is designed for adaptive treatment of clouds, with capability to apply "cloud clearing" or simultaneous cloud parameter retrieval, depending on conditions. We will present retrieval results demonstrating the impact of a new capability to perform the retrievals on sigma or hybrid vertical grid (as opposed to a fixed pressure grid), which particularly affects profile accuracy over land with variable terrain height and with sharp vertical structure near the surface. In addition, we will show impacts of alternative treatments of regularization of the inversion. While OE algorithms typically implement regularization by using background estimates from

  11. Current Status of the Validation of the Atmospheric Chemistry Instruments on Envisat

    Science.gov (United States)

    Lecomte, P.; Koopman, R.; Zehner, C.; Laur, H.; Attema, E.; Wursteisen, P.; Snoeij, P.

    2003-04-01

    Envisat is ESA's advanced Earth observing satellite launched in March 2002 and is designed to provide measurements of the atmosphere, ocean, land and ice over a five-year period. After the launch and the switch-on period, a six-month commissioning phase has taken place for instrument calibration and geophysical validation, concluded with the Envisat Calibration Review held in September 2002. In addition to ESA and its industrial partners in the Envisat consortium, many other companies and research institutes have contributed to the calibration and validation programme under ESA contract as expert support laboratories (ESLs). A major contribution has also been made by the Principal Investigators of approved proposals submitted to ESA in response to a worldwide "Announcement of Opportunity for the Exploitation of the Envisat Data Products" in 1998. Working teams have been formed in which the different participants worked side by side to achieve the objectives of the calibration and validation programme. Validation is a comparison of Envisat level-2 data products and estimates of the different geophysical variables obtained by independent means, the validation instruments. Validation is closely linked to calibration because inconsistencies discovered in the comparison of Envisat Level 2 data products to well-known external instruments can have many different sources, including inaccuracies of the Envisat instrument calibration and the data calibration algorithms. Therefore, initial validation of the geophysical variables has provided feedback to calibration, de-bugging and algorithm improvement. The initial validation phase ended in December 2002 with the Envisat Validation Workshop at which, for a number of products, a final quality statement was given. Full validation of all data products available from the Atmospheric Chemistry Instruments on Envisat (MIPAS, GOMOS and SCIAMACHY) is quite a challenge and therefore it has been decided to adopt a step-wise approach

  12. Mathematical model of rhodium self-powered detectors and algorithms for correction of their time delay

    International Nuclear Information System (INIS)

    Bur'yan, V.I.; Kozlova, L.V.; Kuzhil', A.S.; Shikalov, V.F.

    2005-01-01

    The development of algorithms for correction of self-powered neutron detector (SPND) inertial is caused by necessity to increase the fast response of the in-core instrumentation systems (ICIS). The increase of ICIS fast response will permit to monitor in real time fast transient processes in the core, and in perspective - to use the signals of rhodium SPND for functions of emergency protection by local parameters. In this paper it is proposed to use mathematical model of neutron flux measurements by means of SPND in integral form for creation of correction algorithms. This approach, in the case, is the most convenient for creation of recurrent algorithms for flux estimation. The results of comparison for estimation of neutron flux and reactivity by readings of ionization chambers and SPND signals, corrected by proposed algorithms, are presented [ru

  13. Toward automated assessment of health Web page quality using the DISCERN instrument.

    Science.gov (United States)

    Allam, Ahmed; Schulz, Peter J; Krauthammer, Michael

    2017-05-01

    As the Internet becomes the number one destination for obtaining health-related information, there is an increasing need to identify health Web pages that convey an accurate and current view of medical knowledge. In response, the research community has created multicriteria instruments for reliably assessing online medical information quality. One such instrument is DISCERN, which measures health Web page quality by assessing an array of features. In order to scale up use of the instrument, there is interest in automating the quality evaluation process by building machine learning (ML)-based DISCERN Web page classifiers. The paper addresses 2 key issues that are essential before constructing automated DISCERN classifiers: (1) generation of a robust DISCERN training corpus useful for training classification algorithms, and (2) assessment of the usefulness of the current DISCERN scoring schema as a metric for evaluating the performance of these algorithms. Using DISCERN, 272 Web pages discussing treatment options in breast cancer, arthritis, and depression were evaluated and rated by trained coders. First, different consensus models were compared to obtain a robust aggregated rating among the coders, suitable for a DISCERN ML training corpus. Second, a new DISCERN scoring criterion was proposed (features-based score) as an ML performance metric that is more reflective of the score distribution across different DISCERN quality criteria. First, we found that a probabilistic consensus model applied to the DISCERN instrument was robust against noise (random ratings) and superior to other approaches for building a training corpus. Second, we found that the established DISCERN scoring schema (overall score) is ill-suited to measure ML performance for automated classifiers. Use of a probabilistic consensus model is advantageous for building a training corpus for the DISCERN instrument, and use of a features-based score is an appropriate ML metric for automated DISCERN

  14. Inspector-instrument interface in portable NDA instrumentation

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.

    1981-01-01

    Recent electronics technology advances make it possible to design sophisticated instruments in small packages for convenient field implementation. An inspector-instrument interface design that allows communication of procedures, responses, and results between the instrument and user is presented. This capability has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

  15. Inspector-instrument interface in portable NDA instrumentation

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.

    1981-01-01

    Recent electronics technology advances make it possible to design sophisticated instruments in small packages for convenient field implementation. This report describes an inspector-instrument interface design which allows communication of procedures, responses, and results between the instrument and user. The interface has been incorporated into new spent-fuel instrumentation and a battery-powered multichannel analyzer

  16. Genetic Algorithm and its Application in Optimal Sensor Layout

    Directory of Open Access Journals (Sweden)

    Xiang-Yang Chen

    2015-05-01

    Full Text Available This paper aims at the problem of multi sensor station distribution, based on multi- sensor systems of different types as the research object, in the analysis of various types of sensors with different application background, different indicators of demand, based on the different constraints, for all kinds of multi sensor station is studied, the application of genetic algorithms as a tool for the objective function of the models optimization, then the optimal various types of multi sensor station distribution plan, improve the performance of the system, and achieved good military effect. In the field of application of sensor radar, track measuring instrument, the satellite, passive positioning equipment of various types, specific problem, use care indicators and station arrangement between the mathematical model of geometry, using genetic algorithm to get the optimization results station distribution, to solve a variety of practical problems provides useful help, but also reflects the improved genetic algorithm in electronic weapon system based on multi sensor station distribution on the applicability and effectiveness of the optimization; finally the genetic algorithm for integrated optimization of multi sensor station distribution using the good to the training exercise tasks based on actual in, and have achieved good military effect.

  17. Capabilities and prospects of the East Asia Very Long Baseline Interferometry Network

    Science.gov (United States)

    An, T.; Sohn, B. W.; Imai, H.

    2018-02-01

    The very long baseline interferometry (VLBI) technique offers angular resolutions superior to any other instruments at other wavelengths, enabling unique science applications of high-resolution imaging of radio sources and high-precision astrometry. The East Asia VLBI Network (EAVN) is a collaborative effort in the East Asian region. The EAVN currently consists of 21 telescopes with diverse equipment configurations and frequency setups, allowing flexible subarrays for specific science projects. The EAVN provides the highest resolution of 0.5 mas at 22 GHz, allowing the fine imaging of jets in active galactic nuclei, high-accuracy astrometry of masers and pulsars, and precise spacecraft positioning. The soon-to-be-operational Five-hundred-meter Aperture Spherical radio Telescope (FAST) will open a new era for the EAVN. This state-of-the-art VLBI array also provides easy access to and crucial training for the burgeoning Asian astronomical community. This Perspective summarizes the status, capabilities and prospects of the EAVN.

  18. Basic Optics for the Astronomical Sciences

    CERN Document Server

    Breckinridge, James

    2012-01-01

    This text was written to provide students of astronomy and engineers an understanding of optical science - the study of the generation, propagation, control, and measurement of optical radiation - as it applies to telescopes and instruments for astronomical research in the areas of astrophysics, astrometry, exoplanet characterization, and planetary science. The book provides an overview of the elements of optical design and physical optics within the framework of the needs of the astronomical community.

  19. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    Science.gov (United States)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including

  20. Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia

    Science.gov (United States)

    Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve

    2014-01-01

    A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.

  1. Control architecture for an adaptive electronically steerable flash lidar and associated instruments

    Science.gov (United States)

    Ruppert, Lyle; Craner, Jeremy; Harris, Timothy

    2014-09-01

    An Electronically Steerable Flash Lidar (ESFL), developed by Ball Aerospace & Technologies Corporation, allows realtime adaptive control of configuration and data-collection strategy based on recent or concurrent observations and changing situations. This paper reviews, at a high level, some of the algorithms and control architecture built into ESFL. Using ESFL as an example, it also discusses the merits and utility such adaptable instruments in Earth-system studies.

  2. Derivation of tensile flow characteristics for austenitic materials from instrumented indentation technique

    International Nuclear Information System (INIS)

    Lee, K-W; Kim, K-H; Kim, J-Y; Kwon, D

    2008-01-01

    In this study, a method for deriving the tensile flow characteristics of austenitic materials from an instrumented indentation technique is presented along with its experimental verification. We proposed a modified algorithm for austenitic materials that takes their hardening behaviour into account. First, the true strain based on sine function instead of tangent function was adapted. It was proved that the sine function shows constant degrees of hardening which is a main characteristic of the hardening of austenitic materials. Second, a simple and linear constitutive equation was newly suggested to optimize indentation flow curves. The modified approach was experimentally verified by comparing tensile properties of five austenitic materials from uniaxial tensile test and instrumented indentation tests

  3. Toward Accurate On-Ground Attitude Determination for the Gaia Spacecraft

    Science.gov (United States)

    Samaan, Malak A.

    2010-03-01

    The work presented in this paper concerns the accurate On-Ground Attitude (OGA) reconstruction for the astrometry spacecraft Gaia in the presence of disturbance and of control torques acting on the spacecraft. The reconstruction of the expected environmental torques which influence the spacecraft dynamics will be also investigated. The telemetry data from the spacecraft will include the on-board real-time attitude, which is of order of several arcsec. This raw attitude is the starting point for the further attitude reconstruction. The OGA will use the inputs from the field coordinates of known stars (attitude stars) and also the field coordinate differences of objects on the Sky Mapper (SM) and Astrometric Field (AF) payload instruments to improve this raw attitude. The on-board attitude determination uses a Kalman Filter (KF) to minimize the attitude errors and produce a more accurate attitude estimation than the pure star tracker measurement. Therefore the first approach for the OGA will be an adapted version of KF. Furthermore, we will design a batch least squares algorithm to investigate how to obtain a more accurate OGA estimation. Finally, a comparison between these different attitude determination techniques in terms of accuracy, robustness, speed and memory required will be evaluated in order to choose the best attitude algorithm for the OGA. The expected resulting accuracy for the OGA determination will be on the order of milli-arcsec.

  4. Expert-guided evolutionary algorithm for layout design of complex space stations

    Science.gov (United States)

    Qian, Zhiqin; Bi, Zhuming; Cao, Qun; Ju, Weiguo; Teng, Hongfei; Zheng, Yang; Zheng, Siyu

    2017-08-01

    The layout of a space station should be designed in such a way that different equipment and instruments are placed for the station as a whole to achieve the best overall performance. The station layout design is a typical nondeterministic polynomial problem. In particular, how to manage the design complexity to achieve an acceptable solution within a reasonable timeframe poses a great challenge. In this article, a new evolutionary algorithm has been proposed to meet such a challenge. It is called as the expert-guided evolutionary algorithm with a tree-like structure decomposition (EGEA-TSD). Two innovations in EGEA-TSD are (i) to deal with the design complexity, the entire design space is divided into subspaces with a tree-like structure; it reduces the computation and facilitates experts' involvement in the solving process. (ii) A human-intervention interface is developed to allow experts' involvement in avoiding local optimums and accelerating convergence. To validate the proposed algorithm, the layout design of one-space station is formulated as a multi-disciplinary design problem, the developed algorithm is programmed and executed, and the result is compared with those from other two algorithms; it has illustrated the superior performance of the proposed EGEA-TSD.

  5. Solar Backscatter UV (SBUV total ozone and profile algorithm

    Directory of Open Access Journals (Sweden)

    P. K. Bhartia

    2013-10-01

    Full Text Available We describe the algorithm that has been applied to develop a 42 yr record of total ozone and ozone profiles from eight Solar Backscatter UV (SBUV instruments launched on NASA and NOAA satellites since April 1970. The Version 8 (V8 algorithm was released more than a decade ago and has been in use since then at NOAA to produce their operational ozone products. The current algorithm (V8.6 is basically the same as V8, except for updates to instrument calibration, incorporation of new ozone absorption cross-sections, and new ozone and cloud height climatologies. Since the V8 algorithm has been optimized for deriving monthly zonal mean (MZM anomalies for ozone assessment and model comparisons, our emphasis in this paper is primarily on characterizing the sources of errors that are relevant for such studies. When data are analyzed this way the effect of some errors, such as vertical smoothing of short-term variability, and noise due to clouds and aerosols diminish in importance, while the importance of others, such as errors due to vertical smoothing of the quasi-biennial oscillation (QBO and other periodic and aperiodic variations, become more important. With V8.6 zonal mean data we now provide smoothing kernels that can be used to compare anomalies in SBUV profile and partial ozone columns with models. In this paper we show how to use these kernels to compare SBUV data with Microwave Limb Sounder (MLS ozone profiles. These kernels are particularly useful for comparisons in the lower stratosphere where SBUV profiles have poor vertical resolution but partial column ozone values have high accuracy. We also provide our best estimate of the smoothing errors associated with SBUV MZM profiles. Since smoothing errors are the largest source of uncertainty in these profiles, they can be treated as error bars in deriving interannual variability and trends using SBUV data and for comparing with other measurements. In the V8 and V8.6 algorithms we derive total

  6. Vibration condition measure instrument of motor using MEMS accelerometer

    Science.gov (United States)

    Chen, Jun

    2018-04-01

    In this work, a novel vibration condition measure instrument of motor using a digital micro accelerometer is proposed. In order to reduce the random noise found in the data, the sensor modeling is established and also the Kalman filter (KMF) is developed. According to these data from KMF, the maximum vibration displacement is calculated by the integration algorithm with the DC bias removed. The high performance micro controller unit (MCU) is used in the implementation of controller. By the IIC digital interface port, the data are transmitted from sensor to controller. The hardware circuits of the sensor and micro controller are designed and tested. With the computational formula of maximum displacement and FFT, the high precession results of displacement and frequency are gotten. Finally, the paper presents various experimental results to prove that this instrument is suitable for application in electrical motor vibration measurement.

  7. TH-A-17A-01: Innovation in PET Instrumentation and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Casey, M [Siemens Healthcare, Knoxville, Tennessee (United States); Miyaoka, R [University of Washington, Seattle, WA (United States); Shao, Y [University of Texas MD Anderson Cancer Center, Houston, TX (United States)

    2014-06-15

    Innovation in PET instrumentation has led to the new millennium revolutionary imaging applications for diagnosis, therapeutic guidance, and development of new molecular imaging probes, etc. However, after several decades innovations, will the advances of PET technology and applications continue with the same trend and pace? What will be the next big thing beyond the PET/CT, PET/MRI, and Time-of-flight PET? How will the PET instrumentation and imaging performance be further improved by novel detector research and advanced imaging system development? Or will the development of new algorithms and methodologies extend the limit of current instrumentation and leapfrog the imaging quality and quantification for practical applications? The objective of this session is to present an overview of current status and advances in the PET instrumentation and applications with speakers from leading academic institutes and a major medical imaging company. Presenting with both academic research projects and commercial technology developments, this session will provide a glimpse of some latest advances and challenges in the field, such as using semiconductor photon-sensor based PET detectors to improve performance and enable new applications, as well as the technology trend that may lead to the next breakthrough in PET imaging for clinical and preclinical applications. Both imaging and image-guided therapy subjects will be discussed. Learning Objectives: Describe the latest innovations in PET instrumentation and applications Understand the driven force behind the PET instrumentation innovation and development Learn the trend of PET technology development for applications.

  8. TH-A-17A-01: Innovation in PET Instrumentation and Applications

    International Nuclear Information System (INIS)

    Casey, M; Miyaoka, R; Shao, Y

    2014-01-01

    Innovation in PET instrumentation has led to the new millennium revolutionary imaging applications for diagnosis, therapeutic guidance, and development of new molecular imaging probes, etc. However, after several decades innovations, will the advances of PET technology and applications continue with the same trend and pace? What will be the next big thing beyond the PET/CT, PET/MRI, and Time-of-flight PET? How will the PET instrumentation and imaging performance be further improved by novel detector research and advanced imaging system development? Or will the development of new algorithms and methodologies extend the limit of current instrumentation and leapfrog the imaging quality and quantification for practical applications? The objective of this session is to present an overview of current status and advances in the PET instrumentation and applications with speakers from leading academic institutes and a major medical imaging company. Presenting with both academic research projects and commercial technology developments, this session will provide a glimpse of some latest advances and challenges in the field, such as using semiconductor photon-sensor based PET detectors to improve performance and enable new applications, as well as the technology trend that may lead to the next breakthrough in PET imaging for clinical and preclinical applications. Both imaging and image-guided therapy subjects will be discussed. Learning Objectives: Describe the latest innovations in PET instrumentation and applications Understand the driven force behind the PET instrumentation innovation and development Learn the trend of PET technology development for applications

  9. ORNL instrumentation performance for Slab Core Test Facility (SCTF)-Core I Reflood Test Facility

    International Nuclear Information System (INIS)

    Hardy, J.E.; Hess, R.A.; Hylton, J.O.

    1983-11-01

    Instrumentation was developed for making measurements in experimental refill-reflood test facilities. These unique instrumentation systems were designed to survive the severe environmental conditions that exist during a simulated pressurized water reactor loss-of-coolant accident (LOCA). Measurement of in-vessel fluid phenomena such as two-phase flow velocity and void fraction and film thickness and film velocity are required for better understanding of reactor behavior during LOCAs. The Advanced Instrumentation for Reflood Studies (AIRS) Program fabricated and delivered instrumentation systems and data reduction software algorithms that allowed the above measurements to be made. Data produced by AIRS sensors during three experimental runs in the Japanese Slab Core Test Facility are presented. Although many of the sensors failed before any useful data could be obtained, the remaining probes gave encouraging and useful results. These results are the first of their kind produced during simulated refill-reflood stage of a LOCA near actual thermohydrodynamic conditions

  10. On-board event processing algorithms for a CCD-based space borne X-ray spectrometer

    International Nuclear Information System (INIS)

    Chun, H.J.; Bowles, J.A.; Branduardi-Raymont, G.; Gowen, R.A.

    1996-01-01

    This paper describes two alternative algorithms which are applied to reduce the telemetry requirements for a Charge Coupled Device (CCD) based, space-borne, X-ray spectrometer by on-board reconstruction of the X-ray events split over two or more adjacent pixels. The algorithms have been developed for the Reflection Grating Spectrometer (RGS) on the X-ray multi-mirror (XMM) mission, the second cornerstone project in the European Space Agency's Horizon 2000 programme. The overall instrument and some criteria which provide the background of the development of the algorithms, implemented in Tartan ADA on an MA31750 microprocessor, are described. The on-board processing constraints and requirements are discussed, and the performances of the algorithms are compared. Test results are presented which show that the recursive implementation is faster and has a smaller executable file although it uses more memory because of its stack requirements. (orig.)

  11. Instrumentation

    International Nuclear Information System (INIS)

    Prieur, G.; Nadi, M.; Hedjiedj, A.; Weber, S.

    1995-01-01

    This second chapter on instrumentation gives little general consideration on history and classification of instrumentation, and two specific states of the art. The first one concerns NMR (block diagram of instrumentation chain with details on the magnets, gradients, probes, reception unit). The first one concerns precision instrumentation (optical fiber gyro-meter and scanning electron microscope), and its data processing tools (programmability, VXI standard and its history). The chapter ends with future trends on smart sensors and Field Emission Displays. (D.L.). Refs., figs

  12. Data Retrieval Algorithm and Uncertainty Analysis for a Miniaturized, Laser Heterodyne Radiometer

    Science.gov (United States)

    Miller, J. H.; Melroy, H.; Wilson, E. L.; Clarke, G. B.

    2013-12-01

    In a collaboration between NASA Goddard Space Flight Center and George Washington University, a low-cost, surface instrument is being developed that can continuously monitor key carbon cycle gases in the atmospheric column: carbon dioxide (CO2) and methane (CH4). The instrument is based on a miniaturized, laser heterodyne radiometer (LHR) using near infrared (NIR) telecom lasers. Despite relatively weak absorption line strengths in this spectral region, spectrally-resolved atmospheric column absorptions for these two molecules fall in the range of 60-80% and thus sensitive and precise measurements of column concentrations are possible. Further, because the LHR technique has the potential for sub-Doppler spectral resolution, the possibility exists for interrogating line shapes to extract altitude profiles of the greenhouse gases. From late 2012 through 2013 the instrument was deployed for a variety of field measurements including at Park Falls, Wisconsin; Castle Airport near Atwater, California; and at the NOAA Mauna Loa Observatory in Hawaii. For each subsequent campaign, improvement in the figures of merit for the instrument (notably spectral sweep time and absorbance noise) has been observed. For the latter, the absorbance noise is approaching 0.002 optical density (OD) noise on a 1.8 OD signal. This presentation presents an overview of the measurement campaigns in the context of the data retrieval algorithm under development at GW for the calculation of column concentrations from them. For light transmission through the atmosphere, it is necessary to account for variation of pressure, temperature, composition, and refractive index through the atmosphere that are all functions of latitude, longitude, time of day, altitude, etc. In our initial work we began with coding developed under the LOWTRAN and MODTRAN programs by the AFOSR (and others). We also assumed temperature and pressure profiles from the 1976 US Standard Atmosphere and used the US Naval Observatory

  13. Recent applications of microprocessor-based instruments in nuclear power stations

    International Nuclear Information System (INIS)

    Cash, N.R.; Dennis, U.E.

    1988-01-01

    The incorporation of microprocessors in the design of nuclear power plant instrumentation has led to levels of measurement and control not available previously. In addition to the expected expansion of functional (system) capability, numerous desirable features now are possible. The added ability to both self-calibrate and perform compensation algorithms has led to dramatic improvements in accuracies, response times, and noise rejection. Automated performance checking and self-testing simplify troubleshooting and required periodic surveillance. Alphanumeric displays allow both menu-driven operation and user-prompting, which, in turn, contribute to mistake avoidance. New features of these microprocessor-based instruments are of specific benefit in nuclear power reactors, were safety is of prime concern. Greater reliability and accuracy can be provided. Shortened calibration, surveillance, and repair times reduce the exposure to unnecessary challenges of the plant's protection systems that can arise from spurious noise signals

  14. THE SYNERGY OF DIRECT IMAGING AND ASTROMETRY FOR ORBIT DETERMINATION OF EXO-EARTHS

    International Nuclear Information System (INIS)

    Shao, Michael; Catanzarite, Joseph; Pan Xiaopei

    2010-01-01

    The holy grail of exoplanet searches is an exo-Earth, an Earth mass planet in the habitable zone (HZ) around a nearby star. Mass is one of the most important characteristics of a planet and can only be measured by observing the motion of the star around the planet-star center of gravity. The planet's orbit can be measured either by imaging the planet at multiple epochs or by measuring the position of the star at multiple epochs by space-based astrometry. The measurement of an exoplanet's orbit by direct imaging is complicated by a number of factors. One is the inner working angle (IWA). A space coronagraph or interferometer imaging an exo-Earth can separate the light from the planet from the light from the star only when the star-planet separation is larger than the IWA. Second, the apparent brightness of a planet depends on the orbital phase. A single image of a planet cannot tell us whether the planet is in the HZ or distinguish whether it is an exo-Earth or a Neptune-mass planet. Third is the confusion that may arise from the presence of multiple planets. With two images of a multiple planet system, it is not possible to assign a dot to a planet based only on the photometry and color of the planet. Finally, the planet-star contrast must exceed a certain minimum value in order for the planet to be detected. The planet may be unobservable even when it is outside the IWA, such as when the bright side of the planet is facing away from us in a 'crescent' phase. In this paper we address the question: 'Can a prior astrometric mission that can identify which stars have Earth-like planets significantly improve the science yield of a mission to image exo-Earths?' In the case of the Occulting Ozone Observatory, a small external occulter mission that cannot measure spectra, we find that the occulter mission could confirm the orbits of ∼4 to ∼5 times as many exo-Earths if an astrometric mission preceded it to identify which stars had such planets. In the case of an

  15. An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training.

    Science.gov (United States)

    Loukas, Constantinos; Lahanas, Vasileios; Georgiou, Evangelos

    2013-12-01

    Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera. Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough-Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers. The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment. The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2001-04-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor.

  17. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2001-01-01

    SCK-CEN's research and development programme on instrumentation involves the assessment and the development of sensitive measurement systems used within a radiation environment. Particular emphasis is on the assessment of optical fibre components and their adaptability to radiation environments. The evaluation of ageing processes of instrumentation in fission plants, the development of specific data evaluation strategies to compensate for ageing induced degradation of sensors and cable performance form part of these activities. In 2000, particular emphasis was on in-core reactor instrumentation applied to fusion, accelerator driven and water-cooled fission reactors. This involved the development of high performance instrumentation for irradiation experiments in the BR2 reactor in support of new instrumentation needs for MYRRHA, and for diagnostic systems for the ITER reactor

  18. Parallel algorithm for determining motion vectors in ice floe images by matching edge features

    Science.gov (United States)

    Manohar, M.; Ramapriyan, H. K.; Strong, J. P.

    1988-01-01

    A parallel algorithm is described to determine motion vectors of ice floes using time sequences of images of the Arctic ocean obtained from the Synthetic Aperture Radar (SAR) instrument flown on-board the SEASAT spacecraft. Researchers describe a parallel algorithm which is implemented on the MPP for locating corresponding objects based on their translationally and rotationally invariant features. The algorithm first approximates the edges in the images by polygons or sets of connected straight-line segments. Each such edge structure is then reduced to a seed point. Associated with each seed point are the descriptions (lengths, orientations and sequence numbers) of the lines constituting the corresponding edge structure. A parallel matching algorithm is used to match packed arrays of such descriptions to identify corresponding seed points in the two images. The matching algorithm is designed such that fragmentation and merging of ice floes are taken into account by accepting partial matches. The technique has been demonstrated to work on synthetic test patterns and real image pairs from SEASAT in times ranging from .5 to 0.7 seconds for 128 x 128 images.

  19. Identifying Cassini's Magnetospheric Location Using Magnetospheric Imaging Instrument (MIMI) Data and Machine Learning

    Science.gov (United States)

    Vandegriff, J. D.; Smith, G. L.; Edenbaum, H.; Peachey, J. M.; Mitchell, D. G.

    2017-12-01

    We analyzed data from Cassini's Magnetospheric Imaging Instrument (MIMI) and Magnetometer (MAG) and attempted to identify the region of Saturn's magnetosphere that Cassini was in at a given time using machine learning. MIMI data are from the Charge-Energy-Mass Spectrometer (CHEMS) instrument and the Low-Energy Magnetospheric Measurement System (LEMMS). We trained on data where the region is known based on a previous analysis of Cassini Plasma Spectrometer (CAPS) plasma data. Three magnetospheric regions are considered: Magnetosphere, Magnetosheath, and Solar Wind. MIMI particle intensities, magnetic field values, and spacecraft position are used as input attributes, and the output is the CAPS-based region, which is available from 2004 to 2012. We then use the trained classifier to identify Cassini's magnetospheric regions for times after 2012, when CAPS data is no longer available. Training accuracy is evaluated by testing the classifier performance on a time range of known regions that the classifier has never seen. Preliminary results indicate a 68% accuracy on such test data. Other techniques are being tested that may increase this performance. We present the data and algorithms used, and will describe the latest results, including the magnetospheric regions post-2012 identified by the algorithm.

  20. Algorithms

    Indian Academy of Sciences (India)

    polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.

  1. An algorithm for 4D CT image sorting using spatial continuity.

    Science.gov (United States)

    Li, Chen; Liu, Jie

    2013-01-01

    4D CT, which could locate the position of the movement of the tumor in the entire respiratory cycle and reduce image artifacts effectively, has been widely used in making radiation therapy of tumors. The current 4D CT methods required external surrogates of respiratory motion obtained from extra instruments. However, respiratory signals recorded by these external makers may not always accurately represent the internal tumor and organ movements, especially when irregular breathing patterns happened. In this paper we have proposed a novel automatic 4D CT sorting algorithm that performs without these external surrogates. The sorting algorithm requires collecting the image data with a cine scan protocol. Beginning with the first couch position, images from the adjacent couch position are selected out according to spatial continuity. The process is continued until images from all couch positions are sorted and the entire 3D volume is produced. The algorithm is verified by respiratory phantom image data and clinical image data. The primary test results show that the 4D CT images created by our algorithm have eliminated the motion artifacts effectively and clearly demonstrated the movement of tumor and organ in the breath period.

  2. Evaluation of downwelling diffuse attenuation coefficient algorithms in the Red Sea

    KAUST Repository

    Tiwari, Surya Prakash

    2016-05-07

    Despite the importance of the optical properties such as the downwelling diffuse attenuation coefficient for characterizing the upper water column, until recently no in situ optical measurements were published for the Red Sea. Kirby et al. used observations from the Coastal Zone Color Scanner to characterize the spatial and temporal variability of the diffuse attenuation coefficient (Kd(490)) in the Red Sea. To better understand optical variability and its utility in the Red Sea, it is imperative to comprehend the diffuse attenuation coefficient and its relationship with in situ properties. Two apparent optical properties, spectral remote sensing reflectance (Rrs) and the downwelling diffuse attenuation coefficient (Kd), are calculated from vertical profile measurements of downwelling irradiance (Ed) and upwelling radiance (Lu). Kd characterizes light penetration into water column that is important for understanding both the physical and biogeochemical environment, including water quality and the health of ocean environment. Our study tests the performance of the existing Kd(490) algorithms in the Red Sea and compares them against direct in situ measurements within various subdivisions of the Red Sea. Most standard algorithms either overestimated or underestimated with the measured in situ values of Kd. Consequently, these algorithms provided poor retrieval of Kd(490) for the Red Sea. Random errors were high for all algorithms and the correlation coefficients (r2) with in situ measurements were quite low. Hence, these algorithms may not be suitable for the Red Sea. Overall, statistical analyses of the various algorithms indicated that the existing algorithms are inadequate for the Red Sea. The present study suggests that reparameterizing existing algorithms or developing new regional algorithms is required to improve retrieval of Kd(490) for the Red Sea. © (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is

  3. Unsupervised Learning Through Randomized Algorithms for High-Volume High-Velocity Data (ULTRA-HV).

    Energy Technology Data Exchange (ETDEWEB)

    Pinar, Ali [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kolda, Tamara G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carlberg, Kevin Thomas [Wake Forest Univ., Winston-Salem, MA (United States); Ballard, Grey [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mahoney, Michael [Univ. of California, Berkeley, CA (United States)

    2018-01-01

    Through long-term investments in computing, algorithms, facilities, and instrumentation, DOE is an established leader in massive-scale, high-fidelity simulations, as well as science-leading experimentation. In both cases, DOE is generating more data than it can analyze and the problem is intensifying quickly. The need for advanced algorithms that can automatically convert the abundance of data into a wealth of useful information by discovering hidden structures is well recognized. Such efforts however, are hindered by the massive volume of the data and its high velocity. Here, the challenge is developing unsupervised learning methods to discover hidden structure in high-volume, high-velocity data.

  4. PHOEBE 2.0 – Where no model has gone before

    OpenAIRE

    Pavlovski, K.; Degroote, P.; Conroy, K.; Hambleton, Kelly; Bloemen, S.; Pablo, H.; Giammarco, J.; Prša, A.; Tkachenko, A.; Torres, G.

    2013-01-01

    phoebe 2.0 is an open source framework bridging the gap between stellar observations and models. It allows to create and fit models simultaneously and consistently to a wide range of observational data such as photometry, spectroscopy, spectrapolarimetry, interferometry and astrometry. To reach the level of precision required by the newest generation of instruments such as Kepler, GAIA and the arrays of large telescopes, the code is set up to handle a wide range of phenomena such as multiplic...

  5. Application of Fiber Optic Instrumentation

    Science.gov (United States)

    Richards, William Lance; Parker, Allen R., Jr.; Ko, William L.; Piazza, Anthony; Chan, Patrick

    2012-01-01

    Fiber optic sensing technology has emerged in recent years offering tremendous advantages over conventional aircraft instrumentation systems. The advantages of fiber optic sensors over their conventional counterparts are well established; they are lighter, smaller, and can provide enormous numbers of measurements at a fraction of the total sensor weight. After a brief overview of conventional and fiber-optic sensing technology, this paper presents an overview of the research that has been conducted at NASA Dryden Flight Research Center in recent years to advance this promising new technology. Research and development areas include system and algorithm development, sensor characterization and attachment, and real-time experimentally-derived parameter monitoring for ground- and flight-based applications. The vision of fiber optic smart structure technology is presented and its potential benefits to aerospace vehicles throughout the lifecycle, from preliminary design to final retirement, are presented.

  6. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  7. Heterodyne quasi-elastic light-scattering instrument for biomedical diagnostics.

    Science.gov (United States)

    Lebedev, A D; Ivanova, M A; Lomakin, A V; Noskin, V A

    1997-10-20

    The heterodyne technique has a number of advantages over the homodyne technique when an accurate characterization of particle-size distribution (PSD) of heterogeneous systems is required. However, there are problems related to acoustic vibrations that make it difficult to take advantage of the heterodyne technique. An instrument developed for quasi-elastic light scattering (QELS) that uses the optical heterodyning principle is described. Vibration-related problems are considerably reduced because of the incorporation of all optical elements into one solid optical block. A real-time correlation analysis of the photocurrent fluctuations is performed by a PC-embedded analog-to-digital converter card with a digital signal processor. Investigation of the PSD in biological fluids for medical diagnostics is presented as a typical application. A diagnostic analysis of the PSD requires a simultaneous processing of a huge number of QELS data. An original statistical algorithm to accomplish this analysis has been developed. Technical specifications of instrumentation for heterodyne QELS measurement are discussed.

  8. An improved harmony search algorithm for synchronization of discrete-time chaotic systems

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Andrade Bernert, Diego Luis de

    2009-01-01

    The harmony search (HS) algorithm is a recently developed meta-heuristic algorithm, and has been very successful in a wide variety of optimization problems. HS was conceptualized using an analogy with music improvisation process where music players improvise the pitches of their instruments to obtain better harmony. The HS algorithm does not require initial values and uses a random search instead of a gradient search, so derivative information is unnecessary. Furthermore, the HS algorithm is simple in concept, few in parameters, easy in implementation, imposes fewer mathematical requirements, and does not require initial value settings of the decision variables. In recent years, the investigation of synchronization and control problem for discrete chaotic systems has attracted much attention, and many possible applications. The tuning of a proportional-integral-derivative (PID) controller based on an improved HS (IHS) algorithm for synchronization of two identical discrete chaotic systems subject the different initial conditions is investigated in this paper. Simulation results of the IHS to determine the PID parameters to synchronization of two Henon chaotic systems are compared with other HS approaches including classical HS and global-best HS. Numerical results reveal that the proposed IHS method is a powerful search and controller design optimization tool for synchronization of chaotic systems.

  9. Construct validation of an interactive digital algorithm for ostomy care.

    Science.gov (United States)

    Beitz, Janice M; Gerlach, Mary A; Schafer, Vickie

    2014-01-01

    The purpose of this study was to evaluate construct validity for a previously face and content validated Ostomy Algorithm using digital real-life clinical scenarios. A cross-sectional, mixed-methods Web-based survey design study was conducted. Two hundred ninety-seven English-speaking RNs completed the study; participants practiced in both acute care and postacute settings, with 1 expert ostomy nurse (WOC nurse) and 2 nonexpert nurses. Following written consent, respondents answered demographic questions and completed a brief algorithm tutorial. Participants were then presented with 7 ostomy-related digital scenarios consisting of real-life photos and pertinent clinical information. Respondents used the 11 assessment components of the digital algorithm to choose management options. Participant written comments about the scenarios and the research process were collected. The mean overall percentage of correct responses was 84.23%. Mean percentage of correct responses for respondents with a self-reported basic ostomy knowledge was 87.7%; for those with a self-reported intermediate ostomy knowledge was 85.88% and those who were self-reported experts in ostomy care achieved 82.77% correct response rate. Five respondents reported having no prior ostomy care knowledge at screening and achieved an overall 45.71% correct response rate. No negative comments regarding the algorithm were recorded by participants. The new standardized Ostomy Algorithm remains the only face, content, and construct validated digital clinical decision instrument currently available. Further research on application at the bedside while tracking patient outcomes is warranted.

  10. Hard decoding algorithm for optimizing thresholds under general Markovian noise

    Science.gov (United States)

    Chamberland, Christopher; Wallman, Joel; Beale, Stefanie; Laflamme, Raymond

    2017-04-01

    Quantum error correction is instrumental in protecting quantum systems from noise in quantum computing and communication settings. Pauli channels can be efficiently simulated and threshold values for Pauli error rates under a variety of error-correcting codes have been obtained. However, realistic quantum systems can undergo noise processes that differ significantly from Pauli noise. In this paper, we present an efficient hard decoding algorithm for optimizing thresholds and lowering failure rates of an error-correcting code under general completely positive and trace-preserving (i.e., Markovian) noise. We use our hard decoding algorithm to study the performance of several error-correcting codes under various non-Pauli noise models by computing threshold values and failure rates for these codes. We compare the performance of our hard decoding algorithm to decoders optimized for depolarizing noise and show improvements in thresholds and reductions in failure rates by several orders of magnitude. Our hard decoding algorithm can also be adapted to take advantage of a code's non-Pauli transversal gates to further suppress noise. For example, we show that using the transversal gates of the 5-qubit code allows arbitrary rotations around certain axes to be perfectly corrected. Furthermore, we show that Pauli twirling can increase or decrease the threshold depending upon the code properties. Lastly, we show that even if the physical noise model differs slightly from the hypothesized noise model used to determine an optimized decoder, failure rates can still be reduced by applying our hard decoding algorithm.

  11. Content validation of a standardized algorithm for ostomy care.

    Science.gov (United States)

    Beitz, Janice; Gerlach, Mary; Ginsburg, Pat; Ho, Marianne; McCann, Eileen; Schafer, Vickie; Scott, Vera; Stallings, Bobbie; Turnbull, Gwen

    2010-10-01

    The number of ostomy care clinician experts is limited and the majority of ostomy care is provided by non-specialized clinicians or unskilled caregivers and family. The purpose of this study was to obtain content validation data for a new standardized algorithm for ostomy care developed by expert wound ostomy continence nurse (WOCN) clinicians. After face validity was established using overall review and suggestions from WOCN experts, 166 WOCNs self-identified as having expertise in ostomy care were surveyed online for 6 weeks in 2009. Using a cross-sectional, mixed methods study design and a 30-item instrument with a 4-point Likert-type scale, the participants were asked to quantify the degree of validity of the Ostomy Algorithm's decisions and components. Participants' open-ended comments also were thematically analyzed. Using a scale of 1 to 4, the mean score of the entire algorithm was 3.8 (4 = relevant/very relevant). The algorithm's content validity index (CVI) was 0.95 (out of 1.0). Individual component mean scores ranged from 3.59 to 3.91. Individual CVIs ranged from 0.90 to 0.98. Qualitative data analysis revealed themes of difficulty associated with algorithm formatting, especially orientation and use of the Studio Alterazioni Cutanee Stomali (Study on Peristomal Skin Lesions [SACS™ Instrument]) and the inability of algorithms to capture all individual patient attributes affecting ostomy care. Positive themes included content thoroughness and the helpful clinical photos. Suggestions were offered for algorithm improvement. Study results support the strong content validity of the algorithm and research to ascertain its construct validity and effect on care outcomes is warranted.

  12. Managerial instrument for didactic staff structure optimization for Distance Learning

    Directory of Open Access Journals (Sweden)

    Gavrus Cristina

    2017-01-01

    Full Text Available Distance learning is a modern system for providing educational services and is relatively new in Romania, if related to the date of its emergence in Europe. More and more active working people are interested in this form of education, paying of course a special attention to its quality. It is quite difficult to appraise the quality of educational programs but several instruments and criteria have been developed over time. The present paper proposes an original mathematical instrument that is aiming at human resources, this type of resources being considered extremely important in case of providing educational service. The number of teachers is crucial for a distance learning program study, because the didactic staff must cover a number of didactic classes that take place on weekends. Concretely, this paper is focused on finding an algorithm that allows the didactic staff structure optimization. For accomplishing this objective, two managerial instruments were use. One of them is mathematical linear programing technique, that develops a mathematical model for didactic staff structure and the other one is WinQSB software package that tests the mathematical model.

  13. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    Science.gov (United States)

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed

    2016-11-01

    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  14. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  15. Multiple Harmonics Fitting Algorithms Applied to Periodic Signals Based on Hilbert-Huang Transform

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2013-01-01

    Full Text Available A new generation of multipurpose measurement equipment is transforming the role of computers in instrumentation. The new features involve mixed devices, such as kinds of sensors, analog-to-digital and digital-to-analog converters, and digital signal processing techniques, that are able to substitute typical discrete instruments like multimeters and analyzers. Signal-processing applications frequently use least-squares (LS sine-fitting algorithms. Periodic signals may be interpreted as a sum of sine waves with multiple frequencies: the Fourier series. This paper describes a new sine fitting algorithm that is able to fit a multiharmonic acquired periodic signal. By means of a “sinusoidal wave” whose amplitude and phase are both transient, the “triangular wave” can be reconstructed on the basis of Hilbert-Huang transform (HHT. This method can be used to test effective number of bits (ENOBs of analog-to-digital converter (ADC, avoiding the trouble of selecting initial value of the parameters and working out the nonlinear equations. The simulation results show that the algorithm is precise and efficient. In the case of enough sampling points, even under the circumstances of low-resolution signal with the harmonic distortion existing, the root mean square (RMS error between the sampling data of original “triangular wave” and the corresponding points of fitting “sinusoidal wave” is marvelously small. That maybe means, under the circumstances of any periodic signal, that ENOBs of high-resolution ADC can be tested accurately.

  16. A comprehensive review of sensors and instrumentation methods in devices for musical expression.

    Science.gov (United States)

    Medeiros, Carolina Brum; Wanderley, Marcelo M

    2014-07-25

    Digital Musical Instruments (DMIs) are musical instruments typically composed of a control surface where user interaction is measured by sensors whose values are mapped to sound synthesis algorithms. These instruments have gained interest among skilled musicians and performers in the last decades leading to artistic practices including musical performance, interactive installations and dance. The creation of DMIs typically involves several areas, among them: arts, design and engineering. The balance between these areas is an essential task in DMI design so that the resulting instruments are aesthetically appealing, robust, and allow responsive, accurate and repeatable sensing. In this paper, we review the use of sensors in the DMI community as manifested in the proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2009-2013). Focusing on the sensor technologies and signal conditioning techniques used by the NIME community. Although it has been claimed that specifications for artistic tools are harder than those for military applications, this study raises a paradox showing that in most of the cases, DMIs are based on a few basic sensors types and unsophisticated engineering solutions, not taking advantage of more advanced sensing, instrumentation and signal processing techniques that could dramatically improve their response. We aim to raise awareness of limitations of any engineering solution and to assert the benefits of advanced electronics instrumentation design in DMIs. For this, we propose the use of specialized sensors such as strain gages, advanced conditioning circuits and signal processing tools such as sensor fusion. We believe that careful electronic instrumentation design may lead to more responsive instruments.

  17. Radioisotope instruments

    CERN Document Server

    Cameron, J F; Silverleaf, D J

    1971-01-01

    International Series of Monographs in Nuclear Energy, Volume 107: Radioisotope Instruments, Part 1 focuses on the design and applications of instruments based on the radiation released by radioactive substances. The book first offers information on the physical basis of radioisotope instruments; technical and economic advantages of radioisotope instruments; and radiation hazard. The manuscript then discusses commercial radioisotope instruments, including radiation sources and detectors, computing and control units, and measuring heads. The text describes the applications of radioisotop

  18. Algorithmic mathematics

    CERN Document Server

    Hougardy, Stefan

    2016-01-01

    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  19. Ontology Based Vocabulary Matching for Oceanographic Instruments

    Science.gov (United States)

    Chen, Yu; Shepherd, Adam; Chandler, Cyndy; Arko, Robert; Leadbetter, Adam

    2014-05-01

    Data integration act as the preliminary entry point as we enter the era of big data in many scientific domains. However the reusefulness of various dataset has met the hurdle due to different initial of interests of different parties, therefore different vocabularies in describing similar or semantically related concepts. In this scenario it is vital to devise an automatic or semi-supervised algorithm to facilitate the convergence of different vocabularies. The Ocean Data Interoperability Platform (ODIP) seeks to increase data sharing across scientific domains and international boundaries by providing a forum to harmonize diverse regional data systems. ODIP participants from the US include the Rolling Deck to Repository (R2R) program, whose mission is to capture, catalog, and describe the underway/environmental sensor data from US oceanographic research vessels and submit the data to public long-term archives. In an attempt to harmonize these regional data systems, especially vocabularies, R2R recognizes the value of the SeaDataNet vocabularies served by the NERC Vocabulary Server (NVS) hosted at the British Oceanographic Data Centre as a trusted, authoritative source for describing many oceanographic research concepts such as instrumentation. In this work, we make use of the semantic relations in the vocabularies served by NVS to build a Bayesian network and take advantage of the idea of entropy in evaluating the correlation between different concepts and keywords. The performance of the model is evaluated against matching instruments from R2R against the SeaDataNet instrument vocabularies based on calculated confidence scores in the instrument pairings. These pairings with their scores can then be analyzed for assertion growing the interoperability of the R2R vocabulary through its links to the SeaDataNet entities.

  20. Inversion of Land Surface Temperature (LST Using Terra ASTER Data: A Comparison of Three Algorithms

    Directory of Open Access Journals (Sweden)

    Milton Isaya Ndossi

    2016-12-01

    Full Text Available Land Surface Temperature (LST is an important measurement in studies related to the Earth surface’s processes. The Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER instrument onboard the Terra spacecraft is the currently available Thermal Infrared (TIR imaging sensor with the highest spatial resolution. This study involves the comparison of LSTs inverted from the sensor using the Split Window Algorithm (SWA, the Single Channel Algorithm (SCA and the Planck function. This study has used the National Oceanic and Atmospheric Administration’s (NOAA data to model and compare the results from the three algorithms. The data from the sensor have been processed by the Python programming language in a free and open source software package (QGIS to enable users to make use of the algorithms. The study revealed that the three algorithms are suitable for LST inversion, whereby the Planck function showed the highest level of accuracy, the SWA had moderate level of accuracy and the SCA had the least accuracy. The algorithms produced results with Root Mean Square Errors (RMSE of 2.29 K, 3.77 K and 2.88 K for the Planck function, the SCA and SWA respectively.

  1. THE MASS OF HD 38529c FROM HUBBLE SPACE TELESCOPE ASTROMETRY AND HIGH-PRECISION RADIAL VELOCITIES

    International Nuclear Information System (INIS)

    Benedict, G. Fritz; McArthur, Barbara E.; Bean, Jacob L.; Barnes, Rory; Harrison, Thomas E.; Hatzes, Artie; Martioli, Eder; Nelan, Edmund P.

    2010-01-01

    Hubble Space Telescope Fine Guidance Sensor astrometric observations of the G4 IV star HD 38529 are combined with the results of the analysis of extensive ground-based radial velocity (RV) data to determine the mass of the outermost of two previously known companions. Our new RVs obtained with the Hobby-Eberly Telescope and velocities from the Carnegie-California group now span over 11 yr. With these data we obtain improved RV orbital elements for both the inner companion, HD 38529b, and the outer companion, HD 38529c. We identify a rotational period of HD 38529 (P rot = 31.65 ± 0fd17) with Fine Guidance Sensor photometry. The inferred star spot fraction is consistent with the remaining scatter in velocities being caused by spot-related stellar activity. We then model the combined astrometric and RV measurements to obtain the parallax, proper motion, perturbation period, perturbation inclination, and perturbation size due to HD 38529c. For HD 38529c we find P = 2136.1 ± 0.3 d, perturbation semimajor axis α = 1.05 ± 0.06 mas, and inclination i = 48. 0 3 ± 3. 0 7. Assuming a primary mass M * = 1.48 M sun , we obtain a companion mass M c = 17.6 +1.5 -1.2 M Jup , 3σ above a 13 M Jup deuterium burning, brown dwarf lower limit. Dynamical simulations incorporating this accurate mass for HD 38529c indicate that a near-Saturn mass planet could exist between the two known companions. We find weak evidence of an additional low amplitude signal that can be modeled as a planetary-mass (∼0.17 M Jup ) companion at P ∼194 days. Including this component in our modeling lowers the error of the mass determined for HD 38529c. Additional observations (RVs and/or Gaia astrometry) are required to validate an interpretation of HD 38529d as a planetary-mass companion. If confirmed, the resulting HD 38529 planetary system may be an example of a 'Packed Planetary System'.

  2. Instrument Remote Control via the Astronomical Instrument Markup Language

    Science.gov (United States)

    Sall, Ken; Ames, Troy; Warsaw, Craig; Koons, Lisa; Shafer, Richard

    1998-01-01

    The Instrument Remote Control (IRC) project ongoing at NASA's Goddard Space Flight Center's (GSFC) Information Systems Center (ISC) supports NASA's mission by defining an adaptive intranet-based framework that provides robust interactive and distributed control and monitoring of remote instruments. An astronomical IRC architecture that combines the platform-independent processing capabilities of Java with the power of Extensible Markup Language (XML) to express hierarchical data in an equally platform-independent, as well as human readable manner, has been developed. This architecture is implemented using a variety of XML support tools and Application Programming Interfaces (API) written in Java. IRC will enable trusted astronomers from around the world to easily access infrared instruments (e.g., telescopes, cameras, and spectrometers) located in remote, inhospitable environments, such as the South Pole, a high Chilean mountaintop, or an airborne observatory aboard a Boeing 747. Using IRC's frameworks, an astronomer or other scientist can easily define the type of onboard instrument, control the instrument remotely, and return monitoring data all through the intranet. The Astronomical Instrument Markup Language (AIML) is the first implementation of the more general Instrument Markup Language (IML). The key aspects of our approach to instrument description and control applies to many domains, from medical instruments to machine assembly lines. The concepts behind AIML apply equally well to the description and control of instruments in general. IRC enables us to apply our techniques to several instruments, preferably from different observatories.

  3. Essential algorithms a practical approach to computer algorithms

    CERN Document Server

    Stephens, Rod

    2013-01-01

    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  4. Design and evaluation of an architecture for a digital signal processor for instrumentation applications

    Science.gov (United States)

    Fellman, Ronald D.; Kaneshiro, Ronald T.; Konstantinides, Konstantinos

    1990-03-01

    The authors present the design and evaluation of an architecture for a monolithic, programmable, floating-point digital signal processor (DSP) for instrumentation applications. An investigation of the most commonly used algorithms in instrumentation led to a design that satisfies the requirements for high computational and I/O (input/output) throughput. In the arithmetic unit, a 16- x 16-bit multiplier and a 32-bit accumulator provide the capability for single-cycle multiply/accumulate operations, and three format adjusters automatically adjust the data format for increased accuracy and dynamic range. An on-chip I/O unit is capable of handling data block transfers through a direct memory access port and real-time data streams through a pair of parallel I/O ports. I/O operations and program execution are performed in parallel. In addition, the processor includes two data memories with independent addressing units, a microsequencer with instruction RAM, and multiplexers for internal data redirection. The authors also present the structure and implementation of a design environment suitable for the algorithmic, behavioral, and timing simulation of a complete DSP system. Various benchmarking results are reported.

  5. Algorithmic cryptanalysis

    CERN Document Server

    Joux, Antoine

    2009-01-01

    Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program.Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applic

  6. Denni Algorithm An Enhanced Of SMS (Scan, Move and Sort) Algorithm

    Science.gov (United States)

    Aprilsyah Lubis, Denni; Salim Sitompul, Opim; Marwan; Tulus; Andri Budiman, M.

    2017-12-01

    Sorting has been a profound area for the algorithmic researchers, and many resources are invested to suggest a more working sorting algorithm. For this purpose many existing sorting algorithms were observed in terms of the efficiency of the algorithmic complexity. Efficient sorting is important to optimize the use of other algorithms that require sorted lists to work correctly. Sorting has been considered as a fundamental problem in the study of algorithms that due to many reasons namely, the necessary to sort information is inherent in many applications, algorithms often use sorting as a key subroutine, in algorithm design there are many essential techniques represented in the body of sorting algorithms, and many engineering issues come to the fore when implementing sorting algorithms., Many algorithms are very well known for sorting the unordered lists, and one of the well-known algorithms that make the process of sorting to be more economical and efficient is SMS (Scan, Move and Sort) algorithm, an enhancement of Quicksort invented Rami Mansi in 2010. This paper presents a new sorting algorithm called Denni-algorithm. The Denni algorithm is considered as an enhancement on the SMS algorithm in average, and worst cases. The Denni algorithm is compared with the SMS algorithm and the results were promising.

  7. Instrumentation

    International Nuclear Information System (INIS)

    Umminger, K.

    2008-01-01

    A proper measurement of the relevant single and two-phase flow parameters is the basis for the understanding of many complex thermal-hydraulic processes. Reliable instrumentation is therefore necessary for the interaction between analysis and experiment especially in the field of nuclear safety research where postulated accident scenarios have to be simulated in experimental facilities and predicted by complex computer code systems. The so-called conventional instrumentation for the measurement of e. g. pressures, temperatures, pressure differences and single phase flow velocities is still a solid basis for the investigation and interpretation of many phenomena and especially for the understanding of the overall system behavior. Measurement data from such instrumentation still serves in many cases as a database for thermal-hydraulic system codes. However some special instrumentation such as online concentration measurement for boric acid in the water phase or for non-condensibles in steam atmosphere as well as flow visualization techniques were further developed and successfully applied during the recent years. Concerning the modeling needs for advanced thermal-hydraulic codes, significant advances have been accomplished in the last few years in the local instrumentation technology for two-phase flow by the application of new sensor techniques, optical or beam methods and electronic technology. This paper will give insight into the current state of instrumentation technology for safety-related thermohydraulic experiments. Advantages and limitations of some measurement processes and systems will be indicated as well as trends and possibilities for further development. Aspects of instrumentation in operating reactors will also be mentioned.

  8. Python algorithms mastering basic algorithms in the Python language

    CERN Document Server

    Hetland, Magnus Lie

    2014-01-01

    Python Algorithms, Second Edition explains the Python approach to algorithm analysis and design. Written by Magnus Lie Hetland, author of Beginning Python, this book is sharply focused on classical algorithms, but it also gives a solid understanding of fundamental algorithmic problem-solving techniques. The book deals with some of the most important and challenging areas of programming and computer science in a highly readable manner. It covers both algorithmic theory and programming practice, demonstrating how theory is reflected in real Python programs. Well-known algorithms and data struc

  9. A novel digital pulse processing architecture for nuclear instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole [CEA, LIST - Laboratoire Capteurs et Architectures electroniques, F-91191 Gif-sur-Yvette, (France); Paindavoine, Michel [CNRS, Universite de Bourgogne - Laboratoire d' Etude de l' Apprentissage et du Developpement, 21000 DIJON, (France)

    2015-07-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet

  10. A novel digital pulse processing architecture for nuclear instrumentation

    International Nuclear Information System (INIS)

    Moline, Yoann; Thevenin, Mathieu; Corre, Gwenole; Paindavoine, Michel

    2015-01-01

    The field of nuclear instrumentation covers a wide range of applications, including counting, spectrometry, pulse shape discrimination and multi-channel coincidence. These applications are the topic of many researches, new algorithms and implementations are constantly proposed thanks to advances in digital signal processing. However, these improvements are not yet implemented in instrumentation devices. This is especially true for neutron-gamma discrimination applications which traditionally use charge comparison method while literature proposes other algorithms based on frequency domain or wavelet theory which show better performances. Another example is pileups which are generally rejected while pileup correction algorithms also exist. These processes are traditionally performed offline due to two issues. The first is the Poissonian characteristic of the signal, composed of random arrival pulses which requires to current architectures to work in data flow. The second is the real-time requirement, which implies losing pulses when the pulse rate is too high. Despite the possibility of treating the pulses independently from each other, current architectures paralyze the acquisition of the signal during the processing of a pulse. This loss is called dead-time. These two issues have led current architectures to use dedicated solutions based on re-configurable components like Field Programmable Gate Arrays (FPGAs) to overcome the need of performance necessary to deal with dead-time. However, dedicated hardware algorithm implementations on re-configurable technologies are complex and time-consuming. For all these reasons, a programmable Digital pulse Processing (DPP) architecture in a high level language such as Cor C++ which can reduce dead-time would be worthwhile for nuclear instrumentation. This would reduce prototyping and test duration by reducing the level of hardware expertise to implement new algorithms. However, today's programmable solutions do not meet

  11. Note on: "Inevitability of Plate Tectonics on Super-Earths" by Valencia, O Connell and Sasselov, arXiv preprint 0710.0699

    OpenAIRE

    Omerbashich, Mensur

    2008-01-01

    Valencia et al. recently claimed that the mass of a Super-Earth (SE) is a sole factor in determining whether a SE is tectonically active or not. However, mass resolving astrometry is unable to discern between a SE and its moons if any. The fact that no exomoons have been discovered yet is rather a matter of instrumentation imperfection at the present, not of physical absence of exomoons. This, with recently discovered relationships between geometric and physical properties in astronomical bod...

  12. Safety critical FPGA-based NPP instrumentation and control systems: assessment, development and implementation

    International Nuclear Information System (INIS)

    Bakhmach, E. S.; Siora, A. A.; Tokarev, V. I.; Kharchenko, V. S.; Sklyar, V. V.; Andrashov, A. A.

    2010-10-01

    The stages of development, production, verification, licensing and implementation methods and technologies of safety critical instrumentation and control systems for nuclear power plants (NPP) based on FPGA (Field Programmable Gates Arrays) technologies are described. A life cycle model and multi-version technologies of dependability and safety assurance of FPGA-based instrumentation and control systems are discussed. An analysis of NPP instrumentation and control systems construction principles developed by Research and Production Corporation Radiy using FPGA-technologies and results of these systems implementation and operation at Ukrainian and Bulgarian NPP are presented. The RADIY TM platform has been designed and developed by Research and Production Corporation Radiy, Ukraine. The main peculiarity of the RADIY TM platform is the use of FPGA as programmable components for logic control operation. The FPGA-based RADIY TM platform used for NPP instrumentation and control systems development ensures sca lability of system functions types, volume and peculiarities (by changing quantity and quality of sensors, actuators, input/output signals and control algorithms); sca lability of dependability (safety integrity) (by changing a number of redundant channel, tiers, diagnostic and reconfiguration procedures); sca lability of diversity (by changing types, depth and method of diversity selection). (Author)

  13. Sparse feature learning for instrument identification: Effects of sampling and pooling methods.

    Science.gov (United States)

    Han, Yoonchang; Lee, Subin; Nam, Juhan; Lee, Kyogu

    2016-05-01

    Feature learning for music applications has recently received considerable attention from many researchers. This paper reports on the sparse feature learning algorithm for musical instrument identification, and in particular, focuses on the effects of the frame sampling techniques for dictionary learning and the pooling methods for feature aggregation. To this end, two frame sampling techniques are examined that are fixed and proportional random sampling. Furthermore, the effect of using onset frame was analyzed for both of proposed sampling methods. Regarding summarization of the feature activation, a standard deviation pooling method is used and compared with the commonly used max- and average-pooling techniques. Using more than 47 000 recordings of 24 instruments from various performers, playing styles, and dynamics, a number of tuning parameters are experimented including the analysis frame size, the dictionary size, and the type of frequency scaling as well as the different sampling and pooling methods. The results show that the combination of proportional sampling and standard deviation pooling achieve the best overall performance of 95.62% while the optimal parameter set varies among the instrument classes.

  14. There Are (super)Giants in the Sky: Searching for Misidentified Massive Stars in Algorithmically-Selected Quasar Catalogs

    Science.gov (United States)

    Dorn-Wallenstein, Trevor Z.; Levesque, Emily

    2017-11-01

    Thanks to incredible advances in instrumentation, surveys like the Sloan Digital Sky Survey have been able to find and catalog billions of objects, ranging from local M dwarfs to distant quasars. Machine learning algorithms have greatly aided in the effort to classify these objects; however, there are regimes where these algorithms fail, where interesting oddities may be found. We present here an X-ray bright quasar misidentified as a red supergiant/X-ray binary, and a subsequent search of the SDSS quasar catalog for X-ray bright stars misidentified as quasars.

  15. Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm

    Science.gov (United States)

    Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad

    2018-01-01

    Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.

  16. A Comprehensive Review of Sensors and Instrumentation Methods in Devices for Musical Expression

    Directory of Open Access Journals (Sweden)

    Carolina Brum Medeiros

    2014-07-01

    Full Text Available Digital Musical Instruments (DMIs are musical instruments typically composed of a control surface where user interaction is measured by sensors whose values are mapped to sound synthesis algorithms. These instruments have gained interest among skilled musicians and performers in the last decades leading to artistic practices including musical performance, interactive installations and dance. The creation of DMIs typically involves several areas, among them: arts, design and engineering. The balance between these areas is an essential task in DMI design so that the resulting instruments are aesthetically appealing, robust, and allow responsive, accurate and repeatable sensing. In this paper, we review the use of sensors in the DMI community as manifested in the proceedings of the International Conference on New Interfaces for Musical Expression (NIME 2009–2013. Focusing on the sensor technologies and signal conditioning techniques used by the NIME community. Although it has been claimed that specifications for artistic tools are harder than those for military applications, this study raises a paradox showing that in most of the cases, DMIs are based on a few basic sensors types and unsophisticated engineering solutions, not taking advantage of more advanced sensing, instrumentation and signal processing techniques that could dramatically improve their response. We aim to raise awareness of limitations of any engineering solution and to assert the benefits of advanced electronics instrumentation design in DMIs. For this, we propose the use of specialized sensors such as strain gages, advanced conditioning circuits and signal processing tools such as sensor fusion. We believe that careful electronic instrumentation design may lead to more responsive instruments.

  17. Instrumental interaction

    OpenAIRE

    Luciani , Annie

    2007-01-01

    International audience; The expression instrumental interaction as been introduced by Claude Cadoz to identify a human-object interaction during which a human manipulates a physical object - an instrument - in order to perform a manual task. Classical examples of instrumental interaction are all the professional manual tasks: playing violin, cutting fabrics by hand, moulding a paste, etc.... Instrumental interaction differs from other types of interaction (called symbolic or iconic interactio...

  18. The MCLIB library: Monte Carlo simulation of neutron scattering instruments

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, P.A.

    1995-09-01

    Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC{_}RUN) which use the library are shown as an example.

  19. The MCLIB library: Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    Seeger, P.A.

    1995-01-01

    Monte Carlo is a method to integrate over a large number of variables. Random numbers are used to select a value for each variable, and the integrand is evaluated. The process is repeated a large number of times and the resulting values are averaged. For a neutron transport problem, first select a neutron from the source distribution, and project it through the instrument using either deterministic or probabilistic algorithms to describe its interaction whenever it hits something, and then (if it hits the detector) tally it in a histogram representing where and when it was detected. This is intended to simulate the process of running an actual experiment (but it is much slower). This report describes the philosophy and structure of MCLIB, a Fortran library of Monte Carlo subroutines which has been developed for design of neutron scattering instruments. A pair of programs (LQDGEOM and MC RUN) which use the library are shown as an example

  20. Sound algorithms

    OpenAIRE

    De Götzen , Amalia; Mion , Luca; Tache , Olivier

    2007-01-01

    International audience; We call sound algorithms the categories of algorithms that deal with digital sound signal. Sound algorithms appeared in the very infancy of computer. Sound algorithms present strong specificities that are the consequence of two dual considerations: the properties of the digital sound signal itself and its uses, and the properties of auditory perception.

  1. Instrumentation development

    International Nuclear Information System (INIS)

    Ubbes, W.F.; Yow, J.L. Jr.

    1988-01-01

    Instrumentation is developed for the Civilian Radioactive Waste Management Program to meet several different (and sometimes conflicting) objectives. This paper addresses instrumentation development for data needs that are related either directly or indirectly to a repository site, but does not touch on instrumentation for work with waste forms or other materials. Consequently, this implies a relatively large scale for the measurements, and an in situ setting for instrument performance. In this context, instruments are needed for site characterization to define phenomena, develop models, and obtain parameter values, and for later design and performance confirmation testing in the constructed repository. The former set of applications is more immediate, and is driven by the needs of program design and performance assessment activities. A host of general technical and nontechnical issues have arisen to challenge instrumentation development. Instruments can be classed into geomechanical, geohydrologic, or other specialty categories, but these issues cut across artificial classifications. These issues are outlined. Despite this imposing list of issues, several case histories are cited to evaluate progress in the area

  2. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2002-01-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described

  3. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2002-04-01

    SCK-CEN's R and D programme on instrumentation involves the development of advanced instrumentation systems for nuclear applications as well as the assessment of the performance of these instruments in a radiation environment. Particular emphasis is on the use of optical fibres as umbilincal links of a remote handling unit for use during maintanance of a fusion reacor, studies on the radiation hardening of plasma diagnostic systems; investigations on new instrumentation for the future MYRRHA accelerator driven system; space applications related to radiation-hardened lenses; the development of new approaches for dose, temperature and strain measurements; the assessment of radiation-hardened sensors and motors for remote handling tasks and studies of dose measurement systems including the use of optical fibres. Progress and achievements in these areas for 2001 are described.

  4. KNOW THE STAR, KNOW THE PLANET. V. CHARACTERIZATION OF THE STELLAR COMPANION TO THE EXOPLANET HOST STAR HD 177830

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Lewis C. Jr.; Beichman, Charles; Burruss, Rick; Cady, Eric; Lockhart, Thomas G. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena CA 91109 (United States); Oppenheimer, Rebecca; Brenner, Douglas; Luszcz-Cook, Statia; Nilsson, Ricky [American Museum of Natural History, Central Park West at 79th Street, New York, NY 10024 (United States); Crepp, Justin R. [Department of Physics, University of Notre Dame, 225 Nieuwland Science Hall, Notre Dame, IN 46556 (United States); Baranec, Christoph [Institute for Astronomy, University of Hawai‘i at Mānoa, Hilo, HI 96720-2700 (United States); Dekany, Richard; Hillenbrand, Lynne [Division of Physics, Mathematics, and Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States); Hinkley, Sasha [School of Physics, University of Exeter, Stocker Road, Exeter, EX4 4QL (United Kingdom); King, David; Parry, Ian R. [Institute of Astronomy, University of Cambridge, Madingley Road., Cambridge, CB3 OHA (United Kingdom); Pueyo, Laurent [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Sivaramakrishnan, Anand; Soummer, Rémi [Department of Astronomy, Stockholm University, AlbaNova University Center, Roslagstullsbacken 21, SE-10691 Stockholm (Sweden); Rice, Emily L., E-mail: lewis.c.roberts@jpl.nasa.gov [Department of Engineering Science and Physics, College of Staten Island, City University of New York, Staten Island, NY 10314 (United States); and others

    2015-10-15

    HD 177830 is an evolved K0IV star with two known exoplanets. In addition to the planetary companions it has a late-type stellar companion discovered with adaptive optics imagery. We observed the binary star system with the PHARO near-IR camera and the Project 1640 coronagraph. Using the Project 1640 coronagraph and integral field spectrograph we extracted a spectrum of the stellar companion. This allowed us to determine that the spectral type of the stellar companion is a M4 ± 1 V. We used both instruments to measure the astrometry of the binary system. Combining these data with published data, we determined that the binary star has a likely period of approximately 800 years with a semimajor axis of 100–200 AU. This implies that the stellar companion has had little or no impact on the dynamics of the exoplanets. The astrometry of the system should continue to be monitored, but due to the slow nature of the system, observations can be made once every 5–10 years.

  5. Analysis of key technologies for virtual instruments metrology

    Science.gov (United States)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  6. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  7. Vision though afocal instruments: generalized magnification and eye-instrument interaction

    Science.gov (United States)

    Harris, William F.; Evans, Tanya

    2018-04-01

    In Gaussian optics all observers experience the same magnification, the instrument's angular magnification, when viewing distant objects though a telescope or other afocal instruments. However, analysis in linear optics shows that this is not necessarily so in the presence of astigmatism. Because astigmatism may distort and rotate images it is appropriate to work with generalized angular magnification represented by a 2 × 2 matrix. An expression is derived for the generalized magnification for an arbitrary eye looking through an arbitrary afocal instrument. With afocal instruments containing astigmatic refracting elements not all eyes experience the same generalized magnification; there is interaction between eye and instrument. Eye-instrument interaction may change as the instrument is rotated about its longitudinal axis, there being no interaction in particular orientations. A simple numerical example is given. For sake of completeness, expressions for generalized magnification are also presented in the case of instruments that are not afocal and objects that are not distant.

  8. Comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry peak sorting algorithm.

    Science.gov (United States)

    Oh, Cheolhwan; Huang, Xiaodong; Regnier, Fred E; Buck, Charles; Zhang, Xiang

    2008-02-01

    We report a novel peak sorting method for the two-dimensional gas chromatography/time-of-flight mass spectrometry (GC x GC/TOF-MS) system. The objective of peak sorting is to recognize peaks from the same metabolite occurring in different samples from thousands of peaks detected in the analytical procedure. The developed algorithm is based on the fact that the chromatographic peaks for a given analyte have similar retention times in all of the chromatograms. Raw instrument data are first processed by ChromaTOF (Leco) software to provide the peak tables. Our algorithm achieves peak sorting by utilizing the first- and second-dimension retention times in the peak tables and the mass spectra generated during the process of electron impact ionization. The algorithm searches the peak tables for the peaks generated by the same type of metabolite using several search criteria. Our software also includes options to eliminate non-target peaks from the sorting results, e.g., peaks of contaminants. The developed software package has been tested using a mixture of standard metabolites and another mixture of standard metabolites spiked into human serum. Manual validation demonstrates high accuracy of peak sorting with this algorithm.

  9. Algorithm aversion: people erroneously avoid algorithms after seeing them err.

    Science.gov (United States)

    Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade

    2015-02-01

    Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.

  10. The Texas Medication Algorithm Project (TMAP) schizophrenia algorithms.

    Science.gov (United States)

    Miller, A L; Chiles, J A; Chiles, J K; Crismon, M L; Rush, A J; Shon, S P

    1999-10-01

    In the Texas Medication Algorithm Project (TMAP), detailed guidelines for medication management of schizophrenia and related disorders, bipolar disorders, and major depressive disorders have been developed and implemented. This article describes the algorithms developed for medication treatment of schizophrenia and related disorders. The guidelines recommend a sequence of medications and discuss dosing, duration, and switch-over tactics. They also specify response criteria at each stage of the algorithm for both positive and negative symptoms. The rationale and evidence for each aspect of the algorithms are presented.

  11. Multi-band algorithms for the estimation of chlorophyll concentration in the Chesapeake Bay

    KAUST Repository

    Gilerson, Alexander

    2015-10-14

    Standard blue-green ratio algorithms do not usually work well in turbid productive waters because of the contamination of the blue and green bands by CDOM absorption and scattering by non-algal particles. One of the alternative approaches is based on the two- or three band ratio algorithms in the red/NIR part of the spectrum, which require 665, 708, 753 nm bands (or similar) and which work well in various waters all over the world. The critical 708 nm band for these algorithms is not available on MODIS and VIIRS sensors, which limits applications of this approach. We report on another approach where a combination of the 745nm band with blue-green-red bands was the basis for the new algorithms. A multi-band algorithm which includes ratios Rrs(488)/Rrs(551)and Rrs(671)/Rrs(745) and two band algorithm based on Rrs671/Rrs745 ratio were developed with the main focus on the Chesapeake Bay (USA) waters. These algorithms were tested on the specially developed synthetic datasets, well representing the main relationships between water parameters in the Bay taken from the NASA NOMAD database and available literature, on the field data collected by our group during a 2013 campaign in the Bay, as well as NASA SeaBASS data from the other group and on matchups between satellite imagery and water parameters measured by the Chesapeake Bay program. Our results demonstrate that the coefficient of determination can be as high as R2 > 0.90 for the new algorithms in comparison with R2 = 0.6 for the standard OC3V algorithm on the same field dataset. Substantial improvement was also achieved by applying a similar approach (inclusion of Rrs(667)/Rrs(753) ratio) for MODIS matchups. Results for VIIRS are not yet conclusive. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  12. Towards a practical implementation of the MLE algorithm for positron emission tomography

    International Nuclear Information System (INIS)

    Llacer, J.; Andreae, S.; Veklerov, E.; Hoffman, E.J.

    1986-01-01

    Recognizing that the quality of images obtained by application of the Maximum Likelihood Estimator (MLE) to Positron Emission Tomography (PET) and Single Photon Emission Tomography (SPECT) appears to be substantially better than those obtained by conventional methods, the authors have started to develop methods that will facilitate the necessary research for a good evaluation of the algorithm and may lead to its practical application for research and routine tomography. They have found that the non-linear MLE algorithm can be used with pixel sizes which are smaller than the sampling distance, without interpolation, obtaining excellent resolution and no noticeable increase in noise. They have studied the role of symmetry in reducing the amount of matrix element storage requirements for full size applications of the algorithm and have used that concept to carry out two reconstructions of the Derenzo phantom with data from the ECAT-III instrument. The results show excellent signal-to-noise (S/N) ratio, particularly for data with low total counts, excellent sharpness, but low contrast at high frequencies when using the Shepp-Vardi model for probability matrices

  13. A 1DVAR-based snowfall rate retrieval algorithm for passive microwave radiometers

    Science.gov (United States)

    Meng, Huan; Dong, Jun; Ferraro, Ralph; Yan, Banghua; Zhao, Limin; Kongoli, Cezar; Wang, Nai-Yu; Zavodsky, Bradley

    2017-06-01

    Snowfall rate retrieval from spaceborne passive microwave (PMW) radiometers has gained momentum in recent years. PMW can be so utilized because of its ability to sense in-cloud precipitation. A physically based, overland snowfall rate (SFR) algorithm has been developed using measurements from the Advanced Microwave Sounding Unit-A/Microwave Humidity Sounder sensor pair and the Advanced Technology Microwave Sounder. Currently, these instruments are aboard five polar-orbiting satellites, namely, NOAA-18, NOAA-19, Metop-A, Metop-B, and Suomi-NPP. The SFR algorithm relies on a separate snowfall detection algorithm that is composed of a satellite-based statistical model and a set of numerical weather prediction model-based filters. There are four components in the SFR algorithm itself: cloud properties retrieval, computation of ice particle terminal velocity, ice water content adjustment, and the determination of snowfall rate. The retrieval of cloud properties is the foundation of the algorithm and is accomplished using a one-dimensional variational (1DVAR) model. An existing model is adopted to derive ice particle terminal velocity. Since no measurement of cloud ice distribution is available when SFR is retrieved in near real time, such distribution is implicitly assumed by deriving an empirical function that adjusts retrieved SFR toward radar snowfall estimates. Finally, SFR is determined numerically from a complex integral. The algorithm has been validated against both radar and ground observations of snowfall events from the contiguous United States with satisfactory results. Currently, the SFR product is operationally generated at the National Oceanic and Atmospheric Administration and can be obtained from that organization.

  14. Gaia: unraveling the chemical and dinamical history of our Galaxy

    OpenAIRE

    Pancino, E.

    2010-01-01

    The Gaia astrometric mission - the Hipparcos successor - is described in some detail, with its three instruments: the two (spectro)photometers (BP and RP) covering the range 330-1050 nm, the white light (G-band) imager dedicated to astrometry, and the radial velocity spectrometer (RVS) covering the range 847-874 nm at a resolution R \\simeq 11500. The whole sky will be scanned repeatedly providing data for ~10^9 point-like objects, down to a magnitude of V \\simeq 20, aiming to the full 6D reco...

  15. An Automated Algorithm for Identifying and Tracking Transverse Waves in Solar Images

    Science.gov (United States)

    Weberg, Micah J.; Morton, Richard J.; McLaughlin, James A.

    2018-01-01

    Recent instrumentation has demonstrated that the solar atmosphere supports omnipresent transverse waves, which could play a key role in energizing the solar corona. Large-scale studies are required in order to build up an understanding of the general properties of these transverse waves. To help facilitate this, we present an automated algorithm for identifying and tracking features in solar images and extracting the wave properties of any observed transverse oscillations. We test and calibrate our algorithm using a set of synthetic data, which includes noise and rotational effects. The results indicate an accuracy of 1%–2% for displacement amplitudes and 4%–10% for wave periods and velocity amplitudes. We also apply the algorithm to data from the Atmospheric Imaging Assembly on board the Solar Dynamics Observatory and find good agreement with previous studies. Of note, we find that 35%–41% of the observed plumes exhibit multiple wave signatures, which indicates either the superposition of waves or multiple independent wave packets observed at different times within a single structure. The automated methods described in this paper represent a significant improvement on the speed and quality of direct measurements of transverse waves within the solar atmosphere. This algorithm unlocks a wide range of statistical studies that were previously impractical.

  16. Real time processing of neutron monitor data using the edge editor algorithm

    Directory of Open Access Journals (Sweden)

    Mavromichalaki Helen

    2012-09-01

    Full Text Available The nucleonic component of the secondary cosmic rays is measured by the worldwide network of neutron monitors (NMs. In most cases, a NM station publishes the measured data in a real time basis in order to be available for instant use from the scientific community. The space weather centers and the online applications such as the ground level enhancement (GLE alert make use of the online data and are highly dependent on their quality. However, the primary data in some cases are distorted due to unpredictable instrument variations. For this reason, the real time primary data processing of the measured data of a station is necessary. The general operational principle of the correction algorithms is the comparison between the different channels of a NM, taking advantage of the fact that a station hosts a number of identical detectors. Median editor, Median editor plus and Super editor are some of the correction algorithms that are being used with satisfactory results. In this work an alternative algorithm is proposed and analyzed. The new algorithm uses a statistical approach to define the distribution of the measurements and introduces an error index which is used for the correction of the measurements that deviate from this distribution.

  17. On DESTINY Science Instrument Electrical and Electronics Subsystem Framework

    Science.gov (United States)

    Kizhner, Semion; Benford, Dominic J.; Lauer, Tod R.

    2009-01-01

    Future space missions are going to require large focal planes with many sensing arrays and hundreds of millions of pixels all read out at high data rates'' . This will place unique demands on the electrical and electronics (EE) subsystem design and it will be critically important to have high technology readiness level (TRL) EE concepts ready to support such missions. One such omission is the Joint Dark Energy Mission (JDEM) charged with making precise measurements of the expansion rate of the universe to reveal vital clues about the nature of dark energy - a hypothetical form of energy that permeates all of space and tends to increase the rate of the expansion. One of three JDEM concept studies - the Dark Energy Space Telescope (DESTINY) was conducted in 2008 at the NASA's Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. This paper presents the EE subsystem framework, which evolved from the DESTINY science instrument study. It describes the main challenges and implementation concepts related to the design of an EE subsystem featuring multiple focal planes populated with dozens of large arrays and millions of pixels. The focal planes are passively cooled to cryogenic temperatures (below 140 K). The sensor mosaic is controlled by a large number of Readout Integrated Circuits and Application Specific Integrated Circuits - the ROICs/ASICs in near proximity to their sensor focal planes. The ASICs, in turn, are serviced by a set of "warm" EE subsystem boxes performing Field Programmable Gate Array (FPGA) based digital signal processing (DSP) computations of complex algorithms, such as sampling-up-the-ramp algorithm (SUTR), over large volumes of fast data streams. The SUTR boxes are supported by the Instrument Control/Command and Data Handling box (ICDH Primary and Backup boxes) for lossless data compression, command and low volume telemetry handling, power conversion and for communications with the spacecraft. The paper outlines how the JDEM DESTINY concept

  18. Pancreatitis Quality of Life Instrument: Development of a new instrument

    Directory of Open Access Journals (Sweden)

    Wahid Wassef

    2014-02-01

    Full Text Available Objectives: The goal of this project was to develop the first disease-specific instrument for the evaluation of quality of life in chronic pancreatitis. Methods: Focus groups and interview sessions were conducted, with chronic pancreatitis patients, to identify items felt to impact quality of life which were subsequently formatted into a paper-and-pencil instrument. This instrument was used to conduct an online survey by an expert panel of pancreatologists to evaluate its content validity. Finally, the modified instrument was presented to patients during precognitive testing interviews to evaluate its clarity and appropriateness. Results: In total, 10 patients were enrolled in the focus groups and interview sessions where they identified 50 items. Once redundant items were removed, the 40 remaining items were made into a paper-and-pencil instrument referred to as the Pancreatitis Quality of Life Instrument. Through the processes of content validation and precognitive testing, the number of items in the instrument was reduced to 24. Conclusions: This marks the development of the first disease-specific instrument to evaluate quality of life in chronic pancreatitis. It includes unique features not found in generic instruments (economic factors, stigma, and spiritual factors. Although this marks a giant step forward, psychometric evaluation is still needed prior to its clinical use.

  19. Algorithming the Algorithm

    DEFF Research Database (Denmark)

    Mahnke, Martina; Uprichard, Emma

    2014-01-01

    Imagine sailing across the ocean. The sun is shining, vastness all around you. And suddenly [BOOM] you’ve hit an invisible wall. Welcome to the Truman Show! Ever since Eli Pariser published his thoughts on a potential filter bubble, this movie scenario seems to have become reality, just with slight...... changes: it’s not the ocean, it’s the internet we’re talking about, and it’s not a TV show producer, but algorithms that constitute a sort of invisible wall. Building on this assumption, most research is trying to ‘tame the algorithmic tiger’. While this is a valuable and often inspiring approach, we...

  20. Monte Carlo simulation of neutron scattering instruments

    International Nuclear Information System (INIS)

    Seeger, P.A.; Daemen, L.L.; Hjelm, R.P. Jr.

    1998-01-01

    A code package consisting of the Monte Carlo Library MCLIB, the executing code MC RUN, the web application MC Web, and various ancillary codes is proposed as an open standard for simulation of neutron scattering instruments. The architecture of the package includes structures to define surfaces, regions, and optical elements contained in regions. A particle is defined by its vector position and velocity, its time of flight, its mass and charge, and a polarization vector. The MC RUN code handles neutron transport and bookkeeping, while the action on the neutron within any region is computed using algorithms that may be deterministic, probabilistic, or a combination. Complete versatility is possible because the existing library may be supplemented by any procedures a user is able to code. Some examples are shown

  1. Pseudo-deterministic Algorithms

    OpenAIRE

    Goldwasser , Shafi

    2012-01-01

    International audience; In this talk we describe a new type of probabilistic algorithm which we call Bellagio Algorithms: a randomized algorithm which is guaranteed to run in expected polynomial time, and to produce a correct and unique solution with high probability. These algorithms are pseudo-deterministic: they can not be distinguished from deterministic algorithms in polynomial time by a probabilistic polynomial time observer with black box access to the algorithm. We show a necessary an...

  2. The Algorithmic Imaginary

    DEFF Research Database (Denmark)

    Bucher, Taina

    2017-01-01

    the notion of the algorithmic imaginary. It is argued that the algorithmic imaginary – ways of thinking about what algorithms are, what they should be and how they function – is not just productive of different moods and sensations but plays a generative role in moulding the Facebook algorithm itself...... of algorithms affect people's use of these platforms, if at all? To help answer these questions, this article examines people's personal stories about the Facebook algorithm through tweets and interviews with 25 ordinary users. To understand the spaces where people and algorithms meet, this article develops...

  3. Safety critical FPGA-based NPP instrumentation and control systems: assessment, development and implementation

    Energy Technology Data Exchange (ETDEWEB)

    Bakhmach, E. S.; Siora, A. A.; Tokarev, V. I. [Research and Production Corporation Radiy, 29 Geroev Stalingrada Str., Kirovograd 25006 (Ukraine); Kharchenko, V. S.; Sklyar, V. V.; Andrashov, A. A., E-mail: marketing@radiy.co [Center for Safety Infrastructure-Oriented Research and Analysis, 37 Astronomicheskaya Str., Kharkiv 61085 (Ukraine)

    2010-10-15

    The stages of development, production, verification, licensing and implementation methods and technologies of safety critical instrumentation and control systems for nuclear power plants (NPP) based on FPGA (Field Programmable Gates Arrays) technologies are described. A life cycle model and multi-version technologies of dependability and safety assurance of FPGA-based instrumentation and control systems are discussed. An analysis of NPP instrumentation and control systems construction principles developed by Research and Production Corporation Radiy using FPGA-technologies and results of these systems implementation and operation at Ukrainian and Bulgarian NPP are presented. The RADIY{sup TM} platform has been designed and developed by Research and Production Corporation Radiy, Ukraine. The main peculiarity of the RADIY{sup TM} platform is the use of FPGA as programmable components for logic control operation. The FPGA-based RADIY{sup TM} platform used for NPP instrumentation and control systems development ensures sca lability of system functions types, volume and peculiarities (by changing quantity and quality of sensors, actuators, input/output signals and control algorithms); sca lability of dependability (safety integrity) (by changing a number of redundant channel, tiers, diagnostic and reconfiguration procedures); sca lability of diversity (by changing types, depth and method of diversity selection). (Author)

  4. Novel mathematical algorithm for pupillometric data analysis.

    Science.gov (United States)

    Canver, Matthew C; Canver, Adam C; Revere, Karen E; Amado, Defne; Bennett, Jean; Chung, Daniel C

    2014-01-01

    Pupillometry is used clinically to evaluate retinal and optic nerve function by measuring pupillary response to light stimuli. We have developed a mathematical algorithm to automate and expedite the analysis of non-filtered, non-calculated pupillometric data obtained from mouse pupillary light reflex recordings, obtained from dynamic pupillary diameter recordings following exposure of varying light intensities. The non-filtered, non-calculated pupillometric data is filtered through a low pass finite impulse response (FIR) filter. Thresholding is used to remove data caused by eye blinking, loss of pupil tracking, and/or head movement. Twelve physiologically relevant parameters were extracted from the collected data: (1) baseline diameter, (2) minimum diameter, (3) response amplitude, (4) re-dilation amplitude, (5) percent of baseline diameter, (6) response time, (7) re-dilation time, (8) average constriction velocity, (9) average re-dilation velocity, (10) maximum constriction velocity, (11) maximum re-dilation velocity, and (12) onset latency. No significant differences were noted between parameters derived from algorithm calculated values and manually derived results (p ≥ 0.05). This mathematical algorithm will expedite endpoint data derivation and eliminate human error in the manual calculation of pupillometric parameters from non-filtered, non-calculated pupillometric values. Subsequently, these values can be used as reference metrics for characterizing the natural history of retinal disease. Furthermore, it will be instrumental in the assessment of functional visual recovery in humans and pre-clinical models of retinal degeneration and optic nerve disease following pharmacological or gene-based therapies. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. HIGH-PRECISION RADIO AND INFRARED ASTROMETRY OF LSPM J1314+1320AB. I. PARALLAX, PROPER MOTIONS, AND LIMITS ON PLANETS

    Energy Technology Data Exchange (ETDEWEB)

    Forbrich, Jan [University of Vienna, Department of Astrophysics, Türkenschanzstr. 17, A-1180 Vienna (Austria); Dupuy, Trent J.; Rizzuto, Aaron; Mann, Andrew W.; Kraus, Adam L. [The University of Texas at Austin, Department of Astronomy, 2515 Speedway C1400, Austin, TX 78712 (United States); Reid, Mark J.; Berger, Edo [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Liu, Michael C.; Aller, Kimberly [Institute for Astronomy, University of Hawai’i, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States)

    2016-08-10

    We present multi-epoch astrometric radio observations with the Very Long Baseline Array (VLBA) of the young ultracool-dwarf binary LSPM J1314+1320AB. The radio emission comes from the secondary star. Combining the VLBA data with Keck near-infrared adaptive-optics observations of both components, a full astrometric fit of parallax (π {sub abs} = 57.975 ± 0.045 mas, corresponding to a distance of d = 17.249 ± 0.013 pc), proper motion (μ {sub α} {sub cos} {sub δ} = −247.99 ± 0.10 mas yr{sup −1}, μ {sub δ} = −183.58 ± 0.22 mas yr{sup −1}), and orbital motion is obtained. Despite the fact that the two components have nearly identical masses to within ±2%, the secondary’s radio emission exceeds that of the primary by a factor of ≳30, suggesting a difference in stellar rotation history, which could result in different magnetic field configurations. Alternatively, the emission could be anisotropic and beamed toward us for the secondary but not for the primary. Using only reflex motion, we exclude planets of mass 0.7–10 M {sub jup} with orbital periods of 600–10 days, respectively. Additionally, we use the full orbital solution of the binary to derive an upper limit for the semimajor axis of 0.23 au for stable planetary orbits within this system. These limits cover a parameter space that is inaccessible with, and complementary to, near-infrared radial velocity surveys of ultracool dwarfs. Our absolute astrometry will constitute an important test for the astrometric calibration of Gaia .

  6. The WHO 2016 verbal autopsy instrument: An international standard suitable for automated analysis by InterVA, InSilicoVA, and Tariff 2.0.

    Directory of Open Access Journals (Sweden)

    Erin K Nichols

    2018-01-01

    Full Text Available Verbal autopsy (VA is a practical method for determining probable causes of death at the population level in places where systems for medical certification of cause of death are weak. VA methods suitable for use in routine settings, such as civil registration and vital statistics (CRVS systems, have developed rapidly in the last decade. These developments have been part of a growing global momentum to strengthen CRVS systems in low-income countries. With this momentum have come pressure for continued research and development of VA methods and the need for a single standard VA instrument on which multiple automated diagnostic methods can be developed.In 2016, partners harmonized a WHO VA standard instrument that fully incorporates the indicators necessary to run currently available automated diagnostic algorithms. The WHO 2016 VA instrument, together with validated approaches to analyzing VA data, offers countries solutions to improving information about patterns of cause-specific mortality. This VA instrument offers the opportunity to harmonize the automated diagnostic algorithms in the future.Despite all improvements in design and technology, VA is only recommended where medical certification of cause of death is not possible. The method can nevertheless provide sufficient information to guide public health priorities in communities in which physician certification of deaths is largely unavailable. The WHO 2016 VA instrument, together with validated approaches to analyzing VA data, offers countries solutions to improving information about patterns of cause-specific mortality.

  7. Validation of the Welch Allyn SureBP (inflation) and StepBP (deflation) algorithms by AAMI standard testing and BHS data analysis.

    Science.gov (United States)

    Alpert, Bruce S

    2011-04-01

    We evaluated two new Welch Allyn automated blood pressure (BP) algorithms. The first, SureBP, estimates BP during cuff inflation; the second, StepBP, does so during deflation. We followed the American National Standards Institute/Association for the Advancement of Medical Instrumentation SP10:2006 standard for testing and data analysis. The data were also analyzed using the British Hypertension Society analysis strategy. We tested children, adolescents, and adults. The requirements of the American National Standards Institute/Association for the Advancement of Medical Instrumentation SP10:2006 standard were fulfilled with respect to BP levels, arm sizes, and ages. Association for the Advancement of Medical Instrumentation SP10 Method 1 data analysis was used. The mean±standard deviation for the device readings compared with auscultation by paired, trained, blinded observers in the SureBP mode were -2.14±7.44 mmHg for systolic BP (SBP) and -0.55±5.98 mmHg for diastolic BP (DBP). In the StepBP mode, the differences were -3.61±6.30 mmHg for SBP and -2.03±5.30 mmHg for DBP. Both algorithms achieved an A grade for both SBP and DBP by British Hypertension Society analysis. The SureBP inflation-based algorithm will be available in many new-generation Welch Allyn monitors. Its use will reduce the time it takes to estimate BP in critical patient care circumstances. The device will not need to inflate to excessive suprasystolic BPs to obtain the SBP values. Deflation is rapid once SBP has been determined, thus reducing the total time of cuff inflation and reducing patient discomfort. If the SureBP fails to obtain a BP value, the StepBP algorithm is activated to estimate BP by traditional deflation methodology.

  8. Proposal for a Universal Test Mirror for Characterization of Slope Measuring Instruments

    International Nuclear Information System (INIS)

    Yashchuk, Valeriy V.; McKinney, Wayne R.; Warwick, Tony; Noll, Tino; Siewert, Frank; Zeschke, Thomas; Geckeler, Ralf D.

    2007-01-01

    The development of third generation light sources like the Advanced Light Source (ALS) or BESSY II brought to a focus the need for high performance synchrotron optics with unprecedented tolerances for slope error and micro roughness. Proposed beam lines at Free Electron Lasers (FEL) require optical elements up to a length of one meter, characterized by a residual slope error in the range of 0.1mu rad (rms),and rms values of 0.1 nm for micro roughness. These optical elements must be inspected by highly accurate measuring instruments, providing a measurement uncertainty lower than the specified accuracy of the surface under test. It is essential that metrology devices in use at synchrotron laboratories be precisely characterized and calibrated to achieve this target. In this paper we discuss a proposal for a Universal Test Mirror (UTM) as a realization of a high performance calibration instrument. The instrument would provide an ideal calibration surface to replicate a redundant surface under test of redundant figure. The application of a sophisticated calibration instrument will allow the elimination of the majority of the systematic error from the error budget of an individual measurement of a particular optical element. We present the limitations of existing methods, initial UTM design considerations, possible calibration algorithms, and an estimation of the expected accuracy

  9. An Adaptive Filtering Algorithm Based on Genetic Algorithm-Backpropagation Network

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2013-01-01

    Full Text Available A new image filtering algorithm is proposed. GA-BPN algorithm uses genetic algorithm (GA to decide weights in a back propagation neural network (BPN. It has better global optimal characteristics than traditional optimal algorithm. In this paper, we used GA-BPN to do image noise filter researching work. Firstly, this paper uses training samples to train GA-BPN as the noise detector. Then, we utilize the well-trained GA-BPN to recognize noise pixels in target image. And at last, an adaptive weighted average algorithm is used to recover noise pixels recognized by GA-BPN. Experiment data shows that this algorithm has better performance than other filters.

  10. Medical image registration algorithms assesment Bronze Standard application enactment on grids using the MOTEUR workflow engine

    CERN Document Server

    Glatard, T; Pennec, X

    2006-01-01

    Medical image registration is pre-processing needed for many medical image analysis procedures. A very large number of registration algorithms are available today, but their performance is often not known and very difficult to assess due to the lack of gold standard. The Bronze Standard algorithm is a very data and compute intensive statistical approach for quantifying registration algorithms accuracy. In this paper, we describe the Bronze Standard application and we discuss the need for grids to tackle such computations on medical image databases. We demonstrate MOTEUR, a service-based workflow engine optimized for dealing with data intensive applications. MOTEUR eases the enactment of the Bronze Standard and similar applications on the EGEE production grid infrastructure. It is a generic workflow engine, based on current standards and freely available, that can be used to instrument legacy application code at low cost.

  11. An algorithm for enhanced formation flying of satellites in low earth orbit

    Science.gov (United States)

    Folta, David C.; Quinn, David A.

    1998-01-01

    With scientific objectives for Earth observation programs becoming more ambitious and spacecraft becoming more autonomous, the need for innovative technical approaches on the feasibility of achieving and maintaining formations of spacecraft has come to the forefront. The trend to develop small low-cost spacecraft has led many scientists to recognize the advantage of flying several spacecraft in formation to achieve the correlated instrument measurements formerly possible only by flying many instruments on a single large platform. Yet, formation flying imposes additional complications on orbit maintenance, especially when each spacecraft has its own orbit requirements. However, advances in automation and technology proposed by the Goddard Space Flight Center (GSFC) allow more of the burden in maneuver planning and execution to be placed onboard the spacecraft, mitigating some of the associated operational concerns. The purpose of this paper is to present GSFC's Guidance, Navigation, and Control Center's (GNCC) algorithm for Formation Flying of the low earth orbiting spacecraft that is part of the New Millennium Program (NMP). This system will be implemented as a close-loop flight code onboard the NMP Earth Orbiter-1 (EO-1) spacecraft. Results of this development can be used to determine the appropriateness of formation flying for a particular case as well as operational impacts. Simulation results using this algorithm integrated in an autonomous `fuzzy logic' control system called AutoCon™ are presented.

  12. Nature-inspired optimization algorithms

    CERN Document Server

    Yang, Xin-She

    2014-01-01

    Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning

  13. Implementation of sepsis algorithm by nurses in the intensive care unit

    Directory of Open Access Journals (Sweden)

    Paula Pedroso Peninck

    2012-04-01

    Full Text Available Sepsis is defined as a clinical syndrome consisting of a systemic inflammatory response associated to an infection, which may determine malfunction or failure of multiple organs. This research aims to verify the application of implementation of sepsis algorithm by nurses in the Intensive Care Unit and create an operational nursing assistance guide. This is an exploratory, descriptive study with quantitative approach. A data collection instrument based on relevant literature was elaborated, assessed, corrected and validated. The sample consisted of 20 intensive care unit nurses. We obtained satisfactory evaluations on nurses’ performance, but some issues did not reach 50% accuracy. We emphasize the importance of greater numbers of nurses getting acquainted and correctly applying the sepsis algorithm. Based on the above, an operational septic patient nursing assistance guide was created, based on the difficulties that arose vis-à-vis the variables applied in research and relevant literature.

  14. Convex hull ranking algorithm for multi-objective evolutionary algorithms

    NARCIS (Netherlands)

    Davoodi Monfrared, M.; Mohades, A.; Rezaei, J.

    2012-01-01

    Due to many applications of multi-objective evolutionary algorithms in real world optimization problems, several studies have been done to improve these algorithms in recent years. Since most multi-objective evolutionary algorithms are based on the non-dominated principle, and their complexity

  15. Total algorithms

    NARCIS (Netherlands)

    Tel, G.

    We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of

  16. Magnetospheric Multiscale Instrument Suite Operations and Data System

    Science.gov (United States)

    Baker, D. N.; Riesberg, L.; Pankratz, C. K.; Panneton, R. S.; Giles, B. L.; Wilder, F. D.; Ergun, R. E.

    2016-03-01

    The four Magnetospheric Multiscale (MMS) spacecraft will collect a combined volume of ˜100 gigabits per day of particle and field data. On average, only 4 gigabits of that volume can be transmitted to the ground. To maximize the scientific value of each transmitted data segment, MMS has developed the Science Operations Center (SOC) to manage science operations, instrument operations, and selection, downlink, distribution, and archiving of MMS science data sets. The SOC is managed by the Laboratory for Atmospheric and Space Physics (LASP) in Boulder, Colorado and serves as the primary point of contact for community participation in the mission. MMS instrument teams conduct their operations through the SOC, and utilize the SOC's Science Data Center (SDC) for data management and distribution. The SOC provides a single mission data archive for the housekeeping and science data, calibration data, ephemerides, attitude and other ancillary data needed to support the scientific use and interpretation. All levels of data products will reside at and be publicly disseminated from the SDC. Documentation and metadata describing data products, algorithms, instrument calibrations, validation, and data quality will be provided. Arguably, the most important innovation developed by the SOC is the MMS burst data management and selection system. With nested automation and "Scientist-in-the-Loop" (SITL) processes, these systems are designed to maximize the value of the burst data by prioritizing the data segments selected for transmission to the ground. This paper describes the MMS science operations approach, processes and data systems, including the burst system and the SITL concept.

  17. The effect of Scratch environment on student’s achievement in teaching algorithm

    Directory of Open Access Journals (Sweden)

    Mehmet Tekerek

    2014-08-01

    Full Text Available In this study, the effect of Scratch environment in teaching algorithm in elementary school 6th grade Information and Communication Technologies course was examined. The research method was experimental method. Control group, pretest-posttest design of experimental research method and a convenience sample consisting of 60 6th grade students were used. The research instrument was achievement test to determine the effect of Scratch on learning algorithm. During the implementation process experiment group studied using Scratch and control group studied with traditional methods. The data was analyzed using independent-samples t-test, paired-samples t-test and ANCOVA statistics. According to findings there is no statically significant difference between posttest achievement scores of experiment and control groups. Similarly, In terms of gender there isn’t a statically significant difference between posttest scores of experiment and control groups.

  18. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; de Berg, M.T.; Bouts, Q.W.; ten Brink, Alex P.; Buchin, K.A.; Westenberg, M.A.

    2015-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  19. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; Berg, de M.T.; Bouts, Q.W.; Brink, ten A.P.; Buchin, K.; Westenberg, M.A.

    2014-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  20. New improved algorithm for sky calibration of L-band radiometers JLBARA and ELBARA II

    KAUST Repository

    Dimitrov, Marin

    2012-03-01

    We propose a new algorithm for sky calibration of the L-band radiometers JLBARA and ELBARA II, introducing the effective transmissivities of the instruments. The suggested approach was tested using experimental data obtained at the Selhausen test site, Germany. It was shown that for JLBARA the effective transmissivities depend strongly on the air temperature and decrease with increasing air temperature, while for ELBARA II such strong dependence was not observed. It was also shown that the effective transmissivities account for the antenna and feed cable loss effects, and for the variations of the radiometer gain due to air temperature changes. The new calibration algorithm reduces significantly the bias of brightness temperature estimates for both radiometers, especially for JLBARA. © 2012 IEEE.

  1. New improved algorithm for sky calibration of L-band radiometers JLBARA and ELBARA II

    KAUST Repository

    Dimitrov, Marin; Kostov, K. G.; Jonard, Franç ois; Jadoon, Khan; Schwank, Mike; Weihermü ller, Lutz; Hermes, Normen; Vanderborght, Jan P.; Vereecken, Harry

    2012-01-01

    We propose a new algorithm for sky calibration of the L-band radiometers JLBARA and ELBARA II, introducing the effective transmissivities of the instruments. The suggested approach was tested using experimental data obtained at the Selhausen test site, Germany. It was shown that for JLBARA the effective transmissivities depend strongly on the air temperature and decrease with increasing air temperature, while for ELBARA II such strong dependence was not observed. It was also shown that the effective transmissivities account for the antenna and feed cable loss effects, and for the variations of the radiometer gain due to air temperature changes. The new calibration algorithm reduces significantly the bias of brightness temperature estimates for both radiometers, especially for JLBARA. © 2012 IEEE.

  2. Emissivity compensated spectral pyrometry—algorithm and sensitivity analysis

    International Nuclear Information System (INIS)

    Hagqvist, Petter; Sikström, Fredrik; Christiansson, Anna-Karin; Lennartson, Bengt

    2014-01-01

    In order to solve the problem of non-contact temperature measurements on an object with varying emissivity, a new method is herein described and evaluated. The method uses spectral radiance measurements and converts them to temperature readings. It proves to be resilient towards changes in spectral emissivity and tolerates noisy spectral measurements. It is based on an assumption of smooth changes in emissivity and uses historical values of spectral emissivity and temperature for estimating current spectral emissivity. The algorithm, its constituent steps and accompanying parameters are described and discussed. A thorough sensitivity analysis of the method is carried out through simulations. No rigorous instrument calibration is needed for the presented method and it is therefore industrially tractable. (paper)

  3. Archimedean copula estimation of distribution algorithm based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Haidong Xu; Mingyan Jiang; Kun Xu

    2015-01-01

    The artificial bee colony (ABC) algorithm is a com-petitive stochastic population-based optimization algorithm. How-ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in-sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA cal ed Archimedean copula estima-tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench-mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen-tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.

  4. Instrument performance evaluation

    International Nuclear Information System (INIS)

    Swinth, K.L.

    1993-03-01

    Deficiencies exist in both the performance and the quality of health physics instruments. Recognizing the implications of such deficiencies for the protection of workers and the public, in the early 1980s the DOE and the NRC encouraged the development of a performance standard and established a program to test a series of instruments against criteria in the standard. The purpose of the testing was to establish the practicality of the criteria in the standard, to determine the performance of a cross section of available instruments, and to establish a testing capability. Over 100 instruments were tested, resulting in a practical standard and an understanding of the deficiencies in available instruments. In parallel with the instrument testing, a value-impact study clearly established the benefits of implementing a formal testing program. An ad hoc committee also met several times to establish recommendations for the voluntary implementation of a testing program based on the studies and the performance standard. For several reasons, a formal program did not materialize. Ongoing tests and studies have supported the development of specific instruments and have helped specific clients understand the performance of their instruments. The purpose of this presentation is to trace the history of instrument testing to date and suggest the benefits of a centralized formal program

  5. ISINA: INTEGRAL Source Identification Network Algorithm

    Science.gov (United States)

    Scaringi, S.; Bird, A. J.; Clark, D. J.; Dean, A. J.; Hill, A. B.; McBride, V. A.; Shaw, S. E.

    2008-11-01

    We give an overview of ISINA: INTEGRAL Source Identification Network Algorithm. This machine learning algorithm, using random forests, is applied to the IBIS/ISGRI data set in order to ease the production of unbiased future soft gamma-ray source catalogues. First, we introduce the data set and the problems encountered when dealing with images obtained using the coded mask technique. The initial step of source candidate searching is introduced and an initial candidate list is created. A description of the feature extraction on the initial candidate list is then performed together with feature merging for these candidates. Three training and testing sets are created in order to deal with the diverse time-scales encountered when dealing with the gamma-ray sky. Three independent random forests are built: one dealing with faint persistent source recognition, one dealing with strong persistent sources and a final one dealing with transients. For the latter, a new transient detection technique is introduced and described: the transient matrix. Finally the performance of the network is assessed and discussed using the testing set and some illustrative source examples. Based on observations with INTEGRAL, an ESA project with instruments and science data centre funded by ESA member states (especially the PI countries: Denmark, France, Germany, Italy, Spain), Czech Republic and Poland, and the participation of Russia and the USA. E-mail: simo@astro.soton.ac.uk

  6. Detection of gamma-ray bursts with the ECLAIRs instrument onboard the space mission SVOM

    International Nuclear Information System (INIS)

    Antier-Farfar, Sarah

    2016-01-01

    Discovered in the early 1970's, gamma-ray bursts (GRBs) are amazing cosmic phenomena appearing randomly on the sky and releasing large amounts of energy mainly through gamma-ray emission. Although their origin is still under debate, they are believed to be produced by some of the most violent explosions in the Universe leading to the formation of stellar black-holes. GRBs are detected by their prompt emission, an intense short burst of gamma-rays (from a few milliseconds to few minutes), and are followed by a lived-afterglow emission observed on longer timescales from the X-ray to the radio domain. My thesis participates to the development of the SVOM mission, which a Chinese-French mission to be launched in 2021, devoted to the study of GRBs and involving space and ground instruments. My work is focussed on the main instrument ECLAIRs, a hard X-ray coded mask imaging camera, in charge of the near real-time detection and localization of the prompt emission of GRBs. During my thesis, I studied the scientific performances of ECLAIRs and in particular the number of GRBs expected to be detected by ECLAIRs and their characteristics. For this purpose, I performed simulations using the prototypes of the embedded trigger algorithms combined with the model of the ECLAIRs instrument. The input data of the simulations include a background model and a synthetic population of gamma-ray bursts generated from existing catalogs (CGRO, HETE-2, Fermi and Swift). As a result, I estimated precisely the ECLAIRs detection efficiency of the algorithms and I predicted the number of GRBs to be detected by ECLAIRs: 40 to 70 GRBs per year. Moreover, the study highlighted that ECLAIRs will be particularly sensitive to the X-ray rich GRB population. My thesis provided additional studies about the localization performance, the rate of false alarm and the characteristics of the triggers of the algorithms. Finally, I also proposed two new methods for the detection of GRBs.The preliminary

  7. A filtered backprojection algorithm with characteristics of the iterative landweber algorithm

    OpenAIRE

    L. Zeng, Gengsheng

    2012-01-01

    Purpose: In order to eventually develop an analytical algorithm with noise characteristics of an iterative algorithm, this technical note develops a window function for the filtered backprojection (FBP) algorithm in tomography that behaves as an iterative Landweber algorithm.

  8. AUTOMATION OF CALCULATION ALGORITHMS FOR EFFICIENCY ESTIMATION OF TRANSPORT INFRASTRUCTURE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Sergey Kharitonov

    2015-06-01

    Full Text Available Optimum transport infrastructure usage is an important aspect of the development of the national economy of the Russian Federation. Thus, development of instruments for assessing the efficiency of infrastructure is impossible without constant monitoring of a number of significant indicators. This work is devoted to the selection of indicators and the method of their calculation in relation to the transport subsystem as airport infrastructure. The work also reflects aspects of the evaluation of the possibilities of algorithmic computational mechanisms to improve the tools of public administration transport subsystems.

  9. The semianalytical cloud retrieval algorithm for SCIAMACHY I. The validation

    Directory of Open Access Journals (Sweden)

    A. A. Kokhanovsky

    2006-01-01

    Full Text Available A recently developed cloud retrieval algorithm for the SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY (SCIAMACHY is briefly presented and validated using independent and well tested cloud retrieval techniques based on the look-up-table approach for MODeration resolutIon Spectrometer (MODIS data. The results of the cloud top height retrievals using measurements in the oxygen A-band by an airborne crossed Czerny-Turner spectrograph and the Global Ozone Monitoring Experiment (GOME instrument are compared with those obtained from airborne dual photography and retrievals using data from Along Track Scanning Radiometer (ATSR-2, respectively.

  10. The Usefulness of the DBC-ASA as a Screening Instrument for Autism in Children with Intellectual Disabilities: A Pilot Study

    Science.gov (United States)

    Deb, Shoumitro; Dhaliwal, Akal-Joat; Roy, Meera

    2009-01-01

    Aims: To explore the validity of Developmental Behaviour Checklist-Autism Screening Algorithm (DBC-ASA) as a screening instrument for autism among children with intellectual disabilities. Method: Data were collected from the case notes of 109 children with intellectual disabilities attending a specialist clinic in the UK. Results: The mean score…

  11. Super-Encryption Implementation Using Monoalphabetic Algorithm and XOR Algorithm for Data Security

    Science.gov (United States)

    Rachmawati, Dian; Andri Budiman, Mohammad; Aulia, Indra

    2018-03-01

    The exchange of data that occurs offline and online is very vulnerable to the threat of data theft. In general, cryptography is a science and art to maintain data secrecy. An encryption is a cryptography algorithm in which data is transformed into cipher text, which is something that is unreadable and meaningless so it cannot be read or understood by other parties. In super-encryption, two or more encryption algorithms are combined to make it more secure. In this work, Monoalphabetic algorithm and XOR algorithm are combined to form a super- encryption. Monoalphabetic algorithm works by changing a particular letter into a new letter based on existing keywords while the XOR algorithm works by using logic operation XOR Since Monoalphabetic algorithm is a classical cryptographic algorithm and XOR algorithm is a modern cryptographic algorithm, this scheme is expected to be both easy-to-implement and more secure. The combination of the two algorithms is capable of securing the data and restoring it back to its original form (plaintext), so the data integrity is still ensured.

  12. Soil Moisture Active Passive (SMAP) Project Algorithm Theoretical Basis Document SMAP L1B Radiometer Data Product: L1B_TB

    Science.gov (United States)

    Piepmeier, Jeffrey; Mohammed, Priscilla; De Amici, Giovanni; Kim, Edward; Peng, Jinzheng; Ruf, Christopher; Hanna, Maher; Yueh, Simon; Entekhabi, Dara

    2016-01-01

    The purpose of the Soil Moisture Active Passive (SMAP) radiometer calibration algorithm is to convert Level 0 (L0) radiometer digital counts data into calibrated estimates of brightness temperatures referenced to the Earth's surface within the main beam. The algorithm theory in most respects is similar to what has been developed and implemented for decades for other satellite radiometers; however, SMAP includes two key features heretofore absent from most satellite borne radiometers: radio frequency interference (RFI) detection and mitigation, and measurement of the third and fourth Stokes parameters using digital correlation. The purpose of this document is to describe the SMAP radiometer and forward model, explain the SMAP calibration algorithm, including approximations, errors, and biases, provide all necessary equations for implementing the calibration algorithm and detail the RFI detection and mitigation process. Section 2 provides a summary of algorithm objectives and driving requirements. Section 3 is a description of the instrument and Section 4 covers the forward models, upon which the algorithm is based. Section 5 gives the retrieval algorithm and theory. Section 6 describes the orbit simulator, which implements the forward model and is the key for deriving antenna pattern correction coefficients and testing the overall algorithm.

  13. Instrumentation

    International Nuclear Information System (INIS)

    Decreton, M.

    2000-01-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised

  14. OMERACT Endorsement of Patient-reported Outcome Instruments in Antineutrophil Cytoplasmic Antibody–associated Vasculitis

    Science.gov (United States)

    Robson, Joanna C.; Tomasson, Gunnar; Milman, Nataliya; Ashdown, Sue; Boonen, Annelies; Casey, George C.; Cronholm, Peter F.; Cuthbertson, David; Dawson, Jill; Direskeneli, Haner; Easley, Ebony; Kermani, Tanaz A.; Farrar, John T.; Gebhart, Don; Lanier, Georgia; Luqmani, Raashid A.; Mahr, Alfred; McAlear, Carol A.; Peck, Jacqueline; Shea, Beverley; Shea, Judy A.; Sreih, Antoine G.; Tugwell, Peter S.; Merkel, Peter A.

    2018-01-01

    Objective The antineutrophil cytoplasmic antibody–associated vasculitides (AAV) are multiorgan diseases. Patients with AAV report impairment in their health-related quality of life (HRQOL) and have different priorities regarding disease assessment compared with physicians. The Outcome Measures in Rheumatology (OMERACT) Vasculitis Working Group previously received endorsement for a core set of domains in AAV. Two approaches to measure patient-reported outcomes (PRO) were presented at OMERACT 2016. Methods A novel 5-step tool was used to facilitate assessment of the instruments by delegates: the OMERACT Filter 2.0 Instrument Selection Algorithm, with a red-amber-green checklist of questions, including (1) good match with domain (face and content validity), (2) feasibility, (3) do numeric scores make sense (construct validity)?, (4) overall ratings of discrimination, and (5) can individual thresholds of meaning be defined? Delegates gave an overall endorsement. Three generic Patient-Reported Outcomes Measurement Information System (PROMIS) instruments (fatigue, physical functioning, and pain interference) and a disease-specific PRO, the AAV-PRO (6 domains related to symptoms and HRQOL), were presented. Results OMERACT delegates endorsed the use of the PROMIS instruments for fatigue, physical functioning, and pain interference (87.6% overall endorsement) and the disease-specific AAV-PRO instrument (89.4% overall endorsement). Conclusion The OMERACT Vasculitis Working Group gained endorsement by OMERACT for use of the PROMIS and the AAV-PRO in clinical trials of vasculitis. These instruments are complementary to each other. The PROMIS and the AAV-PRO need further work to assess their utility in longitudinal settings, including their ability to discriminate between treatments of varying efficacy in the setting of a randomized controlled trial. PMID:28864650

  15. SeaWiFS technical report series. Volume 4: An analysis of GAC sampling algorithms. A case study

    Science.gov (United States)

    Yeh, Eueng-Nan (Editor); Hooker, Stanford B. (Editor); Hooker, Stanford B. (Editor); Mccain, Charles R. (Editor); Fu, Gary (Editor)

    1992-01-01

    The Sea-viewing Wide Field-of-view Sensor (SeaWiFS) instrument will sample at approximately a 1 km resolution at nadir which will be broadcast for reception by realtime ground stations. However, the global data set will be comprised of coarser four kilometer data which will be recorded and broadcast to the SeaWiFS Project for processing. Several algorithms for degrading the one kilometer data to four kilometer data are examined using imagery from the Coastal Zone Color Scanner (CZCS) in an effort to determine which algorithm would best preserve the statistical characteristics of the derived products generated from the one kilometer data. Of the algorithms tested, subsampling based on a fixed pixel within a 4 x 4 pixel array is judged to yield the most consistent results when compared to the one kilometer data products.

  16. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    Science.gov (United States)

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  17. Performing the Super Instrument

    DEFF Research Database (Denmark)

    Kallionpaa, Maria

    2016-01-01

    can empower performers by producing super instrument works that allow the concert instrument to become an ensemble controlled by a single player. The existing instrumental skills of the performer can be multiplied and the qualities of regular acoustic instruments extended or modified. Such a situation......The genre of contemporary classical music has seen significant innovation and research related to new super, hyper, and hybrid instruments, which opens up a vast palette of expressive potential. An increasing number of composers, performers, instrument designers, engineers, and computer programmers...... have become interested in different ways of “supersizing” acoustic instruments in order to open up previously-unheard instrumental sounds. Super instruments vary a great deal but each has a transformative effect on the identity and performance practice of the performing musician. Furthermore, composers...

  18. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  19. Calibration and assessment of electrochemical air quality sensors by co-location with regulatory-grade instruments

    Science.gov (United States)

    Hagan, David H.; Isaacman-VanWertz, Gabriel; Franklin, Jonathan P.; Wallace, Lisa M. M.; Kocar, Benjamin D.; Heald, Colette L.; Kroll, Jesse H.

    2018-01-01

    The use of low-cost air quality sensors for air pollution research has outpaced our understanding of their capabilities and limitations under real-world conditions, and there is thus a critical need for understanding and optimizing the performance of such sensors in the field. Here we describe the deployment, calibration, and evaluation of electrochemical sensors on the island of Hawai`i, which is an ideal test bed for characterizing such sensors due to its large and variable sulfur dioxide (SO2) levels and lack of other co-pollutants. Nine custom-built SO2 sensors were co-located with two Hawaii Department of Health Air Quality stations over the course of 5 months, enabling comparison of sensor output with regulatory-grade instruments under a range of realistic environmental conditions. Calibration using a nonparametric algorithm (k nearest neighbors) was found to have excellent performance (RMSE 0.997) across a wide dynamic range in SO2 ( 2 ppm). However, since nonparametric algorithms generally cannot extrapolate to conditions beyond those outside the training set, we introduce a new hybrid linear-nonparametric algorithm, enabling accurate measurements even when pollutant levels are higher than encountered during calibration. We find no significant change in instrument sensitivity toward SO2 after 18 weeks and demonstrate that calibration accuracy remains high when a sensor is calibrated at one location and then moved to another. The performance of electrochemical SO2 sensors is also strong at lower SO2 mixing ratios (pollutant species in other areas (e.g., polluted urban regions), the calibration and validation approaches described here should be widely applicable to a range of pollutants, sensors, and environments.

  20. Linear feature detection algorithm for astronomical surveys - I. Algorithm description

    Science.gov (United States)

    Bektešević, Dino; Vinković, Dejan

    2017-11-01

    Computer vision algorithms are powerful tools in astronomical image analyses, especially when automation of object detection and extraction is required. Modern object detection algorithms in astronomy are oriented towards detection of stars and galaxies, ignoring completely the detection of existing linear features. With the emergence of wide-field sky surveys, linear features attract scientific interest as possible trails of fast flybys of near-Earth asteroids and meteors. In this work, we describe a new linear feature detection algorithm designed specifically for implementation in big data astronomy. The algorithm combines a series of algorithmic steps that first remove other objects (stars and galaxies) from the image and then enhance the line to enable more efficient line detection with the Hough algorithm. The rate of false positives is greatly reduced thanks to a step that replaces possible line segments with rectangles and then compares lines fitted to the rectangles with the lines obtained directly from the image. The speed of the algorithm and its applicability in astronomical surveys are also discussed.

  1. Calibration and assessment of electrochemical air quality sensors by co-location with regulatory-grade instruments

    Directory of Open Access Journals (Sweden)

    D. H. Hagan

    2018-01-01

    Full Text Available The use of low-cost air quality sensors for air pollution research has outpaced our understanding of their capabilities and limitations under real-world conditions, and there is thus a critical need for understanding and optimizing the performance of such sensors in the field. Here we describe the deployment, calibration, and evaluation of electrochemical sensors on the island of Hawai`i, which is an ideal test bed for characterizing such sensors due to its large and variable sulfur dioxide (SO2 levels and lack of other co-pollutants. Nine custom-built SO2 sensors were co-located with two Hawaii Department of Health Air Quality stations over the course of 5 months, enabling comparison of sensor output with regulatory-grade instruments under a range of realistic environmental conditions. Calibration using a nonparametric algorithm (k nearest neighbors was found to have excellent performance (RMSE < 7 ppb, MAE < 4 ppb, r2 > 0.997 across a wide dynamic range in SO2 (< 1 ppb, > 2 ppm. However, since nonparametric algorithms generally cannot extrapolate to conditions beyond those outside the training set, we introduce a new hybrid linear–nonparametric algorithm, enabling accurate measurements even when pollutant levels are higher than encountered during calibration. We find no significant change in instrument sensitivity toward SO2 after 18 weeks and demonstrate that calibration accuracy remains high when a sensor is calibrated at one location and then moved to another. The performance of electrochemical SO2 sensors is also strong at lower SO2 mixing ratios (< 25 ppb, for which they exhibit an error of less than 2.5 ppb. While some specific results of this study (calibration accuracy, performance of the various algorithms, etc. may differ for measurements of other pollutant species in other areas (e.g., polluted urban regions, the calibration and validation approaches described here should be widely applicable

  2. Golden Sine Algorithm: A Novel Math-Inspired Algorithm

    Directory of Open Access Journals (Sweden)

    TANYILDIZI, E.

    2017-05-01

    Full Text Available In this study, Golden Sine Algorithm (Gold-SA is presented as a new metaheuristic method for solving optimization problems. Gold-SA has been developed as a new search algorithm based on population. This math-based algorithm is inspired by sine that is a trigonometric function. In the algorithm, random individuals are created as many as the number of search agents with uniform distribution for each dimension. The Gold-SA operator searches to achieve a better solution in each iteration by trying to bring the current situation closer to the target value. The solution space is narrowed by the golden section so that the areas that are supposed to give only good results are scanned instead of the whole solution space scan. In the tests performed, it is seen that Gold-SA has better results than other population based methods. In addition, Gold-SA has fewer algorithm-dependent parameters and operators than other metaheuristic methods, increasing the importance of this method by providing faster convergence of this new method.

  3. EVA: laparoscopic instrument tracking based on Endoscopic Video Analysis for psychomotor skills assessment.

    Science.gov (United States)

    Oropesa, Ignacio; Sánchez-González, Patricia; Chmarra, Magdalena K; Lamata, Pablo; Fernández, Alvaro; Sánchez-Margallo, Juan A; Jansen, Frank Willem; Dankelman, Jenny; Sánchez-Margallo, Francisco M; Gómez, Enrique J

    2013-03-01

    The EVA (Endoscopic Video Analysis) tracking system is a new system for extracting motions of laparoscopic instruments based on nonobtrusive video tracking. The feasibility of using EVA in laparoscopic settings has been tested in a box trainer setup. EVA makes use of an algorithm that employs information of the laparoscopic instrument's shaft edges in the image, the instrument's insertion point, and the camera's optical center to track the three-dimensional position of the instrument tip. A validation study of EVA comprised a comparison of the measurements achieved with EVA and the TrEndo tracking system. To this end, 42 participants (16 novices, 22 residents, and 4 experts) were asked to perform a peg transfer task in a box trainer. Ten motion-based metrics were used to assess their performance. Construct validation of the EVA has been obtained for seven motion-based metrics. Concurrent validation revealed that there is a strong correlation between the results obtained by EVA and the TrEndo for metrics, such as path length (ρ = 0.97), average speed (ρ = 0.94), or economy of volume (ρ = 0.85), proving the viability of EVA. EVA has been successfully validated in a box trainer setup, showing the potential of endoscopic video analysis to assess laparoscopic psychomotor skills. The results encourage further implementation of video tracking in training setups and image-guided surgery.

  4. Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Decreton, M

    2000-07-01

    SCK-CEN's research and development programme on instrumentation aims at evaluating the potentials of new instrumentation technologies under the severe constraints of a nuclear application. It focuses on the tolerance of sensors to high radiation doses, including optical fibre sensors, and on the related intelligent data processing needed to cope with the nuclear constraints. Main achievements in these domains in 1999 are summarised.

  5. VIRUS instrument enclosures

    Science.gov (United States)

    Prochaska, T.; Allen, R.; Mondrik, N.; Rheault, J. P.; Sauseda, M.; Boster, E.; James, M.; Rodriguez-Patino, M.; Torres, G.; Ham, J.; Cook, E.; Baker, D.; DePoy, Darren L.; Marshall, Jennifer L.; Hill, G. J.; Perry, D.; Savage, R. D.; Good, J. M.; Vattiat, Brian L.

    2014-08-01

    The Visible Integral-Field Replicable Unit Spectrograph (VIRUS) instrument will be installed at the Hobby-Eberly Telescope† in the near future. The instrument will be housed in two enclosures that are mounted adjacent to the telescope, via the VIRUS Support Structure (VSS). We have designed the enclosures to support and protect the instrument, to enable servicing of the instrument, and to cool the instrument appropriately while not adversely affecting the dome environment. The system uses simple HVAC air handling techniques in conjunction with thermoelectric and standard glycol heat exchangers to provide efficient heat removal. The enclosures also provide power and data transfer to and from each VIRUS unit, liquid nitrogen cooling to the detectors, and environmental monitoring of the instrument and dome environments. In this paper, we describe the design and fabrication of the VIRUS enclosures and their subsystems.

  6. The Orthogonally Partitioned EM Algorithm: Extending the EM Algorithm for Algorithmic Stability and Bias Correction Due to Imperfect Data.

    Science.gov (United States)

    Regier, Michael D; Moodie, Erica E M

    2016-05-01

    We propose an extension of the EM algorithm that exploits the common assumption of unique parameterization, corrects for biases due to missing data and measurement error, converges for the specified model when standard implementation of the EM algorithm has a low probability of convergence, and reduces a potentially complex algorithm into a sequence of smaller, simpler, self-contained EM algorithms. We use the theory surrounding the EM algorithm to derive the theoretical results of our proposal, showing that an optimal solution over the parameter space is obtained. A simulation study is used to explore the finite sample properties of the proposed extension when there is missing data and measurement error. We observe that partitioning the EM algorithm into simpler steps may provide better bias reduction in the estimation of model parameters. The ability to breakdown a complicated problem in to a series of simpler, more accessible problems will permit a broader implementation of the EM algorithm, permit the use of software packages that now implement and/or automate the EM algorithm, and make the EM algorithm more accessible to a wider and more general audience.

  7. Health physics instrument manual

    International Nuclear Information System (INIS)

    Gupton, E.D.

    1978-08-01

    The purpose of this manual is to provide apprentice health physics surveyors and other operating groups not directly concerned with radiation detection instruments a working knowledge of the radiation detection and measuring instruments in use at the Laboratory. The characteristics and applications of the instruments are given. Portable instruments, stationary instruments, personnel monitoring instruments, sample counters, and miscellaneous instruments are described. Also, information sheets on calibration sources, procedures, and devices are included. Gamma sources, beta sources, alpha sources, neutron sources, special sources, a gamma calibration device for badge dosimeters, and a calibration device for ionization chambers are described

  8. Test of TEDA, Tsunami Early Detection Algorithm

    Science.gov (United States)

    Bressan, Lidia; Tinti, Stefano

    2010-05-01

    Tsunami detection in real-time, both offshore and at the coastline, plays a key role in Tsunami Warning Systems since it provides so far the only reliable and timely proof of tsunami generation, and is used to confirm or cancel tsunami warnings previously issued on the basis of seismic data alone. Moreover, in case of submarine or coastal landslide generated tsunamis, which are not announced by clear seismic signals and are typically local, real-time detection at the coastline might be the fastest way to release a warning, even if the useful time for emergency operations might be limited. TEDA is an algorithm for real-time detection of tsunami signal on sea-level records, developed by the Tsunami Research Team of the University of Bologna. The development and testing of the algorithm has been accomplished within the framework of the Italian national project DPC-INGV S3 and the European project TRANSFER. The algorithm is to be implemented at station level, and it is based therefore only on sea-level data of a single station, either a coastal tide-gauge or an offshore buoy. TEDA's principle is to discriminate the first tsunami wave from the previous background signal, which implies the assumption that the tsunami waves introduce a difference in the previous sea-level signal. Therefore, in TEDA the instantaneous (most recent) and the previous background sea-level elevation gradients are characterized and compared by proper functions (IS and BS) that are updated at every new data acquisition. Detection is triggered when the instantaneous signal function passes a set threshold and at the same time it is significantly bigger compared to the previous background signal. The functions IS and BS depend on temporal parameters that allow the algorithm to be adapted different situations: in general, coastal tide-gauges have a typical background spectrum depending on the location where the instrument is installed, due to local topography and bathymetry, while offshore buoys are

  9. Parallel sorting algorithms

    CERN Document Server

    Akl, Selim G

    1985-01-01

    Parallel Sorting Algorithms explains how to use parallel algorithms to sort a sequence of items on a variety of parallel computers. The book reviews the sorting problem, the parallel models of computation, parallel algorithms, and the lower bounds on the parallel sorting problems. The text also presents twenty different algorithms, such as linear arrays, mesh-connected computers, cube-connected computers. Another example where algorithm can be applied is on the shared-memory SIMD (single instruction stream multiple data stream) computers in which the whole sequence to be sorted can fit in the

  10. A Gaussian Process Based Online Change Detection Algorithm for Monitoring Periodic Time Series

    Energy Technology Data Exchange (ETDEWEB)

    Chandola, Varun [ORNL; Vatsavai, Raju [ORNL

    2011-01-01

    Online time series change detection is a critical component of many monitoring systems, such as space and air-borne remote sensing instruments, cardiac monitors, and network traffic profilers, which continuously analyze observations recorded by sensors. Data collected by such sensors typically has a periodic (seasonal) component. Most existing time series change detection methods are not directly applicable to handle such data, either because they are not designed to handle periodic time series or because they cannot operate in an online mode. We propose an online change detection algorithm which can handle periodic time series. The algorithm uses a Gaussian process based non-parametric time series prediction model and monitors the difference between the predictions and actual observations within a statistically principled control chart framework to identify changes. A key challenge in using Gaussian process in an online mode is the need to solve a large system of equations involving the associated covariance matrix which grows with every time step. The proposed algorithm exploits the special structure of the covariance matrix and can analyze a time series of length T in O(T^2) time while maintaining a O(T) memory footprint, compared to O(T^4) time and O(T^2) memory requirement of standard matrix manipulation methods. We experimentally demonstrate the superiority of the proposed algorithm over several existing time series change detection algorithms on a set of synthetic and real time series. Finally, we illustrate the effectiveness of the proposed algorithm for identifying land use land cover changes using Normalized Difference Vegetation Index (NDVI) data collected for an agricultural region in Iowa state, USA. Our algorithm is able to detect different types of changes in a NDVI validation data set (with ~80% accuracy) which occur due to crop type changes as well as disruptive changes (e.g., natural disasters).

  11. Nuclear reactor instrumentation

    International Nuclear Information System (INIS)

    Duncombe, E.; McGonigal, G.

    1975-01-01

    A liquid metal cooled nuclear reactor is described which has an equal number of fuel sub-assemblies and sensing instruments. Each instrument senses temperature and rate of coolant flow of a coolant derived from a group of three sub-assemblies so that an abnormal value for one sub-assembly will be indicated on three instruments thereby providing for redundancy of up to two of the three instruments. The abnormal value may be a precurser to unstable boiling of coolant

  12. Exploring Effects of High School Students' Mathematical Processing Skills and Conceptual Understanding of Chemical Concepts on Algorithmic Problem Solving

    Science.gov (United States)

    Gultepe, Nejla; Yalcin Celik, Ayse; Kilic, Ziya

    2013-01-01

    The purpose of the study was to examine the effects of students' conceptual understanding of chemical concepts and mathematical processing skills on algorithmic problem-solving skills. The sample (N = 554) included grades 9, 10, and 11 students in Turkey. Data were collected using the instrument "MPC Test" and with interviews. The MPC…

  13. Tundish Cover Flux Thickness Measurement Method and Instrumentation Based on Computer Vision in Continuous Casting Tundish

    Directory of Open Access Journals (Sweden)

    Meng Lu

    2013-01-01

    Full Text Available Thickness of tundish cover flux (TCF plays an important role in continuous casting (CC steelmaking process. Traditional measurement method of TCF thickness is single/double wire methods, which have several problems such as personal security, easily affected by operators, and poor repeatability. To solve all these problems, in this paper, we specifically designed and built an instrumentation and presented a novel method to measure the TCF thickness. The instrumentation was composed of a measurement bar, a mechanical device, a high-definition industrial camera, a Siemens S7-200 programmable logic controller (PLC, and a computer. Our measurement method was based on the computer vision algorithms, including image denoising method, monocular range measurement method, scale invariant feature transform (SIFT, and image gray gradient detection method. Using the present instrumentation and method, images in the CC tundish can be collected by camera and transferred to computer to do imaging processing. Experiments showed that our instrumentation and method worked well at scene of steel plants, can accurately measure the thickness of TCF, and overcome the disadvantages of traditional measurement methods, or even replace the traditional ones.

  14. Validation of ozone monitoring instrument ultraviolet index against ground-based UV index in Kampala, Uganda.

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Ssenyonga, Taddeo; Chen, Yi-Chun; Stamnes, Jakob J; Frette, Øyvind; Hamre, Børge

    2015-10-01

    The Ozone Monitoring Instrument (OMI) overpass solar ultraviolet (UV) indices have been validated against the ground-based UV indices derived from Norwegian Institute for Air Research UV measurements in Kampala (0.31° N, 32.58° E, 1200 m), Uganda for the period between 2005 and 2014. An excessive use of old cars, which would imply a high loading of absorbing aerosols, could cause the OMI retrieval algorithm to overestimate the surface UV irradiances. The UV index values were found to follow a seasonal pattern with maximum values in March and October. Under all-sky conditions, the OMI retrieval algorithm was found to overestimate the UV index values with a mean bias of about 28%. When only days with radiation modification factor greater than or equal to 65%, 70%, 75%, and 80% were considered, the mean bias between ground-based and OMI overpass UV index values was reduced to 8%, 5%, 3%, and 1%, respectively. The overestimation of the UV index by the OMI retrieval algorithm was found to be mainly due to clouds and aerosols.

  15. Instrument validation project

    International Nuclear Information System (INIS)

    Reynolds, B.A.; Daymo, E.A.; Geeting, J.G.H.; Zhang, J.

    1996-06-01

    Westinghouse Hanford Company Project W-211 is responsible for providing the system capabilities to remove radioactive waste from ten double-shell tanks used to store radioactive wastes on the Hanford Site in Richland, Washington. The project is also responsible for measuring tank waste slurry properties prior to injection into pipeline systems, including the Replacement of Cross-Site Transfer System. This report summarizes studies of the appropriateness of the instrumentation specified for use in Project W-211. The instruments were evaluated in a test loop with simulated slurries that covered the range of properties specified in the functional design criteria. The results of the study indicate that the compact nature of the baseline Project W-211 loop does not result in reduced instrumental accuracy resulting from poor flow profile development. Of the baseline instrumentation, the Micromotion densimeter, the Moore Industries thermocouple, the Fischer and Porter magnetic flow meter, and the Red Valve Pressure transducer meet the desired instrumental accuracy. An alternate magnetic flow meter (Yokagawa) gave nearly identical results as the baseline fischer and Porter. The Micromotion flow meter did not meet the desired instrument accuracy but could potentially be calibrated so that it would meet the criteria. The Nametre on-line viscometer did not meet the desired instrumental accuracy and is not recommended as a quantitative instrument although it does provide qualitative information. The recommended minimum set of instrumentation necessary to ensure the slurry meets the Project W-058 acceptance criteria is the Micromotion mass flow meter and delta pressure cells

  16. Astronomical Instruments in India

    Science.gov (United States)

    Sarma, Sreeramula Rajeswara

    The earliest astronomical instruments used in India were the gnomon and the water clock. In the early seventh century, Brahmagupta described ten types of instruments, which were adopted by all subsequent writers with minor modifications. Contact with Islamic astronomy in the second millennium AD led to a radical change. Sanskrit texts began to lay emphasis on the importance of observational instruments. Exclusive texts on instruments were composed. Islamic instruments like the astrolabe were adopted and some new types of instruments were developed. Production and use of these traditional instruments continued, along with the cultivation of traditional astronomy, up to the end of the nineteenth century.

  17. An efficient feedback calibration algorithm for direct imaging radio telescopes

    Science.gov (United States)

    Beardsley, Adam P.; Thyagarajan, Nithyanandan; Bowman, Judd D.; Morales, Miguel F.

    2017-10-01

    We present the E-field Parallel Imaging Calibration (EPICal) algorithm, which addresses the need for a fast calibration method for direct imaging radio astronomy correlators. Direct imaging involves a spatial fast Fourier transform of antenna signals, alleviating an O(Na ^2) computational bottleneck typical in radio correlators, and yielding a more gentle O(Ng log _2 Ng) scaling, where Na is the number of antennas in the array and Ng is the number of gridpoints in the imaging analysis. This can save orders of magnitude in computation cost for next generation arrays consisting of hundreds or thousands of antennas. However, because antenna signals are mixed in the imaging correlator without creating visibilities, gain correction must be applied prior to imaging, rather than on visibilities post-correlation. We develop the EPICal algorithm to form gain solutions quickly and without ever forming visibilities. This method scales as the number of antennas, and produces results comparable to those from visibilities. We use simulations to demonstrate the EPICal technique and study the noise properties of our gain solutions, showing they are similar to visibility-based solutions in realistic situations. By applying EPICal to 2 s of Long Wavelength Array data, we achieve a 65 per cent dynamic range improvement compared to uncalibrated images, showing this algorithm is a promising solution for next generation instruments.

  18. Initial Results from Radiometer and Polarized Radar-Based Icing Algorithms Compared to In-Situ Data

    Science.gov (United States)

    Serke, David; Reehorst, Andrew L.; King, Michael

    2015-01-01

    In early 2015, a field campaign was conducted at the NASA Glenn Research Center in Cleveland, Ohio, USA. The purpose of the campaign is to test several prototype algorithms meant to detect the location and severity of in-flight icing (or icing aloft, as opposed to ground icing) within the terminal airspace. Terminal airspace for this project is currently defined as within 25 kilometers horizontal distance of the terminal, which in this instance is Hopkins International Airport in Cleveland. Two new and improved algorithms that utilize ground-based remote sensing instrumentation have been developed and were operated during the field campaign. The first is the 'NASA Icing Remote Sensing System', or NIRSS. The second algorithm is the 'Radar Icing Algorithm', or RadIA. In addition to these algorithms, which were derived from ground-based remote sensors, in-situ icing measurements of the profiles of super-cooled liquid water (SLW) collected with vibrating wire sondes attached to weather balloons produced a comprehensive database for comparison. Key fields from the SLW-sondes include air temperature, humidity and liquid water content, cataloged by time and 3-D location. This work gives an overview of the NIRSS and RadIA products and results are compared to in-situ SLW-sonde data from one icing case study. The location and quantity of super-cooled liquid as measured by the in-situ probes provide a measure of the utility of these prototype hazard-sensing algorithms.

  19. Leakage Detection and Estimation Algorithm for Loss Reduction in Water Piping Networks

    Directory of Open Access Journals (Sweden)

    Kazeem B. Adedeji

    2017-10-01

    Full Text Available Water loss through leaking pipes constitutes a major challenge to the operational service of water utilities. In recent years, increasing concern about the financial loss and environmental pollution caused by leaking pipes has been driving the development of efficient algorithms for detecting leakage in water piping networks. Water distribution networks (WDNs are disperse in nature with numerous number of nodes and branches. Consequently, identifying the segment(s of the network and the exact leaking pipelines connected to this segment(s where higher background leakage outflow occurs is a challenging task. Background leakage concerns the outflow from small cracks or deteriorated joints. In addition, because they are diffuse flow, they are not characterised by quick pressure drop and are not detectable by measuring instruments. Consequently, they go unreported for a long period of time posing a threat to water loss volume. Most of the existing research focuses on the detection and localisation of burst type leakages which are characterised by a sudden pressure drop. In this work, an algorithm for detecting and estimating background leakage in water distribution networks is presented. The algorithm integrates a leakage model into a classical WDN hydraulic model for solving the network leakage flows. The applicability of the developed algorithm is demonstrated on two different water networks. The results of the tested networks are discussed and the solutions obtained show the benefits of the proposed algorithm. A noteworthy evidence is that the algorithm permits the detection of critical segments or pipes of the network experiencing higher leakage outflow and indicates the probable pipes of the network where pressure control can be performed. However, the possible position of pressure control elements along such critical pipes will be addressed in future work.

  20. An astrometric standard field in omega Cen

    Science.gov (United States)

    Wyse, Rosemary

    2003-07-01

    We propose to obtain a high-precision astrometric standard in a two-step procedure. First, we will create a ground-based astrometric standard field around omega Cen down to V=22 with a 3 mas accuracy in positions and better than 0.5 mas/yr in proper motions. This standard will be used to obtain precise absolute plate solutions for selected WFPC2 CCD frames and refine the self-calibrated mean distortion solution for the WFPC2 CCD chips. This will eliminate systematic errors inherent in the self-calibration techniques down to the rms=0.3 mas level, thus opening new opportunities to perform precision astrometry with WFPC2 alone or in combination with the other HST imaging instruments. We will also address the issue of the distortion's variation which has a paramount significance for space astrometry such as spearheaded by the HST or being under development {SIM, GAIA}. Second, all reduced WFPC2 CCD frames will be combined into the two field catalogs {astrometric flat fields} of positions in omega Cen of unprecedented precision {s.e.=0.1 mas} down to V=22 and will be available to the GO community and readily applicable to calibrating the ACS.

  1. Automatic tracking of laparoscopic instruments for autonomous control of a cameraman robot.

    Science.gov (United States)

    Khoiy, Keyvan Amini; Mirbagheri, Alireza; Farahmand, Farzam

    2016-01-01

    An automated instrument tracking procedure was designed and developed for autonomous control of a cameraman robot during laparoscopic surgery. The procedure was based on an innovative marker-free segmentation algorithm for detecting the tip of the surgical instruments in laparoscopic images. A compound measure of Saturation and Value components of HSV color space was incorporated that was enhanced further using the Hue component and some essential characteristics of the instrument segment, e.g., crossing the image boundaries. The procedure was then integrated into the controlling system of the RoboLens cameraman robot, within a triple-thread parallel processing scheme, such that the tip is always kept at the center of the image. Assessment of the performance of the system on prerecorded real surgery movies revealed an accuracy rate of 97% for high quality images and about 80% for those suffering from poor lighting and/or blood, water and smoke noises. A reasonably satisfying performance was also observed when employing the system for autonomous control of the robot in a laparoscopic surgery phantom, with a mean time delay of 200ms. It was concluded that with further developments, the proposed procedure can provide a practical solution for autonomous control of cameraman robots during laparoscopic surgery operations.

  2. Validation of ultraviolet radiation budgets using satellite observations from the OMI instrument

    International Nuclear Information System (INIS)

    Den Outer, P.N.; Van Dijk, A.; Slaper, H.

    2008-11-01

    Satellite retrieval of ozone, clouds, aerosols and ground albedo allows the modelling of ultraviolet (UV)-doses received at the ground. UV-doses derived from satellite observations are highly useful in analyzing regional differences in the effects of ozone depletion and climate change on the biologically effective UV-radiation levels. RIVM has developed and used UV-mapping and UV-risk mapping techniques in environmental assessments in evaluating the effects of ozone depletion and climate change. This project provides a validation study on the OMUVB product by means of a comparison with ground-based measurements. This validation should demonstrate if the OMUVB product can be used from the perspective of long-term environmental trend assessments. Comparing ground-based UV-measurements with the OMUVB product, we show that the product consistently overestimates the UV-doses received at the ground in Europe. The systematic comparison with data from 8 European sites shows on average a 15% overestimate in the yearly integrated UV with a site-to-site variability of around 8%. For four of the more northern sites the overestimation in yearly doses is between 5-10%, and for the four sites that are more southern the deviation is 20-27%. Using the ozone and reflectivity data from the OMI-instrument (Ozone Monitoring Instrument) in combination with the AMOUR-algorithm (Assessment Model for Ultraviolet radiation and Risks) shows smaller overestimates of on average 5-6% with a similar variability between the sites. The variability between sites is largely caused by aerosol and albedo effects and is reduced to 3% if local data on aerosol and albedo are used. The overestimates in the OMUVB product are primarily due to too low (tropospheric) aerosol loads used for the European sites. In addition, our comparison shows that under heavy clouded conditions the cloud modification factors are too high. This contributes to the overall too high UV-doses of the OMUVB product. Environmental

  3. Validation of ultraviolet radiation budgets using satellite observations from the OMI instrument

    Energy Technology Data Exchange (ETDEWEB)

    Den Outer, P.N.; Van Dijk, A.; Slaper, H.

    2008-11-15

    Satellite retrieval of ozone, clouds, aerosols and ground albedo allows the modelling of ultraviolet (UV)-doses received at the ground. UV-doses derived from satellite observations are highly useful in analyzing regional differences in the effects of ozone depletion and climate change on the biologically effective UV-radiation levels. RIVM has developed and used UV-mapping and UV-risk mapping techniques in environmental assessments in evaluating the effects of ozone depletion and climate change. This project provides a validation study on the OMUVB product by means of a comparison with ground-based measurements. This validation should demonstrate if the OMUVB product can be used from the perspective of long-term environmental trend assessments. Comparing ground-based UV-measurements with the OMUVB product, we show that the product consistently overestimates the UV-doses received at the ground in Europe. The systematic comparison with data from 8 European sites shows on average a 15% overestimate in the yearly integrated UV with a site-to-site variability of around 8%. For four of the more northern sites the overestimation in yearly doses is between 5-10%, and for the four sites that are more southern the deviation is 20-27%. Using the ozone and reflectivity data from the OMI-instrument (Ozone Monitoring Instrument) in combination with the AMOUR-algorithm (Assessment Model for Ultraviolet radiation and Risks) shows smaller overestimates of on average 5-6% with a similar variability between the sites. The variability between sites is largely caused by aerosol and albedo effects and is reduced to 3% if local data on aerosol and albedo are used. The overestimates in the OMUVB product are primarily due to too low (tropospheric) aerosol loads used for the European sites. In addition, our comparison shows that under heavy clouded conditions the cloud modification factors are too high. This contributes to the overall too high UV-doses of the OMUVB product. Environmental

  4. Instrumentation a reader

    CERN Document Server

    Pope, P

    1990-01-01

    This book contains a selection of papers and articles in instrumentation previously pub­ lished in technical periodicals and journals of learned societies. Our selection has been made to illustrate aspects of current practice and applications of instrumentation. The book does not attempt to be encyclopaedic in its coverage of the subject, but to provide some examples of general transduction techniques, of the sensing of particular measurands, of components of instrumentation systems and of instrumentation practice in two very different environments, the food industry and the nuclear power industry. We have made the selection particularly to provide papers appropriate to the study of the Open University course T292 Instrumentation. The papers have been chosen so that the book covers a wide spectrum of instrumentation techniques. Because of this, the book should be of value not only to students of instrumen­ tation, but also to practising engineers and scientists wishing to glean ideas from areas of instrumen...

  5. Classification and authentication of unknown water samples using machine learning algorithms.

    Science.gov (United States)

    Kundu, Palash K; Panchariya, P C; Kundu, Madhusree

    2011-07-01

    This paper proposes the development of water sample classification and authentication, in real life which is based on machine learning algorithms. The proposed techniques used experimental measurements from a pulse voltametry method which is based on an electronic tongue (E-tongue) instrumentation system with silver and platinum electrodes. E-tongue include arrays of solid state ion sensors, transducers even of different types, data collectors and data analysis tools, all oriented to the classification of liquid samples and authentication of unknown liquid samples. The time series signal and the corresponding raw data represent the measurement from a multi-sensor system. The E-tongue system, implemented in a laboratory environment for 6 numbers of different ISI (Bureau of Indian standard) certified water samples (Aquafina, Bisleri, Kingfisher, Oasis, Dolphin, and McDowell) was the data source for developing two types of machine learning algorithms like classification and regression. A water data set consisting of 6 numbers of sample classes containing 4402 numbers of features were considered. A PCA (principal component analysis) based classification and authentication tool was developed in this study as the machine learning component of the E-tongue system. A proposed partial least squares (PLS) based classifier, which was dedicated as well; to authenticate a specific category of water sample evolved out as an integral part of the E-tongue instrumentation system. The developed PCA and PLS based E-tongue system emancipated an overall encouraging authentication percentage accuracy with their excellent performances for the aforesaid categories of water samples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Fermion cluster algorithms

    International Nuclear Information System (INIS)

    Chandrasekharan, Shailesh

    2000-01-01

    Cluster algorithms have been recently used to eliminate sign problems that plague Monte-Carlo methods in a variety of systems. In particular such algorithms can also be used to solve sign problems associated with the permutation of fermion world lines. This solution leads to the possibility of designing fermion cluster algorithms in certain cases. Using the example of free non-relativistic fermions we discuss the ideas underlying the algorithm

  7. Real-Time On-Board Airborne Demonstration of High-Speed On-Board Data Processing for Science Instruments (HOPS)

    Science.gov (United States)

    Beyon, Jeffrey Y.; Ng, Tak-Kwong; Davis, Mitchell J.; Adams, James K.; Bowen, Stephen C.; Fay, James J.; Hutchinson, Mark A.

    2015-01-01

    The project called High-Speed On-Board Data Processing for Science Instruments (HOPS) has been funded by NASA Earth Science Technology Office (ESTO) Advanced Information Systems Technology (AIST) program since April, 2012. The HOPS team recently completed two flight campaigns during the summer of 2014 on two different aircrafts with two different science instruments. The first flight campaign was in July, 2014 based at NASA Langley Research Center (LaRC) in Hampton, VA on the NASA's HU-25 aircraft. The science instrument that flew with HOPS was Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) CarbonHawk Experiment Simulator (ACES) funded by NASA's Instrument Incubator Program (IIP). The second campaign was in August, 2014 based at NASA Armstrong Flight Research Center (AFRC) in Palmdale, CA on the NASA's DC-8 aircraft. HOPS flew with the Multifunctional Fiber Laser Lidar (MFLL) instrument developed by Excelis Inc. The goal of the campaigns was to perform an end-to-end demonstration of the capabilities of the HOPS prototype system (HOPS COTS) while running the most computationally intensive part of the ASCENDS algorithm real-time on-board. The comparison of the two flight campaigns and the results of the functionality tests of the HOPS COTS are presented in this paper.

  8. Classic (Nonquantic) Algorithm for Observations and Measurements Based on Statistical Strategies of Particles Fields

    OpenAIRE

    Savastru, D.; Dontu, Simona; Savastru, Roxana; Sterian, Andreea Rodica

    2013-01-01

    Our knowledge about surroundings can be achieved by observations and measurements but both are influenced by errors (noise). Therefore one of the first tasks is to try to eliminate the noise by constructing instruments with high accuracy. But any real observed and measured system is characterized by natural limits due to the deterministic nature of the measured information. The present work is dedicated to the identification of these limits. We have analyzed some algorithms for selection and ...

  9. Data evaluation for operator-inspector differences for a specific NDA instrument

    International Nuclear Information System (INIS)

    Franklin, M.

    1984-01-01

    The Joint Research Centre (JRC) of the European Commission is developing a number of NDA instruments for safeguards use. In particular the JRC has developed a photo neutron active interrogation device (Phonid) for the assay of U-235 in bulk quantities. This report describes new statistical results for the D statistic in the context of data evaluation algorithms for the Phonid instrument. The Phonid instrument is useful for this purpose because its error propagation structure is well characterised and yet not trivially simple. The data evaluation for Phonid data is derived from its error propagation modelling plus new results for the sampling distribution of the D statistic. The problem of assigning an uncertainty to the D statistic value without any diversion strategy assumptions has long been an unresolved problem. The results described in this report provide the solution to this problem by considering the sampling distribution of the D statistic given the population of discrepancies. Discrepancy is defined as the difference between operator declared values and the true values measured by the inspector. This approach provides estimable expressions for the sampling moments of the D statistic without making any assumption about the cause (diversion, clerical error, measurement error) of the discrepancy. The report also provides a general discussion of the distinction between planning a verification and performing the data analysis after the verification has been carried out

  10. A fast marching algorithm for the factored eikonal equation

    Energy Technology Data Exchange (ETDEWEB)

    Treister, Eran, E-mail: erantreister@gmail.com [Department of Earth and Ocean Sciences, The University of British Columbia, Vancouver, BC (Canada); Haber, Eldad, E-mail: haber@math.ubc.ca [Department of Earth and Ocean Sciences, The University of British Columbia, Vancouver, BC (Canada); Department of Mathematics, The University of British Columbia, Vancouver, BC (Canada)

    2016-11-01

    The eikonal equation is instrumental in many applications in several fields ranging from computer vision to geoscience. This equation can be efficiently solved using the iterative Fast Sweeping (FS) methods and the direct Fast Marching (FM) methods. However, when used for a point source, the original eikonal equation is known to yield inaccurate numerical solutions, because of a singularity at the source. In this case, the factored eikonal equation is often preferred, and is known to yield a more accurate numerical solution. One application that requires the solution of the eikonal equation for point sources is travel time tomography. This inverse problem may be formulated using the eikonal equation as a forward problem. While this problem has been solved using FS in the past, the more recent choice for applying it involves FM methods because of the efficiency in which sensitivities can be obtained using them. However, while several FS methods are available for solving the factored equation, the FM method is available only for the original eikonal equation. In this paper we develop a Fast Marching algorithm for the factored eikonal equation, using both first and second order finite-difference schemes. Our algorithm follows the same lines as the original FM algorithm and requires the same computational effort. In addition, we show how to obtain sensitivities using this FM method and apply travel time tomography, formulated as an inverse factored eikonal equation. Numerical results in two and three dimensions show that our algorithm solves the factored eikonal equation efficiently, and demonstrate the achieved accuracy for computing the travel time. We also demonstrate a recovery of a 2D and 3D heterogeneous medium by travel time tomography using the eikonal equation for forward modeling and inversion by Gauss–Newton.

  11. Validation of MERIS Ocean Color Algorithms in the Mediterranean Sea

    Science.gov (United States)

    Marullo, S.; D'Ortenzio, F.; Ribera D'Alcalà, M.; Ragni, M.; Santoleri, R.; Vellucci, V.; Luttazzi, C.

    2004-05-01

    Satellite ocean color measurements can contribute, better than any other source of data, to quantify the spatial and time variability of ocean productivity and, tanks to the success of several satellite missions starting with CZCS up to SeaWiFS, MODIS and MERIS, it is now possible to start doing the investigation of interannual variations and compare level of production during different decades ([1],[2]). The interannual variability of the ocean productivity at global and regional scale can be correctly measured providing that chlorophyll estimate are based on well calibrated algorithms in order to avoid regional biases and instrumental time shifts. The calibration and validation of Ocean Color data is then one of the most important tasks of several research projects worldwide ([3], [4]). Algorithms developed to retrieve chlorophyll concentration need a specific effort to define the error ranges associated to the estimates. In particular, the empirical algorithms, calculated on regression with in situ data, require independent records to verify the degree of uncertainties associated. In addition several evidences demonstrated that regional algorithms can improve the accuracy of the satellite chlorophyll estimates [5]. In 2002, Santoleri et al. (SIMBIOS) first showed a significant overestimation of the SeaWiFS derived chlorophyll concentration in Mediterranean Sea when the standard global NASA algorithms (OC4v2 and OC4v4) are used. The same authors [6] proposed two preliminary new algorithms for the Mediterranean Sea (L-DORMA and NL-DORMA) on a basis of a bio-optical data set collected in the basin from 1998 to 2000. In 2002 Bricaud et al., [7] analyzing other bio-optical data collected in the Mediterranean, confirmed the overestimation of the chlorophyll concentration in oligotrophic conditions and proposed a new regional algorithm to be used in case of low concentrations. Recently, the number of in situ observations in the basin was increased, permitting a first

  12. Algorithm improvement program nuclide identification algorithm scoring criteria and scoring application.

    Energy Technology Data Exchange (ETDEWEB)

    Enghauser, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-02-01

    The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.

  13. IOT Overview: IR Instruments

    Science.gov (United States)

    Mason, E.

    In this instrument review chapter the calibration plans of ESO IR instruments are presented and briefly reviewed focusing, in particular, on the case of ISAAC, which has been the first IR instrument at VLT and whose calibration plan served as prototype for the coming instruments.

  14. Algorithm of Particle Data Association for SLAM Based on Improved Ant Algorithm

    Directory of Open Access Journals (Sweden)

    KeKe Gen

    2015-01-01

    Full Text Available The article considers a problem of data association algorithm for simultaneous localization and mapping guidelines in determining the route of unmanned aerial vehicles (UAVs. Currently, these equipments are already widely used, but mainly controlled from the remote operator. An urgent task is to develop a control system that allows for autonomous flight. Algorithm SLAM (simultaneous localization and mapping, which allows to predict the location, speed, the ratio of flight parameters and the coordinates of landmarks and obstacles in an unknown environment, is one of the key technologies to achieve real autonomous UAV flight. The aim of this work is to study the possibility of solving this problem by using an improved ant algorithm.The data association for SLAM algorithm is meant to establish a matching set of observed landmarks and landmarks in the state vector. Ant algorithm is one of the widely used optimization algorithms with positive feedback and the ability to search in parallel, so the algorithm is suitable for solving the problem of data association for SLAM. But the traditional ant algorithm in the process of finding routes easily falls into local optimum. Adding random perturbations in the process of updating the global pheromone to avoid local optima. Setting limits pheromone on the route can increase the search space with a reasonable amount of calculations for finding the optimal route.The paper proposes an algorithm of the local data association for SLAM algorithm based on an improved ant algorithm. To increase the speed of calculation, local data association is used instead of the global data association. The first stage of the algorithm defines targets in the matching space and the observed landmarks with the possibility of association by the criterion of individual compatibility (IC. The second stage defines the matched landmarks and their coordinates using improved ant algorithm. Simulation results confirm the efficiency and

  15. Troubleshooting in nuclear instruments

    International Nuclear Information System (INIS)

    1987-06-01

    This report on troubleshooting of nuclear instruments is the product of several scientists and engineers, who are closely associated with nuclear instrumentation and with the IAEA activities in the field. The text covers the following topics: Preamplifiers, amplifiers, scalers, timers, ratemeters, multichannel analyzers, dedicated instruments, tools, instruments, accessories, components, skills, interfaces, power supplies, preventive maintenance, troubleshooting in systems, radiation detectors. The troubleshooting and repair of instruments is illustrated by some real examples

  16. The BR eigenvalue algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Geist, G.A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Howell, G.W. [Florida Inst. of Tech., Melbourne, FL (United States). Dept. of Applied Mathematics; Watkins, D.S. [Washington State Univ., Pullman, WA (United States). Dept. of Pure and Applied Mathematics

    1997-11-01

    The BR algorithm, a new method for calculating the eigenvalues of an upper Hessenberg matrix, is introduced. It is a bulge-chasing algorithm like the QR algorithm, but, unlike the QR algorithm, it is well adapted to computing the eigenvalues of the narrowband, nearly tridiagonal matrices generated by the look-ahead Lanczos process. This paper describes the BR algorithm and gives numerical evidence that it works well in conjunction with the Lanczos process. On the biggest problems run so far, the BR algorithm beats the QR algorithm by a factor of 30--60 in computing time and a factor of over 100 in matrix storage space.

  17. Dendroclimatic transfer functions revisited: Little Ice Age and Medieval Warm Period summer temperatures reconstructed using artificial neural networks and linear algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Helama, S.; Holopainen, J.; Eronen, M. [Department of Geology, University of Helsinki, (Finland); Makarenko, N.G. [Russian Academy of Sciences, St. Petersburg (Russian Federation). Pulkovo Astronomical Observatory; Karimova, L.M.; Kruglun, O.A. [Institute of Mathematics, Almaty (Kazakhstan); Timonen, M. [Finnish Forest Research Institute, Rovaniemi Research Unit (Finland); Merilaeinen, J. [SAIMA Unit of the Savonlinna Department of Teacher Education, University of Joensuu (Finland)

    2009-07-01

    Tree-rings tell of past climates. To do so, tree-ring chronologies comprising numerous climate-sensitive living-tree and subfossil time-series need to be 'transferred' into palaeoclimate estimates using transfer functions. The purpose of this study is to compare different types of transfer functions, especially linear and nonlinear algorithms. Accordingly, multiple linear regression (MLR), linear scaling (LSC) and artificial neural networks (ANN, nonlinear algorithm) were compared. Transfer functions were built using a regional tree-ring chronology and instrumental temperature observations from Lapland (northern Finland and Sweden). In addition, conventional MLR was compared with a hybrid model whereby climate was reconstructed separately for short- and long-period timescales prior to combining the bands of timescales into a single hybrid model. The fidelity of the different reconstructions was validated against instrumental climate data. The reconstructions by MLR and ANN showed reliable reconstruction capabilities over the instrumental period (AD 1802-1998). LCS failed to reach reasonable verification statistics and did not qualify as a reliable reconstruction: this was due mainly to exaggeration of the low-frequency climatic variance. Over this instrumental period, the reconstructed low-frequency amplitudes of climate variability were rather similar by MLR and ANN. Notably greater differences between the models were found over the actual reconstruction period (AD 802-1801). A marked temperature decline, as reconstructed by MLR, from the Medieval Warm Period (AD 931-1180) to the Little Ice Age (AD 1601-1850), was evident in all the models. This decline was approx. 0.5 C as reconstructed by MLR. Different ANN based palaeotemperatures showed simultaneous cooling of 0.2 to 0.5 C, depending on algorithm. The hybrid MLR did not seem to provide further benefit above conventional MLR in our sample. The robustness of the conventional MLR over the calibration

  18. Vertigo in childhood: proposal for a diagnostic algorithm based upon clinical experience.

    Science.gov (United States)

    Casani, A P; Dallan, I; Navari, E; Sellari Franceschini, S; Cerchiai, N

    2015-06-01

    The aim of this paper is to analyse, after clinical experience with a series of patients with established diagnoses and review of the literature, all relevant anamnestic features in order to build a simple diagnostic algorithm for vertigo in childhood. This study is a retrospective chart review. A series of 37 children underwent complete clinical and instrumental vestibular examination. Only neurological disorders or genetic diseases represented exclusion criteria. All diagnoses were reviewed after applying the most recent diagnostic guidelines. In our experience, the most common aetiology for dizziness is vestibular migraine (38%), followed by acute labyrinthitis/neuritis (16%) and somatoform vertigo (16%). Benign paroxysmal vertigo was diagnosed in 4 patients (11%) and paroxysmal torticollis was diagnosed in a 1-year-old child. In 8% (3 patients) of cases, the dizziness had a post-traumatic origin: 1 canalolithiasis of the posterior semicircular canal and 2 labyrinthine concussions, respectively. Menière's disease was diagnosed in 2 cases. A bilateral vestibular failure of unknown origin caused chronic dizziness in 1 patient. In conclusion, this algorithm could represent a good tool for guiding clinical suspicion to correct diagnostic assessment in dizzy children where no neurological findings are detectable. The algorithm has just a few simple steps, based mainly on two aspects to be investigated early: temporal features of vertigo and presence of hearing impairment. A different algorithm has been proposed for cases in which a traumatic origin is suspected.

  19. Geometric approximation algorithms

    CERN Document Server

    Har-Peled, Sariel

    2011-01-01

    Exact algorithms for dealing with geometric objects are complicated, hard to implement in practice, and slow. Over the last 20 years a theory of geometric approximation algorithms has emerged. These algorithms tend to be simple, fast, and more robust than their exact counterparts. This book is the first to cover geometric approximation algorithms in detail. In addition, more traditional computational geometry techniques that are widely used in developing such algorithms, like sampling, linear programming, etc., are also surveyed. Other topics covered include approximate nearest-neighbor search, shape approximation, coresets, dimension reduction, and embeddings. The topics covered are relatively independent and are supplemented by exercises. Close to 200 color figures are included in the text to illustrate proofs and ideas.

  20. Broad bandwidth frequency domain instrument for quantitative tissue optical spectroscopy

    International Nuclear Information System (INIS)

    Pham, Tuan H.; Coquoz, Olivier; Fishkin, Joshua B.; Anderson, Eric; Tromberg, Bruce J.

    2000-01-01

    Near-infrared (NIR) optical properties of turbid media, e.g., tissue, can be accurately quantified noninvasively using methods based on diffuse reflectance or transmittance, such as frequency domain photon migration (FDPM). Factors which govern the accuracy and sensitivity of FDPM-measured optical properties include instrument performance, the light propagation model, and fitting algorithms used to calculate optical properties from measured data. In this article, we characterize instrument, model, and fitting uncertaintics of an FDPM system designed for clinical use and investigate how each of these factors affects the quantification of NIR absorption (μ a ) and reduced scattering (μ s ' ) parameters in tissue phantoms. The instrument is based on a 500 MHz, multiwavelength platform that sweeps through 201 discrete frequencies in as little as 675 ms. Phase and amplitude of intensity modulated light launched into tissue, i.e., diffuse photon density waves (PDW), are measured with an accuracy of ±0.30 degree sign and ±3.5%, while phase and amplitude precision are ±0.025 degree sign and ±0.20%, respectively. At this level of instrument uncertainty, simultaneous fitting of frequency-dependent phase and amplitude nonlinear model functions derived from a photon diffusion approximation provides an accurate and robust strategy for determining optical properties from FDPM data, especially for media with high absorption. In an optical property range that is characteristic of most human tissues in the NIR (5x10 -3 a -2 mm -1 , 0.5 s ' -1 ), we theoretically and experimentally demonstrate that the multifrequency, simultaneous-fit approach allows μ a and μ s ' to be quantified with an accuracy of ±5% and ±3%, respectively. Although exceptionally high levels of precision can be obtained using this approach ( a and μ s ' . (c) 2000 American Institute of Physics

  1. Algorithms in Singular

    Directory of Open Access Journals (Sweden)

    Hans Schonemann

    1996-12-01

    Full Text Available Some algorithms for singularity theory and algebraic geometry The use of Grobner basis computations for treating systems of polynomial equations has become an important tool in many areas. This paper introduces of the concept of standard bases (a generalization of Grobner bases and the application to some problems from algebraic geometry. The examples are presented as SINGULAR commands. A general introduction to Grobner bases can be found in the textbook [CLO], an introduction to syzygies in [E] and [St1]. SINGULAR is a computer algebra system for computing information about singularities, for use in algebraic geometry. The basic algorithms in SINGULAR are several variants of a general standard basis algorithm for general monomial orderings (see [GG]. This includes wellorderings (Buchberger algorithm ([B1], [B2] and tangent cone orderings (Mora algorithm ([M1], [MPT] as special cases: It is able to work with non-homogeneous and homogeneous input and also to compute in the localization of the polynomial ring in 0. Recent versions include algorithms to factorize polynomials and a factorizing Grobner basis algorithm. For a complete description of SINGULAR see [Si].

  2. Multimodal optimization by using hybrid of artificial bee colony algorithm and BFGS algorithm

    Science.gov (United States)

    Anam, S.

    2017-10-01

    Optimization has become one of the important fields in Mathematics. Many problems in engineering and science can be formulated into optimization problems. They maybe have many local optima. The optimization problem with many local optima, known as multimodal optimization problem, is how to find the global solution. Several metaheuristic methods have been proposed to solve multimodal optimization problems such as Particle Swarm Optimization (PSO), Genetics Algorithm (GA), Artificial Bee Colony (ABC) algorithm, etc. The performance of the ABC algorithm is better than or similar to those of other population-based algorithms with the advantage of employing a fewer control parameters. The ABC algorithm also has the advantages of strong robustness, fast convergence and high flexibility. However, it has the disadvantages premature convergence in the later search period. The accuracy of the optimal value cannot meet the requirements sometimes. Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm is a good iterative method for finding a local optimum. Compared with other local optimization methods, the BFGS algorithm is better. Based on the advantages of the ABC algorithm and the BFGS algorithm, this paper proposes a hybrid of the artificial bee colony algorithm and the BFGS algorithm to solve the multimodal optimization problem. The first step is that the ABC algorithm is run to find a point. In the second step is that the point obtained by the first step is used as an initial point of BFGS algorithm. The results show that the hybrid method can overcome from the basic ABC algorithm problems for almost all test function. However, if the shape of function is flat, the proposed method cannot work well.

  3. Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Surafel Luleseged Tilahun

    2012-01-01

    Full Text Available Firefly algorithm is one of the new metaheuristic algorithms for optimization problems. The algorithm is inspired by the flashing behavior of fireflies. In the algorithm, randomly generated solutions will be considered as fireflies, and brightness is assigned depending on their performance on the objective function. One of the rules used to construct the algorithm is, a firefly will be attracted to a brighter firefly, and if there is no brighter firefly, it will move randomly. In this paper we modify this random movement of the brighter firefly by generating random directions in order to determine the best direction in which the brightness increases. If such a direction is not generated, it will remain in its current position. Furthermore the assignment of attractiveness is modified in such a way that the effect of the objective function is magnified. From the simulation result it is shown that the modified firefly algorithm performs better than the standard one in finding the best solution with smaller CPU time.

  4. Instrumentation

    International Nuclear Information System (INIS)

    Muehllehner, G.; Colsher, J.G.

    1982-01-01

    This chapter reviews the parameters which are important to positron-imaging instruments. It summarizes the options which various groups have explored in designing tomographs and the methods which have been developed to overcome some of the limitations inherent in the technique as well as in present instruments. The chapter is not presented as a defense of positron imaging versus single-photon or other imaging modality, neither does it contain a description of various existing instruments, but rather stresses their common properties and problems. Design parameters which are considered are resolution, sampling requirements, sensitivity, methods of eliminating scattered radiation, random coincidences and attenuation. The implementation of these parameters is considered, with special reference to sampling, choice of detector material, detector ring diameter and shielding and variations in point spread function. Quantitation problems discussed are normalization, and attenuation and random corrections. Present developments mentioned are noise reduction through time-of-flight-assisted tomography and signal to noise improvements through high intrinsic resolution. Extensive bibliography. (U.K.)

  5. Triaxial Accelerometer Error Coefficients Identification with a Novel Artificial Fish Swarm Algorithm

    Directory of Open Access Journals (Sweden)

    Yanbin Gao

    2015-01-01

    Full Text Available Artificial fish swarm algorithm (AFSA is one of the state-of-the-art swarm intelligence techniques, which is widely utilized for optimization purposes. Triaxial accelerometer error coefficients are relatively unstable with the environmental disturbances and aging of the instrument. Therefore, identifying triaxial accelerometer error coefficients accurately and being with lower costs are of great importance to improve the overall performance of triaxial accelerometer-based strapdown inertial navigation system (SINS. In this study, a novel artificial fish swarm algorithm (NAFSA that eliminated the demerits (lack of using artificial fishes’ previous experiences, lack of existing balance between exploration and exploitation, and high computational cost of AFSA is introduced at first. In NAFSA, functional behaviors and overall procedure of AFSA have been improved with some parameters variations. Second, a hybrid accelerometer error coefficients identification algorithm has been proposed based on NAFSA and Monte Carlo simulation (MCS approaches. This combination leads to maximum utilization of the involved approaches for triaxial accelerometer error coefficients identification. Furthermore, the NAFSA-identified coefficients are testified with 24-position verification experiment and triaxial accelerometer-based SINS navigation experiment. The priorities of MCS-NAFSA are compared with that of conventional calibration method and optimal AFSA. Finally, both experiments results demonstrate high efficiency of MCS-NAFSA on triaxial accelerometer error coefficients identification.

  6. Network-Oblivious Algorithms

    DEFF Research Database (Denmark)

    Bilardi, Gianfranco; Pietracaprina, Andrea; Pucci, Geppino

    2016-01-01

    A framework is proposed for the design and analysis of network-oblivious algorithms, namely algorithms that can run unchanged, yet efficiently, on a variety of machines characterized by different degrees of parallelism and communication capabilities. The framework prescribes that a network......-oblivious algorithm be specified on a parallel model of computation where the only parameter is the problem’s input size, and then evaluated on a model with two parameters, capturing parallelism granularity and communication latency. It is shown that for a wide class of network-oblivious algorithms, optimality...... of cache hierarchies, to the realm of parallel computation. Its effectiveness is illustrated by providing optimal network-oblivious algorithms for a number of key problems. Some limitations of the oblivious approach are also discussed....

  7. A novel hybrid algorithm of GSA with Kepler algorithm for numerical optimization

    Directory of Open Access Journals (Sweden)

    Soroor Sarafrazi

    2015-07-01

    Full Text Available It is now well recognized that pure algorithms can be promisingly improved by hybridization with other techniques. One of the relatively new metaheuristic algorithms is Gravitational Search Algorithm (GSA which is based on the Newton laws. In this paper, to enhance the performance of GSA, a novel algorithm called “Kepler”, inspired by the astrophysics, is introduced. The Kepler algorithm is based on the principle of the first Kepler law. The hybridization of GSA and Kepler algorithm is an efficient approach to provide much stronger specialization in intensification and/or diversification. The performance of GSA–Kepler is evaluated by applying it to 14 benchmark functions with 20–1000 dimensions and the optimal approximation of linear system as a practical optimization problem. The results obtained reveal that the proposed hybrid algorithm is robust enough to optimize the benchmark functions and practical optimization problems.

  8. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  9. Problems with radiological surveillance instrumentation

    International Nuclear Information System (INIS)

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1984-09-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  10. Problems with radiological surveillance instrumentation

    International Nuclear Information System (INIS)

    Swinth, K.L.; Tanner, J.E.; Fleming, D.M.

    1985-01-01

    Many radiological surveillance instruments are in use at DOE facilities throughout the country. These instruments are an essential part of all health physics programs, and poor instrument performance can increase program costs or compromise program effectiveness. Generic data from simple tests on newly purchased instruments shows that many instruments will not meet requirements due to manufacturing defects. In other cases, lack of consideration of instrument use has resulted in poor acceptance of instruments and poor reliability. The performance of instruments is highly variable for electronic and mechanical performance, radiation response, susceptibility to interferences and response to environmental factors. Poor instrument performance in these areas can lead to errors or poor accuracy in measurements

  11. Internet Astrometry

    Science.gov (United States)

    Caballero, Rafael; Argyle, R. W.

    Amateur astronomers can carry out scientific research in many different ways. Some activities require expensive telescopes, cameras, and often access to dark skies. But those who live in highly polluted areas, or do not have access to very specialized equipment, still have many possibilities; amongst which is using the online resources available from the internet. In this chapter we explore Aladin, Simbad, and VizieR, three resources created and maintained by the Centre de Données astronomiques de Strasbourg (CDS). Although these applications are intended for professional astronomers, they are also freely available for amateurs. They allow us to find and measure old neglected difficult pairs, discover new double stars, and in general have a better understanding of those tiny pairs of points of light that we love to observe, photograph and measure.

  12. Enhancement and evaluation of an algorithm for atmospheric profiling continuity from Aqua to Suomi-NPP

    Science.gov (United States)

    Lipton, A.; Moncet, J. L.; Payne, V.; Lynch, R.; Polonsky, I. N.

    2017-12-01

    We will present recent results from an algorithm for producing climate-quality atmospheric profiling earth system data records (ESDRs) for application to data from hyperspectral sounding instruments, including the Atmospheric InfraRed Sounder (AIRS) on EOS Aqua and the Cross-track Infrared Sounder (CrIS) on Suomi-NPP, along with their companion microwave sounders, AMSU and ATMS, respectively. The ESDR algorithm uses an optimal estimation approach and the implementation has a flexible, modular software structure to support experimentation and collaboration. Data record continuity benefits from the fact that the same algorithm can be applied to different sensors, simply by providing suitable configuration and data files. Developments to be presented include the impact of a radiance-based pre-classification method for the atmospheric background. In addition to improving retrieval performance, pre-classification has the potential to reduce the sensitivity of the retrievals to the climatological data from which the background estimate and its error covariance are derived. We will also discuss evaluation of a method for mitigating the effect of clouds on the radiances, and enhancements of the radiative transfer forward model.

  13. DEVELOPMENT OF A NEW ALGORITHM FOR KEY AND S-BOX GENERATION IN BLOWFISH ALGORITHM

    Directory of Open Access Journals (Sweden)

    TAYSEER S. ATIA

    2014-08-01

    Full Text Available Blowfish algorithm is a block cipher algorithm, its strong, simple algorithm used to encrypt data in block of size 64-bit. Key and S-box generation process in this algorithm require time and memory space the reasons that make this algorithm not convenient to be used in smart card or application requires changing secret key frequently. In this paper a new key and S-box generation process was developed based on Self Synchronization Stream Cipher (SSS algorithm where the key generation process for this algorithm was modified to be used with the blowfish algorithm. Test result shows that the generation process requires relatively slow time and reasonably low memory requirement and this enhance the algorithm and gave it the possibility for different usage.

  14. Monte Carlo algorithms with absorbing Markov chains: Fast local algorithms for slow dynamics

    International Nuclear Information System (INIS)

    Novotny, M.A.

    1995-01-01

    A class of Monte Carlo algorithms which incorporate absorbing Markov chains is presented. In a particular limit, the lowest order of these algorithms reduces to the n-fold way algorithm. These algorithms are applied to study the escape from the metastable state in the two-dimensional square-lattice nearest-neighbor Ising ferromagnet in an unfavorable applied field, and the agreement with theoretical predictions is very good. It is demonstrated that the higher-order algorithms can be many orders of magnitude faster than either the traditional Monte Carlo or n-fold way algorithms

  15. Dynamic route guidance algorithm based algorithm based on artificial immune system

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To improve the performance of the K-shortest paths search in intelligent traffic guidance systems,this paper proposes an optimal search algorithm based on the intelligent optimization search theory and the memphor mechanism of vertebrate immune systems.This algorithm,applied to the urban traffic network model established by the node-expanding method,can expediently realize K-shortest paths search in the urban traffic guidance systems.Because of the immune memory and global parallel search ability from artificial immune systems,K shortest paths can be found without any repeat,which indicates evidently the superiority of the algorithm to the conventional ones.Not only does it perform a better parallelism,the algorithm also prevents premature phenomenon that often occurs in genetic algorithms.Thus,it is especially suitable for real-time requirement of the traffic guidance system and other engineering optimal applications.A case study verifies the efficiency and the practicability of the algorithm aforementioned.

  16. Regression Analysis of Long-Term Profile Ozone Data Set from BUV Instruments

    Science.gov (United States)

    Stolarski, Richard S.

    2005-01-01

    We have produced a profile merged ozone data set (MOD) based on the SBUV/SBUV2 series of nadir-viewing satellite backscatter instruments, covering the period from November 1978 - December 2003. In 2004, data from the Nimbus 7 SBUV and NOAA 9, ll, and 16 SBUV/2 instruments were reprocessed using the Version 8 (V8) algorithm and most recent calibrations. More recently, data from the Nimbus 4 BUT instrument, which was operational from 1970 - 1977, were also reprocessed using the V8 algorithm. As part of the V8 profile calibration, the Nimbus 7 and NOAA 9 (1993-1997 only) instrument calibrations have been adjusted to match the NOAA 11 calibration, which was established based on comparisons with SSBUV shuttle flight data. Differences between NOAA 11, Nimbus 7 and NOAA 9 profile zonal means are within plus or minus 5% at all levels when averaged over the respective periods of data overlap. NOAA 16 SBUV/2 data have insufficient overlap with NOAA 11, so its calibration is based on pre-flight information. Mean differences over 4 months of overlap are within plus or minus 7%. Given the level of agreement between the data sets, we simply average the ozone values during periods of instrument overlap to produce the MOD profile data set. Initial comparisons of coincident matches of N4 BUV and Arosa Umkehr data show mean differences of 0.5 (0.5)% at 30km; 7.5 (0.5)% at 35 km; and 11 (0.7)% at 40 km, where the number in parentheses is the standard error of the mean. In this study, we use the MOD profile data set (1978-2003) to estimate the change in profile ozone due to changing stratospheric chlorine levels. We use a standard linear regression model with proxies for the seasonal cycle, solar cycle, QBO, and ozone trend. To account for the non-linearity of stratospheric chlorine levels since the late 1990s, we use a time series of Effective Chlorine, defined as the global average of Chlorine + 50 * Bromine at 1 hPa, as the trend proxy. The Effective Chlorine data are taken from

  17. Evaluation of a photovoltaic energy mechatronics system with a built-in quadratic maximum power point tracking algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Chao, R.M.; Ko, S.H.; Lin, I.H. [Department of Systems and Naval Mechatronics Engineering, National Cheng Kung University, Tainan, Taiwan 701 (China); Pai, F.S. [Department of Electronic Engineering, National University of Tainan (China); Chang, C.C. [Department of Environment and Energy, National University of Tainan (China)

    2009-12-15

    The historically high cost of crude oil price is stimulating research into solar (green) energy as an alternative energy source. In general, applications with large solar energy output require a maximum power point tracking (MPPT) algorithm to optimize the power generated by the photovoltaic effect. This work aims to provide a stand-alone solution for solar energy applications by integrating a DC/DC buck converter to a newly developed quadratic MPPT algorithm along with its appropriate software and hardware. The quadratic MPPT method utilizes three previously used duty cycles with their corresponding power outputs. It approaches the maximum value by using a second order polynomial formula, which converges faster than the existing MPPT algorithm. The hardware implementation takes advantage of the real-time controller system from National Instruments, USA. Experimental results have shown that the proposed solar mechatronics system can correctly and effectively track the maximum power point without any difficulties. (author)

  18. Hamiltonian Algorithm Sound Synthesis

    OpenAIRE

    大矢, 健一

    2013-01-01

    Hamiltonian Algorithm (HA) is an algorithm for searching solutions is optimization problems. This paper introduces a sound synthesis technique using Hamiltonian Algorithm and shows a simple example. "Hamiltonian Algorithm Sound Synthesis" uses phase transition effect in HA. Because of this transition effect, totally new waveforms are produced.

  19. OBSERVATIONS OF BINARY STARS WITH THE DIFFERENTIAL SPECKLE SURVEY INSTRUMENT. III. MEASURES BELOW THE DIFFRACTION LIMIT OF THE WIYN TELESCOPE

    International Nuclear Information System (INIS)

    Horch, Elliott P.; Van Altena, William F.; Howell, Steve B.; Sherry, William H.; Ciardi, David R.

    2011-01-01

    In this paper, we study the ability of CCD- and electron-multiplying-CCD-based speckle imaging to obtain reliable astrometry and photometry of binary stars below the diffraction limit of the WIYN 3.5 m Telescope. We present a total of 120 measures of binary stars, 75 of which are below the diffraction limit. The measures are divided into two groups that have different measurement accuracy and precision. The first group is composed of standard speckle observations, that is, a sequence of speckle images taken in a single filter, while the second group consists of paired observations where the two observations are taken on the same observing run and in different filters. The more recent paired observations were taken simultaneously with the Differential Speckle Survey Instrument, which is a two-channel speckle imaging system. In comparing our results to the ephemeris positions of binaries with known orbits, we find that paired observations provide the opportunity to identify cases of systematic error in separation below the diffraction limit and after removing these from consideration, we obtain a linear measurement uncertainty of 3-4 mas. However, if observations are unpaired or if two observations taken in the same filter are paired, it becomes harder to identify cases of systematic error, presumably because the largest source of this error is residual atmospheric dispersion, which is color dependent. When observations are unpaired, we find that it is unwise to report separations below approximately 20 mas, as these are most susceptible to this effect. Using the final results obtained, we are able to update two older orbits in the literature and present preliminary orbits for three systems that were discovered by Hipparcos.

  20. Modified Clipped LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Lotfizad Mojtaba

    2005-01-01

    Full Text Available Abstract A new algorithm is proposed for updating the weights of an adaptive filter. The proposed algorithm is a modification of an existing method, namely, the clipped LMS, and uses a three-level quantization ( scheme that involves the threshold clipping of the input signals in the filter weight update formula. Mathematical analysis shows the convergence of the filter weights to the optimum Wiener filter weights. Also, it can be proved that the proposed modified clipped LMS (MCLMS algorithm has better tracking than the LMS algorithm. In addition, this algorithm has reduced computational complexity relative to the unmodified one. By using a suitable threshold, it is possible to increase the tracking capability of the MCLMS algorithm compared to the LMS algorithm, but this causes slower convergence. Computer simulations confirm the mathematical analysis presented.

  1. Algorithms as fetish: Faith and possibility in algorithmic work

    Directory of Open Access Journals (Sweden)

    Suzanne L Thomas

    2018-01-01

    Full Text Available Algorithms are powerful because we invest in them the power to do things. With such promise, they can transform the ordinary, say snapshots along a robotic vacuum cleaner’s route, into something much more, such as a clean home. Echoing David Graeber’s revision of fetishism, we argue that this easy slip from technical capabilities to broader claims betrays not the “magic” of algorithms but rather the dynamics of their exchange. Fetishes are not indicators of false thinking, but social contracts in material form. They mediate emerging distributions of power often too nascent, too slippery or too disconcerting to directly acknowledge. Drawing primarily on 2016 ethnographic research with computer vision professionals, we show how faith in what algorithms can do shapes the social encounters and exchanges of their production. By analyzing algorithms through the lens of fetishism, we can see the social and economic investment in some people’s labor over others. We also see everyday opportunities for social creativity and change. We conclude that what is problematic about algorithms is not their fetishization but instead their stabilization into full-fledged gods and demons – the more deserving objects of critique.

  2. Quick fuzzy backpropagation algorithm.

    Science.gov (United States)

    Nikov, A; Stoeva, S

    2001-03-01

    A modification of the fuzzy backpropagation (FBP) algorithm called QuickFBP algorithm is proposed, where the computation of the net function is significantly quicker. It is proved that the FBP algorithm is of exponential time complexity, while the QuickFBP algorithm is of polynomial time complexity. Convergence conditions of the QuickFBP, resp. the FBP algorithm are defined and proved for: (1) single output neural networks in case of training patterns with different targets; and (2) multiple output neural networks in case of training patterns with equivalued target vector. They support the automation of the weights training process (quasi-unsupervised learning) establishing the target value(s) depending on the network's input values. In these cases the simulation results confirm the convergence of both algorithms. An example with a large-sized neural network illustrates the significantly greater training speed of the QuickFBP rather than the FBP algorithm. The adaptation of an interactive web system to users on the basis of the QuickFBP algorithm is presented. Since the QuickFBP algorithm ensures quasi-unsupervised learning, this implies its broad applicability in areas of adaptive and adaptable interactive systems, data mining, etc. applications.

  3. A New Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Medha Gupta

    2016-07-01

    Full Text Available Nature inspired meta-heuristic algorithms studies the emergent collective intelligence of groups of simple agents. Firefly Algorithm is one of the new such swarm-based metaheuristic algorithm inspired by the flashing behavior of fireflies. The algorithm was first proposed in 2008 and since then has been successfully used for solving various optimization problems. In this work, we intend to propose a new modified version of Firefly algorithm (MoFA and later its performance is compared with the standard firefly algorithm along with various other meta-heuristic algorithms. Numerical studies and results demonstrate that the proposed algorithm is superior to existing algorithms.

  4. Quantum Computation and Algorithms

    International Nuclear Information System (INIS)

    Biham, O.; Biron, D.; Biham, E.; Grassi, M.; Lidar, D.A.

    1999-01-01

    It is now firmly established that quantum algorithms provide a substantial speedup over classical algorithms for a variety of problems, including the factorization of large numbers and the search for a marked element in an unsorted database. In this talk I will review the principles of quantum algorithms, the basic quantum gates and their operation. The combination of superposition and interference, that makes these algorithms efficient, will be discussed. In particular, Grover's search algorithm will be presented as an example. I will show that the time evolution of the amplitudes in Grover's algorithm can be found exactly using recursion equations, for any initial amplitude distribution

  5. Scientific planning for the VLT and VLTI

    Science.gov (United States)

    Leibundgut, B.; Berger, J.-P.

    2016-07-01

    An observatory system like the VLT/I requires careful scientific planning for operations and future instruments. Currently the ESO optical/near-infrared facilities include four 8m telescopes, four (movable) 1.8m telescopes used exclusively for interferometry, two 4m telescopes and two survey telescopes. This system offers a large range of scientific capabilities and setting the corresponding priorities depends good community interactions. Coordinating the existing and planned instrumentation is an important aspect for strong scientific return. The current scientific priorities for the VLT and VLTI are pushing for the development of the highest angular resolution imaging and astrometry, integral field spectroscopy and multi-object spectroscopy. The ESO 4m telescopes on La Silla will be dedicated to time domain spectroscopy and exo-planet searches with highly specialized instruments. The next decade will also see a significant rise in the scientific importance of massive ground and space-based surveys. We discuss how future developments in astronomical research could shape the VLT/I evolution.

  6. Next-generation marine instruments to join plume debate

    Science.gov (United States)

    Simons, F. J.; Nolet, G.; Babcock, J.

    2003-12-01

    instrument is able to maintain a constant water column depth below the sound channel and will surface only periodically for position determination and satellite data communication. Using these low-cost, non-recovered floating sensors, the aperture of arrays mounted on oceanic islands can be increased manifold. Furthermore, adding such instruments to poorly instrumented areas will improve the resolution of deep Earth structure more dramatically than the addition of stations in already densely sampled continental areas. Our progress has been made in the design of intelligent algorithms for the automatic identification and discrimination of seismic phases that are expected to be recorded. We currently recognize teleseismic arrivals in the presence of local P, S, and T phases, ship and whale noise, and other contaminating factors such as airgunning. Our approach combines continuous time-domain processing, spectrogram analysis, and custom-made wavelet methods new to global seismology. The lifespan and cost of the instrument are critically dependent on its ability to limit its power consumption by using a minimum amount of processing steps. Hence, we pay particular attention to the numerical implementation and efficiency of our algorithms, which are shown to be accurate while approaching a theoretical limit of efficiency. We show examples on data from ridge-tethered hydrophones and expect preliminary results from a first test deployment in October.

  7. Bridging Ground Validation and Algorithms: Using Scattering and Integral Tables to Incorporate Observed DSD Correlations into Satellite Algorithms

    Science.gov (United States)

    Williams, C. R.

    2012-12-01

    -polarization at the instrument view angles of nadir to 17 degrees (for DPR) and 48 & 53 degrees off nadir (for GMI). The GPM DSD Working Group is generating integral tables with GV observed DSD correlations and is performing sensitivity and verification tests. One advantage of keeping scattering tables separate from integral tables is that research can progress on the electromagnetic scattering of particles independent of cloud microphysics research. Another advantage of keeping the tables separate is that multiple scattering tables will be needed for frozen precipitation. Scattering tables are being developed for individual frozen particles based on habit, density and operating frequency. And a third advantage of keeping scattering and integral tables separate is that this framework provides an opportunity to communicate GV findings about DSD correlations into integral tables, and thus, into satellite algorithms.

  8. Semioptimal practicable algorithmic cooling

    International Nuclear Information System (INIS)

    Elias, Yuval; Mor, Tal; Weinstein, Yossi

    2011-01-01

    Algorithmic cooling (AC) of spins applies entropy manipulation algorithms in open spin systems in order to cool spins far beyond Shannon's entropy bound. Algorithmic cooling of nuclear spins was demonstrated experimentally and may contribute to nuclear magnetic resonance spectroscopy. Several cooling algorithms were suggested in recent years, including practicable algorithmic cooling (PAC) and exhaustive AC. Practicable algorithms have simple implementations, yet their level of cooling is far from optimal; exhaustive algorithms, on the other hand, cool much better, and some even reach (asymptotically) an optimal level of cooling, but they are not practicable. We introduce here semioptimal practicable AC (SOPAC), wherein a few cycles (typically two to six) are performed at each recursive level. Two classes of SOPAC algorithms are proposed and analyzed. Both attain cooling levels significantly better than PAC and are much more efficient than the exhaustive algorithms. These algorithms are shown to bridge the gap between PAC and exhaustive AC. In addition, we calculated the number of spins required by SOPAC in order to purify qubits for quantum computation. As few as 12 and 7 spins are required (in an ideal scenario) to yield a mildly pure spin (60% polarized) from initial polarizations of 1% and 10%, respectively. In the latter case, about five more spins are sufficient to produce a highly pure spin (99.99% polarized), which could be relevant for fault-tolerant quantum computing.

  9. The Ocean Colour Climate Change Initiative: III. A Round-Robin Comparison on In-Water Bio-Optical Algorithms

    Science.gov (United States)

    Brewin, Robert J.W.; Sathyendranath, Shubha; Muller, Dagmar; Brockmann, Carsten; Deschamps, Pierre-Yves; Devred, Emmanuel; Doerffer, Roland; Fomferra, Norman; Franz, Bryan; Grant, Mike; hide

    2013-01-01

    Satellite-derived remote-sensing reflectance (Rrs) can be used for mapping biogeochemically relevant variables, such as the chlorophyll concentration and the Inherent Optical Properties (IOPs) of the water, at global scale for use in climate-change studies. Prior to generating such products, suitable algorithms have to be selected that are appropriate for the purpose. Algorithm selection needs to account for both qualitative and quantitative requirements. In this paper we develop an objective methodology designed to rank the quantitative performance of a suite of bio-optical models. The objective classification is applied using the NASA bio-Optical Marine Algorithm Dataset (NOMAD). Using in situ Rrs as input to the models, the performance of eleven semianalytical models, as well as five empirical chlorophyll algorithms and an empirical diffuse attenuation coefficient algorithm, is ranked for spectrally-resolved IOPs, chlorophyll concentration and the diffuse attenuation coefficient at 489 nm. The sensitivity of the objective classification and the uncertainty in the ranking are tested using a Monte-Carlo approach (bootstrapping). Results indicate that the performance of the semi-analytical models varies depending on the product and wavelength of interest. For chlorophyll retrieval, empirical algorithms perform better than semi-analytical models, in general. The performance of these empirical models reflects either their immunity to scale errors or instrument noise in Rrs data, or simply that the data used for model parameterisation were not independent of NOMAD. Nonetheless, uncertainty in the classification suggests that the performance of some semi-analytical algorithms at retrieving chlorophyll is comparable with the empirical algorithms. For phytoplankton absorption at 443 nm, some semi-analytical models also perform with similar accuracy to an empirical model. We discuss the potential biases, limitations and uncertainty in the approach, as well as additional

  10. A high-precision instrument for analyzing nonlinear dynamic behavior of bearing cage

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Z., E-mail: zhaohui@nwpu.edu.cn; Yu, T. [School of Aeronautics, Northwestern Polytechnical University, Xi’an 710072 (China); Chen, H. [Xi’an Aerospace Propulsion Institute, Xi’an 710100 (China); Li, B. [State Key Laboratory for Manufacturing and Systems Engineering, Xi’an Jiaotong University, Xi’an 710054 (China)

    2016-08-15

    The high-precision ball bearing is fundamental to the performance of complex mechanical systems. As the speed increases, the cage behavior becomes a key factor in influencing the bearing performance, especially life and reliability. This paper develops a high-precision instrument for analyzing nonlinear dynamic behavior of the bearing cage. The trajectory of the rotational center and non-repetitive run-out (NRRO) of the cage are used to evaluate the instability of cage motion. This instrument applied an aerostatic spindle to support and spin test the bearing to decrease the influence of system error. Then, a high-speed camera is used to capture images when the bearing works at high speeds. A 3D trajectory tracking software TEMA Motion is used to track the spot which marked the cage surface. Finally, by developing the MATLAB program, a Lissajous’ figure was used to evaluate the nonlinear dynamic behavior of the cage with different speeds. The trajectory of rotational center and NRRO of the cage with various speeds are analyzed. The results can be used to predict the initial failure and optimize cage structural parameters. In addition, the repeatability precision of instrument is also validated. In the future, the motorized spindle will be applied to increase testing speed and image processing algorithms will be developed to analyze the trajectory of the cage.

  11. A high-precision instrument for analyzing nonlinear dynamic behavior of bearing cage

    International Nuclear Information System (INIS)

    Yang, Z.; Yu, T.; Chen, H.; Li, B.

    2016-01-01

    The high-precision ball bearing is fundamental to the performance of complex mechanical systems. As the speed increases, the cage behavior becomes a key factor in influencing the bearing performance, especially life and reliability. This paper develops a high-precision instrument for analyzing nonlinear dynamic behavior of the bearing cage. The trajectory of the rotational center and non-repetitive run-out (NRRO) of the cage are used to evaluate the instability of cage motion. This instrument applied an aerostatic spindle to support and spin test the bearing to decrease the influence of system error. Then, a high-speed camera is used to capture images when the bearing works at high speeds. A 3D trajectory tracking software TEMA Motion is used to track the spot which marked the cage surface. Finally, by developing the MATLAB program, a Lissajous’ figure was used to evaluate the nonlinear dynamic behavior of the cage with different speeds. The trajectory of rotational center and NRRO of the cage with various speeds are analyzed. The results can be used to predict the initial failure and optimize cage structural parameters. In addition, the repeatability precision of instrument is also validated. In the future, the motorized spindle will be applied to increase testing speed and image processing algorithms will be developed to analyze the trajectory of the cage.

  12. Status of safeguards instrumentation

    International Nuclear Information System (INIS)

    Higinbotham, W.A.

    The International Atomic Energy Agency is performing safeguards at some nuclear power reactors, 50 bulk processing facilities, and 170 research facilities. Its verification activities require the use of instruments to measure nuclear materials and of surveillance instruments to maintain continuity of knowledge of the locations of nuclear materials. Instruments that are in use and under development to measure weight, volume, concentration, and isotopic composition of nuclear materials, and the major surveillance instruments, are described in connection with their uses at representative nuclear facilities. The current status of safeguards instrumentation and the needs for future development are discussed

  13. Early modern mathematical instruments.

    Science.gov (United States)

    Bennett, Jim

    2011-12-01

    In considering the appropriate use of the terms "science" and "scientific instrument," tracing the history of "mathematical instruments" in the early modern period is offered as an illuminating alternative to the historian's natural instinct to follow the guiding lights of originality and innovation, even if the trail transgresses contemporary boundaries. The mathematical instrument was a well-defined category, shared across the academic, artisanal, and commercial aspects of instrumentation, and its narrative from the sixteenth to the eighteenth century was largely independent from other classes of device, in a period when a "scientific" instrument was unheard of.

  14. Instrumental Capital

    Directory of Open Access Journals (Sweden)

    Gabriel Valerio

    2007-07-01

    Full Text Available During the history of human kind, since our first ancestors, tools have represented a mean to reach objectives which might otherwise seemed impossibles. In the called New Economy, where tangibles assets appear to be losing the role as the core element to produce value versus knowledge, tools have kept aside man in his dairy work. In this article, the author's objective is to describe, in a simple manner, the importance of managing the organization's group of tools or instruments (Instrumental Capital. The characteristic conditions of this New Economy, the way Knowledge Management deals with these new conditions and the sub-processes that provide support to the management of Instrumental Capital are described.

  15. An Ordering Linear Unification Algorithm

    Institute of Scientific and Technical Information of China (English)

    胡运发

    1989-01-01

    In this paper,we present an ordering linear unification algorithm(OLU).A new idea on substituteion of the binding terms is introduced to the algorithm,which is able to overcome some drawbacks of other algorithms,e.g.,MM algorithm[1],RG1 and RG2 algorithms[2],Particularly,if we use the directed eyclie graphs,the algoritm needs not check the binding order,then the OLU algorithm can also be aplied to the infinite tree data struceture,and a higher efficiency can be expected.The paper focuses upon the discussion of OLU algorithm and a partial order structure with respect to the unification algorithm.This algorithm has been implemented in the GKD-PROLOG/VAX 780 interpreting system.Experimental results have shown that the algorithm is very simple and efficient.

  16. Effects of thermal deformation on optical instruments for space application

    Science.gov (United States)

    Segato, E.; Da Deppo, V.; Debei, S.; Cremonese, G.

    2017-11-01

    Optical instruments for space missions work in hostile environment, it's thus necessary to accurately study the effects of ambient parameters variations on the equipment. In particular optical instruments are very sensitive to ambient conditions, especially temperature. This variable can cause dilatations and misalignments of the optical elements, and can also lead to rise of dangerous stresses in the optics. Their displacements and the deformations degrade the quality of the sampled images. In this work a method for studying the effects of the temperature variations on the performance of imaging instrument is presented. The optics and their mountings are modeled and processed by a thermo-mechanical Finite Element Model (FEM) analysis, then the output data, which describe the deformations of the optical element surfaces, are elaborated using an ad hoc MATLAB routine: a non-linear least square optimization algorithm is adopted to determine the surface equations (plane, spherical, nth polynomial) which best fit the data. The obtained mathematical surface representations are then directly imported into ZEMAX for sequential raytracing analysis. The results are the variations of the Spot Diagrams, of the MTF curves and of the Diffraction Ensquared Energy due to simulated thermal loads. This method has been successfully applied to the Stereo Camera for the BepiColombo mission reproducing expected operative conditions. The results help to design and compare different optical housing systems for a feasible solution and show that it is preferable to use kinematic constraints on prisms and lenses to minimize the variation of the optical performance of the Stereo Camera.

  17. Digital Signal Processing in Beam Instrumentation Latest Trends and Typical Applications

    CERN Document Server

    Angoletta, Maria Elena

    2003-01-01

    The last decade has seen major improvements in digital hardware, algortithms and software, which have trickled down to the Beam Instrumentation (BI) area. An advantageous transition is taking place towards systems with an ever-stronger digital presence. Digital systems are assembled by means of a rather small number of basic building blocks, with improved speed, precision, signal-to-noise ratio, dynamic range, flexibility, and accompanied by a range of powerful and user-friendly development tools. The paper reviews current digital BI trends, including using Digital Signal Processors, Field Programmable Gate Arrays, Digital Receivers and General Purpose Processors as well as some useful processing algorithms. Selected digital applications are illustrated on control/feedback and beam diagnostics.

  18. Evaluating musical instruments

    International Nuclear Information System (INIS)

    Campbell, D. Murray

    2014-01-01

    Scientific measurements of sound generation and radiation by musical instruments are surprisingly hard to correlate with the subtle and complex judgments of instrumental quality made by expert musicians

  19. Performance of a rain retrieval algorithm using TRMM data in the Eastern Mediterranean

    Directory of Open Access Journals (Sweden)

    D. Katsanos

    2006-01-01

    Full Text Available This study aims to make a regional characterization of the performance of the rain retrieval algorithm BRAIN. This algorithm estimates the rain rate from brightness temperatures measured by the TRMM Microwave Imager (TMI onboard the TRMM satellite. In this stage of the study, a comparison between the rain estimated from Precipitation Radar (PR onboard TRMM (2A25 version 5 and the rain retrieved by the BRAIN algorithm is presented, for about 30 satellite overpasses over the Central and Eastern Mediterranean during the period October 2003–March 2004, in order to assess the behavior of the algorithm in the Eastern Mediterranean region. BRAIN was built and tested using PR rain estimates distributed randomly over the whole TRMM sampling region. Characterization of the differences between PR and BRAIN over a specific region is thus interesting because it might show some local trend for one or the other of the instrument. The checking of BRAIN results against the PR rain-estimate appears to be consistent with former results i.e. a somewhat marked discrepancy for the highest rain rates. This difference arises from a known problem that affect rain retrieval based on passive microwave radiometers measurements, but some of the higher radar rain rates could also be questioned. As an independent test, a good correlation between the rain retrieved by BRAIN and lighting data (obtained by the UK Met. Office long range detection system is also emphasized in the paper.

  20. VISUALIZATION OF PAGERANK ALGORITHM

    OpenAIRE

    Perhaj, Ervin

    2013-01-01

    The goal of the thesis is to develop a web application that help users understand the functioning of the PageRank algorithm. The thesis consists of two parts. First we develop an algorithm to calculate PageRank values of web pages. The input of algorithm is a list of web pages and links between them. The user enters the list through the web interface. From the data the algorithm calculates PageRank value for each page. The algorithm repeats the process, until the difference of PageRank va...

  1. RFID Location Algorithm

    Directory of Open Access Journals (Sweden)

    Wang Zi Min

    2016-01-01

    Full Text Available With the development of social services, people’s living standards improve further requirements, there is an urgent need for a way to adapt to the complex situation of the new positioning technology. In recent years, RFID technology have a wide range of applications in all aspects of life and production, such as logistics tracking, car alarm, security and other items. The use of RFID technology to locate, it is a new direction in the eyes of the various research institutions and scholars. RFID positioning technology system stability, the error is small and low-cost advantages of its location algorithm is the focus of this study.This article analyzes the layers of RFID technology targeting methods and algorithms. First, RFID common several basic methods are introduced; Secondly, higher accuracy to political network location method; Finally, LANDMARC algorithm will be described. Through this it can be seen that advanced and efficient algorithms play an important role in increasing RFID positioning accuracy aspects.Finally, the algorithm of RFID location technology are summarized, pointing out the deficiencies in the algorithm, and put forward a follow-up study of the requirements, the vision of a better future RFID positioning technology.

  2. Improved multivariate polynomial factoring algorithm

    International Nuclear Information System (INIS)

    Wang, P.S.

    1978-01-01

    A new algorithm for factoring multivariate polynomials over the integers based on an algorithm by Wang and Rothschild is described. The new algorithm has improved strategies for dealing with the known problems of the original algorithm, namely, the leading coefficient problem, the bad-zero problem and the occurrence of extraneous factors. It has an algorithm for correctly predetermining leading coefficients of the factors. A new and efficient p-adic algorithm named EEZ is described. Bascially it is a linearly convergent variable-by-variable parallel construction. The improved algorithm is generally faster and requires less store then the original algorithm. Machine examples with comparative timing are included

  3. The 10/66 Dementia Research Group's fully operationalised DSM-IV dementia computerized diagnostic algorithm, compared with the 10/66 dementia algorithm and a clinician diagnosis: a population validation study

    Directory of Open Access Journals (Sweden)

    Krishnamoorthy ES

    2008-06-01

    Full Text Available Abstract Background The criterion for dementia implicit in DSM-IV is widely used in research but not fully operationalised. The 10/66 Dementia Research Group sought to do this using assessments from their one phase dementia diagnostic research interview, and to validate the resulting algorithm in a population-based study in Cuba. Methods The criterion was operationalised as a computerised algorithm, applying clinical principles, based upon the 10/66 cognitive tests, clinical interview and informant reports; the Community Screening Instrument for Dementia, the CERAD 10 word list learning and animal naming tests, the Geriatric Mental State, and the History and Aetiology Schedule – Dementia Diagnosis and Subtype. This was validated in Cuba against a local clinician DSM-IV diagnosis and the 10/66 dementia diagnosis (originally calibrated probabilistically against clinician DSM-IV diagnoses in the 10/66 pilot study. Results The DSM-IV sub-criteria were plausibly distributed among clinically diagnosed dementia cases and controls. The clinician diagnoses agreed better with 10/66 dementia diagnosis than with the more conservative computerized DSM-IV algorithm. The DSM-IV algorithm was particularly likely to miss less severe dementia cases. Those with a 10/66 dementia diagnosis who did not meet the DSM-IV criterion were less cognitively and functionally impaired compared with the DSMIV confirmed cases, but still grossly impaired compared with those free of dementia. Conclusion The DSM-IV criterion, strictly applied, defines a narrow category of unambiguous dementia characterized by marked impairment. It may be specific but incompletely sensitive to clinically relevant cases. The 10/66 dementia diagnosis defines a broader category that may be more sensitive, identifying genuine cases beyond those defined by our DSM-IV algorithm, with relevance to the estimation of the population burden of this disorder.

  4. Governance by algorithms

    Directory of Open Access Journals (Sweden)

    Francesca Musiani

    2013-08-01

    Full Text Available Algorithms are increasingly often cited as one of the fundamental shaping devices of our daily, immersed-in-information existence. Their importance is acknowledged, their performance scrutinised in numerous contexts. Yet, a lot of what constitutes 'algorithms' beyond their broad definition as “encoded procedures for transforming input data into a desired output, based on specified calculations” (Gillespie, 2013 is often taken for granted. This article seeks to contribute to the discussion about 'what algorithms do' and in which ways they are artefacts of governance, providing two examples drawing from the internet and ICT realm: search engine queries and e-commerce websites’ recommendations to customers. The question of the relationship between algorithms and rules is likely to occupy an increasingly central role in the study and the practice of internet governance, in terms of both institutions’ regulation of algorithms, and algorithms’ regulation of our society.

  5. Nitrogen dioxide observations from the Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument: Retrieval algorithm and measurements during DISCOVER-AQ Texas 2013

    Science.gov (United States)

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument is a test bed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA F...

  6. Instrument Modeling and Synthesis

    Science.gov (United States)

    Horner, Andrew B.; Beauchamp, James W.

    During the 1970s and 1980s, before synthesizers based on direct sampling of musical sounds became popular, replicating musical instruments using frequency modulation (FM) or wavetable synthesis was one of the “holy grails” of music synthesis. Synthesizers such as the Yamaha DX7 allowed users great flexibility in mixing and matching sounds, but were notoriously difficult to coerce into producing sounds like those of a given instrument. Instrument design wizards practiced the mysteries of FM instrument design.

  7. Instrumentation for Nuclear Applications

    International Nuclear Information System (INIS)

    1998-01-01

    The objective of this project was to develop and coordinate nuclear instrumentation standards with resulting economies for the nuclear and radiation fields. There was particular emphasis on coordination and management of the Nuclear Instrument Module (NIM) System, U.S. activity involving the CAMAC international standard dataway system, the FASTBUS modular high-speed data acquisition and control system and processing and management of national nuclear instrumentation and detector standards, as well as a modest amount of assistance and consultation services to the Pollutant Characterization and Safety Research Division of the Office of Health and Environmental Research. The principal accomplishments were the development and maintenance of the NIM instrumentation system that is the predominant instrumentation system in the nuclear and radiation fields worldwide, the CAMAC digital interface system in coordination with the ESONE Committee of European Laboratories, the FASTBUS high-speed system and numerous national and international nuclear instrumentation standards

  8. Algorithmic and user study of an autocompletion algorithm on a large medical vocabulary.

    Science.gov (United States)

    Sevenster, Merlijn; van Ommering, Rob; Qian, Yuechen

    2012-02-01

    Autocompletion supports human-computer interaction in software applications that let users enter textual data. We will be inspired by the use case in which medical professionals enter ontology concepts, catering the ongoing demand for structured and standardized data in medicine. Goal is to give an algorithmic analysis of one particular autocompletion algorithm, called multi-prefix matching algorithm, which suggests terms whose words' prefixes contain all words in the string typed by the user, e.g., in this sense, opt ner me matches optic nerve meningioma. Second we aim to investigate how well it supports users entering concepts from a large and comprehensive medical vocabulary (snomed ct). We give a concise description of the multi-prefix algorithm, and sketch how it can be optimized to meet required response time. Performance will be compared to a baseline algorithm, which gives suggestions that extend the string typed by the user to the right, e.g. optic nerve m gives optic nerve meningioma, but opt ner me does not. We conduct a user experiment in which 12 participants are invited to complete 40 snomed ct terms with the baseline algorithm and another set of 40 snomed ct terms with the multi-prefix algorithm. Our results show that users need significantly fewer keystrokes when supported by the multi-prefix algorithm than when supported by the baseline algorithm. The proposed algorithm is a competitive candidate for searching and retrieving terms from a large medical ontology. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Optimal Fungal Space Searching Algorithms.

    Science.gov (United States)

    Asenova, Elitsa; Lin, Hsin-Yu; Fu, Eileen; Nicolau, Dan V; Nicolau, Dan V

    2016-10-01

    Previous experiments have shown that fungi use an efficient natural algorithm for searching the space available for their growth in micro-confined networks, e.g., mazes. This natural "master" algorithm, which comprises two "slave" sub-algorithms, i.e., collision-induced branching and directional memory, has been shown to be more efficient than alternatives, with one, or the other, or both sub-algorithms turned off. In contrast, the present contribution compares the performance of the fungal natural algorithm against several standard artificial homologues. It was found that the space-searching fungal algorithm consistently outperforms uninformed algorithms, such as Depth-First-Search (DFS). Furthermore, while the natural algorithm is inferior to informed ones, such as A*, this under-performance does not importantly increase with the increase of the size of the maze. These findings suggest that a systematic effort of harvesting the natural space searching algorithms used by microorganisms is warranted and possibly overdue. These natural algorithms, if efficient, can be reverse-engineered for graph and tree search strategies.

  10. On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework

    Science.gov (United States)

    Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg

    2007-01-01

    Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).

  11. VizieR Online Data Catalog: 2007.5 to 2010.4 HST astrometry of HD 202206 (Benedict+, 2017)

    Science.gov (United States)

    Benedict, G. F.; Harrison, T. E.

    2017-08-01

    For this study astrometric measurements came from Fine Guidance Sensor 1r (FGS 1r), an upgraded FGS installed in 1997 during the second Hubble Space Telescope (HST) servicing mission. It provided superior fringes from which to obtain target and reference star positions (McArthur et al. 2003hstc.conf..373M). We utilized only the fringe tracking mode (POS mode) in this investigation. POS mode observations of a star have a typical duration of 60s, during which over 2000 individual position measures are collected. The astrometric centroid is estimated by choosing the median measure, after filtering large outliers (caused by cosmic-ray hits and particles trapped by the Earth's magnetic field). The standard deviation of the measures provides a measurement error. We refer to the aggregate of astrometric centroids of each star secured during one visibility period as an "orbit". Because one of the pillars of the scientific method involves reproducibility, we present a complete ensemble of time-tagged HD202206 and reference star astrometric measurements, Optical Field Angle Distortion (OFAD; McArthur et al. 2006hstc.conf..396M) and intra-orbit-drift-corrected, in Table2, along with calculated parallax factors in R.A. and decl. These data, collected from 2007.5 to 2010.4, in addition to providing material for confirmation of our results, might ultimately be combined with Gaia measures, significantly extending the time baseline of astrometry, thereby improving proper motion and perturbation characterization. Our band passes for reference star photometry include: BVRI photometry of the reference stars from the NMSU 1m telescope located at Apache Point Observatory and JHK (from 2MASS; see Cutri et al. 2003, Cat. II/246). Table4 lists the visible and infrared photometry for the HD202206 reference stars. To establish spectral type and luminosity class, the reference frame stars were observed on 2009 December 9 using the RCSPEC on the Blanco 4m telescope at Cerro Tololo Inter

  12. Identifying and Analyzing Novel Epilepsy-Related Genes Using Random Walk with Restart Algorithm

    Directory of Open Access Journals (Sweden)

    Wei Guo

    2017-01-01

    Full Text Available As a pathological condition, epilepsy is caused by abnormal neuronal discharge in brain which will temporarily disrupt the cerebral functions. Epilepsy is a chronic disease which occurs in all ages and would seriously affect patients’ personal lives. Thus, it is highly required to develop effective medicines or instruments to treat the disease. Identifying epilepsy-related genes is essential in order to understand and treat the disease because the corresponding proteins encoded by the epilepsy-related genes are candidates of the potential drug targets. In this study, a pioneering computational workflow was proposed to predict novel epilepsy-related genes using the random walk with restart (RWR algorithm. As reported in the literature RWR algorithm often produces a number of false positive genes, and in this study a permutation test and functional association tests were implemented to filter the genes identified by RWR algorithm, which greatly reduce the number of suspected genes and result in only thirty-three novel epilepsy genes. Finally, these novel genes were analyzed based upon some recently published literatures. Our findings implicate that all novel genes were closely related to epilepsy. It is believed that the proposed workflow can also be applied to identify genes related to other diseases and deepen our understanding of the mechanisms of these diseases.

  13. Autonomous Star Tracker Algorithms

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Kilsgaard, Søren

    1998-01-01

    Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances.......Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances....

  14. Virtual instrumentation: a new approach for control and instrumentation - application in containment studies facility

    International Nuclear Information System (INIS)

    Gole, N.V.; Shanware, V.M.; Sebastian, A.; Subramaniam, K.

    2001-01-01

    PC based data-acquisition has emerged as a rapidly developing area particularly with respect to process instrumentation. Computer based data acquisition in process instrumentation combined with Supervisory Control and Data Acquisition (SCADA) software has introduced extensive possibilities with respect to formats for presentation of information. The concept of presenting data using any instrument format with the help of software tools to simulate the instrument on screen, needs to be understood, in order to be able to make use of its vast potential. The purpose of this paper is to present the significant features of the Virtual Instrumentation concept and discuss its application in the instrumentation and control system of containment studies facility (CSF). Factors involved in the development of the virtual instrumentation based I and C system for CSF are detailed and a functional overview of the system configuration is given. (author)

  15. Fast geometric algorithms

    International Nuclear Information System (INIS)

    Noga, M.T.

    1984-01-01

    This thesis addresses a number of important problems that fall within the framework of the new discipline of Computational Geometry. The list of topics covered includes sorting and selection, convex hull algorithms, the L 1 hull, determination of the minimum encasing rectangle of a set of points, the Euclidean and L 1 diameter of a set of points, the metric traveling salesman problem, and finding the superrange of star-shaped and monotype polygons. The main theme of all the work was to develop a set of very fast state-of-the-art algorithms that supersede any rivals in terms of speed and ease of implementation. In some cases existing algorithms were refined; for others new techniques were developed that add to the present database of fast adaptive geometric algorithms. What emerges is a collection of techniques that is successful at merging modern tools developed in analysis of algorithms with those of classical geometry

  16. LOFT instrumentation

    International Nuclear Information System (INIS)

    Bixby, W.W.

    1979-01-01

    A description of instrumentation used in the Loss-of-Fluid Test (LOFT) large break Loss-of-Coolant Experiments is presented. Emphasis is placed on hydraulic and thermal measurements in the primary system piping and components, reactor vessel, and pressure suppression system. In addition, instrumentation which is being considered for measurement of phenomena during future small break testing is discussed. (orig.) 891 HP/orig. 892 BRE [de

  17. Instrumentation for environmental monitoring: biomedical

    International Nuclear Information System (INIS)

    1979-05-01

    An update is presented to Volume four of the six-volume series devoted to a survey of instruments useful for measurements in biomedicine related to environmental research and monitoring. Results of the survey are given as descriptions of the physical and operating characteristics of available instruments, critical comparisons among instrumentation methods, and recommendations of promising methodology and development of new instrumentation. Methods of detection and analysis of gaseous organic pollutants and metals, including Ni and As are presented. Instrument techniques and notes are included on atomic spectrometry and uv and visible absorption instrumentation

  18. Separation of musical instruments based on amplitude and frequency comodulation

    Science.gov (United States)

    Jacobson, Barry D.; Cauwenberghs, Gert; Quatieri, Thomas F.

    2002-05-01

    In previous work, amplitude comodulation was investigated as a basis for monaural source separation. Amplitude comodulation refers to similarities in amplitude envelopes of individual spectral components emitted by particular types of sources. In many types of musical instruments, amplitudes of all resonant modes rise/fall, and start/stop together during the course of normal playing. We found that under certain well-defined conditions, a mixture of constant frequency, amplitude comodulated sources can unambiguously be decomposed into its constituents on the basis of these similarities. In this work, system performance was improved by relaxing the constant frequency requirement. String instruments, for example, which are normally played with vibrato, are both amplitude and frequency comodulated sources, and could not be properly tracked under the constant frequency assumption upon which our original algorithm was based. Frequency comodulation refers to similarities in frequency variations of individual harmonics emitted by these types of sources. The analytical difficulty is in defining a representation of the source which properly tracks frequency varying components. A simple, fixed filter bank can only track an individual spectral component for the duration in which it is within the passband of one of the filters. Alternatives are therefore explored which are amenable to real-time implementation.

  19. Spins, shapes, and orbits for potentially hazardous near-earth objects by NEON

    DEFF Research Database (Denmark)

    Muinonen, K.; Jørgensen, U.G.

    2006-01-01

    radiative transfer, scattering, celestial mechanics, methods: analytical, methods: data analysis, methods: numerical, methods: statistical, techniques: photometric, astrometry, comets: general, minor planets, asteroids......radiative transfer, scattering, celestial mechanics, methods: analytical, methods: data analysis, methods: numerical, methods: statistical, techniques: photometric, astrometry, comets: general, minor planets, asteroids...

  20. Genetic Algorithm Applied to the Eigenvalue Equalization Filtered-x LMS Algorithm (EE-FXLMS

    Directory of Open Access Journals (Sweden)

    Stephan P. Lovstedt

    2008-01-01

    Full Text Available The FXLMS algorithm, used extensively in active noise control (ANC, exhibits frequency-dependent convergence behavior. This leads to degraded performance for time-varying tonal noise and noise with multiple stationary tones. Previous work by the authors proposed the eigenvalue equalization filtered-x least mean squares (EE-FXLMS algorithm. For that algorithm, magnitude coefficients of the secondary path transfer function are modified to decrease variation in the eigenvalues of the filtered-x autocorrelation matrix, while preserving the phase, giving faster convergence and increasing overall attenuation. This paper revisits the EE-FXLMS algorithm, using a genetic algorithm to find magnitude coefficients that give the least variation in eigenvalues. This method overcomes some of the problems with implementing the EE-FXLMS algorithm arising from finite resolution of sampled systems. Experimental control results using the original secondary path model, and a modified secondary path model for both the previous implementation of EE-FXLMS and the genetic algorithm implementation are compared.

  1. Aeroacoustics of Musical Instruments

    NARCIS (Netherlands)

    Fabre, B.; Gilbert, J.; Hirschberg, Abraham; Pelorson, X.

    2012-01-01

    We are interested in the quality of sound produced by musical instruments and their playability. In wind instruments, a hydrodynamic source of sound is coupled to an acoustic resonator. Linear acoustics can predict the pitch of an instrument. This can significantly reduce the trial-and-error process

  2. Isotope-equipped measuring instruments

    International Nuclear Information System (INIS)

    Miyagawa, Kazuo; Amano, Hiroshi

    1980-01-01

    In the steel industry, though the investment in isotope-equipped measuring instruments is small as compared with that in machinery, they play important role in the moisture measurement in sintering and blast furnaces, the thickness measurement in rolling process and others in automatic control systems. The economic aspect of the isotope-equipped measuring instruments is described on the basis of the practices in Kimitsu Works of Nippon Steel Corporation: distribution of such instruments, evaluation of economic effects, usefulness evaluation in view of raising the accuracy, and usefulness evaluation viewed from the failure of the isotope instruments. The evaluation of economic effects was made under the premise that the isotope-equipped measuring instruments are not employed. Then, the effects of raising the accuracy are evaluated for a γ-ray plate thickness gauge and a neutron moisture gauge for coke in a blast furnace. Finally, the usefulness was evaluated, assuming possible failure of the isotope-equipped measuring instruments. (J.P.N.)

  3. On factoring RSA modulus using random-restart hill-climbing algorithm and Pollard’s rho algorithm

    Science.gov (United States)

    Budiman, M. A.; Rachmawati, D.

    2017-12-01

    The security of the widely-used RSA public key cryptography algorithm depends on the difficulty of factoring a big integer into two large prime numbers. For many years, the integer factorization problem has been intensively and extensively studied in the field of number theory. As a result, a lot of deterministic algorithms such as Euler’s algorithm, Kraitchik’s, and variants of Pollard’s algorithms have been researched comprehensively. Our study takes a rather uncommon approach: rather than making use of intensive number theories, we attempt to factorize RSA modulus n by using random-restart hill-climbing algorithm, which belongs the class of metaheuristic algorithms. The factorization time of RSA moduli with different lengths is recorded and compared with the factorization time of Pollard’s rho algorithm, which is a deterministic algorithm. Our experimental results indicates that while random-restart hill-climbing algorithm is an acceptable candidate to factorize smaller RSA moduli, the factorization speed is much slower than that of Pollard’s rho algorithm.

  4. Instrumentation reference book

    CERN Document Server

    Boyes, Walt

    2002-01-01

    Instrumentation is not a clearly defined subject, having a 'fuzzy' boundary with a number of other disciplines. Often categorized as either 'techniques' or 'applications' this book addresses the various applications that may be needed with reference to the practical techniques that are available for the instrumentation or measurement of a specific physical quantity or quality. This makes it of direct interest to anyone working in the process, control and instrumentation fields where these measurements are essential.* Comprehensive and authoritative collection of technical information* Writte

  5. Jones' instrument technology

    CERN Document Server

    Jones, Ernest Beachcroft; Kingham, Edward G; Radnai, Rudolf

    1985-01-01

    Jones' Instrument Technology, Volume 5: Automatic Instruments and Measuring Systems deals with general trends in automatic instruments and measuring systems. Specific examples are provided to illustrate the principles of such devices. A brief review of a considerable number of standards is undertaken, with emphasis on the IEC625 Interface System. Other relevant standards are reviewed, including the interface and backplane bus standards. This volume is comprised of seven chapters and begins with a short introduction to the principles of automatic measurements, classification of measuring system

  6. DIPSI: the diffraction image phase sensing instrument for APE

    Science.gov (United States)

    Montoya-Martínez, Luzma; Reyes, Marcos; Schumacher, Achim; Hernández, Elvio

    2006-06-01

    Large segmented mirrors require efficient co-phasing techniques in order to avoid the image degradation due to segments misalignment. For this purpose in the last few years new co-phasing techniques have been developed in collaboration with several European institutes. The Active Phasing Experiment (APE) will be a technical instrument aimed at testing different phasing techniques for an Extremely Large Telescope (ELT). A mirror composed of 61 hexagonal segments will be conjugated to the primary mirror of the VLT (Very Large Telescope). Each segment can be moved in piston, tip and tilt. Three new types of co-phasing sensors dedicated to the measurement of segmentation errors will be tested, evaluated and compared: ZEUS (Zernike Unit for Segment phasing) developed by LAM and IAC, PYPS (PYramid Phase Sensor) developed by INAF/ARCETRI, and DIPSI (Diffraction Image Phase Sensing Instrument) developed by IAC, GRANTECAN and LAM. This experiment will first run in the laboratory with point-like polychromatic sources and a turbulence generator. In a second step, it will be mounted at the Nasmyth platform focus of a VLT unit telescope. This paper describes the scientific concept of DIPSI, its optomechanical design, the signal analysis to retrieve segment piston and tip-tilt, the multiwavelength algorithm to increase the capture range, and the multiple segmentation case, including both simulation and laboratory tests results.

  7. Opposition-Based Adaptive Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Chibing Gong

    2016-07-01

    Full Text Available A fireworks algorithm (FWA is a recent swarm intelligence algorithm that is inspired by observing fireworks explosions. An adaptive fireworks algorithm (AFWA proposes additional adaptive amplitudes to improve the performance of the enhanced fireworks algorithm (EFWA. The purpose of this paper is to add opposition-based learning (OBL to AFWA with the goal of further boosting performance and achieving global optimization. Twelve benchmark functions are tested in use of an opposition-based adaptive fireworks algorithm (OAFWA. The final results conclude that OAFWA significantly outperformed EFWA and AFWA in terms of solution accuracy. Additionally, OAFWA was compared with a bat algorithm (BA, differential evolution (DE, self-adapting control parameters in differential evolution (jDE, a firefly algorithm (FA, and a standard particle swarm optimization 2011 (SPSO2011 algorithm. The research results indicate that OAFWA ranks the highest of the six algorithms for both solution accuracy and runtime cost.

  8. Opposite Degree Algorithm and Its Applications

    Directory of Open Access Journals (Sweden)

    Xiao-Guang Yue

    2015-12-01

    Full Text Available The opposite (Opposite Degree, referred to as OD algorithm is an intelligent algorithm proposed by Yue Xiaoguang et al. Opposite degree algorithm is mainly based on the concept of opposite degree, combined with the idea of design of neural network and genetic algorithm and clustering analysis algorithm. The OD algorithm is divided into two sub algorithms, namely: opposite degree - numerical computation (OD-NC algorithm and opposite degree - Classification computation (OD-CC algorithm.

  9. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  10. Astronomical Instrumentation System Markup Language

    Science.gov (United States)

    Goldbaum, Jesse M.

    2016-05-01

    The Astronomical Instrumentation System Markup Language (AISML) is an Extensible Markup Language (XML) based file format for maintaining and exchanging information about astronomical instrumentation. The factors behind the need for an AISML are first discussed followed by the reasons why XML was chosen as the format. Next it's shown how XML also provides the framework for a more precise definition of an astronomical instrument and how these instruments can be combined to form an Astronomical Instrumentation System (AIS). AISML files for several instruments as well as one for a sample AIS are provided. The files demonstrate how AISML can be utilized for various tasks from web page generation and programming interface to instrument maintenance and quality management. The advantages of widespread adoption of AISML are discussed.

  11. The Material Culture of Nineteenth-Century Astrometry, its Circulation and Heritage at the Astronomical Observatory of Lisbon

    Science.gov (United States)

    Raposo, Pedro

    The Astronomical Observatory of Lisbon was founded in 1857 in the sequence of a controversy on stellar parallax measurements involving astronomers from the Observatory of Paris and the Observatory of Pulkovo. The development of this discussion led the contenders to recognize Lisbon as a suitable place to carry out this kind of measurements and to foster the field of stellar astronomy. Some local actors strived to keep up with this wave of international interest and establish a first-rank astronomical institution in the Portuguese capital. In order to fulfil this goal, correspondence was intensively exchanged with leading foreign astronomers and instrument makers. Besides, a Portuguese Navy officer bound to become the first director of the new institution was commissioned to visit several observatories and instrument workshops abroad, and to spend a few years in Pulkovo as a trainee astronomer. Although founded with generous financial support from the Portuguese crown and lavishly equipped and constructed, the Observatory of Lisbon was later affected by limiting budgets and a shortage of qualified personnel. Nevertheless, local efforts to improve instruments as well as observation and calculation techniques enabled its astronomers to yield important contributions to positional astronomy, especially towards the end of the nineteenth century and the beginnings of the twentieth century. The original instruments and spaces of the Observatory of Lisbon, strongly modelled on those of Pulkovo, are very well preserved, constituting an outstanding extant example of a mid-nineteenth century advanced observatory. The history they embody testifies the connectedness of the astronomical heritage worldwide.

  12. A physics-based algorithm for retrieving land-surface emissivity and temperature from EOS/MODIS data

    International Nuclear Information System (INIS)

    Wan, Z.; Li, Z.L.

    1997-01-01

    The authors have developed a physics-based land-surface temperature (LST) algorithm for simultaneously retrieving surface band-averaged emissivities and temperatures from day/night pairs of MODIS (Moderate Resolution Imaging Spectroradiometer) data in seven thermal infrared bands. The set of 14 nonlinear equations in the algorithm is solved with the statistical regression method and the least-squares fit method. This new LST algorithm was tested with simulated MODIS data for 80 sets of band-averaged emissivities calculated from published spectral data of terrestrial materials in wide ranges of atmospheric and surface temperature conditions. Comprehensive sensitivity and error analysis has been made to evaluate the performance of the new LST algorithm and its dependence on variations in surface emissivity and temperature, upon atmospheric conditions, as well as the noise-equivalent temperature difference (NEΔT) and calibration accuracy specifications of the MODIS instrument. In cases with a systematic calibration error of 0.5%, the standard deviations of errors in retrieved surface daytime and nighttime temperatures fall between 0.4--0.5 K over a wide range of surface temperatures for mid-latitude summer conditions. The standard deviations of errors in retrieved emissivities in bands 31 and 32 (in the 10--12.5 microm IR spectral window region) are 0.009, and the maximum error in retrieved LST values falls between 2--3 K

  13. Evaluation of sensor placement algorithms for on-orbit identification of space platforms

    Science.gov (United States)

    Glassburn, Robin S.; Smith, Suzanne Weaver

    1994-01-01

    Anticipating the construction of the international space station, on-orbit modal identification of space platforms through optimally placed accelerometers is an area of recent activity. Unwanted vibrations in the platform could affect the results of experiments which are planned. Therefore, it is important that sensors (accelerometers) be strategically placed to identify the amount and extent of these unwanted vibrations, and to validate the mathematical models used to predict the loads and dynamic response. Due to cost, installation, and data management issues, only a limited number of sensors will be available for placement. This work evaluates and compares four representative sensor placement algorithms for modal identification. Most of the sensor placement work to date has employed only numerical simulations for comparison. This work uses experimental data from a fully-instrumented truss structure which was one of a series of structures designed for research in dynamic scale model ground testing of large space structures at NASA Langley Research Center. Results from this comparison show that for this cantilevered structure, the algorithm based on Guyan reduction is rated slightly better than that based on Effective Independence.

  14. Evaluating the accuracy of a MODIS direct broadcast algorithm for mapping burned areas over Russia

    Science.gov (United States)

    Petkov, A.; Hao, W. M.; Nordgren, B.; Corley, R.; Urbanski, S. P.; Ponomarev, E. I.

    2012-12-01

    Emission inventories for open area biomass burning rely on burned area estimates as a key component. We have developed an automated algorithm based on MODerate resolution Imaging Spectroradiometer (MODIS) satellite instrument data for estimating burned area from biomass fires. The algorithm is based on active fire detections, burn scars from MODIS calibrated radiances (MOD02HKM), and MODIS land cover classification (MOD12Q1). Our burned area product combines active fires and burn scar detections using spatio-temporal criteria, and has a resolution of 500 x 500 meters. The algorithm has been used for smoke emission estimates over the western United States. We will present the assessed accuracy of our algorithm in different regions of Russia with intense wildfire activity by comparing our results with the burned area product from the Sukachev Institute of Forest (SIF) of the Russian Academy of Sciences in Krasnoyarsk, Russia, as well as burn scars extracted from Landsat imagery. Landsat burned area extraction was based on threshold classification using the Jenks Natural Breaks algorithm to the histogram for each singe scene Normalized Burn Ratio (NBR) image. The final evaluation consisted of a grid-based approach, where the burned area in each 3 km x 3 km grid cell was calculated and compared with the other two sources. A comparison between our burned area estimates and those from SIF showed strong correlation (R2=0.978), although our estimate is approximately 40% lower than the SIF burned areas. The linear fit between the burned area from Landsat scenes and our MODIS algorithm over 18,754 grid cells resulted with a slope of 0.998 and R2=0.7, indicating that our algorithm is suitable for mapping burned areas for fires in boreal forests and other ecosystems. The results of our burned area algorithm will be used for estimating emissions of trace gasses and aerosol particles (including black carbon) from biomass burning in Northern Eurasia for the period of 2002-2011.

  15. SBUV version 8.6 Retrieval Algorithm: Error Analysis and Validation Technique

    Science.gov (United States)

    Kramarova, N. A.; Bhartia, P. K.; Frith, P. K.; McPeters, S. M.; Labow, R. D.; Taylor, G.; Fisher, S.; DeLand, M.

    2012-01-01

    SBUV version 8.6 algorithm was used to reprocess data from the Back Scattered Ultra Violet (BUV), the Solar Back Scattered Ultra Violet (SBUV) and a number of SBUV/2 instruments, which 'span a 41-year period from 1970 to 2011 (except a 5-year gap in the 1970s)[see Bhartia et al, 2012]. In the new version Daumont et al. [1992] ozone cross section were used, and new ozone [McPeters et ai, 2007] and cloud climatologies Doiner and Bhartia, 1995] were implemented. The algorithm uses the Optimum Estimation technique [Rodgers, 2000] to retrieve ozone profiles as ozone layer (partial column, DU) on 21 pressure layers. The corresponding total ozone values are calculated by summing ozone columns at individual layers. The algorithm is optimized to accurately retrieve monthly zonal mean (mzm) profiles rather than an individual profile, since it uses monthly zonal mean ozone climatology as the A Priori. Thus, the SBUV version 8.6 ozone dataset is better suited for long-term trend analysis and monitoring ozone changes rather than for studying short-term ozone variability. Here we discuss some characteristics of the SBUV algorithm and sources of error in the SBUV profile and total ozone retrievals. For the first time the Averaging Kernels, smoothing errors and weighting functions (or Jacobians) are included in the SBUV metadata. The Averaging Kernels (AK) represent the sensitivity of the retrieved profile to the true state and contain valuable information about the retrieval algorithm, such as Vertical Resolution, Degrees of Freedom for Signals (DFS) and Retrieval Efficiency [Rodgers, 2000]. Analysis of AK for mzm ozone profiles shows that the total number of DFS for ozone profiles varies from 4.4 to 5.5 out of 6-9 wavelengths used for retrieval. The number of wavelengths in turn depends on solar zenith angles. Between 25 and 0.5 hPa, where SBUV vertical resolution is the highest, DFS for individual layers are about 0.5.

  16. Soil monitoring instrumentation

    International Nuclear Information System (INIS)

    Umbarger, C.J.

    1981-01-01

    The Los Alamos Scientific Laboratory (LASL) has an extensive program for the development of nondestructive assay instrumentation for the quantitative analysis of transuranic (TRU) materials found in bulk solid wastes generated by Department of Energy facilities and by the commercial nuclear power industry. Included are wastes generated in decontamination and decommissioning of outdated nuclear facilities, as well as from old waste-burial-ground exhumation programs. The assay instrumentation is designed to have detection limits below 10 nCi/g wherever practicable. The assay instrumentation that is applied specifically to soil monitoring is discussed

  17. Instruments to assess integrated care

    DEFF Research Database (Denmark)

    Lyngsø, Anne Marie; Godtfredsen, Nina Skavlan; Høst, Dorte

    2014-01-01

    INTRODUCTION: Although several measurement instruments have been developed to measure the level of integrated health care delivery, no standardised, validated instrument exists covering all aspects of integrated care. The purpose of this review is to identify the instruments concerning how to mea...... was prevalent. It is uncertain whether development of a single 'all-inclusive' model for assessing integrated care is desirable. We emphasise the continuing need for validated instruments embedded in theoretical contexts.......INTRODUCTION: Although several measurement instruments have been developed to measure the level of integrated health care delivery, no standardised, validated instrument exists covering all aspects of integrated care. The purpose of this review is to identify the instruments concerning how...... to measure the level of integration across health-care sectors and to assess and evaluate the organisational elements within the instruments identified. METHODS: An extensive, systematic literature review in PubMed, CINAHL, PsycINFO, Cochrane Library, Web of Science for the years 1980-2011. Selected...

  18. Analysis on detection accuracy of binocular photoelectric instrument optical axis parallelism digital calibration instrument

    Science.gov (United States)

    Ying, Jia-ju; Yin, Jian-ling; Wu, Dong-sheng; Liu, Jie; Chen, Yu-dan

    2017-11-01

    Low-light level night vision device and thermal infrared imaging binocular photoelectric instrument are used widely. The maladjustment of binocular instrument ocular axises parallelism will cause the observer the symptom such as dizziness, nausea, when use for a long time. Binocular photoelectric equipment digital calibration instrument is developed for detecting ocular axises parallelism. And the quantitative value of optical axis deviation can be quantitatively measured. As a testing instrument, the precision must be much higher than the standard of test instrument. Analyzes the factors that influence the accuracy of detection. Factors exist in each testing process link which affect the precision of the detecting instrument. They can be divided into two categories, one category is factors which directly affect the position of reticle image, the other category is factors which affect the calculation the center of reticle image. And the Synthesize error is calculated out. And further distribute the errors reasonably to ensure the accuracy of calibration instruments.

  19. Algorithmic phase diagrams

    Science.gov (United States)

    Hockney, Roger

    1987-01-01

    Algorithmic phase diagrams are a neat and compact representation of the results of comparing the execution time of several algorithms for the solution of the same problem. As an example, the recent results are shown of Gannon and Van Rosendale on the solution of multiple tridiagonal systems of equations in the form of such diagrams. The act of preparing these diagrams has revealed an unexpectedly complex relationship between the best algorithm and the number and size of the tridiagonal systems, which was not evident from the algebraic formulae in the original paper. Even so, for a particular computer, one diagram suffices to predict the best algorithm for all problems that are likely to be encountered the prediction being read directly from the diagram without complex calculation.

  20. A 3000 TNOs Survey Project at ESO La Silla

    Science.gov (United States)

    Boehnhardt, H.; Hainaut, O.

    We propose a wide-shallow TNO search to be done with the Wide Field Imager (WFI) instrument at the 2.2m MPG/ESO telescope in La Silla/Chile. The WFI is a half-deg camera equipped with an 8kx8k CCD (0.24 arcsec/pixel). The telescope can support excellent seeing quality down to 0.5arcsec FWHM. A TNO search pilot project was run with the 2.2m+WFI in 1999: images with just 1.6sdeg sky coverage and typically 24mag limiting brightness revealed 6 new TNOs when processed with our new automatic detection program MOVIE. The project is now continued on a somewhat larger scale in order to find more TNOs and to fine-tune the operational environment for a full automatic on-line detection, astrometry and photometry of the objects at the telescope. The future goal is to perform - with the 2.2m+WFI and in an international colaboration - an even larger TNO survey over a major part of the sky (typically 2000sdeg in and out of Ecliptic) down to 24mag. Follow-up astrometry and photometry of the expected more than 3000 discovered objects will secure their orbital and physical characterisation for synoptic dynamical and taxonomic studies of the Transneptunian population.

  1. Recursive forgetting algorithms

    DEFF Research Database (Denmark)

    Parkum, Jens; Poulsen, Niels Kjølstad; Holst, Jan

    1992-01-01

    In the first part of the paper, a general forgetting algorithm is formulated and analysed. It contains most existing forgetting schemes as special cases. Conditions are given ensuring that the basic convergence properties will hold. In the second part of the paper, the results are applied...... to a specific algorithm with selective forgetting. Here, the forgetting is non-uniform in time and space. The theoretical analysis is supported by a simulation example demonstrating the practical performance of this algorithm...

  2. Drift-corrected Odin-OSIRIS ozone product: algorithm and updated stratospheric ozone trends

    Directory of Open Access Journals (Sweden)

    A. E. Bourassa

    2018-01-01

    Full Text Available A small long-term drift in the Optical Spectrograph and Infrared Imager System (OSIRIS stratospheric ozone product, manifested mostly since 2012, is quantified and attributed to a changing bias in the limb pointing knowledge of the instrument. A correction to this pointing drift using a predictable shape in the measured limb radiance profile is implemented and applied within the OSIRIS retrieval algorithm. This new data product, version 5.10, displays substantially better both long- and short-term agreement with Microwave Limb Sounder (MLS ozone throughout the stratosphere due to the pointing correction. Previously reported stratospheric ozone trends over the time period 1984–2013, which were derived by merging the altitude–number density ozone profile measurements from the Stratospheric Aerosol and Gas Experiment (SAGE II satellite instrument (1984–2005 and from OSIRIS (2002–2013, are recalculated using the new OSIRIS version 5.10 product and extended to 2017. These results still show statistically significant positive trends throughout the upper stratosphere since 1997, but at weaker levels that are more closely in line with estimates from other data records.

  3. Impact of Noise Reduction Algorithm in Cochlear Implant Processing on Music Enjoyment.

    Science.gov (United States)

    Kohlberg, Gavriel D; Mancuso, Dean M; Griffin, Brianna M; Spitzer, Jaclyn B; Lalwani, Anil K

    2016-06-01

    Noise reduction algorithm (NRA) in speech processing strategy has positive impact on speech perception among cochlear implant (CI) listeners. We sought to evaluate the effect of NRA on music enjoyment. Prospective analysis of music enjoyment. Academic medical center. Normal-hearing (NH) adults (N = 16) and CI listeners (N = 9). Subjective rating of music excerpts. NH and CI listeners evaluated country music piece on three enjoyment modalities: pleasantness, musicality, and naturalness. Participants listened to the original version and 20 modified, less complex versions created by including subsets of musical instruments from the original song. NH participants listened to the segments through CI simulation and CI listeners listened to the segments with their usual speech processing strategy, with and without NRA. Decreasing the number of instruments was significantly associated with increase in the pleasantness and naturalness in both NH and CI subjects (p  0.05): this was true for the original and the modified music segments with one to three instruments (p > 0.05). NRA does not affect music enjoyment in CI listener or NH individual with CI simulation. This suggests that strategies to enhance speech processing will not necessarily have a positive impact on music enjoyment. However, reducing the complexity of music shows promise in enhancing music enjoyment and should be further explored.

  4. Agency and Algorithms

    Directory of Open Access Journals (Sweden)

    Hanns Holger Rutz

    2016-11-01

    Full Text Available Although the concept of algorithms has been established a long time ago, their current topicality indicates a shift in the discourse. Classical definitions based on logic seem to be inadequate to describe their aesthetic capabilities. New approaches stress their involvement in material practices as well as their incompleteness. Algorithmic aesthetics can no longer be tied to the static analysis of programs, but must take into account the dynamic and experimental nature of coding practices. It is suggested that the aesthetic objects thus produced articulate something that could be called algorithmicity or the space of algorithmic agency. This is the space or the medium – following Luhmann’s form/medium distinction – where human and machine undergo mutual incursions. In the resulting coupled “extimate” writing process, human initiative and algorithmic speculation cannot be clearly divided out any longer. An observation is attempted of defining aspects of such a medium by drawing a trajectory across a number of sound pieces. The operation of exchange between form and medium I call reconfiguration and it is indicated by this trajectory. 

  5. Suborbital Reusable Launch Vehicles as an Opportunity to Consolidate and Calibrate Ground Based and Satellite Instruments

    Science.gov (United States)

    Papadopoulos, K.

    2014-12-01

    XCOR Aerospace, a commercial space company, is planning to provide frequent, low cost access to near-Earth space on the Lynx suborbital Reusable Launch Vehicle (sRLV). Measurements in the external vacuum environment can be made and can launch from most runways on a limited lead time. Lynx can operate as a platform to perform suborbital in situ measurements and remote sensing to supplement models and simulations with new data points. These measurements can serve as a quantitative link to existing instruments and be used as a basis to calibrate detectors on spacecraft. Easier access to suborbital data can improve the longevity and cohesiveness of spacecraft and ground-based resources. A study of how these measurements can be made on Lynx sRLV will be presented. At the boundary between terrestrial and space weather, measurements from instruments on Lynx can help develop algorithms to optimize the consolidation of ground and satellite based data as well as assimilate global models with new data points. For example, current tides and the equatorial electrojet, essential to understanding the Thermosphere-Ionosphere system, can be measured in situ frequently and on short notice. Furthermore, a negative-ion spectrometer and a Faraday cup, can take measurements of the D-region ion composition. A differential GPS receiver can infer the spatial gradient of ionospheric electron density. Instruments and optics on spacecraft degrade over time, leading to calibration drift. Lynx can be a cost effective platform for deploying a reference instrument to calibrate satellites with a frequent and fast turnaround and a successful return of the instrument. A calibrated reference instrument on Lynx can make collocated observations as another instrument and corrections are made for the latter, thus ensuring data consistency and mission longevity. Aboard a sRLV, atmospheric conditions that distort remotely sensed data (ground and spacecraft based) can be measured in situ. Moreover, an

  6. The potential of soft computing methods in NPP instrumentation and control

    International Nuclear Information System (INIS)

    Hampel, R.; Chaker, N.; Kaestner, W.; Traichel, A.; Wagenknecht, M.; Gocht, U.

    2002-01-01

    The method of signal processing by soft computing include the application of fuzzy logic, synthetic neural networks, and evolutionary algorithms. The article contains an outline of the objectives and results of the application of fuzzy logic and methods of synthetic neural networks in nuclear measurement and control. The special requirements to be met by the software in safety-related areas with respect to reliability, evaluation, and validation are described. Possible uses may be in off-line applications in modeling, simulation, and reliability analysis as well as in on-line applications (real-time systems) for instrumentation and control. Safety-related aspects of signal processing are described and analyzed for the fuzzy logic and synthetic neural network concepts. Application are covered in selected examples. (orig.)

  7. Towards an Analogue Neuromorphic VLSI Instrument for the Sensing of Complex Odours

    Science.gov (United States)

    Ab Aziz, Muhammad Fazli; Harun, Fauzan Khairi Che; Covington, James A.; Gardner, Julian W.

    2011-09-01

    Almost all electronic nose instruments reported today employ pattern recognition algorithms written in software and run on digital processors, e.g. micro-processors, microcontrollers or FPGAs. Conversely, in this paper we describe the analogue VLSI implementation of an electronic nose through the design of a neuromorphic olfactory chip. The modelling, design and fabrication of the chip have already been reported. Here a smart interface has been designed and characterised for thisneuromorphic chip. Thus we can demonstrate the functionality of the a VLSI neuromorphic chip, producing differing principal neuron firing patterns to real sensor response data. Further work is directed towards integrating 9 separate neuromorphic chips to create a large neuronal network to solve more complex olfactory problems.

  8. A multi-instrument non-parametric reconstruction of the electron pressure profile in the galaxy cluster CLJ1226.9+3332

    Science.gov (United States)

    Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-04-01

    Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 generation of SZ instruments such as NIKA2 and MUSTANG2.

  9. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  10. A cluster algorithm for graphs

    NARCIS (Netherlands)

    S. van Dongen

    2000-01-01

    textabstractA cluster algorithm for graphs called the emph{Markov Cluster algorithm (MCL~algorithm) is introduced. The algorithm provides basically an interface to an algebraic process defined on stochastic matrices, called the MCL~process. The graphs may be both weighted (with nonnegative weight)

  11. [Controlling instruments in radiology].

    Science.gov (United States)

    Maurer, M

    2013-10-01

    Due to the rising costs and competitive pressures radiological clinics and practices are now facing, controlling instruments are gaining importance in the optimization of structures and processes of the various diagnostic examinations and interventional procedures. It will be shown how the use of selected controlling instruments can secure and improve the performance of radiological facilities. A definition of the concept of controlling will be provided. It will be shown which controlling instruments can be applied in radiological departments and practices. As an example, two of the controlling instruments, material cost analysis and benchmarking, will be illustrated.

  12. Analysis of instrumentation technology for SMART

    International Nuclear Information System (INIS)

    Hur, Seop; Koo, I. S.; Park, H. Y.; Lee, C. K.; Kim, D. H.; Suh, Y. S.; Seong, S. H.; Jang, G. S.

    1998-03-01

    It is necessary that development requirements, techniques to be developed, and development tasks and approach are established to develop the SMART instrumentation system. It is important to establish the development strategies for input for developing SMART instrumentation system. To meet above needs, the industry general and nuclear instrumentation techniques were analyzed and reviewed, respectively, based on the classification of instrumentation to analyze the industrial instrumentation techniques, and analysis results which described the inherent merits and demerits of each technique can be used for inputs to select the instruments for SMART. For the instrumentation techniques for nuclear environments, the major instrumentation techniques were reviewed, and the instrumentation system were established. The following development approaches were established based on the development requirements and the analysis results of research and development trends of industrial and nuclear instrumentation techniques. (author). 90 refs., 38 tabs., 33 figs

  13. Unsupervised learning algorithms

    CERN Document Server

    Aydin, Kemal

    2016-01-01

    This book summarizes the state-of-the-art in unsupervised learning. The contributors discuss how with the proliferation of massive amounts of unlabeled data, unsupervised learning algorithms, which can automatically discover interesting and useful patterns in such data, have gained popularity among researchers and practitioners. The authors outline how these algorithms have found numerous applications including pattern recognition, market basket analysis, web mining, social network analysis, information retrieval, recommender systems, market research, intrusion detection, and fraud detection. They present how the difficulty of developing theoretically sound approaches that are amenable to objective evaluation have resulted in the proposal of numerous unsupervised learning algorithms over the past half-century. The intended audience includes researchers and practitioners who are increasingly using unsupervised learning algorithms to analyze their data. Topics of interest include anomaly detection, clustering,...

  14. Generalized-ensemble molecular dynamics and Monte Carlo algorithms beyond the limit of the multicanonical algorithm

    International Nuclear Information System (INIS)

    Okumura, Hisashi

    2010-01-01

    I review two new generalized-ensemble algorithms for molecular dynamics and Monte Carlo simulations of biomolecules, that is, the multibaric–multithermal algorithm and the partial multicanonical algorithm. In the multibaric–multithermal algorithm, two-dimensional random walks not only in the potential-energy space but also in the volume space are realized. One can discuss the temperature dependence and pressure dependence of biomolecules with this algorithm. The partial multicanonical simulation samples a wide range of only an important part of potential energy, so that one can concentrate the effort to determine a multicanonical weight factor only on the important energy terms. This algorithm has higher sampling efficiency than the multicanonical and canonical algorithms. (review)

  15. An Attitude Scale on Individual Instrument and Individual Instrument Course: Validity-Reliability Research

    Science.gov (United States)

    Kuçukosmanoglu, Hayrettin Onur

    2015-01-01

    The main purpose of this study is to develop a scale to determine students' attitude levels on individual instruments and individual instrument courses in instrument training, which is an important dimension of music education, and to conduct a validity-reliability research of the scale that has been developed. The scale consists of 16 items. The…

  16. Theoretic derivation of directed acyclic subgraph algorithm and comparisons with message passing algorithm

    Science.gov (United States)

    Ha, Jeongmok; Jeong, Hong

    2016-07-01

    This study investigates the directed acyclic subgraph (DAS) algorithm, which is used to solve discrete labeling problems much more rapidly than other Markov-random-field-based inference methods but at a competitive accuracy. However, the mechanism by which the DAS algorithm simultaneously achieves competitive accuracy and fast execution speed, has not been elucidated by a theoretical derivation. We analyze the DAS algorithm by comparing it with a message passing algorithm. Graphical models, inference methods, and energy-minimization frameworks are compared between DAS and message passing algorithms. Moreover, the performances of DAS and other message passing methods [sum-product belief propagation (BP), max-product BP, and tree-reweighted message passing] are experimentally compared.

  17. Recent Advances and Achievements at The Catalina Sky Survey

    Science.gov (United States)

    Leonard, Gregory J.; Christensen, Eric J.; Fuls, Carson; Gibbs, Alex; Grauer, Al; Johnson, Jess A.; Kowalski, Richard; Larson, Stephen M.; Matheny, Rose; Seaman, Rob; Shelly, Frank

    2017-10-01

    The Catalina Sky Survey (CSS) is a NASA-funded project fully dedicated to discover and track near-Earth objects (NEOs). Since its founding nearly 20 years ago CSS remains at the forefront of NEO surveys, and recent improvements in both instrumentation and software have increased both survey productivity and data quality. In 2016 new large-format (10K x 10K) cameras were installed on both CSS survey telescopes, the 1.5-m reflector and the 0.7-m Schmidt, increasing the field of view, and hence nightly sky coverage by 4x and 2.4x respectively. The new cameras, coupled with improvements in the reduction and detection pipelines, and revised sky-coverage strategies have yielded a dramatic upward trend of NEO discovery rates. CSS has also developed a custom adaptive queue manager for scheduling NEO follow-up astrometry using a remotely operated and recently renovated 1-m Cassegrain reflector telescope, improvements that have increased the production of follow-up astrometry for newly discovered NEOs and arc extensions for previously discovered objects by CSS and other surveys. Additionally, reprocessing of archival CSS data (which includes some 46 million individual astrometric measurements) through the new reduction and detection pipeline will allow for improved orbit determinations and increased arc extensions for hundreds of thousands of asteroids. Reprocessed data will soon feed into a new public archive of CSS images and catalog data products made available through NASA’s Planetary Data System (PDS). For the future, CSS is working towards improved NEO follow-up capabilities through a combination of access to larger telescopes, instrument upgrades and follow-up scheduling tools.

  18. Shadow algorithms data miner

    CERN Document Server

    Woo, Andrew

    2012-01-01

    Digital shadow generation continues to be an important aspect of visualization and visual effects in film, games, simulations, and scientific applications. This resource offers a thorough picture of the motivations, complexities, and categorized algorithms available to generate digital shadows. From general fundamentals to specific applications, it addresses shadow algorithms and how to manage huge data sets from a shadow perspective. The book also examines the use of shadow algorithms in industrial applications, in terms of what algorithms are used and what software is applicable.

  19. Diversity-Guided Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær

    2002-01-01

    Population diversity is undoubtably a key issue in the performance of evolutionary algorithms. A common hypothesis is that high diversity is important to avoid premature convergence and to escape local optima. Various diversity measures have been used to analyze algorithms, but so far few...... algorithms have used a measure to guide the search. The diversity-guided evolutionary algorithm (DGEA) uses the wellknown distance-to-average-point measure to alternate between phases of exploration (mutation) and phases of exploitation (recombination and selection). The DGEA showed remarkable results...

  20. Contact-impact algorithms on parallel computers

    International Nuclear Information System (INIS)

    Zhong Zhihua; Nilsson, Larsgunnar

    1994-01-01

    Contact-impact algorithms on parallel computers are discussed within the context of explicit finite element analysis. The algorithms concerned include a contact searching algorithm and an algorithm for contact force calculations. The contact searching algorithm is based on the territory concept of the general HITA algorithm. However, no distinction is made between different contact bodies, or between different contact surfaces. All contact segments from contact boundaries are taken as a single set. Hierarchy territories and contact territories are expanded. A three-dimensional bucket sort algorithm is used to sort contact nodes. The defence node algorithm is used in the calculation of contact forces. Both the contact searching algorithm and the defence node algorithm are implemented on the connection machine CM-200. The performance of the algorithms is examined under different circumstances, and numerical results are presented. ((orig.))

  1. A review on quantum search algorithms

    Science.gov (United States)

    Giri, Pulak Ranjan; Korepin, Vladimir E.

    2017-12-01

    The use of superposition of states in quantum computation, known as quantum parallelism, has significant advantage in terms of speed over the classical computation. It is evident from the early invented quantum algorithms such as Deutsch's algorithm, Deutsch-Jozsa algorithm and its variation as Bernstein-Vazirani algorithm, Simon algorithm, Shor's algorithms, etc. Quantum parallelism also significantly speeds up the database search algorithm, which is important in computer science because it comes as a subroutine in many important algorithms. Quantum database search of Grover achieves the task of finding the target element in an unsorted database in a time quadratically faster than the classical computer. We review Grover's quantum search algorithms for a singe and multiple target elements in a database. The partial search algorithm of Grover and Radhakrishnan and its optimization by Korepin called GRK algorithm are also discussed.

  2. Design of 4D x-ray tomography experiments for reconstruction using regularized iterative algorithms

    Science.gov (United States)

    Mohan, K. Aditya

    2017-10-01

    4D X-ray computed tomography (4D-XCT) is widely used to perform non-destructive characterization of time varying physical processes in various materials. The conventional approach to improving temporal resolution in 4D-XCT involves the development of expensive and complex instrumentation that acquire data faster with reduced noise. It is customary to acquire data with many tomographic views at a high signal to noise ratio. Instead, temporal resolution can be improved using regularized iterative algorithms that are less sensitive to noise and limited views. These algorithms benefit from optimization of other parameters such as the view sampling strategy while improving temporal resolution by reducing the total number of views or the detector exposure time. This paper presents the design principles of 4D-XCT experiments when using regularized iterative algorithms derived using the framework of model-based reconstruction. A strategy for performing 4D-XCT experiments is presented that allows for improving the temporal resolution by progressively reducing the number of views or the detector exposure time. Theoretical analysis of the effect of the data acquisition parameters on the detector signal to noise ratio, spatial reconstruction resolution, and temporal reconstruction resolution is also presented in this paper.

  3. Algorithmic modeling of the irrelevant sound effect (ISE) by the hearing sensation fluctuation strength.

    Science.gov (United States)

    Schlittmeier, Sabine J; Weissgerber, Tobias; Kerber, Stefan; Fastl, Hugo; Hellbrück, Jürgen

    2012-01-01

    Background sounds, such as narration, music with prominent staccato passages, and office noise impair verbal short-term memory even when these sounds are irrelevant. This irrelevant sound effect (ISE) is evoked by so-called changing-state sounds that are characterized by a distinct temporal structure with varying successive auditory-perceptive tokens. However, because of the absence of an appropriate psychoacoustically based instrumental measure, the disturbing impact of a given speech or nonspeech sound could not be predicted until now, but necessitated behavioral testing. Our database for parametric modeling of the ISE included approximately 40 background sounds (e.g., speech, music, tone sequences, office noise, traffic noise) and corresponding performance data that was collected from 70 behavioral measurements of verbal short-term memory. The hearing sensation fluctuation strength was chosen to model the ISE and describes the percept of fluctuations when listening to slowly modulated sounds (f(mod) background sounds, the algorithm estimated behavioral performance data in 63 of 70 cases within the interquartile ranges. In particular, all real-world sounds were modeled adequately, whereas the algorithm overestimated the (non-)disturbance impact of synthetic steady-state sounds that were constituted by a repeated vowel or tone. Implications of the algorithm's strengths and prediction errors are discussed.

  4. Magnet sorting algorithms

    International Nuclear Information System (INIS)

    Dinev, D.

    1996-01-01

    Several new algorithms for sorting of dipole and/or quadrupole magnets in synchrotrons and storage rings are described. The algorithms make use of a combinatorial approach to the problem and belong to the class of random search algorithms. They use an appropriate metrization of the state space. The phase-space distortion (smear) is used as a goal function. Computational experiments for the case of the JINR-Dubna superconducting heavy ion synchrotron NUCLOTRON have shown a significant reduction of the phase-space distortion after the magnet sorting. (orig.)

  5. Indoor footstep localization from structural dynamics instrumentation

    Science.gov (United States)

    Poston, Jeffrey D.; Buehrer, R. Michael; Tarazaga, Pablo A.

    2017-05-01

    Measurements from accelerometers originally deployed to measure a building's structural dynamics can serve a new role: locating individuals moving within a building. Specifically, this paper proposes measurements of footstep-generated vibrations as a novel source of information for localization. The complexity of wave propagation in a building (e.g., dispersion and reflection) limits the utility of existing algorithms designed to locate, for example, the source of sound in a room or radio waves in free space. This paper develops enhancements for arrival time determination and time difference of arrival localization in order to address the complexities posed by wave propagation within a building's structure. Experiments with actual measurements from an instrumented public building demonstrate the potential of locating footsteps to sub-meter accuracy. Furthermore, this paper explains how to forecast performance in other buildings with different sensor configurations. This localization capability holds the potential to assist public safety agencies in building evacuation and incidence response, to facilitate occupancy-based optimization of heating or cooling and to inform facility security.

  6. Power station instrumentation

    International Nuclear Information System (INIS)

    Jervis, M.W.

    1993-01-01

    Power stations are characterized by a wide variety of mechanical and electrical plant operating with structures, liquids and gases working at high pressures and temperatures and with large mass flows. The voltages and currents are also the highest that occur in most industries. In order to achieve maximum economy, the plant is operated with relatively small margins from conditions that can cause rapid plant damage, safety implications, and very high financial penalties. In common with other process industries, power stations depend heavily on control and instrumentation. These systems have become particularly significant, in the cost-conscious privatized environment, for providing the means to implement the automation implicit in maintaining safety standards, improving generation efficiency and reducing operating manpower costs. This book is for professional instrumentation engineers who need to known about their use in power stations and power station engineers requiring information about the principles and choice of instrumentation available. There are 8 chapters; chapter 4 on instrumentation for nuclear steam supply systems is indexed separately. (Author)

  7. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  8. New algorithms for motion error detection of numerical control machine tool by laser tracking measurement on the basis of GPS principle

    Science.gov (United States)

    Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie

    2018-01-01

    As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.

  9. Some emergency instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Burgess, P H

    1986-10-01

    The widespread release of activity and the resultant spread of contamination after the Chernobyl accident resulted in requests to NRPB to provide instruments for, and expertise in, the measurement of radiation. The most common request was for advice on the usefulness of existing instruments, but Board staff were also involved in their adaptation or in the development of new instruments specially to meet the circumstances of the accident. The accident occurred on 26 April. On 1 May, NRPB was involved at Heathrow Airport in the monitoring of the British students who had returned from Kiev and Minsk. The main purpose was to reassure the students by checking that their persons and belongings did not have significant surface contamination. Additional measurements were also made of iodine activity in thyroid using hand-held detectors or a mobile body monitor. This operation was arranged with the Foreign and Commonwealth Office, which had also received numerous requests for instruments from embassies and consulates in countries close to the scene of the accident. There was concern for the well-being of staff and other United Kingdom nationals who resided in or intended to visit the most affected countries. The board supplied suitable instruments, and the FCO distributed them to embassies. The frequency of environmental monitoring was increased from 29 April in anticipation of contamination and appropriate Board instrumentation was deployed. After the Chernobyl cloud arrived in the UK on 2 May, there were numerous requests from local government, public authorities, private companies and members of the public for information and advice on monitoring equipment and procedures. Some of these requirements could be met with existing equipment but members of the public were usually advised not to proceed. At a later stage, the contamination of foodstuffs and livestock required the development of an instrument capable of detecting low levels of {sup 137}Cs and {sup 134}Cs in food

  10. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    Science.gov (United States)

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.

  11. Law and Order in Algorithmics

    NARCIS (Netherlands)

    Fokkinga, M.M.

    1992-01-01

    An algorithm is the input-output effect of a computer program; mathematically, the notion of algorithm comes close to the notion of function. Just as arithmetic is the theory and practice of calculating with numbers, so is ALGORITHMICS the theory and practice of calculating with algorithms. Just as

  12. The Science of String Instruments

    CERN Document Server

    Rossing, Thomas D

    2010-01-01

    Many performing musicians, as well as instrument builders, are coming to realize the importance of understanding the science of musical instruments. This book explains how string instruments produce sound. It presents basic ideas in simple language, and it also translates some more sophisticated ideas in non-technical language. It should be of interest to performers, researchers, and instrument makers alike.

  13. Remote Sensing of Cloud Top Height from SEVIRI: Analysis of Eleven Current Retrieval Algorithms

    Science.gov (United States)

    Hamann, U.; Walther, A.; Baum, B.; Bennartz, R.; Bugliaro, L.; Derrien, M.; Francis, P. N.; Heidinger, A.; Joro, S.; Kniffka, A.; hide

    2014-01-01

    The role of clouds remains the largest uncertainty in climate projections. They influence solar and thermal radiative transfer and the earth's water cycle. Therefore, there is an urgent need for accurate cloud observations to validate climate models and to monitor climate change. Passive satellite imagers measuring radiation at visible to thermal infrared (IR) wavelengths provide a wealth of information on cloud properties. Among others, the cloud top height (CTH) - a crucial parameter to estimate the thermal cloud radiative forcing - can be retrieved. In this paper we investigate the skill of ten current retrieval algorithms to estimate the CTH using observations from the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) onboard Meteosat Second Generation (MSG). In the first part we compare ten SEVIRI cloud top pressure (CTP) data sets with each other. The SEVIRI algorithms catch the latitudinal variation of the CTP in a similar way. The agreement is better in the extratropics than in the tropics. In the tropics multi-layer clouds and thin cirrus layers complicate the CTP retrieval, whereas a good agreement among the algorithms is found for trade wind cumulus, marine stratocumulus and the optically thick cores of the deep convective system. In the second part of the paper the SEVIRI retrievals are compared to CTH observations from the Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP) and Cloud Profiling Radar (CPR) instruments. It is important to note that the different measurement techniques cause differences in the retrieved CTH data. SEVIRI measures a radiatively effective CTH, while the CTH of the active instruments is derived from the return time of the emitted radar or lidar signal. Therefore, some systematic differences are expected. On average the CTHs detected by the SEVIRI algorithms are 1.0 to 2.5 kilometers lower than CALIOP observations, and the correlation coefficients between the SEVIRI and the CALIOP data sets range between 0.77 and 0

  14. Algorithms in Algebraic Geometry

    CERN Document Server

    Dickenstein, Alicia; Sommese, Andrew J

    2008-01-01

    In the last decade, there has been a burgeoning of activity in the design and implementation of algorithms for algebraic geometric computation. Some of these algorithms were originally designed for abstract algebraic geometry, but now are of interest for use in applications and some of these algorithms were originally designed for applications, but now are of interest for use in abstract algebraic geometry. The workshop on Algorithms in Algebraic Geometry that was held in the framework of the IMA Annual Program Year in Applications of Algebraic Geometry by the Institute for Mathematics and Its

  15. Explaining algorithms using metaphors

    CERN Document Server

    Forišek, Michal

    2013-01-01

    There is a significant difference between designing a new algorithm, proving its correctness, and teaching it to an audience. When teaching algorithms, the teacher's main goal should be to convey the underlying ideas and to help the students form correct mental models related to the algorithm. This process can often be facilitated by using suitable metaphors. This work provides a set of novel metaphors identified and developed as suitable tools for teaching many of the 'classic textbook' algorithms taught in undergraduate courses worldwide. Each chapter provides exercises and didactic notes fo

  16. Portfolios of quantum algorithms.

    Science.gov (United States)

    Maurer, S M; Hogg, T; Huberman, B A

    2001-12-17

    Quantum computation holds promise for the solution of many intractable problems. However, since many quantum algorithms are stochastic in nature they can find the solution of hard problems only probabilistically. Thus the efficiency of the algorithms has to be characterized by both the expected time to completion and the associated variance. In order to minimize both the running time and its uncertainty, we show that portfolios of quantum algorithms analogous to those of finance can outperform single algorithms when applied to the NP-complete problems such as 3-satisfiability.

  17. DNABIT Compress - Genome compression algorithm.

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-22

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.

  18. A verified LLL algorithm

    NARCIS (Netherlands)

    Divasón, Jose; Joosten, Sebastiaan; Thiemann, René; Yamada, Akihisa

    2018-01-01

    The Lenstra-Lenstra-Lovász basis reduction algorithm, also known as LLL algorithm, is an algorithm to find a basis with short, nearly orthogonal vectors of an integer lattice. Thereby, it can also be seen as an approximation to solve the shortest vector problem (SVP), which is an NP-hard problem,

  19. SPECIAL LIBRARIES OF FRAGMENTS OF ALGORITHMIC NETWORKS TO AUTOMATE THE DEVELOPMENT OF ALGORITHMIC MODELS

    Directory of Open Access Journals (Sweden)

    V. E. Marley

    2015-01-01

    Full Text Available Summary. The concept of algorithmic models appeared from the algorithmic approach in which the simulated object, the phenomenon appears in the form of process, subject to strict rules of the algorithm, which placed the process of operation of the facility. Under the algorithmic model is the formalized description of the scenario subject specialist for the simulated process, the structure of which is comparable with the structure of the causal and temporal relationships between events of the process being modeled, together with all information necessary for its software implementation. To represent the structure of algorithmic models used algorithmic network. Normally, they were defined as loaded finite directed graph, the vertices which are mapped to operators and arcs are variables, bound by operators. The language of algorithmic networks has great features, the algorithms that it can display indifference the class of all random algorithms. In existing systems, automation modeling based on algorithmic nets, mainly used by operators working with real numbers. Although this reduces their ability, but enough for modeling a wide class of problems related to economy, environment, transport, technical processes. The task of modeling the execution of schedules and network diagrams is relevant and useful. There are many counting systems, network graphs, however, the monitoring process based analysis of gaps and terms of graphs, no analysis of prediction execution schedule or schedules. The library is designed to build similar predictive models. Specifying source data to obtain a set of projections from which to choose one and take it for a new plan.

  20. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-15

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.