WorldWideScience

Sample records for satellite algorithm testbed

  1. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    Science.gov (United States)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  2. A Battery Certification Testbed for Small Satellite Missions

    Science.gov (United States)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  3. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    Science.gov (United States)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  4. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  5. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    Science.gov (United States)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  6. Design and Prototyping of a Satellite Antenna Slew Testbed

    Science.gov (United States)

    2013-12-01

    beers and kind advice gave me a family away from home. To my familia here in the Bay Area; their constant support, understanding and surprise...Encoder Cable Maxon 275934 2 CAB 29 EPOS Power Cable Maxon 275829 2 CAB 30 Misc Hardware** NPS 30 - - Bill of Materials 35 closely match the actual ...computed trajectory. The position and velocity results were then implemented on the testbed motors for comparison of actual versus commanded values

  7. A numerical testbed for remote sensing of aerosols, and its demonstration for evaluating retrieval synergy from a geostationary satellite constellation of GEO-CAPE and GOES-R

    International Nuclear Information System (INIS)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEO-CAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O 2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  8. Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed

    Science.gov (United States)

    Tian, Ye; Song, Qi; Cattafesta, Louis

    2005-01-01

    This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.

  9. A Matlab-Based Testbed for Integration, Evaluation and Comparison of Heterogeneous Stereo Vision Matching Algorithms

    Directory of Open Access Journals (Sweden)

    Raul Correal

    2016-11-01

    Full Text Available Stereo matching is a heavily researched area with a prolific published literature and a broad spectrum of heterogeneous algorithms available in diverse programming languages. This paper presents a Matlab-based testbed that aims to centralize and standardize this variety of both current and prospective stereo matching approaches. The proposed testbed aims to facilitate the application of stereo-based methods to real situations. It allows for configuring and executing algorithms, as well as comparing results, in a fast, easy and friendly setting. Algorithms can be combined so that a series of processes can be chained and executed consecutively, using the output of a process as input for the next; some additional filtering and image processing techniques have been included within the testbed for this purpose. A use case is included to illustrate how these processes are sequenced and its effect on the results for real applications. The testbed has been conceived as a collaborative and incremental open-source project, where its code is accessible and modifiable, with the objective of receiving contributions and releasing future versions to include new algorithms and features. It is currently available online for the research community.

  10. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU

    Directory of Open Access Journals (Sweden)

    Hailong Xu

    2016-03-01

    Full Text Available Nowadays, software-defined radio (SDR has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP and Space-Frequency Adaptive Processing (SFAP are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications.

  11. A Numerical Testbed for Remote Sensing of Aerosols, and its Demonstration for Evaluating Retrieval Synergy from a Geostationary Satellite Constellation of GEO-CAPE and GOES-R

    Science.gov (United States)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael I.

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEOCAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  12. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  13. The Soil Moisture Active Passive Mission (SMAP) Science Data Products: Results of Testing with Field Experiment and Algorithm Testbed Simulation Environment Data

    Science.gov (United States)

    Entekhabi, Dara; Njoku, Eni E.; O'Neill, Peggy E.; Kellogg, Kent H.; Entin, Jared K.

    2010-01-01

    Talk outline 1. Derivation of SMAP basic and applied science requirements from the NRC Earth Science Decadal Survey applications 2. Data products and latencies 3. Algorithm highlights 4. SMAP Algorithm Testbed 5. SMAP Working Groups and community engagement

  14. Phase Retrieval Using a Genetic Algorithm on the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime R.

    2003-01-01

    NASA s Marshall Space Flight Center s Systematic Image-Based Optical Alignment (SIBOA) Testbed was developed to test phase retrieval algorithms and hardware techniques. Individuals working with the facility developed the idea of implementing phase retrieval by breaking the determination of the tip/tilt of each mirror apart from the piston motion (or translation) of each mirror. Presented in this report is an algorithm that determines the optimal phase correction associated only with the piston motion of the mirrors. A description of the Phase Retrieval problem is first presented. The Systematic Image-Based Optical Alignment (SIBOA) Testbeb is then described. A Discrete Fourier Transform (DFT) is necessary to transfer the incoming wavefront (or estimate of phase error) into the spatial frequency domain to compare it with the image. A method for reducing the DFT to seven scalar/matrix multiplications is presented. A genetic algorithm is then used to search for the phase error. The results of this new algorithm on a test problem are presented.

  15. Sensing across large-scale cognitive radio networks: Data processing, algorithms, and testbed for wireless tomography and moving target tracking

    Science.gov (United States)

    Bonior, Jason David

    As the use of wireless devices has become more widespread so has the potential for utilizing wireless networks for remote sensing applications. Regular wireless communication devices are not typically designed for remote sensing. Remote sensing techniques must be carefully tailored to the capabilities of these networks before they can be applied. Experimental verification of these techniques and algorithms requires robust yet flexible testbeds. In this dissertation, two experimental testbeds for the advancement of research into sensing across large-scale cognitive radio networks are presented. System architectures, implementations, capabilities, experimental verification, and performance are discussed. One testbed is designed for the collection of scattering data to be used in RF and wireless tomography research. This system is used to collect full complex scattering data using a vector network analyzer (VNA) and amplitude-only data using non-synchronous software-defined radios (SDRs). Collected data is used to experimentally validate a technique for phase reconstruction using semidefinite relaxation and demonstrate the feasibility of wireless tomography. The second testbed is a SDR network for the collection of experimental data. The development of tools for network maintenance and data collection is presented and discussed. A novel recursive weighted centroid algorithm for device-free target localization using the variance of received signal strength for wireless links is proposed. The signal variance resulting from a moving target is modeled as having contours related to Cassini ovals. This model is used to formulate recursive weights which reduce the influence of wireless links that are farther from the target location estimate. The algorithm and its implementation on this testbed are presented and experimental results discussed.

  16. Experimental Validation of Advanced Dispersed Fringe Sensing (ADFS) Algorithm Using Advanced Wavefront Sensing and Correction Testbed (AWCT)

    Science.gov (United States)

    Wang, Xu; Shi, Fang; Sigrist, Norbert; Seo, Byoung-Joon; Tang, Hong; Bikkannavar, Siddarayappa; Basinger, Scott; Lay, Oliver

    2012-01-01

    Large aperture telescope commonly features segment mirrors and a coarse phasing step is needed to bring these individual segments into the fine phasing capture range. Dispersed Fringe Sensing (DFS) is a powerful coarse phasing technique and its alteration is currently being used for JWST.An Advanced Dispersed Fringe Sensing (ADFS) algorithm is recently developed to improve the performance and robustness of previous DFS algorithms with better accuracy and unique solution. The first part of the paper introduces the basic ideas and the essential features of the ADFS algorithm and presents the some algorithm sensitivity study results. The second part of the paper describes the full details of algorithm validation process through the advanced wavefront sensing and correction testbed (AWCT): first, the optimization of the DFS hardware of AWCT to ensure the data accuracy and reliability is illustrated. Then, a few carefully designed algorithm validation experiments are implemented, and the corresponding data analysis results are shown. Finally the fiducial calibration using Range-Gate-Metrology technique is carried out and a <10nm or <1% algorithm accuracy is demonstrated.

  17. Real-Time Signal Processing for Multiantenna Systems: Algorithms, Optimization, and Implementation on an Experimental Test-Bed

    Directory of Open Access Journals (Sweden)

    Haustein Thomas

    2006-01-01

    Full Text Available A recently realized concept of a reconfigurable hardware test-bed suitable for real-time mobile communication with multiple antennas is presented in this paper. We discuss the reasons and prerequisites for real-time capable MIMO transmission systems which may allow channel adaptive transmission to increase link stability and data throughput. We describe a concept of an efficient implementation of MIMO signal processing using FPGAs and DSPs. We focus on some basic linear and nonlinear MIMO detection and precoding algorithms and their optimization for a DSP target, and a few principal steps for computational performance enhancement are outlined. An experimental verification of several real-time MIMO transmission schemes at high data rates in a typical office scenario is presented and results on the achieved BER and throughput performance are given. The different transmission schemes used either channel state information at both sides of the link or at one side only (transmitter or receiver. Spectral efficiencies of more than 20 bits/s/Hz and a throughput of more than 150 Mbps were shown with a single-carrier transmission. The experimental results clearly show the feasibility of real-time high data rate MIMO techniques with state-of-the-art hardware and that more sophisticated baseband signal processing will be an essential part of future communication systems. A discussion on implementation challenges towards future wireless communication systems supporting higher data rates (1 Gbps and beyond or high mobility concludes the paper.

  18. Heuristic Scheduling Algorithm Oriented Dynamic Tasks for Imaging Satellites

    Directory of Open Access Journals (Sweden)

    Maocai Wang

    2014-01-01

    Full Text Available Imaging satellite scheduling is an NP-hard problem with many complex constraints. This paper researches the scheduling problem for dynamic tasks oriented to some emergency cases. After the dynamic properties of satellite scheduling were analyzed, the optimization model is proposed in this paper. Based on the model, two heuristic algorithms are proposed to solve the problem. The first heuristic algorithm arranges new tasks by inserting or deleting them, then inserting them repeatedly according to the priority from low to high, which is named IDI algorithm. The second one called ISDR adopts four steps: insert directly, insert by shifting, insert by deleting, and reinsert the tasks deleted. Moreover, two heuristic factors, congestion degree of a time window and the overlapping degree of a task, are employed to improve the algorithm’s performance. Finally, a case is given to test the algorithms. The results show that the IDI algorithm is better than ISDR from the running time point of view while ISDR algorithm with heuristic factors is more effective with regard to algorithm performance. Moreover, the results also show that our method has good performance for the larger size of the dynamic tasks in comparison with the other two methods.

  19. Handoff algorithm for mobile satellite systems with ancillary terrestrial component

    KAUST Repository

    Sadek, Mirette

    2012-06-01

    This paper presents a locally optimal handoff algorithm for integrated satellite/ground communication systems. We derive the handoff decision function and present the results in the form of tradeoff curves between the number of handoffs and the number of link degradation events in a given distance covered by the mobile user. This is a practical receiver-controlled handoff algorithm that optimizes the handoff process from a user perspective based on the received signal strength rather than from a network perspective. © 2012 IEEE.

  20. Use of Tabu Search in a Solver to Map Complex Networks onto Emulab Testbeds

    National Research Council Canada - National Science Library

    MacDonald, Jason E

    2007-01-01

    The University of Utah's solver for the testbed mapping problem uses a simulated annealing metaheuristic algorithm to map a researcher's experimental network topology onto available testbed resources...

  1. Trace explosives sensor testbed (TESTbed)

    Science.gov (United States)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  2. Theoretical algorithms for satellite-derived sea surface temperatures

    Science.gov (United States)

    Barton, I. J.; Zavody, A. M.; O'Brien, D. M.; Cutten, D. R.; Saunders, R. W.; Llewellyn-Jones, D. T.

    1989-03-01

    Reliable climate forecasting using numerical models of the ocean-atmosphere system requires accurate data sets of sea surface temperature (SST) and surface wind stress. Global sets of these data will be supplied by the instruments to fly on the ERS 1 satellite in 1990. One of these instruments, the Along-Track Scanning Radiometer (ATSR), has been specifically designed to provide SST in cloud-free areas with an accuracy of 0.3 K. The expected capabilities of the ATSR can be assessed using transmission models of infrared radiative transfer through the atmosphere. The performances of several different models are compared by estimating the infrared brightness temperatures measured by the NOAA 9 AVHRR for three standard atmospheres. Of these, a computationally quick spectral band model is used to derive typical AVHRR and ATSR SST algorithms in the form of linear equations. These algorithms show that a low-noise 3.7-μm channel is required to give the best satellite-derived SST and that the design accuracy of the ATSR is likely to be achievable. The inclusion of extra water vapor information in the analysis did not improve the accuracy of multiwavelength SST algorithms, but some improvement was noted with the multiangle technique. Further modeling is required with atmospheric data that include both aerosol variations and abnormal vertical profiles of water vapor and temperature.

  3. Satellite constellation design and radio resource management using genetic algorithm.

    OpenAIRE

    Asvial, Muhamad.

    2003-01-01

    A novel strategy for automatic satellite constellation design with satellite diversity is proposed. The automatic satellite constellation design means some parameters of satellite constellation design can be determined simultaneously. The total number of satellites, the altitude of satellite, the angle between planes, the angle shift between satellites and the inclination angle are considered for automatic satellite constellation design. Satellite constellation design is modelled using a mult...

  4. ALGORITHM OF SAR SATELLITE ATTITUDE MEASUREMENT USING GPS AIDED BY KINEMATIC VECTOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In this paper, in order to improve the accuracy of the Synthetic Aperture Radar (SAR)satellite attitude using Global Positioning System (GPS) wide-band carrier phase, the SAR satellite attitude kinematic vector and Kalman filter are introduced. Introducing the state variable function of GPS attitude determination algorithm in SAR satellite by means of kinematic vector and describing the observation function by the GPS wide-band carrier phase, the paper uses the Kalman filter algorithm to obtian the attitude variables of SAR satellite. Compared the simulation results of Kalman filter algorithm with the least square algorithm and explicit solution, it is indicated that the Kalman filter algorithm is the best.

  5. Trace Gas Measurements from the GeoTASO and GCAS Airborne Instruments: An Instrument and Algorithm Test-Bed for Air Quality Observations from Geostationary Orbit

    Science.gov (United States)

    Nowlan, C. R.; Liu, X.; Janz, S. J.; Leitch, J. W.; Al-Saadi, J. A.; Chance, K.; Cole, J.; Delker, T.; Follette-Cook, M. B.; Gonzalez Abad, G.; Good, W. S.; Kowalewski, M. G.; Loughner, C.; Pickering, K. E.; Ruppert, L.; Soo, D.; Szykman, J.; Valin, L.; Zoogman, P.

    2016-12-01

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) and the GEO-CAPE Airborne Simulator (GCAS) instruments are pushbroom sensors capable of making remote sensing measurements of air quality and ocean color. Originally developed as test-bed instruments for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey, these instruments are now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions, and will provide validation capabilities after the satellite instruments are in orbit. GeoTASO and GCAS flew on two different aircraft in their first intensive air quality field campaigns during the DISCOVER-AQ missions over Texas in 2013 and Colorado in 2014. GeoTASO was also deployed in 2016 during the KORUS-AQ field campaign to make measurements of trace gases and aerosols over Korea. GeoTASO and GCAS collect spectra of backscattered solar radiation in the UV and visible that can be used to derive 2-D maps of trace gas columns below the aircraft at spatial resolutions on the order of 250 x 500 m. We present spatially resolved maps of trace gas retrievals of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the field campaigns, and comparisons with data from ground-based spectrometers, in situ monitoring instruments, and satellites.

  6. The SUMO Ship Detector Algorithm for Satellite Radar Images

    Directory of Open Access Journals (Sweden)

    Harm Greidanus

    2017-03-01

    Full Text Available Search for Unidentified Maritime Objects (SUMO is an algorithm for ship detection in satellite Synthetic Aperture Radar (SAR images. It has been developed over the course of more than 15 years, using a large amount of SAR images from almost all available SAR satellites operating in L-, C- and X-band. As validated by benchmark tests, it performs very well on a wide range of SAR image modes (from Spotlight to ScanSAR and resolutions (from 1–100 m and for all types and sizes of ships, within the physical limits imposed by the radar imaging. This paper describes, in detail, the algorithmic approach in all of the steps of the ship detection: land masking, clutter estimation, detection thresholding, target clustering, ship attribute estimation and false alarm suppression. SUMO is a pixel-based CFAR (Constant False Alarm Rate detector for multi-look radar images. It assumes a K distribution for the sea clutter, corrected however for deviations of the actual sea clutter from this distribution, implementing a fast and robust method for the clutter background estimation. The clustering of detected pixels into targets (ships uses several thresholds to deal with the typically irregular distribution of the radar backscatter over a ship. In a multi-polarization image, the different channels are fused. Azimuth ambiguities, a common source of false alarms in ship detection, are removed. A reliability indicator is computed for each target. In post-processing, using the results of a series of images, additional false alarms from recurrent (fixed targets including range ambiguities are also removed. SUMO can run in semi-automatic mode, where an operator can verify each detected target. It can also run in fully automatic mode, where batches of over 10,000 images have successfully been processed in less than two hours. The number of satellite SAR systems keeps increasing, as does their application to maritime surveillance. The open data policy of the EU

  7. Bias correction of daily satellite precipitation data using genetic algorithm

    Science.gov (United States)

    Pratama, A. W.; Buono, A.; Hidayat, R.; Harsa, H.

    2018-05-01

    Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) was producted by blending Satellite-only Climate Hazards Group InfraRed Precipitation (CHIRP) with Stasion observations data. The blending process was aimed to reduce bias of CHIRP. However, Biases of CHIRPS on statistical moment and quantil values were high during wet season over Java Island. This paper presented a bias correction scheme to adjust statistical moment of CHIRP using observation precipitation data. The scheme combined Genetic Algorithm and Nonlinear Power Transformation, the results was evaluated based on different season and different elevation level. The experiment results revealed that the scheme robustly reduced bias on variance around 100% reduction and leaded to reduction of first, and second quantile biases. However, bias on third quantile only reduced during dry months. Based on different level of elevation, the performance of bias correction process is only significantly different on skewness indicators.

  8. Improved interpretation of satellite altimeter data using genetic algorithms

    Science.gov (United States)

    Messa, Kenneth; Lybanon, Matthew

    1992-01-01

    Genetic algorithms (GA) are optimization techniques that are based on the mechanics of evolution and natural selection. They take advantage of the power of cumulative selection, in which successive incremental improvements in a solution structure become the basis for continued development. A GA is an iterative procedure that maintains a 'population' of 'organisms' (candidate solutions). Through successive 'generations' (iterations) the population as a whole improves in simulation of Darwin's 'survival of the fittest'. GA's have been shown to be successful where noise significantly reduces the ability of other search techniques to work effectively. Satellite altimetry provides useful information about oceanographic phenomena. It provides rapid global coverage of the oceans and is not as severely hampered by cloud cover as infrared imagery. Despite these and other benefits, several factors lead to significant difficulty in interpretation. The GA approach to the improved interpretation of satellite data involves the representation of the ocean surface model as a string of parameters or coefficients from the model. The GA searches in parallel, a population of such representations (organisms) to obtain the individual that is best suited to 'survive', that is, the fittest as measured with respect to some 'fitness' function. The fittest organism is the one that best represents the ocean surface model with respect to the altimeter data.

  9. SU-G-JeP1-07: Development of a Programmable Motion Testbed for the Validation of Ultrasound Tracking Algorithms

    International Nuclear Information System (INIS)

    Shepard, A; Matrosic, C; Zagzebski, J; Bednarz, B

    2016-01-01

    Purpose: To develop an advanced testbed that combines a 3D motion stage and ultrasound phantom to optimize and validate 2D and 3D tracking algorithms for real-time motion management during radiation therapy. Methods: A Siemens S2000 Ultrasound scanner utilizing a 9L4 transducer was coupled with the Washington University 4D Phantom to simulate patient motion. The transducer was securely fastened to the 3D stage and positioned to image three cylinders of varying contrast in a Gammex 404GS LE phantom. The transducer was placed within a water bath above the phantom in order to maintain sufficient coupling for the entire range of simulated motion. A programmed motion sequence was used to move the transducer during image acquisition and a cine video was acquired for one minute to allow for long sequence tracking. Images were analyzed using a normalized cross-correlation block matching tracking algorithm and compared to the known motion of the transducer relative to the phantom. Results: The setup produced stable ultrasound motion traces consistent with those programmed into the 3D motion stage. The acquired ultrasound images showed minimal artifacts and an image quality that was more than suitable for tracking algorithm verification. Comparisons of a block matching tracking algorithm with the known motion trace for the three features resulted in an average tracking error of 0.59 mm. Conclusion: The high accuracy and programmability of the 4D phantom allows for the acquisition of ultrasound motion sequences that are highly customizable; allowing for focused analysis of some common pitfalls of tracking algorithms such as partial feature occlusion or feature disappearance, among others. The design can easily be modified to adapt to any probe such that the process can be extended to 3D acquisition. Further development of an anatomy specific phantom better resembling true anatomical landmarks could lead to an even more robust validation. This work is partially funded by NIH

  10. SU-G-JeP1-07: Development of a Programmable Motion Testbed for the Validation of Ultrasound Tracking Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, A; Matrosic, C; Zagzebski, J; Bednarz, B [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: To develop an advanced testbed that combines a 3D motion stage and ultrasound phantom to optimize and validate 2D and 3D tracking algorithms for real-time motion management during radiation therapy. Methods: A Siemens S2000 Ultrasound scanner utilizing a 9L4 transducer was coupled with the Washington University 4D Phantom to simulate patient motion. The transducer was securely fastened to the 3D stage and positioned to image three cylinders of varying contrast in a Gammex 404GS LE phantom. The transducer was placed within a water bath above the phantom in order to maintain sufficient coupling for the entire range of simulated motion. A programmed motion sequence was used to move the transducer during image acquisition and a cine video was acquired for one minute to allow for long sequence tracking. Images were analyzed using a normalized cross-correlation block matching tracking algorithm and compared to the known motion of the transducer relative to the phantom. Results: The setup produced stable ultrasound motion traces consistent with those programmed into the 3D motion stage. The acquired ultrasound images showed minimal artifacts and an image quality that was more than suitable for tracking algorithm verification. Comparisons of a block matching tracking algorithm with the known motion trace for the three features resulted in an average tracking error of 0.59 mm. Conclusion: The high accuracy and programmability of the 4D phantom allows for the acquisition of ultrasound motion sequences that are highly customizable; allowing for focused analysis of some common pitfalls of tracking algorithms such as partial feature occlusion or feature disappearance, among others. The design can easily be modified to adapt to any probe such that the process can be extended to 3D acquisition. Further development of an anatomy specific phantom better resembling true anatomical landmarks could lead to an even more robust validation. This work is partially funded by NIH

  11. Scheduling algorithm for data relay satellite optical communication based on artificial intelligent optimization

    Science.gov (United States)

    Zhao, Wei-hu; Zhao, Jing; Zhao, Shang-hong; Li, Yong-jun; Wang, Xiang; Dong, Yi; Dong, Chen

    2013-08-01

    Optical satellite communication with the advantages of broadband, large capacity and low power consuming broke the bottleneck of the traditional microwave satellite communication. The formation of the Space-based Information System with the technology of high performance optical inter-satellite communication and the realization of global seamless coverage and mobile terminal accessing are the necessary trend of the development of optical satellite communication. Considering the resources, missions and restraints of Data Relay Satellite Optical Communication System, a model of optical communication resources scheduling is established and a scheduling algorithm based on artificial intelligent optimization is put forwarded. According to the multi-relay-satellite, multi-user-satellite, multi-optical-antenna and multi-mission with several priority weights, the resources are scheduled reasonable by the operation: "Ascertain Current Mission Scheduling Time" and "Refresh Latter Mission Time-Window". The priority weight is considered as the parameter of the fitness function and the scheduling project is optimized by the Genetic Algorithm. The simulation scenarios including 3 relay satellites with 6 optical antennas, 12 user satellites and 30 missions, the simulation result reveals that the algorithm obtain satisfactory results in both efficiency and performance and resources scheduling model and the optimization algorithm are suitable in multi-relay-satellite, multi-user-satellite, and multi-optical-antenna recourses scheduling problem.

  12. Development of a Tethered Formation Flight Testbed for ISS, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The development of a testbed for the development and demonstration of technologies needed by tethered formation flying satellites is proposed. Such a testbed would...

  13. A Study on Fuel Estimation Algorithms for a Geostationary Communication & Broadcasting Satellite

    Directory of Open Access Journals (Sweden)

    Jong Won Eun

    2000-12-01

    Full Text Available It has been developed to calculate fuel budget for a geostationary communication and broadcasting satellite. It is quite essential that the pre-launch fuel budget estimation must account for the deterministic transfer and drift orbit maneuver requirements. After on-station, the calculation of satellite lifetime should be based on the estimation of remaining fuel and assessment of actual performance. These estimations step from the proper algorithms to produce the prediction of satellite lifetime. This paper concentrates on the fuel estimation method that was studied for calculation of the propellant budget by using the given algorithms. Applications of this method are discussed for a communication and broadcasting satellite.

  14. Algorithms and programs for processing of satellite data on ozone layer and UV radiation levels

    International Nuclear Information System (INIS)

    Borkovskij, N.B.; Ivanyukovich, V.A.

    2012-01-01

    Some algorithms and programs for automatic retrieving and processing ozone layer satellite data are discussed. These techniques are used for reliable short-term UV-radiation levels forecasting. (authors)

  15. A Study on Fuel Estimation Algorithms for a Geostationary Communication & Broadcasting Satellite

    OpenAIRE

    Jong Won Eun

    2000-01-01

    It has been developed to calculate fuel budget for a geostationary communication and broadcasting satellite. It is quite essential that the pre-launch fuel budget estimation must account for the deterministic transfer and drift orbit maneuver requirements. After on-station, the calculation of satellite lifetime should be based on the estimation of remaining fuel and assessment of actual performance. These estimations step from the proper algorithms to produce the prediction of satellite lifet...

  16. A simple and efficient algorithm to estimate daily global solar radiation from geostationary satellite data

    International Nuclear Information System (INIS)

    Lu, Ning; Qin, Jun; Yang, Kun; Sun, Jiulin

    2011-01-01

    Surface global solar radiation (GSR) is the primary renewable energy in nature. Geostationary satellite data are used to map GSR in many inversion algorithms in which ground GSR measurements merely serve to validate the satellite retrievals. In this study, a simple algorithm with artificial neural network (ANN) modeling is proposed to explore the non-linear physical relationship between ground daily GSR measurements and Multi-functional Transport Satellite (MTSAT) all-channel observations in an effort to fully exploit information contained in both data sets. Singular value decomposition is implemented to extract the principal signals from satellite data and a novel method is applied to enhance ANN performance at high altitude. A three-layer feed-forward ANN model is trained with one year of daily GSR measurements at ten ground sites. This trained ANN is then used to map continuous daily GSR for two years, and its performance is validated at all 83 ground sites in China. The evaluation result demonstrates that this algorithm can quickly and efficiently build the ANN model that estimates daily GSR from geostationary satellite data with good accuracy in both space and time. -- Highlights: → A simple and efficient algorithm to estimate GSR from geostationary satellite data. → ANN model fully exploits both the information from satellite and ground measurements. → Good performance of the ANN model is comparable to that of the classical models. → Surface elevation and infrared information enhance GSR inversion.

  17. Holodeck Testbed Project

    Science.gov (United States)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  18. Mapping Surface Broadband Albedo from Satellite Observations: A Review of Literatures on Algorithms and Products

    Directory of Open Access Journals (Sweden)

    Ying Qu

    2015-01-01

    Full Text Available Surface albedo is one of the key controlling geophysical parameters in the surface energy budget studies, and its temporal and spatial variation is closely related to the global climate change and regional weather system due to the albedo feedback mechanism. As an efficient tool for monitoring the surfaces of the Earth, remote sensing is widely used for deriving long-term surface broadband albedo with various geostationary and polar-orbit satellite platforms in recent decades. Moreover, the algorithms for estimating surface broadband albedo from satellite observations, including narrow-to-broadband conversions, bidirectional reflectance distribution function (BRDF angular modeling, direct-estimation algorithm and the algorithms for estimating albedo from geostationary satellite data, are developed and improved. In this paper, we present a comprehensive literature review on algorithms and products for mapping surface broadband albedo with satellite observations and provide a discussion of different algorithms and products in a historical perspective based on citation analysis of the published literature. This paper shows that the observation technologies and accuracy requirement of applications are important, and long-term, global fully-covered (including land, ocean, and sea-ice surfaces, gap-free, surface broadband albedo products with higher spatial and temporal resolution are required for climate change, surface energy budget, and hydrological studies.

  19. Evaluation of Multiple Kernel Learning Algorithms for Crop Mapping Using Satellite Image Time-Series Data

    Science.gov (United States)

    Niazmardi, S.; Safari, A.; Homayouni, S.

    2017-09-01

    Crop mapping through classification of Satellite Image Time-Series (SITS) data can provide very valuable information for several agricultural applications, such as crop monitoring, yield estimation, and crop inventory. However, the SITS data classification is not straightforward. Because different images of a SITS data have different levels of information regarding the classification problems. Moreover, the SITS data is a four-dimensional data that cannot be classified using the conventional classification algorithms. To address these issues in this paper, we presented a classification strategy based on Multiple Kernel Learning (MKL) algorithms for SITS data classification. In this strategy, initially different kernels are constructed from different images of the SITS data and then they are combined into a composite kernel using the MKL algorithms. The composite kernel, once constructed, can be used for the classification of the data using the kernel-based classification algorithms. We compared the computational time and the classification performances of the proposed classification strategy using different MKL algorithms for the purpose of crop mapping. The considered MKL algorithms are: MKL-Sum, SimpleMKL, LPMKL and Group-Lasso MKL algorithms. The experimental tests of the proposed strategy on two SITS data sets, acquired by SPOT satellite sensors, showed that this strategy was able to provide better performances when compared to the standard classification algorithm. The results also showed that the optimization method of the used MKL algorithms affects both the computational time and classification accuracy of this strategy.

  20. An orbit determination algorithm for small satellites based on the magnitude of the earth magnetic field

    Science.gov (United States)

    Zagorski, P.; Gallina, A.; Rachucki, J.; Moczala, B.; Zietek, S.; Uhl, T.

    2018-06-01

    Autonomous attitude determination systems based on simple measurements of vector quantities such as magnetic field and the Sun direction are commonly used in very small satellites. However, those systems always require knowledge of the satellite position. This information can be either propagated from orbital elements periodically uplinked from the ground station or measured onboard by dedicated global positioning system (GPS) receiver. The former solution sacrifices satellite autonomy while the latter requires additional sensors which may represent a significant part of mass, volume, and power budget in case of pico- or nanosatellites. Hence, it is thought that a system for onboard satellite position determination without resorting to GPS receivers would be useful. In this paper, a novel algorithm for determining the satellite orbit semimajor-axis is presented. The methods exploit only the magnitude of the Earth magnetic field recorded onboard by magnetometers. This represents the first step toward an extended algorithm that can determine all orbital elements of the satellite. The method is validated by numerical analysis and real magnetic field measurements.

  1. Bridging Ground Validation and Algorithms: Using Scattering and Integral Tables to Incorporate Observed DSD Correlations into Satellite Algorithms

    Science.gov (United States)

    Williams, C. R.

    2012-12-01

    The NASA Global Precipitation Mission (GPM) raindrop size distribution (DSD) Working Group is composed of NASA PMM Science Team Members and is charged to "investigate the correlations between DSD parameters using Ground Validation (GV) data sets that support, or guide, the assumptions used in satellite retrieval algorithms." Correlations between DSD parameters can be used to constrain the unknowns and reduce the degrees-of-freedom in under-constrained satellite algorithms. Over the past two years, the GPM DSD Working Group has analyzed GV data and has found correlations between the mass-weighted mean raindrop diameter (Dm) and the mass distribution standard deviation (Sm) that follows a power-law relationship. This Dm-Sm power-law relationship appears to be robust and has been observed in surface disdrometer and vertically pointing radar observations. One benefit of a Dm-Sm power-law relationship is that a three parameter DSD can be modeled with just two parameters: Dm and Nw that determines the DSD amplitude. In order to incorporate observed DSD correlations into satellite algorithms, the GPM DSD Working Group is developing scattering and integral tables that can be used by satellite algorithms. Scattering tables describe the interaction of electromagnetic waves on individual particles to generate cross sections of backscattering, extinction, and scattering. Scattering tables are independent of the distribution of particles. Integral tables combine scattering table outputs with DSD parameters and DSD correlations to generate integrated normalized reflectivity, attenuation, scattering, emission, and asymmetry coefficients. Integral tables contain both frequency dependent scattering properties and cloud microphysics. The GPM DSD Working Group has developed scattering tables for raindrops at both Dual Precipitation Radar (DPR) frequencies and at all GMI radiometer frequencies less than 100 GHz. Scattering tables include Mie and T-matrix scattering with H- and V

  2. First Attempt of Orbit Determination of SLR Satellites and Space Debris Using Genetic Algorithms

    Science.gov (United States)

    Deleflie, F.; Coulot, D.; Descosta, R.; Fernier, A.; Richard, P.

    2013-08-01

    We present an orbit determination method based on genetic algorithms. Contrary to usual estimation methods mainly based on least-squares methods, these algorithms do not require any a priori knowledge of the initial state vector to be estimated. These algorithms can be applied when a new satellite is launched or for uncatalogued objects that appear in images obtained from robotic telescopes such as the TAROT ones. We show in this paper preliminary results obtained from an SLR satellite, for which tracking data acquired by the ILRS network enable to build accurate orbital arcs at a few centimeter level, which can be used as a reference orbit ; in this case, the basic observations are made up of time series of ranges, obtained from various tracking stations. We show as well the results obtained from the observations acquired by the two TAROT telescopes on the Telecom-2D satellite operated by CNES ; in that case, the observations are made up of time series of azimuths and elevations, seen from the two TAROT telescopes. The method is carried out in several steps: (i) an analytical propagation of the equations of motion, (ii) an estimation kernel based on genetic algorithms, which follows the usual steps of such approaches: initialization and evolution of a selected population, so as to determine the best parameters. Each parameter to be estimated, namely each initial keplerian element, has to be searched among an interval that is preliminary chosen. The algorithm is supposed to converge towards an optimum over a reasonable computational time.

  3. Proportional fair scheduling algorithm based on traffic in satellite communication system

    Science.gov (United States)

    Pan, Cheng-Sheng; Sui, Shi-Long; Liu, Chun-ling; Shi, Yu-Xin

    2018-02-01

    In the satellite communication network system, in order to solve the problem of low system capacity and user fairness in multi-user access to satellite communication network in the downlink, combined with the characteristics of user data service, an algorithm study on throughput capacity and user fairness scheduling is proposed - Proportional Fairness Algorithm Based on Traffic(B-PF). The algorithm is improved on the basis of the proportional fairness algorithm in the wireless communication system, taking into account the user channel condition and caching traffic information. The user outgoing traffic is considered as the adjustment factor of the scheduling priority and presents the concept of traffic satisfaction. Firstly,the algorithm calculates the priority of the user according to the scheduling algorithm and dispatches the users with the highest priority. Secondly, when a scheduled user is the business satisfied user, the system dispatches the next priority user. The simulation results show that compared with the PF algorithm, B-PF can improve the system throughput, the business satisfaction and fairness.

  4. Technical Report Series on Global Modeling and Data Assimilation. Volume 12; Comparison of Satellite Global Rainfall Algorithms

    Science.gov (United States)

    Suarez, Max J. (Editor); Chang, Alfred T. C.; Chiu, Long S.

    1997-01-01

    Seventeen months of rainfall data (August 1987-December 1988) from nine satellite rainfall algorithms (Adler, Chang, Kummerow, Prabhakara, Huffman, Spencer, Susskind, and Wu) were analyzed to examine the uncertainty of satellite-derived rainfall estimates. The variability among algorithms, measured as the standard deviation computed from the ensemble of algorithms, shows regions of high algorithm variability tend to coincide with regions of high rain rates. Histograms of pattern correlation (PC) between algorithms suggest a bimodal distribution, with separation at a PC-value of about 0.85. Applying this threshold as a criteria for similarity, our analyses show that algorithms using the same sensor or satellite input tend to be similar, suggesting the dominance of sampling errors in these satellite estimates.

  5. An Online Tilt Estimation and Compensation Algorithm for a Small Satellite Camera

    Science.gov (United States)

    Lee, Da-Hyun; Hwang, Jai-hyuk

    2018-04-01

    In the case of a satellite camera designed to execute an Earth observation mission, even after a pre-launch precision alignment process has been carried out, misalignment will occur due to external factors during the launch and in the operating environment. In particular, for high-resolution satellite cameras, which require submicron accuracy for alignment between optical components, misalignment is a major cause of image quality degradation. To compensate for this, most high-resolution satellite cameras undergo a precise realignment process called refocusing before and during the operation process. However, conventional Earth observation satellites only execute refocusing upon de-space. Thus, in this paper, an online tilt estimation and compensation algorithm that can be utilized after de-space correction is executed. Although the sensitivity of the optical performance degradation due to the misalignment is highest in de-space, the MTF can be additionally increased by correcting tilt after refocusing. The algorithm proposed in this research can be used to estimate the amount of tilt that occurs by taking star images, and it can also be used to carry out automatic tilt corrections by employing a compensation mechanism that gives angular motion to the secondary mirror. Crucially, this algorithm is developed using an online processing system so that it can operate without communication with the ground.

  6. The HSBQ Algorithm with Triple-play Services for Broadband Hybrid Satellite Constellation Communication System

    Directory of Open Access Journals (Sweden)

    Anupon Boriboon

    2016-07-01

    Full Text Available The HSBQ algorithm is the one of active queue management algorithms, which orders to avoid high packet loss rates and control stable stream queue. That is the problem of calculation of the drop probability for both queue length stability and bandwidth fairness. This paper proposes the HSBQ, which drop the packets before the queues overflow at the gateways, so that the end nodes can respond to the congestion before queue overflow. This algorithm uses the change of the average queue length to adjust the amount by which the mark (or drop probability is changed. Moreover it adjusts the queue weight, which is used to estimate the average queue length, based on the rate. The results show that HSBQ algorithm could maintain control stable stream queue better than group of congestion metric without flow information algorithm as the rate of hybrid satellite network changing dramatically, as well as the presented empiric evidences demonstrate that the use of HSBQ algorithm offers a better quality of service than the traditionally queue control mechanisms used in hybrid satellite network.

  7. Development of a space-systems network testbed

    Science.gov (United States)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  8. Research on Scheduling Algorithm for Multi-satellite and Point Target Task on Swinging Mode

    Science.gov (United States)

    Wang, M.; Dai, G.; Peng, L.; Song, Z.; Chen, G.

    2012-12-01

    and negative swinging angle and the computation of time window are analyzed and discussed. And many strategies to improve the efficiency of this model are also put forward. In order to solve the model, we bring forward the conception of activity sequence map. By using the activity sequence map, the activity choice and the start time of the activity can be divided. We also bring forward three neighborhood operators to search the result space. The front movement remaining time and the back movement remaining time are used to analyze the feasibility to generate solution from neighborhood operators. Lastly, the algorithm to solve the problem and model is put forward based genetic algorithm. Population initialization, crossover operator, mutation operator, individual evaluation, collision decrease operator, select operator and collision elimination operator is designed in the paper. Finally, the scheduling result and the simulation for a practical example on 5 satellites and 100 point targets with swinging mode is given, and the scheduling performances are also analyzed while the swinging angle in 0, 5, 10, 15, 25. It can be shown by the result that the model and the algorithm are more effective than those ones without swinging mode.

  9. Virtual Factory Testbed

    Data.gov (United States)

    Federal Laboratory Consortium — The Virtual Factory Testbed (VFT) is comprised of three physical facilities linked by a standalone network (VFNet). The three facilities are the Smart and Wireless...

  10. Comparison of four machine learning algorithms for their applicability in satellite-based optical rainfall retrievals

    Science.gov (United States)

    Meyer, Hanna; Kühnlein, Meike; Appelhans, Tim; Nauss, Thomas

    2016-03-01

    Machine learning (ML) algorithms have successfully been demonstrated to be valuable tools in satellite-based rainfall retrievals which show the practicability of using ML algorithms when faced with high dimensional and complex data. Moreover, recent developments in parallel computing with ML present new possibilities for training and prediction speed and therefore make their usage in real-time systems feasible. This study compares four ML algorithms - random forests (RF), neural networks (NNET), averaged neural networks (AVNNET) and support vector machines (SVM) - for rainfall area detection and rainfall rate assignment using MSG SEVIRI data over Germany. Satellite-based proxies for cloud top height, cloud top temperature, cloud phase and cloud water path serve as predictor variables. The results indicate an overestimation of rainfall area delineation regardless of the ML algorithm (averaged bias = 1.8) but a high probability of detection ranging from 81% (SVM) to 85% (NNET). On a 24-hour basis, the performance of the rainfall rate assignment yielded R2 values between 0.39 (SVM) and 0.44 (AVNNET). Though the differences in the algorithms' performance were rather small, NNET and AVNNET were identified as the most suitable algorithms. On average, they demonstrated the best performance in rainfall area delineation as well as in rainfall rate assignment. NNET's computational speed is an additional advantage in work with large datasets such as in remote sensing based rainfall retrievals. However, since no single algorithm performed considerably better than the others we conclude that further research in providing suitable predictors for rainfall is of greater necessity than an optimization through the choice of the ML algorithm.

  11. AN EVOLUTIONARY ALGORITHM FOR FAST INTENSITY BASED IMAGE MATCHING BETWEEN OPTICAL AND SAR SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    P. Fischer

    2018-04-01

    Full Text Available This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  12. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters

    Science.gov (United States)

    Ouillon, Sylvain; Douillet, Pascal; Petrenko, Anne; Neveux, Jacques; Dupouy, Cécile; Froidefond, Jean-Marie; Andréfouët, Serge; Muñoz-Caravaca, Alain

    2008-01-01

    Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba), and sediment-rich waters in the Laucala Bay (Fiji). In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations). The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if < 1 FTU, it is recalculated using an algorithm based on Rrs620.Rrs681/Rrs412. On our data set, this algorithm is suitable for the 0.2-25 FTU turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU). This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach. PMID:27879929

  13. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters

    Directory of Open Access Journals (Sweden)

    Alain Muñoz-Caravaca

    2008-07-01

    Full Text Available Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba, and sediment-rich waters in the Laucala Bay (Fiji. In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations. The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if < 1 FTU, it is recalculated using an algorithm based on Rrs620.Rrs681/Rrs412. On our data set, this algorithm is suitable for the 0.2-25 FTU turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU. This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach.

  14. A novel gridding algorithm to create regional trace gas maps from satellite observations

    Science.gov (United States)

    Kuhlmann, G.; Hartl, A.; Cheung, H. M.; Lam, Y. F.; Wenig, M. O.

    2014-02-01

    The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2) onto a longitude-latitude grid (level 3). The algorithm is designed for the Ozone Monitoring Instrument (OMI) and can easily be employed for similar instruments - for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI). Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly developed gridding

  15. A novel gridding algorithm to create regional trace gas maps from satellite observations

    Directory of Open Access Journals (Sweden)

    G. Kuhlmann

    2014-02-01

    Full Text Available The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2 onto a longitude–latitude grid (level 3. The algorithm is designed for the Ozone Monitoring Instrument (OMI and can easily be employed for similar instruments – for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI. Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly

  16. Satellites

    International Nuclear Information System (INIS)

    Burns, J.A.; Matthews, M.S.

    1986-01-01

    The present work is based on a conference: Natural Satellites, Colloquium 77 of the IAU, held at Cornell University from July 5 to 9, 1983. Attention is given to the background and origins of satellites, protosatellite swarms, the tectonics of icy satellites, the physical characteristics of satellite surfaces, and the interactions of planetary magnetospheres with icy satellite surfaces. Other topics include the surface composition of natural satellites, the cratering of planetary satellites, the moon, Io, and Europa. Consideration is also given to Ganymede and Callisto, the satellites of Saturn, small satellites, satellites of Uranus and Neptune, and the Pluto-Charon system

  17. Schedule Optimization of Imaging Missions for Multiple Satellites and Ground Stations Using Genetic Algorithm

    Science.gov (United States)

    Lee, Junghyun; Kim, Heewon; Chung, Hyun; Kim, Haedong; Choi, Sujin; Jung, Okchul; Chung, Daewon; Ko, Kwanghee

    2018-04-01

    In this paper, we propose a method that uses a genetic algorithm for the dynamic schedule optimization of imaging missions for multiple satellites and ground systems. In particular, the visibility conflicts of communication and mission operation using satellite resources (electric power and onboard memory) are integrated in sequence. Resource consumption and restoration are considered in the optimization process. Image acquisition is an essential part of satellite missions and is performed via a series of subtasks such as command uplink, image capturing, image storing, and image downlink. An objective function for optimization is designed to maximize the usability by considering the following components: user-assigned priority, resource consumption, and image-acquisition time. For the simulation, a series of hypothetical imaging missions are allocated to a multi-satellite control system comprising five satellites and three ground stations having S- and X-band antennas. To demonstrate the performance of the proposed method, simulations are performed via three operation modes: general, commercial, and tactical.

  18. Design of an Image Motion Compenstaion (IMC Algorithm for Image Registration of the Communication, Ocean, Meteorolotical Satellite (COMS-1

    Directory of Open Access Journals (Sweden)

    Taek Seo Jung

    2006-03-01

    Full Text Available This paper presents an Image Motion Compensation (IMC algorithm for the Korea's Communication, Ocean, and Meteorological Satellite (COMS-1. An IMC algorithm is a priority component of image registration in Image Navigation and Registration (INR system to locate and register radiometric image data. Due to various perturbations, a satellite has orbit and attitude errors with respect to a reference motion. These errors cause depointing of the imager aiming direction, and in consequence cause image distortions. To correct the depointing of the imager aiming direction, a compensation algorithm is designed by adapting different equations from those used for the GOES satellites. The capability of the algorithm is compared with that of existing algorithm applied to the GOES's INR system. The algorithm developed in this paper improves pointing accuracy by 40%, and efficiently compensates the depointings of the imager aiming direction.

  19. Shadow Detection from Very High Resoluton Satellite Image Using Grabcut Segmentation and Ratio-Band Algorithms

    Science.gov (United States)

    Kadhim, N. M. S. M.; Mourshed, M.; Bray, M. T.

    2015-03-01

    Very-High-Resolution (VHR) satellite imagery is a powerful source of data for detecting and extracting information about urban constructions. Shadow in the VHR satellite imageries provides vital information on urban construction forms, illumination direction, and the spatial distribution of the objects that can help to further understanding of the built environment. However, to extract shadows, the automated detection of shadows from images must be accurate. This paper reviews current automatic approaches that have been used for shadow detection from VHR satellite images and comprises two main parts. In the first part, shadow concepts are presented in terms of shadow appearance in the VHR satellite imageries, current shadow detection methods, and the usefulness of shadow detection in urban environments. In the second part, we adopted two approaches which are considered current state-of-the-art shadow detection, and segmentation algorithms using WorldView-3 and Quickbird images. In the first approach, the ratios between the NIR and visible bands were computed on a pixel-by-pixel basis, which allows for disambiguation between shadows and dark objects. To obtain an accurate shadow candidate map, we further refine the shadow map after applying the ratio algorithm on the Quickbird image. The second selected approach is the GrabCut segmentation approach for examining its performance in detecting the shadow regions of urban objects using the true colour image from WorldView-3. Further refinement was applied to attain a segmented shadow map. Although the detection of shadow regions is a very difficult task when they are derived from a VHR satellite image that comprises a visible spectrum range (RGB true colour), the results demonstrate that the detection of shadow regions in the WorldView-3 image is a reasonable separation from other objects by applying the GrabCut algorithm. In addition, the derived shadow map from the Quickbird image indicates significant performance of

  20. SHADOW DETECTION FROM VERY HIGH RESOLUTON SATELLITE IMAGE USING GRABCUT SEGMENTATION AND RATIO-BAND ALGORITHMS

    Directory of Open Access Journals (Sweden)

    N. M. S. M. Kadhim

    2015-03-01

    Full Text Available Very-High-Resolution (VHR satellite imagery is a powerful source of data for detecting and extracting information about urban constructions. Shadow in the VHR satellite imageries provides vital information on urban construction forms, illumination direction, and the spatial distribution of the objects that can help to further understanding of the built environment. However, to extract shadows, the automated detection of shadows from images must be accurate. This paper reviews current automatic approaches that have been used for shadow detection from VHR satellite images and comprises two main parts. In the first part, shadow concepts are presented in terms of shadow appearance in the VHR satellite imageries, current shadow detection methods, and the usefulness of shadow detection in urban environments. In the second part, we adopted two approaches which are considered current state-of-the-art shadow detection, and segmentation algorithms using WorldView-3 and Quickbird images. In the first approach, the ratios between the NIR and visible bands were computed on a pixel-by-pixel basis, which allows for disambiguation between shadows and dark objects. To obtain an accurate shadow candidate map, we further refine the shadow map after applying the ratio algorithm on the Quickbird image. The second selected approach is the GrabCut segmentation approach for examining its performance in detecting the shadow regions of urban objects using the true colour image from WorldView-3. Further refinement was applied to attain a segmented shadow map. Although the detection of shadow regions is a very difficult task when they are derived from a VHR satellite image that comprises a visible spectrum range (RGB true colour, the results demonstrate that the detection of shadow regions in the WorldView-3 image is a reasonable separation from other objects by applying the GrabCut algorithm. In addition, the derived shadow map from the Quickbird image indicates

  1. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters.

    Science.gov (United States)

    Ouillon, Sylvain; Douillet, Pascal; Petrenko, Anne; Neveux, Jacques; Dupouy, Cécile; Froidefond, Jean-Marie; Andréfouët, Serge; Muñoz-Caravaca, Alain

    2008-07-10

    Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba), and sediment-rich waters in the Laucala Bay (Fiji). In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations). The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU). This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach.

  2. A commercial space technology testbed on ISS

    Science.gov (United States)

    Boyle, David R.

    2000-01-01

    There is a significant and growing commercial market for new, more capable communications and remote sensing satellites. Competition in this market strongly motivates satellite manufacturers and spacecraft component developers to test and demonstrate new space hardware in a realistic environment. External attach points on the International Space Station allow it to function uniquely as a space technology testbed to satisfy this market need. However, space industry officials have identified three critical barriers to their commercial use of the ISS: unpredictable access, cost risk, and schedule uncertainty. Appropriate NASA policy initiatives and business/technical assistance for industry from the Commercial Space Center for Engineering can overcome these barriers. .

  3. CDRD and PNPR satellite passive microwave precipitation retrieval algorithms: EuroTRMM/EURAINSAT origins and H-SAF operations

    Science.gov (United States)

    Mugnai, A.; Smith, E. A.; Tripoli, G. J.; Bizzarri, B.; Casella, D.; Dietrich, S.; Di Paola, F.; Panegrossi, G.; Sanò, P.

    2013-04-01

    Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF) is a EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites) program, designed to deliver satellite products of hydrological interest (precipitation, soil moisture and snow parameters) over the European and Mediterranean region to research and operations users worldwide. Six satellite precipitation algorithms and concomitant precipitation products are the responsibility of various agencies in Italy. Two of these algorithms have been designed for maximum accuracy by restricting their inputs to measurements from conical and cross-track scanning passive microwave (PMW) radiometers mounted on various low Earth orbiting satellites. They have been developed at the Italian National Research Council/Institute of Atmospheric Sciences and Climate in Rome (CNR/ISAC-Rome), and are providing operational retrievals of surface rain rate and its phase properties. Each of these algorithms is physically based, however, the first of these, referred to as the Cloud Dynamics and Radiation Database (CDRD) algorithm, uses a Bayesian-based solution solver, while the second, referred to as the PMW Neural-net Precipitation Retrieval (PNPR) algorithm, uses a neural network-based solution solver. Herein we first provide an overview of the two initial EU research and applications programs that motivated their initial development, EuroTRMM and EURAINSAT (European Satellite Rainfall Analysis and Monitoring at the Geostationary Scale), and the current H-SAF program that provides the framework for their operational use and continued development. We stress the relevance of the CDRD and PNPR algorithms and their precipitation products in helping secure the goals of H-SAF's scientific and operations agenda, the former helpful as a secondary calibration reference to other algorithms in H-SAF's complete mix of algorithms. Descriptions of the algorithms' designs are provided

  4. CDRD and PNPR satellite passive microwave precipitation retrieval algorithms: EuroTRMM/EURAINSAT origins and H-SAF operations

    Directory of Open Access Journals (Sweden)

    A. Mugnai

    2013-04-01

    Full Text Available Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF is a EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites program, designed to deliver satellite products of hydrological interest (precipitation, soil moisture and snow parameters over the European and Mediterranean region to research and operations users worldwide. Six satellite precipitation algorithms and concomitant precipitation products are the responsibility of various agencies in Italy. Two of these algorithms have been designed for maximum accuracy by restricting their inputs to measurements from conical and cross-track scanning passive microwave (PMW radiometers mounted on various low Earth orbiting satellites. They have been developed at the Italian National Research Council/Institute of Atmospheric Sciences and Climate in Rome (CNR/ISAC-Rome, and are providing operational retrievals of surface rain rate and its phase properties. Each of these algorithms is physically based, however, the first of these, referred to as the Cloud Dynamics and Radiation Database (CDRD algorithm, uses a Bayesian-based solution solver, while the second, referred to as the PMW Neural-net Precipitation Retrieval (PNPR algorithm, uses a neural network-based solution solver. Herein we first provide an overview of the two initial EU research and applications programs that motivated their initial development, EuroTRMM and EURAINSAT (European Satellite Rainfall Analysis and Monitoring at the Geostationary Scale, and the current H-SAF program that provides the framework for their operational use and continued development. We stress the relevance of the CDRD and PNPR algorithms and their precipitation products in helping secure the goals of H-SAF's scientific and operations agenda, the former helpful as a secondary calibration reference to other algorithms in H-SAF's complete mix of algorithms. Descriptions of the algorithms' designs are

  5. Code Tracking Algorithms for Mitigating Multipath Effects in Fading Channels for Satellite-Based Positioning

    Directory of Open Access Journals (Sweden)

    Markku Renfors

    2007-12-01

    Full Text Available The ever-increasing public interest in location and positioning services has originated a demand for higher performance global navigation satellite systems (GNSSs. In order to achieve this incremental performance, the estimation of line-of-sight (LOS delay with high accuracy is a prerequisite for all GNSSs. The delay lock loops (DLLs and their enhanced variants (i.e., feedback code tracking loops are the structures of choice for the commercial GNSS receivers, but their performance in severe multipath scenarios is still rather limited. In addition, the new satellite positioning system proposals specify the use of a new modulation, the binary offset carrier (BOC modulation, which triggers a new challenge in the code tracking stage. Therefore, in order to meet this emerging challenge and to improve the accuracy of the delay estimation in severe multipath scenarios, this paper analyzes feedback as well as feedforward code tracking algorithms and proposes the peak tracking (PT methods, which are combinations of both feedback and feedforward structures and utilize the inherent advantages of both structures. We propose and analyze here two variants of PT algorithm: PT with second-order differentiation (Diff2, and PT with Teager Kaiser (TK operator, which will be denoted herein as PT(Diff2 and PT(TK, respectively. In addition to the proposal of the PT methods, the authors propose also an improved early-late-slope (IELS multipath elimination technique which is shown to provide very good mean-time-to-lose-lock (MTLL performance. An implementation of a noncoherent multipath estimating delay locked loop (MEDLL structure is also presented. We also incorporate here an extensive review of the existing feedback and feedforward delay estimation algorithms for direct sequence code division multiple access (DS-CDMA signals in satellite fading channels, by taking into account the impact of binary phase shift keying (BPSK as well as the newly proposed BOC modulation

  6. Impact of Missing Passive Microwave Sensors on Multi-Satellite Precipitation Retrieval Algorithm

    Directory of Open Access Journals (Sweden)

    Bin Yong

    2015-01-01

    Full Text Available The impact of one or two missing passive microwave (PMW input sensors on the end product of multi-satellite precipitation products is an interesting but obscure issue for both algorithm developers and data users. On 28 January 2013, the Version-7 TRMM Multi-satellite Precipitation Analysis (TMPA products were reproduced and re-released by National Aeronautics and Space Administration (NASA Goddard Space Flight Center because the Advanced Microwave Sounding Unit-B (AMSU-B and the Special Sensor Microwave Imager-Sounder-F16 (SSMIS-F16 input data were unintentionally disregarded in the prior retrieval. Thus, this study investigates the sensitivity of TMPA algorithm results to missing PMW sensors by intercomparing the “early” and “late” Version-7 TMPA real-time (TMPA-RT precipitation estimates (i.e., without and with AMSU-B, SSMIS-F16 sensors with an independent high-density gauge network of 200 tipping-bucket rain gauges over the Chinese Jinghe river basin (45,421 km2. The retrieval counts and retrieval frequency of various PMW and Infrared (IR sensors incorporated into the TMPA system were also analyzed to identify and diagnose the impacts of sensor availability on the TMPA-RT retrieval accuracy. Results show that the incorporation of AMSU-B and SSMIS-F16 has substantially reduced systematic errors. The improvement exhibits rather strong seasonal and topographic dependencies. Our analyses suggest that one or two single PMW sensors might play a key role in affecting the end product of current combined microwave-infrared precipitation estimates. This finding supports algorithm developers’ current endeavor in spatiotemporally incorporating as many PMW sensors as possible in the multi-satellite precipitation retrieval system called Integrated Multi-satellitE Retrievals for Global Precipitation Measurement mission (IMERG. This study also recommends users of satellite precipitation products to switch to the newest Version-7 TMPA datasets and

  7. Algorithmic Foundation of Spectral Rarefaction for Measuring Satellite Imagery Heterogeneity at Multiple Spatial Scales

    Science.gov (United States)

    Rocchini, Duccio

    2009-01-01

    Measuring heterogeneity in satellite imagery is an important task to deal with. Most measures of spectral diversity have been based on Shannon Information theory. However, this approach does not inherently address different scales, ranging from local (hereafter referred to alpha diversity) to global scales (gamma diversity). The aim of this paper is to propose a method for measuring spectral heterogeneity at multiple scales based on rarefaction curves. An algorithmic solution of rarefaction applied to image pixel values (Digital Numbers, DNs) is provided and discussed. PMID:22389600

  8. Orbit computation of the TELECOM-2D satellite with a Genetic Algorithm

    Science.gov (United States)

    Deleflie, Florent; Coulot, David; Vienne, Alain; Decosta, Romain; Richard, Pascal; Lasri, Mohammed Amjad

    2014-07-01

    In order to test a preliminary orbit determination method, we fit an orbit of the geostationary satellite TELECOM-2D, as if we did not know any a priori information on its trajectory. The method is based on a genetic algorithm coupled to an analytical propagator of the trajectory, that is used over a couple of days, and that uses a whole set of altazimutal data that are acquired by the tracking network made up of the two TAROT telescopes. The adjusted orbit is then compared to a numerical reference. The method is described, and the results are analyzed, as a step towards an operational method of preliminary orbit determination for uncatalogued objects.

  9. Incorporating Satellite Precipitation Estimates into a Radar-Gauge Multi-Sensor Precipitation Estimation Algorithm

    Directory of Open Access Journals (Sweden)

    Yuxiang He

    2018-01-01

    Full Text Available This paper presents a new and enhanced fusion module for the Multi-Sensor Precipitation Estimator (MPE that would objectively blend real-time satellite quantitative precipitation estimates (SQPE with radar and gauge estimates. This module consists of a preprocessor that mitigates systematic bias in SQPE, and a two-way blending routine that statistically fuses adjusted SQPE with radar estimates. The preprocessor not only corrects systematic bias in SQPE, but also improves the spatial distribution of precipitation based on SQPE and makes it closely resemble that of radar-based observations. It uses a more sophisticated radar-satellite merging technique to blend preprocessed datasets, and provides a better overall QPE product. The performance of the new satellite-radar-gauge blending module is assessed using independent rain gauge data over a five-year period between 2003–2007, and the assessment evaluates the accuracy of newly developed satellite-radar-gauge (SRG blended products versus that of radar-gauge products (which represents MPE algorithm currently used in the NWS (National Weather Service operations over two regions: (I Inside radar effective coverage and (II immediately outside radar coverage. The outcomes of the evaluation indicate (a ingest of SQPE over areas within effective radar coverage improve the quality of QPE by mitigating the errors in radar estimates in region I; and (b blending of radar, gauge, and satellite estimates over region II leads to reduction of errors relative to bias-corrected SQPE. In addition, the new module alleviates the discontinuities along the boundaries of radar effective coverage otherwise seen when SQPE is used directly to fill the areas outside of effective radar coverage.

  10. Development of a computationally efficient algorithm for attitude estimation of a remote sensing satellite

    Science.gov (United States)

    Labibian, Amir; Bahrami, Amir Hossein; Haghshenas, Javad

    2017-09-01

    This paper presents a computationally efficient algorithm for attitude estimation of remote a sensing satellite. In this study, gyro, magnetometer, sun sensor and star tracker are used in Extended Kalman Filter (EKF) structure for the purpose of Attitude Determination (AD). However, utilizing all of the measurement data simultaneously in EKF structure increases computational burden. Specifically, assuming n observation vectors, an inverse of a 3n×3n matrix is required for gain calculation. In order to solve this problem, an efficient version of EKF, namely Murrell's version, is employed. This method utilizes measurements separately at each sampling time for gain computation. Therefore, an inverse of a 3n×3n matrix is replaced by an inverse of a 3×3 matrix for each measurement vector. Moreover, gyro drifts during the time can reduce the pointing accuracy. Therefore, a calibration algorithm is utilized for estimation of the main gyro parameters.

  11. Target Matching Recognition for Satellite Images Based on the Improved FREAK Algorithm

    Directory of Open Access Journals (Sweden)

    Yantong Chen

    2016-01-01

    Full Text Available Satellite remote sensing image target matching recognition exhibits poor robustness and accuracy because of the unfit feature extractor and large data quantity. To address this problem, we propose a new feature extraction algorithm for fast target matching recognition that comprises an improved feature from accelerated segment test (FAST feature detector and a binary fast retina key point (FREAK feature descriptor. To improve robustness, we extend the FAST feature detector by applying scale space theory and then transform the feature vector acquired by the FREAK descriptor from decimal into binary. We reduce the quantity of data in the computer and improve matching accuracy by using the binary space. Simulation test results show that our algorithm outperforms other relevant methods in terms of robustness and accuracy.

  12. Algorithm to retrieve the melt pond fraction and the spectral albedo of Arctic summer ice from satellite optical data

    OpenAIRE

    Zege, E.; Malinka, A.; Katsev, I.; Prikhach, A.; Heygster, Georg; Istomina, L.; Birnbaum, Gerit; Schwarz, Pascal

    2015-01-01

    A new algorithmto retrieve characteristics (albedo and melt pond fraction) of summer ice in the Arctic fromoptical satellite data is described. In contrast to other algorithms this algorithm does not use a priori values of the spectral albedo of the sea-ice constituents (such asmelt ponds,white ice etc.). Instead, it is based on an analytical solution for the reflection from sea ice surface. The algorithm includes the correction of the sought-for ice and ponds characteristics with...

  13. Estimation of the soil temperature from the AVHRR-NOAA satellite data applying split window algorithms

    International Nuclear Information System (INIS)

    Parra, J.C.; Acevedo, P.S.; Sobrino, J.A.; Morales, L.J.

    2006-01-01

    Four algorithms based on the technique of split-window, to estimate the land surface temperature starting from the data provided by the sensor Advanced Very High Resolution radiometer (AVHRR), on board the series of satellites of the National Oceanic and Atmospheric Administration (NOAA), are carried out. These algorithms consider corrections for atmospheric characteristics and emissivity of the different surfaces of the land. Fourteen images AVHRR-NOAA corresponding to the months of October of 2003, and January of 2004 were used. Simultaneously, measurements of soil temperature in the Carillanca hydro-meteorological station were collected in the Region of La Araucana, Chile (38 deg 41 min S; 72 deg 25 min W). Of all the used algorithms, the best results correspond to the model proposed by Sobrino and Raussoni (2000), with a media and standard deviation corresponding to the difference among the temperature of floor measure in situ and the estimated for this algorithm, of -0.06 and 2.11 K, respectively. (Author)

  14. Clustering of tethered satellite system simulation data by an adaptive neuro-fuzzy algorithm

    Science.gov (United States)

    Mitra, Sunanda; Pemmaraju, Surya

    1992-01-01

    Recent developments in neuro-fuzzy systems indicate that the concepts of adaptive pattern recognition, when used to identify appropriate control actions corresponding to clusters of patterns representing system states in dynamic nonlinear control systems, may result in innovative designs. A modular, unsupervised neural network architecture, in which fuzzy learning rules have been embedded is used for on-line identification of similar states. The architecture and control rules involved in Adaptive Fuzzy Leader Clustering (AFLC) allow this system to be incorporated in control systems for identification of system states corresponding to specific control actions. We have used this algorithm to cluster the simulation data of Tethered Satellite System (TSS) to estimate the range of delta voltages necessary to maintain the desired length rate of the tether. The AFLC algorithm is capable of on-line estimation of the appropriate control voltages from the corresponding length error and length rate error without a priori knowledge of their membership functions and familarity with the behavior of the Tethered Satellite System.

  15. TRMM Satellite Algorithm Estimates to Represent the Spatial Distribution of Rainstorms

    Directory of Open Access Journals (Sweden)

    Patrick Marina

    2017-01-01

    Full Text Available On-site measurements from rain gauge provide important information for the design, construction, and operation of water resources engineering projects, groundwater potentials, and the water supply and irrigation systems. A dense gauging network is needed to accurately characterize the variation of rainfall over a region, unfitting for conditions with limited networks, such as in Sarawak, Malaysia. Hence, satellite-based algorithm estimates are introduced as an innovative solution to these challenges. With accessibility to dataset retrievals from public domain websites, it has become a useful source to measure rainfall for a wider coverage area at finer temporal resolution. This paper aims to investigate the rainfall estimates prepared by Tropical Rainfall Measuring Mission (TRMM to explain whether it is suitable to represent the distribution of extreme rainfall in Sungai Sarawak Basin. Based on the findings, more uniform correlations for the investigated storms can be observed for low to medium altitude (>40 MASL. It is found for the investigated events of Jan 05-11, 2009: the normalized root mean square error (NRMSE = 36.7 %; and good correlation (CC = 0.9. These findings suggest that satellite algorithm estimations from TRMM are suitable to represent the spatial distribution of extreme rainfall.

  16. Mapping Global Ocean Surface Albedo from Satellite Observations: Models, Algorithms, and Datasets

    Science.gov (United States)

    Li, X.; Fan, X.; Yan, H.; Li, A.; Wang, M.; Qu, Y.

    2018-04-01

    Ocean surface albedo (OSA) is one of the important parameters in surface radiation budget (SRB). It is usually considered as a controlling factor of the heat exchange among the atmosphere and ocean. The temporal and spatial dynamics of OSA determine the energy absorption of upper level ocean water, and have influences on the oceanic currents, atmospheric circulations, and transportation of material and energy of hydrosphere. Therefore, various parameterizations and models have been developed for describing the dynamics of OSA. However, it has been demonstrated that the currently available OSA datasets cannot full fill the requirement of global climate change studies. In this study, we present a literature review on mapping global OSA from satellite observations. The models (parameterizations, the coupled ocean-atmosphere radiative transfer (COART), and the three component ocean water albedo (TCOWA)), algorithms (the estimation method based on reanalysis data, and the direct-estimation algorithm), and datasets (the cloud, albedo and radiation (CLARA) surface albedo product, dataset derived by the TCOWA model, and the global land surface satellite (GLASS) phase-2 surface broadband albedo product) of OSA have been discussed, separately.

  17. Performance Evaluation of Machine Learning Algorithms for Urban Pattern Recognition from Multi-spectral Satellite Images

    Directory of Open Access Journals (Sweden)

    Marc Wieland

    2014-03-01

    Full Text Available In this study, a classification and performance evaluation framework for the recognition of urban patterns in medium (Landsat ETM, TM and MSS and very high resolution (WorldView-2, Quickbird, Ikonos multi-spectral satellite images is presented. The study aims at exploring the potential of machine learning algorithms in the context of an object-based image analysis and to thoroughly test the algorithm’s performance under varying conditions to optimize their usage for urban pattern recognition tasks. Four classification algorithms, Normal Bayes, K Nearest Neighbors, Random Trees and Support Vector Machines, which represent different concepts in machine learning (probabilistic, nearest neighbor, tree-based, function-based, have been selected and implemented on a free and open-source basis. Particular focus is given to assess the generalization ability of machine learning algorithms and the transferability of trained learning machines between different image types and image scenes. Moreover, the influence of the number and choice of training data, the influence of the size and composition of the feature vector and the effect of image segmentation on the classification accuracy is evaluated.

  18. Geostationary Communications Satellites as Sensors for the Space Weather Environment: Telemetry Event Identification Algorithms

    Science.gov (United States)

    Carlton, A.; Cahoy, K.

    2015-12-01

    Reliability of geostationary communication satellites (GEO ComSats) is critical to many industries worldwide. The space radiation environment poses a significant threat and manufacturers and operators expend considerable effort to maintain reliability for users. Knowledge of the space radiation environment at the orbital location of a satellite is of critical importance for diagnosing and resolving issues resulting from space weather, for optimizing cost and reliability, and for space situational awareness. For decades, operators and manufacturers have collected large amounts of telemetry from geostationary (GEO) communications satellites to monitor system health and performance, yet this data is rarely mined for scientific purposes. The goal of this work is to acquire and analyze archived data from commercial operators using new algorithms that can detect when a space weather (or non-space weather) event of interest has occurred or is in progress. We have developed algorithms, collectively called SEER (System Event Evaluation Routine), to statistically analyze power amplifier current and temperature telemetry by identifying deviations from nominal operations or other events and trends of interest. This paper focuses on our work in progress, which currently includes methods for detection of jumps ("spikes", outliers) and step changes (changes in the local mean) in the telemetry. We then examine available space weather data from the NOAA GOES and the NOAA-computed Kp index and sunspot numbers to see what role, if any, it might have played. By combining the results of the algorithm for many components, the spacecraft can be used as a "sensor" for the space radiation environment. Similar events occurring at one time across many component telemetry streams may be indicative of a space radiation event or system-wide health and safety concern. Using SEER on representative datasets of telemetry from Inmarsat and Intelsat, we find events that occur across all or many of

  19. Advancements of in-flight mass moment of inertia and structural deflection algorithms for satellite attitude simulators

    Science.gov (United States)

    Wright, Jonathan W.

    Experimental satellite attitude simulators have long been used to test and analyze control algorithms in order to drive down risk before implementation on an operational satellite. Ideally, the dynamic response of a terrestrial-based experimental satellite attitude simulator would be similar to that of an on-orbit satellite. Unfortunately, gravitational disturbance torques and poorly characterized moments of inertia introduce uncertainty into the system dynamics leading to questionable attitude control algorithm experimental results. This research consists of three distinct, but related contributions to the field of developing robust satellite attitude simulators. In the first part of this research, existing approaches to estimate mass moments and products of inertia are evaluated followed by a proposition and evaluation of a new approach that increases both the accuracy and precision of these estimates using typical on-board satellite sensors. Next, in order to better simulate the micro-torque environment of space, a new approach to mass balancing satellite attitude simulator is presented, experimentally evaluated, and verified. Finally, in the third area of research, we capitalize on the platform improvements to analyze a control moment gyroscope (CMG) singularity avoidance steering law. Several successful experiments were conducted with the CMG array at near-singular configurations. An evaluation process was implemented to verify that the platform remained near the desired test momentum, showing that the first two components of this research were effective in allowing us to conduct singularity avoidance experiments in a representative space-like test environment.

  20. Improving the Regional Applicability of Satellite Precipitation Products by Ensemble Algorithm

    Directory of Open Access Journals (Sweden)

    Waseem Muhammad

    2018-04-01

    Full Text Available Satellite-based precipitation products (e.g., Integrated Multi-Satellite Retrievals for Global Precipitation Measurement (IMERG and its predecessor, Tropical Rainfall Measuring Mission (TRMM are a critical source of precipitation estimation, particularly for a region with less, or no, hydrometric networking. However, the inconsistency in the performance of these products has been observed in different climatic and topographic diverse regions, timescales, and precipitation intensities and there is still room for improvement. Hence, using a projected ensemble algorithm, the regional precipitation estimate (RP is introduced here. The RP concept is mainly based on the regional performance weights derived from the Mean Square Error (MSE and the precipitation estimate from the TRMM product, that is, TRMM 3B42 (TR, real-time (late (IT and the research (post-real-time (IR products of IMERG. The overall results of the selected contingency table (e.g., Probability of detection (POD and statistical indices (e.g., Correlation Coefficient (CC signposted that the proposed RP product has shown an overall better potential to capture the gauge observations compared with the TR, IR, and IT in five different climatic regions of Pakistan from January 2015 to December 2016, at a diurnal time scale. The current study could be the first research providing preliminary feedback from Pakistan for global precipitation measurement researchers by highlighting the need for refinement in the IMERG.

  1. Satellite-Derived Bathymetry: Accuracy Assessment on Depths Derivation Algorithm for Shallow Water Area

    Science.gov (United States)

    Said, N. M.; Mahmud, M. R.; Hasan, R. C.

    2017-10-01

    Over the years, the acquisition technique of bathymetric data has evolved from a shipborne platform to airborne and presently, utilising space-borne acquisition. The extensive development of remote sensing technology has brought in the new revolution to the hydrographic surveying. Satellite-Derived Bathymetry (SDB), a space-borne acquisition technique which derives bathymetric data from high-resolution multispectral satellite imagery for various purposes recently considered as a new promising technology in the hydrographic surveying industry. Inspiring by this latest developments, a comprehensive study was initiated by National Hydrographic Centre (NHC) and Universiti Teknologi Malaysia (UTM) to analyse SDB as a means for shallow water area acquisition. By adopting additional adjustment in calibration stage, a marginal improvement discovered on the outcomes from both Stumpf and Lyzenga algorithms where the RMSE values for the derived (predicted) depths were 1.432 meters and 1.728 meters respectively. This paper would deliberate in detail the findings from the study especially on the accuracy level and practicality of SDB over the tropical environmental setting in Malaysia.

  2. A Comprehensive Training Data Set for the Development of Satellite-Based Volcanic Ash Detection Algorithms

    Science.gov (United States)

    Schmidl, Marius

    2017-04-01

    We present a comprehensive training data set covering a large range of atmospheric conditions, including disperse volcanic ash and desert dust layers. These data sets contain all information required for the development of volcanic ash detection algorithms based on artificial neural networks, urgently needed since volcanic ash in the airspace is a major concern of aviation safety authorities. Selected parts of the data are used to train the volcanic ash detection algorithm VADUGS. They contain atmospheric and surface-related quantities as well as the corresponding simulated satellite data for the channels in the infrared spectral range of the SEVIRI instrument on board MSG-2. To get realistic results, ECMWF, IASI-based, and GEOS-Chem data are used to calculate all parameters describing the environment, whereas the software package libRadtran is used to perform radiative transfer simulations returning the brightness temperatures for each atmospheric state. As optical properties are a prerequisite for radiative simulations accounting for aerosol layers, the development also included the computation of optical properties for a set of different aerosol types from different sources. A description of the developed software and the used methods is given, besides an overview of the resulting data sets.

  3. Wireless Testbed Bonsai

    Science.gov (United States)

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  4. DEM GENERATION FROM HIGH RESOLUTION SATELLITE IMAGES THROUGH A NEW 3D LEAST SQUARES MATCHING ALGORITHM

    Directory of Open Access Journals (Sweden)

    T. Kim

    2012-09-01

    Full Text Available Automated generation of digital elevation models (DEMs from high resolution satellite images (HRSIs has been an active research topic for many years. However, stereo matching of HRSIs, in particular based on image-space search, is still difficult due to occlusions and building facades within them. Object-space matching schemes, proposed to overcome these problem, often are very time consuming and critical to the dimensions of voxels. In this paper, we tried a new least square matching (LSM algorithm that works in a 3D object space. The algorithm starts with an initial height value on one location of the object space. From this 3D point, the left and right image points are projected. The true height is calculated by iterative least squares estimation based on the grey level differences between the left and right patches centred on the projected left and right points. We tested the 3D LSM to the Worldview images over 'Terrassa Sud' provided by the ISPRS WG I/4. We also compared the performance of the 3D LSM with the correlation matching based on 2D image space and the correlation matching based on 3D object space. The accuracy of the DEM from each method was analysed against the ground truth. Test results showed that 3D LSM offers more accurate DEMs over the conventional matching algorithms. Results also showed that 3D LSM is sensitive to the accuracy of initial height value to start the estimation. We combined the 3D COM and 3D LSM for accurate and robust DEM generation from HRSIs. The major contribution of this paper is that we proposed and validated that LSM can be applied to object space and that the combination of 3D correlation and 3D LSM can be a good solution for automated DEM generation from HRSIs.

  5. Aerosol Retrievals from Proposed Satellite Bistatic Lidar Observations: Algorithm and Information Content

    Science.gov (United States)

    Alexandrov, M. D.; Mishchenko, M. I.

    2017-12-01

    Accurate aerosol retrievals from space remain quite challenging and typically involve solving a severely ill-posed inverse scattering problem. We suggested to address this ill-posedness by flying a bistatic lidar system. Such a system would consist of formation flying constellation of a primary satellite equipped with a conventional monostatic (backscattering) lidar and an additional platform hosting a receiver of the scattered laser light. If successfully implemented, this concept would combine the measurement capabilities of a passive multi-angle multi-spectral polarimeter with the vertical profiling capability of a lidar. Thus, bistatic lidar observations will be free of deficiencies affecting both monostatic lidar measurements (caused by the highly limited information content) and passive photopolarimetric measurements (caused by vertical integration and surface reflection).We present a preliminary aerosol retrieval algorithm for a bistatic lidar system consisting of a high spectral resolution lidar (HSRL) and an additional receiver flown in formation with it at a scattering angle of 165 degrees. This algorithm was applied to synthetic data generated using Mie-theory computations. The model/retrieval parameters in our tests were the effective radius and variance of the aerosol size distribution, complex refractive index of the particles, and their number concentration. Both mono- and bimodal aerosol mixtures were considered. Our algorithm allowed for definitive evaluation of error propagation from measurements to retrievals using a Monte Carlo technique, which involves random distortion of the observations and statistical characterization of the resulting retrieval errors. Our tests demonstrated that supplementing a conventional monostatic HSRL with an additional receiver dramatically increases the information content of the measurements and allows for a sufficiently accurate characterization of tropospheric aerosols.

  6. Advanced Oil Spill Detection Algorithms For Satellite Based Maritime Environment Monitoring

    Science.gov (United States)

    Radius, Andrea; Azevedo, Rui; Sapage, Tania; Carmo, Paulo

    2013-12-01

    During the last years, the increasing pollution occurrence and the alarming deterioration of the environmental health conditions of the sea, lead to the need of global monitoring capabilities, namely for marine environment management in terms of oil spill detection and indication of the suspected polluter. The sensitivity of Synthetic Aperture Radar (SAR) to the different phenomena on the sea, especially for oil spill and vessel detection, makes it a key instrument for global pollution monitoring. The SAR performances in maritime pollution monitoring are being operationally explored by a set of service providers on behalf of the European Maritime Safety Agency (EMSA), which has launched in 2007 the CleanSeaNet (CSN) project - a pan-European satellite based oil monitoring service. EDISOFT, which is from the beginning a service provider for CSN, is continuously investing in R&D activities that will ultimately lead to better algorithms and better performance on oil spill detection from SAR imagery. This strategy is being pursued through EDISOFT participation in the FP7 EC Sea-U project and in the Automatic Oil Spill Detection (AOSD) ESA project. The Sea-U project has the aim to improve the current state of oil spill detection algorithms, through the informative content maximization obtained with data fusion, the exploitation of different type of data/ sensors and the development of advanced image processing, segmentation and classification techniques. The AOSD project is closely related to the operational segment, because it is focused on the automation of the oil spill detection processing chain, integrating auxiliary data, like wind information, together with image and geometry analysis techniques. The synergy between these different objectives (R&D versus operational) allowed EDISOFT to develop oil spill detection software, that combines the operational automatic aspect, obtained through dedicated integration of the processing chain in the existing open source NEST

  7. Algorithm Development and Validation for Satellite-Derived Distributions of DOC and CDOM in the US Middle Atlantic Bight

    Science.gov (United States)

    Mannino, Antonio; Russ, Mary E.; Hooker, Stanford B.

    2007-01-01

    In coastal ocean waters, distributions of dissolved organic carbon (DOC) and chromophoric dissolved organic matter (CDOM) vary seasonally and interannually due to multiple source inputs and removal processes. We conducted several oceanographic cruises within the continental margin of the U.S. Middle Atlantic Bight (MAB) to collect field measurements in order to develop algorithms to retrieve CDOM and DOC from NASA's MODIS-Aqua and SeaWiFS satellite sensors. In order to develop empirical algorithms for CDOM and DOC, we correlated the CDOM absorption coefficient (a(sub cdom)) with in situ radiometry (remote sensing reflectance, Rrs, band ratios) and then correlated DOC to Rrs band ratios through the CDOM to DOC relationships. Our validation analyses demonstrate successful retrieval of DOC and CDOM from coastal ocean waters using the MODIS-Aqua and SeaWiFS satellite sensors with mean absolute percent differences from field measurements of cdom)(355)1,6 % for a(sub cdom)(443), and 12% for the CDOM spectral slope. To our knowledge, the algorithms presented here represent the first validated algorithms for satellite retrieval of a(sub cdom) DOC, and CDOM spectral slope in the coastal ocean. The satellite-derived DOC and a(sub cdom) products demonstrate the seasonal net ecosystem production of DOC and photooxidation of CDOM from spring to fall. With accurate satellite retrievals of CDOM and DOC, we will be able to apply satellite observations to investigate interannual and decadal-scale variability in surface CDOM and DOC within continental margins and monitor impacts of climate change and anthropogenic activities on coastal ecosystems.

  8. Low-cost Citizen Science Balloon Platform for Measuring Air Pollutants to Improve Satellite Retrieval Algorithms

    Science.gov (United States)

    Potosnak, M. J.; Beck-Winchatz, B.; Ritter, P.

    2016-12-01

    High-altitude balloons (HABs) are an engaging platform for citizen science and formal and informal STEM education. However, the logistics of launching, chasing and recovering a payload on a 1200 g or 1500 g balloon can be daunting for many novice school groups and citizen scientists, and the cost can be prohibitive. In addition, there are many interesting scientific applications that do not require reaching the stratosphere, including measuring atmospheric pollutants in the planetary boundary layer. With a large number of citizen scientist flights, these data can be used to constrain satellite retrieval algorithms. In this poster presentation, we discuss a novel approach based on small (30 g) balloons that are cheap and easy to handle, and low-cost tracking devices (SPOT trackers for hikers) that do not require a radio license. Our scientific goal is to measure air quality in the lower troposphere. For example, particulate matter (PM) is an air pollutant that varies on small spatial scales and has sources in rural areas like biomass burning and farming practices such as tilling. Our HAB platform test flight incorporates an optical PM sensor, an integrated single board computer that records the PM sensor signal in addition to flight parameters (pressure, location and altitude), and a low-cost tracking system. Our goal is for the entire platform to cost less than $500. While the datasets generated by these flights are typically small, integrating a network of flight data from citizen scientists into a form usable for comparison to satellite data will require big data techniques.

  9. Seasonal nitrate algorithms for nitrate retrieval using OCEANSAT-2 and MODIS-AQUA satellite data.

    Science.gov (United States)

    Durairaj, Poornima; Sarangi, Ranjit Kumar; Ramalingam, Shanthi; Thirunavukarassu, Thangaradjou; Chauhan, Prakash

    2015-04-01

    In situ datasets of nitrate, sea surface temperature (SST), and chlorophyll a (chl a) collected during the monthly coastal samplings and organized cruises along the Tamilnadu and Andhra Pradesh coast between 2009 and 2013 were used to develop seasonal nitrate algorithms. The nitrate algorithms have been built up based on the three-dimensional regressions between SST, chl a, and nitrate in situ data using linear, Gaussian, Lorentzian, and paraboloid function fittings. Among these four functions, paraboloid was found to be better with the highest co-efficient of determination (postmonsoon: R2=0.711, n=357; summer: R2=0.635, n=302; premonsoon: R2=0.829, n=249; and monsoon: R2=0.692, n=272) for all seasons. Based on these fittings, seasonal nitrate images were generated using the concurrent satellite data of SST from Moderate Resolution Imaging Spectroradiometer (MODIS) and chlorophyll (chl) from Ocean Color Monitor (OCM-2) and MODIS. The best retrieval of modeled nitrate (R2=0.527, root mean square error (RMSE)=3.72, and mean normalized bias (MNB)=0.821) was observed for the postmonsoon season due to the better retrieval of both SST MODIS (28 February 2012, R2=0.651, RMSE=2.037, and MNB=0.068) and chl OCM-2 (R2=0.534, RMSE=0.317, and MNB=0.27). Present results confirm that the chl OCM-2 and SST MODIS retrieve nitrate well than the MODIS-derived chl and SST largely due to the better retrieval of chl by OCM-2 than MODIS.

  10. An Algorithm to Generate Deep-Layer Temperatures from Microwave Satellite Observations for the Purpose of Monitoring Climate Change. Revised

    Science.gov (United States)

    Goldberg, Mitchell D.; Fleming, Henry E.

    1994-01-01

    An algorithm for generating deep-layer mean temperatures from satellite-observed microwave observations is presented. Unlike traditional temperature retrieval methods, this algorithm does not require a first guess temperature of the ambient atmosphere. By eliminating the first guess a potentially systematic source of error has been removed. The algorithm is expected to yield long-term records that are suitable for detecting small changes in climate. The atmospheric contribution to the deep-layer mean temperature is given by the averaging kernel. The algorithm computes the coefficients that will best approximate a desired averaging kernel from a linear combination of the satellite radiometer's weighting functions. The coefficients are then applied to the measurements to yield the deep-layer mean temperature. Three constraints were used in deriving the algorithm: (1) the sum of the coefficients must be one, (2) the noise of the product is minimized, and (3) the shape of the approximated averaging kernel is well-behaved. Note that a trade-off between constraints 2 and 3 is unavoidable. The algorithm can also be used to combine measurements from a future sensor (i.e., the 20-channel Advanced Microwave Sounding Unit (AMSU)) to yield the same averaging kernel as that based on an earlier sensor (i.e., the 4-channel Microwave Sounding Unit (MSU)). This will allow a time series of deep-layer mean temperatures based on MSU measurements to be continued with AMSU measurements. The AMSU is expected to replace the MSU in 1996.

  11. Experimental Demonstration of an Algorithm to Detect the Presence of a Parasitic Satellite

    National Research Council Canada - National Science Library

    Dabrowski, Vincent

    2003-01-01

    Published reports of microsatellite weapons testing have led to a concern that some of these "parasitic" satellites could be deployed against US satellites to rendezvous dock and then disrupt, degrade...

  12. Climatology 2011: An MLS and Sonde Derived Ozone Climatology for Satellite Retrieval Algorithms

    Science.gov (United States)

    McPeters, Richard D.; Labow, Gordon J.

    2012-01-01

    The ozone climatology used as the a priori for the version 8 Solar Backscatter Ultraviolet (SBUV) retrieval algorithms has been updated. The Microwave Limb Sounder (MLS) instrument on Aura has excellent latitude coverage and measures ozone daily from the upper troposphere to the lower mesosphere. The new climatology consists of monthly average ozone profiles for ten degree latitude zones covering pressure altitudes from 0 to 65 km. The climatology was formed by combining data from Aura MLS (2004-2010) with data from balloon sondes (1988-2010). Ozone below 8 km (below 12 km at high latitudes) is based on balloons sondes, while ozone above 16 km (21 km at high latitudes) is based on MLS measurements. Sonde and MLS data are blended in the transition region. Ozone accuracy in the upper troposphere is greatly improved because of the near uniform coverage by Aura MLS, while the addition of a large number of balloon sonde measurements improves the accuracy in the lower troposphere, in the tropics and southern hemisphere in particular. The addition of MLS data also improves the accuracy of climatology in the upper stratosphere and lower mesosphere. The revised climatology has been used for the latest reprocessing of SBUV and TOMS satellite ozone data.

  13. Development of Ray Tracing Algorithms for Scanning Plane and Transverse Plane Analysis for Satellite Multibeam Application

    Directory of Open Access Journals (Sweden)

    N. H. Abd Rahman

    2014-01-01

    Full Text Available Reflector antennas have been widely used in many areas. In the implementation of parabolic reflector antenna for broadcasting satellite applications, it is essential for the spacecraft antenna to provide precise contoured beam to effectively serve the required region. For this purpose, combinations of more than one beam are required. Therefore, a tool utilizing ray tracing method is developed to calculate precise off-axis beams for multibeam antenna system. In the multibeam system, each beam will be fed from different feed positions to allow the main beam to be radiated at the exact direction on the coverage area. Thus, detailed study on caustics of a parabolic reflector antenna is performed and presented in this paper, which is to investigate the behaviour of the rays and its relation to various antenna parameters. In order to produce accurate data for the analysis, the caustic behaviours are investigated in two distinctive modes: scanning plane and transverse plane. This paper presents the detailed discussions on the derivation of the ray tracing algorithms, the establishment of the equations of caustic loci, and the verification of the method through calculation of radiation pattern.

  14. Multi-Stage Hybrid Rocket Conceptual Design for Micro-Satellites Launch using Genetic Algorithm

    Science.gov (United States)

    Kitagawa, Yosuke; Kitagawa, Koki; Nakamiya, Masaki; Kanazaki, Masahiro; Shimada, Toru

    The multi-objective genetic algorithm (MOGA) is applied to the multi-disciplinary conceptual design problem for a three-stage launch vehicle (LV) with a hybrid rocket engine (HRE). MOGA is an optimization tool used for multi-objective problems. The parallel coordinate plot (PCP), which is a data mining method, is employed in the post-process in MOGA for design knowledge discovery. A rocket that can deliver observing micro-satellites to the sun-synchronous orbit (SSO) is designed. It consists of an oxidizer tank containing liquid oxidizer, a combustion chamber containing solid fuel, a pressurizing tank and a nozzle. The objective functions considered in this study are to minimize the total mass of the rocket and to maximize the ratio of the payload mass to the total mass. To calculate the thrust and the engine size, the regression rate is estimated based on an empirical model for a paraffin (FT-0070) propellant. Several non-dominated solutions are obtained using MOGA, and design knowledge is discovered for the present hybrid rocket design problem using a PCP analysis. As a result, substantial knowledge on the design of an LV with an HRE is obtained for use in space transportation.

  15. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  16. An analytic algorithm for global coverage of the revisiting orbit and its application to the CFOSAT satellite

    Science.gov (United States)

    Xu, Ming; Huang, Li

    2014-08-01

    This paper addresses a new analytic algorithm for global coverage of the revisiting orbit and its application to the mission revisiting the Earth within long periods of time, such as Chinese-French Oceanic Satellite (abbr., CFOSAT). In the first, it is presented that the traditional design methodology of the revisiting orbit for some imaging satellites only on the single (ascending or descending) pass, and the repeating orbit is employed to perform the global coverage within short periods of time. However, the selection of the repeating orbit is essentially to yield the suboptimum from the rare measure of rational numbers of passes per day, which will lose lots of available revisiting orbits. Thus, an innovative design scheme is proposed to check both rational and irrational passes per day to acquire the relationship between the coverage percentage and the altitude. To improve the traditional imaging only on the single pass, the proposed algorithm is mapping every pass into its ascending and descending nodes on the specified latitude circle, and then is accumulating the projected width on the circle by the field of view of the satellite. The ergodic geometry of coverage percentage produced from the algorithm is affecting the final scheme, such as the optimal one owning the largest percentage, and the balance one possessing the less gradient in its vicinity, and is guiding to heuristic design for the station-keeping control strategies. The application of CFOSAT validates the feasibility of the algorithm.

  17. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    Science.gov (United States)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  18. The remote sensing of ocean primary productivity - Use of a new data compilation to test satellite algorithms

    Science.gov (United States)

    Balch, William; Evans, Robert; Brown, Jim; Feldman, Gene; Mcclain, Charles; Esaias, Wayne

    1992-01-01

    Global pigment and primary productivity algorithms based on a new data compilation of over 12,000 stations occupied mostly in the Northern Hemisphere, from the late 1950s to 1988, were tested. The results showed high variability of the fraction of total pigment contributed by chlorophyll, which is required for subsequent predictions of primary productivity. Two models, which predict pigment concentration normalized to an attenuation length of euphotic depth, were checked against 2,800 vertical profiles of pigments. Phaeopigments consistently showed maxima at about one optical depth below the chlorophyll maxima. CZCS data coincident with the sea truth data were also checked. A regression of satellite-derived pigment vs ship-derived pigment had a coefficient of determination. The satellite underestimated the true pigment concentration in mesotrophic and oligotrophic waters and overestimated the pigment concentration in eutrophic waters. The error in the satellite estimate showed no trends with time between 1978 and 1986.

  19. An Improved Image Encryption Algorithm Based on Cyclic Rotations and Multiple Chaotic Sequences: Application to Satellite Images

    Directory of Open Access Journals (Sweden)

    MADANI Mohammed

    2017-10-01

    Full Text Available In this paper, a new satellite image encryption algorithm based on the combination of multiple chaotic systems and a random cyclic rotation technique is proposed. Our contribution consists in implementing three different chaotic maps (logistic, sine, and standard combined to improve the security of satellite images. Besides enhancing the encryption, the proposed algorithm also focuses on advanced efficiency of the ciphered images. Compared with classical encryption schemes based on multiple chaotic maps and the Rubik's cube rotation, our approach has not only the same merits of chaos systems like high sensitivity to initial values, unpredictability, and pseudo-randomness, but also other advantages like a higher number of permutations, better performances in Peak Signal to Noise Ratio (PSNR and a Maximum Deviation (MD.

  20. A Novel Strategy Using Factor Graphs and the Sum-Product Algorithm for Satellite Broadcast Scheduling Problems

    Science.gov (United States)

    Chen, Jung-Chieh

    This paper presents a low complexity algorithmic framework for finding a broadcasting schedule in a low-altitude satellite system, i. e., the satellite broadcast scheduling (SBS) problem, based on the recent modeling and computational methodology of factor graphs. Inspired by the huge success of the low density parity check (LDPC) codes in the field of error control coding, in this paper, we transform the SBS problem into an LDPC-like problem through a factor graph instead of using the conventional neural network approaches to solve the SBS problem. Based on a factor graph framework, the soft-information, describing the probability that each satellite will broadcast information to a terminal at a specific time slot, is exchanged among the local processing in the proposed framework via the sum-product algorithm to iteratively optimize the satellite broadcasting schedule. Numerical results show that the proposed approach not only can obtain optimal solution but also enjoys the low complexity suitable for integral-circuit implementation.

  1. Statistically Optimized Inversion Algorithm for Enhanced Retrieval of Aerosol Properties from Spectral Multi-Angle Polarimetric Satellite Observations

    Science.gov (United States)

    Dubovik, O; Herman, M.; Holdak, A.; Lapyonok, T.; Taure, D.; Deuze, J. L.; Ducos, F.; Sinyuk, A.

    2011-01-01

    The proposed development is an attempt to enhance aerosol retrieval by emphasizing statistical optimization in inversion of advanced satellite observations. This optimization concept improves retrieval accuracy relying on the knowledge of measurement error distribution. Efficient application of such optimization requires pronounced data redundancy (excess of the measurements number over number of unknowns) that is not common in satellite observations. The POLDER imager on board the PARASOL microsatellite registers spectral polarimetric characteristics of the reflected atmospheric radiation at up to 16 viewing directions over each observed pixel. The completeness of such observations is notably higher than for most currently operating passive satellite aerosol sensors. This provides an opportunity for profound utilization of statistical optimization principles in satellite data inversion. The proposed retrieval scheme is designed as statistically optimized multi-variable fitting of all available angular observations obtained by the POLDER sensor in the window spectral channels where absorption by gas is minimal. The total number of such observations by PARASOL always exceeds a hundred over each pixel and the statistical optimization concept promises to be efficient even if the algorithm retrieves several tens of aerosol parameters. Based on this idea, the proposed algorithm uses a large number of unknowns and is aimed at retrieval of extended set of parameters affecting measured radiation.

  2. Visible nulling coronagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Petrone, Peter; Madison, Timothy; Rizzo, Maxime; Melnick, Gary; Tolls, Volker

    2009-08-01

    We report on our recent laboratory results with the NASA/Goddard Space Flight Center (GSFC) Visible Nulling Coronagraph (VNC) testbed. We have experimentally achieved focal plane contrasts of 1 x 108 and approaching 109 at inner working angles of 2 * wavelength/D and 4 * wavelength/D respectively where D is the aperture diameter. The result was obtained using a broadband source with a narrowband spectral filter of width 10 nm centered on 630 nm. To date this is the deepest nulling result with a visible nulling coronagraph yet obtained. Developed also is a Null Control Breadboard (NCB) to assess and quantify MEMS based segmented deformable mirror technology and develop and assess closed-loop null sensing and control algorithm performance from both the pupil and focal planes. We have demonstrated closed-loop control at 27 Hz in the laboratory environment. Efforts are underway to first bring the contrast to > 109 necessary for the direct detection and characterization of jovian (Jupiter-like) and then to > 1010 necessary for terrestrial (Earth-like) exosolar planets. Short term advancements are expected to both broaden the spectral passband from 10 nm to 100 nm and to increase both the long-term stability to > 2 hours and the extent of the null out to a ~ 10 * wavelength / D via the use of MEMS based segmented deformable mirror technology, a coherent fiber bundle, achromatic phase shifters, all in a vacuum chamber at the GSFC VNC facility. Additionally an extreme stability textbook sized compact VNC is under development.

  3. Robust Fault-Tolerant Control for Satellite Attitude Stabilization Based on Active Disturbance Rejection Approach with Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Fei Song

    2014-01-01

    Full Text Available This paper proposed a robust fault-tolerant control algorithm for satellite stabilization based on active disturbance rejection approach with artificial bee colony algorithm. The actuating mechanism of attitude control system consists of three working reaction flywheels and one spare reaction flywheel. The speed measurement of reaction flywheel is adopted for fault detection. If any reaction flywheel fault is detected, the corresponding fault flywheel is isolated and the spare reaction flywheel is activated to counteract the fault effect and ensure that the satellite is working safely and reliably. The active disturbance rejection approach is employed to design the controller, which handles input information with tracking differentiator, estimates system uncertainties with extended state observer, and generates control variables by state feedback and compensation. The designed active disturbance rejection controller is robust to both internal dynamics and external disturbances. The bandwidth parameter of extended state observer is optimized by the artificial bee colony algorithm so as to improve the performance of attitude control system. A series of simulation experiment results demonstrate the performance superiorities of the proposed robust fault-tolerant control algorithm.

  4. Link Adaptation for Mitigating Earth-To-Space Propagation Effects on the NASA SCaN Testbed

    Science.gov (United States)

    Kilcoyne, Deirdre K.; Headley, William C.; Leffke, Zach J.; Rowe, Sonya A.; Mortensen, Dale J.; Reinhart, Richard C.; McGwier, Robert W.

    2016-01-01

    In Earth-to-Space communications, well-known propagation effects such as path loss and atmospheric loss can lead to fluctuations in the strength of the communications link between a satellite and its ground station. Additionally, the typically unconsidered effect of shadowing due to the geometry of the satellite and its solar panels can also lead to link degradation. As a result of these anticipated channel impairments, NASA's communication links have been traditionally designed to handle the worst-case impact of these effects through high link margins and static, lower rate, modulation formats. The work presented in this paper aims to relax these constraints by providing an improved trade-off between data rate and link margin through utilizing link adaptation. More specifically, this work provides a simulation study on the propagation effects impacting NASA's SCaN Testbed flight software-defined radio (SDR) as well as proposes a link adaptation algorithm that varies the modulation format of a communications link as its signal-to-noise ratio fluctuates. Ultimately, the models developed in this work will be utilized to conduct real-time flight experiments on-board the NASA SCaN Testbed.

  5. Network design consideration of a satellite-based mobile communications system

    Science.gov (United States)

    Yan, T.-Y.

    1986-01-01

    Technical considerations for the Mobile Satellite Experiment (MSAT-X), the ground segment testbed for the low-cost spectral efficient satellite-based mobile communications technologies being developed for the 1990's, are discussed. The Network Management Center contains a flexible resource sharing algorithm, the Demand Assigned Multiple Access scheme, which partitions the satellite transponder bandwidth among voice, data, and request channels. Satellite use of multiple UHF beams permits frequency reuse. The backhaul communications and the Telemetry, Tracking and Control traffic are provided through a single full-coverage SHF beam. Mobile Terminals communicate with the satellite using UHF. All communications including SHF-SHF between Base Stations and/or Gateways, are routed through the satellite. Because MSAT-X is an experimental network, higher level network protocols (which are service-specific) will be developed only to test the operation of the lowest three levels, the physical, data link, and network layers.

  6. Validation and Application of the Modified Satellite-Based Priestley-Taylor Algorithm for Mapping Terrestrial Evapotranspiration

    Directory of Open Access Journals (Sweden)

    Yunjun Yao

    2014-01-01

    Full Text Available Satellite-based vegetation indices (VIs and Apparent Thermal Inertia (ATI derived from temperature change provide valuable information for estimating evapotranspiration (LE and detecting the onset and severity of drought. The modified satellite-based Priestley-Taylor (MS-PT algorithm that we developed earlier, coupling both VI and ATI, is validated based on observed data from 40 flux towers distributed across the world on all continents. The validation results illustrate that the daily LE can be estimated with the Root Mean Square Error (RMSE varying from 10.7 W/m2 to 87.6 W/m2, and with the square of correlation coefficient (R2 from 0.41 to 0.89 (p < 0.01. Compared with the Priestley-Taylor-based LE (PT-JPL algorithm, the MS-PT algorithm improves the LE estimates at most flux tower sites. Importantly, the MS-PT algorithm is also satisfactory in reproducing the inter-annual variability at flux tower sites with at least five years of data. The R2 between measured and predicted annual LE anomalies is 0.42 (p = 0.02. The MS-PT algorithm is then applied to detect the variations of long-term terrestrial LE over Three-North Shelter Forest Region of China and to monitor global land surface drought. The MS-PT algorithm described here demonstrates the ability to map regional terrestrial LE and identify global soil moisture stress, without requiring precipitation information.

  7. NASA Robotic Neurosurgery Testbed

    Science.gov (United States)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  8. Generating Land Surface Reflectance for the New Generation of Geostationary Satellite Sensors with the MAIAC Algorithm

    Science.gov (United States)

    Wang, W.; Wang, Y.; Hashimoto, H.; Li, S.; Takenaka, H.; Higuchi, A.; Lyapustin, A.; Nemani, R. R.

    2017-12-01

    The latest generation of geostationary satellite sensors, including the GOES-16/ABI and the Himawari 8/AHI, provide exciting capability to monitor land surface at very high temporal resolutions (5-15 minute intervals) and with spatial and spectral characteristics that mimic the Earth Observing System flagship MODIS. However, geostationary data feature changing sun angles at constant view geometry, which is almost reciprocal to sun-synchronous observations. Such a challenge needs to be carefully addressed before one can exploit the full potential of the new sources of data. Here we take on this challenge with Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm, recently developed for accurate and globally robust applications like the MODIS Collection 6 re-processing. MAIAC first grids the top-of-atmosphere measurements to a fixed grid so that the spectral and physical signatures of each grid cell are stacked ("remembered") over time and used to dramatically improve cloud/shadow/snow detection, which is by far the dominant error source in the remote sensing. It also exploits the changing sun-view geometry of the geostationary sensor to characterize surface BRDF with augmented angular resolution for accurate aerosol retrievals and atmospheric correction. The high temporal resolutions of the geostationary data indeed make the BRDF retrieval much simpler and more robust as compared with sun-synchronous sensors such as MODIS. As a prototype test for the geostationary-data processing pipeline on NASA Earth Exchange (GEONEX), we apply MAIAC to process 18 months of data from Himawari 8/AHI over Australia. We generate a suite of test results, including the input TOA reflectance and the output cloud mask, aerosol optical depth (AOD), and the atmospherically-corrected surface reflectance for a variety of geographic locations, terrain, and land cover types. Comparison with MODIS data indicates a general agreement between the retrieved surface reflectance

  9. An Image Matching Algorithm Integrating Global SRTM and Image Segmentation for Multi-Source Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Xiao Ling

    2016-08-01

    Full Text Available This paper presents a novel image matching method for multi-source satellite images, which integrates global Shuttle Radar Topography Mission (SRTM data and image segmentation to achieve robust and numerous correspondences. This method first generates the epipolar lines as a geometric constraint assisted by global SRTM data, after which the seed points are selected and matched. To produce more reliable matching results, a region segmentation-based matching propagation is proposed in this paper, whereby the region segmentations are extracted by image segmentation and are considered to be a spatial constraint. Moreover, a similarity measure integrating Distance, Angle and Normalized Cross-Correlation (DANCC, which considers geometric similarity and radiometric similarity, is introduced to find the optimal correspondences. Experiments using typical satellite images acquired from Resources Satellite-3 (ZY-3, Mapping Satellite-1, SPOT-5 and Google Earth demonstrated that the proposed method is able to produce reliable and accurate matching results.

  10. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  11. Environment Emulation For Wsn Testbed

    Directory of Open Access Journals (Sweden)

    Radosław Kapłoniak

    2012-01-01

    Full Text Available The development of applications for wireless sensor networks is a challenging task. For this reason, several testbed platforms have been created. They simplify the manageability of nodes by offering easy ways of programming and debugging sensor nodes. These platforms, sometimes composed of dozens of sensors, provide a convenient way for carrying out research on medium access control and data exchange between nodes. In this article, we propose the extension of the WSN testbed, which could be used for evaluating and testing the functionality of sensor networks applications by emulating a real-world environment.

  12. Advanced Artificial Intelligence Technology Testbed

    Science.gov (United States)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  13. A satellite digital controller or 'play that PID tune again, Sam'. [Position, Integral, Derivative feedback control algorithm for design strategy

    Science.gov (United States)

    Seltzer, S. M.

    1976-01-01

    The problem discussed is to design a digital controller for a typical satellite. The controlled plant is considered to be a rigid body acting in a plane. The controller is assumed to be a digital computer which, when combined with the proposed control algorithm, can be represented as a sampled-data system. The objective is to present a design strategy and technique for selecting numerical values for the control gains (assuming position, integral, and derivative feedback) and the sample rate. The technique is based on the parameter plane method and requires that the system be amenable to z-transform analysis.

  14. Enhancement of Satellite Image Compression Using a Hybrid (DWT-DCT) Algorithm

    Science.gov (United States)

    Shihab, Halah Saadoon; Shafie, Suhaidi; Ramli, Abdul Rahman; Ahmad, Fauzan

    2017-12-01

    Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) image compression techniques have been utilized in most of the earth observation satellites launched during the last few decades. However, these techniques have some issues that should be addressed. The DWT method has proven to be more efficient than DCT for several reasons. Nevertheless, the DCT can be exploited to improve the high-resolution satellite image compression when combined with the DWT technique. Hence, a proposed hybrid (DWT-DCT) method was developed and implemented in the current work, simulating an image compression system on-board on a small remote sensing satellite, with the aim of achieving a higher compression ratio to decrease the onboard data storage and the downlink bandwidth, while avoiding further complex levels of DWT. This method also succeeded in maintaining the reconstructed satellite image quality through replacing the standard forward DWT thresholding and quantization processes with an alternative process that employed the zero-padding technique, which also helped to reduce the processing time of DWT compression. The DCT, DWT and the proposed hybrid methods were implemented individually, for comparison, on three LANDSAT 8 images, using the MATLAB software package. A comparison was also made between the proposed method and three other previously published hybrid methods. The evaluation of all the objective and subjective results indicated the feasibility of using the proposed hybrid (DWT-DCT) method to enhance the image compression process on-board satellites.

  15. Algorithms

    Indian Academy of Sciences (India)

    polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.

  16. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    Science.gov (United States)

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types.

  17. Chlorophyll-a Estimation Around the Antarctica Peninsula Using Satellite Algorithms: Hints from Field Water Leaving Reflectance

    Directory of Open Access Journals (Sweden)

    Chen Zeng

    2016-12-01

    Full Text Available Ocean color remote sensing significantly contributes to our understanding of phytoplankton distribution and abundance and primary productivity in the Southern Ocean (SO. However, the current SO in situ optical database is still insufficient and unevenly distributed. This limits the ability to produce robust and accurate measurements of satellite-based chlorophyll. Based on data collected on cruises around the Antarctica Peninsula (AP on January 2014 and 2016, this research intends to enhance our knowledge of SO water and atmospheric optical characteristics and address satellite algorithm deficiency of ocean color products. We collected high resolution in situ water leaving reflectance (±1 nm band resolution, simultaneous in situ chlorophyll-a concentrations and satellite (MODIS and VIIRS water leaving reflectance. Field samples show that clouds have a great impact on the visible green bands and are difficult to detect because NASA protocols apply the NIR band as a cloud contamination threshold. When compared to global case I water, water around the AP has lower water leaving reflectance and a narrower blue-green band ratio, which explains chlorophyll-a underestimation in high chlorophyll-a regions and overestimation in low chlorophyll-a regions. VIIRS shows higher spatial coverage and detection accuracy than MODIS. After coefficient improvement, VIIRS is able to predict chlorophyll a with 53% accuracy.

  18. Chlorophyll-a Estimation Around the Antarctica Peninsula Using Satellite Algorithms: Hints from Field Water Leaving Reflectance.

    Science.gov (United States)

    Zeng, Chen; Xu, Huiping; Fischer, Andrew M

    2016-12-07

    Ocean color remote sensing significantly contributes to our understanding of phytoplankton distribution and abundance and primary productivity in the Southern Ocean (SO). However, the current SO in situ optical database is still insufficient and unevenly distributed. This limits the ability to produce robust and accurate measurements of satellite-based chlorophyll. Based on data collected on cruises around the Antarctica Peninsula (AP) on January 2014 and 2016, this research intends to enhance our knowledge of SO water and atmospheric optical characteristics and address satellite algorithm deficiency of ocean color products. We collected high resolution in situ water leaving reflectance (±1 nm band resolution), simultaneous in situ chlorophyll-a concentrations and satellite (MODIS and VIIRS) water leaving reflectance. Field samples show that clouds have a great impact on the visible green bands and are difficult to detect because NASA protocols apply the NIR band as a cloud contamination threshold. When compared to global case I water, water around the AP has lower water leaving reflectance and a narrower blue-green band ratio, which explains chlorophyll-a underestimation in high chlorophyll-a regions and overestimation in low chlorophyll-a regions. VIIRS shows higher spatial coverage and detection accuracy than MODIS. After coefficient improvement, VIIRS is able to predict chlorophyll a with 53% accuracy.

  19. An Enhanced Satellite-Based Algorithm for Detecting and Tracking Dust Outbreaks by Means of SEVIRI Data

    Directory of Open Access Journals (Sweden)

    Francesco Marchese

    2017-05-01

    Full Text Available Dust outbreaks are meteorological phenomena of great interest for scientists and authorities (because of their impact on the climate, environment, and human activities, which may be detected, monitored, and characterized from space using different methods and procedures. Among the recent dust detection algorithms, the RSTDUST multi-temporal technique has provided good results in different geographic areas (e.g., Mediterranean basin; Arabian Peninsula, exhibiting a better performance than traditional split window methods, in spite of some limitations. In this study, we present an optimized configuration of this technique, which better exploits data provided by Spinning Enhanced Visible and Infrared Imager (SEVIRI aboard Meteosat Second Generation (MSG satellites to address those issues (e.g., sensitivity reduction over arid and semi-arid regions; dependence on some meteorological clouds. Three massive dust events affecting Europe and the Mediterranean basin in May 2008/2010 are analysed in this work, using information provided by some independent and well-established aerosol products to assess the achieved results. The study shows that the proposed algorithm, christened eRSTDUST (i.e., enhanced RSTDUST, which provides qualitative information about dust outbreaks, is capable of increasing the trade-off between reliability and sensitivity. The results encourage further experimentations of this method in other periods of the year, also exploiting data provided by different satellite sensors, for better evaluating the advantages arising from the use of this dust detection technique in operational scenarios.

  20. An Approach for Smart Antenna Testbed

    Science.gov (United States)

    Kawitkar, R. S.; Wakde, D. G.

    2003-07-01

    The use of wireless, mobile, personal communications services are expanding rapidly. Adaptive or "Smart" antenna arrays can increase channel capacity through spatial division. Adaptive antennas can also track mobile users, improving both signal range and quality. For these reasons, smart antenna systems have attracted widespread interest in the telecommunications industry for applications to third generation wireless systems.This paper aims to design and develop an advanced antennas testbed to serve as a common reference for testing adaptive antenna arrays and signal combining algorithms, as well as complete systems. A flexible suite of off line processing software should be written using matlab to perform system calibration, test bed initialization, data acquisition control, data storage/transfer, off line signal processing and analysis and graph plotting. The goal of this paper is to develop low complexity smart antenna structures for 3G systems. The emphasis will be laid on ease of implementation in a multichannel / multi-user environment. A smart antenna test bed will be developed, and various state-of-the-art DSP structures and algorithms will be investigated.Facing the soaring demand for mobile communications, the use of smart antenna arrays in mobile communications systems to exploit spatial diversity to further improve spectral efficiency has recently received considerable attention. Basically, a smart antenna array comprises a number of antenna elements combined via a beamforming network (amplitude and phase control network). Some of the benefits that can be achieved by using SAS (Smart Antenna System) include lower mobile terminal power consumption, range extension, ISI reduction, higher data rate support, and ease of integration into the existing base station system. In terms of economic benefits, adaptive antenna systems employed at base station, though increases the per base station cost, can increase coverage area of each cell site, thereby reducing

  1. Smart Antenna UKM Testbed for Digital Beamforming System

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH array antenna and software reconfigurable digital beamforming system (DBS. The antenna is developed based on using the novel LIEH microstrip patch element design arranged into 4×1 uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance TMS320C6711TM floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88–2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  2. Improved Chlorophyll-a Algorithm for the Satellite Ocean Color Data in the Northern Bering Sea and Southern Chukchi Sea

    Science.gov (United States)

    Lee, Sang Heon; Ryu, Jongseong; Park, Jung-woo; Lee, Dabin; Kwon, Jae-Il; Zhao, Jingping; Son, SeungHyun

    2018-03-01

    The Bering and Chukchi seas are an important conduit to the Arctic Ocean and are reported to be one of the most productive regions in the world's oceans in terms of high primary productivity that sustains large numbers of fishes, marine mammals, and sea birds as well as benthic animals. Climate-induced changes in primary production and production at higher trophic levels also have been observed in the northern Bering and Chukchi seas. Satellite ocean color observations could enable the monitoring of relatively long term patterns in chlorophyll-a (Chl-a) concentrations that would serve as an indicator of phytoplankton biomass. The performance of existing global and regional Chl-a algorithms for satellite ocean color data was investigated in the northeastern Bering Sea and southern Chukchi Sea using in situ optical measurements from the Healy 2007 cruise. The model-derived Chl-a data using the previous Chl-a algorithms present striking uncertainties regarding Chl-a concentrations-for example, overestimation in lower Chl-a concentrations or systematic overestimation in the northeastern Bering Sea and southern Chukchi Sea. Accordingly, a simple two band ratio (R rs(443)/R rs(555)) algorithm of Chl-a for the satellite ocean color data was devised for the northeastern Bering Sea and southern Chukchi Sea. The MODIS-derived Chl-a data from July 2002 to December 2014 were produced using the new Chl-a algorithm to investigate the seasonal and interannual variations of Chl-a in the northern Bering Sea and the southern Chukchi Sea. The seasonal distribution of Chl-a shows that the highest (spring bloom) Chl-a concentrations are in May and the lowest are in July in the overall area. Chl-a concentrations relatively decreased in June, particularly in the open ocean waters of the Bering Sea. The Chl-a concentrations start to increase again in August and become quite high in September. In October, Chl-a concentrations decreased in the western area of the Study area and the Alaskan

  3. Optical testbed for the LISA phasemeter

    International Nuclear Information System (INIS)

    Schwarze, T S; Fernández Barranco, G; Penkert, D; Gerberding, O; Heinzel, G; Danzmann, K

    2016-01-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup. (paper)

  4. Optical testbed for the LISA phasemeter

    Science.gov (United States)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  5. Validation of Cloud Parameters Derived from Geostationary Satellites, AVHRR, MODIS, and VIIRS Using SatCORPS Algorithms

    Science.gov (United States)

    Minnis, P.; Sun-Mack, S.; Bedka, K. M.; Yost, C. R.; Trepte, Q. Z.; Smith, W. L., Jr.; Painemal, D.; Chen, Y.; Palikonda, R.; Dong, X.; hide

    2016-01-01

    Validation is a key component of remote sensing that can take many different forms. The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) is applied to many different imager datasets including those from the geostationary satellites, Meteosat, Himiwari-8, INSAT-3D, GOES, and MTSAT, as well as from the low-Earth orbiting satellite imagers, MODIS, AVHRR, and VIIRS. While each of these imagers have similar sets of channels with wavelengths near 0.65, 3.7, 11, and 12 micrometers, many differences among them can lead to discrepancies in the retrievals. These differences include spatial resolution, spectral response functions, viewing conditions, and calibrations, among others. Even when analyzed with nearly identical algorithms, it is necessary, because of those discrepancies, to validate the results from each imager separately in order to assess the uncertainties in the individual parameters. This paper presents comparisons of various SatCORPS-retrieved cloud parameters with independent measurements and retrievals from a variety of instruments. These include surface and space-based lidar and radar data from CALIPSO and CloudSat, respectively, to assess the cloud fraction, height, base, optical depth, and ice water path; satellite and surface microwave radiometers to evaluate cloud liquid water path; surface-based radiometers to evaluate optical depth and effective particle size; and airborne in-situ data to evaluate ice water content, effective particle size, and other parameters. The results of comparisons are compared and contrasted and the factors influencing the differences are discussed.

  6. A Study on Retrieval Algorithm of Black Water Aggregation in Taihu Lake Based on HJ-1 Satellite Images

    International Nuclear Information System (INIS)

    Lei, Zou; Bing, Zhang; Junsheng, Li; Qian, Shen; Fangfang, Zhang; Ganlin, Wang

    2014-01-01

    The phenomenon of black water aggregation (BWA) occurs in inland water when massive algal bodies aggregate, die, and react with the toxic sludge in certain climate conditions to deprive the water of oxygen. This process results in the deterioration of water quality and damage to the ecosystem. Because charge coupled device (CCD) camera data from the Chinese HJ environmental satellite shows high potential in monitoring BWA, we acquired four HJ-CCD images of Taihu Lake captured during 2009 to 2011 to study this phenomenon. The first study site was selected near the Shore of Taihu Lake. We pre-processed the HJ-CCD images and analyzed the digital number (DN) gray values in the research area and in typical BWA areas. The results show that the DN values of visible bands in BWA areas are obviously lower than those in the research areas. Moreover, we developed an empirical retrieving algorithm of BWA based on the DN mean values and variances of research areas. Finally, we tested the accuracy of this empirical algorithm. The retrieving accuracies were89.9%, 58.1%, 73.4%, and 85.5%, respectively, which demonstrates the efficiency of empirical algorithm in retrieving the approximate distributions of BWA

  7. Application of an optimization algorithm to satellite ocean color imagery: A case study in Southwest Florida coastal waters

    Science.gov (United States)

    Hu, Chuanmin; Lee, Zhongping; Muller-Karger, Frank E.; Carder, Kendall L.

    2003-05-01

    A spectra-matching optimization algorithm, designed for hyperspectral sensors, has been implemented to process SeaWiFS-derived multi-spectral water-leaving radiance data. The algorithm has been tested over Southwest Florida coastal waters. The total spectral absorption and backscattering coefficients can be well partitioned with the inversion algorithm, resulting in RMS errors generally less than 5% in the modeled spectra. For extremely turbid waters that come from either river runoff or sediment resuspension, the RMS error is in the range of 5-15%. The bio-optical parameters derived in this optically complex environment agree well with those obtained in situ. Further, the ability to separate backscattering (a proxy for turbidity) from the satellite signal makes it possible to trace water movement patterns, as indicated by the total absorption imagery. The derived patterns agree with those from concurrent surface drifters. For waters where CDOM overwhelmingly dominates the optical signal, however, the procedure tends to regard CDOM as the sole source of absorption, implying the need for better atmospheric correction and for adjustment of some model coefficients for this particular region.

  8. SeaWiFS Technical Report Series. Volume 42; Satellite Primary Productivity Data and Algorithm Development: A Science Plan for Mission to Planet Earth

    Science.gov (United States)

    Falkowski, Paul G.; Behrenfeld, Michael J.; Esaias, Wayne E.; Balch, William; Campbell, Janet W.; Iverson, Richard L.; Kiefer, Dale A.; Morel, Andre; Yoder, James A.; Hooker, Stanford B. (Editor); hide

    1998-01-01

    Two issues regarding primary productivity, as it pertains to the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Program and the National Aeronautics and Space Administration (NASA) Mission to Planet Earth (MTPE) are presented in this volume. Chapter 1 describes the development of a science plan for deriving primary production for the world ocean using satellite measurements, by the Ocean Primary Productivity Working Group (OPPWG). Chapter 2 presents discussions by the same group, of algorithm classification, algorithm parameterization and data availability, algorithm testing and validation, and the benefits of a consensus primary productivity algorithm.

  9. Validation of near infrared satellite based algorithms to relative atmospheric water vapour content over land

    International Nuclear Information System (INIS)

    Serpolla, A.; Bonafoni, S.; Basili, P.; Biondi, R.; Arino, O.

    2009-01-01

    This paper presents the validation results of ENVISAT MERIS and TERRA MODIS retrieval algorithms for atmospheric Water Vapour Content (WVC) estimation in clear sky condition on land. The MERIS algorithms exploits the radiance ratio of the absorbing channel at 900 nm with the almost absorption-free reference at 890 nm, while the MODIS one is based on the ratio of measurements centred at near 0.905, 0.936, and 0.94 μm with atmospheric window reflectance at 0.865 and 1.24 μm. The first test was performed in the Mediterranean area using WVC provided from both ECMWF and AERONET. As a second step, the performances of the algorithms were tested exploiting WVC computed from radio sounding (RAOBs)in the North East Australia. The different comparisons with respect to reference WVC values showed an overestimation of WVC by MODIS (root mean square error percentage greater than 20%) and an acceptable performance of MERIS algorithms (root mean square error percentage around 10%) [it

  10. A feed-forward Hopfield neural network algorithm (FHNNA) with a colour satellite image for water quality mapping

    Science.gov (United States)

    Asal Kzar, Ahmed; Mat Jafri, M. Z.; Hwee San, Lim; Al-Zuky, Ali A.; Mutter, Kussay N.; Hassan Al-Saleh, Anwar

    2016-06-01

    There are many techniques that have been given for water quality problem, but the remote sensing techniques have proven their success, especially when the artificial neural networks are used as mathematical models with these techniques. Hopfield neural network is one type of artificial neural networks which is common, fast, simple, and efficient, but it when it deals with images that have more than two colours such as remote sensing images. This work has attempted to solve this problem via modifying the network that deals with colour remote sensing images for water quality mapping. A Feed-forward Hopfield Neural Network Algorithm (FHNNA) was modified and used with a satellite colour image from type of Thailand earth observation system (THEOS) for TSS mapping in the Penang strait, Malaysia, through the classification of TSS concentrations. The new algorithm is based essentially on three modifications: using HNN as feed-forward network, considering the weights of bitplanes, and non-self-architecture or zero diagonal of weight matrix, in addition, it depends on a validation data. The achieved map was colour-coded for visual interpretation. The efficiency of the new algorithm has found out by the higher correlation coefficient (R=0.979) and the lower root mean square error (RMSE=4.301) between the validation data that were divided into two groups. One used for the algorithm and the other used for validating the results. The comparison was with the minimum distance classifier. Therefore, TSS mapping of polluted water in Penang strait, Malaysia, can be performed using FHNNA with remote sensing technique (THEOS). It is a new and useful application of HNN, so it is a new model with remote sensing techniques for water quality mapping which is considered important environmental problem.

  11. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  12. Satellite Ocean Aerosol Retrieval (SOAR) Algorithm Extension to S-NPP VIIRS as Part of the "Deep Blue" Aerosol Project

    Science.gov (United States)

    Sayer, A. M.; Hsu, N. C.; Lee, J.; Bettenhausen, C.; Kim, W. V.; Smirnov, A.

    2018-01-01

    The Suomi National Polar-Orbiting Partnership (S-NPP) satellite, launched in late 2011, carries the Visible Infrared Imaging Radiometer Suite (VIIRS) and several other instruments. VIIRS has similar characteristics to prior satellite sensors used for aerosol optical depth (AOD) retrieval, allowing the continuation of space-based aerosol data records. The Deep Blue algorithm has previously been applied to retrieve AOD from Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate Resolution Imaging Spectroradiometer (MODIS) measurements over land. The SeaWiFS Deep Blue data set also included a SeaWiFS Ocean Aerosol Retrieval (SOAR) algorithm to cover water surfaces. As part of NASA's VIIRS data processing, Deep Blue is being applied to VIIRS data over land, and SOAR has been adapted from SeaWiFS to VIIRS for use over water surfaces. This study describes SOAR as applied in version 1 of NASA's S-NPP VIIRS Deep Blue data product suite. Several advances have been made since the SeaWiFS application, as well as changes to make use of the broader spectral range of VIIRS. A preliminary validation against Maritime Aerosol Network (MAN) measurements suggests a typical uncertainty on retrieved 550 nm AOD of order ±(0.03+10%), comparable to existing SeaWiFS/MODIS aerosol data products. Retrieved Ångström exponent and fine-mode AOD fraction are also well correlated with MAN data, with small biases and uncertainty similar to or better than SeaWiFS/MODIS products.

  13. Automatic Mexico Gulf Oil Spill Detection from Radarsat-2 SAR Satellite Data Using Genetic Algorithm

    Science.gov (United States)

    Marghany, Maged

    2016-10-01

    In this work, a genetic algorithm is exploited for automatic detection of oil spills of small and large size. The route is achieved using arrays of RADARSAT-2 SAR ScanSAR Narrow single beam data obtained in the Gulf of Mexico. The study shows that genetic algorithm has automatically segmented the dark spot patches related to small and large oil spill pixels. This conclusion is confirmed by the receiveroperating characteristic (ROC) curve and ground data which have been documented. The ROC curve indicates that the existence of oil slick footprints can be identified with the area under the curve between the ROC curve and the no-discrimination line of 90%, which is greater than that of other surrounding environmental features. The small oil spill sizes represented 30% of the discriminated oil spill pixels in ROC curve. In conclusion, the genetic algorithm can be used as a tool for the automatic detection of oil spills of either small or large size and the ScanSAR Narrow single beam mode serves as an excellent sensor for oil spill patterns detection and surveying in the Gulf of Mexico.

  14. A remote integrated testbed for cooperating objects

    CERN Document Server

    Dios, Jose Ramiro Martinez-de; Bernabe, Alberto de San; Ollero, Anibal

    2013-01-01

    Testbeds are gaining increasing relevance in research domains and also in industrial applications. However, very few books devoted to testbeds have been published. To the best of my knowledge no book on this topic has been published. This book is particularly interesting for the growing community of testbed developers. I believe the book is also very interesting for researchers in robot-WSN cooperation.This book provides detailed description of a system that can be considered the first testbed that allows full peer-to-peer interoperability between heterogeneous robots and ubiquitous systems su

  15. High efficient optical remote sensing images acquisition for nano-satellite: reconstruction algorithms

    Science.gov (United States)

    Liu, Yang; Li, Feng; Xin, Lei; Fu, Jie; Huang, Puming

    2017-10-01

    Large amount of data is one of the most obvious features in satellite based remote sensing systems, which is also a burden for data processing and transmission. The theory of compressive sensing(CS) has been proposed for almost a decade, and massive experiments show that CS has favorable performance in data compression and recovery, so we apply CS theory to remote sensing images acquisition. In CS, the construction of classical sensing matrix for all sparse signals has to satisfy the Restricted Isometry Property (RIP) strictly, which limits applying CS in practical in image compression. While for remote sensing images, we know some inherent characteristics such as non-negative, smoothness and etc.. Therefore, the goal of this paper is to present a novel measurement matrix that breaks RIP. The new sensing matrix consists of two parts: the standard Nyquist sampling matrix for thumbnails and the conventional CS sampling matrix. Since most of sun-synchronous based satellites fly around the earth 90 minutes and the revisit cycle is also short, lots of previously captured remote sensing images of the same place are available in advance. This drives us to reconstruct remote sensing images through a deep learning approach with those measurements from the new framework. Therefore, we propose a novel deep convolutional neural network (CNN) architecture which takes in undersampsing measurements as input and outputs an intermediate reconstruction image. It is well known that the training procedure to the network costs long time, luckily, the training step can be done only once, which makes the approach attractive for a host of sparse recovery problems.

  16. Validation of ozone profile retrievals derived from the OMPS LP version 2.5 algorithm against correlative satellite measurements

    Science.gov (United States)

    Kramarova, Natalya A.; Bhartia, Pawan K.; Jaross, Glen; Moy, Leslie; Xu, Philippe; Chen, Zhong; DeLand, Matthew; Froidevaux, Lucien; Livesey, Nathaniel; Degenstein, Douglas; Bourassa, Adam; Walker, Kaley A.; Sheese, Patrick

    2018-05-01

    The Limb Profiler (LP) is a part of the Ozone Mapping and Profiler Suite launched on board of the Suomi NPP satellite in October 2011. The LP measures solar radiation scattered from the atmospheric limb in ultraviolet and visible spectral ranges between the surface and 80 km. These measurements of scattered solar radiances allow for the retrieval of ozone profiles from cloud tops up to 55 km. The LP started operational observations in April 2012. In this study we evaluate more than 5.5 years of ozone profile measurements from the OMPS LP processed with the new NASA GSFC version 2.5 retrieval algorithm. We provide a brief description of the key changes that had been implemented in this new algorithm, including a pointing correction, new cloud height detection, explicit aerosol correction and a reduction of the number of wavelengths used in the retrievals. The OMPS LP ozone retrievals have been compared with independent satellite profile measurements obtained from the Aura Microwave Limb Sounder (MLS), Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS) and Odin Optical Spectrograph and InfraRed Imaging System (OSIRIS). We document observed biases and seasonal differences and evaluate the stability of the version 2.5 ozone record over 5.5 years. Our analysis indicates that the mean differences between LP and correlative measurements are well within required ±10 % between 18 and 42 km. In the upper stratosphere and lower mesosphere (> 43 km) LP tends to have a negative bias. We find larger biases in the lower stratosphere and upper troposphere, but LP ozone retrievals have significantly improved in version 2.5 compared to version 2 due to the implemented aerosol correction. In the northern high latitudes we observe larger biases between 20 and 32 km due to the remaining thermal sensitivity issue. Our analysis shows that LP ozone retrievals agree well with the correlative satellite observations in characterizing vertical, spatial and temporal

  17. QIKAIM, a fast seminumerical algorithm for the generation of minute-of-arc accuracy satellite predictions

    Science.gov (United States)

    Vermeer, M.

    1981-07-01

    A program was designed to replace AIMLASER for the generation of aiming predictions, to achieve a major saving in computing time, and to keep the program small enough for use even on small systems. An approach was adopted that incorporated the numerical integration of the orbit through a pass, limiting the computation of osculating elements to only one point per pass. The numerical integration method which is fourth order in delta t in the cumulative error after a given time lapse is presented. Algorithms are explained and a flowchart and listing of the program are provided.

  18. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    Science.gov (United States)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  19. Satellite lithium-ion battery remaining useful life estimation with an iterative updated RVM fused with the KF algorithm

    Institute of Scientific and Technical Information of China (English)

    Yuchen SONG; Datong LIU; Yandong HOU; Jinxiang YU; Yu PENG

    2018-01-01

    Lithium-ion batteries have become the third-generation space batteries and are widely utilized in a series of spacecraft. Remaining Useful Life (RUL) estimation is essential to a spacecraft as the battery is a critical part and determines the lifetime and reliability. The Relevance Vector Machine (RVM) is a data-driven algorithm used to estimate a battery's RUL due to its sparse fea-ture and uncertainty management capability. Especially, some of the regressive cases indicate that the RVM can obtain a better short-term prediction performance rather than long-term prediction. As a nonlinear kernel learning algorithm, the coefficient matrix and relevance vectors are fixed once the RVM training is conducted. Moreover, the RVM can be simply influenced by the noise with the training data. Thus, this work proposes an iterative updated approach to improve the long-term prediction performance for a battery's RUL prediction. Firstly, when a new estimator is output by the RVM, the Kalman filter is applied to optimize this estimator with a physical degradation model. Then, this optimized estimator is added into the training set as an on-line sample, the RVM model is re-trained, and the coefficient matrix and relevance vectors can be dynamically adjusted to make next iterative prediction. Experimental results with a commercial battery test data set and a satellite battery data set both indicate that the proposed method can achieve a better per-formance for RUL estimation.

  20. Satellite lithium-ion battery remaining useful life estimation with an iterative updated RVM fused with the KF algorithm

    Directory of Open Access Journals (Sweden)

    Yuchen SONG

    2018-01-01

    Full Text Available Lithium-ion batteries have become the third-generation space batteries and are widely utilized in a series of spacecraft. Remaining Useful Life (RUL estimation is essential to a spacecraft as the battery is a critical part and determines the lifetime and reliability. The Relevance Vector Machine (RVM is a data-driven algorithm used to estimate a battery’s RUL due to its sparse feature and uncertainty management capability. Especially, some of the regressive cases indicate that the RVM can obtain a better short-term prediction performance rather than long-term prediction. As a nonlinear kernel learning algorithm, the coefficient matrix and relevance vectors are fixed once the RVM training is conducted. Moreover, the RVM can be simply influenced by the noise with the training data. Thus, this work proposes an iterative updated approach to improve the long-term prediction performance for a battery’s RUL prediction. Firstly, when a new estimator is output by the RVM, the Kalman filter is applied to optimize this estimator with a physical degradation model. Then, this optimized estimator is added into the training set as an on-line sample, the RVM model is re-trained, and the coefficient matrix and relevance vectors can be dynamically adjusted to make next iterative prediction. Experimental results with a commercial battery test data set and a satellite battery data set both indicate that the proposed method can achieve a better performance for RUL estimation.

  1. An algorithm for enhanced formation flying of satellites in low earth orbit

    Science.gov (United States)

    Folta, David C.; Quinn, David A.

    1998-01-01

    With scientific objectives for Earth observation programs becoming more ambitious and spacecraft becoming more autonomous, the need for innovative technical approaches on the feasibility of achieving and maintaining formations of spacecraft has come to the forefront. The trend to develop small low-cost spacecraft has led many scientists to recognize the advantage of flying several spacecraft in formation to achieve the correlated instrument measurements formerly possible only by flying many instruments on a single large platform. Yet, formation flying imposes additional complications on orbit maintenance, especially when each spacecraft has its own orbit requirements. However, advances in automation and technology proposed by the Goddard Space Flight Center (GSFC) allow more of the burden in maneuver planning and execution to be placed onboard the spacecraft, mitigating some of the associated operational concerns. The purpose of this paper is to present GSFC's Guidance, Navigation, and Control Center's (GNCC) algorithm for Formation Flying of the low earth orbiting spacecraft that is part of the New Millennium Program (NMP). This system will be implemented as a close-loop flight code onboard the NMP Earth Orbiter-1 (EO-1) spacecraft. Results of this development can be used to determine the appropriateness of formation flying for a particular case as well as operational impacts. Simulation results using this algorithm integrated in an autonomous `fuzzy logic' control system called AutoCon™ are presented.

  2. Exploring Subpixel Learning Algorithms for Estimating Global Land Cover Fractions from Satellite Data Using High Performance Computing

    Directory of Open Access Journals (Sweden)

    Uttam Kumar

    2017-10-01

    Full Text Available Land cover (LC refers to the physical and biological cover present over the Earth’s surface in terms of the natural environment such as vegetation, water, bare soil, etc. Most LC features occur at finer spatial scales compared to the resolution of primary remote sensing satellites. Therefore, observed data are a mixture of spectral signatures of two or more LC features resulting in mixed pixels. One solution to the mixed pixel problem is the use of subpixel learning algorithms to disintegrate the pixel spectrum into its constituent spectra. Despite the popularity and existing research conducted on the topic, the most appropriate approach is still under debate. As an attempt to address this question, we compared the performance of several subpixel learning algorithms based on least squares, sparse regression, signal–subspace and geometrical methods. Analysis of the results obtained through computer-simulated and Landsat data indicated that fully constrained least squares (FCLS outperformed the other techniques. Further, FCLS was used to unmix global Web-Enabled Landsat Data to obtain abundances of substrate (S, vegetation (V and dark object (D classes. Due to the sheer nature of data and computational needs, we leveraged the NASA Earth Exchange (NEX high-performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into four classes, namely forest, farmland, water and urban areas (in conjunction with nighttime lights data over California, USA using a random forest classifier. Validation of these LC maps with the National Land Cover Database 2011 products and North American Forest Dynamics static forest map shows a 6% improvement in unmixing-based classification relative to per-pixel classification. As such, abundance maps continue to offer a useful alternative to high-spatial-resolution classified maps for forest inventory analysis, multi

  3. Comparison of satellite reflectance algorithms for estimating chlorophyll-a in a temperate reservoir using coincident hyperspectral aircraft imagery and dense coincident surface observations

    Science.gov (United States)

    We analyzed 10 established and 4 new satellite reflectance algorithms for estimating chlorophyll-a (Chl-a) in a temperate reservoir in southwest Ohio using coincident hyperspectral aircraft imagery and dense water truth collected within one hour of image acquisition to develop si...

  4. A practical algorithm for the retrieval of floe size distribution of Arctic sea ice from high-resolution satellite Synthetic Aperture Radar imagery

    Directory of Open Access Journals (Sweden)

    Byongjun Hwang

    2017-07-01

    Full Text Available In this study, we present an algorithm for summer sea ice conditions that semi-automatically produces the floe size distribution of Arctic sea ice from high-resolution satellite Synthetic Aperture Radar data. Currently, floe size distribution data from satellite images are very rare in the literature, mainly due to the lack of a reliable algorithm to produce such data. Here, we developed the algorithm by combining various image analysis methods, including Kernel Graph Cuts, distance transformation and watershed transformation, and a rule-based boundary revalidation. The developed algorithm has been validated against the ground truth that was extracted manually with the aid of 1-m resolution visible satellite data. Comprehensive validation analysis has shown both perspectives and limitations. The algorithm tends to fail to detect small floes (mostly less than 100 m in mean caliper diameter compared to ground truth, which is mainly due to limitations in water-ice segmentation. Some variability in the power law exponent of floe size distribution is observed due to the effects of control parameters in the process of de-noising, Kernel Graph Cuts segmentation, thresholds for boundary revalidation and image resolution. Nonetheless, the algorithm, for floes larger than 100 m, has shown a reasonable agreement with ground truth under various selections of these control parameters. Considering that the coverage and spatial resolution of satellite Synthetic Aperture Radar data have increased significantly in recent years, the developed algorithm opens a new possibility to produce large volumes of floe size distribution data, which is essential for improving our understanding and prediction of the Arctic sea ice cover

  5. Exploring Simple Algorithms for Estimating Gross Primary Production in Forested Areas from Satellite Data

    Directory of Open Access Journals (Sweden)

    Ramakrishna R. Nemani

    2012-01-01

    Full Text Available Algorithms that use remotely-sensed vegetation indices to estimate gross primary production (GPP, a key component of the global carbon cycle, have gained a lot of popularity in the past decade. Yet despite the amount of research on the topic, the most appropriate approach is still under debate. As an attempt to address this question, we compared the performance of different vegetation indices from the Moderate Resolution Imaging Spectroradiometer (MODIS in capturing the seasonal and the annual variability of GPP estimates from an optimal network of 21 FLUXNET forest towers sites. The tested indices include the Normalized Difference Vegetation Index (NDVI, Enhanced Vegetation Index (EVI, Leaf Area Index (LAI, and Fraction of Photosynthetically Active Radiation absorbed by plant canopies (FPAR. Our results indicated that single vegetation indices captured 50–80% of the variability of tower-estimated GPP, but no one index performed universally well in all situations. In particular, EVI outperformed the other MODIS products in tracking seasonal variations in tower-estimated GPP, but annual mean MODIS LAI was the best estimator of the spatial distribution of annual flux-tower GPP (GPP = 615 × LAI − 376, where GPP is in g C/m2/year. This simple algorithm rehabilitated earlier approaches linking ground measurements of LAI to flux-tower estimates of GPP and produced annual GPP estimates comparable to the MODIS 17 GPP product. As such, remote sensing-based estimates of GPP continue to offer a useful alternative to estimates from biophysical models, and the choice of the most appropriate approach depends on whether the estimates are required at annual or sub-annual temporal resolution.

  6. Fast Physics Testbed for the FASTER Project

    Energy Technology Data Exchange (ETDEWEB)

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  7. A New Temperature-Vegetation Triangle Algorithm with Variable Edges (TAVE for Satellite-Based Actual Evapotranspiration Estimation

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    2016-09-01

    Full Text Available The estimation of spatially-variable actual evapotranspiration (AET is a critical challenge to regional water resources management. We propose a new remote sensing method, the Triangle Algorithm with Variable Edges (TAVE, to generate daily AET estimates based on satellite-derived land surface temperature and the vegetation index NDVI. The TAVE captures heterogeneity in AET across elevation zones and permits variability in determining local values of wet and dry end-member classes (known as edges. Compared to traditional triangle methods, TAVE introduces three unique features: (i the discretization of the domain as overlapping elevation zones; (ii a variable wet edge that is a function of elevation zone; and (iii variable values of a combined-effect parameter (that accounts for aerodynamic and surface resistance, vapor pressure gradient, and soil moisture availability along both wet and dry edges. With these features, TAVE effectively addresses the combined influence of terrain and water stress on semi-arid environment AET estimates. We demonstrate the effectiveness of this method in one of the driest countries in the world—Jordan, and compare it to a traditional triangle method (TA and a global AET product (MOD16 over different land use types. In irrigated agricultural lands, TAVE matched the results of the single crop coefficient model (−3%, in contrast to substantial overestimation by TA (+234% and underestimation by MOD16 (−50%. In forested (non-irrigated, water consuming regions, TA and MOD16 produced AET average deviations 15.5 times and −3.5 times of those based on TAVE. As TAVE has a simple structure and low data requirements, it provides an efficient means to satisfy the increasing need for evapotranspiration estimation in data-scarce semi-arid regions. This study constitutes a much needed step towards the satellite-based quantification of agricultural water consumption in Jordan.

  8. INFN Tier-1 Testbed Facility

    International Nuclear Information System (INIS)

    Gregori, Daniele; Cavalli, Alessandro; Dell'Agnello, Luca; Dal Pra, Stefano; Prosperini, Andrea; Ricci, Pierpaolo; Ronchieri, Elisabetta; Sapunenko, Vladimir

    2012-01-01

    INFN-CNAF, located in Bologna, is the Information Technology Center of National Institute of Nuclear Physics (INFN). In the framework of the Worldwide LHC Computing Grid, INFN-CNAF is one of the eleven worldwide Tier-1 centers to store and reprocessing Large Hadron Collider (LHC) data. The Italian Tier-1 provides the resources of storage (i.e., disk space for short term needs and tapes for long term needs) and computing power that are needed for data processing and analysis to the LHC scientific community. Furthermore, INFN Tier-1 houses computing resources for other particle physics experiments, like CDF at Fermilab, SuperB at Frascati, as well as for astro particle and spatial physics experiments. The computing center is a very complex infrastructure, the hardaware layer include the network, storage and farming area, while the software layer includes open source and proprietary software. Software updating and new hardware adding can unexpectedly deteriorate the production activity of the center: therefore a testbed facility has been set up in order to reproduce and certify the various layers of the Tier-1. In this article we describe the testbed and the checks performed.

  9. Comparison of two matrix data structures for advanced CSM testbed applications

    Science.gov (United States)

    Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.

    1989-01-01

    The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.

  10. Solar Resource Assessment with Sky Imagery and a Virtual Testbed for Sky Imager Solar Forecasting

    Science.gov (United States)

    Kurtz, Benjamin Bernard

    In recent years, ground-based sky imagers have emerged as a promising tool for forecasting solar energy on short time scales (0 to 30 minutes ahead). Following the development of sky imager hardware and algorithms at UC San Diego, we present three new or improved algorithms for sky imager forecasting and forecast evaluation. First, we present an algorithm for measuring irradiance with a sky imager. Sky imager forecasts are often used in conjunction with other instruments for measuring irradiance, so this has the potential to decrease instrumentation costs and logistical complexity. In particular, the forecast algorithm itself often relies on knowledge of the current irradiance which can now be provided directly from the sky images. Irradiance measurements are accurate to within about 10%. Second, we demonstrate a virtual sky imager testbed that can be used for validating and enhancing the forecast algorithm. The testbed uses high-quality (but slow) simulations to produce virtual clouds and sky images. Because virtual cloud locations are known, much more advanced validation procedures are possible with the virtual testbed than with measured data. In this way, we are able to determine that camera geometry and non-uniform evolution of the cloud field are the two largest sources of forecast error. Finally, with the assistance of the virtual sky imager testbed, we develop improvements to the cloud advection model used for forecasting. The new advection schemes are 10-20% better at short time horizons.

  11. Algorithms

    Indian Academy of Sciences (India)

    ticians but also forms the foundation of computer science. Two ... with methods of developing algorithms for solving a variety of problems but ... applications of computers in science and engineer- ... numerical calculus are as important. We will ...

  12. AN ACTIVE-PASSIVE COMBINED ALGORITHM FOR HIGH SPATIAL RESOLUTION RETRIEVAL OF SOIL MOISTURE FROM SATELLITE SENSORS (Invited)

    Science.gov (United States)

    Lakshmi, V.; Mladenova, I. E.; Narayan, U.

    2009-12-01

    Soil moisture is known to be an essential factor in controlling the partitioning of rainfall into surface runoff and infiltration and solar energy into latent and sensible heat fluxes. Remote sensing has long proven its capability to obtain soil moisture in near real-time. However, at the present time we have the Advanced Scanning Microwave Radiometer (AMSR-E) on board NASA’s AQUA platform is the only satellite sensor that supplies a soil moisture product. AMSR-E coarse spatial resolution (~ 50 km at 6.9 GHz) strongly limits its applicability for small scale studies. A very promising technique for spatial disaggregation by combining radar and radiometer observations has been demonstrated by the authors using a methodology is based on the assumption that any change in measured brightness temperature and backscatter from one to the next time step is due primarily to change in soil wetness. The approach uses radiometric estimates of soil moisture at a lower resolution to compute the sensitivity of radar to soil moisture at the lower resolution. This estimate of sensitivity is then disaggregated using vegetation water content, vegetation type and soil texture information, which are the variables on which determine the radar sensitivity to soil moisture and are generally available at a scale of radar observation. This change detection algorithm is applied to several locations. We have used aircraft observed active and passive data over Walnut Creek watershed in Central Iowa in 2002; the Little Washita Watershed in Oklahoma in 2003 and the Murrumbidgee Catchment in southeastern Australia for 2006. All of these locations have different soils and land cover conditions which leads to a rigorous test of the disaggregation algorithm. Furthermore, we compare the derived high spatial resolution soil moisture to in-situ sampling and ground observation networks

  13. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    Science.gov (United States)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  14. LTE-Advanced/WLAN testbed

    OpenAIRE

    Plaisner, Denis

    2017-01-01

    Táto práca sa zaoberá skúmaním a vyhodnocovaním komunikácie štandardov LTE-Advance a WiFi (IEEE 802.11n/ac). Pri jednotlivých štandardoch je preskúmaný chybový parameter EVM. Pre prácu s jednotlivými štandardmi je navrhnuté univerzálne pracovisko (testbed). Toto univerzálne pracovisko slúži na nastavovanie vysielacieho a prijímacieho zariadenia a na spracovávanie prenášaných signálov a ich vyhodnocovanie. Pre túto prácu je vybrané prostredie Matlab, cez ktoré sa ovládajú použité prístroje ako...

  15. Algorithms

    Indian Academy of Sciences (India)

    algorithm design technique called 'divide-and-conquer'. One of ... Turtle graphics, September. 1996. 5. ... whole list named 'PO' is a pointer to the first element of the list; ..... Program for computing matrices X and Y and placing the result in C *).

  16. Algorithms

    Indian Academy of Sciences (India)

    algorithm that it is implicitly understood that we know how to generate the next natural ..... Explicit comparisons are made in line (1) where maximum and minimum is ... It can be shown that the function T(n) = 3/2n -2 is the solution to the above ...

  17. Satellite remote sensing of harmful algal blooms: A new multi-algorithm method for detecting the Florida Red Tide (Karenia brevis)

    Science.gov (United States)

    Carvalho, Gustavo A.; Minnett, Peter J.; Fleming, Lora E.; Banzon, Viva F.; Baringer, Warner

    2010-01-01

    In a continuing effort to develop suitable methods for the surveillance of Harmful Algal Blooms (HABs) of Karenia brevis using satellite radiometers, a new multi-algorithm method was developed to explore whether improvements in the remote sensing detection of the Florida Red Tide was possible. A Hybrid Scheme was introduced that sequentially applies the optimized versions of two pre-existing satellite-based algorithms: an Empirical Approach (using water-leaving radiance as a function of chlorophyll concentration) and a Bio-optical Technique (using particulate backscatter along with chlorophyll concentration). The long-term evaluation of the new multi-algorithm method was performed using a multi-year MODIS dataset (2002 to 2006; during the boreal Summer-Fall periods – July to December) along the Central West Florida Shelf between 25.75°N and 28.25°N. Algorithm validation was done with in situ measurements of the abundances of K. brevis; cell counts ≥1.5×104 cells l−1 defined a detectable HAB. Encouraging statistical results were derived when either or both algorithms correctly flagged known samples. The majority of the valid match-ups were correctly identified (~80% of both HABs and non-blooming conditions) and few false negatives or false positives were produced (~20% of each). Additionally, most of the HAB-positive identifications in the satellite data were indeed HAB samples (positive predictive value: ~70%) and those classified as HAB-negative were almost all non-bloom cases (negative predictive value: ~86%). These results demonstrate an excellent detection capability, on average ~10% more accurate than the individual algorithms used separately. Thus, the new Hybrid Scheme could become a powerful tool for environmental monitoring of K. brevis blooms, with valuable consequences including leading to the more rapid and efficient use of ships to make in situ measurements of HABs. PMID:21037979

  18. Wireless Sensor Networks TestBed: ASNTbed

    CSIR Research Space (South Africa)

    Dludla, AG

    2013-05-01

    Full Text Available Wireless sensor networks (WSNs) have been used in different types of applications and deployed within various environments. Simulation tools are essential for studying WSNs, especially for exploring large-scale networks. However, WSN testbeds...

  19. AMS San Diego Testbed - Calibration Data

    Data.gov (United States)

    Department of Transportation — The data in this repository were collected from the San Diego, California testbed, namely, I-15 from the interchange with SR-78 in the north to the interchange with...

  20. University of Florida Advanced Technologies Campus Testbed

    Science.gov (United States)

    2017-09-21

    The University of Florida (UF) and its Transportation Institute (UFTI), the Florida Department of Transportation (FDOT) and the City of Gainesville (CoG) are cooperating to develop a smart transportation testbed on the University of Florida (UF) main...

  1. Versatile Electric Propulsion Aircraft Testbed, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An all-electric aircraft testbed is proposed to provide a dedicated development environment for the rigorous study and advancement of electrically powered aircraft....

  2. Algorithms

    Indian Academy of Sciences (India)

    will become clear in the next article when we discuss a simple logo like programming language. ... Rod B may be used as an auxiliary store. The problem is to find an algorithm which performs this task. ... No disks are moved from A to Busing C as auxiliary rod. • move _disk (A, C);. (No + l)th disk is moved from A to C directly ...

  3. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    Science.gov (United States)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  4. Implementation of standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Babiuc, M C [Department of Physics and Physical Science, Marshall University, Huntington, WV 25755 (United States); Husa, S [Friedrich Schiller University Jena, Max-Wien-Platz 1, 07743 Jena (Germany); Alic, D [Department of Physics, University of the Balearic Islands, Cra Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinder, I [Center for Gravitational Wave Physics, Pennsylvania State University, University Park, PA 16802 (United States); Lechner, C [Weierstrass Institute for Applied Analysis and Stochastics (WIAS), Mohrenstrasse 39, 10117 Berlin (Germany); Schnetter, E [Center for Computation and Technology, 216 Johnston Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Szilagyi, B; Dorband, N; Pollney, D; Winicour, J [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut), Am Muehlenberg 1, 14076 Golm (Germany); Zlochower, Y [Center for Computational Relativity and Gravitation, School of Mathematical Sciences, Rochester Institute of Technology, 78 Lomb Memorial Drive, Rochester, New York 14623 (United States)

    2008-06-21

    We discuss results that have been obtained from the implementation of the initial round of testbeds for numerical relativity which was proposed in the first paper of the Apples with Apples Alliance. We present benchmark results for various codes which provide templates for analyzing the testbeds and to draw conclusions about various features of the codes. This allows us to sharpen the initial test specifications, design a new test and add theoretical insight.

  5. An adaptable, low cost test-bed for unmanned vehicle systems research

    Science.gov (United States)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  6. Long-term analysis of aerosol optical depth over Northeast Asia using a satellite-based measurement: MI Yonsei Aerosol Retrieval Algorithm (YAER)

    Science.gov (United States)

    Kim, Mijin; Kim, Jhoon; Yoon, Jongmin; Chung, Chu-Yong; Chung, Sung-Rae

    2017-04-01

    In 2010, the Korean geostationary earth orbit (GEO) satellite, the Communication, Ocean, and Meteorological Satellite (COMS), was launched including the Meteorological Imager (MI). The MI measures atmospheric condition over Northeast Asia (NEA) using a single visible channel centered at 0.675 μm and four IR channels at 3.75, 6.75, 10.8, 12.0 μm. The visible measurement can also be utilized for the retrieval of aerosol optical properties (AOPs). Since the GEO satellite measurement has an advantage for continuous monitoring of AOPs, we can analyze the spatiotemporal variation of the aerosol using the MI observations over NEA. Therefore, we developed an algorithm to retrieve aerosol optical depth (AOD) using the visible observation of MI, and named as MI Yonsei Aerosol Retrieval Algorithm (YAER). In this study, we investigated the accuracy of MI YAER AOD by comparing the values with the long-term products of AERONET sun-photometer. The result showed that the MI AODs were significantly overestimated than the AERONET values over bright surface in low AOD case. Because the MI visible channel centered at red color range, contribution of aerosol signal to the measured reflectance is relatively lower than the surface contribution. Therefore, the AOD error in low AOD case over bright surface can be a fundamental limitation of the algorithm. Meanwhile, an assumption of background aerosol optical depth (BAOD) could result in the retrieval uncertainty, also. To estimate the surface reflectance by considering polluted air condition over the NEA, we estimated the BAOD from the MODIS dark target (DT) aerosol products by pixel. The satellite-based AOD retrieval, however, largely depends on the accuracy of the surface reflectance estimation especially in low AOD case, and thus, the BAOD could include the uncertainty in surface reflectance estimation of the satellite-based retrieval. Therefore, we re-estimated the BAOD using the ground-based sun-photometer measurement, and

  7. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    Science.gov (United States)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  8. The CMS integration grid testbed

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  9. The CMS Integration Grid Testbed

    CERN Document Server

    Graham, G E; Aziz, Shafqat; Bauerdick, L.A.T.; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yu-jun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge Luis; Kategari, Suchindra; Couvares, Peter; DeSmet, Alan; Livny, Miron; Roy, Alain; Tannenbaum, Todd; Graham, Gregory E.; Aziz, Shafqat; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yujun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge; Kategari, Suchindra; Couvares, Peter; Smet, Alan De; Livny, Miron; Roy, Alain; Tannenbaum, Todd

    2003-01-01

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distrib ution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuo us two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. ...

  10. The AMSR2 Satellite-based Microwave Snow Algorithm (SMSA) to estimate regional to global snow depth and snow water equivalent

    Science.gov (United States)

    Kelly, R. E. J.; Saberi, N.; Li, Q.

    2017-12-01

    With moderate to high spatial resolution (observation approaches yet to be fully scoped and developed, the long-term satellite passive microwave record remains an important tool for cryosphere-climate diagnostics. A new satellite microwave remote sensing approach is described for estimating snow depth (SD) and snow water equivalent (SWE). The algorithm, called the Satellite-based Microwave Snow Algorithm (SMSA), uses Advanced Microwave Scanning Radiometer - 2 (AMSR2) observations aboard the Global Change Observation Mission - Water mission launched by the Japan Aerospace Exploration Agency in 2012. The approach is unique since it leverages observed brightness temperatures (Tb) with static ancillary data to parameterize a physically-based retrieval without requiring parameter constraints from in situ snow depth observations or historical snow depth climatology. After screening snow from non-snow surface targets (water bodies [including freeze/thaw state], rainfall, high altitude plateau regions [e.g. Tibetan plateau]), moderate and shallow snow depths are estimated by minimizing the difference between Dense Media Radiative Transfer model estimates (Tsang et al., 2000; Picard et al., 2011) and AMSR2 Tb observations to retrieve SWE and SD. Parameterization of the model combines a parsimonious snow grain size and density approach originally developed by Kelly et al. (2003). Evaluation of the SMSA performance is achieved using in situ snow depth data from a variety of standard and experiment data sources. Results presented from winter seasons 2012-13 to 2016-17 illustrate the improved performance of the new approach in comparison with the baseline AMSR2 algorithm estimates and approach the performance of the model assimilation-based approach of GlobSnow. Given the variation in estimation power of SWE by different land surface/climate models and selected satellite-derived passive microwave approaches, SMSA provides SWE estimates that are independent of real or near real

  11. The University of Canberra quantum key distribution testbed

    International Nuclear Information System (INIS)

    Ganeshkumar, G.; Edwards, P.J.; Cheung, W.N.; Barbopoulos, L.O.; Pham, H.; Hazel, J.C.

    1999-01-01

    Full text: We describe the design, operation and preliminary results obtained from a quantum key distribution (QKD) testbed constructed at the University of Canberra. Quantum cryptographic systems use shared secret keys exchanged in the form of sequences of polarisation coded or phase encoded single photons transmitted over an optical communications channel. Secrecy of this quantum key rests upon fundamental laws of quantum physics: measurements of linear or circular photon polarisation states introduce noise into the conjugate variable and so reveal eavesdropping. In its initial realisation reported here, pulsed light from a 650nm laser diode is attenuated by a factor of 10 6 , plane-polarised and then transmitted through a birefringent liquid crystal modulator (LCM) to a polarisation sensitive single photon receiver. This transmitted key sequence consists of a 1 kHz train of weak coherent 100ns wide light pulses, polarisation coded according to the BB84 protocol. Each pulse is randomly assigned one of four polarisation states (two orthogonal linear and two orthogonal circular) by computer PCA operated by the sender ('Alice'). This quaternary polarisation shift keyed photon stream is detected by the receiver ('Bob') whose computer (PCB) randomly chooses either a linear or a circular polarisation basis. Computer PCB is also used for final key selection, authentication, privacy amplification and eavesdropping. We briefly discuss the realisation of a mesoscopic single photon QKD source and the use of the testbed to simulate a global quantum key distribution system using earth satellites. Copyright (1999) Australian Optical Society

  12. SABA: A Testbed for a Real-Time MIMO System

    Directory of Open Access Journals (Sweden)

    Brühl Lars

    2006-01-01

    Full Text Available The growing demand for high data rates for wireless communication systems leads to the development of new technologies to increase the channel capacity thus increasing the data rate. MIMO (multiple-input multiple-output systems are best qualified for these applications. In this paper, we present a MIMO test environment for high data rate transmissions in frequency-selective environments. An overview of the testbed is given, including the analyzed algorithms, the digital signal processing with a new highly parallel processor to perform the algorithms in real time, as well as the analog front-ends. A brief overview of the influence of polarization on the channel capacity is given as well.

  13. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links.

    Science.gov (United States)

    Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen

    2018-05-25

    Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection

  14. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links

    Directory of Open Access Journals (Sweden)

    Hongbo Zhao

    2018-05-01

    Full Text Available Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR, complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS and BeiDou Navigation Satellite System (BDS adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST. This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher

  15. The feasibility of retrieving vertical temperature profiles from satellite nadir UV observations: A sensitivity analysis and an inversion experiment with neural network algorithms

    International Nuclear Information System (INIS)

    Sellitto, P.; Del Frate, F.

    2014-01-01

    Atmospheric temperature profiles are inferred from passive satellite instruments, using thermal infrared or microwave observations. Here we investigate on the feasibility of the retrieval of height resolved temperature information in the ultraviolet spectral region. The temperature dependence of the absorption cross sections of ozone in the Huggins band, in particular in the interval 320–325 nm, is exploited. We carried out a sensitivity analysis and demonstrated that a non-negligible information on the temperature profile can be extracted from this small band. Starting from these results, we developed a neural network inversion algorithm, trained and tested with simulated nadir EnviSat-SCIAMACHY ultraviolet observations. The algorithm is able to retrieve the temperature profile with root mean square errors and biases comparable to existing retrieval schemes that use thermal infrared or microwave observations. This demonstrates, for the first time, the feasibility of temperature profiles retrieval from space-borne instruments operating in the ultraviolet. - Highlights: • A sensitivity analysis and an inversion scheme to retrieve temperature profiles from satellite UV observations (320–325 nm). • The exploitation of the temperature dependence of the absorption cross section of ozone in the Huggins band is proposed. • First demonstration of the feasibility of temperature profiles retrieval from satellite UV observations. • RMSEs and biases comparable with more established techniques involving TIR and MW observations

  16. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    Science.gov (United States)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  17. The DataTAG transatlantic testbed

    CERN Document Server

    Martin, O; Martin-Flatin, J P; Moroni, P; Nae, D; Newman, H; Ravot, S

    2005-01-01

    Wide area network testbeds allow researchers and engineers to test out new equipment, protocols and services in real-life situations, without jeopardizing the stability and reliability of production networks. The Data TransAtlantic Grid (DataTAG) testbed, deployed in 2002 between CERN, Geneva, Switzerland and StarLight, Chicago, IL, USA, is probably the largest testbed built to date. Jointly managed by CERN and Caltech, it is funded by the European Commission, the U.S. Department of Energy and the U.S. National Science Foundation. The main objectives of this testbed are to improve the Grid community's understanding of the networking issues posed by data- intensive Grid applications over transoceanic gigabit networks, design and develop new Grid middleware services, and improve the interoperability of European and U.S. Grid applications in High- Energy and Nuclear Physics. In this paper, we give an overview of this testbed, describe its various topologies over time, and summarize the main lessons learned after...

  18. Algorithm and Application of Gcp-Independent Block Adjustment for Super Large-Scale Domestic High Resolution Optical Satellite Imagery

    Science.gov (United States)

    Sun, Y. S.; Zhang, L.; Xu, B.; Zhang, Y.

    2018-04-01

    The accurate positioning of optical satellite image without control is the precondition for remote sensing application and small/medium scale mapping in large abroad areas or with large-scale images. In this paper, aiming at the geometric features of optical satellite image, based on a widely used optimization method of constraint problem which is called Alternating Direction Method of Multipliers (ADMM) and RFM least-squares block adjustment, we propose a GCP independent block adjustment method for the large-scale domestic high resolution optical satellite image - GISIBA (GCP-Independent Satellite Imagery Block Adjustment), which is easy to parallelize and highly efficient. In this method, the virtual "average" control points are built to solve the rank defect problem and qualitative and quantitative analysis in block adjustment without control. The test results prove that the horizontal and vertical accuracy of multi-covered and multi-temporal satellite images are better than 10 m and 6 m. Meanwhile the mosaic problem of the adjacent areas in large area DOM production can be solved if the public geographic information data is introduced as horizontal and vertical constraints in the block adjustment process. Finally, through the experiments by using GF-1 and ZY-3 satellite images over several typical test areas, the reliability, accuracy and performance of our developed procedure will be presented and studied in this paper.

  19. Exploration Systems Health Management Facilities and Testbed Workshop

    Science.gov (United States)

    Wilson, Scott; Waterman, Robert; McCleskey, Carey

    2004-01-01

    Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).

  20. Design of a nickel-hydrogen battery simulator for the NASA EOS testbed

    Science.gov (United States)

    Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.

    1992-01-01

    The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.

  1. The design and implementation of the LLNL gigabit testbed

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, D. [Lawrence Livermore National Labs., CA (United States)

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  2. About Non-Line-Of-Sight Satellite Detection and Exclusion in a 3D Map-Aided Localization Algorithm

    Directory of Open Access Journals (Sweden)

    François Peyret

    2013-01-01

    Full Text Available Reliable GPS positioning in city environment is a key issue: actually, signals are prone to multipath, with poor satellite geometry in many streets. Using a 3D urban model to forecast satellite visibility in urban contexts in order to improve GPS localization is the main topic of the present article. A virtual image processing that detects and eliminates possible faulty measurements is the core of this method. This image is generated using the position estimated a priori by the navigation process itself, under road constraints. This position is then updated by measurements to line-of-sight satellites only. This closed-loop real-time processing has shown very first promising full-scale test results.

  3. About Non-Line-Of-Sight Satellite Detection and Exclusion in a 3D Map-Aided Localization Algorithm

    Science.gov (United States)

    Peyraud, Sébastien; Bétaille, David; Renault, Stéphane; Ortiz, Miguel; Mougel, Florian; Meizel, Dominique; Peyret, François

    2013-01-01

    Reliable GPS positioning in city environment is a key issue actually, signals are prone to multipath, with poor satellite geometry in many streets. Using a 3D urban model to forecast satellite visibility in urban contexts in order to improve GPS localization is the main topic of the present article. A virtual image processing that detects and eliminates possible faulty measurements is the core of this method. This image is generated using the position estimated a priori by the navigation process itself, under road constraints. This position is then updated by measurements to line-of-sight satellites only. This closed-loop real-time processing has shown very first promising full-scale test results. PMID:23344379

  4. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, J.; Schmidt, G. K.

    2016-12-01

    SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers. The SSERVI Analog Regolith Simulant Testbed provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment. The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area. SSERVI provides a bridge between several groups, joining together researchers from: 1) scientific and exploration communities, 2) multiple disciplines across a wide range of planetary sciences, and 3) domestic and international communities and partnerships. This testbed provides a means of consolidating the tasks of acquisition, storage and safety mitigation in handling large quantities of regolith simulant Facility hardware and environment testing scenarios include, but are not limited to the following; Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, and Surface features (i.e. grades and rocks) Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and planetary exploration activities at NASA Research Park, to academia and expanded commercial opportunities in California's Silicon Valley, as well as public outreach and education opportunities.

  5. Cognitive Medical Wireless Testbed System (COMWITS)

    Science.gov (United States)

    2016-11-01

    Number: ...... ...... Sub Contractors (DD882) Names of other research staff Inventions (DD882) Scientific Progress This testbed merges two ARO grants...bit 64 bit CPU Intel Xeon Processor E5-1650v3 (6C, 3.5 GHz, Turbo, HT , 15M, 140W) Intel Core i7-3770 (3.4 GHz Quad Core, 77W) Dual Intel Xeon

  6. A two-step nearest neighbors algorithm using satellite imagery for predicting forest structure within species composition classes

    Science.gov (United States)

    Ronald E. McRoberts

    2009-01-01

    Nearest neighbors techniques have been shown to be useful for predicting multiple forest attributes from forest inventory and Landsat satellite image data. However, in regions lacking good digital land cover information, nearest neighbors selected to predict continuous variables such as tree volume must be selected without regard to relevant categorical variables such...

  7. Assessment of Machine Learning Algorithms for Automatic Benthic Cover Monitoring and Mapping Using Towed Underwater Video Camera and High-Resolution Satellite Images

    Directory of Open Access Journals (Sweden)

    Hassan Mohamed

    2018-05-01

    Full Text Available Benthic habitat monitoring is essential for many applications involving biodiversity, marine resource management, and the estimation of variations over temporal and spatial scales. Nevertheless, both automatic and semi-automatic analytical methods for deriving ecologically significant information from towed camera images are still limited. This study proposes a methodology that enables a high-resolution towed camera with a Global Navigation Satellite System (GNSS to adaptively monitor and map benthic habitats. First, the towed camera finishes a pre-programmed initial survey to collect benthic habitat videos, which can then be converted to geo-located benthic habitat images. Second, an expert labels a number of benthic habitat images to class habitats manually. Third, attributes for categorizing these images are extracted automatically using the Bag of Features (BOF algorithm. Fourth, benthic cover categories are detected automatically using Weighted Majority Voting (WMV ensembles for Support Vector Machines (SVM, K-Nearest Neighbor (K-NN, and Bagging (BAG classifiers. Fifth, WMV-trained ensembles can be used for categorizing more benthic cover images automatically. Finally, correctly categorized geo-located images can provide ground truth samples for benthic cover mapping using high-resolution satellite imagery. The proposed methodology was tested over Shiraho, Ishigaki Island, Japan, a heterogeneous coastal area. The WMV ensemble exhibited 89% overall accuracy for categorizing corals, sediments, seagrass, and algae species. Furthermore, the same WMV ensemble produced a benthic cover map using a Quickbird satellite image with 92.7% overall accuracy.

  8. A Business-to-Business Interoperability Testbed: An Overview

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Ivezic, Nenad [ORNL; Monica, Martin [Sun Microsystems, Inc.; Jones, Albert [National Institute of Standards and Technology (NIST)

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  9. Investigation of Lake Water Salinity by Using Four-Band Salinity Algorithm on WorldView-2 Satellite Image for a Saline Industrial Lake

    Science.gov (United States)

    Budakoǧlu, Murat; Karaman, Muhittin; Damla Uça Avcı, Z.; Kumral, Mustafa; Geredeli (Yılmaz), Serpil

    2014-05-01

    Salinity of a lake is an important characteristic since, these are potentially industrial lakes and the degree of salinity can significantly be used for determination of mineral resources and for the production management. In the literature, there are many studies of using satellite data for salinity related lake studies such as determination of salinity distribution and detection of potential freshwater sources in less salt concentrated regions. As the study area Lake Acigol, located in Denizli (Turkey) was selected. With it's saline environment, it's the major sodium sulphate production resource of Turkey. In this study, remote sensing data and data from a field study was used and correlated. Remote sensing is an efficient tool to monitor and analyze lake properties by using it complementary to field data. Worldview-2 satellite data was used in this study which consists of 8 bands. At the same time with the satellite data acquisition, a field study was conducted to collect the salinity values in 17 points of the laker with using YSI 556 Multiparametre for measurements. The values were measured as salinity amount in grams per kilogram solution and obtained as ppt unit. It was observed that the values vary from 34 ppt - 40.1 ppt and the average is 38.056 ppt. In Thalassic serie, the lake was in mixoeuhaline state in the time of issue. As a first step, ATCOR correction was performed on satellite image for atmospheric correction. There were some clouds on the lake field, hence it was decided to continue the study by using the 12 sampling points which were clear on the image. Then, for each sampling point, a spectral value was obtained by calculating the average at a 11*11 neighborhood. The relation between the spectral reflectance values and the salinity was investigated. The 4-band algorithm, which was used for determination of chlorophyll-a distribution in highly turbid coastal environment by Wei (2012) was applied. Salinity α (Λi-1 / Λj-1) * (Λk-1 / Λm-1) (i

  10. Development of a Remotely Operated Vehicle Test-bed

    Directory of Open Access Journals (Sweden)

    Biao WANG

    2013-06-01

    Full Text Available This paper presents the development of a remotely operated vehicle (ROV, designed to serve as a convenient, cost-effective platform for research and experimental validation of hardware, sensors and control algorithms. Both of the mechanical and control system design are introduced. The vehicle with a dimension 0.65 m long, 0.45 m wide has been designed to have a frame structure for modification of mounted devices and thruster allocation. For control system, STM32 based MCU boards specially designed for this project, are used as core processing boards. And an open source, modular, flexible software is developed. Experiment results demonstrate the effectiveness of the test-bed.

  11. Retrieval of land surface temperature (LST) from landsat TM6 and TIRS data by single channel radiative transfer algorithm using satellite and ground-based inputs

    Science.gov (United States)

    Chatterjee, R. S.; Singh, Narendra; Thapa, Shailaja; Sharma, Dravneeta; Kumar, Dheeraj

    2017-06-01

    The present study proposes land surface temperature (LST) retrieval from satellite-based thermal IR data by single channel radiative transfer algorithm using atmospheric correction parameters derived from satellite-based and in-situ data and land surface emissivity (LSE) derived by a hybrid LSE model. For example, atmospheric transmittance (τ) was derived from Terra MODIS spectral radiance in atmospheric window and absorption bands, whereas the atmospheric path radiance and sky radiance were estimated using satellite- and ground-based in-situ solar radiation, geographic location and observation conditions. The hybrid LSE model which is coupled with ground-based emissivity measurements is more versatile than the previous LSE models and yields improved emissivity values by knowledge-based approach. It uses NDVI-based and NDVI Threshold method (NDVITHM) based algorithms and field-measured emissivity values. The model is applicable for dense vegetation cover, mixed vegetation cover, bare earth including coal mining related land surface classes. The study was conducted in a coalfield of India badly affected by coal fire for decades. In a coal fire affected coalfield, LST would provide precise temperature difference between thermally anomalous coal fire pixels and background pixels to facilitate coal fire detection and monitoring. The derived LST products of the present study were compared with radiant temperature images across some of the prominent coal fire locations in the study area by graphical means and by some standard mathematical dispersion coefficients such as coefficient of variation, coefficient of quartile deviation, coefficient of quartile deviation for 3rd quartile vs. maximum temperature, coefficient of mean deviation (about median) indicating significant increase in the temperature difference among the pixels. The average temperature slope between adjacent pixels, which increases the potential of coal fire pixel detection from background pixels, is

  12. Adaptation of an aerosol retrieval algorithm using multi-wavelength and multi-pixel information of satellites (MWPM) to GOSAT/TANSO-CAI

    Science.gov (United States)

    Hashimoto, M.; Takenaka, H.; Higurashi, A.; Nakajima, T.

    2017-12-01

    Aerosol in the atmosphere is an important constituent for determining the earth's radiation budget, so the accurate aerosol retrievals from satellite is useful. We have developed a satellite remote sensing algorithm to retrieve the aerosol optical properties using multi-wavelength and multi-pixel information of satellite imagers (MWPM). The method simultaneously derives aerosol optical properties, such as aerosol optical thickness (AOT), single scattering albedo (SSA) and aerosol size information, by using spatial difference of wavelegths (multi-wavelength) and surface reflectances (multi-pixel). The method is useful for aerosol retrieval over spatially heterogeneous surface like an urban region. In this algorithm, the inversion method is a combination of an optimal method and smoothing constraint for the state vector. Furthermore, this method has been combined with the direct radiation transfer calculation (RTM) numerically solved by each iteration step of the non-linear inverse problem, without using look up table (LUT) with several constraints. However, it takes too much computation time. To accelerate the calculation time, we replaced the RTM with an accelerated RTM solver learned by neural network-based method, EXAM (Takenaka et al., 2011), using Rster code. And then, the calculation time was shorternd to about one thouthandth. We applyed MWPM combined with EXAM to GOSAT/TANSO-CAI (Cloud and Aerosol Imager). CAI is a supplement sensor of TANSO-FTS, dedicated to measure cloud and aerosol properties. CAI has four bands, 380, 674, 870 and 1600 nm, and observes in 500 meters resolution for band1, band2 and band3, and 1.5 km for band4. Retrieved parameters are aerosol optical properties, such as aerosol optical thickness (AOT) of fine and coarse mode particles at a wavelenth of 500nm, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength by combining a minimum reflectance method and Fukuda et al. (2013). We will show

  13. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Drira, Anis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Reed, Frederick K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  14. Interconnection of Broadband Islands via Satellite-Experiments on the Race II Catalyst Project

    National Research Council Canada - National Science Library

    Sun, Z

    1996-01-01

    .... The purpose of the project was to develop an ATM satellite link for the future B-ISDN services, particularly for the interconnections of the ATM testbeds which are in the form of broadband islands...

  15. Mini-mast CSI testbed user's guide

    Science.gov (United States)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  16. 多系统GNSS卫星轨道快速积分方法%A Rapid Orbit Integration Algorithm for Multi-GNSS Satellites

    Institute of Scientific and Technical Information of China (English)

    范磊; 李敏; 宋伟伟; 施闯; 王成

    2016-01-01

    A rapid and efficient orbit numerical integration algorithm with high accuracy is needed in multi-GNSS rapid precise orbit determination.In order to improve the compute efficiency, an adaptive step-changed Admas integration method and a synchronous integration algoritm for multi-GNSS satellites are developed in this paper.To validate the precision and efficiency of the proposed method, the multi-GNSS precise orbit products calculated by Wuhan University (WHU) and Center for Orbit Determination in Europe (CODE) are used for orbit fitting.Results show that, the average 3DRMS of GPS, GLONASS, BDS and Galileo satellites are all below 20mm.Comparing with the traditional step-fixed orbit integraion method applied for each satellite separately, the computational efficiency of the proposed method is improved significantly: without damaging the accuracy, it takes only 0.09s for a single satellite, which is 14 times faster than the traditional method.Besides, further improvement can be achieved when the number of satellites is increased.%快速高效且高精度的轨道数值积分算法是多系统GNSS卫星联合快速精密定轨的重要基础.本文从自适应变换Admas积分步长和多卫星同步积分两方面研究了多系统GNSS卫星轨道快速积分方法.为了验证该方法的精度和效率,利用武汉大学(WHU)与欧洲定轨中心(CODE)发布的事后精密星历进行轨道动力学拟合.试验结果表明:GPS/GLONASS/BDS/Galileo 4个系统卫星平均三维RMS均优于20mm;在不损失传统方法精度的前提下,单颗卫星平均积分与拟合耗时仅需0.09s,较传统逐颗卫星固定步长积分算法提升了14倍,并且随着卫星数的增加,效率提升越明显.

  17. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  18. Current Developments in DETER Cybersecurity Testbed Technology

    Science.gov (United States)

    2015-12-08

    Management Experimental cybersecurity research is often inherently risky. An experiment may involve releasing live malware code, operating a real botnet...imagine a worm that can only propagate by first contacting a “propagation service” (T1 constraint), composed with a testbed firewall (T2...experiment. Finally, T1 constraints might be enforced by (1) explicit modification of malware to constrain its behavior, (2) implicit constraints

  19. The Airborne Optical Systems Testbed (AOSTB)

    Science.gov (United States)

    2017-05-31

    are the Atlantic Ocean and coastal waterways, which reflect back very little light at our SWIR operating wavelength of 1064 nm. The Airborne Optical...demonstrate our typical FOPEN capabilities, figure 5 shows two images taken over a forested area near Burlington, VT. Figure 5(a) is a 3D point...Systems Testbed (AOSTB) 1 - 6 STO-MP-SET-999 (a) (b) Fig. 5. Ladar target scan of a forested area in northern Vermont

  20. Towards standard testbeds for numerical relativity

    International Nuclear Information System (INIS)

    Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Hawley, Scott H; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilagyi, Bela; Takahashi, Ryoji; Winicour, Jeff

    2004-01-01

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community

  1. Towards standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Alcubierre, Miguel [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Allen, Gabrielle; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany); Bona, Carles [Departament de Fisica, Universitat de les Illes Balears, Ctra de Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Fiske, David [Dept. of Physics, Univ. of Maryland, College Park, MD 20742-4111 (United States); Hawley, Scott H [Center for Relativity, Univ. of Texas at Austin, Austin, Texas 78712 (United States); Salgado, Marcelo [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Schnetter, Erik [Inst. fuer Astronomie und Astrophysik, Universitaet Tuebingen, 72076 Tuebingen (Germany); Seidel, Edward [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Inst., 14476 Golm (Germany); Shinkai, Hisa-aki [Computational Science Div., Inst. of Physical and Chemical Research (RIKEN), Hirosawa 2-1, Wako, Saitama 351-0198 (Japan); Shoemaker, Deirdre [Center for Radiophysics and Space Research, Cornell Univ., Ithaca, NY 14853 (United States); Szilagyi, Bela [Dept. of Physics and Astronomy, Univ. of Pittsburgh, Pittsburgh, PA 15260 (United States); Takahashi, Ryoji [Theoretical Astrophysics Center, Juliane Maries Vej 30, 2100 Copenhagen, (Denmark); Winicour, Jeff [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany)

    2004-01-21

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.

  2. Development of Fast Error Compensation Algorithm for Integrated Inertial-Satellite Navigation System of Small-size Unmanned Aerial Vehicles in Complex Environment

    Directory of Open Access Journals (Sweden)

    A. V. Fomichev

    2015-01-01

    Full Text Available In accordance with the structural features of small-size unmanned aerial vehicle (UAV, and considering the feasibility of this project, the article studies an integrated inertial-satellite navigation system (INS. The INS algorithm development is based on the method of indirect filtration and principle of loosely coupled combination of output data on UAV positions and velocity. Data on position and velocity are provided from the strapdown inertial navigation system (SINS and satellite navigation system (GPS. A difference between the output flows of measuring data on position and velocity provided from the SINS and GPS is used to evaluate SINS errors by means of the basic algorithm of Kalman filtering. Then the outputs of SINS are revised. The INS possesses the following advantages: a simpler mathematical model of Kalman filtering, high reliability, two independently operating navigation systems, and high redundancy of available navigation information.But in case of loosely coupled scheme, INS can meet the challenge of high precision and reliability of navigation only when the SINS and GPS operating conditions are normal all the time. The proposed INS is used with UAV moving in complex environment due to obstacles available, severe natural climatic conditions, etc. This case expects that it is impossible for UAV to receive successful GPS-signals frequently. In order to solve this problem, was developed an algorithm for rapid compensation for errors of INS information, which could effectively solve the problem of failure of the navigation system in case there are no GPS-signals .Since it is almost impossible to obtain the data of the real trajectory in practice, in the course of simulation in accordance with the kinematic model of the UAV and the complex environment of the terrain, the flight path generator is used to produce the flight path. The errors of positions and velocities are considered as an indicator of the INS effectiveness. The results

  3. Performance of operational satellite bio-optical algorithms in different water types in the southeastern Arabian Sea

    Directory of Open Access Journals (Sweden)

    P. Minu

    2016-10-01

    Full Text Available The in situ remote sensing reflectance (Rrs and optically active substances (OAS measured using hyperspectral radiometer, were used for optical classification of coastal waters in the southeastern Arabian Sea. The spectral Rrs showed three distinct water types, that were associated with the variability in OAS such as chlorophyll-a (chl-a, chromophoric dissolved organic matter (CDOM and volume scattering function at 650 nm (β650. The water types were classified as Type-I, Type-II and Type-III respectively for the three Rrs spectra. The Type-I waters showed the peak Rrs in the blue band (470 nm, whereas in the case of Type-II and III waters the peak Rrs was at 560 and 570 nm respectively. The shifting of the peak Rrs at the longer wavelength was due to an increase in concentration of OAS. Further, we evaluated six bio-optical algorithms (OC3C, OC4O, OC4, OC4E, OC3M and OC4O2 used operationally to retrieve chl-a from Coastal Zone Colour Scanner (CZCS, Ocean Colour Temperature Scanner (OCTS, Sea-viewing Wide Field-of-view Sensor (SeaWiFS, MEdium Resolution Imaging Spectrometer (MERIS, Moderate Resolution Imaging Spectroradiometer (MODIS and Ocean Colour Monitor (OCM2. For chl-a concentration greater than 1.0 mg m−3, algorithms based on the reference band ratios 488/510/520 nm to 547/550/555/560/565 nm have to be considered. The assessment of algorithms showed better performance of OC3M and OC4. All the algorithms exhibited better performance in Type-I waters. However, the performance was poor in Type-II and Type-III waters which could be attributed to the significant co-variance of chl-a with CDOM.

  4. Utilization of a genetic algorithm for the automatic detection of oil spill from RADARSAT-2 SAR satellite data

    International Nuclear Information System (INIS)

    Marghany, Maged

    2014-01-01

    Highlights: • An oil platform located 70 km from the coast of Louisiana sank on Thursday. • Oil spill has backscatter values of −25 dB in RADARSAT-2 SAR. • Oil spill is portrayed in SCNB mode by shallower incidence angle. • Ideal detection of oil spills in SAR images requires moderate wind speeds. • Genetic algorithm is excellent tool for automatic detection of oil spill in RADARSAT-2 SAR data. - Abstract: In this work, a genetic algorithm is applied for the automatic detection of oil spills. The procedure is implemented using sequences from RADARSAT-2 SAR ScanSAR Narrow single-beam data acquired in the Gulf of Mexico. The study demonstrates that the implementation of crossover allows for the generation of an accurate oil spill pattern. This conclusion is confirmed by the receiver-operating characteristic (ROC) curve. The ROC curve indicates that the existence of oil slick footprints can be identified using the area between the ROC curve and the no-discrimination line of 90%, which is greater than that of other surrounding environmental features. In conclusion, the genetic algorithm can be used as a tool for the automatic detection of oil spills, and the ScanSAR Narrow single-beam mode serves as an excellent sensor for oil spill detection and survey

  5. Vacuum Nuller Testbed Performance, Characterization and Null Control

    Science.gov (United States)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  6. A Multi-Vehicles, Wireless Testbed for Networked Control, Communications and Computing

    Science.gov (United States)

    Murray, Richard; Doyle, John; Effros, Michelle; Hickey, Jason; Low, Steven

    2002-03-01

    We have constructed a testbed consisting of 4 mobile vehicles (with 4 additional vehicles being completed), each with embedded computing and communications capability for use in testing new approaches for command and control across dynamic networks. The system is being used or is planned to be used for testing of a variety of communications-related technologies, including distributed command and control algorithms, dynamically reconfigurable network topologies, source coding for real-time transmission of data in lossy environments, and multi-network communications. A unique feature of the testbed is the use of vehicles that have second order dynamics. Requiring real-time feedback algorithms to stabilize the system while performing cooperative tasks. The testbed was constructed in the Caltech Vehicles Laboratory and consists of individual vehicles with PC-based computation and controls, and multiple communications devices (802.11 wireless Ethernet, Bluetooth, and infrared). The vehicles are freely moving, wheeled platforms propelled by high performance dotted fairs. The room contains an access points for an 802.11 network, overhead visual sensing (to allow emulation of CI'S signal processing), a centralized computer for emulating certain distributed computations, and network gateways to control and manipulate communications traffic.

  7. Validation of ozone profile retrievals derived from the OMPS LP version 2.5 algorithm against correlative satellite measurements

    Directory of Open Access Journals (Sweden)

    N. A. Kramarova

    2018-05-01

    Full Text Available The Limb Profiler (LP is a part of the Ozone Mapping and Profiler Suite launched on board of the Suomi NPP satellite in October 2011. The LP measures solar radiation scattered from the atmospheric limb in ultraviolet and visible spectral ranges between the surface and 80 km. These measurements of scattered solar radiances allow for the retrieval of ozone profiles from cloud tops up to 55 km. The LP started operational observations in April 2012. In this study we evaluate more than 5.5 years of ozone profile measurements from the OMPS LP processed with the new NASA GSFC version 2.5 retrieval algorithm. We provide a brief description of the key changes that had been implemented in this new algorithm, including a pointing correction, new cloud height detection, explicit aerosol correction and a reduction of the number of wavelengths used in the retrievals. The OMPS LP ozone retrievals have been compared with independent satellite profile measurements obtained from the Aura Microwave Limb Sounder (MLS, Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS and Odin Optical Spectrograph and InfraRed Imaging System (OSIRIS. We document observed biases and seasonal differences and evaluate the stability of the version 2.5 ozone record over 5.5 years. Our analysis indicates that the mean differences between LP and correlative measurements are well within required ±10 % between 18 and 42 km. In the upper stratosphere and lower mesosphere (> 43 km LP tends to have a negative bias. We find larger biases in the lower stratosphere and upper troposphere, but LP ozone retrievals have significantly improved in version 2.5 compared to version 2 due to the implemented aerosol correction. In the northern high latitudes we observe larger biases between 20 and 32 km due to the remaining thermal sensitivity issue. Our analysis shows that LP ozone retrievals agree well with the correlative satellite observations in characterizing

  8. Definition of technology development missions for early space station satellite servicing, volume 1

    Science.gov (United States)

    1983-01-01

    The testbed role of an early manned space station in the context of a satellite servicing evolutionary development and flight demonstration technology plan which results in a satellite servicing operational capability is defined. A satellite servicing technology development mission (a set of missions) to be performed on an early manned space station is conceptually defined.

  9. On the role of visible radiation in ozone profile retrieval from nadir UV/VIS satellite measurements: An experiment with neural network algorithms inverting SCIAMACHY data

    International Nuclear Information System (INIS)

    Sellitto, P.; Di Noia, A.; Del Frate, F.; Burini, A.; Casadio, S.; Solimini, D.

    2012-01-01

    Theoretical evidence has been given on the role of visible (VIS) radiation in enhancing the accuracy of ozone retrievals from satellite data, especially in the troposphere. However, at present, VIS is not being systematically used together with ultraviolet (UV) measurements, even when possible with one single instrument, e.g., the SCanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY). Reasons mainly reside in the defective performance of optimal estimation and regularization algorithms caused by inaccurate modeling of VIS interaction with aerosols or clouds, as well as in inconsistent intercalibration between UV and VIS measurements. Here we intend to discuss the role of VIS radiation when it feeds a retrieval algorithm based on Neural Networks (NNs) that does not need a forward radiative transfer model and is robust with respect to calibration errors. The NN we designed was trained with a set of ozonesondes (OSs) data and tested over an independent set of OS measurements. We compared the ozone concentration profiles retrieved from UV-only with those retrieved from UV plus VIS nadir data taken by SCIAMACHY. We found that VIS radiation was able to yield more than 10% increase of accuracy and to substantially reduce biases of retrieved profiles at tropospheric levels.

  10. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  11. Sensitivity of Satellite-Based Skin Temperature to Different Surface Emissivity and NWP Reanalysis Sources Demonstrated Using a Single-Channel, Viewing-Angle-Corrected Retrieval Algorithm

    Science.gov (United States)

    Scarino, B. R.; Minnis, P.; Yost, C. R.; Chee, T.; Palikonda, R.

    2015-12-01

    Single-channel algorithms for satellite thermal-infrared- (TIR-) derived land and sea surface skin temperature (LST and SST) are advantageous in that they can be easily applied to a variety of satellite sensors. They can also accommodate decade-spanning instrument series, particularly for periods when split-window capabilities are not available. However, the benefit of one unified retrieval methodology for all sensors comes at the cost of critical sensitivity to surface emissivity (ɛs) and atmospheric transmittance estimation. It has been demonstrated that as little as 0.01 variance in ɛs can amount to more than a 0.5-K adjustment in retrieved LST values. Atmospheric transmittance requires calculations that employ vertical profiles of temperature and humidity from numerical weather prediction (NWP) models. Selection of a given NWP model can significantly affect LST and SST agreement relative to their respective validation sources. Thus, it is necessary to understand the accuracies of the retrievals for various NWP models to ensure the best LST/SST retrievals. The sensitivities of the single-channel retrievals to surface emittance and NWP profiles are investigated using NASA Langley historic land and ocean clear-sky skin temperature (Ts) values derived from high-resolution 11-μm TIR brightness temperature measured from geostationary satellites (GEOSat) and Advanced Very High Resolution Radiometers (AVHRR). It is shown that mean GEOSat-derived, anisotropy-corrected LST can vary by up to ±0.8 K depending on whether CERES or MODIS ɛs sources are used. Furthermore, the use of either NOAA Global Forecast System (GFS) or NASA Goddard Modern-Era Retrospective Analysis for Research and Applications (MERRA) for the radiative transfer model initial atmospheric state can account for more than 0.5-K variation in mean Ts. The results are compared to measurements from the Surface Radiation Budget Network (SURFRAD), an Atmospheric Radiation Measurement (ARM) Program ground

  12. Validation of new satellite aerosol optical depth retrieval algorithm using Raman lidar observations at radiative transfer laboratory in Warsaw

    Science.gov (United States)

    Zawadzka, Olga; Stachlewska, Iwona S.; Markowicz, Krzysztof M.; Nemuc, Anca; Stebel, Kerstin

    2018-04-01

    During an exceptionally warm September of 2016, the unique, stable weather conditions over Poland allowed for an extensive testing of the new algorithm developed to improve the Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI) aerosol optical depth (AOD) retrieval. The development was conducted in the frame of the ESA-ESRIN SAMIRA project. The new AOD algorithm aims at providing the aerosol optical depth maps over the territory of Poland with a high temporal resolution of 15 minutes. It was tested on the data set obtained between 11-16 September 2016, during which a day of relatively clean atmospheric background related to an Arctic airmass inflow was surrounded by a few days with well increased aerosol load of different origin. On the clean reference day, for estimating surface reflectance the AOD forecast available on-line via the Copernicus Atmosphere Monitoring Service (CAMS) was used. The obtained AOD maps were validated against AODs available within the Poland-AOD and AERONET networks, and with AOD values obtained from the PollyXT-UW lidar. of the University of Warsaw (UW).

  13. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  14. Meteosat SEVIRI Fire Radiative Power (FRP) products from the Land Surface Analysis Satellite Applications Facility (LSA SAF) - Part 1: Algorithms, product contents and analysis

    Science.gov (United States)

    Wooster, M. J.; Roberts, G.; Freeborn, P. H.; Xu, W.; Govaerts, Y.; Beeby, R.; He, J.; Lattanzio, A.; Mullen, R.

    2015-06-01

    Characterising changes in landscape scale fire activity at very high temporal resolution is best achieved using thermal observations of actively burning fires made from geostationary Earth observation (EO) satellites. Over the last decade or more, a series of research and/or operational "active fire" products have been developed from these types of geostationary observations, often with the aim of supporting the generation of data related to biomass burning fuel consumption and trace gas and aerosol emission fields. The Fire Radiative Power (FRP) products generated by the Land Surface Analysis Satellite Applications Facility (LSA SAF) from data collected by the Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI) are one such set of products, and are freely available in both near real-time and archived form. Every 15 min, the algorithms used to generate these products identify and map the location of new SEVIRI observations containing actively burning fires, and characterise their individual rates of radiative energy release (fire radiative power; FRP) that is believed proportional to rates of biomass consumption and smoke emission. The FRP-PIXEL product contains the highest spatial resolution FRP dataset, delivered for all of Europe, northern and southern Africa, and part of South America at a spatial resolution of 3 km (decreasing away from the west African sub-satellite point) at the full 15 min temporal resolution. The FRP-GRID product is an hourly summary of the FRP-PIXEL data, produced at a 5° grid cell size and including simple bias adjustments for meteorological cloud cover and for the regional underestimation of FRP caused, primarily, by the non-detection of low FRP fire pixels at SEVIRI's relatively coarse pixel size. Here we describe the enhanced geostationary Fire Thermal Anomaly (FTA) algorithm used to detect the SEVIRI active fire pixels, and detail methods used to deliver atmospherically corrected FRP information

  15. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  16. Design, Development, and Testing of a UAV Hardware-in-the-Loop Testbed for Aviation and Airspace Prognostics Research

    Science.gov (United States)

    Kulkarni, Chetan; Teubert, Chris; Gorospe, George; Burgett, Drew; Quach, Cuong C.; Hogge, Edward

    2016-01-01

    The airspace is becoming more and more complicated, and will continue to do so in the future with the integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, other forms of aviation technology into the airspace. The new technology and complexity increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems & systems of systems can be very difficult, expensive, and sometimes unsafe in real life scenarios. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. The framework injects flight related anomalies related to ground systems, routing, airport congestion, etc. to test and verify algorithms for NAS safety. In our research work, we develop a live, distributed, hardware-in-the-loop testbed for aviation and airspace prognostics along with exploring further research possibilities to verify and validate future algorithms for NAS safety. The testbed integrates virtual aircraft using the X-Plane simulator and X-PlaneConnect toolbox, UAVs using onboard sensors and cellular communications, and hardware in the loop components. In addition, the testbed includes an additional research framework to support and simplify future research activities. It enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. This paper describes the design, development, and testing of this system. Software reliability, safety and latency are some of the critical design considerations in development of the testbed. Integration of HITL elements in

  17. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  18. Development and application of an actively controlled hybrid proton exchange membrane fuel cell - Lithium-ion battery laboratory test-bed based on off-the-shelf components

    Energy Technology Data Exchange (ETDEWEB)

    Yufit, V.; Brandon, N.P. [Dept. Earth Science and Engineering, Imperial College, London SW7 2AZ (United Kingdom)

    2011-01-15

    The use of commercially available components enables rapid prototyping and assembling of laboratory scale hybrid test-bed systems, which can be used to evaluate new hybrid configurations. The development of such a test-bed using an off-the-shelf PEM fuel cell, lithium-ion battery and DC/DC converter is presented here, and its application to a hybrid configuration appropriate for an unmanned underwater vehicle is explored. A control algorithm was implemented to regulate the power share between the fuel cell and the battery with a graphical interface to control, record and analyze the electrochemical and thermal parameters of the system. The results demonstrate the applicability of the test-bed and control algorithm for this application, and provide data on the dynamic electrical and thermal behaviour of the hybrid system. (author)

  19. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    Science.gov (United States)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  20. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    International Nuclear Information System (INIS)

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Soummer, Remi; Macintosh, Bruce; Sivaramakrishnan, Anand

    2011-01-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  1. Generalized Split-Window Algorithm for Estimate of Land Surface Temperature from Chinese Geostationary FengYun Meteorological Satellite (FY-2C Data

    Directory of Open Access Journals (Sweden)

    Jun Xia

    2008-02-01

    Full Text Available On the basis of the radiative transfer theory, this paper addressed the estimate ofLand Surface Temperature (LST from the Chinese first operational geostationarymeteorological satellite-FengYun-2C (FY-2C data in two thermal infrared channels (IR1,10.3-11.3 μ m and IR2, 11.5-12.5 μ m , using the Generalized Split-Window (GSWalgorithm proposed by Wan and Dozier (1996. The coefficients in the GSW algorithmcorresponding to a series of overlapping ranging of the mean emissivity, the atmosphericWater Vapor Content (WVC, and the LST were derived using a statistical regressionmethod from the numerical values simulated with an accurate atmospheric radiativetransfer model MODTRAN 4 over a wide range of atmospheric and surface conditions.The simulation analysis showed that the LST could be estimated by the GSW algorithmwith the Root Mean Square Error (RMSE less than 1 K for the sub-ranges with theViewing Zenith Angle (VZA less than 30° or for the sub-rangs with VZA less than 60°and the atmospheric WVC less than 3.5 g/cm2 provided that the Land Surface Emissivities(LSEs are known. In order to determine the range for the optimum coefficients of theGSW algorithm, the LSEs could be derived from the data in MODIS channels 31 and 32 provided by MODIS/Terra LST product MOD11B1, or be estimated either according tothe land surface classification or using the method proposed by Jiang et al. (2006; and theWVC could be obtained from MODIS total precipitable water product MOD05, or beretrieved using Li et al.’ method (2003. The sensitivity and error analyses in term of theuncertainty of the LSE and WVC as well as the instrumental noise were performed. Inaddition, in order to compare the different formulations of the split-window algorithms,several recently proposed split-window algorithms were used to estimate the LST with thesame simulated FY-2C data. The result of the intercomparsion showed that most of thealgorithms give

  2. Event metadata records as a testbed for scalable data mining

    International Nuclear Information System (INIS)

    Gemmeren, P van; Malon, D

    2010-01-01

    At a data rate of 200 hertz, event metadata records ('TAGs,' in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise 'data mining,' but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  3. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Science.gov (United States)

    2012-03-28

    .... 120322212-2212-01] Spectrum Sharing Innovation Test-Bed Pilot Program AGENCY: National Telecommunications... Innovation Test-Bed pilot program to assess whether devices employing Dynamic Spectrum Access techniques can... Spectrum Sharing Innovation Test-Bed (Test-Bed) pilot program to examine the feasibility of increased...

  4. A low-cost test-bed for real-time landmark tracking

    Science.gov (United States)

    Csaszar, Ambrus; Hanan, Jay C.; Moreels, Pierre; Assad, Christopher

    2007-04-01

    A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates calculating relative motion and visually servoing to science targets. A limitation for the current system is serial computing-each additional landmark is tracked in order-but since each landmark is tracked independently, if transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.

  5. Development of an autonomous power system testbed

    International Nuclear Information System (INIS)

    Barton, J.R.; Adams, T.; Liffring, M.E.

    1985-01-01

    A power system testbed has been assembled to advance the development of large autonomous electrical power systems required for the space station, spacecraft, and aircraft. The power system for this effort was designed to simulate single- or dual-bus autonomous power systems, or autonomous systems that reconfigure from a single bus to a dual bus following a severe fault. The approach taken was to provide a flexible power system design with two computer systems for control and management. One computer operates as the control system and performs basic control functions, data and command processing, charge control, and provides status to the second computer. The second computer contains expert system software for mission planning, load management, fault identification and recovery, and sends load and configuration commands to the control system

  6. Aerodynamic design of the National Rotor Testbed.

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, Christopher Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbine in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.

  7. Building a ROS-Based Testbed for Realistic Multi-Robot Simulation: Taking the Exploration as an Example

    Directory of Open Access Journals (Sweden)

    Zhi Yan

    2017-09-01

    Full Text Available While the robotics community agrees that the benchmarking is of high importance to objectively compare different solutions, there are only few and limited tools to support it. To address this issue in the context of multi-robot systems, we have defined a benchmarking process based on experimental designs, which aimed at improving the reproducibility of experiments by making explicit all elements of a benchmark such as parameters, measurements and metrics. We have also developed a ROS (Robot Operating System-based testbed with the goal of making it easy for users to validate, benchmark, and compare different algorithms including coordination strategies. Our testbed uses the MORSE (Modular OpenRobots Simulation Engine simulator for realistic simulation and a computer cluster for decentralized computation. In this paper, we present our testbed in details with the architecture and infrastructure, the issues encountered in implementing the infrastructure, and the automation of the deployment. We also report a series of experiments on multi-robot exploration, in order to demonstrate the capabilities of our testbed.

  8. Testbed model and data assimilation for ARM

    International Nuclear Information System (INIS)

    Louis, J.F.

    1992-01-01

    The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes

  9. Wavefront control performance modeling with WFIRST shaped pupil coronagraph testbed

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijian; Krist, John; Cady, Eric; Kern, Brian; Poberezhskiy, Ilya

    2017-09-01

    NASA's WFIRST mission includes a coronagraph instrument (CGI) for direct imaging of exoplanets. Significant improvement in CGI model fidelity has been made recently, alongside a testbed high contrast demonstration in a simulated dynamic environment at JPL. We present our modeling method and results of comparisons to testbed's high order wavefront correction performance for the shaped pupil coronagraph. Agreement between model prediction and testbed result at better than a factor of 2 has been consistently achieved in raw contrast (contrast floor, chromaticity, and convergence), and with that comes good agreement in contrast sensitivity to wavefront perturbations and mask lateral shear.

  10. COMBINATION OF GENETIC ALGORITHM AND DEMPSTER-SHAFER THEORY OF EVIDENCE FOR LAND COVER CLASSIFICATION USING INTEGRATION OF SAR AND OPTICAL SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    H. T. Chu

    2012-07-01

    Full Text Available The integration of different kinds of remotely sensed data, in particular Synthetic Aperture Radar (SAR and optical satellite imagery, is considered a promising approach for land cover classification because of the complimentary properties of each data source. However, the challenges are: how to fully exploit the capabilities of these multiple data sources, which combined datasets should be used and which data processing and classification techniques are most appropriate in order to achieve the best results. In this paper an approach, in which synergistic use of a feature selection (FS methods with Genetic Algorithm (GA and multiple classifiers combination based on Dempster-Shafer Theory of Evidence, is proposed and evaluated for classifying land cover features in New South Wales, Australia. Multi-date SAR data, including ALOS/PALSAR, ENVISAT/ASAR and optical (Landsat 5 TM+ images, were used for this study. Textural information were also derived and integrated with the original images. Various combined datasets were generated for classification. Three classifiers, namely Artificial Neural Network (ANN, Support Vector Machines (SVMs and Self-Organizing Map (SOM were employed. Firstly, feature selection using GA was applied for each classifier and dataset to determine the optimal input features and parameters. Then the results of three classifiers on particular datasets were combined using the Dempster-Shafer theory of Evidence. Results of this study demonstrate the advantages of the proposed method for land cover mapping using complex datasets. It is revealed that the use of GA in conjunction with the Dempster-Shafer Theory of Evidence can significantly improve the classification accuracy. Furthermore, integration of SAR and optical data often outperform single-type datasets.

  11. Web Based Rapid Mapping of Disaster Areas using Satellite Images, Web Processing Service, Web Mapping Service, Frequency Based Change Detection Algorithm and J-iView

    Science.gov (United States)

    Bandibas, J. C.; Takarada, S.

    2013-12-01

    Timely identification of areas affected by natural disasters is very important for a successful rescue and effective emergency relief efforts. This research focuses on the development of a cost effective and efficient system of identifying areas affected by natural disasters, and the efficient distribution of the information. The developed system is composed of 3 modules which are the Web Processing Service (WPS), Web Map Service (WMS) and the user interface provided by J-iView (fig. 1). WPS is an online system that provides computation, storage and data access services. In this study, the WPS module provides online access of the software implementing the developed frequency based change detection algorithm for the identification of areas affected by natural disasters. It also sends requests to WMS servers to get the remotely sensed data to be used in the computation. WMS is a standard protocol that provides a simple HTTP interface for requesting geo-registered map images from one or more geospatial databases. In this research, the WMS component provides remote access of the satellite images which are used as inputs for land cover change detection. The user interface in this system is provided by J-iView, which is an online mapping system developed at the Geological Survey of Japan (GSJ). The 3 modules are seamlessly integrated into a single package using J-iView, which could rapidly generate a map of disaster areas that is instantaneously viewable online. The developed system was tested using ASTER images covering the areas damaged by the March 11, 2011 tsunami in northeastern Japan. The developed system efficiently generated a map showing areas devastated by the tsunami. Based on the initial results of the study, the developed system proved to be a useful tool for emergency workers to quickly identify areas affected by natural disasters.

  12. Prognostics-Enabled Power Supply for ADAPT Testbed, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Ridgetop's role is to develop electronic prognostics for sensing power systems in support of NASA/Ames ADAPT testbed. The prognostic enabled power systems from...

  13. The Living With a Star Space Environment Testbed Payload

    Science.gov (United States)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  14. Integrating Simulated Physics and Device Virtualization in Control System Testbeds

    OpenAIRE

    Redwood , Owen; Reynolds , Jason; Burmester , Mike

    2016-01-01

    Part 3: INFRASTRUCTURE MODELING AND SIMULATION; International audience; Malware and forensic analyses of embedded cyber-physical systems are tedious, manual processes that testbeds are commonly not designed to support. Additionally, attesting the physics impact of embedded cyber-physical system malware has no formal methodologies and is currently an art. This chapter describes a novel testbed design methodology that integrates virtualized embedded industrial control systems and physics simula...

  15. A Novel UAV Electric Propulsion Testbed for Diagnostics and Prognostics

    Science.gov (United States)

    Gorospe, George E., Jr.; Kulkarni, Chetan S.

    2017-01-01

    This paper presents a novel hardware-in-the-loop (HIL) testbed for systems level diagnostics and prognostics of an electric propulsion system used in UAVs (unmanned aerial vehicle). Referencing the all electric, Edge 540T aircraft used in science and research by NASA Langley Flight Research Center, the HIL testbed includes an identical propulsion system, consisting of motors, speed controllers and batteries. Isolated under a controlled laboratory environment, the propulsion system has been instrumented for advanced diagnostics and prognostics. To produce flight like loading on the system a slave motor is coupled to the motor under test (MUT) and provides variable mechanical resistance, and the capability of introducing nondestructive mechanical wear-like frictional loads on the system. This testbed enables the verification of mathematical models of each component of the propulsion system, the repeatable generation of flight-like loads on the system for fault analysis, test-to-failure scenarios, and the development of advanced system level diagnostics and prognostics methods. The capabilities of the testbed are extended through the integration of a LabVIEW-based client for the Live Virtual Constructive Distributed Environment (LVCDC) Gateway which enables both the publishing of generated data for remotely located observers and prognosers and the synchronization the testbed propulsion system with vehicles in the air. The developed HIL testbed gives researchers easy access to a scientifically relevant portion of the aircraft without the overhead and dangers encountered during actual flight.

  16. Recent Successes and Future Plans for NASA's Space Communications and Navigation Testbed on the International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Sankovic, John M.; Johnson, Sandra K.; Lux, James P.; Chelmins, David T.

    2014-01-01

    Flexible and extensible space communications architectures and technology are essential to enable future space exploration and science activities. NASA has championed the development of the Space Telecommunications Radio System (STRS) software defined radio (SDR) standard and the application of SDR technology to reduce the costs and risks of using SDRs for space missions, and has developed an on-orbit testbed to validate these capabilities. The Space Communications and Navigation (SCaN) Testbed (previously known as the Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT)) is advancing SDR, on-board networking, and navigation technologies by conducting space experiments aboard the International Space Station. During its first year(s) on-orbit, the SCaN Testbed has achieved considerable accomplishments to better understand SDRs and their applications. The SDR platforms and software waveforms on each SDR have over 1500 hours of operation and are performing as designed. The Ka-band SDR on the SCaN Testbed is NASAs first space Ka-band transceiver and is NASA's first Ka-band mission using the Space Network. This has provided exciting opportunities to operate at Ka-band and assist with on-orbit tests of NASA newest Tracking and Data Relay Satellites (TDRS). During its first year, SCaN Testbed completed its first on-orbit SDR reconfigurations. SDR reconfigurations occur when implementing new waveforms on an SDR. SDR reconfigurations allow a radio to change minor parameters, such as data rate, or complete functionality. New waveforms which provide new capability and are reusable across different missions provide long term value for reconfigurable platforms such as SDRs. The STRS Standard provides guidelines for new waveform development by third parties. Waveform development by organizations other than the platform provider offers NASA the ability to develop waveforms itself and reduce its dependence and costs on the platform developer. Each of these

  17. Next-Generation Satellite Precipitation Products for Understanding Global and Regional Water Variability

    Science.gov (United States)

    Hou, Arthur Y.

    2011-01-01

    A major challenge in understanding the space-time variability of continental water fluxes is the lack of accurate precipitation estimates over complex terrains. While satellite precipitation observations can be used to complement ground-based data to obtain improved estimates, space-based and ground-based estimates come with their own sets of uncertainties, which must be understood and characterized. Quantitative estimation of uncertainties in these products also provides a necessary foundation for merging satellite and ground-based precipitation measurements within a rigorous statistical framework. Global Precipitation Measurement (GPM) is an international satellite mission that will provide next-generation global precipitation data products for research and applications. It consists of a constellation of microwave sensors provided by NASA, JAXA, CNES, ISRO, EUMETSAT, DOD, NOAA, NPP, and JPSS. At the heart of the mission is the GPM Core Observatory provided by NASA and JAXA to be launched in 2013. The GPM Core, which will carry the first space-borne dual-frequency radar and a state-of-the-art multi-frequency radiometer, is designed to set new reference standards for precipitation measurements from space, which can then be used to unify and refine precipitation retrievals from all constellation sensors. The next-generation constellation-based satellite precipitation estimates will be characterized by intercalibrated radiometric measurements and physical-based retrievals using a common observation-derived hydrometeor database. For pre-launch algorithm development and post-launch product evaluation, NASA supports an extensive ground validation (GV) program in cooperation with domestic and international partners to improve (1) physics of remote-sensing algorithms through a series of focused field campaigns, (2) characterization of uncertainties in satellite and ground-based precipitation products over selected GV testbeds, and (3) modeling of atmospheric processes and

  18. Termite: Emulation Testbed for Encounter Networks

    Directory of Open Access Journals (Sweden)

    Rodrigo Bruno

    2015-08-01

    Full Text Available Cutting-edge mobile devices like smartphones and tablets are equipped with various infrastructureless wireless interfaces, such as WiFi Direct and Bluetooth. Such technologies allow for novel mobile applications that take advantage of casual encounters between co-located users. However, the need to mimic the behavior of real-world encounter networks makes testing and debugging of such applications hard tasks. We present Termite, an emulation testbed for encounter networks. Our system allows developers to run their applications on a virtual encounter network emulated by software. Developers can model arbitrary encounter networks and specify user interactions on the emulated virtual devices. To facilitate testing and debugging, developers can place breakpoints, inspect the runtime state of virtual nodes, and run experiments in a stepwise fashion. Termite defines its own Petri Net variant to model the dynamically changing topology and synthesize user interactions with virtual devices. The system is designed to efficiently multiplex an underlying emulation hosting infrastructure across multiple developers, and to support heterogeneous mobile platforms. Our current system implementation supports virtual Android devices communicating over WiFi Direct networks and runs on top of a local cloud infrastructure. We evaluated our system using emulator network traces, and found that Termite is expressive and performs well.

  19. Ames life science telescience testbed evaluation

    Science.gov (United States)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  20. Multi-agent robotic systems and applications for satellite missions

    Science.gov (United States)

    Nunes, Miguel A.

    A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi

  1. The Fourier-Kelvin Stellar Interferometer (FKSI) Nulling Testbed II: Closed-loop Path Length Metrology And Control Subsystem

    Science.gov (United States)

    Frey, B. J.; Barry, R. K.; Danchi, W. C.; Hyde, T. T.; Lee, K. Y.; Martino, A. J.; Zuray, M. S.

    2006-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer in the near to mid-infrared spectral region (3-8 microns), and will be a scientific and technological pathfinder for upcoming missions including TPF-I/DARWIN, SPECS, and SPIRIT. At NASA's Goddard Space Flight Center, we have constructed a symmetric Mach-Zehnder nulling testbed to demonstrate techniques and algorithms that can be used to establish and maintain the 10(exp 4) null depth that will be required for such a mission. Among the challenges inherent in such a system is the ability to acquire and track the null fringe to the desired depth for timescales on the order of hours in a laboratory environment. In addition, it is desirable to achieve this stability without using conventional dithering techniques. We describe recent testbed metrology and control system developments necessary to achieve these goals and present our preliminary results.

  2. A Monocular Vision Measurement System of Three-Degree-of-Freedom Air-Bearing Test-Bed Based on FCCSP

    Science.gov (United States)

    Gao, Zhanyu; Gu, Yingying; Lv, Yaoyu; Xu, Zhenbang; Wu, Qingwen

    2018-06-01

    A monocular vision-based pose measurement system is provided for real-time measurement of a three-degree-of-freedom (3-DOF) air-bearing test-bed. Firstly, a circular plane cooperative target is designed. An image of a target fixed on the test-bed is then acquired. Blob analysis-based image processing is used to detect the object circles on the target. A fast algorithm (FCCSP) based on pixel statistics is proposed to extract the centers of object circles. Finally, pose measurements can be obtained when combined with the centers and the coordinate transformation relation. Experiments show that the proposed method is fast, accurate, and robust enough to satisfy the requirement of the pose measurement.

  3. Real-world experimentation of distributed DSA network algorithms

    DEFF Research Database (Denmark)

    Tonelli, Oscar; Berardinelli, Gilberto; Tavares, Fernando Menezes Leitão

    2013-01-01

    such as a dynamic propagation environment, human presence impact and terminals mobility. This chapter focuses on the practical aspects related to the real world-experimentation with distributed DSA network algorithms over a testbed network. Challenges and solutions are extensively discussed, from the testbed design......The problem of spectrum scarcity in uncoordinated and/or heterogeneous wireless networks is the key aspect driving the research in the field of flexible management of frequency resources. In particular, distributed dynamic spectrum access (DSA) algorithms enable an efficient sharing...... to the setup of experiments. A practical example of experimentation process with a DSA algorithm is also provided....

  4. Development of a hardware-in-the-loop testbed to demonstrate multiple spacecraft operations in proximity

    Science.gov (United States)

    Eun, Youngho; Park, Sang-Young; Kim, Geuk-Nam

    2018-06-01

    This paper presents a new state-of-the-art ground-based hardware-in-the-loop test facility, which was developed to verify and demonstrate autonomous guidance, navigation, and control algorithms for space proximity operations and formation flying maneuvers. The test facility consists of two complete spaceflight simulators, an aluminum-based operational arena, and a set of infrared motion tracking cameras; thus, the testbed is capable of representing space activities under circumstances prevailing on the ground. The spaceflight simulators have a maximum of five-degree-of-freedom in a quasi-momentum-free environment, which is produced by a set of linear/hemispherical air-bearings and a horizontally leveled operational arena. The tracking system measures the real-time three-dimensional position and attitude to provide state variables to the agents. The design of the testbed is illustrated in detail for every element throughout the paper. The practical hardware characteristics of the active/passive measurement units and internal actuators are identified in detail from various perspectives. These experimental results support the successful development of the entire facility and enable us to implement and verify the spacecraft proximity operation strategy in the near future.

  5. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    Science.gov (United States)

    Papathakis, Kurt V.; Kloesel, Kurt J.; Lin, Yohan; Clarke, Sean; Ediger, Jacob J.; Ginn, Starr

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC) (Edwards, California) is developing a Hybrid-Electric Integrated Systems Testbed (HEIST) Testbed as part of the HEIST Project, to study power management and transition complexities, modular architectures, and flight control laws for turbo-electric distributed propulsion technologies using representative hardware and piloted simulations. Capabilities are being developed to assess the flight readiness of hybrid electric and distributed electric vehicle architectures. Additionally, NASA will leverage experience gained and assets developed from HEIST to assist in flight-test proposal development, flight-test vehicle design, and evaluation of hybrid electric and distributed electric concept vehicles for flight safety. The HEIST test equipment will include three trailers supporting a distributed electric propulsion wing, a battery system and turbogenerator, dynamometers, and supporting power and communication infrastructure, all connected to the AFRC Core simulation. Plans call for 18 high performance electric motors that will be powered by batteries and the turbogenerator, and commanded by a piloted simulation. Flight control algorithms will be developed on the turbo-electric distributed propulsion system.

  6. The Living With a Star Space Environment Testbed Program

    Science.gov (United States)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  7. Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15

    Science.gov (United States)

    Event Archives Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15 On November 12th, Dr Workshop on Accessible Remote Testbeds (ART'15) at Georgia Tech. From the event website: The rationale behind the ART'15 workshop is that remote-access testbeds could, if done right, significantly change how

  8. Real-Time Emulation of Heterogeneous Wireless Networks with End-to-Edge Quality of Service Guarantees: The AROMA Testbed

    Directory of Open Access Journals (Sweden)

    Anna Umbert

    2010-01-01

    Full Text Available This work presents and describes the real-time testbed for all-IP Beyond 3G (B3G heterogeneous wireless networks that has been developed in the framework of the European IST AROMA project. The main objective of the AROMA testbed is to provide a highly accurate and realistic framework where the performance of algorithms, policies, protocols, services, and applications for a complete heterogeneous wireless network can be fully assessed and evaluated before bringing them to a real system. The complexity of the interaction between all-IP B3G systems and user applications, while dealing with the Quality of Service (QoS concept, motivates the development of this kind of emulation platform where different solutions can be tested in realistic conditions that could not be achieved by means of simple offline simulations. This work provides an in-depth description of the AROMA testbed, emphasizing many interesting implementation details and lessons learned during the development of the tool that may result helpful to other researchers and system engineers in the development of similar emulation platforms. Several case studies are also presented in order to illustrate the full potential and capabilities of the presented emulation platform.

  9. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    Science.gov (United States)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  10. ASE-BAN, a Wireless Body Area Network Testbed

    DEFF Research Database (Denmark)

    Madsen, Jens Kargaard; Karstoft, Henrik; Toftegaard, Thomas Skjødeberg

    2010-01-01

    /actuators attached to the body and a host server application. The gateway uses the BlackFin BF533 processor from Analog Devices, and uses Bluetooth for wireless communication. Two types of sensors are attached to the network: an electro-cardio-gram sensor and an oximeter sensor. The testbed has been successfully...

  11. Towards a Perpetual Sensor Network Testbed without Backchannel

    DEFF Research Database (Denmark)

    Johansen, Aslak; Bonnet, Philippe; Sørensen, Thomas

    2012-01-01

    The sensor network testbeds available today rely on a communication channel different from the mote radio - a backchannel - to facilitate mote reprogramming, health monitoring and performance analysis. Such backchannels are either supported as wired communication channels (USB or Ethernet), or vi...

  12. Torpedo and countermeasures modelling in the Torpedo Defence System Testbed

    NARCIS (Netherlands)

    Benders, F.P.A.; Witberg, R.R.; H.J. Grootendorst, H.J.

    2002-01-01

    Several years ago, TNO-FEL started the development of the Torpedo Defence System Testbed (TDSTB) based on the TORpedo SIMulation (TORSIM) model and the Maritime Operations Simulation and Evaluation System (MOSES). MOSES provides the simulation and modelling environment for the evaluation and

  13. Operation Duties on the F-15B Research Testbed

    Science.gov (United States)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  14. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    Science.gov (United States)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  15. Development of a Testbed for Wireless Underground Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mehmet C. Vuran

    2010-01-01

    Full Text Available Wireless Underground Sensor Networks (WUSNs constitute one of the promising application areas of the recently developed wireless sensor networking techniques. WUSN is a specialized kind of Wireless Sensor Network (WSN that mainly focuses on the use of sensors that communicate through soil. Recent models for the wireless underground communication channel are proposed but few field experiments were realized to verify the accuracy of the models. The realization of field WUSN experiments proved to be extremely complex and time-consuming in comparison with the traditional wireless environment. To the best of our knowledge, this is the first work that proposes guidelines for the development of an outdoor WUSN testbed with the goals of improving the accuracy and reducing of time for WUSN experiments. Although the work mainly aims WUSNs, many of the presented practices can also be applied to generic WSN testbeds.

  16. A MIMO-OFDM Testbed for Wireless Local Area Networks

    Directory of Open Access Journals (Sweden)

    Conrat Jean-Marc

    2006-01-01

    Full Text Available We describe the design steps and final implementation of a MIMO OFDM prototype platform developed to enhance the performance of wireless LAN standards such as HiperLAN/2 and 802.11, using multiple transmit and multiple receive antennas. We first describe the channel measurement campaign used to characterize the indoor operational propagation environment, and analyze the influence of the channel on code design through a ray-tracing channel simulator. We also comment on some antenna and RF issues which are of importance for the final realization of the testbed. Multiple coding, decoding, and channel estimation strategies are discussed and their respective performance-complexity trade-offs are evaluated over the realistic channel obtained from the propagation studies. Finally, we present the design methodology, including cross-validation of the Matlab, C++, and VHDL components, and the final demonstrator architecture. We highlight the increased measured performance of the MIMO testbed over the single-antenna system.

  17. A technical description of the FlexHouse Project Testbed

    DEFF Research Database (Denmark)

    Sørensen, Jens Otto

    2000-01-01

    This paper describes the FlexHouse project testbed; a server dedicated to experiments within the FlexHouse project. The FlexHouse project is a project originating from The Business Computing Research Group at The Aarhus School of Business. The purpose of the project is to identify and develop...... methods that satisfy the following three requirements. Flexibility with respect to evolving data sources. Flexibility with respect to change of information needs. Efficiency with respect to view management....

  18. Testbed for a LiFi system integrated in streetlights

    OpenAIRE

    Monzón Baeza, Victor; Sánchez Fernández, Matilde Pilar; García-Armada, Ana; Royo, A.

    2015-01-01

    Proceeding at: 2015 European Conference on Networks and Communications (EuCNC) took place June 29 - July 2 in Paris, France. In this paper, a functional LiFi real-time testbed implemented on FPGAs is presented. The setup evaluates the performance of our design in a downlink scenario where the transmitter is embedded on the streetlights and a mobile phone’s camera is used as receiver, therefore achieving the goal of lighting and communicating simultaneously. To validate the ...

  19. Development and experimentation of an eye/brain/task testbed

    Science.gov (United States)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  20. Seeking an optimal algorithm for a new satellite-based Sea Ice Drift Climate Data Record : Motivations, plans and initial results from the ESA CCI Sea Ice project

    DEFF Research Database (Denmark)

    Lavergne, T.; Dybkjær, Gorm; Girard-Ardhuin, Fanny

    The Sea Ice Essential Climate Variable (ECV) as defined by GCOS pertains of both sea ice concentration, thickness, and drift. Now in its second phase, the ESA CCI Sea Ice project is conducting the necessary research efforts to address sea ice drift.Accurate estimates of sea ice drift direction an...... in the final product. This contribution reviews the motivation for the work, the plans for sea ice drift algorithms intercomparison and selection, and early results from our activity....

  1. Visible nulling coronagraphy testbed development for exoplanet detection

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-07-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 108, 109 and 1010 at an inner working angle of 2*λ/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  2. Easy as Pi: A Network Coding Raspberry Pi Testbed

    Directory of Open Access Journals (Sweden)

    Chres W. Sørensen

    2016-10-01

    Full Text Available In the near future, upcoming communications and storage networks are expected to tolerate major difficulties produced by huge amounts of data being generated from the Internet of Things (IoT. For these types of networks, strategies and mechanisms based on network coding have appeared as an alternative to overcome these difficulties in a holistic manner, e.g., without sacrificing the benefit of a given network metric when improving another. There has been recurrent issues on: (i making large-scale deployments akin to the Internet of Things; (ii assessing and (iii replicating the obtained results in preliminary studies. Therefore, finding testbeds that can deal with large-scale deployments and not lose historic data in order to evaluate these mechanisms are greatly needed and desirable from a research perspective. However, this can be hard to manage, not only due to the inherent costs of the hardware, but also due to maintenance challenges. In this paper, we present the required key steps to design, setup and maintain an inexpensive testbed using Raspberry Pi devices for communications and storage networks with network coding capabilities. This testbed can be utilized for any applications requiring results replicability.

  3. Sensor System Performance Evaluation and Benefits from the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I)

    Science.gov (United States)

    Larar, A.; Zhou, D.; Smith, W.

    2009-01-01

    Advanced satellite sensors are tasked with improving global-scale measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring, and environmental change detection. Validation of the entire measurement system is crucial to achieving this goal and thus maximizing research and operational utility of resultant data. Field campaigns employing satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft are an essential part of this validation task. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Airborne Sounder Testbed-Interferometer (NAST-I) has been a fundamental contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This paper focuses on some of the challenges associated with validating advanced atmospheric sounders and the benefits obtained from employing airborne interferometers such as the NAST-I. Select results from underflights of the Aqua Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) obtained during recent field campaigns will be presented.

  4. An Intelligent Archive Testbed Incorporating Data Mining

    Science.gov (United States)

    Ramapriyan, H.; Isaac, D.; Yang, W.; Bonnlander, B.; Danks, D.

    2009-01-01

    Many significant advances have occurred during the last two decades in remote sensing instrumentation, computation, storage, and communication technology. A series of Earth observing satellites have been launched by U.S. and international agencies and have been operating and collecting global data on a regular basis. These advances have created a data rich environment for scientific research and applications. NASA s Earth Observing System (EOS) Data and Information System (EOSDIS) has been operational since August 1994 with support for pre-EOS data. Currently, EOSDIS supports all the EOS missions including Terra (1999), Aqua (2002), ICESat (2002) and Aura (2004). EOSDIS has been effectively capturing, processing and archiving several terabytes of standard data products each day. It has also been distributing these data products at a rate of several terabytes per day to a diverse and globally distributed user community (Ramapriyan et al. 2009). There are other NASA-sponsored data system activities including measurement-based systems such as the Ocean Data Processing System and the Precipitation Processing system, and several projects under the Research, Education and Applications Solutions Network (REASoN), Making Earth Science Data Records for Use in Research Environments (MEaSUREs), and the Advancing Collaborative Connections for Earth-Sun System Science (ACCESS) programs. Together, these activities provide a rich set of resources constituting a value chain for users to obtain data at various levels ranging from raw radiances to interdisciplinary model outputs. The result has been a significant leap in our understanding of the Earth systems that all humans depend on for their enjoyment, livelihood, and survival. The trend in the community today is towards many distributed sets of providers of data and services. Despite this, visions for the future include users being able to locate, fuse and utilize data with location transparency and high degree of

  5. Autonomous Satellite Command and Control Through the World Wide Web. Phase 3

    Science.gov (United States)

    Cantwell, Brian; Twiggs, Robert

    1998-01-01

    The Automated Space System Experimental Testbed (ASSET) system is a simple yet comprehensive real-world operations network being developed. Phase 3 of the ASSET Project was January-December 1997 and is the subject of this report. This phase permitted SSDL and its project partners to expand the ASSET system in a variety of ways. These added capabilities included the advancement of ground station capabilities, the adaptation of spacecraft on-board software, and the expansion of capabilities of the ASSET management algorithms. Specific goals of Phase 3 were: (1) Extend Web-based goal-level commanding for both the payload PI and the spacecraft engineer. (2) Support prioritized handling of multiple (PIs) Principle Investigators as well as associated payload experimenters. (3) Expand the number and types of experiments supported by the ASSET system and its associated spacecraft. (4) Implement more advanced resource management, modeling and fault management capabilities that integrate the space and ground segments of the space system hardware. (5) Implement a beacon monitoring test. (6) Implement an experimental blackboard controller for space system management. (7) Further define typical ground station developments required for Internet-based remote control and for full system automation of the PI-to-spacecraft link. Each of those goals are examined. Significant sections of this report were also published as a conference paper. Several publications produced in support of this grant are included as attachments. Titles include: 1) Experimental Initiatives in Space System Operations; 2) The ASSET Client Interface: Balancing High Level Specification with Low Level Control; 3) Specifying Spacecraft Operations At The Product/Service Level; 4) The Design of a Highly Configurable, Reusable Operating System for Testbed Satellites; 5) Automated Health Operations For The Sapphire Spacecraft; 6) Engineering Data Summaries for Space Missions; and 7) Experiments In Automated Health

  6. Forecasting Caspian Sea level changes using satellite altimetry data (June 1992-December 2013) based on evolutionary support vector regression algorithms and gene expression programming

    Science.gov (United States)

    Imani, Moslem; You, Rey-Jer; Kuo, Chung-Yen

    2014-10-01

    Sea level forecasting at various time intervals is of great importance in water supply management. Evolutionary artificial intelligence (AI) approaches have been accepted as an appropriate tool for modeling complex nonlinear phenomena in water bodies. In the study, we investigated the ability of two AI techniques: support vector machine (SVM), which is mathematically well-founded and provides new insights into function approximation, and gene expression programming (GEP), which is used to forecast Caspian Sea level anomalies using satellite altimetry observations from June 1992 to December 2013. SVM demonstrates the best performance in predicting Caspian Sea level anomalies, given the minimum root mean square error (RMSE = 0.035) and maximum coefficient of determination (R2 = 0.96) during the prediction periods. A comparison between the proposed AI approaches and the cascade correlation neural network (CCNN) model also shows the superiority of the GEP and SVM models over the CCNN.

  7. Development of Liquid Propulsion Systems Testbed at MSFC

    Science.gov (United States)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  8. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    OpenAIRE

    Jared A. Frank; Anthony Brill; Vikram Kapila

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their em...

  9. Fuzzy Information Retrieval Using Genetic Algorithms and Relevance Feedback.

    Science.gov (United States)

    Petry, Frederick E.; And Others

    1993-01-01

    Describes an approach that combines concepts from information retrieval, fuzzy set theory, and genetic programing to improve weighted Boolean query formulation via relevance feedback. Highlights include background on information retrieval systems; genetic algorithms; subproblem formulation; and preliminary results based on a testbed. (Contains 12…

  10. Saturn satellites

    International Nuclear Information System (INIS)

    Ruskol, E.L.

    1981-01-01

    The characteristics of the Saturn satellites are discussed. The satellites close to Saturn - Janus, Mimas, Enceladus, Tethys, Dione and Rhea - rotate along the circular orbits. High reflectivity is attributed to them, and the density of the satellites is 1 g/cm 3 . Titan is one of the biggest Saturn satellites. Titan has atmosphere many times more powerful than that of Mars. The Titan atmosphere is a peculiar medium with a unique methane and hydrogen distribution in the whole Solar system. The external satellites - Hyperion, Japetus and Phoebe - are poorly investigated. Neither satellite substance density, nor their composition are known. The experimental data on the Saturn rings obtained on the ''Pioneer-11'' and ''Voyager-1'' satellites are presented [ru

  11. Accounting for the Effects of Surface BRDF on Satellite Cloud and Trace-Gas Retrievals: A New Approach Based on Geometry-Dependent Lambertian-Equivalent Reflectivity Applied to OMI Algorithms

    Science.gov (United States)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50% in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  12. Accounting for the effects of surface BRDF on satellite cloud and trace-gas retrievals: a new approach based on geometry-dependent Lambertian equivalent reflectivity applied to OMI algorithms

    Science.gov (United States)

    Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey

    2017-01-01

    Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50 % in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.

  13. Comment on 'The remote sensing of ocean primary productivity - Use of a new data compilation to test satellite algorithms' by William Balch et al

    Science.gov (United States)

    Platt, Trevor; Sathyendranath, Shubha

    1993-01-01

    Various conclusions by Balch et al. (1992) about the current state of modeling primary production in the sea (lack of improvement in primary production models, since 1957, utility of analytical models, and merits or weaknesses of complex models) are commented on. It is argued that since they are based on a false premise, these conclusions are not robust, and that the approach used by Balch et al. (the model of Platt and Sathyendranath, 1988) was inadequate for the question they set out to address. The present criticism is based mainly on the issue of whether implementation was correct with respect to parameter selection. It is concluded that the findings of Balch et al. with respect to the model of Platt and Sathyendranath is unreliable. Balch replies that satellite-derived estimates of primary production should be compared directly to that measured in situ in as many regions as possible. This will provide a first-order estimate of the magnitude of the error involved in estimating primary production from space.

  14. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report. [supporting datasets - Phoenix Testbed

    Science.gov (United States)

    2017-07-26

    The datasets in this zip file are in support of FHWA-JPO-16-379, Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Program...

  15. Improvement of remote monitoring on water quality in a subtropical reservoir by incorporating grammatical evolution with parallel genetic algorithms into satellite imagery.

    Science.gov (United States)

    Chen, Li; Tan, Chih-Hung; Kao, Shuh-Ji; Wang, Tai-Sheng

    2008-01-01

    Parallel GEGA was constructed by incorporating grammatical evolution (GE) into the parallel genetic algorithm (GA) to improve reservoir water quality monitoring based on remote sensing images. A cruise was conducted to ground-truth chlorophyll-a (Chl-a) concentration longitudinally along the Feitsui Reservoir, the primary water supply for Taipei City in Taiwan. Empirical functions with multiple spectral parameters from the Landsat 7 Enhanced Thematic Mapper (ETM+) data were constructed. The GE, an evolutionary automatic programming type system, automatically discovers complex nonlinear mathematical relationships among observed Chl-a concentrations and remote-sensed imageries. A GA was used afterward with GE to optimize the appropriate function type. Various parallel subpopulations were processed to enhance search efficiency during the optimization procedure with GA. Compared with a traditional linear multiple regression (LMR), the performance of parallel GEGA was found to be better than that of the traditional LMR model with lower estimating errors.

  16. Description of Simulated Small Satellite Operation Data Sets

    Science.gov (United States)

    Kulkarni, Chetan S.; Guarneros Luna, Ali

    2018-01-01

    A set of two BP930 batteries (Identified as PK31 and PK35) were operated continuously for a simulated satellite operation profile completion for single cycle. The battery packs were charged to an initial voltage of around 8.35 V for 100% SOC before the experiment was started. This document explains the structure of the battery data sets. Please cite this paper when using this dataset: Z. Cameron, C. Kulkarni, A. Guarneros, K. Goebel, S. Poll, "A Battery Certification Testbed for Small Satellite Missions", IEEE AUTOTESTCON 2015, Nov 2-5, 2015, National Harbor, MA

  17. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  18. Automatic Integration Testbeds validation on Open Science Grid

    International Nuclear Information System (INIS)

    Caballero, J; Potekhin, M; Thapa, S; Gardner, R

    2011-01-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit 'VO-like' jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  19. Geostationary satellites collocation

    CERN Document Server

    Li, Hengnian

    2014-01-01

    Geostationary Satellites Collocation aims to find solutions for deploying a safe and reliable collocation control. Focusing on the orbital perturbation analysis, the mathematical foundations for orbit and control of the geostationary satellite are summarized. The mathematical and physical principle of orbital maneuver and collocation strategies for multi geostationary satellites sharing with the same dead band is also stressed. Moreover, the book presents some applications using the above algorithms and mathematical models to help readers master the corrective method for planning station keeping maneuvers. Engineers and scientists in the fields of aerospace technology and space science can benefit from this book. Hengnian Li is the Deputy Director of State Key Laboratory of Astronautic Dynamics, China.

  20. GEO light imaging national testbed (GLINT) heliostat design and testing status

    Science.gov (United States)

    Thornton, Marcia A.; Oldenettel, Jerry R.; Hult, Dane W.; Koski, Katrina; Depue, Tracy; Cuellar, Edward L.; Balfour, Jim; Roof, Morey; Yarger, Fred W.; Newlin, Greg; Ramzel, Lee; Buchanan, Peter; Mariam, Fesseha G.; Scotese, Lee

    2002-01-01

    The GEO Light Imaging National Testbed (GLINT) will use three laser beams producing simultaneous interference fringes to illuminate satellites in geosynchronous earth orbit (GEO). The reflected returns will be recorded using a large 4,000 m2 'light bucket' receiver. This imaging methodology is termed Fourier Telescopy. A major component of the 'light bucket' will be an array of 40 - 80 heliostats. Each heliostat will have a mirrored surface area of 100 m2 mounted on a rigid truss structure which is supported by an A-frame. The truss structure attaches to the torque tube elevation drive and the A-frame structure rests on an azimuth ring that could provide nearly full coverage of the sky. The heliostat is designed to operate in 15 mph winds with jitter of less than 500 microradians peak-to- peak. One objective of the design was to minimize receiver cost to the maximum extent possible while maintaining GLINT system performance specifications. The mechanical structure weights approximately seven tons and is a simple fabricated steel framework. A prototype heliostat has been assembled at Stallion Range Center, White Sands Missile Range, New Mexico and is being tested under a variety of weather and operational conditions. The preliminary results of that testing will be presented as well as some finite element model analyses that were performed to predict the performance of the structure.

  1. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  2. Centriolar satellites

    DEFF Research Database (Denmark)

    Tollenaere, Maxim A X; Mailand, Niels; Bekker-Jensen, Simon

    2015-01-01

    Centriolar satellites are small, microscopically visible granules that cluster around centrosomes. These structures, which contain numerous proteins directly involved in centrosome maintenance, ciliogenesis, and neurogenesis, have traditionally been viewed as vehicles for protein trafficking...... highlight newly discovered regulatory mechanisms targeting centriolar satellites and their functional status, and we discuss how defects in centriolar satellite components are intimately linked to a wide spectrum of human diseases....

  3. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    Energy Technology Data Exchange (ETDEWEB)

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  4. The Living With a Star Space Environment Testbed Experiments

    Science.gov (United States)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  5. SCaN Testbed Software Development and Lessons Learned

    Science.gov (United States)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  6. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  7. The Living With a Star Program Space Environment Testbed

    Science.gov (United States)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  8. Smart Grid: Network simulator for smart grid test-bed

    International Nuclear Information System (INIS)

    Lai, L C; Ong, H S; Che, Y X; Do, N Q; Ong, X J

    2013-01-01

    Smart Grid become more popular, a smaller scale of smart grid test-bed is set up at UNITEN to investigate the performance and to find out future enhancement of smart grid in Malaysia. The fundamental requirement in this project is design a network with low delay, no packet drop and with high data rate. Different type of traffic has its own characteristic and is suitable for different type of network and requirement. However no one understands the natural of traffic in smart grid. This paper presents the comparison between different types of traffic to find out the most suitable traffic for the optimal network performance.

  9. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan.

    Science.gov (United States)

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate theimpacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM)strategies. The outputs (mo...

  10. Growth plan for an inspirational test-bed of smart textile services

    NARCIS (Netherlands)

    Wensveen, S.A.G.; Tomico, O.; Bhomer, ten M.; Kuusk, K.

    2015-01-01

    In this pictorial we visualize the growth plan for an inspirational test-bed of smart textile product service systems. The goal of the test-bed is to inspire and inform the Dutch creative industries of textile, interaction and service design to combine their strengths and share opportunities. The

  11. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  12. Context-aware local Intrusion Detection in SCADA systems : a testbed and two showcases

    NARCIS (Netherlands)

    Chromik, Justyna Joanna; Haverkort, Boudewijn R.H.M.; Remke, Anne Katharina Ingrid; Pilch, Carina; Brackmann, Pascal; Duhme, Christof; Everinghoff, Franziska; Giberlein, Artur; Teodorowicz, Thomas; Wieland, Julian

    2017-01-01

    This paper illustrates the use of a testbed that we have developed for context-aware local intrusion detection. This testbed is based on the co-simulation framework Mosaik and allows for the validation of local intrusion detection mechanisms at field stations in power distribution networks. For two

  13. Design of aircraft cabin testbed for stress free air travel experiment

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    The paper presents an aircraft cabin testbed that is designed and built for the stress free air travel experiment. The project is funded by European Union in the aim of improving air travel comfort during long haul flight. The testbed is used to test and validate the adaptive system that is capable

  14. Satellite Servicing's Autonomous Rendezvous and Docking Testbed on the International Space Station

    Science.gov (United States)

    Naasz, Bo J.; Strube, Matthew; Van Eepoel, John; Barbee, Brent W.; Getzandanner, Kenneth M.

    2011-01-01

    The Space Servicing Capabilities Project (SSCP) at NASA's Goddard Space Flight Center (GSFC) has been tasked with developing systems for servicing space assets. Starting in 2009, the SSCP completed a study documenting potential customers and the business case for servicing, as well as defining several notional missions and required technologies. In 2010, SSCP moved to the implementation stage by completing several ground demonstrations and commencing development of two International Space Station (ISS) payloads-the Robotic Refueling Mission (RRM) and the Dextre Pointing Package (DPP)--to mitigate new technology risks for a robotic mission to service existing assets in geosynchronous orbit. This paper introduces the DPP, scheduled to fly in July of 2012 on the third operational SpaceX Dragon mission, and its Autonomous Rendezvous and Docking (AR&D) instruments. The combination of sensors and advanced avionics provide valuable on-orbit demonstrations of essential technologies for servicing existing vehicles, both cooperative and non-cooperative.

  15. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  16. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    Science.gov (United States)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  17. Development of optical packet and circuit integrated ring network testbed.

    Science.gov (United States)

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated. © 2011 Optical Society of America

  18. Satellite Communications

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Satellite Communications. Arthur C Clarke wrote a seminal paper in 1945 in wireless world. Use three satellites in geo-synchronous orbit to enable intercontinental communications. System could be realised in '50 to 100 years'

  19. Encryption protection for communication satellites

    Science.gov (United States)

    Sood, D. R.; Hoernig, O. W., Jr.

    In connection with the growing importance of the commercial communication satellite systems and the introduction of new technological developments, users and operators of these systems become increasingly concerned with aspects of security. The user community is concerned with maintaining confidentiality and integrity of the information being transmitted over the satellite links, while the satellite operators are concerned about the safety of their assets in space. In response to these concerns, the commercial satellite operators are now taking steps to protect the communication information and the satellites. Thus, communication information is being protected by end-to-end encryption of the customer communication traffic. Attention is given to the selection of the NBS DES algorithm, the command protection systems, and the communication protection systems.

  20. Satellite Communications

    CERN Document Server

    Pelton, Joseph N

    2012-01-01

    The field of satellite communications represents the world's largest space industry. Those who are interested in space need to understand the fundamentals of satellite communications, its technology, operation, business, economic, and regulatory aspects. This book explains all this along with key insights into the field's future growth trends and current strategic challenges. Fundamentals of Satellite Communications is a concise book that gives all of the key facts and figures as well as a strategic view of where this dynamic industry is going. Author Joseph N. Pelton, PhD, former Dean of the International Space University and former Director of Strategic Policy at Intelstat, presents a r

  1. Cargo container inspection test program at ARPA's Nonintrusive Inspection Technology Testbed

    Science.gov (United States)

    Volberding, Roy W.; Khan, Siraj M.

    1994-10-01

    An x-ray-based cargo inspection system test program is being conducted at the Advanced Research Project Agency (ARPA)-sponsored Nonintrusive Inspection Technology Testbed (NITT) located in the Port of Tacoma, Washington. The test program seeks to determine the performance that can be expected from a dual, high-energy x-ray cargo inspection system when inspecting ISO cargo containers. This paper describes an intensive, three-month, system test involving two independent test groups, one representing the criminal smuggling element and the other representing the law enforcement community. The first group, the `Red Team', prepares ISO containers for inspection at an off-site facility. An algorithm randomly selects and indicates the positions and preparation of cargoes within a container. The prepared container is dispatched to the NITT for inspection by the `Blue Team'. After in-gate processing, it is queued for examination. The Blue Team inspects the container and decides whether or not to pass the container. The shipment undergoes out-gate processing and returns to the Red Team. The results of the inspection are recorded for subsequent analysis. The test process, including its governing protocol, the cargoes, container preparation, the examination and results available at the time of submission are presented.

  2. Space Station technology testbed: 2010 deep space transport

    Science.gov (United States)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  3. Satellite myths

    Science.gov (United States)

    Easton, Roger L.; Hall, David

    2008-01-01

    Richard Corfield's article “Sputnik's legacy” (October 2007 pp23-27) states that the satellite on board the US Vanguard rocket, which exploded during launch on 6 December 1957 two months after Sputnik's successful take-off, was “a hastily put together contraption of wires and circuitry designed only to send a radio signal back to Earth”. In fact, the Vanguard satellite was developed over a period of several years and put together carefully using the best techniques and equipment available at the time - such as transistors from Bell Laboratories/Western Electric. The satellite contained not one but two transmitters, in which the crystal-controlled oscillators had been designed to measure both the temperature of the satellite shell and of the internal package.

  4. Satellite Geomagnetism

    DEFF Research Database (Denmark)

    Olsen, Nils; Stolle, Claudia

    2012-01-01

    Observations of Earth’s magnetic field from space began more than 50 years ago. A continuous monitoring of the field using low Earth orbit (LEO) satellites, however, started only in 1999, and three satellites have taken highprecision measurements of the geomagnetic field during the past decade....... The unprecedented time-space coverage of their data opened revolutionary new possibilities for monitoring, understanding, and exploring Earth’s magnetic field. In the near future, the three-satellite constellation Swarm will ensure continuity of such measurement and provide enhanced possibilities to improve our...... ability to characterize and understand the many sources that contribute to Earth’s magnetic field. In this review, we summarize investigations of Earth’s interior and environment that have been possible through the analysis of high-precision magnetic field observations taken by LEO satellites....

  5. Assimilation of Remotely Sensed Leaf Area Index into the Community Land Model with Explicit Carbon and Nitrogen Components using Data Assimilation Research Testbed

    Science.gov (United States)

    Ling, X.; Fu, C.; Yang, Z. L.; Guo, W.

    2017-12-01

    Information of the spatial and temporal patterns of leaf area index (LAI) is crucial to understand the exchanges of momentum, carbon, energy, and water between the terrestrial ecosystem and the atmosphere, while both in-situ observation and model simulation usually show distinct deficiency in terms of LAI coverage and value. Land data assimilation, combined with observation and simulation together, is a promising way to provide variable estimation. The Data Assimilation Research Testbed (DART) developed and maintained by the National Centre for Atmospheric Research (NCAR) provides a powerful tool to facilitate the combination of assimilation algorithms, models, and real (as well as synthetic) observations to better understanding of all three. Here we systematically investigated the effects of data assimilation on improving LAI simulation based on NCAR Community Land Model with the prognostic carbon-nitrogen option (CLM4CN) linked with DART using the deterministic Ensemble Adjustment Kalman Filter (EAKF). Random 40-member atmospheric forcing was used to drive the CLM4CN with or without LAI assimilation. The Global Land Surface Satellite LAI data (GLASS LAI) LAI is assimilated into the CLM4CN at a frequency of 8 days, and LAI (and leaf carbon / nitrogen) are adjusted at each time step. The results show that assimilating remotely sensed LAI into the CLM4CN is an effective method for improving model performance. In detail, the CLM4-CN simulated LAI systematically overestimates global LAI, especially in low latitude with the largest bias of 5 m2/m2. While if updating both LAI and leaf carbon and leaf nitrogen simultaneously during assimilation, the analyzed LAI can be corrected, especially in low latitude regions with the bias controlled around ±1 m2/m2. Analyzed LAI could also represent the seasonal variation except for the Southern Temperate (23°S-90°S). The obviously improved regions located in the center of Africa, Amazon, the South of Eurasia, the northeast of

  6. Testbed for High-Acuity Imaging and Stable Photometry

    Science.gov (United States)

    Gregory, James

    This proposal from MIT Lincoln Laboratory (LL) accompanies the NASA/APRA proposal enti-tled THAI-SPICE: Testbed for High-Acuity Imaging - Stable Photometry and Image-Motion Compensa-tion Experiment (submitted by Eliot Young, Southwest Research Institute). The goal of the THAI-SPICE project is to demonstrate three technologies that will help low-cost balloon-borne telescopes achieve diffraction-limited imaging: stable pointing, passive thermal stabilization and in-flight monitoring of the wave front error. This MIT LL proposal supplies a key element of the pointing stabilization component of THAI-SPICE: an electronic camera based on an orthogonaltransfer charge-coupled device (OTCCD). OTCCD cameras have been demonstrated with charge-transfer efficiencies >0.99999, noise of 90%. In addition to supplying a camera with an OTCCD detector, MIT LL will help with integration and testing of the OTCCD with the THAI-SPICE payload’s guide camera.

  7. Designing, Implementing and Documenting the Atlas Networking Test-bed.

    CERN Document Server

    Martinsen, Hans Åge

    The A Toroidal LHC ApparatuS (Atlas) experiment at the Large Hadron Colider (LHC) in European Organization for Nuclear Research (CERN), Geneva is a production environment. To develop new architectures, test new equipment and evaluate new technologies a well supported test bench is needed. A new one is now being commissioned and I will take a leading role in its development, commissioning and operation. This thesis will cover the requirements, the implementation, the documentation and the approach to the different challenges in implementing the testbed. I will be joining the project in the early stages and start by following the work that my colleagues are doing and then, as I get a better understanding, more responsibility will be given to me. To be able to suggest and implement solutions I will have to understand what the requirements are and how to achieve these requirements with the given resources.

  8. Segmented Aperture Interferometric Nulling Testbed (SAINT) II: component systems update

    Science.gov (United States)

    Hicks, Brian A.; Bolcar, Matthew R.; Helmbrecht, Michael A.; Petrone, Peter; Burke, Elliot; Corsetti, James; Dillon, Thomas; Lea, Andrew; Pellicori, Samuel; Sheets, Teresa; Shiri, Ron; Agolli, Jack; DeVries, John; Eberhardt, Andrew; McCabe, Tyler

    2017-09-01

    This work presents updates to the coronagraph and telescope components of the Segmented Aperture Interferometric Nulling Testbed (SAINT). The project pairs an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC) towards demonstrating capabilities for the future space observatories needed to directly detect and characterize a significant sample of Earth-sized worlds around nearby stars in the quest for identifying those which may be habitable and possibly harbor life. Efforts to improve the VNC wavefront control optics and mechanisms towards repeating narrowband results are described. A narrative is provided for the design of new optical components aimed at enabling broadband performance. Initial work with the hardware and software interface for controlling the segmented telescope mirror is also presented.

  9. Telescience testbed: Operational support functions for biomedical experiments

    Science.gov (United States)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  10. Simulation to Flight Test for a UAV Controls Testbed

    Science.gov (United States)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  11. Tower-Based Greenhouse Gas Measurement Network Design---The National Institute of Standards and Technology North East Corridor Testbed.

    Science.gov (United States)

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k -means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  12. Boomerang Satellites

    Science.gov (United States)

    Hesselbrock, Andrew; Minton, David A.

    2017-10-01

    We recently reported that the orbital architecture of the Martian environment allows for material in orbit around the planet to ``cycle'' between orbiting the planet as a ring, or as coherent satellites. Here we generalize our previous analysis to examine several factors that determine whether satellites accreting at the edge of planetary rings will cycle. In order for the orbiting material to cycle, tidal evolution must decrease the semi-major axis of any accreting satellites. In some systems, the density of the ring/satellite material, the surface mass density of the ring, the tidal parameters of the system, and the rotation rate of the primary body contribute to a competition between resonant ring torques and tidal dissipation that prevent this from occurring, either permanently or temporarily. Analyzing these criteria, we examine various bodies in our solar system (such as Saturn, Uranus, and Eris) to identify systems where cycling may occur. We find that a ring-satellite cycle may give rise to the current Uranian ring-satellite system, and suggest that Miranda may have formed from an early, more massive Uranian ring.

  13. High Precision Testbed to Evaluate Ethernet Performance for In-Car Networks

    DEFF Research Database (Denmark)

    Revsbech, Kasper; Madsen, Tatiana Kozlova; Schiøler, Henrik

    2012-01-01

    Validating safety-critical real-time systems such as in-car networks often involves a model-based performance analysis of the network. An important issue performing such analysis is to provide precise model parameters, matching the actual equipment. One way to obtain such parameters is to derive...... them by measurements of the equipment. In this work we describe the design of a testbed enabling active measurements on up to 1 [Gb=Sec] Copper based Ethernet Switches. By use of the testbed it self, we conduct a series of tests where the precision of the testbed is estimated. We find a maximum error...

  14. Security Concepts for Satellite Links

    Science.gov (United States)

    Tobehn, C.; Penné, B.; Rathje, R.; Weigl, A.; Gorecki, Ch.; Michalik, H.

    2008-08-01

    The high costs to develop, launch and maintain a satellite network makes protecting the assets imperative. Attacks may be passive such as eavesdropping on the payload data. More serious threat are active attacks that try to gain control of the satellite, which may lead to the total lost of the satellite asset. To counter these threats, new satellite and ground systems are using cryptographic technologies to provide a range of services: confidentiality, entity & message authentication, and data integrity. Additionally, key management cryptographic services are required to support these services. This paper describes the key points of current satellite control and operations, that are authentication of the access to the satellite TMTC link and encryption of security relevant TM/TC data. For payload data management the key points are multi-user ground station access and high data rates both requiring frequent updates and uploads of keys with the corresponding key management methods. For secure satellite management authentication & key negotiation algorithms as HMAC-RIPEMD160, EC- DSA and EC-DH are used. Encryption of data uses algorithms as IDEA, AES, Triple-DES, or other. A channel coding and encryption unit for payload data provides download data rates up to Nx250 Mbps. The presented concepts are based on our experience and heritage of the security systems for all German MOD satellite projects (SATCOMBw2, SAR-Lupe multi- satellite system and German-French SAR-Lupe-Helios- II systems inter-operability) as well as for further international (KOMPSAT-II Payload data link system) and ESA activities (TMTC security and GMES).

  15. Modeling soil organic matter (SOM) from satellite data using VISNIR-SWIR spectroscopy and PLS regression with step-down variable selection algorithm: case study of Campos Amazonicos National Park savanna enclave, Brazil

    Science.gov (United States)

    Rosero-Vlasova, O.; Borini Alves, D.; Vlassova, L.; Perez-Cabello, F.; Montorio Lloveria, R.

    2017-10-01

    Deforestation in Amazon basin due, among other factors, to frequent wildfires demands continuous post-fire monitoring of soil and vegetation. Thus, the study posed two objectives: (1) evaluate the capacity of Visible - Near InfraRed - ShortWave InfraRed (VIS-NIR-SWIR) spectroscopy to estimate soil organic matter (SOM) in fire-affected soils, and (2) assess the feasibility of SOM mapping from satellite images. For this purpose, 30 soil samples (surface layer) were collected in 2016 in areas of grass and riparian vegetation of Campos Amazonicos National Park, Brazil, repeatedly affected by wildfires. Standard laboratory procedures were applied to determine SOM. Reflectance spectra of soils were obtained in controlled laboratory conditions using Fieldspec4 spectroradiometer (spectral range 350nm- 2500nm). Measured spectra were resampled to simulate reflectances for Landsat-8, Sentinel-2 and EnMap spectral bands, used as predictors in SOM models developed using Partial Least Squares regression and step-down variable selection algorithm (PLSR-SD). The best fit was achieved with models based on reflectances simulated for EnMap bands (R2=0.93; R2cv=0.82 and NMSE=0.07; NMSEcv=0.19). The model uses only 8 out of 244 predictors (bands) chosen by the step-down variable selection algorithm. The least reliable estimates (R2=0.55 and R2cv=0.40 and NMSE=0.43; NMSEcv=0.60) resulted from Landsat model, while Sentinel-2 model showed R2=0.68 and R2cv=0.63; NMSE=0.31 and NMSEcv=0.38. The results confirm high potential of VIS-NIR-SWIR spectroscopy for SOM estimation. Application of step-down produces sparser and better-fit models. Finally, SOM can be estimated with an acceptable accuracy (NMSE 0.35) from EnMap and Sentinel-2 data enabling mapping and analysis of impacts of repeated wildfires on soils in the study area.

  16. Aerosol optical properties derived from the DRAGON-NE Asia campaign, and implications for a single-channel algorithm to retrieve aerosol optical depth in spring from Meteorological Imager (MI on-board the Communication, Ocean, and Meteorological Satellite (COMS

    Directory of Open Access Journals (Sweden)

    M. Kim

    2016-02-01

    Full Text Available An aerosol model optimized for northeast Asia is updated with the inversion data from the Distributed Regional Aerosol Gridded Observation Networks (DRAGON-northeast (NE Asia campaign which was conducted during spring from March to May 2012. This updated aerosol model was then applied to a single visible channel algorithm to retrieve aerosol optical depth (AOD from a Meteorological Imager (MI on-board the geostationary meteorological satellite, Communication, Ocean, and Meteorological Satellite (COMS. This model plays an important role in retrieving accurate AOD from a single visible channel measurement. For the single-channel retrieval, sensitivity tests showed that perturbations by 4 % (0.926 ± 0.04 in the assumed single scattering albedo (SSA can result in the retrieval error in AOD by over 20 %. Since the measured reflectance at the top of the atmosphere depends on both AOD and SSA, the overestimation of assumed SSA in the aerosol model leads to an underestimation of AOD. Based on the AErosol RObotic NETwork (AERONET inversion data sets obtained over East Asia before 2011, seasonally analyzed aerosol optical properties (AOPs were categorized by SSAs at 675 nm of 0.92 ± 0.035 for spring (March, April, and May. After the DRAGON-NE Asia campaign in 2012, the SSA during spring showed a slight increase to 0.93 ± 0.035. In terms of the volume size distribution, the mode radius of coarse particles was increased from 2.08 ± 0.40 to 2.14 ± 0.40. While the original aerosol model consists of volume size distribution and refractive indices obtained before 2011, the new model is constructed by using a total data set after the DRAGON-NE Asia campaign. The large volume of data in high spatial resolution from this intensive campaign can be used to improve the representative aerosol model for East Asia. Accordingly, the new AOD data sets retrieved from a single-channel algorithm, which uses a precalculated look-up table (LUT with the new aerosol model

  17. Aerosol Optical Properties Derived from the DRAGON-NE Asia Campaign, and Implications for a Single-Channel Algorithm to Retrieve Aerosol Optical Depth in Spring from Meteorological Imager (MI) On-Board the Communication, Ocean, and Meteorological Satellite (COMS)

    Science.gov (United States)

    Kim, M.; Kim, J.; Jeong, U.; Kim, W.; Hong, H.; Holben, B.; Eck, T. F.; Lim, J.; Song, C.; Lee, S.; hide

    2016-01-01

    An aerosol model optimized for northeast Asia is updated with the inversion data from the Distributed Regional Aerosol Gridded Observation Networks (DRAGON)-northeast (NE) Asia campaign which was conducted during spring from March to May 2012. This updated aerosol model was then applied to a single visible channel algorithm to retrieve aerosol optical depth (AOD) from a Meteorological Imager (MI) on-board the geostationary meteorological satellite, Communication, Ocean, and Meteorological Satellite (COMS). This model plays an important role in retrieving accurate AOD from a single visible channel measurement. For the single-channel retrieval, sensitivity tests showed that perturbations by 4 % (0.926 +/- 0.04) in the assumed single scattering albedo (SSA) can result in the retrieval error in AOD by over 20 %. Since the measured reflectance at the top of the atmosphere depends on both AOD and SSA, the overestimation of assumed SSA in the aerosol model leads to an underestimation of AOD. Based on the AErosol RObotic NETwork (AERONET) inversion data sets obtained over East Asia before 2011, seasonally analyzed aerosol optical properties (AOPs) were categorized by SSAs at 675 nm of 0.92 +/- 0.035 for spring (March, April, and May). After the DRAGON-NE Asia campaign in 2012, the SSA during spring showed a slight increase to 0.93 +/- 0.035. In terms of the volume size distribution, the mode radius of coarse particles was increased from 2.08 +/- 0.40 to 2.14 +/- 0.40. While the original aerosol model consists of volume size distribution and refractive indices obtained before 2011, the new model is constructed by using a total data set after the DRAGON-NE Asia campaign. The large volume of data in high spatial resolution from this intensive campaign can be used to improve the representative aerosol model for East Asia. Accordingly, the new AOD data sets retrieved from a single-channel algorithm, which uses a precalculated look-up table (LUT) with the new aerosol model, show

  18. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  19. A Testbed For Validating the LHC Controls System Core Before Deployment

    CERN Document Server

    Nguyen Xuan, J

    2011-01-01

    Since the start-up of the LHC, it is crucial to carefully test core controls components before deploying them operationally. The Testbed of the CERN accelerator controls group was developed for this purpose. It contains different hardware (PPC, i386) running various operating systems (Linux and LynxOS) and core software components running on front-ends, communication middleware and client libraries. The Testbed first executes integration tests to verify that the components delivered by individual teams interoperate, and then system tests, which verify high-level, end-user functionality. It also verifies that different versions of components are compatible, which is vital, because not all parts of the operational LHC control system can be upgraded simultaneously. In addition, the Testbed can be used for performance and stress tests. Internally, the Testbed is driven by Atlassian Bamboo, a Continuous Integration server, which builds and deploys automatically new software versions into the Test...

  20. Construction of test-bed system of voltage management system to ...

    African Journals Online (AJOL)

    Construction of test-bed system of voltage management system to apply physical power system. ... Journal of Fundamental and Applied Sciences ... system of voltage management system (VMS) in order to apply physical power system.

  1. Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed

    Science.gov (United States)

    2012-01-01

    Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed Matthew Keeter1, Daniel Moore2,3, Ryan Muller2,3, Eric Nieters1, Jennifer...Many applications for autonomous vehicles involve three-dimensional domains, notably aerial and aquatic environments. Such applications include mon...TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Cooperative Search With Autonomous Vehicles In A 3D Aquatic Testbed 5a

  2. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  3. PEER Testbed Study on a Laboratory Building: Exercising Seismic Performance Assessment

    OpenAIRE

    Comerio, Mary C.; Stallmeyer, John C.; Smith, Ryan; Makris, Nicos; Konstantinidis, Dimitrios; Mosalam, Khalid; Lee, Tae-Hyung; Beck, James L.; Porter, Keith A.; Shaikhutdinov, Rustem; Hutchinson, Tara; Chaudhuri, Samit Ray; Chang, Stephanie E.; Falit-Baiamonte, Anthony; Holmes, William T.

    2005-01-01

    From 2002 to 2004 (years five and six of a ten-year funding cycle), the PEER Center organized the majority of its research around six testbeds. Two buildings and two bridges, a campus, and a transportation network were selected as case studies to “exercise” the PEER performance-based earthquake engineering methodology. All projects involved interdisciplinary teams of researchers, each producing data to be used by other colleagues in their research. The testbeds demonstrat...

  4. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Science.gov (United States)

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  5. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Directory of Open Access Journals (Sweden)

    Jared A. Frank

    2016-08-01

    Full Text Available Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  6. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    Science.gov (United States)

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  7. First results of the Test-Bed Telescopes (TBT) project: Cebreros telescope commissioning

    Science.gov (United States)

    Ocaña, Francisco; Ibarra, Aitor; Racero, Elena; Montero, Ángel; Doubek, Jirí; Ruiz, Vicente

    2016-07-01

    The TBT project is being developed under ESA's General Studies and Technology Programme (GSTP), and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario within the Space Situational Awareness (SSA) programme of the European Space Agency (ESA). The goal of the project is to provide two fully robotic telescopes, which will serve as prototypes for development of a future network. The system consists of two telescopes, one in Spain and the second one in the Southern Hemisphere. The telescope is a fast astrograph with a large Field of View (FoV) of 2.5 x 2.5 square-degrees and a plate scale of 2.2 arcsec/pixel. The tube is mounted on a fast direct-drive mount moving with speed up to 20 degrees per second. The focal plane hosts a 2-port 4K x 4K back-illuminated CCD with readout speeds up to 1MHz per port. All these characteristics ensure good survey performance for transients and fast moving objects. Detection software and hardware are optimised for the detection of NEOs and objects in high Earth orbits (objects moving from 0.1-40 arcsec/second). Nominal exposures are in the range from 2 to 30 seconds, depending on the observational strategy. Part of the validation scenario involves the scheduling concept integrated in the robotic operations for both sensors. Every night it takes all the input needed and prepares a schedule following predefined rules allocating tasks for the telescopes. Telescopes are managed by RTS2 control software, that performs the real-time scheduling of the observation and manages all the devices at the observatory.1 At the end of the night the observing systems report astrometric positions and photometry of the objects detected. The first telescope was installed in Cebreros Satellite Tracking Station in mid-2015. It is currently in the commissioning phase and we present here the first results of the telescope. We evaluate the site characteristics and the performance of the TBT Cebreros

  8. Microgrid testbeds around the world: State of art

    International Nuclear Information System (INIS)

    Hossain, Eklas; Kabalci, Ersan; Bayindir, Ramazan; Perez, Ronald

    2014-01-01

    Highlights: • A detail discussion on microgrid project around the world such as North American, Europe, and Japan. • Key benefits of microgrid, issues with on-site generation, features. • Why we need distributed generation system with a brief introduction. • Distributed generation technologies with cost analysis. • The overview on existing distribution network. - Abstract: This paper deals with the recent evolution of microgrids being used around the world in real life applications as well as laboratory application for research. This study is intended to introduce the subject by reviewing the components level, structure and types of microgrid applications installed as a plant or modeled as a simulation environment. The paper also presents a survey regarding published papers on why the microgrid is required, and what the components and control systems are which constitute the actual microgrid studies. It leads the researcher to see the microgrid in terms of the actual bigger picture of today and creates a new outlook about the potential developments. Additionally, comparison of microgrids in various regions based on several parameters allows researchers to define the required criteria and features of a special microgrid that is chosen for a particular scenario. The authors of this paper also tabulated all the necessary information about microgrids, and proposed a standard microgrid for better power quality and optimizing energy generation. Consequently, it is focused on inadequate knowledge and technology gaps in the power system field with regards to the future, and it is this which has been illustrated for the reader. The existing microgrid testbeds all around the world have been studied and analyzed and several of them are explained as an example in this study. Later, those investigated distribution systems are classified based on region (North America, Europe and Asia) and, as presented in literature, a significant amount of deviation has been found

  9. Satellite Control Laboratory

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Bak, Thomas

    2001-01-01

    The Satellite Laboratory at the Department of Control Engineering of Aalborg University (SatLab) is a dynamic motion facility designed for analysis and test of micro spacecraft. A unique feature of the laboratory is that it provides a completely gravity-free environment. A test spacecraft...... of the laboratory is to conduct dynamic tests of the control and attitude determination algorithms during nominal operation and in abnormal conditions. Further it is intended to use SatLab for validation of various algorithms for fault detection, accommodation and supervisory control. Different mission objectives...... can be implemented in the laboratory, e.g. three-axis attitude control, slew manoeuvres, spins stabilization using magnetic actuation and/or reaction wheels. The spacecraft attitude can be determined applying magnetometer measurements...

  10. Satellite Radio

    Indian Academy of Sciences (India)

    Satellites have been a highly effective platform for multi- form broadcasts. This has led to a ... diversity offormats, languages, genre, and a universal reach that cannot be met by .... programs can be delivered to whom it is intended. In the case of.

  11. Algorithming the Algorithm

    DEFF Research Database (Denmark)

    Mahnke, Martina; Uprichard, Emma

    2014-01-01

    Imagine sailing across the ocean. The sun is shining, vastness all around you. And suddenly [BOOM] you’ve hit an invisible wall. Welcome to the Truman Show! Ever since Eli Pariser published his thoughts on a potential filter bubble, this movie scenario seems to have become reality, just with slight...... changes: it’s not the ocean, it’s the internet we’re talking about, and it’s not a TV show producer, but algorithms that constitute a sort of invisible wall. Building on this assumption, most research is trying to ‘tame the algorithmic tiger’. While this is a valuable and often inspiring approach, we...

  12. Digital Preservation Theory and Application: Transcontinental Persistent Archives Testbed Activity

    Directory of Open Access Journals (Sweden)

    Paul Watry

    2007-12-01

    Full Text Available The National Archives and Records Administration (NARA and EU SHAMAN projects are working with multiple research institutions on tools and technologies that will supply a comprehensive, systematic, and dynamic means for preserving virtually any type of electronic record, free from dependence on any specific hardware or software. This paper describes the joint development work between the University of Liverpool and the San Diego Supercomputer Center (SDSC at the University of California, San Diego on the NARA and SHAMAN prototypes. The aim is to provide technologies in support of the required generic data management infrastructure. We describe a Theory of Preservation that quantifies how communication can be accomplished when future technologies are different from those available at present. This includes not only different hardware and software, but also different standards for encoding information. We describe the concept of a “digital ontology” to characterize preservation processes; this is an advance on the current OAIS Reference Model of providing representation information about records. To realize a comprehensive Theory of Preservation, we describe the ongoing integration of distributed shared collection management technologies, digital library browsing, and presentation technologies for the NARA and SHAMAN Persistent Archive Testbeds.

  13. NN-SITE: A remote monitoring testbed facility

    International Nuclear Information System (INIS)

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-01-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide

  14. Digital pathology: DICOM-conform draft, testbed, and first results.

    Science.gov (United States)

    Zwönitzer, Ralf; Kalinski, Thomas; Hofmann, Harald; Roessner, Albert; Bernarding, Johannes

    2007-09-01

    Hospital information systems are state of the art nowadays. Therefore, Digital Pathology, also labelled as Virtual Microscopy, has gained increased attention. Triggered by radiology, standardized information models and workflows were world-wide defined based on DICOM. However, DICOM-conform integration of Digital Pathology into existing clinical information systems imposes new problems requiring specific solutions concerning the huge amount of data as well as the special structure of the data to be managed, transferred, and stored. We implemented a testbed to realize and evaluate the workflow of digitized slides from acquisition to archiving. The experiences led to the draft of a DICOM-conform information model that accounted for extensions, definitions, and technical requirements necessary to integrate digital pathology in a hospital-wide DICOM environment. Slides were digitized, compressed, and could be viewed remotely. Real-time transfer of the huge amount of data was optimized using streaming techniques. Compared to a recent discussion in the DICOM Working Group for Digital Pathology (WG26) our experiences led to a preference of a JPEG2000/JPIP-based streaming of the whole slide image. The results showed that digital pathology is feasible but strong efforts by users and vendors are still necessary to integrate Digital Pathology into existing information systems.

  15. Extrasolar Planetary Imaging Coronagraph (EPIC): visible nulling cornagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal

    2008-07-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept under study for the upcoming Exoplanet Probe. EPIC's mission would be to image and characterize extrasolar giant planets, and potential super-Earths, in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys and potentially some transits, determine orbital inclinations and masses, characterize the atmospheres of gas giants around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched into a heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime (5 year goal) and will revisit planets at least three times. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA/Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  16. Extrasolar Planetary Imaging Coronagraph: Visible Nulling Coronagraph Testbed Results

    Science.gov (United States)

    Lyon, Richard G.

    2008-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime ( 5 year goal) and will revisit planets at least three times at intervals of 9 months. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed,

  17. Atmospheric Fluctuation Measurements with the Palomar Testbed Interferometer

    Science.gov (United States)

    Linfield, R. P.; Lane, B. F.; Colavita, M. M.; PTI Collaboration

    Observations of bright stars with the Palomar Testbed Interferometer, at a wavelength of 2.2 microns, have been used to measure atmospheric delay fluctuations. The delay structure function Dτ(Δ t) was calculated for 66 scans (each >= 120s in length) on seven nights in 1997 and one in 1998. For all except one scan, Dτ exhibited a clean power law shape over the time interval 50-500 msec. Over shorter time intervals, the effect of the delay line servo loop corrupts Dτ. Over longer time intervals (usually starting at > 1s), the slope of Dτ decreases, presumably due to some combination of saturation e.g. finite turbulent layer thickness) and the effect of the finite wind speed crossing time on our 110 m baseline. The mean power law slopes for the eight nights ranged from 1.16 to 1.36, substantially flatter than the value of 1.67 for three dimensional Kolmogorov turbulence. Such sub-Kolmogorov slopes will result in atmospheric seeling (θ) that improves rapidly with increasing wavelength: θ propto λ1-(2β), where β is the observed power law slope of Dτ. The atmospheric errors in astrometric measurements with an interferometer will average down more quickly than in the Kolmogorov case.

  18. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    Science.gov (United States)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  19. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  20. Performance measurement, modeling, and evaluation of integrated concurrency control and recovery algorithms in distributed data base systems

    Energy Technology Data Exchange (ETDEWEB)

    Jenq, B.C.

    1986-01-01

    The performance evaluation of integrated concurrency-control and recovery mechanisms for distributed data base systems is studied using a distributed testbed system. In addition, a queueing network model was developed to analyze the two phase locking scheme in the distributed testbed system. The combination of testbed measurement and analytical modeling provides an effective tool for understanding the performance of integrated concurrency control and recovery algorithms in distributed database systems. The design and implementation of the distributed testbed system, CARAT, are presented. The concurrency control and recovery algorithms implemented in CARAT include: a two phase locking scheme with distributed deadlock detection, a distributed version of optimistic approach, before-image and after-image journaling mechanisms for transaction recovery, and a two-phase commit protocol. Many performance measurements were conducted using a variety of workloads. A queueing network model is developed to analyze the performance of the CARAT system using the two-phase locking scheme with before-image journaling. The combination of testbed measurements and analytical modeling provides significant improvements in understanding the performance impacts of the concurrency control and recovery algorithms in distributed database systems.

  1. Satellite teleradiology test bed for digital mammography

    Science.gov (United States)

    Barnett, Bruce G.; Dudding, Kathryn E.; Abdel-Malek, Aiman A.; Mitchell, Robert J.

    1996-05-01

    Teleradiology offers significant improvement in efficiency and patient compliance over current practices in traditional film/screen-based diagnosis. The increasing number of women who need to be screened for breast cancer, including those in remote rural regions, make the advantages of teleradiology especially attractive for digital mammography. At the same time, the size and resolution of digital mammograms are among the most challenging to support in a cost effective teleradiology system. This paper will describe a teleradiology architecture developed for use with digital mammography by GE Corporate Research and Development in collaboration with Massachusetts General Hospital under National Cancer Institute (NCI/NIH) grant number R01 CA60246-01. The testbed architecture is based on the Digital Imaging and Communications in Medicine (DICOM) standard, created by the American College of Radiology and National Electrical Manufacturers Association. The testbed uses several Sun workstations running SunOS, which emulate a rural examination facility connected to a central diagnostic facility, and uses a TCP-based DICOM application to transfer images over a satellite link. Network performance depends on the product of the bandwidth times the round- trip time. A satellite link has a round trip of 513 milliseconds, making the bandwidth-delay a significant problem. This type of high bandwidth, high delay network is called a Long Fat Network, or LFN. The goal of this project was to quantify the performance of the satellite link, and evaluate the effectiveness of TCP over an LFN. Four workstations have Sun's HSI/S (High Speed Interface) option. Two are connected by a cable, and two are connected through a satellite link. Both interfaces have the same T1 bandwidth (1.544 Megabits per second). The only difference was the round trip time. Even with large window buffers, the time to transfer a file over the satellite link was significantly longer, due to the bandwidth-delay. To

  2. Utilizing the ISS Mission as a Testbed to Develop Cognitive Communications Systems

    Science.gov (United States)

    Jackson, Dan

    2016-01-01

    The ISS provides an excellent opportunity for pioneering artificial intelligence software to meet the challenges of real-time communications (comm) link management. This opportunity empowers the ISS Program to forge a testbed for developing cognitive communications systems for the benefit of the ISS mission, manned Low Earth Orbit (LEO) science programs and future planetary exploration programs. In November, 1998, the Flight Operations Directorate (FOD) started the ISS Antenna Manager (IAM) project to develop a single processor supporting multiple comm satellite tracking for two different antenna systems. Further, the processor was developed to be highly adaptable as it supported the ISS mission through all assembly stages. The ISS mission mandated communications specialists with complete knowledge of when the ISS was about to lose or gain comm link service. The current specialty mandated cognizance of large sun-tracking solar arrays and thermal management panels in addition to the highly-dynamic satellite service schedules and rise/set tables. This mission requirement makes the ISS the ideal communications management analogue for future LEO space station and long-duration planetary exploration missions. Future missions, with their precision-pointed, dynamic, laser-based comm links, require complete autonomy for managing high-data rate communications systems. Development of cognitive communications management systems that permit any crew member or payload science specialist, regardless of experience level, to control communications is one of the greater benefits the ISS can offer new space exploration programs. The IAM project met a new mission requirement never previously levied against US space-born communications systems management: process and display the orientation of large solar arrays and thermal control panels based on real-time joint angle telemetry. However, IAM leaves the actual communications availability assessment to human judgement, which introduces

  3. Scientific Satellites

    Science.gov (United States)

    1967-01-01

    noise signal level exceeds 10 times the normal background. EXPERIMENTS FOR SATELLITE ASTRONOMY 615 ANTENNA MONOPOLE -., PREAMPLFE = BANDPASS-FILTER...OUTPUT TO AND DETECTOR TELEMETRYCHANNELS (18) CALIBRATION NOISE MATRIX CLOCK NOISE SOURCE ’ON’ SOURCE COMMAND F ROM PROGRAMERP ANTENNA MONOPOLE FIGURE 13...Animal Tempera- ture Sensing for Studying the Effect of Prolonged Orbital Flight on the Circadian Rhythms of Pocket Mice . Unmanned Spacecraft Meeting

  4. Satellite Attitude Control System Simulator

    Directory of Open Access Journals (Sweden)

    G.T. Conti

    2008-01-01

    Full Text Available Future space missions will involve satellites with great autonomy and stringent pointing precision, requiring of the Attitude Control Systems (ACS with better performance than before, which is function of the control algorithms implemented on board computers. The difficulties for developing experimental ACS test is to obtain zero gravity and torque free conditions similar to the SCA operate in space. However, prototypes for control algorithms experimental verification are fundamental for space mission success. This paper presents the parameters estimation such as inertia matrix and position of mass centre of a Satellite Attitude Control System Simulator (SACSS, using algorithms based on least square regression and least square recursive methods. Simulations have shown that both methods have estimated the system parameters with small error. However, the least square recursive methods have performance more adequate for the SACSS objectives. The SACSS platform model will be used to do experimental verification of fundamental aspects of the satellite attitude dynamics and design of different attitude control algorithm.

  5. Solar satellites

    Energy Technology Data Exchange (ETDEWEB)

    Poher, C.

    1982-01-01

    A reference system design, projected costs, and the functional concepts of a satellite solar power system (SSPS) for converting sunlight falling on solar panels of a satellite in GEO to a multi-GW beam which could be received by a rectenna on earth are outlined. Electricity transmission by microwaves has been demonstrated, and a reference design system for supplying 5 GW dc to earth was devised. The system will use either monocrystalline Si or concentrator GaAs solar cells for energy collection in GEO. Development is still needed to improve the lifespan of the cells. Currently, the cell performance degrades 50 percent in efficiency after 7-8 yr in space. Each SSPS satellite would weigh either 34,000 tons (Si) or 51,000 tons (GaAs), thereby requiring the fabrication of a heavy lift launch vehicle or a single-stage-to-orbit transport in order to minimize launch costs. Costs for the solar panels have been estimated at $500/kW using the GaAs technology, with transport costs for materials to GEO being $40/kg.

  6. Solar satellites

    Science.gov (United States)

    Poher, C.

    A reference system design, projected costs, and the functional concepts of a satellite solar power system (SSPS) for converting sunlight falling on solar panels of a satellite in GEO to a multi-GW beam which could be received by a rectenna on earth are outlined. Electricity transmission by microwaves has been demonstrated, and a reference design system for supplying 5 GW dc to earth was devised. The system will use either monocrystalline Si or concentrator GaAs solar cells for energy collection in GEO. Development is still needed to improve the lifespan of the cells. Currently, the cell performance degrades 50 percent in efficiency after 7-8 yr in space. Each SSPS satellite would weigh either 34,000 tons (Si) or 51,000 tons (GaAs), thereby requiring the fabrication of a heavy lift launch vehicle or a single-stage-to-orbit transport in order to minimize launch costs. Costs for the solar panels have been estimated at $500/kW using the GaAs technology, with transport costs for materials to GEO being $40/kg.

  7. A demand assignment control in international business satellite communications network

    Science.gov (United States)

    Nohara, Mitsuo; Takeuchi, Yoshio; Takahata, Fumio; Hirata, Yasuo

    An experimental system is being developed for use in an international business satellite (IBS) communications network based on demand-assignment (DA) and TDMA techniques. This paper discusses its system design, in particular from the viewpoints of a network configuration, a DA control, and a satellite channel-assignment algorithm. A satellite channel configuration is also presented along with a tradeoff study on transmission rate, HPA output power, satellite resource efficiency, service quality, and so on.

  8. A test-bed modeling study for wave resource assessment

    Science.gov (United States)

    Yang, Z.; Neary, V. S.; Wang, T.; Gunawan, B.; Dallman, A.

    2016-02-01

    Hindcasts from phase-averaged wave models are commonly used to estimate standard statistics used in wave energy resource assessments. However, the research community and wave energy converter industry is lacking a well-documented and consistent modeling approach for conducting these resource assessments at different phases of WEC project development, and at different spatial scales, e.g., from small-scale pilot study to large-scale commercial deployment. Therefore, it is necessary to evaluate current wave model codes, as well as limitations and knowledge gaps for predicting sea states, in order to establish best wave modeling practices, and to identify future research needs to improve wave prediction for resource assessment. This paper presents the first phase of an on-going modeling study to address these concerns. The modeling study is being conducted at a test-bed site off the Central Oregon Coast using two of the most widely-used third-generation wave models - WaveWatchIII and SWAN. A nested-grid modeling approach, with domain dimension ranging from global to regional scales, was used to provide wave spectral boundary condition to a local scale model domain, which has a spatial dimension around 60km by 60km and a grid resolution of 250m - 300m. Model results simulated by WaveWatchIII and SWAN in a structured-grid framework are compared to NOAA wave buoy data for the six wave parameters, including omnidirectional wave power, significant wave height, energy period, spectral width, direction of maximum directionally resolved wave power, and directionality coefficient. Model performance and computational efficiency are evaluated, and the best practices for wave resource assessments are discussed, based on a set of standard error statistics and model run times.

  9. Earthbound Unmanned Autonomous Vehicles (UAVS) As Planetary Science Testbeds

    Science.gov (United States)

    Pieri, D. C.; Bland, G.; Diaz, J. A.; Fladeland, M. M.

    2014-12-01

    Recent advances in the technology of unmanned vehicles have greatly expanded the range of contemplated terrestrial operational environments for their use, including aerial, surface, and submarine. The advances have been most pronounced in the areas of autonomy, miniaturization, durability, standardization, and ease of operation, most notably (especially in the popular press) for airborne vehicles. Of course, for a wide range of planetary venues, autonomy at high cost of both money and risk, has always been a requirement. Most recently, missions to Mars have also featured an unprecedented degree of mobility. Combining the traditional planetary surface deployment operational and science imperatives with emerging, very accessible, and relatively economical small UAV platforms on Earth can provide flexible, rugged, self-directed, test-bed platforms for landed instruments and strategies that will ultimately be directed elsewhere, and, in the process, provide valuable earth science data. While the most direct transfer of technology from terrestrial to planetary venues is perhaps for bodies with atmospheres (and oceans), with appropriate technology and strategy accommodations, single and networked UAVs can be designed to operate on even airless bodies, under a variety of gravities. In this presentation, we present and use results and lessons learned from our recent earth-bound UAV volcano deployments, as well as our future plans for such, to conceptualize a range of planetary and small-body missions. We gratefully acknowledge the assistance of students and colleagues at our home institutions, and the government of Costa Rica, without which our UAV deployments would not have been possible. This work was carried out, in part, at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.

  10. High-contrast imager for Complex Aperture Telescopes (HiCAT): testbed design and coronagraph developments

    Science.gov (United States)

    N'Diaye, Mamadou; Choquet, E.; Pueyo, L.; Elliot, E.; Perrin, M. D.; Wallace, J.; Anderson, R. E.; Carlotti, A.; Groff, T. D.; Hartig, G. F.; Kasdin, J.; Lajoie, C.; Levecq, O.; Long, C.; Macintosh, B.; Mawet, D.; Norman, C. A.; Shaklan, S.; Sheckells, M.; Sivaramakrishnan, A.; Soummer, R.

    2014-01-01

    We present a new high-contrast imaging testbed designed to provide complete solutions for wavefront sensing and control and starlight suppression with complex aperture telescopes (NASA APRA; Soummer PI). This includes geometries with central obstruction, support structures, and/or primary mirror segmentation. Complex aperture telescopes are often associated with large telescope designs, which are considered for future space missions. However, these designs makes high-contrast imaging challenging because of additional diffraction features in the point spread function. We present a novel optimization approach for the testbed optical and opto-mechanical design that minimizes the impact of both phase and amplitude errors from the wave propagation of testbed optics surface errors. This design approach allows us to define the specification for the bench optics, which we then compare to the manufactured parts. We discuss the testbed alignment and first results. We also present our coronagraph design for different testbed pupil shapes (AFTA or ATLAST), which involves a new method for the optimization of Apodized Pupil Lyot Coronagraphs (APLC).

  11. Sound algorithms

    OpenAIRE

    De Götzen , Amalia; Mion , Luca; Tache , Olivier

    2007-01-01

    International audience; We call sound algorithms the categories of algorithms that deal with digital sound signal. Sound algorithms appeared in the very infancy of computer. Sound algorithms present strong specificities that are the consequence of two dual considerations: the properties of the digital sound signal itself and its uses, and the properties of auditory perception.

  12. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  13. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    Science.gov (United States)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  14. Definition study for variable cycle engine testbed engine and associated test program

    Science.gov (United States)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  15. A Future Accelerated Cognitive Distributed Hybrid Testbed for Big Data Science Analytics

    Science.gov (United States)

    Halem, M.; Prathapan, S.; Golpayegani, N.; Huang, Y.; Blattner, T.; Dorband, J. E.

    2016-12-01

    As increased sensor spectral data volumes from current and future Earth Observing satellites are assimilated into high-resolution climate models, intensive cognitive machine learning technologies are needed to data mine, extract and intercompare model outputs. It is clear today that the next generation of computers and storage, beyond petascale cluster architectures, will be data centric. They will manage data movement and process data in place. Future cluster nodes have been announced that integrate multiple CPUs with high-speed links to GPUs and MICS on their backplanes with massive non-volatile RAM and access to active flash RAM disk storage. Active Ethernet connected key value store disk storage drives with 10Ge or higher are now available through the Kinetic Open Storage Alliance. At the UMBC Center for Hybrid Multicore Productivity Research, a future state-of-the-art Accelerated Cognitive Computer System (ACCS) for Big Data science is being integrated into the current IBM iDataplex computational system `bluewave'. Based on the next gen IBM 200 PF Sierra processor, an interim two node IBM Power S822 testbed is being integrated with dual Power 8 processors with 10 cores, 1TB Ram, a PCIe to a K80 GPU and an FPGA Coherent Accelerated Processor Interface card to 20TB Flash Ram. This system is to be updated to the Power 8+, an NVlink 1.0 with the Pascal GPU late in 2016. Moreover, the Seagate 96TB Kinetic Disk system with 24 Ethernet connected active disks is integrated into the ACCS storage system. A Lightweight Virtual File System developed at the NASA GSFC is installed on bluewave. Since remote access to publicly available quantum annealing computers is available at several govt labs, the ACCS will offer an in-line Restricted Boltzmann Machine optimization capability to the D-Wave 2X quantum annealing processor over the campus high speed 100 Gb network to Internet 2 for large files. As an evaluation test of the cognitive functionality of the architecture, the

  16. Multi-level infrastructure of interconnected testbeds of large-scale wireless sensor networks (MI2T-WSN)

    CSIR Research Space (South Africa)

    Abu-Mahfouz, Adnan M

    2012-06-01

    Full Text Available are still required for further testing before the real implementation. In this paper we propose a multi-level infrastructure of interconnected testbeds of large- scale WSNs. This testbed consists of 1000 sensor motes that will be distributed into four...

  17. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    Science.gov (United States)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  18. An overview of the U.S. Army Research Laboratory's Sensor Information Testbed for Collaborative Research Environment (SITCORE) and Automated Online Data Repository (AODR) capabilities

    Science.gov (United States)

    Ward, Dennis W.; Bennett, Kelly W.

    2017-05-01

    The Sensor Information Testbed COllaberative Research Environment (SITCORE) and the Automated Online Data Repository (AODR) are significant enablers of the U.S. Army Research Laboratory (ARL)'s Open Campus Initiative and together create a highly-collaborative research laboratory and testbed environment focused on sensor data and information fusion. SITCORE creates a virtual research development environment allowing collaboration from other locations, including DoD, industry, academia, and collation facilities. SITCORE combined with AODR provides end-toend algorithm development, experimentation, demonstration, and validation. The AODR enterprise allows the U.S. Army Research Laboratory (ARL), as well as other government organizations, industry, and academia to store and disseminate multiple intelligence (Multi-INT) datasets collected at field exercises and demonstrations, and to facilitate research and development (R and D), and advancement of analytical tools and algorithms supporting the Intelligence, Surveillance, and Reconnaissance (ISR) community. The AODR provides a potential central repository for standards compliant datasets to serve as the "go-to" location for lessons-learned and reference products. Many of the AODR datasets have associated ground truth and other metadata which provides a rich and robust data suite for researchers to develop, test, and refine their algorithms. Researchers download the test data to their own environments using a sophisticated web interface. The AODR allows researchers to request copies of stored datasets and for the government to process the requests and approvals in an automated fashion. Access to the AODR requires two-factor authentication in the form of a Common Access Card (CAC) or External Certificate Authority (ECA)

  19. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    Science.gov (United States)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and

  20. Aeronautics Autonomy Testbed Capability (AATC) Team Developed Concepts

    Science.gov (United States)

    Smith, Phillip J.

    2018-01-01

    In 2015, the National Aeronautics and Space Administration (NASA) formed a multi-center, interdisciplinary team of engineers from three different aeronautics research centers who were tasked with improving NASA autonomy research capabilities. This group was subsequently named the Aeronautics Autonomy Testbed Capability (AATC) team. To aid in confronting the autonomy research directive, NASA contracted IDEO, a design firm, to provide consultants and guides to educate NASA engineers through the practice of design thinking, which is an unconventional method for aerospace design processes. The team then began learning about autonomy research challenges by conducting interviews with a diverse group of researchers and pilots, military personnel and civilians, experts and amateurs. Part of this design thinking process involved developing ideas for products or programs known as concepts that could enable real world fulfillment of the most important latent needs identified through analysis of the interviews. The concepts are intended to be sacrificial, intermediate steps in the design thinking process and are presented in this report to record the efforts of the AATC group. Descriptions are provided in present tense to allow for further ideation and imagining the concept as reality as was attempted during the teams discussions and interviews. This does not indicate that the concepts are actually in practice within NASA though there may be similar existing programs independent of AATC. These concepts were primarily created at two distinct stages during the design thinking process. After the initial interviews, there was a workshop for concept development and the resulting ideas are shown in this work as from the First Round. As part of succeeding interviews, the team members presented the First Round concepts to refine the understanding of existing research needs. This knowledge was then used to generate an additional set of concepts denoted as the Second Round. Some

  1. LOS Throughput Measurements in Real-Time with a 128-Antenna Massive MIMO Testbed

    OpenAIRE

    Harris, Paul; Zhang, Siming; Beach, Mark; Mellios, Evangelos; Nix, Andrew; Armour, Simon; Doufexi, Angela; Nieman, Karl; Kundargi, Nikhil

    2017-01-01

    This paper presents initial results for a novel 128-antenna massive Multiple-Input, Multiple- Output (MIMO) testbed developed through Bristol Is Open in collaboration with National Instruments and Lund University. We believe that the results presented here validate the adoption of massive MIMO as a key enabling technology for 5G and pave the way for further pragmatic research by the massive MIMO community. The testbed operates in real-time with a Long-Term Evolution (LTE)-like PHY in Time Div...

  2. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Evans, Philip G [ORNL; Sheets, Gregory S [ORNL; Jones, John P [ORNL; Flynn, Toby H [ORNL; O' Neil, Lori Ross [Pacific Northwest National Laboratory (PNNL); Hutton, William [Pacific Northwest National Laboratory (PNNL); Pratt, Richard [Pacific Northwest National Laboratory (PNNL); Carroll, Thomas E. [Pacific Northwest National Laboratory (PNNL)

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  3. Accelerating Innovation that Enhances Resource Recovery in the Wastewater Sector: Advancing a National Testbed Network.

    Science.gov (United States)

    Mihelcic, James R; Ren, Zhiyong Jason; Cornejo, Pablo K; Fisher, Aaron; Simon, A J; Snyder, Seth W; Zhang, Qiong; Rosso, Diego; Huggins, Tyler M; Cooper, William; Moeller, Jeff; Rose, Bob; Schottel, Brandi L; Turgeon, Jason

    2017-07-18

    This Feature examines significant challenges and opportunities to spur innovation and accelerate adoption of reliable technologies that enhance integrated resource recovery in the wastewater sector through the creation of a national testbed network. The network is a virtual entity that connects appropriate physical testing facilities, and other components needed for a testbed network, with researchers, investors, technology providers, utilities, regulators, and other stakeholders to accelerate the adoption of innovative technologies and processes that are needed for the water resource recovery facility of the future. Here we summarize and extract key issues and developments, to provide a strategy for the wastewater sector to accelerate a path forward that leads to new sustainable water infrastructures.

  4. Data dissemination in the wild: A testbed for high-mobility MANETs

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Heide, Janus

    2012-01-01

    This paper investigates the problem of efficient data dissemination in Mobile Ad hoc NETworks (MANETs) with high mobility. A testbed is presented; which provides a high degree of mobility in experiments. The testbed consists of 10 autonomous robots with mobile phones mounted on them. The mobile...... information, and the goal is to convey that information to all devices. A strategy is proposed that uses UDP broadcast transmissions and random linear network coding to facilitate the efficient exchange of information in the network. An application is introduced that implements this strategy on Nokia phones...

  5. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    Science.gov (United States)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  6. Implementation of a virtual link between power system testbeds at Marshall Spaceflight Center and Lewis Research Center

    Science.gov (United States)

    Doreswamy, Rajiv

    1990-01-01

    The Marshall Space Flight Center (MSFC) owns and operates a space station module power management and distribution (SSM-PMAD) testbed. This system, managed by expert systems, is used to analyze and develop power system automation techniques for Space Station Freedom. The Lewis Research Center (LeRC), Cleveland, Ohio, has developed and implemented a space station electrical power system (EPS) testbed. This system and its power management controller are representative of the overall Space Station Freedom power system. A virtual link is being implemented between the testbeds at MSFC and LeRC. This link would enable configuration of SSM-PMAD as a load center for the EPS testbed at LeRC. This connection will add to the versatility of both systems, and provide an environment of enhanced realism for operation of both testbeds.

  7. A Mobile Satellite Experiment (MSAT-X) network definition

    Science.gov (United States)

    Wang, Charles C.; Yan, Tsun-Yee

    1990-01-01

    The network architecture development of the Mobile Satellite Experiment (MSAT-X) project for the past few years is described. The results and findings of the network research activities carried out under the MSAT-X project are summarized. A framework is presented upon which the Mobile Satellite Systems (MSSs) operator can design a commercial network. A sample network configuration and its capability are also included under the projected scenario. The Communication Interconnection aspect of the MSAT-X network is discussed. In the MSAT-X network structure two basic protocols are presented: the channel access protocol, and the link connection protocol. The error-control techniques used in the MSAT-X project and the packet structure are also discussed. A description of two testbeds developed for experimentally simulating the channel access protocol and link control protocol, respectively, is presented. A sample network configuration and some future network activities of the MSAT-X project are also presented.

  8. Algorithmic cryptanalysis

    CERN Document Server

    Joux, Antoine

    2009-01-01

    Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program.Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applic

  9. Retrievals of Karenia brevis Harmful Algal Blooms in the West Florida Shelf from observations by the JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Satellite processed using Neural Network Algorithms, and Evaluation of the Impact of Temporal Variabilities on Attainable Accuracies against in-situ Measurements

    Science.gov (United States)

    El-Habashi, A.; Ahmed, S.; Lovko, V. J.

    2017-12-01

    Retrievals of of Karenia brevis Harmful Algal blooms (KB HABS) in the West Florida Shelf (WFS) obtained from remote sensing reflectance (Rrs) measurements by the JPSS VIIRS satellite and processed using recently developed neural network (NN) algorithms are examined and compared with other techniques. The NN approach is used because it does not require observations of Rrs at the 678 nm chlorophyll fluorescence channel. This channel, previously used on MODIS-A (the predecessor satellite) to satisfactorily detect KB HABs blooms using the normalized fluorescence height approach, is unavailable on VIIRS. Thus NN is trained on a synthetic data set of 20,000 IOPs based on a wide range of parameters from NOMAD, and requires as inputs the Rrs measurements only at 486, 551 and 671 and 488, 555 and 667 nm channels, available from VIIRS and MODIS-A respectively. These channels are less vulnerable to atmospheric correction inadequacies affecting observations at the shorter blue wavelengths which are used with other algorithms. The NN retrieves phytoplankton absorption at 443 nm, which, when combined with backscatter information at 551 nm, is sufficient for effective KB HABs retrievals. NN retrievals of KB HABs in the WFS are found to compare favorably with retrievals using other retrieval algorithms, including OCI/OC3, GIOP and QAA version 5. Accuracies of VIIRS retrievals were then compared against all the in-situ measurements available over the 2012-2016 period for which concurrent or near concurrent match ups could be obtained with VIIRS observations. Retrieval statistics showed that the NN technique achieved the best accuracies. They also highlight the impact of temporal variabilities on retrieval accuracies. These showed the importance of having a shorter overlap time window between in-situ measurement and satellite retrieval. Retrievals within a 15 minute overlap time window showed very significantly improved accuracies over those attained with a 100 minute window

  10. EVALUATING THE ACCURACY OF DEM GENERATION ALGORITHMS FROM UAV IMAGERY

    Directory of Open Access Journals (Sweden)

    J. J. Ruiz

    2013-08-01

    Full Text Available In this work we evaluated how the use of different positioning systems affects the accuracy of Digital Elevation Models (DEMs generated from aerial imagery obtained with Unmanned Aerial Vehicles (UAVs. In this domain, state-of-the-art DEM generation algorithms suffer from typical errors obtained by GPS/INS devices in the position measurements associated with each picture obtained. The deviations from these measurements to real world positions are about meters. The experiments have been carried out using a small quadrotor in the indoor testbed at the Center for Advanced Aerospace Technologies (CATEC. This testbed houses a system that is able to track small markers mounted on the UAV and along the scenario with millimeter precision. This provides very precise position measurements, to which we can add random noise to simulate errors in different GPS receivers. The results showed that final DEM accuracy clearly depends on the positioning information.

  11. Design and implementation of a low cost experimental testbed for ...

    African Journals Online (AJOL)

    As wireless sensor networks (WSNs) become essential part of modern day infrastructure, researchers have presented loads of algorithms and models aimed at optimizing several aspects of the technology. These models are often developed and analyzed in simulated environments. The obvious need to experiment and ...

  12. Geostationary Satellite (GOES) Images

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Visible and Infrared satellite imagery taken from radiometer instruments on SMS (ATS) and GOES satellites in geostationary orbit. These satellites produced...

  13. Shadow imaging of geosynchronous satellites

    Science.gov (United States)

    Douglas, Dennis Michael

    Geosynchronous (GEO) satellites are essential for modern communication networks. If communication to a GEO satellite is lost and a malfunction occurs upon orbit insertion such as a solar panel not deploying there is no direct way to observe it from Earth. Due to the GEO orbit distance of ~36,000 km from Earth's surface, the Rayleigh criteria dictates that a 14 m telescope is required to conventionally image a satellite with spatial resolution down to 1 m using visible light. Furthermore, a telescope larger than 30 m is required under ideal conditions to obtain spatial resolution down to 0.4 m. This dissertation evaluates a method for obtaining high spatial resolution images of GEO satellites from an Earth based system by measuring the irradiance distribution on the ground resulting from the occultation of the satellite passing in front of a star. The representative size of a GEO satellite combined with the orbital distance results in the ground shadow being consistent with a Fresnel diffraction pattern when observed at visible wavelengths. A measurement of the ground shadow irradiance is used as an amplitude constraint in a Gerchberg-Saxton phase retrieval algorithm that produces a reconstruction of the satellite's 2D transmission function which is analogous to a reverse contrast image of the satellite. The advantage of shadow imaging is that a terrestrial based redundant set of linearly distributed inexpensive small telescopes, each coupled to high speed detectors, is a more effective resolved imaging system for GEO satellites than a very large telescope under ideal conditions. Modeling and simulation efforts indicate sub-meter spatial resolution can be readily achieved using collection apertures of less than 1 meter in diameter. A mathematical basis is established for the treatment of the physical phenomena involved in the shadow imaging process. This includes the source star brightness and angular extent, and the diffraction of starlight from the satellite

  14. Algorithmic mathematics

    CERN Document Server

    Hougardy, Stefan

    2016-01-01

    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  15. Total algorithms

    NARCIS (Netherlands)

    Tel, G.

    We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of

  16. Diffraction-based analysis of tunnel size for a scaled external occulter testbed

    Science.gov (United States)

    Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-07-01

    For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.

  17. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  18. Design of a low-power testbed for Wireless Sensor Networks and verification

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dulman, S.O.; Havinga, Paul J.M.; Kip, Harry J.

    In this document the design considerations and component choices of a testbed prototype device for wireless sensor networks will be discussed. These devices must be able to monitor their physical environment, process data and assist other nodes in forwarding sensor readings. For these tasks, five

  19. Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approach

    Science.gov (United States)

    Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.

    2017-12-01

    Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.

  20. Creative thinking of design and redesign on SEAT aircraft cabin testbed: a case study

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    this paper, the intuition approach in the design and redesign of the environmental friendly innovative aircraft cabin simulator is presented.. The aircraft cabin simulator is a testbed that used for European Project SEAT (Smart tEchnologies for Stress free Air Travel). The SEAT project aims to

  1. Vacuum nuller testbed (VNT) performance, characterization and null control: progress report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-10-01

    Herein we report on the development, sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 108, 109, and ideally 1010 at an inner working angle of 2*λ/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  2. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  3. Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA)

    Science.gov (United States)

    Banker, Brian F.; Robinson, Travis

    2016-01-01

    The proposed paper will cover ongoing effort named HESTIA (Human Exploration Spacecraft Testbed for Integration and Advancement), led at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) to promote a cross-subsystem approach to developing Mars-enabling technologies with the ultimate goal of integrated system optimization. HESTIA also aims to develop the infrastructure required to rapidly test these highly integrated systems at a low cost. The initial focus is on the common fluids architecture required to enable human exploration of mars, specifically between life support and in-situ resource utilization (ISRU) subsystems. An overview of the advancements in both integrated technologies, in infrastructure, in simulation, and in modeling capabilities will be presented, as well as the results and findings of integrated testing,. Due to the enormous mass gear-ratio required for human exploration beyond low-earth orbit, (for every 1 kg of payload landed on Mars, 226 kg will be required on Earth), minimization of surface hardware and commodities is paramount. Hardware requirements can be minimized by reduction of equipment performing similar functions though for different subsystems. If hardware could be developed which meets the requirements of both life support and ISRU it could result in the reduction of primary hardware and/or reduction in spares. Minimization of commodities to the surface of mars can be achieved through the creation of higher efficiency systems producing little to no undesired waste, such as a closed-loop life support subsystem. Where complete efficiency is impossible or impractical, makeup commodities could be manufactured via ISRU. Although, utilization of ISRU products (oxygen and water) for crew consumption holds great promise of reducing demands on life support hardware, there exist concerns as to the purity and transportation of commodities. To date, ISRU has been focused on production rates and purities for

  4. Development and Validation of Improved Techniques for Cloud Property Retrieval from Environmental Satellites

    National Research Council Canada - National Science Library

    Gustafson, Gary

    2000-01-01

    ...) develop extensible cloud property retrieval algorithms suitable for expanding existing cloud analysis capabilities to utilize data from new and future environmental satellite sensing systems; (2...

  5. An information technology enabled sustainability test-bed (ITEST) for occupancy detection through an environmental sensing network

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Bing; Lam, Khee Poh; Zhang, Rui; Chiou, Yun-Shang [Center for Building Performance and Diagnostics, Carnegie Mellon University, Pittsburgh, PA 15213 (United States); Andrews, Burton; Hoeynck, Michael; Benitez, Diego [Research and Technology Center, Robert BOSCH LLC, Pittsburgh, PA 15212 (United States)

    2010-07-15

    This paper describes a large-scale wireless and wired environmental sensor network test-bed and its application to occupancy detection in an open-plan office building. Detection of occupant presence has been used extensively in built environments for applications such as demand-controlled ventilation and security; however, the ability to discern the actual number of people in a room is beyond the scope of current sensing techniques. To address this problem, a complex sensor network is deployed in the Robert L. Preger Intelligent Workplace comprising a wireless ambient-sensing system, a wired carbon dioxide sensing system, and a wired indoor air quality sensing system. A wired camera network is implemented as well for establishing true occupancy levels to be used as ground truth information for deriving algorithmic relationships with the environment conditions. To our knowledge, this extensive and diverse ambient-sensing infrastructure of the ITEST setup as well as the continuous data-collection capability is unprecedented. Final results indicate that there are significant correlations between measured environmental conditions and occupancy status. An average of 73% accuracy on the occupancy number detection was achieved by Hidden Markov Models during testing periods. This paper serves as an exploration to the research of ITEST for occupancy detection in offices. In addition, its utility extends to a wide variety of other building technology research areas such as human-centered environmental control, security, energy efficient and sustainable green buildings. (author)

  6. Instrument-induced spatial crosstalk deconvolution algorithm

    Science.gov (United States)

    Wright, Valerie G.; Evans, Nathan L., Jr.

    1986-01-01

    An algorithm has been developed which reduces the effects of (deconvolves) instrument-induced spatial crosstalk in satellite image data by several orders of magnitude where highly precise radiometry is required. The algorithm is based upon radiance transfer ratios which are defined as the fractional bilateral exchange of energy betwen pixels A and B.

  7. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis.

  8. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    International Nuclear Information System (INIS)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill

    2016-01-01

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis

  9. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Loop-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging to design and operate. Extreme environments limit the options for sensors and actuators and degrade their performance. Because sensors and actuators are necessary for feedback control, these limitations mean that designing embedded instrumentation and control systems for the challenging environments of nuclear reactors requires advanced technical solutions that are not available commercially. This report details the development of testbed that will be used for cross-cutting embedded instrumentation and control research for nuclear power applications. This research is funded by the Department of Energy's Nuclear Energy Enabling Technology program's Advanced Sensors and Instrumentation topic. The design goal of the loop-scale testbed is to build a low temperature pump that utilizes magnetic bearing that will be incorporated into a water loop to test control system performance and self-sensing techniques. Specifically, this testbed will be used to analyze control system performance in response to nonlinear and cross-coupling fluid effects between the shaft axes of motion, rotordynamics and gyroscopic effects, and impeller disturbances. This testbed will also be used to characterize the performance losses when using self-sensing position measurement techniques. Active magnetic bearings are a technology that can reduce failures and maintenance costs in nuclear power plants. They are particularly relevant to liquid salt reactors that operate at high temperatures (700 C). Pumps used in the extreme environment of liquid salt reactors provide many engineering challenges that can be overcome with magnetic bearings and their associated embedded instrumentation and control. This report will give details of the mechanical design and electromagnetic design of the loop-scale embedded instrumentation and control testbed.

  10. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    Science.gov (United States)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; hide

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  11. JPSS CGS Tools For Rapid Algorithm Updates

    Science.gov (United States)

    Smith, D. C.; Grant, K. D.

    2011-12-01

    The National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation civilian weather and environmental satellite system: the Joint Polar Satellite System (JPSS). JPSS will contribute the afternoon orbit component and ground processing system of the restructured National Polar-orbiting Operational Environmental Satellite System (NPOESS). As such, JPSS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the ground processing component of both POES and the Defense Meteorological Satellite Program (DMSP) replacement known as the Defense Weather Satellite System (DWSS), managed by the Department of Defense (DoD). The JPSS satellites will carry a suite of sensors designed to collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground processing system for JPSS is known as the JPSS Common Ground System (JPSS CGS), and consists of a Command, Control, and Communications Segment (C3S) and the Interface Data Processing Segment (IDPS). Both are developed by Raytheon Intelligence and Information Systems (IIS). The Interface Data Processing Segment will process NPOESS Preparatory Project, Joint Polar Satellite System and Defense Weather Satellite System satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Under NPOESS, Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization was responsible for the algorithms that produce the EDRs, including their quality aspects. For JPSS, that responsibility has transferred to NOAA's Center for Satellite Applications & Research (STAR). As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent JPSS and DWSS launches, rapid algorithm updates may be required. Raytheon and

  12. Iodine Satellite

    Science.gov (United States)

    Kamhawi, Hani; Dankanich, John; Martinez, Andres; Petro, Andrew

    2015-01-01

    The Iodine Satellite (iSat) spacecraft will be the first CubeSat to demonstrate high change in velocity from a primary propulsion system by using Hall thruster technology and iodine as a propellant. The mission will demonstrate CubeSat maneuverability, including plane change, altitude change and change in its closest approach to Earth to ensure atmospheric reentry in less than 90 days. The mission is planned for launch in fall 2017. Hall thruster technology is a type of electric propulsion. Electric propulsion uses electricity, typically from solar panels, to accelerate the propellant. Electric propulsion can accelerate propellant to 10 times higher velocities than traditional chemical propulsion systems, which significantly increases fuel efficiency. To enable the success of the propulsion subsystem, iSat will also demonstrate power management and thermal control capabilities well beyond the current state-of-the-art for spacecraft of its size. This technology is a viable primary propulsion system that can be used on small satellites ranging from about 22 pounds (10 kilograms) to more than 1,000 pounds (450 kilograms). iSat's fuel efficiency is ten times greater and its propulsion per volume is 100 times greater than current cold-gas systems and three times better than the same system operating on xenon. iSat's iodine propulsion system consists of a 200 watt (W) Hall thruster, a cathode, a tank to store solid iodine, a power processing unit (PPU) and the feed system to supply the iodine. This propulsion system is based on a 200 W Hall thruster developed by Busek Co. Inc., which was previously flown using xenon as the propellant. Several improvements have been made to the original system to include a compact PPU, targeting greater than 80 percent reduction in mass and volume of conventional PPU designs. The cathode technology is planned to enable heaterless cathode conditioning, significantly increasing total system efficiency. The feed system has been designed to

  13. The use of a MODIS band-ratio algorithm versus a new hybrid approach for estimating colored dissolved organic matter (CDOM)

    Science.gov (United States)

    Satellite remote sensing offers synoptic and frequent monitoring of optical water quality parameters, such as chlorophyll-a, turbidity, and colored dissolved organic matter (CDOM). While traditional satellite algorithms were developed for the open ocean, these algorithms often do...

  14. Decision tree approach for classification of remotely sensed satellite

    Indian Academy of Sciences (India)

    DTC) algorithm for classification of remotely sensed satellite data (Landsat TM) using open source support. The decision tree is constructed by recursively partitioning the spectral distribution of the training dataset using WEKA, open source ...

  15. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

    Science.gov (United States)

    Duan, Haoran

    1997-12-01

    This dissertation presents the concepts, principles, performance, and implementation of input queuing and cell-scheduling modules for the Illinois Pulsar-based Optical INTerconnect (iPOINT) input-buffered Asynchronous Transfer Mode (ATM) testbed. Input queuing (IQ) ATM switches are well suited to meet the requirements of current and future ultra-broadband ATM networks. The IQ structure imposes minimum memory bandwidth requirements for cell buffering, tolerates bursty traffic, and utilizes memory efficiently for multicast traffic. The lack of efficient cell queuing and scheduling solutions has been a major barrier to build high-performance, scalable IQ-based ATM switches. This dissertation proposes a new Three-Dimensional Queue (3DQ) and a novel Matrix Unit Cell Scheduler (MUCS) to remove this barrier. 3DQ uses a linked-list architecture based on Synchronous Random Access Memory (SRAM) to combine the individual advantages of per-virtual-circuit (per-VC) queuing, priority queuing, and N-destination queuing. It avoids Head of Line (HOL) blocking and provides per-VC Quality of Service (QoS) enforcement mechanisms. Computer simulation results verify the QoS capabilities of 3DQ. For multicast traffic, 3DQ provides efficient usage of cell buffering memory by storing multicast cells only once. Further, the multicast mechanism of 3DQ prevents a congested destination port from blocking other less- loaded ports. The 3DQ principle has been prototyped in the Illinois Input Queue (iiQueue) module. Using Field Programmable Gate Array (FPGA) devices, SRAM modules, and integrated on a Printed Circuit Board (PCB), iiQueue can process incoming traffic at 800 Mb/s. Using faster circuit technology, the same design is expected to operate at the OC-48 rate (2.5 Gb/s). MUCS resolves the output contention by evaluating the weight index of each candidate and selecting the heaviest. It achieves near-optimal scheduling and has a very short response time. The algorithm originates from a

  16. Asteroid Satellites

    Science.gov (United States)

    Merline, W. J.

    2001-11-01

    Discovery and study of small satellites of asteroids or double asteroids can yield valuable information about the intrinsic properties of asteroids themselves and about their history and evolution. Determination of the orbits of these moons can provide precise masses of the primaries, and hence reliable estimates of the fundamental property of bulk density. This reveals much about the composition and structure of the primary and will allow us to make comparisons between, for example, asteroid taxonomic type and our inventory of meteorites. The nature and prevalence of these systems will also give clues as to the collisional environment in which they formed, and have further implications for the role of collisions in shaping our solar system. A decade ago, binary asteroids were more of a theoretical curiosity. In 1993, the Galileo spacecraft allowed the first undeniable detection of an asteroid moon, with the discovery of Dactyl, a small moon of Ida. Since that time, and particularly in the last year, the number of known binaries has risen dramatically. Previously odd-shaped and lobate near-Earth asteroids, observed by radar, have given way to signatures indicating, almost certainly, that at least four NEAs are binary systems. The tell-tale lightcurves of several other NEAs reveal a high likelihood of being double. Indications are that among the NEAs, there may be a binary frequency of several tens of percent. Among the main-belt asteroids, we now know of 6 confirmed binary systems, although their overall frequency is likely to be low, perhaps a few percent. The detections have largely come about because of significant advances in adaptive optics systems on large telescopes, which can now reduce the blurring of the Earth's atmosphere to compete with the spatial resolution of space-based imaging (which itself, via HST, is now contributing valuable observations). Most of these binary systems have similarities, but there are important exceptions. Searches among other

  17. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    Science.gov (United States)

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  18. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the San Diego Testbed : Draft Report.

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  19. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - Evaluation Report for the San Diego Testbed

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  20. Trends in communications satellites

    CERN Document Server

    Curtin, Denis J

    1979-01-01

    Trends in Communications Satellites offers a comprehensive look at trends and advances in satellite communications, including experimental ones such as NASA satellites and those jointly developed by France and Germany. The economic aspects of communications satellites are also examined. This book consists of 16 chapters and begins with a discussion on the fundamentals of electrical communications and their application to space communications, including spacecraft, earth stations, and orbit and wavelength utilization. The next section demonstrates how successful commercial satellite communicati

  1. Resource-Aware Data Fusion Algorithms for Wireless Sensor Networks

    CERN Document Server

    Abdelgawad, Ahmed

    2012-01-01

    This book introduces resource-aware data fusion algorithms to gather and combine data from multiple sources (e.g., sensors) in order to achieve inferences.  These techniques can be used in centralized and distributed systems to overcome sensor failure, technological limitation, and spatial and temporal coverage problems. The algorithms described in this book are evaluated with simulation and experimental results to show they will maintain data integrity and make data useful and informative.   Describes techniques to overcome real problems posed by wireless sensor networks deployed in circumstances that might interfere with measurements provided, such as strong variations of pressure, temperature, radiation, and electromagnetic noise; Uses simulation and experimental results to evaluate algorithms presented and includes real test-bed; Includes case study implementing data fusion algorithms on a remote monitoring framework for sand production in oil pipelines.

  2. NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, B.; Reed, B.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the EDRs, including their quality aspects. As the Calibration and Validation activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  3. On-board attitude determination for the Explorer Platform satellite

    Science.gov (United States)

    Jayaraman, C.; Class, B.

    1992-01-01

    This paper describes the attitude determination algorithm for the Explorer Platform satellite. The algorithm, which is baselined on the Landsat code, is a six-element linear quadratic state estimation processor, in the form of a Kalman filter augmented by an adaptive filter process. Improvements to the original Landsat algorithm were required to meet mission pointing requirements. These consisted of a more efficient sensor processing algorithm and the addition of an adaptive filter which acts as a check on the Kalman filter during satellite slew maneuvers. A 1750A processor will be flown on board the satellite for the first time as a coprocessor (COP) in addition to the NASA Standard Spacecraft Computer. The attitude determination algorithm, which will be resident in the COP's memory, will make full use of its improved processing capabilities to meet mission requirements. Additional benefits were gained by writing the attitude determination code in Ada.

  4. Coral-based Proxy Records of Ocean Acidification: A Pilot Study at the Puerto Rico Test-bed Site

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coral cores collected nearby the Atlantic Ocean Acidification Test-bed (AOAT) at La Parguera, Puerto Rico were used to characterize the relationship between...

  5. Phased Array Antenna Testbed Development at the NASA Glenn Research Center

    Science.gov (United States)

    Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey

    2003-01-01

    Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.

  6. A demonstration of remote survey and characterization of a buried waste site using the SRIP [Soldier Robot Interface Project] testbed

    International Nuclear Information System (INIS)

    Burks, B.L.; Richardson, B.S.; Armstrong, G.A.; Hamel, W.R.; Jansen, J.F.; Killough, S.M.; Thompson, D.H.; Emery, M.S.

    1990-01-01

    During FY 1990, the Oak Ridge National Laboratory (ORNL) supported the Department of Energy (DOE) Environmental Restoration and Waste Management (ER ampersand WM) Office of Technology Development through several projects including the development of a semiautonomous survey of a buried waste site using a remotely operated all-terrain robotic testbed borrowed from the US Army. The testbed was developed for the US Army's Human Engineering Laboratory (HEL) for the US Army's Soldier Robot Interface Project (SRIP). Initial development of the SRIP testbed was performed by a team including ORNL, HEL, Tooele Army Depot, and Odetics, Inc., as an experimental testbed for a variety of human factors issues related to military applications of robotics. The SRIP testbed was made available to the DOE and ORNL for the further development required for a remote landfill survey. The robot was modified extensively, equipped with environmental sensors, and used to demonstrate an automated remote survey of Solid Waste Storage Area No. 3 (SWSA 3) at ORNL on Tuesday, September 18, 1990. Burial trenches in this area containing contaminated materials were covered with soil nearly twenty years ago. This paper describes the SRIP testbed and work performed in FY 1990 to demonstrate a semiautonomous landfill survey at ORNL. 5 refs

  7. Satellite image collection optimization

    Science.gov (United States)

    Martin, William

    2002-09-01

    Imaging satellite systems represent a high capital cost. Optimizing the collection of images is critical for both satisfying customer orders and building a sustainable satellite operations business. We describe the functions of an operational, multivariable, time dynamic optimization system that maximizes the daily collection of satellite images. A graphical user interface allows the operator to quickly see the results of what if adjustments to an image collection plan. Used for both long range planning and daily collection scheduling of Space Imaging's IKONOS satellite, the satellite control and tasking (SCT) software allows collection commands to be altered up to 10 min before upload to the satellite.

  8. Handbook of satellite applications

    CERN Document Server

    Madry, Scott; Camacho-Lara, Sergio

    2017-01-01

    The first edition of this ground breaking reference work was the most comprehensive reference source available about the key aspects of the satellite applications field. This updated second edition covers the technology, the markets, applications and regulations related to satellite telecommunications, broadcasting and networking—including civilian and military systems; precise satellite navigation and timing networks (i.e. GPS and others); remote sensing and meteorological satellite systems. Created under the auspices of the International Space University based in France, this brand new edition is now expanded to cover new innovative small satellite constellations, new commercial launching systems, innovation in military application satellites and their acquisition, updated appendices, a useful glossary and more.

  9. EPIC: A Testbed for Scientifically Rigorous Cyber-Physical Security Experimentation

    OpenAIRE

    SIATERLIS CHRISTOS; GENGE BELA; HOHENADEL MARC

    2013-01-01

    Recent malware, like Stuxnet and Flame, constitute a major threat to Networked Critical Infrastructures (NCIs), e.g., power plants. They revealed several vulnerabilities in today's NCIs, but most importantly they highlighted the lack of an efficient scientific approach to conduct experiments that measure the impact of cyber threats on both the physical and the cyber parts of NCIs. In this paper we present EPIC, a novel cyber-physical testbed and a modern scientific instrument that can pr...

  10. Cooperating expert systems for Space Station - Power/thermal subsystem testbeds

    Science.gov (United States)

    Wong, Carla M.; Weeks, David J.; Sundberg, Gale R.; Healey, Kathleen L.; Dominick, Jeffrey S.

    1988-01-01

    The Systems Autonomy Demonstration Project (SADP) is a NASA-sponsored series of increasingly complex demonstrations to show the benefits of integrating knowledge-based systems with conventional process control in real-time, real-world problem domains that can facilitate the operations and availability of major Space Station distributed systems. This paper describes the system design, objectives, approaches, and status of each of the testbed knowledge-based systems. Simplified schematics of the systems are shown.

  11. Development of an Experimental Testbed for Research in Lithium-Ion Battery Management Systems

    Directory of Open Access Journals (Sweden)

    Mehdi Ferdowsi

    2013-10-01

    Full Text Available Advanced electrochemical batteries are becoming an integral part of a wide range of applications from household and commercial to smart grid, transportation, and aerospace applications. Among different battery technologies, lithium-ion (Li-ion batteries are growing more and more popular due to their high energy density, high galvanic potential, low self-discharge, low weight, and the fact that they have almost no memory effect. However, one of the main obstacles facing the widespread commercialization of Li-ion batteries is the design of reliable battery management systems (BMSs. An efficient BMS ensures electrical safety during operation, while increasing battery lifetime, capacity and thermal stability. Despite the need for extensive research in this field, the majority of research conducted on Li-ion battery packs and BMS are proprietary works conducted by manufacturers. The available literature, however, provides either general descriptions or detailed analysis of individual components of the battery system, and ignores addressing details of the overall system development. This paper addresses the development of an experimental research testbed for studying Li-ion batteries and their BMS design. The testbed can be configured in a variety of cell and pack architectures, allowing for a wide range of BMS monitoring, diagnostics, and control technologies to be tested and analyzed. General considerations that should be taken into account while designing Li-ion battery systems are reviewed and different technologies and challenges commonly encountered in Li-ion battery systems are investigated. This testbed facilitates future development of more practical and improved BMS technologies with the aim of increasing the safety, reliability, and efficiency of existing Li-ion battery systems. Experimental results of initial tests performed on the system are used to demonstrate some of the capabilities of the developed research testbed. To the authors

  12. Static and dynamic optimization of CAPE problems using a Model Testbed

    DEFF Research Database (Denmark)

    This paper presents a new computer aided tool for setting up and solving CAPE related static and dynamic optimisation problems. The Model Testbed (MOT) offers an integrated environment for setting up and solving a very large range of CAPE problems, including complex optimisation problems...... and dynamic optimisation, and how interfacing of solvers and seamless information flow can lead to more efficient solution of process design problems....

  13. A Testbed Environment for Buildings-to-Grid Cyber Resilience Research and Development

    Energy Technology Data Exchange (ETDEWEB)

    Sridhar, Siddharth; Ashok, Aditya; Mylrea, Michael E.; Pal, Seemita; Rice, Mark J.; Gourisetti, Sri Nikhil Gup

    2017-09-19

    The Smart Grid is characterized by the proliferation of advanced digital controllers at all levels of its operational hierarchy from generation to end consumption. Such controllers within modern residential and commercial buildings enable grid operators to exercise fine-grained control over energy consumption through several emerging Buildings-to-Grid (B2G) applications. Though this capability promises significant benefits in terms of operational economics and improved reliability, cybersecurity weaknesses in the supporting infrastructure could be exploited to cause a detrimental effect and this necessitates focused research efforts on two fronts. First, the understanding of how cyber attacks in the B2G space could impact grid reliability and to what extent. Second, the development and validation of cyber-physical application-specific countermeasures that are complementary to traditional infrastructure cybersecurity mechanisms for enhanced cyber attack detection and mitigation. The PNNL B2G testbed is currently being developed to address these core research needs. Specifically, the B2G testbed combines high-fidelity buildings+grid simulators, industry-grade building automation and Supervisory Control and Data Acquisition (SCADA) systems in an integrated, realistic, and reconfigurable environment capable of supporting attack-impact-detection-mitigation experimentation. In this paper, we articulate the need for research testbeds to model various B2G applications broadly by looking at the end-to-end operational hierarchy of the Smart Grid. Finally, the paper not only describes the architecture of the B2G testbed in detail, but also addresses the broad spectrum of B2G resilience research it is capable of supporting based on the smart grid operational hierarchy identified earlier.

  14. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    International Nuclear Information System (INIS)

    Shin, Jinsoo; Heo, Gyunyoung; Son, Hanseong; An, Yongkyu; Rizwan, Uddin

    2015-01-01

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model

  15. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jinsoo; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Son, Hanseong [Joongbu Univ., Geumsan (Korea, Republic of); An, Yongkyu; Rizwan, Uddin [University of Illinois at Urbana-Champaign, Urbana (United States)

    2015-10-15

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model.

  16. Testbed diversity as a fundamental principle for effective ICS security research

    OpenAIRE

    Green, Benjamin; Frey, Sylvain Andre Francis; Rashid, Awais; Hutchison, David

    2016-01-01

    The implementation of diversity in testbeds is essential to understanding and improving the security and resilience of Industrial Control Systems (ICS). Employing a wide spec- trum of equipment, diverse networks, and business processes, as deployed in real-life infrastructures, is particularly diffi- cult in experimental conditions. However, this level of di- versity is key from a security perspective, as attackers can exploit system particularities and process intricacies to their advantage....

  17. Satellite-Based Precipitation Datasets

    Science.gov (United States)

    Munchak, S. J.; Huffman, G. J.

    2017-12-01

    Of the possible sources of precipitation data, those based on satellites provide the greatest spatial coverage. There is a wide selection of datasets, algorithms, and versions from which to choose, which can be confusing to non-specialists wishing to use the data. The International Precipitation Working Group (IPWG) maintains tables of the major publicly available, long-term, quasi-global precipitation data sets (http://www.isac.cnr.it/ ipwg/data/datasets.html), and this talk briefly reviews the various categories. As examples, NASA provides two sets of quasi-global precipitation data sets: the older Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) and current Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (GPM) mission (IMERG). Both provide near-real-time and post-real-time products that are uniformly gridded in space and time. The TMPA products are 3-hourly 0.25°x0.25° on the latitude band 50°N-S for about 16 years, while the IMERG products are half-hourly 0.1°x0.1° on 60°N-S for over 3 years (with plans to go to 16+ years in Spring 2018). In addition to the precipitation estimates, each data set provides fields of other variables, such as the satellite sensor providing estimates and estimated random error. The discussion concludes with advice about determining suitability for use, the necessity of being clear about product names and versions, and the need for continued support for satellite- and surface-based observation.

  18. In-Space Internet-Based Communications for Space Science Platforms Using Commercial Satellite Networks

    Science.gov (United States)

    Kerczewski, Robert J.; Bhasin, Kul B.; Fabian, Theodore P.; Griner, James H.; Kachmar, Brian A.; Richard, Alan M.

    1999-01-01

    The continuing technological advances in satellite communications and global networking have resulted in commercial systems that now can potentially provide capabilities for communications with space-based science platforms. This reduces the need for expensive government owned communications infrastructures to support space science missions while simultaneously making available better service to the end users. An interactive, high data rate Internet type connection through commercial space communications networks would enable authorized researchers anywhere to control space-based experiments in near real time and obtain experimental results immediately. A space based communications network architecture consisting of satellite constellations connecting orbiting space science platforms to ground users can be developed to provide this service. The unresolved technical issues presented by this scenario are the subject of research at NASA's Glenn Research Center in Cleveland, Ohio. Assessment of network architectures, identification of required new or improved technologies, and investigation of data communications protocols are being performed through testbed and satellite experiments and laboratory simulations.

  19. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal planeregion extending from 14 D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. TheVNC is a hybrid interferometriccoronagraphic approach for exoplanet science. It operates with high Lyot stopefficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential futureNASA flight telescopes. NASAGoddard Space Flight Center (GSFC) has a well-established effort to develop the VNCand its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and itsenabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry tounprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a W configurationto accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters.We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, criticaltechnologies and null sensing and control.

  20. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    Science.gov (United States)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  1. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  2. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    Science.gov (United States)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  3. Real-Time Simulation and Hardware-in-the-Loop Testbed for Distribution Synchrophasor Applications

    Directory of Open Access Journals (Sweden)

    Matthias Stifter

    2018-04-01

    Full Text Available With the advent of Distribution Phasor Measurement Units (D-PMUs and Micro-Synchrophasors (Micro-PMUs, the situational awareness in power distribution systems is going to the next level using time-synchronization. However, designing, analyzing, and testing of such accurate measurement devices are still challenging. Due to the lack of available knowledge and sufficient history for synchrophasors’ applications at the power distribution level, the realistic simulation, and validation environments are essential for D-PMU development and deployment. This paper presents a vendor agnostic PMU real-time simulation and hardware-in-the-Loop (PMU-RTS-HIL testbed, which helps in multiple PMUs validation and studies. The network of real and virtual PMUs was built in a full time-synchronized environment for PMU applications’ validation. The proposed testbed also includes an emulated communication network (CNS layer to replicate bandwidth, packet loss and collisions conditions inherent to the PMUs data streams’ issues. Experimental results demonstrate the flexibility and scalability of the developed PMU-RTS-HIL testbed by producing large amounts of measurements under typical normal and abnormal distribution grid operation conditions.

  4. High contrast vacuum nuller testbed (VNT) contrast, performance, and null control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-09-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal plane region extending from 1 - 4 λ/D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. The VNC is a hybrid interferometric/coronagraphic approach for exoplanet science. It operates with high Lyot stop efficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential future NASA flight telescopes. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop the VNC and its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and its enabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry to unprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a “W” configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, critical technologies and null sensing and control.

  5. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    Energy Technology Data Exchange (ETDEWEB)

    Hebner, Gregory A.

    2017-04-01

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Once realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.

  6. Satellite observation of particulate organic carbon dynamics in ...

    Science.gov (United States)

    Particulate organic carbon (POC) plays an important role in coastal carbon cycling and the formation of hypoxia. Yet, coastal POC dynamics are often poorly understood due to a lack of long-term POC observations and the complexity of coastal hydrodynamic and biogeochemical processes that influence POC sources and sinks. Using field observations and satellite ocean color products, we developed a nw multiple regression algorithm to estimate POC on the Louisiana Continental Shelf (LCS) from satellite observations. The algorithm had reliable performance with mean relative error (MRE) of ?40% and root mean square error (RMSE) of ?50% for MODIS and SeaWiFS images for POC ranging between ?80 and ?1200 mg m23, and showed similar performance for a large estuary (Mobile Bay). Substantial spatiotemporal variability in the satellite-derived POC was observed on the LCS, with high POC found on the inner shelf (satellite data with carefully developed algorithms can greatly increase

  7. GPS Satellite Simulation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The GPS satellite simulation facility consists of a GPS satellite simulator controlled by either a Silicon Graphics Origin 2000 or PC depending upon unit under test...

  8. Full Scale Advanced Systems Testbed (FAST): Capabilities and Recent Flight Research

    Science.gov (United States)

    Miller, Christopher

    2014-01-01

    At the NASA Armstrong Flight Research Center research is being conducted into flight control technologies that will enable the next generation of air and space vehicles. The Full Scale Advanced Systems Testbed (FAST) aircraft provides a laboratory for flight exploration of these technologies. In recent years novel but simple adaptive architectures for aircraft and rockets have been researched along with control technologies for improving aircraft fuel efficiency and control structural interaction. This presentation outlines the FAST capabilities and provides a snapshot of the research accomplishments to date. Flight experimentation allows a researcher to substantiate or invalidate their assumptions and intuition about a new technology or innovative approach Data early in a development cycle is invaluable for determining which technology barriers are real and which ones are imagined Data for a technology at a low TRL can be used to steer and focus the exploration and fuel rapid advances based on real world lessons learned It is important to identify technologies that are mature enough to benefit from flight research data and not be tempted to wait until we have solved all the potential issues prior to getting some data Sometimes a stagnated technology just needs a little real world data to get it going One trick to getting data for low TRL technologies is finding an environment where it is okay to take risks, where occasional failure is an expected outcome Learning how things fail is often as valuable as showing that they work FAST has been architected to facilitate this type of testing for control system technologies, specifically novel algorithms and sensors Rapid prototyping with a quick turnaround in a fly-fix-fly paradigm Sometimes it's easier and cheaper to just go fly it than to analyze the problem to death The goal is to find and test control technologies that would benefit from flight data and find solutions to the real barriers to innovation. The FAST

  9. Estimación de la temperatura superficial del mar desde datos satelitales NOAA-AVHRR: validación de algoritmos aplicados a la costa norte de Chile Sea surface temperature estimation from NOAA-AVHRR satellite data: validation of algorithms applied to the northern coast of Chile

    Directory of Open Access Journals (Sweden)

    Juan C Parra

    2011-01-01

    Full Text Available Se aplicaron y compararon tres algoritmos del tipo Split-Window (SW, que permitieron estimar la temperatura superficial del mar desde datos aportados por el sensor Advanced Very High Resolution Radiometer (AVHRR, a bordo de la serie de satélites de la National Oceanic and Atmospheric Administration (NOAA. La validación de los algoritmos fue lograda por comparación con mediciones in situ de temperatura del mar provenientes de una boya hidrográfica, ubicada frente a la costa norte de Chile (21°21'S, 70°6'W; Región de Tarapacá, a 3 km de la costa aproximadamente. Los mejores resultados se obtuvieron por aplicación del algoritmo propuesto por Sobrino & Raissouni (2000. En efecto, diferencias entre la temperatura medida in situ y la estimada por SW, permitieron evidenciar una media y desviación estándar de 0,3° y 0,8°K, respectivamente.The present article applies and compares three split-window (SW algorithms, which allowed the estimation of sea surface temperature using data obtained from the Advanced Very High Resolution Radiometer (AVHRR on board the National Oceanic and Atmospheric Administration (NOAA series of satellites. The algorithms were validated by comparison with in situ measurements of sea temperature obtained from a hydrographical buoy located off the coast of northern Chile (21°21'S, 70°6'W; Tarapacá Región, approximately 3 km from the coast. The best results were obtained by the application of the algorithm proposed by Sobrino & Raissouni (2000. The mean and standard deviation of the differences between the temperatures measured in situ and those estimated by SW were 0.3° and 0.8°K, respectively.

  10. New Channel Coding Methods for Satellite Communication

    Directory of Open Access Journals (Sweden)

    J. Sebesta

    2010-04-01

    Full Text Available This paper deals with the new progressive channel coding methods for short message transmission via satellite transponder using predetermined length of frame. The key benefits of this contribution are modification and implementation of a new turbo code and utilization of unique features with applications of methods for bit error rate estimation and algorithm for output message reconstruction. The mentioned methods allow an error free communication with very low Eb/N0 ratio and they have been adopted for satellite communication, however they can be applied for other systems working with very low Eb/N0 ratio.

  11. Spectrum and power allocation in cognitive multi-beam satellite communications with flexible satellite payloads

    Science.gov (United States)

    Liu, Zhihui; Wang, Haitao; Dong, Tao; Yin, Jie; Zhang, Tingting; Guo, Hui; Li, Dequan

    2018-02-01

    In this paper, the cognitive multi-beam satellite system, i.e., two satellite networks coexist through underlay spectrum sharing, is studied, and the power and spectrum allocation method is employed for interference control and throughput maximization. Specifically, the multi-beam satellite with flexible payload reuses the authorized spectrum of the primary satellite, adjusting its transmission band as well as power for each beam to limit its interference on the primary satellite below the prescribed threshold and maximize its own achievable rate. This power and spectrum allocation problem is formulated as a mixed nonconvex programming. For effective solving, we first introduce the concept of signal to leakage plus noise ratio (SLNR) to decouple multiple transmit power variables in the both objective and constraint, and then propose a heuristic algorithm to assign spectrum sub-bands. After that, a stepwise plus slice-wise algorithm is proposed to implement the discrete power allocation. Finally, simulation results show that adopting cognitive technology can improve spectrum efficiency of the satellite communication.

  12. Meteorological satellite systems

    CERN Document Server

    Tan, Su-Yin

    2014-01-01

    “Meteorological Satellite Systems” is a primer on weather satellites and their Earth applications. This book reviews historic developments and recent technological advancements in GEO and polar orbiting meteorological satellites. It explores the evolution of these remote sensing technologies and their capabilities to monitor short- and long-term changes in weather patterns in response to climate change. Satellites developed by various countries, such as U.S. meteorological satellites, EUMETSAT, and Russian, Chinese, Japanese and Indian satellite platforms are reviewed. This book also discusses international efforts to coordinate meteorological remote sensing data collection and sharing. This title provides a ready and quick reference for information about meteorological satellites. It serves as a useful tool for a broad audience that includes students, academics, private consultants, engineers, scientists, and teachers.

  13. Theory of geostationary satellites

    CERN Document Server

    Zee, Chong-Hung

    1989-01-01

    Geostationary or equatorial synchronous satellites are a daily reminder of our space efforts during the past two decades. The nightly television satellite weather picture, the intercontinental telecommunications of television transmissions and telephone conversations, and the establishrnent of educational programs in remote regions on Earth are constant reminders of the presence of these satellites. As used here, the term 'geo­ stationary' must be taken loosely because, in the long run, the satellites will not remain 'stationary' with respect to an Earth-fixed reference frame. This results from the fact that these satellites, as is true for all satellites, are incessantly subject to perturbations other than the central-body attraction of the Earth. Among the more predominant pertur­ bations are: the ellipticity of the Earth's equator, the Sun and Moon, and solar radiation pressure. Higher harmonics of the Earth's potential and tidal effects also influence satellite motion, but they are of second­ order whe...

  14. Congestion control and routing over satellite networks

    Science.gov (United States)

    Cao, Jinhua

    Satellite networks and transmissions find their application in fields of computer communications, telephone communications, television broadcasting, transportation, space situational awareness systems and so on. This thesis mainly focuses on two networking issues affecting satellite networking: network congestion control and network routing optimization. Congestion, which leads to long queueing delays, packet losses or both, is a networking problem that has drawn the attention of many researchers. The goal of congestion control mechanisms is to ensure high bandwidth utilization while avoiding network congestion by regulating the rate at which traffic sources inject packets into a network. In this thesis, we propose a stable congestion controller using data-driven, safe switching control theory to improve the dynamic performance of satellite Transmission Control Protocol/Active Queue Management (TCP/AQM) networks. First, the stable region of the Proportional-Integral (PI) parameters for a nominal model is explored. Then, a PI controller, whose parameters are adaptively tuned by switching among members of a given candidate set, using observed plant data, is presented and compared with some classical AQM policy examples, such as Random Early Detection (RED) and fixed PI control. A new cost detectable switching law with an interval cost function switching algorithm, which improves the performance and also saves the computational cost, is developed and compared with a law commonly used in the switching control literature. Finite-gain stability of the system is proved. A fuzzy logic PI controller is incorporated as a special candidate to achieve good performance at all nominal points with the available set of candidate controllers. Simulations are presented to validate the theory. An effocient routing algorithm plays a key role in optimizing network resources. In this thesis, we briefly analyze Low Earth Orbit (LEO) satellite networks, review the Cross Entropy (CE

  15. An optimization tool for satellite equipment layout

    Science.gov (United States)

    Qin, Zheng; Liang, Yan-gang; Zhou, Jian-ping

    2018-01-01

    Selection of the satellite equipment layout with performance constraints is a complex task which can be viewed as a constrained multi-objective optimization and a multiple criteria decision making problem. The layout design of a satellite cabin involves the process of locating the required equipment in a limited space, thereby satisfying various behavioral constraints of the interior and exterior environments. The layout optimization of satellite cabin in this paper includes the C.G. offset, the moments of inertia and the space debris impact risk of the system, of which the impact risk index is developed to quantify the risk to a satellite cabin of coming into contact with space debris. In this paper an optimization tool for the integration of CAD software as well as the optimization algorithms is presented, which is developed to automatically find solutions for a three-dimensional layout of equipment in satellite. The effectiveness of the tool is also demonstrated by applying to the layout optimization of a satellite platform.

  16. Algorithmic alternatives

    International Nuclear Information System (INIS)

    Creutz, M.

    1987-11-01

    A large variety of Monte Carlo algorithms are being used for lattice gauge simulations. For purely bosonic theories, present approaches are generally adequate; nevertheless, overrelaxation techniques promise savings by a factor of about three in computer time. For fermionic fields the situation is more difficult and less clear. Algorithms which involve an extrapolation to a vanishing step size are all quite closely related. Methods which do not require such an approximation tend to require computer time which grows as the square of the volume of the system. Recent developments combining global accept/reject stages with Langevin or microcanonical updatings promise to reduce this growth to V/sup 4/3/

  17. Combinatorial algorithms

    CERN Document Server

    Hu, T C

    2002-01-01

    Newly enlarged, updated second edition of a valuable text presents algorithms for shortest paths, maximum flows, dynamic programming and backtracking. Also discusses binary trees, heuristic and near optimums, matrix multiplication, and NP-complete problems. 153 black-and-white illus. 23 tables.Newly enlarged, updated second edition of a valuable, widely used text presents algorithms for shortest paths, maximum flows, dynamic programming and backtracking. Also discussed are binary trees, heuristic and near optimums, matrix multiplication, and NP-complete problems. New to this edition: Chapter 9

  18. Social media analytics and research testbed (SMART: Exploring spatiotemporal patterns of human dynamics with geo-targeted social media messages

    Directory of Open Access Journals (Sweden)

    Jiue-An Yang

    2016-06-01

    Full Text Available The multilevel model of meme diffusion conceptualizes how mediated messages diffuse over time and space. As a pilot application of implementing the meme diffusion, we developed the social media analytics and research testbed to monitor Twitter messages and track the diffusion of information in and across different cities and geographic regions. Social media analytics and research testbed is an online geo-targeted search and analytics tool, including an automatic data processing procedure at the backend and an interactive frontend user interface. Social media analytics and research testbed is initially designed to facilitate (1 searching and geo-locating tweet topics and terms in different cities and geographic regions; (2 filtering noise from raw data (such as removing redundant retweets and using machine learning methods to improve precision; (3 analyzing social media data from a spatiotemporal perspective; and (4 visualizing social media data in diagnostic ways (such as weekly and monthly trends, trend maps, top media, top retweets, top mentions, or top hashtags. Social media analytics and research testbed provides researchers and domain experts with a tool that can efficiently facilitate the refinement, formalization, and testing of research hypotheses or questions. Three case studies (flu outbreaks, Ebola epidemic, and marijuana legalization are introduced to illustrate how the predictions of meme diffusion can be examined and to demonstrate the potentials and key functions of social media analytics and research testbed.

  19. Fine-tuning satellite-based rainfall estimates

    Science.gov (United States)

    Harsa, Hastuadi; Buono, Agus; Hidayat, Rahmat; Achyar, Jaumil; Noviati, Sri; Kurniawan, Roni; Praja, Alfan S.

    2018-05-01

    Rainfall datasets are available from various sources, including satellite estimates and ground observation. The locations of ground observation scatter sparsely. Therefore, the use of satellite estimates is advantageous, because satellite estimates can provide data on places where the ground observations do not present. However, in general, the satellite estimates data contain bias, since they are product of algorithms that transform the sensors response into rainfall values. Another cause may come from the number of ground observations used by the algorithms as the reference in determining the rainfall values. This paper describe the application of bias correction method to modify the satellite-based dataset by adding a number of ground observation locations that have not been used before by the algorithm. The bias correction was performed by utilizing Quantile Mapping procedure between ground observation data and satellite estimates data. Since Quantile Mapping required mean and standard deviation of both the reference and the being-corrected data, thus the Inverse Distance Weighting scheme was applied beforehand to the mean and standard deviation of the observation data in order to provide a spatial composition of them, which were originally scattered. Therefore, it was possible to provide a reference data point at the same location with that of the satellite estimates. The results show that the new dataset have statistically better representation of the rainfall values recorded by the ground observation than the previous dataset.

  20. Comparison of wavefront control algorithms and first results on the high-contrast imager for complex aperture telescopes (hicat) testbed

    Science.gov (United States)

    Leboulleux, L.; N'Diaye, M.; Mazoyer, J.; Pueyo, L.; Perrin, M.; Egron, S.; Choquet, E.; Sauvage, J.-F.; Fusco, T.; Soummer, R.

    2017-09-01

    The next generation of space telescopes for direct imaging and spectroscopy of exoplanets includes telescopes with a monolithic mirror, such as the Wide Field Infrared Survey Telescope (WFIRST) [1] and Large Ultra-Violet Optical Infrared (LUVOIR) telescopes with segmented primary mirror, like ATLAST [2, 3] or HDST [4].

  1. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    Science.gov (United States)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  2. Autodriver algorithm

    Directory of Open Access Journals (Sweden)

    Anna Bourmistrova

    2011-02-01

    Full Text Available The autodriver algorithm is an intelligent method to eliminate the need of steering by a driver on a well-defined road. The proposed method performs best on a four-wheel steering (4WS vehicle, though it is also applicable to two-wheel-steering (TWS vehicles. The algorithm is based on coinciding the actual vehicle center of rotation and road center of curvature, by adjusting the kinematic center of rotation. The road center of curvature is assumed prior information for a given road, while the dynamic center of rotation is the output of dynamic equations of motion of the vehicle using steering angle and velocity measurements as inputs. We use kinematic condition of steering to set the steering angles in such a way that the kinematic center of rotation of the vehicle sits at a desired point. At low speeds the ideal and actual paths of the vehicle are very close. With increase of forward speed the road and tire characteristics, along with the motion dynamics of the vehicle cause the vehicle to turn about time-varying points. By adjusting the steering angles, our algorithm controls the dynamic turning center of the vehicle so that it coincides with the road curvature center, hence keeping the vehicle on a given road autonomously. The position and orientation errors are used as feedback signals in a closed loop control to adjust the steering angles. The application of the presented autodriver algorithm demonstrates reliable performance under different driving conditions.

  3. An ODMG-compatible testbed architecture for scalable management and analysis of physics data

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.

    1997-01-01

    This paper describes a testbed architecture for the investigation and development of scalable approaches to the management and analysis of massive amounts of high energy physics data. The architecture has two components: an interface layer that is compliant with a substantial subset of the ODMG-93 Version 1.2 specification, and a lightweight object persistence manager that provides flexible storage and retrieval services on a variety of single- and multi-level storage architectures, and on a range of parallel and distributed computing platforms

  4. Deployment of a Testbed in a Brazilian Research Network using IPv6 and Optical Access Technologies

    Science.gov (United States)

    Martins, Luciano; Ferramola Pozzuto, João; Olimpio Tognolli, João; Chaves, Niudomar Siqueira De A.; Reggiani, Atilio Eduardo; Hortêncio, Claudio Antonio

    2012-04-01

    This article presents the implementation of a testbed and the experimental results obtained with it on the Brazilian Experimental Network of the government-sponsored "GIGA Project." The use of IPv6 integrated to current and emerging optical architectures and technologies, such as dense wavelength division multiplexing and 10-gigabit Ethernet on the core and gigabit capable passive optical network and optical distribution network on access, were tested. These protocols, architectures, and optical technologies are promising and part of a brand new worldwide technological scenario that has being fairly adopted in the networks of enterprises and providers of the world.

  5. Test-bed Assessment of Communication Technologies for a Power-Balancing Controller

    DEFF Research Database (Denmark)

    Findrik, Mislav; Pedersen, Rasmus; Hasenleithner, Eduard

    2016-01-01

    and control. In this paper, we present a Smart Grid test-bed that integrates various communication technologies and deploys a power balancing controller for LV grids. Control performance of the introduced power balancing controller is subsequently investigated and its robustness to communication network cross......Due to growing need for sustainable energy, increasing number of different renewable energy resources are being connected into distribution grids. In order to efficiently manage a decentralized power generation units, the smart grid will rely on communication networks for information exchange...

  6. Communication satellite applications

    Science.gov (United States)

    Pelton, Joseph N.

    The status and future of the technologies, numbers and services provided by communications satellites worldwide are explored. The evolution of Intelsat satellites and the associated earth terminals toward high-rate all-digital telephony, data, facsimile, videophone, videoconferencing and DBS capabilities are described. The capabilities, services and usage of the Intersputnik, Eutelsat, Arabsat and Palapa systems are also outlined. Domestic satellite communications by means of the Molniya, ANIK, Olympus, Intelsat and Palapa spacecraft are outlined, noting the fast growth of the market and the growing number of different satellite manufacturers. The technical, economic and service definition issues surrounding DBS systems are discussed, along with presently operating and planned maritime and aeronautical communications and positioning systems. Features of search and rescue and tracking, data, and relay satellite systems are summarized, and services offered or which will be offered by every existing or planned communication satellite worldwide are tabulated.

  7. Satellite Ocean Biology: Past, Present, Future

    Science.gov (United States)

    McClain, Charles R.

    2012-01-01

    Since 1978 when the first satellite ocean color proof-of-concept sensor, the Nimbus-7 Coastal Zone Color Scanner, was launched, much progress has been made in refining the basic measurement concept and expanding the research applications of global satellite time series of biological and optical properties such as chlorophyll-a concentrations. The seminar will review the fundamentals of satellite ocean color measurements (sensor design considerations, on-orbit calibration, atmospheric corrections, and bio-optical algorithms), scientific results from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate resolution Imaging Spectroradiometer (MODIS) missions, and the goals of future NASA missions such as PACE, the Aerosol, Cloud, Ecology (ACE), and Geostationary Coastal and Air Pollution Events (GeoCAPE) missions.

  8. Parallel algorithms on the ASTRA SIMD machine

    International Nuclear Information System (INIS)

    Odor, G.; Rohrbach, F.; Vesztergombi, G.; Varga, G.; Tatrai, F.

    1996-01-01

    In view of the tremendous computing power jump of modern RISC processors the interest in parallel computing seems to be thinning out. Why use a complicated system of parallel processors, if the problem can be solved by a single powerful micro-chip. It is a general law, however, that exponential growth will always end by some kind of a saturation, and then parallelism will again become a hot topic. We try to prepare ourselves for this eventuality. The MPPC project started in 1990 in the keydeys of parallelism and produced four ASTRA machines (presented at CHEP's 92) with 4k processors (which are expandable to 16k) based on yesterday's chip-technology (chip presented at CHEP'91). These machines now provide excellent test-beds for algorithmic developments in a complete, real environment. We are developing for example fast-pattern recognition algorithms which could be used in high-energy physics experiments at the LHC (planned to be operational after 2004 at CERN) for triggering and data reduction. The basic feature of our ASP (Associate String Processor) approach is to use extremely simple (thus very cheap) processor elements but in huge quantities (up to millions of processors) connected together by a very simple string-like communication chain. In this paper we present powerful algorithms based on this architecture indicating the performance perspectives if the hardware quality reaches present or even future technology levels. (author)

  9. Experimental validation of a distributed algorithm for dynamic spectrum access in local area networks

    DEFF Research Database (Denmark)

    Tonelli, Oscar; Berardinelli, Gilberto; Tavares, Fernando Menezes Leitão

    2013-01-01

    Next generation wireless networks aim at a significant improvement of the spectral efficiency in order to meet the dramatic increase in data service demand. In local area scenarios user-deployed base stations are expected to take place, thus making the centralized planning of frequency resources...... activities with the Autonomous Component Carrier Selection (ACCS) algorithm, a distributed solution for interference management among small neighboring cells. A preliminary evaluation of the algorithm performance is provided considering its live execution on a software defined radio network testbed...

  10. Satellite services system overview

    Science.gov (United States)

    Rysavy, G.

    1982-01-01

    The benefits of a satellite services system and the basic needs of the Space Transportation System to have improved satellite service capability are identified. Specific required servicing equipment are discussed in terms of their technology development status and their operative functions. Concepts include maneuverable television systems, extravehicular maneuvering unit, orbiter exterior lighting, satellite holding and positioning aid, fluid transfer equipment, end effectors for the remote manipulator system, teleoperator maneuvering system, and hand and power tools.

  11. Traffic sharing algorithms for hybrid mobile networks

    Science.gov (United States)

    Arcand, S.; Murthy, K. M. S.; Hafez, R.

    1995-01-01

    In a hybrid (terrestrial + satellite) mobile personal communications networks environment, a large size satellite footprint (supercell) overlays on a large number of smaller size, contiguous terrestrial cells. We assume that the users have either a terrestrial only single mode terminal (SMT) or a terrestrial/satellite dual mode terminal (DMT) and the ratio of DMT to the total terminals is defined gamma. It is assumed that the call assignments to and handovers between terrestrial cells and satellite supercells take place in a dynamic fashion when necessary. The objectives of this paper are twofold, (1) to propose and define a class of traffic sharing algorithms to manage terrestrial and satellite network resources efficiently by handling call handovers dynamically, and (2) to analyze and evaluate the algorithms by maximizing the traffic load handling capability (defined in erl/cell) over a wide range of terminal ratios (gamma) given an acceptable range of blocking probabilities. Two of the algorithms (G & S) in the proposed class perform extremely well for a wide range of gamma.

  12. On-wire lithography-generated molecule-based transport junctions: a new testbed for molecular electronics.

    Science.gov (United States)

    Chen, Xiaodong; Jeon, You-Moon; Jang, Jae-Won; Qin, Lidong; Huo, Fengwei; Wei, Wei; Mirkin, Chad A

    2008-07-02

    On-wire lithography (OWL) fabricated nanogaps are used as a new testbed to construct molecular transport junctions (MTJs) through the assembly of thiolated molecular wires across a nanogap formed between two Au electrodes. In addition, we show that one can use OWL to rapidly characterize a MTJ and optimize gap size for two molecular wires of different dimensions. Finally, we have used this new testbed to identify unusual temperature-dependent transport mechanisms for alpha,omega-dithiol terminated oligo(phenylene ethynylene).

  13. Design and construction of a 76m long-travel laser enclosure for a space occulter testbed

    Science.gov (United States)

    Galvin, Michael; Kim, Yunjong; Kasdin, N. Jeremy; Sirbu, Dan; Vanderbei, Robert; Echeverri, Dan; Sagolla, Giuseppe; Rousing, Andreas; Balasubramanian, Kunjithapatham; Ryan, Daniel; Shaklan, Stuart; Lisman, Doug

    2016-07-01

    Princeton University is upgrading our space occulter testbed. In particular, we are lengthening it to 76m to achieve flightlike Fresnel numbers. This much longer testbed required an all-new enclosure design. In this design, we prioritized modularity and the use of commercial off-the-shelf (COTS) and semi-COTS components. Several of the technical challenges encountered included an unexpected slow beam drift and black paint selection. Herein we describe the design and construction of this long-travel laser enclosure.

  14. Algorithmic Self

    DEFF Research Database (Denmark)

    Markham, Annette

    This paper takes an actor network theory approach to explore some of the ways that algorithms co-construct identity and relational meaning in contemporary use of social media. Based on intensive interviews with participants as well as activity logging and data tracking, the author presents a richly...... layered set of accounts to help build our understanding of how individuals relate to their devices, search systems, and social network sites. This work extends critical analyses of the power of algorithms in implicating the social self by offering narrative accounts from multiple perspectives. It also...... contributes an innovative method for blending actor network theory with symbolic interaction to grapple with the complexity of everyday sensemaking practices within networked global information flows....

  15. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won

    2014-01-01

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface

  16. Development of research reactor simulator and its application to dynamic test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Baang, Dane; Park, Jae-Chang; Lee, Seung-Wook; Bae, Sung Won

    2014-01-01

    We developed a real-time simulator for 'High-flux Advanced Neutron Application ReactOr (HANARO), and the Jordan Research and Training Reactor (JRTR). The main purpose of this simulator is operator training, but we modified this simulator into a dynamic test-bed (DTB) to test the functions and dynamic control performance of reactor regulating system (RRS) in HANARO or JRTR before installation. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The software includes a mathematical model that implements plant dynamics in real-time, an instructor station module that manages user instructions, and a human machine interface module. The developed research reactor simulators are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by actual RRS cabinet, and was interfaced using a hard-wired and network-based interface. RRS cabinet generates control signals for reactor power control based on the various feedback signals from DTB, and the DTB runs plant dynamics based on the RRS control signals. Thus the Hardware-In-the-Loop Simulation between RRS and the emulated plant (DTB) has been implemented and tested in this configuration. The test result shows that the developed DTB and actual RRS cabinet works together simultaneously resulting in quite good dynamic control performances. (author)

  17. Test-bed for the remote health monitoring system for bridge structures using FBG sensors

    Science.gov (United States)

    Lee, Chin-Hyung; Park, Ki-Tae; Joo, Bong-Chul; Hwang, Yoon-Koog

    2009-05-01

    This paper reports on test-bed for the long-term health monitoring system for bridge structures employing fiber Bragg grating (FBG) sensors, which is remotely accessible via the web, to provide real-time quantitative information on a bridge's response to live loading and environmental changes, and fast prediction of the structure's integrity. The sensors are attached on several locations of the structure and connected to a data acquisition system permanently installed onsite. The system can be accessed through remote communication using an optical cable network, through which the evaluation of the bridge behavior under live loading can be allowed at place far away from the field. Live structural data are transmitted continuously to the server computer at the central office. The server computer is connected securely to the internet, where data can be retrieved, processed and stored for the remote web-based health monitoring. Test-bed revealed that the remote health monitoring technology will enable practical, cost-effective, and reliable condition assessment and maintenance of bridge structures.

  18. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  19. FloorNet: Deployment and Evaluation of a Multihop Wireless 802.11 Testbed

    Directory of Open Access Journals (Sweden)

    Zink Michael

    2010-01-01

    Full Text Available A lot of attention has been given to multihop wireless networks lately, but further research—in particular, through experimentation—is needed. This attention has motivated an increase in the number of 802.11-based deployments, both indoor and outdoor. These testbeds, which require a significant amount of resources during both deployment and maintenance, are used to run measurements in order to analyze and understand the limitation and differences between analytical or simulation-based figures and the results from real-life experimentation. This paper makes two major contributions: (i first, we describe a novel wireless multihop testbed, which we name FloorNet, that is deployed and operated under the false floor of a lab in our Computer Science building. This false floor provides a strong physical protection that prevents disconnections or misplacements, as well as radio shielding (to some extent thanks to the false floor panels—this later feature is assessed through experimentation; (ii second, by running exhaustive and controlled experiments we are able to analyze the performance limits of commercial off-the-shelf hardware, as well as to derive practical design criteria for the deployment and configuration of mesh networks. These results both provide valuable insights of wireless multihop performance and prove that FloorNet constitutes a valuable asset to research on wireless mesh networks.

  20. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Science.gov (United States)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  1. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Directory of Open Access Journals (Sweden)

    C. Knote

    2018-02-01

    Full Text Available The Background Error Analysis Testbed (BEATBOX is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX to the Kinetic Pre-Processor (KPP, this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  2. Open-Source Based Testbed for Multioperator 4G/5G Infrastructure Sharing in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Ricardo Marco Alaez

    2017-01-01

    Full Text Available Fourth-Generation (4G mobile networks are based on Long-Term Evolution (LTE technologies and are being deployed worldwide, while research on further evolution towards the Fifth Generation (5G has been recently initiated. 5G will be featured with advanced network infrastructure sharing capabilities among different operators. Therefore, an open-source implementation of 4G/5G networks with this capability is crucial to enable early research in this area. The main contribution of this paper is the design and implementation of such a 4G/5G open-source testbed to investigate multioperator infrastructure sharing capabilities executed in virtual architectures. The proposed design and implementation enable the virtualization and sharing of some of the components of the LTE architecture. A testbed has been implemented and validated with intensive empirical experiments conducted to validate the suitability of virtualizing LTE components in virtual infrastructures (i.e., infrastructures with multitenancy sharing capabilities. The impact of the proposed technologies can lead to significant saving of both capital and operational costs for mobile telecommunication operators.

  3. A Functional Neuroimaging Analysis of the Trail Making Test-B: Implications for Clinical Application

    Directory of Open Access Journals (Sweden)

    Mark D. Allen

    2011-01-01

    Full Text Available Recent progress has been made using fMRI as a clinical assessment tool, often employing analogues of traditional “paper and pencil” tests. The Trail Making Test (TMT, popular for years as a neuropsychological exam, has been largely ignored in the realm of neuroimaging, most likely because its physical format and administration does not lend itself to straightforward adaptation as an fMRI paradigm. Likewise, there is relatively more ambiguity about the neural systems associated with this test than many other tests of comparable clinical use. In this study, we describe an fMRI version of Trail Making Test-B (TMTB that maintains the core functionality of the TMT while optimizing its use for both research and clinical settings. Subjects (N = 32 were administered the Functional Trail Making Test-B (f-TMTB. Brain region activations elicited by the f-TMTB were consistent with expectations given by prior TMT neurophysiological studies, including significant activations in the ventral and dorsal visual pathways and the medial pre-supplementary motor area. The f-TMTB was further evaluated for concurrent validity with the traditional TMTB using an additional sample of control subjects (N = 100. Together, these results support the f-TMTB as a viable neuroimaging adaptation of the TMT that is optimized to evoke maximally robust fMRI activation with minimal time and equipment requirements.

  4. PlanetLab Europe as Geographically-Distributed Testbed for Software Development and Evaluation

    Directory of Open Access Journals (Sweden)

    Dan Komosny

    2015-01-01

    Full Text Available In this paper, we analyse the use of PlanetLab Europe for development and evaluation of geographically-oriented Internet services. PlanetLab is a global research network with the main purpose to support development of new Internet services and protocols. PlanetLab is divided into several branches; one of them is PlanetLab Europe. PlanetLab Europe consists of about 350 nodes at 150 geographically different sites. The nodes are accessible by remote login, and the users can run their software on the nodes. In the paper, we study the PlanetLab's properties that are significant for its use as a geographically distributed testbed. This includes node position accuracy, services availability and stability. We find a considerable number of location inaccuracies and a number of services that cannot be considered as reliable. Based on the results we propose a simple approach to nodes selection in testbeds for geographically-oriented Internet services development and evaluation.

  5. Establishment of a sensor testbed at NIST for plant productivity monitoring

    Science.gov (United States)

    Allen, D. W.; Hutyra, L.; Reinmann, A.; Trlica, A.; Marrs, J.; Jones, T.; Whetstone, J. R.; Logan, B.; Reblin, J.

    2017-12-01

    Accurate assessments of biogenic carbon fluxes is challenging. Correlating optical signatures to plant activity allows for monitoring large regions. New methods, including solar-induced fluorescence (SIF), promise to provide more timely and accurate estimate of plant activity, but we are still developing a full understanding of the mechanistic leakage between plant assimilation of carbon and SIF. We have initiated a testbed to facilitate the evaluation of sensors and methods for remote monitoring of plant activity at the NIST headquarters. The test bed utilizes a forested area of mature trees in a mixed urban environment. A 1 hectare plot within the 26 hectare forest has been instrumented for ecophysiological measurements with an edge (100 m long) that is persistently monitored with multimodal optical sensors (SIF spectrometers, hyperspectral imagers, thermal infrared imaging, and lidar). This biological testbed has the advantage of direct access to the national scales maintained by NIST of measurements related to both the physical and optical measurements of interest. We offer a description of the test site, the sensors, and preliminary results from the first season of observations for ecological, physiological, and remote sensing based estimates of ecosystem productivity.

  6. The Objectives of NASA's Living with a Star Space Environment Testbed

    Science.gov (United States)

    Barth, Janet L.; LaBel, Kenneth A.; Brewer, Dana; Kauffman, Billy; Howard, Regan; Griffin, Geoff; Day, John H. (Technical Monitor)

    2001-01-01

    NASA is planning to fly a series of Space Environment Testbeds (SET) as part of the Living With A Star (LWS) Program. The goal of the testbeds is to improve and develop capabilities to mitigate and/or accommodate the affects of solar variability in spacecraft and avionics design and operation. This will be accomplished by performing technology validation in space to enable routine operations, characterize technology performance in space, and improve and develop models, guidelines and databases. The anticipated result of the LWS/SET program is improved spacecraft performance, design, and operation for survival of the radiation, spacecraft charging, meteoroid, orbital debris and thermosphere/ionosphere environments. The program calls for a series of NASA Research Announcements (NRAs) to be issued to solicit flight validation experiments, improvement in environment effects models and guidelines, and collateral environment measurements. The selected flight experiments may fly on the SET experiment carriers and flights of opportunity on other commercial and technology missions. This paper presents the status of the project so far, including a description of the types of experiments that are intended to fly on SET-1 and a description of the SET-1 carrier parameters.

  7. Carrier Plus: A sensor payload for Living With a Star Space Environment Testbed (LWS/SET)

    Science.gov (United States)

    Marshall, Cheryl J.; Moss, Steven; Howard, Regan; LaBel, Kenneth A.; Grycewicz, Tom; Barth, Janet L.; Brewer, Dana

    2003-01-01

    The Defense Threat Reduction Agency (DTR4) and National Aeronautics and Space Administration (NASA) Goddard Space Flight Center are collaborating to develop the Carrier Plus sensor experiment platform as a capability of the Space Environments Testbed (SET). The Space Environment Testbed (SET) provides flight opportunities for technology experiments as part of NASA's Living With a Star (LWS) program. The Carrier Plus will provide new capability to characterize sensor technologies such as state-of-the-art visible focal plane arrays (FPAs) in a natural space radiation environment. The technical objectives include on-orbit validation of recently developed FPA technologies and performance prediction methodologies, as well as characterization of the FPA radiation response to total ionizing dose damage, displacement damage and transients. It is expected that the sensor experiment will carry 4-6 FPAs and associated radiation correlative environment monitors (CEMs) for a 2006-2007 launch. Sensor technology candidates may include n- and p-charge coupled devices (CCDs), active pixel sensors (APS), and hybrid CMOS arrays. The presentation will describe the Carrier Plus goals and objectives, as well as provide details about the architecture and design. More information on the LWS program can be found at http://lws.gsfc.nasa.gov/. Business announcements for LWS/SET and program briefings are posted at http://lws-set.gsfc.nasa.gov

  8. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface.

  9. The Orlando TDWR testbed and airborne wind shear date comparison results

    Science.gov (United States)

    Campbell, Steven; Berke, Anthony; Matthews, Michael

    1992-01-01

    The focus of this talk is on comparing terminal Doppler Weather Radar (TDWR) and airborne wind shear data in computing a microburst hazard index called the F factor. The TDWR is a ground-based system for detecting wind shear hazards to aviation in the terminal area. The Federal Aviation Administration will begin deploying TDWR units near 45 airports in late 1992. As part of this development effort, M.I.T. Lincoln Laboratory operates under F.A.A. support a TDWR testbed radar in Orlando, FL. During the past two years, a series of flight tests has been conducted with instrumented aircraft penetrating microburst events while under testbed radar surveillance. These tests were carried out with a Cessna Citation 2 aircraft operated by the University of North Dakota (UND) Center for Aerospace Sciences in 1990, and a Boeing 737 operated by NASA Langley Research Center in 1991. A large data base of approximately 60 instrumented microburst penetrations has been obtained from these flights.

  10. Algorithmic Relative Complexity

    Directory of Open Access Journals (Sweden)

    Daniele Cerra

    2011-04-01

    Full Text Available Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence found in Shannon’s framework. We define the cross-complexity of an object x with respect to another object y as the amount of computational resources needed to specify x in terms of y, and the complexity of x related to y as the compression power which is lost when adopting such a description for x, compared to the shortest representation of x. Properties of analogous quantities in classical information theory hold for these new concepts. As these notions are incomputable, a suitable approximation based upon data compression is derived to enable the application to real data, yielding a divergence measure applicable to any pair of strings. Example applications are outlined, involving authorship attribution and satellite image classification, as well as a comparison to similar established techniques.

  11. The satellite-based remote sensing of particulate matter (PM) in support to urban air quality: PM variability and hot spots within the Cordoba city (Argentina) as revealed by the high-resolution MAIAC-algorithm retrievals applied to a ten-years dataset (2

    Science.gov (United States)

    Della Ceca, Lara Sofia; Carreras, Hebe A.; Lyapustin, Alexei I.; Barnaba, Francesca

    2016-04-01

    Particulate matter (PM) is one of the major harmful pollutants to public health and the environment [1]. In developed countries, specific air-quality legislation establishes limit values for PM metrics (e.g., PM10, PM2.5) to protect the citizens health (e.g., European Commission Directive 2008/50, US Clean Air Act). Extensive PM measuring networks therefore exist in these countries to comply with the legislation. In less developed countries air quality monitoring networks are still lacking and satellite-based datasets could represent a valid alternative to fill observational gaps. The main PM (or aerosol) parameter retrieved from satellite is the 'aerosol optical depth' (AOD), an optical parameter quantifying the aerosol load in the whole atmospheric column. Datasets from the MODIS sensors on board of the NASA spacecrafts TERRA and AQUA are among the longest records of AOD from space. However, although extremely useful in regional and global studies, the standard 10 km-resolution MODIS AOD product is not suitable to be employed at the urban scale. Recently, a new algorithm called Multi-Angle Implementation of Atmospheric Correction (MAIAC) was developed for MODIS, providing AOD at 1 km resolution [2]. In this work, the MAIAC AOD retrievals over the decade 2003-2013 were employed to investigate the spatiotemporal variation of atmospheric aerosols over the Argentinean city of Cordoba and its surroundings, an area where a very scarce dataset of in situ PM data is available. The MAIAC retrievals over the city were firstly validated using a 'ground truth' AOD dataset from the Cordoba sunphotometer operating within the global AERONET network [3]. This validation showed the good performances of the MAIAC algorithm in the area. The satellite MAIAC AOD dataset was therefore employed to investigate the 10-years trend as well as seasonal and monthly patterns of particulate matter in the Cordoba city. The first showed a marked increase of AOD over time, particularly evident in

  12. Effect of Ionosphere on Geostationary Communication Satellite Signals

    Science.gov (United States)

    Erdem, Esra; Arikan, Feza; Gulgonul, Senol

    2016-07-01

    Geostationary orbit (GEO) communications satellites allow radio, television, and telephone transmissions to be sent live anywhere in the world. They are extremely important in daily life and also for military applications. Since, satellite communication is an expensive technology addressing crowd of people, it is critical to improve the performance of this technology. GEO satellites are at 35,786 kilometres from Earth's surface situated directly over the equator. A satellite in a geostationary orbit (GEO) appears to stand still in the sky, in a fixed position with respect to an observer on the earth, because the satellite's orbital period is the same as the rotation rate of the Earth. The advantage of this orbit is that ground antennas can be fixed to point towards to satellite without their having to track the satellite's motion. Radio frequency ranges used in satellite communications are C, X, Ku, Ka and even EHG and V-band. Satellite signals are disturbed by atmospheric effects on the path between the satellite and the receiver antenna. These effects are mostly rain, cloud and gaseous attenuation. It is expected that ionosphere has a minor effect on the satellite signals when the ionosphere is quiet. But there are anomalies and perturbations on the structure of ionosphere with respect to geomagnetic field and solar activity and these conditions may cause further affects on the satellite signals. In this study IONOLAB-RAY algorithm is adopted to examine the effect of ionosphere on satellite signals. IONOLAB-RAY is developed to calculate propagation path and characteristics of high frequency signals. The algorithm does not have any frequency limitation and models the plasmasphere up to 20,200 km altitude, so that propagation between a GEO satellite and antenna on Earth can be simulated. The algorithm models inhomogeneous, anisotropic and time dependent structure of the ionosphere with a 3-D spherical grid geometry and calculates physical parameters of the

  13. Satellite Communications Industry

    Science.gov (United States)

    1993-04-01

    Ariane $loom SAJAC 1 Hughes Satellite Japan 06/94 $150m SAJAC 2 Hughes Satellite Japan -- (spare) $150m SatcomHl GE GE Americom /95 $50m SOLIDARIDAD ...1 Hughes SCT (Mexico) 11/93 Ariane $loom SOLIDARIDAD 2 Hughes SCT (Mexico) /94 $loom Superbird Al Loral Space Com Gp (Jap) 11/92 Ariane $175m

  14. Partnership via Satellite.

    Science.gov (United States)

    Powell, Marie Clare

    1980-01-01

    Segments of the 1980 National Catholic Educational Association (NCEA) conference were to be telecast nationally by satellite. The author briefly explains the satellite transmission process and advises Catholic educators on how to pick up the broadcast through their local cable television system. (SJL)

  15. DETERMINATION OF THE LIGHT CURVE OF THE ARTIFICIAL SATELLITE BY ITS ROTATION PATH AS PREPARATION TO THE INVERSE PROBLEM SOLUTION

    OpenAIRE

    Pavlenko, Daniil

    2012-01-01

    Developing the algorithm of estimation of the rotational parameters of the artificial satellite by its light curve, we face the necessity to compute test light curves for various initially given types of rotation and specific features of lighting of the satellite. In the present study the algorithm of creation of such light curves with the simulation method and the obtained result are described.

  16. High accuracy satellite drag model (HASDM)

    Science.gov (United States)

    Storz, Mark F.; Bowman, Bruce R.; Branson, Major James I.; Casali, Stephen J.; Tobiska, W. Kent

    The dominant error source in force models used to predict low-perigee satellite trajectories is atmospheric drag. Errors in operational thermospheric density models cause significant errors in predicted satellite positions, since these models do not account for dynamic changes in atmospheric drag for orbit predictions. The Air Force Space Battlelab's High Accuracy Satellite Drag Model (HASDM) estimates and predicts (out three days) a dynamically varying global density field. HASDM includes the Dynamic Calibration Atmosphere (DCA) algorithm that solves for the phases and amplitudes of the diurnal and semidiurnal variations of thermospheric density near real-time from the observed drag effects on a set of Low Earth Orbit (LEO) calibration satellites. The density correction is expressed as a function of latitude, local solar time and altitude. In HASDM, a time series prediction filter relates the extreme ultraviolet (EUV) energy index E10.7 and the geomagnetic storm index ap, to the DCA density correction parameters. The E10.7 index is generated by the SOLAR2000 model, the first full spectrum model of solar irradiance. The estimated and predicted density fields will be used operationally to significantly improve the accuracy of predicted trajectories for all low-perigee satellites.

  17. Moving object detection in video satellite image based on deep learning

    Science.gov (United States)

    Zhang, Xueyang; Xiang, Junhua

    2017-11-01

    Moving object detection in video satellite image is studied. A detection algorithm based on deep learning is proposed. The small scale characteristics of remote sensing video objects are analyzed. Firstly, background subtraction algorithm of adaptive Gauss mixture model is used to generate region proposals. Then the objects in region proposals are classified via the deep convolutional neural network. Thus moving objects of interest are detected combined with prior information of sub-satellite point. The deep convolution neural network employs a 21-layer residual convolutional neural network, and trains the network parameters by transfer learning. Experimental results about video from Tiantuo-2 satellite demonstrate the effectiveness of the algorithm.

  18. The satellite situation center

    International Nuclear Information System (INIS)

    Teague, M.J.; Sawyer, D.M.; Vette, J.I.

    1982-01-01

    Considerations related to the early planning for the International Magnetospheric Study (IMS) took into account the desirability of an establishment of specific entities for generating and disseminating coordination information for both retrospective and predictive periods. The organizations established include the IMS/Satellite Situation Center (IMS/SSC) operated by NASA. The activities of the SSC are related to the preparation of reports on predicted and actually achieved satellite positions, the response to inquiries, the compilation of information on satellite experiments, and the issue of periodic status summaries. Attention is given to high-altitude satellite services, other correlative satellite services, non-IMS activities of the SSC, a summary of the SSC request activity, and post-IMS and future activities

  19. Algorithms and Applications in Grass Growth Monitoring

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2013-01-01

    Full Text Available Monitoring vegetation phonology using satellite data has been an area of growing research interest in recent decades. Validation is an essential issue in land surface phenology study at large scale. In this paper, double logistic function-fitting algorithm was used to retrieve phenophases for grassland in North China from a consistently processed Moderate Resolution Spectrodiometer (MODIS dataset. Then, the accuracy of the satellite-based estimates was assessed using field phenology observations. Results show that the method is valid to identify vegetation phenology with good success. The phenophases derived from satellite and observed on ground are generally similar. Greenup onset dates identified by Normalized Difference Vegetation Index (NDVI and in situ observed dates showed general agreement. There is an excellent agreement between the dates of maturity onset determined by MODIS and the field observations. The satellite-derived length of vegetation growing season is generally consistent with the surface observation.

  20. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for ATDM program. [supporting datasets - Pasadena Testbed

    Science.gov (United States)

    2017-07-26

    This zip file contains POSTDATA.ATT (.ATT); Print to File (.PRN); Portable Document Format (.PDF); and document (.DOCX) files of data to support FHWA-JPO-16-385, Analysis, modeling, and simulation (AMS) testbed development and evaluation to support d...

  1. Parallel algorithms

    CERN Document Server

    Casanova, Henri; Robert, Yves

    2008-01-01

    ""…The authors of the present book, who have extensive credentials in both research and instruction in the area of parallelism, present a sound, principled treatment of parallel algorithms. … This book is very well written and extremely well designed from an instructional point of view. … The authors have created an instructive and fascinating text. The book will serve researchers as well as instructors who need a solid, readable text for a course on parallelism in computing. Indeed, for anyone who wants an understandable text from which to acquire a current, rigorous, and broad vi

  2. Algorithm 865

    DEFF Research Database (Denmark)

    Gustavson, Fred G.; Reid, John K.; Wasniewski, Jerzy

    2007-01-01

    We present subroutines for the Cholesky factorization of a positive-definite symmetric matrix and for solving corresponding sets of linear equations. They exploit cache memory by using the block hybrid format proposed by the authors in a companion article. The matrix is packed into n(n + 1)/2 real...... variables, and the speed is usually better than that of the LAPACK algorithm that uses full storage (n2 variables). Included are subroutines for rearranging a matrix whose upper or lower-triangular part is packed by columns to this format and for the inverse rearrangement. Also included is a kernel...

  3. Assessment of satellite derived diffuse attenuation coefficients ...

    Science.gov (United States)

    Optical data collected in coastal waters off South Florida and in the Caribbean Sea between January 2009 and December 2010 were used to evaluate products derived with three bio-optical inversion algorithms applied to MOIDS/Aqua, MODIS/Terra, and SeaWiFS satellite observations. The products included the diffuse attenuation coefficient at 490 nm (Kd_490) and for the visible range (Kd_PAR), and euphotic depth (Zeu, corresponding to 1% of the surface incident photosynthetically available radiation or PAR). Above-water hyperspectral reflectance data collected over optically shallow waters of the Florida Keys between June 1997 and August 2011 were used to help understand algorithm performance over optically shallow waters. The in situ data covered a variety of water types in South Florida and the Caribbean Sea, ranging from deep clear waters, turbid coastal waters, and optically shallow waters (Kd_490 range of ~0.03 – 1.29m-1). An algorithm based on Inherent Optical Properties (IOPs) showed the best performance (RMSD turbidity or shallow bottom contamination. Similar results were obtained when only in situ data were used to evaluate algorithm performance. The excellent agreement between satellite-derived remote sensing reflectance (Rrs) and in situ Rrs suggested that

  4. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  5. Interactive aircraft cabin testbed for stress-free air travel system experiment: an innovative concurrent design approach

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    In this paper, a study of the concurrent engineering design for the environmental friendly low cost aircraft cabin simulator is presented. The study describes the used of concurrent design technique in the design activity. The simulator is a testbed that was designed and built for research on

  6. Photovoltaic Engineering Testbed: A Facility for Space Calibration and Measurement of Solar Cells on the International Space Station

    Science.gov (United States)

    Landis, Geoffrey A.; Bailey, Sheila G.; Jenkins, Phillip; Sexton, J. Andrew; Scheiman, David; Christie, Robert; Charpie, James; Gerber, Scott S.; Johnson, D. Bruce

    2001-01-01

    The Photovoltaic Engineering Testbed ("PET") is a facility to be flown on the International Space Station to perform calibration, measurement, and qualification of solar cells in the space environment and then returning the cells to Earth for laboratory use. PET will allow rapid turnaround testing of new photovoltaic technology under AM0 conditions.

  7. Prediction of GNSS satellite clocks

    International Nuclear Information System (INIS)

    Broederbauer, V.

    2010-01-01

    This thesis deals with the characterisation and prediction of GNSS-satellite-clocks. A prerequisite to develop powerful algorithms for the prediction of clock-corrections is the thorough study of the behaviour of the different clock-types of the satellites. In this context the predicted part of the IGU-clock-corrections provided by the Analysis Centers (ACs) of the IGS was compared to the IGS-Rapid-clock solutions to determine reasonable estimates of the quality of already existing well performing predictions. For the shortest investigated interval (three hours) all ACs obtain almost the same accuracy of 0,1 to 0,4 ns. For longer intervals the individual predictions results start to diverge. Thus, for a 12-hours- interval the differences range from nearly 10 ns (GFZ, CODE) until up to some 'tens of ns'. Based on the estimated clock corrections provided via the IGS Rapid products a simple quadratic polynomial turns out to be sufficient to describe the time series of Rubidium-clocks. On the other hand Cesium-clocks show a periodical behaviour (revolution period) with an amplitude of up to 6 ns. A clear correlation between these amplitudes and the Sun elevation angle above the orbital planes can be demonstrated. The variability of the amplitudes is supposed to be caused by temperature-variations affecting the oscillator. To account for this periodical behaviour a quadratic polynomial with an additional sinus-term was finally chosen as prediction model both for the Cesium as well as for the Rubidium clocks. The three polynomial-parameters as well as amplitude and phase shift of the periodic term are estimated within a least-square-adjustment by means of program GNSS-VC/static. Input-data are time series of the observed part of the IGU clock corrections. With the estimated parameters clock-corrections are predicted for various durations. The mean error of the prediction of Rubidium-clock-corrections for an interval of six hours reaches up to 1,5 ns. For the 12-hours

  8. First light of an external occulter testbed at flight Fresnel numbers

    Science.gov (United States)

    Kim, Yunjong; Sirbu, Dan; Hu, Mia; Kasdin, Jeremy; Vanderbei, Robert J.; Harness, Anthony; Shaklan, Stuart

    2017-01-01

    Many approaches have been suggested over the last couple of decades for imaging Earth-like planets. One of the main candidates for creating high-contrast for future Earth-like planets detection is an external occulter. The external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. The occulter is typically tens of meters in diameter and the separation from the telescope is of the order of tens of thousands of kilometers. Optical testing of a full-scale external occulter on the ground is impossible because of the long separations. Therefore, laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The goal of this experiment is to demonstrate a pupil plane suppression of better than 1e-9 with a corresponding image plane contrast of better than 1e-11. The occulter testbed uses a 77.2 m optical propagation distance to realize the flight Fresnel number of 14.5. The scaled mask is placed at 27.2 m from the artificial source and the camera is located 50.0 m from the scaled mask. We will use an etched silicon mask, manufactured by the Microdevices Lab(MDL) of the Jet Propulsion Laboratory(JPL), as the occulter. Based on conversations with MDL, we expect that 0.5 μm feature size is an achievable resolution in the mask manufacturing process and is therefore likely the indicator of the best possible performance. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the experimental setup of the testbed. We compare the experimental results with simulations

  9. Evaluation of Future Internet Technologies for Processing and Distribution of Satellite Imagery

    Science.gov (United States)

    Becedas, J.; Perez, R.; Gonzalez, G.; Alvarez, J.; Garcia, F.; Maldonado, F.; Sucari, A.; Garcia, J.

    2015-04-01

    Satellite imagery data centres are designed to operate a defined number of satellites. For instance, difficulties when new satellites have to be incorporated in the system appear. This occurs because traditional infrastructures are neither flexible nor scalable. With the appearance of Future Internet technologies new solutions can be provided to manage large and variable amounts of data on demand. These technologies optimize resources and facilitate the appearance of new applications and services in the traditional Earth Observation (EO) market. The use of Future Internet technologies for the EO sector were validated with the GEO-Cloud experiment, part of the Fed4FIRE FP7 European project. This work presents the final results of the project, in which a constellation of satellites records the whole Earth surface on a daily basis. The satellite imagery is downloaded into a distributed network of ground stations and ingested in a cloud infrastructure, where the data is processed, stored, archived and distributed to the end users. The processing and transfer times inside the cloud, workload of the processors, automatic cataloguing and accessibility through the Internet are evaluated to validate if Future Internet technologies present advantages over traditional methods. Applicability of these technologies is evaluated to provide high added value services. Finally, the advantages of using federated testbeds to carry out large scale, industry driven experiments are analysed evaluating the feasibility of an experiment developed in the European infrastructure Fed4FIRE and its migration to a commercial cloud: SoftLayer, an IBM Company.

  10. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    Science.gov (United States)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  11. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  12. Autonomous, agile micro-satellites and supporting technologies

    International Nuclear Information System (INIS)

    Breitfeller, E; Dittman, M D; Gaughan, R J; Jones, M S; Kordas, J F; Ledebuhr, A G; Ng, L C; Whitehead, J C; Wilson, B

    1999-01-01

    This paper updates the on-going effort at Lawrence Livermore National Laboratory to develop autonomous, agile micro-satellites (MicroSats). The objective of this development effort is to develop MicroSats weighing only a few tens of kilograms, that are able to autonomously perform precision maneuvers and can be used telerobotically in a variety of mission modes. The required capabilities include satellite rendezvous, inspection, proximity-operations, docking, and servicing. The MicroSat carries an integrated proximity-operations sensor-suite incorporating advanced avionics. A new self-pressurizing propulsion system utilizing a miniaturized pump and non-toxic mono-propellant hydrogen peroxide was successfully tested. This system can provide a nominal 25 kg MicroSat with 200-300 m/s delta-v including a warm-gas attitude control system. The avionics is based on the latest PowerPC processor using a CompactPCI bus architecture, which is modular, high-performance and processor-independent. This leverages commercial-off-the-shelf (COTS) technologies and minimizes the effects of future changes in processors. The MicroSat software development environment uses the Vx-Works real-time operating system (RTOS) that provides a rapid development environment for integration of new software modules, allowing early integration and test. We will summarize results of recent integrated ground flight testing of our latest non-toxic pumped propulsion MicroSat testbed vehicle operated on our unique dynamic air-rail

  13. Adaptive suppression of passive intermodulation in digital satellite transceivers

    Directory of Open Access Journals (Sweden)

    Lu TIAN

    2017-06-01

    Full Text Available For the performance issues of satellite transceivers suffering passive intermodulation interference, a novel and effective digital suppression algorithm is presented in this paper. In contrast to analog approaches, digital passive intermodulation (PIM suppression approaches can be easily reconfigured and therefore are highly attractive for future satellite communication systems. A simplified model of nonlinear distortion from passive microwave devices is established in consideration of the memory effect. The multiple high-order PIM products falling into the receiving band can be described as a bilinear predictor function. A suppression algorithm based on a bilinear polynomial decorrelated adaptive filter is proposed for baseband digital signal processing. In consideration of the time-varying characteristics of passive intermodulation, this algorithm can achieve the rapidness of online interference estimation and low complexity with less consumption of resources. Numerical simulation results show that the algorithm can effectively compensate the passive intermodulation interference, and achieve a high signal-to-interference ratio gain.

  14. The development of the human exploration demonstration project (HEDP), a planetary systems testbed

    Science.gov (United States)

    Chevers, Edward S.; Korsmeyer, David J.

    1993-01-01

    The Human Exploration Demonstration Project (HEDP) is an ongoing task at the National Aeronautics and Space Administration's Ames Research Center to address the advanced technology requirements necessary to implement an integrated working and living environment for a planetary surface habitat. The integrated environment will consist of life support systems, physiological monitoring of project crew, a virtual environment workstation, and centralized data acquisition and habitat systems health monitoring. There will be several robotic systems on a simulated planetary landscape external to the habitat environment to provide representative work loads for the crew. This paper describes the status of the HEDP after one year, the major facilities composing the HEDP, the project's role as an Ames Research Center testbed, and the types of demonstration scenarios that will be run to showcase the technologies.

  15. High-Resolution Adaptive Optics Test-Bed for Vision Science

    International Nuclear Information System (INIS)

    Wilks, S.C.; Thomspon, C.A.; Olivier, S.S.; Bauman, B.J.; Barnes, T.; Werner, J.S.

    2001-01-01

    We discuss the design and implementation of a low-cost, high-resolution adaptive optics test-bed for vision research. It is well known that high-order aberrations in the human eye reduce optical resolution and limit visual acuity. However, the effects of aberration-free eyesight on vision are only now beginning to be studied using adaptive optics to sense and correct the aberrations in the eye. We are developing a high-resolution adaptive optics system for this purpose using a Hamamatsu Parallel Aligned Nematic Liquid Crystal Spatial Light Modulator. Phase-wrapping is used to extend the effective stroke of the device, and the wavefront sensing and wavefront correction are done at different wavelengths. Issues associated with these techniques will be discussed

  16. Thermal and Fluid Modeling of the CRYogenic Orbital TEstbed (CRYOTE) Ground Test Article (GTA)

    Science.gov (United States)

    Piryk, David; Schallhorn, Paul; Walls, Laurie; Stopnitzky, Benny; Rhys, Noah; Wollen, Mark

    2012-01-01

    The purpose of this study was to anchor thermal and fluid system models to data acquired from a ground test article (GTA) for the CRYogenic Orbital TEstbed - CRYOTE. To accomplish this analysis, it was broken into four primary tasks. These included model development, pre-test predictions, testing support at Marshall Space Flight Center (MSFC} and post-test correlations. Information from MSFC facilitated the task of refining and correlating the initial models. The primary goal of the modeling/testing/correlating efforts was to characterize heat loads throughout the ground test article. Significant factors impacting the heat loads included radiative environments, multi-layer insulation (MLI) performance, tank fill levels, tank pressures, and even contact conductance coefficients. This paper demonstrates how analytical thermal/fluid networks were established, and it includes supporting rationale for specific thermal responses seen during testing.

  17. Living with a Star (LWS) Space Environment Testbeds (SET), Mission Carrier Overview and Capabilities

    Science.gov (United States)

    Patschke, Robert; Barth, Janet; Label, Ken; Mariano, Carolyn; Pham, Karen; Brewer, Dana; Cuviello, Michael; Kobe, David; Wu, Carl; Jarosz, Donald

    2004-01-01

    NASA has initiated the Living With a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affect life and society. A goal of the program is to bridge the gap between science, engineering, and user application communities. This will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The three program elements of the LWS Program are Science Missions; Targeted Research and Technology; and Space Environment Testbeds (SETS). SET is an ideal platform for small experiments performing research on space environment effects on technologies and on the mitigation of space weather effects. A short description of the LWS Program will be given, and the SET will be described in detail, giving the mission objectives, available carrier services, and upcoming flight opportunities.

  18. OPNET/Simulink Based Testbed for Disturbance Detection in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Sadi, Mohammad A. H. [University of Memphis; Dasgupta, Dipankar [ORNL; Ali, Mohammad Hassan [University of Memphis; Abercrombie, Robert K [ORNL

    2015-01-01

    The important backbone of the smart grid is the cyber/information infrastructure, which is primarily used to communicate with different grid components. A smart grid is a complex cyber physical system containing a numerous and variety number of sources, devices, controllers and loads. Therefore, the smart grid is vulnerable to grid related disturbances. For such dynamic system, disturbance and intrusion detection is a paramount issue. This paper presents a Simulink and Opnet based co-simulated platform to carry out a cyber-intrusion in cyber network for modern power systems and the smart grid. The IEEE 30 bus power system model is used to demonstrate the effectiveness of the simulated testbed. The experiments were performed by disturbing the circuit breakers reclosing time through a cyber-attack. Different disturbance situations in the considered test system are considered and the results indicate the effectiveness of the proposed co-simulated scheme.

  19. MODELING CIRCUMSTELLAR DISKS OF B-TYPE STARS WITH OBSERVATIONS FROM THE PALOMAR TESTBED INTERFEROMETER

    International Nuclear Information System (INIS)

    Grzenia, B. J.; Tycner, C.; Jones, C. E.; Sigut, T. A. A.; Rinehart, S. A.; Van Belle, G. T.

    2013-01-01

    Geometrical (uniform disk) and numerical models were calculated for a set of B-emission (Be) stars observed with the Palomar Testbed Interferometer (PTI). Physical extents have been estimated for the disks of a total of 15 stars via uniform disk models. Our numerical non-LTE models used parameters for the B0, B2, B5, and B8 spectral classes and following the framework laid by previous studies, we have compared them to infrared K-band interferometric observations taken at PTI. This is the first time such an extensive set of Be stars observed with long-baseline interferometry has been analyzed with self-consistent non-LTE numerical disk models.

  20. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...