WorldWideScience

Sample records for satellite algorithm testbed

  1. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    Science.gov (United States)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  2. A Battery Certification Testbed for Small Satellite Missions

    Science.gov (United States)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  3. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  4. A numerical testbed for remote sensing of aerosols, and its demonstration for evaluating retrieval synergy from a geostationary satellite constellation of GEO-CAPE and GOES-R

    International Nuclear Information System (INIS)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEO-CAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O 2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  5. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    Science.gov (United States)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  6. An SDR-Based Real-Time Testbed for GNSS Adaptive Array Anti-Jamming Algorithms Accelerated by GPU

    Directory of Open Access Journals (Sweden)

    Hailong Xu

    2016-03-01

    Full Text Available Nowadays, software-defined radio (SDR has become a common approach to evaluate new algorithms. However, in the field of Global Navigation Satellite System (GNSS adaptive array anti-jamming, previous work has been limited due to the high computational power demanded by adaptive algorithms, and often lack flexibility and configurability. In this paper, the design and implementation of an SDR-based real-time testbed for GNSS adaptive array anti-jamming accelerated by a Graphics Processing Unit (GPU are documented. This testbed highlights itself as a feature-rich and extendible platform with great flexibility and configurability, as well as high computational performance. Both Space-Time Adaptive Processing (STAP and Space-Frequency Adaptive Processing (SFAP are implemented with a wide range of parameters. Raw data from as many as eight antenna elements can be processed in real-time in either an adaptive nulling or beamforming mode. To fully take advantage of the parallelism resource provided by the GPU, a batched method in programming is proposed. Tests and experiments are conducted to evaluate both the computational and anti-jamming performance. This platform can be used for research and prototyping, as well as a real product in certain applications.

  7. A Matlab-Based Testbed for Integration, Evaluation and Comparison of Heterogeneous Stereo Vision Matching Algorithms

    Directory of Open Access Journals (Sweden)

    Raul Correal

    2016-11-01

    Full Text Available Stereo matching is a heavily researched area with a prolific published literature and a broad spectrum of heterogeneous algorithms available in diverse programming languages. This paper presents a Matlab-based testbed that aims to centralize and standardize this variety of both current and prospective stereo matching approaches. The proposed testbed aims to facilitate the application of stereo-based methods to real situations. It allows for configuring and executing algorithms, as well as comparing results, in a fast, easy and friendly setting. Algorithms can be combined so that a series of processes can be chained and executed consecutively, using the output of a process as input for the next; some additional filtering and image processing techniques have been included within the testbed for this purpose. A use case is included to illustrate how these processes are sequenced and its effect on the results for real applications. The testbed has been conceived as a collaborative and incremental open-source project, where its code is accessible and modifiable, with the objective of receiving contributions and releasing future versions to include new algorithms and features. It is currently available online for the research community.

  8. A Numerical Testbed for Remote Sensing of Aerosols, and its Demonstration for Evaluating Retrieval Synergy from a Geostationary Satellite Constellation of GEO-CAPE and GOES-R

    Science.gov (United States)

    Wang, Jun; Xu, Xiaoguang; Ding, Shouguo; Zeng, Jing; Spurr, Robert; Liu, Xiong; Chance, Kelly; Mishchenko, Michael I.

    2014-01-01

    We present a numerical testbed for remote sensing of aerosols, together with a demonstration for evaluating retrieval synergy from a geostationary satellite constellation. The testbed combines inverse (optimal-estimation) software with a forward model containing linearized code for computing particle scattering (for both spherical and non-spherical particles), a kernel-based (land and ocean) surface bi-directional reflectance facility, and a linearized radiative transfer model for polarized radiance. Calculation of gas absorption spectra uses the HITRAN (HIgh-resolution TRANsmission molecular absorption) database of spectroscopic line parameters and other trace species cross-sections. The outputs of the testbed include not only the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering and physical parameters (such as size and shape parameters, refractive index, and plume height), but also DFS (Degree of Freedom for Signal) values for retrieval of these parameters. This testbed can be used as a tool to provide an objective assessment of aerosol information content that can be retrieved for any constellation of (planned or real) satellite sensors and for any combination of algorithm design factors (in terms of wavelengths, viewing angles, radiance and/or polarization to be measured or used). We summarize the components of the testbed, including the derivation and validation of analytical formulae for Jacobian calculations. Benchmark calculations from the forward model are documented. In the context of NASA's Decadal Survey Mission GEOCAPE (GEOstationary Coastal and Air Pollution Events), we demonstrate the use of the testbed to conduct a feasibility study of using polarization measurements in and around the O2 A band for the retrieval of aerosol height information from space, as well as an to assess potential improvement in the retrieval of aerosol fine and coarse mode aerosol optical depth (AOD) through the

  9. Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed

    Science.gov (United States)

    Tian, Ye; Song, Qi; Cattafesta, Louis

    2005-01-01

    This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.

  10. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    Science.gov (United States)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  11. Sensing across large-scale cognitive radio networks: Data processing, algorithms, and testbed for wireless tomography and moving target tracking

    Science.gov (United States)

    Bonior, Jason David

    As the use of wireless devices has become more widespread so has the potential for utilizing wireless networks for remote sensing applications. Regular wireless communication devices are not typically designed for remote sensing. Remote sensing techniques must be carefully tailored to the capabilities of these networks before they can be applied. Experimental verification of these techniques and algorithms requires robust yet flexible testbeds. In this dissertation, two experimental testbeds for the advancement of research into sensing across large-scale cognitive radio networks are presented. System architectures, implementations, capabilities, experimental verification, and performance are discussed. One testbed is designed for the collection of scattering data to be used in RF and wireless tomography research. This system is used to collect full complex scattering data using a vector network analyzer (VNA) and amplitude-only data using non-synchronous software-defined radios (SDRs). Collected data is used to experimentally validate a technique for phase reconstruction using semidefinite relaxation and demonstrate the feasibility of wireless tomography. The second testbed is a SDR network for the collection of experimental data. The development of tools for network maintenance and data collection is presented and discussed. A novel recursive weighted centroid algorithm for device-free target localization using the variance of received signal strength for wireless links is proposed. The signal variance resulting from a moving target is modeled as having contours related to Cassini ovals. This model is used to formulate recursive weights which reduce the influence of wireless links that are farther from the target location estimate. The algorithm and its implementation on this testbed are presented and experimental results discussed.

  12. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  13. Development of a space-systems network testbed

    Science.gov (United States)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  14. Development of a Tethered Formation Flight Testbed for ISS, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The development of a testbed for the development and demonstration of technologies needed by tethered formation flying satellites is proposed. Such a testbed would...

  15. The Soil Moisture Active Passive Mission (SMAP) Science Data Products: Results of Testing with Field Experiment and Algorithm Testbed Simulation Environment Data

    Science.gov (United States)

    Entekhabi, Dara; Njoku, Eni E.; O'Neill, Peggy E.; Kellogg, Kent H.; Entin, Jared K.

    2010-01-01

    Talk outline 1. Derivation of SMAP basic and applied science requirements from the NRC Earth Science Decadal Survey applications 2. Data products and latencies 3. Algorithm highlights 4. SMAP Algorithm Testbed 5. SMAP Working Groups and community engagement

  16. Phase Retrieval Using a Genetic Algorithm on the Systematic Image-Based Optical Alignment Testbed

    Science.gov (United States)

    Taylor, Jaime R.

    2003-01-01

    NASA s Marshall Space Flight Center s Systematic Image-Based Optical Alignment (SIBOA) Testbed was developed to test phase retrieval algorithms and hardware techniques. Individuals working with the facility developed the idea of implementing phase retrieval by breaking the determination of the tip/tilt of each mirror apart from the piston motion (or translation) of each mirror. Presented in this report is an algorithm that determines the optimal phase correction associated only with the piston motion of the mirrors. A description of the Phase Retrieval problem is first presented. The Systematic Image-Based Optical Alignment (SIBOA) Testbeb is then described. A Discrete Fourier Transform (DFT) is necessary to transfer the incoming wavefront (or estimate of phase error) into the spatial frequency domain to compare it with the image. A method for reducing the DFT to seven scalar/matrix multiplications is presented. A genetic algorithm is then used to search for the phase error. The results of this new algorithm on a test problem are presented.

  17. Link Adaptation for Mitigating Earth-To-Space Propagation Effects on the NASA SCaN Testbed

    Science.gov (United States)

    Kilcoyne, Deirdre K.; Headley, William C.; Leffke, Zach J.; Rowe, Sonya A.; Mortensen, Dale J.; Reinhart, Richard C.; McGwier, Robert W.

    2016-01-01

    In Earth-to-Space communications, well-known propagation effects such as path loss and atmospheric loss can lead to fluctuations in the strength of the communications link between a satellite and its ground station. Additionally, the typically unconsidered effect of shadowing due to the geometry of the satellite and its solar panels can also lead to link degradation. As a result of these anticipated channel impairments, NASA's communication links have been traditionally designed to handle the worst-case impact of these effects through high link margins and static, lower rate, modulation formats. The work presented in this paper aims to relax these constraints by providing an improved trade-off between data rate and link margin through utilizing link adaptation. More specifically, this work provides a simulation study on the propagation effects impacting NASA's SCaN Testbed flight software-defined radio (SDR) as well as proposes a link adaptation algorithm that varies the modulation format of a communications link as its signal-to-noise ratio fluctuates. Ultimately, the models developed in this work will be utilized to conduct real-time flight experiments on-board the NASA SCaN Testbed.

  18. Experimental Validation of Advanced Dispersed Fringe Sensing (ADFS) Algorithm Using Advanced Wavefront Sensing and Correction Testbed (AWCT)

    Science.gov (United States)

    Wang, Xu; Shi, Fang; Sigrist, Norbert; Seo, Byoung-Joon; Tang, Hong; Bikkannavar, Siddarayappa; Basinger, Scott; Lay, Oliver

    2012-01-01

    Large aperture telescope commonly features segment mirrors and a coarse phasing step is needed to bring these individual segments into the fine phasing capture range. Dispersed Fringe Sensing (DFS) is a powerful coarse phasing technique and its alteration is currently being used for JWST.An Advanced Dispersed Fringe Sensing (ADFS) algorithm is recently developed to improve the performance and robustness of previous DFS algorithms with better accuracy and unique solution. The first part of the paper introduces the basic ideas and the essential features of the ADFS algorithm and presents the some algorithm sensitivity study results. The second part of the paper describes the full details of algorithm validation process through the advanced wavefront sensing and correction testbed (AWCT): first, the optimization of the DFS hardware of AWCT to ensure the data accuracy and reliability is illustrated. Then, a few carefully designed algorithm validation experiments are implemented, and the corresponding data analysis results are shown. Finally the fiducial calibration using Range-Gate-Metrology technique is carried out and a <10nm or <1% algorithm accuracy is demonstrated.

  19. Comparison of two matrix data structures for advanced CSM testbed applications

    Science.gov (United States)

    Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.

    1989-01-01

    The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.

  20. ALGORITHM OF SAR SATELLITE ATTITUDE MEASUREMENT USING GPS AIDED BY KINEMATIC VECTOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In this paper, in order to improve the accuracy of the Synthetic Aperture Radar (SAR)satellite attitude using Global Positioning System (GPS) wide-band carrier phase, the SAR satellite attitude kinematic vector and Kalman filter are introduced. Introducing the state variable function of GPS attitude determination algorithm in SAR satellite by means of kinematic vector and describing the observation function by the GPS wide-band carrier phase, the paper uses the Kalman filter algorithm to obtian the attitude variables of SAR satellite. Compared the simulation results of Kalman filter algorithm with the least square algorithm and explicit solution, it is indicated that the Kalman filter algorithm is the best.

  1. Use of Tabu Search in a Solver to Map Complex Networks onto Emulab Testbeds

    National Research Council Canada - National Science Library

    MacDonald, Jason E

    2007-01-01

    The University of Utah's solver for the testbed mapping problem uses a simulated annealing metaheuristic algorithm to map a researcher's experimental network topology onto available testbed resources...

  2. Trace explosives sensor testbed (TESTbed)

    Science.gov (United States)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  3. An adaptable, low cost test-bed for unmanned vehicle systems research

    Science.gov (United States)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  4. Network design consideration of a satellite-based mobile communications system

    Science.gov (United States)

    Yan, T.-Y.

    1986-01-01

    Technical considerations for the Mobile Satellite Experiment (MSAT-X), the ground segment testbed for the low-cost spectral efficient satellite-based mobile communications technologies being developed for the 1990's, are discussed. The Network Management Center contains a flexible resource sharing algorithm, the Demand Assigned Multiple Access scheme, which partitions the satellite transponder bandwidth among voice, data, and request channels. Satellite use of multiple UHF beams permits frequency reuse. The backhaul communications and the Telemetry, Tracking and Control traffic are provided through a single full-coverage SHF beam. Mobile Terminals communicate with the satellite using UHF. All communications including SHF-SHF between Base Stations and/or Gateways, are routed through the satellite. Because MSAT-X is an experimental network, higher level network protocols (which are service-specific) will be developed only to test the operation of the lowest three levels, the physical, data link, and network layers.

  5. Scheduling algorithm for data relay satellite optical communication based on artificial intelligent optimization

    Science.gov (United States)

    Zhao, Wei-hu; Zhao, Jing; Zhao, Shang-hong; Li, Yong-jun; Wang, Xiang; Dong, Yi; Dong, Chen

    2013-08-01

    Optical satellite communication with the advantages of broadband, large capacity and low power consuming broke the bottleneck of the traditional microwave satellite communication. The formation of the Space-based Information System with the technology of high performance optical inter-satellite communication and the realization of global seamless coverage and mobile terminal accessing are the necessary trend of the development of optical satellite communication. Considering the resources, missions and restraints of Data Relay Satellite Optical Communication System, a model of optical communication resources scheduling is established and a scheduling algorithm based on artificial intelligent optimization is put forwarded. According to the multi-relay-satellite, multi-user-satellite, multi-optical-antenna and multi-mission with several priority weights, the resources are scheduled reasonable by the operation: "Ascertain Current Mission Scheduling Time" and "Refresh Latter Mission Time-Window". The priority weight is considered as the parameter of the fitness function and the scheduling project is optimized by the Genetic Algorithm. The simulation scenarios including 3 relay satellites with 6 optical antennas, 12 user satellites and 30 missions, the simulation result reveals that the algorithm obtain satisfactory results in both efficiency and performance and resources scheduling model and the optimization algorithm are suitable in multi-relay-satellite, multi-user-satellite, and multi-optical-antenna recourses scheduling problem.

  6. Design and Prototyping of a Satellite Antenna Slew Testbed

    Science.gov (United States)

    2013-12-01

    beers and kind advice gave me a family away from home. To my familia here in the Bay Area; their constant support, understanding and surprise...Encoder Cable Maxon 275934 2 CAB 29 EPOS Power Cable Maxon 275829 2 CAB 30 Misc Hardware** NPS 30 - - Bill of Materials 35 closely match the actual ...computed trajectory. The position and velocity results were then implemented on the testbed motors for comparison of actual versus commanded values

  7. A commercial space technology testbed on ISS

    Science.gov (United States)

    Boyle, David R.

    2000-01-01

    There is a significant and growing commercial market for new, more capable communications and remote sensing satellites. Competition in this market strongly motivates satellite manufacturers and spacecraft component developers to test and demonstrate new space hardware in a realistic environment. External attach points on the International Space Station allow it to function uniquely as a space technology testbed to satisfy this market need. However, space industry officials have identified three critical barriers to their commercial use of the ISS: unpredictable access, cost risk, and schedule uncertainty. Appropriate NASA policy initiatives and business/technical assistance for industry from the Commercial Space Center for Engineering can overcome these barriers. .

  8. Solar Resource Assessment with Sky Imagery and a Virtual Testbed for Sky Imager Solar Forecasting

    Science.gov (United States)

    Kurtz, Benjamin Bernard

    In recent years, ground-based sky imagers have emerged as a promising tool for forecasting solar energy on short time scales (0 to 30 minutes ahead). Following the development of sky imager hardware and algorithms at UC San Diego, we present three new or improved algorithms for sky imager forecasting and forecast evaluation. First, we present an algorithm for measuring irradiance with a sky imager. Sky imager forecasts are often used in conjunction with other instruments for measuring irradiance, so this has the potential to decrease instrumentation costs and logistical complexity. In particular, the forecast algorithm itself often relies on knowledge of the current irradiance which can now be provided directly from the sky images. Irradiance measurements are accurate to within about 10%. Second, we demonstrate a virtual sky imager testbed that can be used for validating and enhancing the forecast algorithm. The testbed uses high-quality (but slow) simulations to produce virtual clouds and sky images. Because virtual cloud locations are known, much more advanced validation procedures are possible with the virtual testbed than with measured data. In this way, we are able to determine that camera geometry and non-uniform evolution of the cloud field are the two largest sources of forecast error. Finally, with the assistance of the virtual sky imager testbed, we develop improvements to the cloud advection model used for forecasting. The new advection schemes are 10-20% better at short time horizons.

  9. Heuristic Scheduling Algorithm Oriented Dynamic Tasks for Imaging Satellites

    Directory of Open Access Journals (Sweden)

    Maocai Wang

    2014-01-01

    Full Text Available Imaging satellite scheduling is an NP-hard problem with many complex constraints. This paper researches the scheduling problem for dynamic tasks oriented to some emergency cases. After the dynamic properties of satellite scheduling were analyzed, the optimization model is proposed in this paper. Based on the model, two heuristic algorithms are proposed to solve the problem. The first heuristic algorithm arranges new tasks by inserting or deleting them, then inserting them repeatedly according to the priority from low to high, which is named IDI algorithm. The second one called ISDR adopts four steps: insert directly, insert by shifting, insert by deleting, and reinsert the tasks deleted. Moreover, two heuristic factors, congestion degree of a time window and the overlapping degree of a task, are employed to improve the algorithm’s performance. Finally, a case is given to test the algorithms. The results show that the IDI algorithm is better than ISDR from the running time point of view while ISDR algorithm with heuristic factors is more effective with regard to algorithm performance. Moreover, the results also show that our method has good performance for the larger size of the dynamic tasks in comparison with the other two methods.

  10. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    Science.gov (United States)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  11. A Study on Fuel Estimation Algorithms for a Geostationary Communication & Broadcasting Satellite

    Directory of Open Access Journals (Sweden)

    Jong Won Eun

    2000-12-01

    Full Text Available It has been developed to calculate fuel budget for a geostationary communication and broadcasting satellite. It is quite essential that the pre-launch fuel budget estimation must account for the deterministic transfer and drift orbit maneuver requirements. After on-station, the calculation of satellite lifetime should be based on the estimation of remaining fuel and assessment of actual performance. These estimations step from the proper algorithms to produce the prediction of satellite lifetime. This paper concentrates on the fuel estimation method that was studied for calculation of the propellant budget by using the given algorithms. Applications of this method are discussed for a communication and broadcasting satellite.

  12. Smart Antenna UKM Testbed for Digital Beamforming System

    Directory of Open Access Journals (Sweden)

    2009-03-01

    Full Text Available A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH array antenna and software reconfigurable digital beamforming system (DBS. The antenna is developed based on using the novel LIEH microstrip patch element design arranged into 4×1 uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance TMS320C6711TM floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88–2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  13. Proportional fair scheduling algorithm based on traffic in satellite communication system

    Science.gov (United States)

    Pan, Cheng-Sheng; Sui, Shi-Long; Liu, Chun-ling; Shi, Yu-Xin

    2018-02-01

    In the satellite communication network system, in order to solve the problem of low system capacity and user fairness in multi-user access to satellite communication network in the downlink, combined with the characteristics of user data service, an algorithm study on throughput capacity and user fairness scheduling is proposed - Proportional Fairness Algorithm Based on Traffic(B-PF). The algorithm is improved on the basis of the proportional fairness algorithm in the wireless communication system, taking into account the user channel condition and caching traffic information. The user outgoing traffic is considered as the adjustment factor of the scheduling priority and presents the concept of traffic satisfaction. Firstly,the algorithm calculates the priority of the user according to the scheduling algorithm and dispatches the users with the highest priority. Secondly, when a scheduled user is the business satisfied user, the system dispatches the next priority user. The simulation results show that compared with the PF algorithm, B-PF can improve the system throughput, the business satisfaction and fairness.

  14. Technical Report Series on Global Modeling and Data Assimilation. Volume 12; Comparison of Satellite Global Rainfall Algorithms

    Science.gov (United States)

    Suarez, Max J. (Editor); Chang, Alfred T. C.; Chiu, Long S.

    1997-01-01

    Seventeen months of rainfall data (August 1987-December 1988) from nine satellite rainfall algorithms (Adler, Chang, Kummerow, Prabhakara, Huffman, Spencer, Susskind, and Wu) were analyzed to examine the uncertainty of satellite-derived rainfall estimates. The variability among algorithms, measured as the standard deviation computed from the ensemble of algorithms, shows regions of high algorithm variability tend to coincide with regions of high rain rates. Histograms of pattern correlation (PC) between algorithms suggest a bimodal distribution, with separation at a PC-value of about 0.85. Applying this threshold as a criteria for similarity, our analyses show that algorithms using the same sensor or satellite input tend to be similar, suggesting the dominance of sampling errors in these satellite estimates.

  15. A Multi-Vehicles, Wireless Testbed for Networked Control, Communications and Computing

    Science.gov (United States)

    Murray, Richard; Doyle, John; Effros, Michelle; Hickey, Jason; Low, Steven

    2002-03-01

    We have constructed a testbed consisting of 4 mobile vehicles (with 4 additional vehicles being completed), each with embedded computing and communications capability for use in testing new approaches for command and control across dynamic networks. The system is being used or is planned to be used for testing of a variety of communications-related technologies, including distributed command and control algorithms, dynamically reconfigurable network topologies, source coding for real-time transmission of data in lossy environments, and multi-network communications. A unique feature of the testbed is the use of vehicles that have second order dynamics. Requiring real-time feedback algorithms to stabilize the system while performing cooperative tasks. The testbed was constructed in the Caltech Vehicles Laboratory and consists of individual vehicles with PC-based computation and controls, and multiple communications devices (802.11 wireless Ethernet, Bluetooth, and infrared). The vehicles are freely moving, wheeled platforms propelled by high performance dotted fairs. The room contains an access points for an 802.11 network, overhead visual sensing (to allow emulation of CI'S signal processing), a centralized computer for emulating certain distributed computations, and network gateways to control and manipulate communications traffic.

  16. An Online Tilt Estimation and Compensation Algorithm for a Small Satellite Camera

    Science.gov (United States)

    Lee, Da-Hyun; Hwang, Jai-hyuk

    2018-04-01

    In the case of a satellite camera designed to execute an Earth observation mission, even after a pre-launch precision alignment process has been carried out, misalignment will occur due to external factors during the launch and in the operating environment. In particular, for high-resolution satellite cameras, which require submicron accuracy for alignment between optical components, misalignment is a major cause of image quality degradation. To compensate for this, most high-resolution satellite cameras undergo a precise realignment process called refocusing before and during the operation process. However, conventional Earth observation satellites only execute refocusing upon de-space. Thus, in this paper, an online tilt estimation and compensation algorithm that can be utilized after de-space correction is executed. Although the sensitivity of the optical performance degradation due to the misalignment is highest in de-space, the MTF can be additionally increased by correcting tilt after refocusing. The algorithm proposed in this research can be used to estimate the amount of tilt that occurs by taking star images, and it can also be used to carry out automatic tilt corrections by employing a compensation mechanism that gives angular motion to the secondary mirror. Crucially, this algorithm is developed using an online processing system so that it can operate without communication with the ground.

  17. Theoretical algorithms for satellite-derived sea surface temperatures

    Science.gov (United States)

    Barton, I. J.; Zavody, A. M.; O'Brien, D. M.; Cutten, D. R.; Saunders, R. W.; Llewellyn-Jones, D. T.

    1989-03-01

    Reliable climate forecasting using numerical models of the ocean-atmosphere system requires accurate data sets of sea surface temperature (SST) and surface wind stress. Global sets of these data will be supplied by the instruments to fly on the ERS 1 satellite in 1990. One of these instruments, the Along-Track Scanning Radiometer (ATSR), has been specifically designed to provide SST in cloud-free areas with an accuracy of 0.3 K. The expected capabilities of the ATSR can be assessed using transmission models of infrared radiative transfer through the atmosphere. The performances of several different models are compared by estimating the infrared brightness temperatures measured by the NOAA 9 AVHRR for three standard atmospheres. Of these, a computationally quick spectral band model is used to derive typical AVHRR and ATSR SST algorithms in the form of linear equations. These algorithms show that a low-noise 3.7-μm channel is required to give the best satellite-derived SST and that the design accuracy of the ATSR is likely to be achievable. The inclusion of extra water vapor information in the analysis did not improve the accuracy of multiwavelength SST algorithms, but some improvement was noted with the multiangle technique. Further modeling is required with atmospheric data that include both aerosol variations and abnormal vertical profiles of water vapor and temperature.

  18. A simple and efficient algorithm to estimate daily global solar radiation from geostationary satellite data

    International Nuclear Information System (INIS)

    Lu, Ning; Qin, Jun; Yang, Kun; Sun, Jiulin

    2011-01-01

    Surface global solar radiation (GSR) is the primary renewable energy in nature. Geostationary satellite data are used to map GSR in many inversion algorithms in which ground GSR measurements merely serve to validate the satellite retrievals. In this study, a simple algorithm with artificial neural network (ANN) modeling is proposed to explore the non-linear physical relationship between ground daily GSR measurements and Multi-functional Transport Satellite (MTSAT) all-channel observations in an effort to fully exploit information contained in both data sets. Singular value decomposition is implemented to extract the principal signals from satellite data and a novel method is applied to enhance ANN performance at high altitude. A three-layer feed-forward ANN model is trained with one year of daily GSR measurements at ten ground sites. This trained ANN is then used to map continuous daily GSR for two years, and its performance is validated at all 83 ground sites in China. The evaluation result demonstrates that this algorithm can quickly and efficiently build the ANN model that estimates daily GSR from geostationary satellite data with good accuracy in both space and time. -- Highlights: → A simple and efficient algorithm to estimate GSR from geostationary satellite data. → ANN model fully exploits both the information from satellite and ground measurements. → Good performance of the ANN model is comparable to that of the classical models. → Surface elevation and infrared information enhance GSR inversion.

  19. CDRD and PNPR satellite passive microwave precipitation retrieval algorithms: EuroTRMM/EURAINSAT origins and H-SAF operations

    Science.gov (United States)

    Mugnai, A.; Smith, E. A.; Tripoli, G. J.; Bizzarri, B.; Casella, D.; Dietrich, S.; Di Paola, F.; Panegrossi, G.; Sanò, P.

    2013-04-01

    Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF) is a EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites) program, designed to deliver satellite products of hydrological interest (precipitation, soil moisture and snow parameters) over the European and Mediterranean region to research and operations users worldwide. Six satellite precipitation algorithms and concomitant precipitation products are the responsibility of various agencies in Italy. Two of these algorithms have been designed for maximum accuracy by restricting their inputs to measurements from conical and cross-track scanning passive microwave (PMW) radiometers mounted on various low Earth orbiting satellites. They have been developed at the Italian National Research Council/Institute of Atmospheric Sciences and Climate in Rome (CNR/ISAC-Rome), and are providing operational retrievals of surface rain rate and its phase properties. Each of these algorithms is physically based, however, the first of these, referred to as the Cloud Dynamics and Radiation Database (CDRD) algorithm, uses a Bayesian-based solution solver, while the second, referred to as the PMW Neural-net Precipitation Retrieval (PNPR) algorithm, uses a neural network-based solution solver. Herein we first provide an overview of the two initial EU research and applications programs that motivated their initial development, EuroTRMM and EURAINSAT (European Satellite Rainfall Analysis and Monitoring at the Geostationary Scale), and the current H-SAF program that provides the framework for their operational use and continued development. We stress the relevance of the CDRD and PNPR algorithms and their precipitation products in helping secure the goals of H-SAF's scientific and operations agenda, the former helpful as a secondary calibration reference to other algorithms in H-SAF's complete mix of algorithms. Descriptions of the algorithms' designs are provided

  20. CDRD and PNPR satellite passive microwave precipitation retrieval algorithms: EuroTRMM/EURAINSAT origins and H-SAF operations

    Directory of Open Access Journals (Sweden)

    A. Mugnai

    2013-04-01

    Full Text Available Satellite Application Facility on Support to Operational Hydrology and Water Management (H-SAF is a EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites program, designed to deliver satellite products of hydrological interest (precipitation, soil moisture and snow parameters over the European and Mediterranean region to research and operations users worldwide. Six satellite precipitation algorithms and concomitant precipitation products are the responsibility of various agencies in Italy. Two of these algorithms have been designed for maximum accuracy by restricting their inputs to measurements from conical and cross-track scanning passive microwave (PMW radiometers mounted on various low Earth orbiting satellites. They have been developed at the Italian National Research Council/Institute of Atmospheric Sciences and Climate in Rome (CNR/ISAC-Rome, and are providing operational retrievals of surface rain rate and its phase properties. Each of these algorithms is physically based, however, the first of these, referred to as the Cloud Dynamics and Radiation Database (CDRD algorithm, uses a Bayesian-based solution solver, while the second, referred to as the PMW Neural-net Precipitation Retrieval (PNPR algorithm, uses a neural network-based solution solver. Herein we first provide an overview of the two initial EU research and applications programs that motivated their initial development, EuroTRMM and EURAINSAT (European Satellite Rainfall Analysis and Monitoring at the Geostationary Scale, and the current H-SAF program that provides the framework for their operational use and continued development. We stress the relevance of the CDRD and PNPR algorithms and their precipitation products in helping secure the goals of H-SAF's scientific and operations agenda, the former helpful as a secondary calibration reference to other algorithms in H-SAF's complete mix of algorithms. Descriptions of the algorithms' designs are

  1. A Study on Fuel Estimation Algorithms for a Geostationary Communication & Broadcasting Satellite

    OpenAIRE

    Jong Won Eun

    2000-01-01

    It has been developed to calculate fuel budget for a geostationary communication and broadcasting satellite. It is quite essential that the pre-launch fuel budget estimation must account for the deterministic transfer and drift orbit maneuver requirements. After on-station, the calculation of satellite lifetime should be based on the estimation of remaining fuel and assessment of actual performance. These estimations step from the proper algorithms to produce the prediction of satellite lifet...

  2. Mapping Surface Broadband Albedo from Satellite Observations: A Review of Literatures on Algorithms and Products

    Directory of Open Access Journals (Sweden)

    Ying Qu

    2015-01-01

    Full Text Available Surface albedo is one of the key controlling geophysical parameters in the surface energy budget studies, and its temporal and spatial variation is closely related to the global climate change and regional weather system due to the albedo feedback mechanism. As an efficient tool for monitoring the surfaces of the Earth, remote sensing is widely used for deriving long-term surface broadband albedo with various geostationary and polar-orbit satellite platforms in recent decades. Moreover, the algorithms for estimating surface broadband albedo from satellite observations, including narrow-to-broadband conversions, bidirectional reflectance distribution function (BRDF angular modeling, direct-estimation algorithm and the algorithms for estimating albedo from geostationary satellite data, are developed and improved. In this paper, we present a comprehensive literature review on algorithms and products for mapping surface broadband albedo with satellite observations and provide a discussion of different algorithms and products in a historical perspective based on citation analysis of the published literature. This paper shows that the observation technologies and accuracy requirement of applications are important, and long-term, global fully-covered (including land, ocean, and sea-ice surfaces, gap-free, surface broadband albedo products with higher spatial and temporal resolution are required for climate change, surface energy budget, and hydrological studies.

  3. Design of a nickel-hydrogen battery simulator for the NASA EOS testbed

    Science.gov (United States)

    Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.

    1992-01-01

    The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.

  4. Holodeck Testbed Project

    Science.gov (United States)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  5. Handoff algorithm for mobile satellite systems with ancillary terrestrial component

    KAUST Repository

    Sadek, Mirette

    2012-06-01

    This paper presents a locally optimal handoff algorithm for integrated satellite/ground communication systems. We derive the handoff decision function and present the results in the form of tradeoff curves between the number of handoffs and the number of link degradation events in a given distance covered by the mobile user. This is a practical receiver-controlled handoff algorithm that optimizes the handoff process from a user perspective based on the received signal strength rather than from a network perspective. © 2012 IEEE.

  6. Design, Development, and Testing of a UAV Hardware-in-the-Loop Testbed for Aviation and Airspace Prognostics Research

    Science.gov (United States)

    Kulkarni, Chetan; Teubert, Chris; Gorospe, George; Burgett, Drew; Quach, Cuong C.; Hogge, Edward

    2016-01-01

    The airspace is becoming more and more complicated, and will continue to do so in the future with the integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, other forms of aviation technology into the airspace. The new technology and complexity increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems & systems of systems can be very difficult, expensive, and sometimes unsafe in real life scenarios. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. The framework injects flight related anomalies related to ground systems, routing, airport congestion, etc. to test and verify algorithms for NAS safety. In our research work, we develop a live, distributed, hardware-in-the-loop testbed for aviation and airspace prognostics along with exploring further research possibilities to verify and validate future algorithms for NAS safety. The testbed integrates virtual aircraft using the X-Plane simulator and X-PlaneConnect toolbox, UAVs using onboard sensors and cellular communications, and hardware in the loop components. In addition, the testbed includes an additional research framework to support and simplify future research activities. It enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. This paper describes the design, development, and testing of this system. Software reliability, safety and latency are some of the critical design considerations in development of the testbed. Integration of HITL elements in

  7. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    Science.gov (United States)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  8. The DataTAG transatlantic testbed

    CERN Document Server

    Martin, O; Martin-Flatin, J P; Moroni, P; Nae, D; Newman, H; Ravot, S

    2005-01-01

    Wide area network testbeds allow researchers and engineers to test out new equipment, protocols and services in real-life situations, without jeopardizing the stability and reliability of production networks. The Data TransAtlantic Grid (DataTAG) testbed, deployed in 2002 between CERN, Geneva, Switzerland and StarLight, Chicago, IL, USA, is probably the largest testbed built to date. Jointly managed by CERN and Caltech, it is funded by the European Commission, the U.S. Department of Energy and the U.S. National Science Foundation. The main objectives of this testbed are to improve the Grid community's understanding of the networking issues posed by data- intensive Grid applications over transoceanic gigabit networks, design and develop new Grid middleware services, and improve the interoperability of European and U.S. Grid applications in High- Energy and Nuclear Physics. In this paper, we give an overview of this testbed, describe its various topologies over time, and summarize the main lessons learned after...

  9. Real-world experimentation of distributed DSA network algorithms

    DEFF Research Database (Denmark)

    Tonelli, Oscar; Berardinelli, Gilberto; Tavares, Fernando Menezes Leitão

    2013-01-01

    such as a dynamic propagation environment, human presence impact and terminals mobility. This chapter focuses on the practical aspects related to the real world-experimentation with distributed DSA network algorithms over a testbed network. Challenges and solutions are extensively discussed, from the testbed design......The problem of spectrum scarcity in uncoordinated and/or heterogeneous wireless networks is the key aspect driving the research in the field of flexible management of frequency resources. In particular, distributed dynamic spectrum access (DSA) algorithms enable an efficient sharing...... to the setup of experiments. A practical example of experimentation process with a DSA algorithm is also provided....

  10. The HSBQ Algorithm with Triple-play Services for Broadband Hybrid Satellite Constellation Communication System

    Directory of Open Access Journals (Sweden)

    Anupon Boriboon

    2016-07-01

    Full Text Available The HSBQ algorithm is the one of active queue management algorithms, which orders to avoid high packet loss rates and control stable stream queue. That is the problem of calculation of the drop probability for both queue length stability and bandwidth fairness. This paper proposes the HSBQ, which drop the packets before the queues overflow at the gateways, so that the end nodes can respond to the congestion before queue overflow. This algorithm uses the change of the average queue length to adjust the amount by which the mark (or drop probability is changed. Moreover it adjusts the queue weight, which is used to estimate the average queue length, based on the rate. The results show that HSBQ algorithm could maintain control stable stream queue better than group of congestion metric without flow information algorithm as the rate of hybrid satellite network changing dramatically, as well as the presented empiric evidences demonstrate that the use of HSBQ algorithm offers a better quality of service than the traditionally queue control mechanisms used in hybrid satellite network.

  11. Bridging Ground Validation and Algorithms: Using Scattering and Integral Tables to Incorporate Observed DSD Correlations into Satellite Algorithms

    Science.gov (United States)

    Williams, C. R.

    2012-12-01

    The NASA Global Precipitation Mission (GPM) raindrop size distribution (DSD) Working Group is composed of NASA PMM Science Team Members and is charged to "investigate the correlations between DSD parameters using Ground Validation (GV) data sets that support, or guide, the assumptions used in satellite retrieval algorithms." Correlations between DSD parameters can be used to constrain the unknowns and reduce the degrees-of-freedom in under-constrained satellite algorithms. Over the past two years, the GPM DSD Working Group has analyzed GV data and has found correlations between the mass-weighted mean raindrop diameter (Dm) and the mass distribution standard deviation (Sm) that follows a power-law relationship. This Dm-Sm power-law relationship appears to be robust and has been observed in surface disdrometer and vertically pointing radar observations. One benefit of a Dm-Sm power-law relationship is that a three parameter DSD can be modeled with just two parameters: Dm and Nw that determines the DSD amplitude. In order to incorporate observed DSD correlations into satellite algorithms, the GPM DSD Working Group is developing scattering and integral tables that can be used by satellite algorithms. Scattering tables describe the interaction of electromagnetic waves on individual particles to generate cross sections of backscattering, extinction, and scattering. Scattering tables are independent of the distribution of particles. Integral tables combine scattering table outputs with DSD parameters and DSD correlations to generate integrated normalized reflectivity, attenuation, scattering, emission, and asymmetry coefficients. Integral tables contain both frequency dependent scattering properties and cloud microphysics. The GPM DSD Working Group has developed scattering tables for raindrops at both Dual Precipitation Radar (DPR) frequencies and at all GMI radiometer frequencies less than 100 GHz. Scattering tables include Mie and T-matrix scattering with H- and V

  12. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  13. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Drira, Anis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Reed, Frederick K. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  14. Design of an Image Motion Compenstaion (IMC Algorithm for Image Registration of the Communication, Ocean, Meteorolotical Satellite (COMS-1

    Directory of Open Access Journals (Sweden)

    Taek Seo Jung

    2006-03-01

    Full Text Available This paper presents an Image Motion Compensation (IMC algorithm for the Korea's Communication, Ocean, and Meteorological Satellite (COMS-1. An IMC algorithm is a priority component of image registration in Image Navigation and Registration (INR system to locate and register radiometric image data. Due to various perturbations, a satellite has orbit and attitude errors with respect to a reference motion. These errors cause depointing of the imager aiming direction, and in consequence cause image distortions. To correct the depointing of the imager aiming direction, a compensation algorithm is designed by adapting different equations from those used for the GOES satellites. The capability of the algorithm is compared with that of existing algorithm applied to the GOES's INR system. The algorithm developed in this paper improves pointing accuracy by 40%, and efficiently compensates the depointings of the imager aiming direction.

  15. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    Science.gov (United States)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  16. A low-cost test-bed for real-time landmark tracking

    Science.gov (United States)

    Csaszar, Ambrus; Hanan, Jay C.; Moreels, Pierre; Assad, Christopher

    2007-04-01

    A low-cost vehicle test-bed system was developed to iteratively test, refine and demonstrate navigation algorithms before attempting to transfer the algorithms to more advanced rover prototypes. The platform used here was a modified radio controlled (RC) car. A microcontroller board and onboard laptop computer allow for either autonomous or remote operation via a computer workstation. The sensors onboard the vehicle represent the types currently used on NASA-JPL rover prototypes. For dead-reckoning navigation, optical wheel encoders, a single axis gyroscope, and 2-axis accelerometer were used. An ultrasound ranger is available to calculate distance as a substitute for the stereo vision systems presently used on rovers. The prototype also carries a small laptop computer with a USB camera and wireless transmitter to send real time video to an off-board computer. A real-time user interface was implemented that combines an automatic image feature selector, tracking parameter controls, streaming video viewer, and user generated or autonomous driving commands. Using the test-bed, real-time landmark tracking was demonstrated by autonomously driving the vehicle through the JPL Mars yard. The algorithms tracked rocks as waypoints. This generated coordinates calculating relative motion and visually servoing to science targets. A limitation for the current system is serial computing-each additional landmark is tracked in order-but since each landmark is tracked independently, if transferred to appropriate parallel hardware, adding targets would not significantly diminish system speed.

  17. Impact of Missing Passive Microwave Sensors on Multi-Satellite Precipitation Retrieval Algorithm

    Directory of Open Access Journals (Sweden)

    Bin Yong

    2015-01-01

    Full Text Available The impact of one or two missing passive microwave (PMW input sensors on the end product of multi-satellite precipitation products is an interesting but obscure issue for both algorithm developers and data users. On 28 January 2013, the Version-7 TRMM Multi-satellite Precipitation Analysis (TMPA products were reproduced and re-released by National Aeronautics and Space Administration (NASA Goddard Space Flight Center because the Advanced Microwave Sounding Unit-B (AMSU-B and the Special Sensor Microwave Imager-Sounder-F16 (SSMIS-F16 input data were unintentionally disregarded in the prior retrieval. Thus, this study investigates the sensitivity of TMPA algorithm results to missing PMW sensors by intercomparing the “early” and “late” Version-7 TMPA real-time (TMPA-RT precipitation estimates (i.e., without and with AMSU-B, SSMIS-F16 sensors with an independent high-density gauge network of 200 tipping-bucket rain gauges over the Chinese Jinghe river basin (45,421 km2. The retrieval counts and retrieval frequency of various PMW and Infrared (IR sensors incorporated into the TMPA system were also analyzed to identify and diagnose the impacts of sensor availability on the TMPA-RT retrieval accuracy. Results show that the incorporation of AMSU-B and SSMIS-F16 has substantially reduced systematic errors. The improvement exhibits rather strong seasonal and topographic dependencies. Our analyses suggest that one or two single PMW sensors might play a key role in affecting the end product of current combined microwave-infrared precipitation estimates. This finding supports algorithm developers’ current endeavor in spatiotemporally incorporating as many PMW sensors as possible in the multi-satellite precipitation retrieval system called Integrated Multi-satellitE Retrievals for Global Precipitation Measurement mission (IMERG. This study also recommends users of satellite precipitation products to switch to the newest Version-7 TMPA datasets and

  18. Definition of technology development missions for early space station satellite servicing, volume 1

    Science.gov (United States)

    1983-01-01

    The testbed role of an early manned space station in the context of a satellite servicing evolutionary development and flight demonstration technology plan which results in a satellite servicing operational capability is defined. A satellite servicing technology development mission (a set of missions) to be performed on an early manned space station is conceptually defined.

  19. SABA: A Testbed for a Real-Time MIMO System

    Directory of Open Access Journals (Sweden)

    Brühl Lars

    2006-01-01

    Full Text Available The growing demand for high data rates for wireless communication systems leads to the development of new technologies to increase the channel capacity thus increasing the data rate. MIMO (multiple-input multiple-output systems are best qualified for these applications. In this paper, we present a MIMO test environment for high data rate transmissions in frequency-selective environments. An overview of the testbed is given, including the analyzed algorithms, the digital signal processing with a new highly parallel processor to perform the algorithms in real time, as well as the analog front-ends. A brief overview of the influence of polarization on the channel capacity is given as well.

  20. An orbit determination algorithm for small satellites based on the magnitude of the earth magnetic field

    Science.gov (United States)

    Zagorski, P.; Gallina, A.; Rachucki, J.; Moczala, B.; Zietek, S.; Uhl, T.

    2018-06-01

    Autonomous attitude determination systems based on simple measurements of vector quantities such as magnetic field and the Sun direction are commonly used in very small satellites. However, those systems always require knowledge of the satellite position. This information can be either propagated from orbital elements periodically uplinked from the ground station or measured onboard by dedicated global positioning system (GPS) receiver. The former solution sacrifices satellite autonomy while the latter requires additional sensors which may represent a significant part of mass, volume, and power budget in case of pico- or nanosatellites. Hence, it is thought that a system for onboard satellite position determination without resorting to GPS receivers would be useful. In this paper, a novel algorithm for determining the satellite orbit semimajor-axis is presented. The methods exploit only the magnitude of the Earth magnetic field recorded onboard by magnetometers. This represents the first step toward an extended algorithm that can determine all orbital elements of the satellite. The method is validated by numerical analysis and real magnetic field measurements.

  1. Trace Gas Measurements from the GeoTASO and GCAS Airborne Instruments: An Instrument and Algorithm Test-Bed for Air Quality Observations from Geostationary Orbit

    Science.gov (United States)

    Nowlan, C. R.; Liu, X.; Janz, S. J.; Leitch, J. W.; Al-Saadi, J. A.; Chance, K.; Cole, J.; Delker, T.; Follette-Cook, M. B.; Gonzalez Abad, G.; Good, W. S.; Kowalewski, M. G.; Loughner, C.; Pickering, K. E.; Ruppert, L.; Soo, D.; Szykman, J.; Valin, L.; Zoogman, P.

    2016-12-01

    The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) and the GEO-CAPE Airborne Simulator (GCAS) instruments are pushbroom sensors capable of making remote sensing measurements of air quality and ocean color. Originally developed as test-bed instruments for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey, these instruments are now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions, and will provide validation capabilities after the satellite instruments are in orbit. GeoTASO and GCAS flew on two different aircraft in their first intensive air quality field campaigns during the DISCOVER-AQ missions over Texas in 2013 and Colorado in 2014. GeoTASO was also deployed in 2016 during the KORUS-AQ field campaign to make measurements of trace gases and aerosols over Korea. GeoTASO and GCAS collect spectra of backscattered solar radiation in the UV and visible that can be used to derive 2-D maps of trace gas columns below the aircraft at spatial resolutions on the order of 250 x 500 m. We present spatially resolved maps of trace gas retrievals of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the field campaigns, and comparisons with data from ground-based spectrometers, in situ monitoring instruments, and satellites.

  2. Fast Physics Testbed for the FASTER Project

    Energy Technology Data Exchange (ETDEWEB)

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  3. Building a ROS-Based Testbed for Realistic Multi-Robot Simulation: Taking the Exploration as an Example

    Directory of Open Access Journals (Sweden)

    Zhi Yan

    2017-09-01

    Full Text Available While the robotics community agrees that the benchmarking is of high importance to objectively compare different solutions, there are only few and limited tools to support it. To address this issue in the context of multi-robot systems, we have defined a benchmarking process based on experimental designs, which aimed at improving the reproducibility of experiments by making explicit all elements of a benchmark such as parameters, measurements and metrics. We have also developed a ROS (Robot Operating System-based testbed with the goal of making it easy for users to validate, benchmark, and compare different algorithms including coordination strategies. Our testbed uses the MORSE (Modular OpenRobots Simulation Engine simulator for realistic simulation and a computer cluster for decentralized computation. In this paper, we present our testbed in details with the architecture and infrastructure, the issues encountered in implementing the infrastructure, and the automation of the deployment. We also report a series of experiments on multi-robot exploration, in order to demonstrate the capabilities of our testbed.

  4. First Attempt of Orbit Determination of SLR Satellites and Space Debris Using Genetic Algorithms

    Science.gov (United States)

    Deleflie, F.; Coulot, D.; Descosta, R.; Fernier, A.; Richard, P.

    2013-08-01

    We present an orbit determination method based on genetic algorithms. Contrary to usual estimation methods mainly based on least-squares methods, these algorithms do not require any a priori knowledge of the initial state vector to be estimated. These algorithms can be applied when a new satellite is launched or for uncatalogued objects that appear in images obtained from robotic telescopes such as the TAROT ones. We show in this paper preliminary results obtained from an SLR satellite, for which tracking data acquired by the ILRS network enable to build accurate orbital arcs at a few centimeter level, which can be used as a reference orbit ; in this case, the basic observations are made up of time series of ranges, obtained from various tracking stations. We show as well the results obtained from the observations acquired by the two TAROT telescopes on the Telecom-2D satellite operated by CNES ; in that case, the observations are made up of time series of azimuths and elevations, seen from the two TAROT telescopes. The method is carried out in several steps: (i) an analytical propagation of the equations of motion, (ii) an estimation kernel based on genetic algorithms, which follows the usual steps of such approaches: initialization and evolution of a selected population, so as to determine the best parameters. Each parameter to be estimated, namely each initial keplerian element, has to be searched among an interval that is preliminary chosen. The algorithm is supposed to converge towards an optimum over a reasonable computational time.

  5. Development and application of an actively controlled hybrid proton exchange membrane fuel cell - Lithium-ion battery laboratory test-bed based on off-the-shelf components

    Energy Technology Data Exchange (ETDEWEB)

    Yufit, V.; Brandon, N.P. [Dept. Earth Science and Engineering, Imperial College, London SW7 2AZ (United Kingdom)

    2011-01-15

    The use of commercially available components enables rapid prototyping and assembling of laboratory scale hybrid test-bed systems, which can be used to evaluate new hybrid configurations. The development of such a test-bed using an off-the-shelf PEM fuel cell, lithium-ion battery and DC/DC converter is presented here, and its application to a hybrid configuration appropriate for an unmanned underwater vehicle is explored. A control algorithm was implemented to regulate the power share between the fuel cell and the battery with a graphical interface to control, record and analyze the electrochemical and thermal parameters of the system. The results demonstrate the applicability of the test-bed and control algorithm for this application, and provide data on the dynamic electrical and thermal behaviour of the hybrid system. (author)

  6. A remote integrated testbed for cooperating objects

    CERN Document Server

    Dios, Jose Ramiro Martinez-de; Bernabe, Alberto de San; Ollero, Anibal

    2013-01-01

    Testbeds are gaining increasing relevance in research domains and also in industrial applications. However, very few books devoted to testbeds have been published. To the best of my knowledge no book on this topic has been published. This book is particularly interesting for the growing community of testbed developers. I believe the book is also very interesting for researchers in robot-WSN cooperation.This book provides detailed description of a system that can be considered the first testbed that allows full peer-to-peer interoperability between heterogeneous robots and ubiquitous systems su

  7. A Business-to-Business Interoperability Testbed: An Overview

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Ivezic, Nenad [ORNL; Monica, Martin [Sun Microsystems, Inc.; Jones, Albert [National Institute of Standards and Technology (NIST)

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  8. AN EVOLUTIONARY ALGORITHM FOR FAST INTENSITY BASED IMAGE MATCHING BETWEEN OPTICAL AND SAR SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    P. Fischer

    2018-04-01

    Full Text Available This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  9. Algorithm Development and Validation for Satellite-Derived Distributions of DOC and CDOM in the US Middle Atlantic Bight

    Science.gov (United States)

    Mannino, Antonio; Russ, Mary E.; Hooker, Stanford B.

    2007-01-01

    In coastal ocean waters, distributions of dissolved organic carbon (DOC) and chromophoric dissolved organic matter (CDOM) vary seasonally and interannually due to multiple source inputs and removal processes. We conducted several oceanographic cruises within the continental margin of the U.S. Middle Atlantic Bight (MAB) to collect field measurements in order to develop algorithms to retrieve CDOM and DOC from NASA's MODIS-Aqua and SeaWiFS satellite sensors. In order to develop empirical algorithms for CDOM and DOC, we correlated the CDOM absorption coefficient (a(sub cdom)) with in situ radiometry (remote sensing reflectance, Rrs, band ratios) and then correlated DOC to Rrs band ratios through the CDOM to DOC relationships. Our validation analyses demonstrate successful retrieval of DOC and CDOM from coastal ocean waters using the MODIS-Aqua and SeaWiFS satellite sensors with mean absolute percent differences from field measurements of cdom)(355)1,6 % for a(sub cdom)(443), and 12% for the CDOM spectral slope. To our knowledge, the algorithms presented here represent the first validated algorithms for satellite retrieval of a(sub cdom) DOC, and CDOM spectral slope in the coastal ocean. The satellite-derived DOC and a(sub cdom) products demonstrate the seasonal net ecosystem production of DOC and photooxidation of CDOM from spring to fall. With accurate satellite retrievals of CDOM and DOC, we will be able to apply satellite observations to investigate interannual and decadal-scale variability in surface CDOM and DOC within continental margins and monitor impacts of climate change and anthropogenic activities on coastal ecosystems.

  10. Optical testbed for the LISA phasemeter

    Science.gov (United States)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  11. Optical testbed for the LISA phasemeter

    International Nuclear Information System (INIS)

    Schwarze, T S; Fernández Barranco, G; Penkert, D; Gerberding, O; Heinzel, G; Danzmann, K

    2016-01-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup. (paper)

  12. The SUMO Ship Detector Algorithm for Satellite Radar Images

    Directory of Open Access Journals (Sweden)

    Harm Greidanus

    2017-03-01

    Full Text Available Search for Unidentified Maritime Objects (SUMO is an algorithm for ship detection in satellite Synthetic Aperture Radar (SAR images. It has been developed over the course of more than 15 years, using a large amount of SAR images from almost all available SAR satellites operating in L-, C- and X-band. As validated by benchmark tests, it performs very well on a wide range of SAR image modes (from Spotlight to ScanSAR and resolutions (from 1–100 m and for all types and sizes of ships, within the physical limits imposed by the radar imaging. This paper describes, in detail, the algorithmic approach in all of the steps of the ship detection: land masking, clutter estimation, detection thresholding, target clustering, ship attribute estimation and false alarm suppression. SUMO is a pixel-based CFAR (Constant False Alarm Rate detector for multi-look radar images. It assumes a K distribution for the sea clutter, corrected however for deviations of the actual sea clutter from this distribution, implementing a fast and robust method for the clutter background estimation. The clustering of detected pixels into targets (ships uses several thresholds to deal with the typically irregular distribution of the radar backscatter over a ship. In a multi-polarization image, the different channels are fused. Azimuth ambiguities, a common source of false alarms in ship detection, are removed. A reliability indicator is computed for each target. In post-processing, using the results of a series of images, additional false alarms from recurrent (fixed targets including range ambiguities are also removed. SUMO can run in semi-automatic mode, where an operator can verify each detected target. It can also run in fully automatic mode, where batches of over 10,000 images have successfully been processed in less than two hours. The number of satellite SAR systems keeps increasing, as does their application to maritime surveillance. The open data policy of the EU

  13. A Novel Strategy Using Factor Graphs and the Sum-Product Algorithm for Satellite Broadcast Scheduling Problems

    Science.gov (United States)

    Chen, Jung-Chieh

    This paper presents a low complexity algorithmic framework for finding a broadcasting schedule in a low-altitude satellite system, i. e., the satellite broadcast scheduling (SBS) problem, based on the recent modeling and computational methodology of factor graphs. Inspired by the huge success of the low density parity check (LDPC) codes in the field of error control coding, in this paper, we transform the SBS problem into an LDPC-like problem through a factor graph instead of using the conventional neural network approaches to solve the SBS problem. Based on a factor graph framework, the soft-information, describing the probability that each satellite will broadcast information to a terminal at a specific time slot, is exchanged among the local processing in the proposed framework via the sum-product algorithm to iteratively optimize the satellite broadcasting schedule. Numerical results show that the proposed approach not only can obtain optimal solution but also enjoys the low complexity suitable for integral-circuit implementation.

  14. Performance measurement, modeling, and evaluation of integrated concurrency control and recovery algorithms in distributed data base systems

    Energy Technology Data Exchange (ETDEWEB)

    Jenq, B.C.

    1986-01-01

    The performance evaluation of integrated concurrency-control and recovery mechanisms for distributed data base systems is studied using a distributed testbed system. In addition, a queueing network model was developed to analyze the two phase locking scheme in the distributed testbed system. The combination of testbed measurement and analytical modeling provides an effective tool for understanding the performance of integrated concurrency control and recovery algorithms in distributed database systems. The design and implementation of the distributed testbed system, CARAT, are presented. The concurrency control and recovery algorithms implemented in CARAT include: a two phase locking scheme with distributed deadlock detection, a distributed version of optimistic approach, before-image and after-image journaling mechanisms for transaction recovery, and a two-phase commit protocol. Many performance measurements were conducted using a variety of workloads. A queueing network model is developed to analyze the performance of the CARAT system using the two-phase locking scheme with before-image journaling. The combination of testbed measurements and analytical modeling provides significant improvements in understanding the performance impacts of the concurrency control and recovery algorithms in distributed database systems.

  15. SHADOW DETECTION FROM VERY HIGH RESOLUTON SATELLITE IMAGE USING GRABCUT SEGMENTATION AND RATIO-BAND ALGORITHMS

    Directory of Open Access Journals (Sweden)

    N. M. S. M. Kadhim

    2015-03-01

    Full Text Available Very-High-Resolution (VHR satellite imagery is a powerful source of data for detecting and extracting information about urban constructions. Shadow in the VHR satellite imageries provides vital information on urban construction forms, illumination direction, and the spatial distribution of the objects that can help to further understanding of the built environment. However, to extract shadows, the automated detection of shadows from images must be accurate. This paper reviews current automatic approaches that have been used for shadow detection from VHR satellite images and comprises two main parts. In the first part, shadow concepts are presented in terms of shadow appearance in the VHR satellite imageries, current shadow detection methods, and the usefulness of shadow detection in urban environments. In the second part, we adopted two approaches which are considered current state-of-the-art shadow detection, and segmentation algorithms using WorldView-3 and Quickbird images. In the first approach, the ratios between the NIR and visible bands were computed on a pixel-by-pixel basis, which allows for disambiguation between shadows and dark objects. To obtain an accurate shadow candidate map, we further refine the shadow map after applying the ratio algorithm on the Quickbird image. The second selected approach is the GrabCut segmentation approach for examining its performance in detecting the shadow regions of urban objects using the true colour image from WorldView-3. Further refinement was applied to attain a segmented shadow map. Although the detection of shadow regions is a very difficult task when they are derived from a VHR satellite image that comprises a visible spectrum range (RGB true colour, the results demonstrate that the detection of shadow regions in the WorldView-3 image is a reasonable separation from other objects by applying the GrabCut algorithm. In addition, the derived shadow map from the Quickbird image indicates

  16. Shadow Detection from Very High Resoluton Satellite Image Using Grabcut Segmentation and Ratio-Band Algorithms

    Science.gov (United States)

    Kadhim, N. M. S. M.; Mourshed, M.; Bray, M. T.

    2015-03-01

    Very-High-Resolution (VHR) satellite imagery is a powerful source of data for detecting and extracting information about urban constructions. Shadow in the VHR satellite imageries provides vital information on urban construction forms, illumination direction, and the spatial distribution of the objects that can help to further understanding of the built environment. However, to extract shadows, the automated detection of shadows from images must be accurate. This paper reviews current automatic approaches that have been used for shadow detection from VHR satellite images and comprises two main parts. In the first part, shadow concepts are presented in terms of shadow appearance in the VHR satellite imageries, current shadow detection methods, and the usefulness of shadow detection in urban environments. In the second part, we adopted two approaches which are considered current state-of-the-art shadow detection, and segmentation algorithms using WorldView-3 and Quickbird images. In the first approach, the ratios between the NIR and visible bands were computed on a pixel-by-pixel basis, which allows for disambiguation between shadows and dark objects. To obtain an accurate shadow candidate map, we further refine the shadow map after applying the ratio algorithm on the Quickbird image. The second selected approach is the GrabCut segmentation approach for examining its performance in detecting the shadow regions of urban objects using the true colour image from WorldView-3. Further refinement was applied to attain a segmented shadow map. Although the detection of shadow regions is a very difficult task when they are derived from a VHR satellite image that comprises a visible spectrum range (RGB true colour), the results demonstrate that the detection of shadow regions in the WorldView-3 image is a reasonable separation from other objects by applying the GrabCut algorithm. In addition, the derived shadow map from the Quickbird image indicates significant performance of

  17. SU-G-JeP1-07: Development of a Programmable Motion Testbed for the Validation of Ultrasound Tracking Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, A; Matrosic, C; Zagzebski, J; Bednarz, B [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: To develop an advanced testbed that combines a 3D motion stage and ultrasound phantom to optimize and validate 2D and 3D tracking algorithms for real-time motion management during radiation therapy. Methods: A Siemens S2000 Ultrasound scanner utilizing a 9L4 transducer was coupled with the Washington University 4D Phantom to simulate patient motion. The transducer was securely fastened to the 3D stage and positioned to image three cylinders of varying contrast in a Gammex 404GS LE phantom. The transducer was placed within a water bath above the phantom in order to maintain sufficient coupling for the entire range of simulated motion. A programmed motion sequence was used to move the transducer during image acquisition and a cine video was acquired for one minute to allow for long sequence tracking. Images were analyzed using a normalized cross-correlation block matching tracking algorithm and compared to the known motion of the transducer relative to the phantom. Results: The setup produced stable ultrasound motion traces consistent with those programmed into the 3D motion stage. The acquired ultrasound images showed minimal artifacts and an image quality that was more than suitable for tracking algorithm verification. Comparisons of a block matching tracking algorithm with the known motion trace for the three features resulted in an average tracking error of 0.59 mm. Conclusion: The high accuracy and programmability of the 4D phantom allows for the acquisition of ultrasound motion sequences that are highly customizable; allowing for focused analysis of some common pitfalls of tracking algorithms such as partial feature occlusion or feature disappearance, among others. The design can easily be modified to adapt to any probe such that the process can be extended to 3D acquisition. Further development of an anatomy specific phantom better resembling true anatomical landmarks could lead to an even more robust validation. This work is partially funded by NIH

  18. SU-G-JeP1-07: Development of a Programmable Motion Testbed for the Validation of Ultrasound Tracking Algorithms

    International Nuclear Information System (INIS)

    Shepard, A; Matrosic, C; Zagzebski, J; Bednarz, B

    2016-01-01

    Purpose: To develop an advanced testbed that combines a 3D motion stage and ultrasound phantom to optimize and validate 2D and 3D tracking algorithms for real-time motion management during radiation therapy. Methods: A Siemens S2000 Ultrasound scanner utilizing a 9L4 transducer was coupled with the Washington University 4D Phantom to simulate patient motion. The transducer was securely fastened to the 3D stage and positioned to image three cylinders of varying contrast in a Gammex 404GS LE phantom. The transducer was placed within a water bath above the phantom in order to maintain sufficient coupling for the entire range of simulated motion. A programmed motion sequence was used to move the transducer during image acquisition and a cine video was acquired for one minute to allow for long sequence tracking. Images were analyzed using a normalized cross-correlation block matching tracking algorithm and compared to the known motion of the transducer relative to the phantom. Results: The setup produced stable ultrasound motion traces consistent with those programmed into the 3D motion stage. The acquired ultrasound images showed minimal artifacts and an image quality that was more than suitable for tracking algorithm verification. Comparisons of a block matching tracking algorithm with the known motion trace for the three features resulted in an average tracking error of 0.59 mm. Conclusion: The high accuracy and programmability of the 4D phantom allows for the acquisition of ultrasound motion sequences that are highly customizable; allowing for focused analysis of some common pitfalls of tracking algorithms such as partial feature occlusion or feature disappearance, among others. The design can easily be modified to adapt to any probe such that the process can be extended to 3D acquisition. Further development of an anatomy specific phantom better resembling true anatomical landmarks could lead to an even more robust validation. This work is partially funded by NIH

  19. Evaluation of Multiple Kernel Learning Algorithms for Crop Mapping Using Satellite Image Time-Series Data

    Science.gov (United States)

    Niazmardi, S.; Safari, A.; Homayouni, S.

    2017-09-01

    Crop mapping through classification of Satellite Image Time-Series (SITS) data can provide very valuable information for several agricultural applications, such as crop monitoring, yield estimation, and crop inventory. However, the SITS data classification is not straightforward. Because different images of a SITS data have different levels of information regarding the classification problems. Moreover, the SITS data is a four-dimensional data that cannot be classified using the conventional classification algorithms. To address these issues in this paper, we presented a classification strategy based on Multiple Kernel Learning (MKL) algorithms for SITS data classification. In this strategy, initially different kernels are constructed from different images of the SITS data and then they are combined into a composite kernel using the MKL algorithms. The composite kernel, once constructed, can be used for the classification of the data using the kernel-based classification algorithms. We compared the computational time and the classification performances of the proposed classification strategy using different MKL algorithms for the purpose of crop mapping. The considered MKL algorithms are: MKL-Sum, SimpleMKL, LPMKL and Group-Lasso MKL algorithms. The experimental tests of the proposed strategy on two SITS data sets, acquired by SPOT satellite sensors, showed that this strategy was able to provide better performances when compared to the standard classification algorithm. The results also showed that the optimization method of the used MKL algorithms affects both the computational time and classification accuracy of this strategy.

  20. Research on Scheduling Algorithm for Multi-satellite and Point Target Task on Swinging Mode

    Science.gov (United States)

    Wang, M.; Dai, G.; Peng, L.; Song, Z.; Chen, G.

    2012-12-01

    and negative swinging angle and the computation of time window are analyzed and discussed. And many strategies to improve the efficiency of this model are also put forward. In order to solve the model, we bring forward the conception of activity sequence map. By using the activity sequence map, the activity choice and the start time of the activity can be divided. We also bring forward three neighborhood operators to search the result space. The front movement remaining time and the back movement remaining time are used to analyze the feasibility to generate solution from neighborhood operators. Lastly, the algorithm to solve the problem and model is put forward based genetic algorithm. Population initialization, crossover operator, mutation operator, individual evaluation, collision decrease operator, select operator and collision elimination operator is designed in the paper. Finally, the scheduling result and the simulation for a practical example on 5 satellites and 100 point targets with swinging mode is given, and the scheduling performances are also analyzed while the swinging angle in 0, 5, 10, 15, 25. It can be shown by the result that the model and the algorithm are more effective than those ones without swinging mode.

  1. Next-Generation Satellite Precipitation Products for Understanding Global and Regional Water Variability

    Science.gov (United States)

    Hou, Arthur Y.

    2011-01-01

    A major challenge in understanding the space-time variability of continental water fluxes is the lack of accurate precipitation estimates over complex terrains. While satellite precipitation observations can be used to complement ground-based data to obtain improved estimates, space-based and ground-based estimates come with their own sets of uncertainties, which must be understood and characterized. Quantitative estimation of uncertainties in these products also provides a necessary foundation for merging satellite and ground-based precipitation measurements within a rigorous statistical framework. Global Precipitation Measurement (GPM) is an international satellite mission that will provide next-generation global precipitation data products for research and applications. It consists of a constellation of microwave sensors provided by NASA, JAXA, CNES, ISRO, EUMETSAT, DOD, NOAA, NPP, and JPSS. At the heart of the mission is the GPM Core Observatory provided by NASA and JAXA to be launched in 2013. The GPM Core, which will carry the first space-borne dual-frequency radar and a state-of-the-art multi-frequency radiometer, is designed to set new reference standards for precipitation measurements from space, which can then be used to unify and refine precipitation retrievals from all constellation sensors. The next-generation constellation-based satellite precipitation estimates will be characterized by intercalibrated radiometric measurements and physical-based retrievals using a common observation-derived hydrometeor database. For pre-launch algorithm development and post-launch product evaluation, NASA supports an extensive ground validation (GV) program in cooperation with domestic and international partners to improve (1) physics of remote-sensing algorithms through a series of focused field campaigns, (2) characterization of uncertainties in satellite and ground-based precipitation products over selected GV testbeds, and (3) modeling of atmospheric processes and

  2. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Science.gov (United States)

    2012-03-28

    .... 120322212-2212-01] Spectrum Sharing Innovation Test-Bed Pilot Program AGENCY: National Telecommunications... Innovation Test-Bed pilot program to assess whether devices employing Dynamic Spectrum Access techniques can... Spectrum Sharing Innovation Test-Bed (Test-Bed) pilot program to examine the feasibility of increased...

  3. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    Science.gov (United States)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  4. TRMM Satellite Algorithm Estimates to Represent the Spatial Distribution of Rainstorms

    Directory of Open Access Journals (Sweden)

    Patrick Marina

    2017-01-01

    Full Text Available On-site measurements from rain gauge provide important information for the design, construction, and operation of water resources engineering projects, groundwater potentials, and the water supply and irrigation systems. A dense gauging network is needed to accurately characterize the variation of rainfall over a region, unfitting for conditions with limited networks, such as in Sarawak, Malaysia. Hence, satellite-based algorithm estimates are introduced as an innovative solution to these challenges. With accessibility to dataset retrievals from public domain websites, it has become a useful source to measure rainfall for a wider coverage area at finer temporal resolution. This paper aims to investigate the rainfall estimates prepared by Tropical Rainfall Measuring Mission (TRMM to explain whether it is suitable to represent the distribution of extreme rainfall in Sungai Sarawak Basin. Based on the findings, more uniform correlations for the investigated storms can be observed for low to medium altitude (>40 MASL. It is found for the investigated events of Jan 05-11, 2009: the normalized root mean square error (NRMSE = 36.7 %; and good correlation (CC = 0.9. These findings suggest that satellite algorithm estimations from TRMM are suitable to represent the spatial distribution of extreme rainfall.

  5. Advancements of in-flight mass moment of inertia and structural deflection algorithms for satellite attitude simulators

    Science.gov (United States)

    Wright, Jonathan W.

    Experimental satellite attitude simulators have long been used to test and analyze control algorithms in order to drive down risk before implementation on an operational satellite. Ideally, the dynamic response of a terrestrial-based experimental satellite attitude simulator would be similar to that of an on-orbit satellite. Unfortunately, gravitational disturbance torques and poorly characterized moments of inertia introduce uncertainty into the system dynamics leading to questionable attitude control algorithm experimental results. This research consists of three distinct, but related contributions to the field of developing robust satellite attitude simulators. In the first part of this research, existing approaches to estimate mass moments and products of inertia are evaluated followed by a proposition and evaluation of a new approach that increases both the accuracy and precision of these estimates using typical on-board satellite sensors. Next, in order to better simulate the micro-torque environment of space, a new approach to mass balancing satellite attitude simulator is presented, experimentally evaluated, and verified. Finally, in the third area of research, we capitalize on the platform improvements to analyze a control moment gyroscope (CMG) singularity avoidance steering law. Several successful experiments were conducted with the CMG array at near-singular configurations. An evaluation process was implemented to verify that the platform remained near the desired test momentum, showing that the first two components of this research were effective in allowing us to conduct singularity avoidance experiments in a representative space-like test environment.

  6. Algorithms and programs for processing of satellite data on ozone layer and UV radiation levels

    International Nuclear Information System (INIS)

    Borkovskij, N.B.; Ivanyukovich, V.A.

    2012-01-01

    Some algorithms and programs for automatic retrieving and processing ozone layer satellite data are discussed. These techniques are used for reliable short-term UV-radiation levels forecasting. (authors)

  7. An Approach for Smart Antenna Testbed

    Science.gov (United States)

    Kawitkar, R. S.; Wakde, D. G.

    2003-07-01

    The use of wireless, mobile, personal communications services are expanding rapidly. Adaptive or "Smart" antenna arrays can increase channel capacity through spatial division. Adaptive antennas can also track mobile users, improving both signal range and quality. For these reasons, smart antenna systems have attracted widespread interest in the telecommunications industry for applications to third generation wireless systems.This paper aims to design and develop an advanced antennas testbed to serve as a common reference for testing adaptive antenna arrays and signal combining algorithms, as well as complete systems. A flexible suite of off line processing software should be written using matlab to perform system calibration, test bed initialization, data acquisition control, data storage/transfer, off line signal processing and analysis and graph plotting. The goal of this paper is to develop low complexity smart antenna structures for 3G systems. The emphasis will be laid on ease of implementation in a multichannel / multi-user environment. A smart antenna test bed will be developed, and various state-of-the-art DSP structures and algorithms will be investigated.Facing the soaring demand for mobile communications, the use of smart antenna arrays in mobile communications systems to exploit spatial diversity to further improve spectral efficiency has recently received considerable attention. Basically, a smart antenna array comprises a number of antenna elements combined via a beamforming network (amplitude and phase control network). Some of the benefits that can be achieved by using SAS (Smart Antenna System) include lower mobile terminal power consumption, range extension, ISI reduction, higher data rate support, and ease of integration into the existing base station system. In terms of economic benefits, adaptive antenna systems employed at base station, though increases the per base station cost, can increase coverage area of each cell site, thereby reducing

  8. Development of a Remotely Operated Vehicle Test-bed

    Directory of Open Access Journals (Sweden)

    Biao WANG

    2013-06-01

    Full Text Available This paper presents the development of a remotely operated vehicle (ROV, designed to serve as a convenient, cost-effective platform for research and experimental validation of hardware, sensors and control algorithms. Both of the mechanical and control system design are introduced. The vehicle with a dimension 0.65 m long, 0.45 m wide has been designed to have a frame structure for modification of mounted devices and thruster allocation. For control system, STM32 based MCU boards specially designed for this project, are used as core processing boards. And an open source, modular, flexible software is developed. Experiment results demonstrate the effectiveness of the test-bed.

  9. Mapping Global Ocean Surface Albedo from Satellite Observations: Models, Algorithms, and Datasets

    Science.gov (United States)

    Li, X.; Fan, X.; Yan, H.; Li, A.; Wang, M.; Qu, Y.

    2018-04-01

    Ocean surface albedo (OSA) is one of the important parameters in surface radiation budget (SRB). It is usually considered as a controlling factor of the heat exchange among the atmosphere and ocean. The temporal and spatial dynamics of OSA determine the energy absorption of upper level ocean water, and have influences on the oceanic currents, atmospheric circulations, and transportation of material and energy of hydrosphere. Therefore, various parameterizations and models have been developed for describing the dynamics of OSA. However, it has been demonstrated that the currently available OSA datasets cannot full fill the requirement of global climate change studies. In this study, we present a literature review on mapping global OSA from satellite observations. The models (parameterizations, the coupled ocean-atmosphere radiative transfer (COART), and the three component ocean water albedo (TCOWA)), algorithms (the estimation method based on reanalysis data, and the direct-estimation algorithm), and datasets (the cloud, albedo and radiation (CLARA) surface albedo product, dataset derived by the TCOWA model, and the global land surface satellite (GLASS) phase-2 surface broadband albedo product) of OSA have been discussed, separately.

  10. Virtual Factory Testbed

    Data.gov (United States)

    Federal Laboratory Consortium — The Virtual Factory Testbed (VFT) is comprised of three physical facilities linked by a standalone network (VFNet). The three facilities are the Smart and Wireless...

  11. Comparison of four machine learning algorithms for their applicability in satellite-based optical rainfall retrievals

    Science.gov (United States)

    Meyer, Hanna; Kühnlein, Meike; Appelhans, Tim; Nauss, Thomas

    2016-03-01

    Machine learning (ML) algorithms have successfully been demonstrated to be valuable tools in satellite-based rainfall retrievals which show the practicability of using ML algorithms when faced with high dimensional and complex data. Moreover, recent developments in parallel computing with ML present new possibilities for training and prediction speed and therefore make their usage in real-time systems feasible. This study compares four ML algorithms - random forests (RF), neural networks (NNET), averaged neural networks (AVNNET) and support vector machines (SVM) - for rainfall area detection and rainfall rate assignment using MSG SEVIRI data over Germany. Satellite-based proxies for cloud top height, cloud top temperature, cloud phase and cloud water path serve as predictor variables. The results indicate an overestimation of rainfall area delineation regardless of the ML algorithm (averaged bias = 1.8) but a high probability of detection ranging from 81% (SVM) to 85% (NNET). On a 24-hour basis, the performance of the rainfall rate assignment yielded R2 values between 0.39 (SVM) and 0.44 (AVNNET). Though the differences in the algorithms' performance were rather small, NNET and AVNNET were identified as the most suitable algorithms. On average, they demonstrated the best performance in rainfall area delineation as well as in rainfall rate assignment. NNET's computational speed is an additional advantage in work with large datasets such as in remote sensing based rainfall retrievals. However, since no single algorithm performed considerably better than the others we conclude that further research in providing suitable predictors for rainfall is of greater necessity than an optimization through the choice of the ML algorithm.

  12. Clustering of tethered satellite system simulation data by an adaptive neuro-fuzzy algorithm

    Science.gov (United States)

    Mitra, Sunanda; Pemmaraju, Surya

    1992-01-01

    Recent developments in neuro-fuzzy systems indicate that the concepts of adaptive pattern recognition, when used to identify appropriate control actions corresponding to clusters of patterns representing system states in dynamic nonlinear control systems, may result in innovative designs. A modular, unsupervised neural network architecture, in which fuzzy learning rules have been embedded is used for on-line identification of similar states. The architecture and control rules involved in Adaptive Fuzzy Leader Clustering (AFLC) allow this system to be incorporated in control systems for identification of system states corresponding to specific control actions. We have used this algorithm to cluster the simulation data of Tethered Satellite System (TSS) to estimate the range of delta voltages necessary to maintain the desired length rate of the tether. The AFLC algorithm is capable of on-line estimation of the appropriate control voltages from the corresponding length error and length rate error without a priori knowledge of their membership functions and familarity with the behavior of the Tethered Satellite System.

  13. The end-to-end testbed of the optical metrology system on-board LISA Pathfinder

    Energy Technology Data Exchange (ETDEWEB)

    Steier, F; Cervantes, F Guzman; Marin, A F GarcIa; Heinzel, G; Danzmann, K [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut) and Universitaet Hannover (Germany); Gerardi, D, E-mail: frank.steier@aei.mpg.d [EADS Astrium Satellites GmbH, Friedrichshafen (Germany)

    2009-05-07

    LISA Pathfinder is a technology demonstration mission for the Laser Interferometer Space Antenna (LISA). The main experiment on-board LISA Pathfinder is the so-called LISA Technology Package (LTP) which has the aim to measure the differential acceleration between two free-falling test masses with an accuracy of 3 x 10{sup -14} ms{sup -2} Hz{sup -1/2} between 1 mHz and 30 mHz. This measurement is performed interferometrically by the optical metrology system (OMS) on-board LISA Pathfinder. In this paper, we present the development of an experimental end-to-end testbed of the entire OMS. It includes the interferometer and its sub-units, the interferometer backend which is a phasemeter and the processing of the phasemeter output data. Furthermore, three-axes piezo-actuated mirrors are used instead of the free-falling test masses for the characterization of the dynamic behaviour of the system and some parts of the drag-free and attitude control system (DFACS) which controls the test masses and the satellite. The end-to-end testbed includes all parts of the LTP that can reasonably be tested on earth without free-falling test masses. At its present status it consists mainly of breadboard components. Some of those have already been replaced by engineering models of the LTP experiment. In the next steps, further engineering and flight models will also be inserted in this testbed and tested against well-characterized breadboard components. The presented testbed is an important reference for the unit tests and can also be used for validation of the on-board experiment during the mission.

  14. Schedule Optimization of Imaging Missions for Multiple Satellites and Ground Stations Using Genetic Algorithm

    Science.gov (United States)

    Lee, Junghyun; Kim, Heewon; Chung, Hyun; Kim, Haedong; Choi, Sujin; Jung, Okchul; Chung, Daewon; Ko, Kwanghee

    2018-04-01

    In this paper, we propose a method that uses a genetic algorithm for the dynamic schedule optimization of imaging missions for multiple satellites and ground systems. In particular, the visibility conflicts of communication and mission operation using satellite resources (electric power and onboard memory) are integrated in sequence. Resource consumption and restoration are considered in the optimization process. Image acquisition is an essential part of satellite missions and is performed via a series of subtasks such as command uplink, image capturing, image storing, and image downlink. An objective function for optimization is designed to maximize the usability by considering the following components: user-assigned priority, resource consumption, and image-acquisition time. For the simulation, a series of hypothetical imaging missions are allocated to a multi-satellite control system comprising five satellites and three ground stations having S- and X-band antennas. To demonstrate the performance of the proposed method, simulations are performed via three operation modes: general, commercial, and tactical.

  15. Interconnection of Broadband Islands via Satellite-Experiments on the Race II Catalyst Project

    National Research Council Canada - National Science Library

    Sun, Z

    1996-01-01

    .... The purpose of the project was to develop an ATM satellite link for the future B-ISDN services, particularly for the interconnections of the ATM testbeds which are in the form of broadband islands...

  16. Code Tracking Algorithms for Mitigating Multipath Effects in Fading Channels for Satellite-Based Positioning

    Directory of Open Access Journals (Sweden)

    Markku Renfors

    2007-12-01

    Full Text Available The ever-increasing public interest in location and positioning services has originated a demand for higher performance global navigation satellite systems (GNSSs. In order to achieve this incremental performance, the estimation of line-of-sight (LOS delay with high accuracy is a prerequisite for all GNSSs. The delay lock loops (DLLs and their enhanced variants (i.e., feedback code tracking loops are the structures of choice for the commercial GNSS receivers, but their performance in severe multipath scenarios is still rather limited. In addition, the new satellite positioning system proposals specify the use of a new modulation, the binary offset carrier (BOC modulation, which triggers a new challenge in the code tracking stage. Therefore, in order to meet this emerging challenge and to improve the accuracy of the delay estimation in severe multipath scenarios, this paper analyzes feedback as well as feedforward code tracking algorithms and proposes the peak tracking (PT methods, which are combinations of both feedback and feedforward structures and utilize the inherent advantages of both structures. We propose and analyze here two variants of PT algorithm: PT with second-order differentiation (Diff2, and PT with Teager Kaiser (TK operator, which will be denoted herein as PT(Diff2 and PT(TK, respectively. In addition to the proposal of the PT methods, the authors propose also an improved early-late-slope (IELS multipath elimination technique which is shown to provide very good mean-time-to-lose-lock (MTLL performance. An implementation of a noncoherent multipath estimating delay locked loop (MEDLL structure is also presented. We also incorporate here an extensive review of the existing feedback and feedforward delay estimation algorithms for direct sequence code division multiple access (DS-CDMA signals in satellite fading channels, by taking into account the impact of binary phase shift keying (BPSK as well as the newly proposed BOC modulation

  17. A Monocular Vision Measurement System of Three-Degree-of-Freedom Air-Bearing Test-Bed Based on FCCSP

    Science.gov (United States)

    Gao, Zhanyu; Gu, Yingying; Lv, Yaoyu; Xu, Zhenbang; Wu, Qingwen

    2018-06-01

    A monocular vision-based pose measurement system is provided for real-time measurement of a three-degree-of-freedom (3-DOF) air-bearing test-bed. Firstly, a circular plane cooperative target is designed. An image of a target fixed on the test-bed is then acquired. Blob analysis-based image processing is used to detect the object circles on the target. A fast algorithm (FCCSP) based on pixel statistics is proposed to extract the centers of object circles. Finally, pose measurements can be obtained when combined with the centers and the coordinate transformation relation. Experiments show that the proposed method is fast, accurate, and robust enough to satisfy the requirement of the pose measurement.

  18. A novel gridding algorithm to create regional trace gas maps from satellite observations

    Science.gov (United States)

    Kuhlmann, G.; Hartl, A.; Cheung, H. M.; Lam, Y. F.; Wenig, M. O.

    2014-02-01

    The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2) onto a longitude-latitude grid (level 3). The algorithm is designed for the Ozone Monitoring Instrument (OMI) and can easily be employed for similar instruments - for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI). Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly developed gridding

  19. A novel gridding algorithm to create regional trace gas maps from satellite observations

    Directory of Open Access Journals (Sweden)

    G. Kuhlmann

    2014-02-01

    Full Text Available The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2 onto a longitude–latitude grid (level 3. The algorithm is designed for the Ozone Monitoring Instrument (OMI and can easily be employed for similar instruments – for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI. Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly

  20. Sensor System Performance Evaluation and Benefits from the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I)

    Science.gov (United States)

    Larar, A.; Zhou, D.; Smith, W.

    2009-01-01

    Advanced satellite sensors are tasked with improving global-scale measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring, and environmental change detection. Validation of the entire measurement system is crucial to achieving this goal and thus maximizing research and operational utility of resultant data. Field campaigns employing satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft are an essential part of this validation task. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Airborne Sounder Testbed-Interferometer (NAST-I) has been a fundamental contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This paper focuses on some of the challenges associated with validating advanced atmospheric sounders and the benefits obtained from employing airborne interferometers such as the NAST-I. Select results from underflights of the Aqua Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) obtained during recent field campaigns will be presented.

  1. Description of Simulated Small Satellite Operation Data Sets

    Science.gov (United States)

    Kulkarni, Chetan S.; Guarneros Luna, Ali

    2018-01-01

    A set of two BP930 batteries (Identified as PK31 and PK35) were operated continuously for a simulated satellite operation profile completion for single cycle. The battery packs were charged to an initial voltage of around 8.35 V for 100% SOC before the experiment was started. This document explains the structure of the battery data sets. Please cite this paper when using this dataset: Z. Cameron, C. Kulkarni, A. Guarneros, K. Goebel, S. Poll, "A Battery Certification Testbed for Small Satellite Missions", IEEE AUTOTESTCON 2015, Nov 2-5, 2015, National Harbor, MA

  2. The design and implementation of the LLNL gigabit testbed

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, D. [Lawrence Livermore National Labs., CA (United States)

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  3. Advanced Artificial Intelligence Technology Testbed

    Science.gov (United States)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  4. The University of Canberra quantum key distribution testbed

    International Nuclear Information System (INIS)

    Ganeshkumar, G.; Edwards, P.J.; Cheung, W.N.; Barbopoulos, L.O.; Pham, H.; Hazel, J.C.

    1999-01-01

    Full text: We describe the design, operation and preliminary results obtained from a quantum key distribution (QKD) testbed constructed at the University of Canberra. Quantum cryptographic systems use shared secret keys exchanged in the form of sequences of polarisation coded or phase encoded single photons transmitted over an optical communications channel. Secrecy of this quantum key rests upon fundamental laws of quantum physics: measurements of linear or circular photon polarisation states introduce noise into the conjugate variable and so reveal eavesdropping. In its initial realisation reported here, pulsed light from a 650nm laser diode is attenuated by a factor of 10 6 , plane-polarised and then transmitted through a birefringent liquid crystal modulator (LCM) to a polarisation sensitive single photon receiver. This transmitted key sequence consists of a 1 kHz train of weak coherent 100ns wide light pulses, polarisation coded according to the BB84 protocol. Each pulse is randomly assigned one of four polarisation states (two orthogonal linear and two orthogonal circular) by computer PCA operated by the sender ('Alice'). This quaternary polarisation shift keyed photon stream is detected by the receiver ('Bob') whose computer (PCB) randomly chooses either a linear or a circular polarisation basis. Computer PCB is also used for final key selection, authentication, privacy amplification and eavesdropping. We briefly discuss the realisation of a mesoscopic single photon QKD source and the use of the testbed to simulate a global quantum key distribution system using earth satellites. Copyright (1999) Australian Optical Society

  5. An analytic algorithm for global coverage of the revisiting orbit and its application to the CFOSAT satellite

    Science.gov (United States)

    Xu, Ming; Huang, Li

    2014-08-01

    This paper addresses a new analytic algorithm for global coverage of the revisiting orbit and its application to the mission revisiting the Earth within long periods of time, such as Chinese-French Oceanic Satellite (abbr., CFOSAT). In the first, it is presented that the traditional design methodology of the revisiting orbit for some imaging satellites only on the single (ascending or descending) pass, and the repeating orbit is employed to perform the global coverage within short periods of time. However, the selection of the repeating orbit is essentially to yield the suboptimum from the rare measure of rational numbers of passes per day, which will lose lots of available revisiting orbits. Thus, an innovative design scheme is proposed to check both rational and irrational passes per day to acquire the relationship between the coverage percentage and the altitude. To improve the traditional imaging only on the single pass, the proposed algorithm is mapping every pass into its ascending and descending nodes on the specified latitude circle, and then is accumulating the projected width on the circle by the field of view of the satellite. The ergodic geometry of coverage percentage produced from the algorithm is affecting the final scheme, such as the optimal one owning the largest percentage, and the balance one possessing the less gradient in its vicinity, and is guiding to heuristic design for the station-keeping control strategies. The application of CFOSAT validates the feasibility of the algorithm.

  6. Real-Time Signal Processing for Multiantenna Systems: Algorithms, Optimization, and Implementation on an Experimental Test-Bed

    Directory of Open Access Journals (Sweden)

    Haustein Thomas

    2006-01-01

    Full Text Available A recently realized concept of a reconfigurable hardware test-bed suitable for real-time mobile communication with multiple antennas is presented in this paper. We discuss the reasons and prerequisites for real-time capable MIMO transmission systems which may allow channel adaptive transmission to increase link stability and data throughput. We describe a concept of an efficient implementation of MIMO signal processing using FPGAs and DSPs. We focus on some basic linear and nonlinear MIMO detection and precoding algorithms and their optimization for a DSP target, and a few principal steps for computational performance enhancement are outlined. An experimental verification of several real-time MIMO transmission schemes at high data rates in a typical office scenario is presented and results on the achieved BER and throughput performance are given. The different transmission schemes used either channel state information at both sides of the link or at one side only (transmitter or receiver. Spectral efficiencies of more than 20 bits/s/Hz and a throughput of more than 150 Mbps were shown with a single-carrier transmission. The experimental results clearly show the feasibility of real-time high data rate MIMO techniques with state-of-the-art hardware and that more sophisticated baseband signal processing will be an essential part of future communication systems. A discussion on implementation challenges towards future wireless communication systems supporting higher data rates (1 Gbps and beyond or high mobility concludes the paper.

  7. Validation and Application of the Modified Satellite-Based Priestley-Taylor Algorithm for Mapping Terrestrial Evapotranspiration

    Directory of Open Access Journals (Sweden)

    Yunjun Yao

    2014-01-01

    Full Text Available Satellite-based vegetation indices (VIs and Apparent Thermal Inertia (ATI derived from temperature change provide valuable information for estimating evapotranspiration (LE and detecting the onset and severity of drought. The modified satellite-based Priestley-Taylor (MS-PT algorithm that we developed earlier, coupling both VI and ATI, is validated based on observed data from 40 flux towers distributed across the world on all continents. The validation results illustrate that the daily LE can be estimated with the Root Mean Square Error (RMSE varying from 10.7 W/m2 to 87.6 W/m2, and with the square of correlation coefficient (R2 from 0.41 to 0.89 (p < 0.01. Compared with the Priestley-Taylor-based LE (PT-JPL algorithm, the MS-PT algorithm improves the LE estimates at most flux tower sites. Importantly, the MS-PT algorithm is also satisfactory in reproducing the inter-annual variability at flux tower sites with at least five years of data. The R2 between measured and predicted annual LE anomalies is 0.42 (p = 0.02. The MS-PT algorithm is then applied to detect the variations of long-term terrestrial LE over Three-North Shelter Forest Region of China and to monitor global land surface drought. The MS-PT algorithm described here demonstrates the ability to map regional terrestrial LE and identify global soil moisture stress, without requiring precipitation information.

  8. Exploration Systems Health Management Facilities and Testbed Workshop

    Science.gov (United States)

    Wilson, Scott; Waterman, Robert; McCleskey, Carey

    2004-01-01

    Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).

  9. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  10. Recent Successes and Future Plans for NASA's Space Communications and Navigation Testbed on the International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Sankovic, John M.; Johnson, Sandra K.; Lux, James P.; Chelmins, David T.

    2014-01-01

    Flexible and extensible space communications architectures and technology are essential to enable future space exploration and science activities. NASA has championed the development of the Space Telecommunications Radio System (STRS) software defined radio (SDR) standard and the application of SDR technology to reduce the costs and risks of using SDRs for space missions, and has developed an on-orbit testbed to validate these capabilities. The Space Communications and Navigation (SCaN) Testbed (previously known as the Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT)) is advancing SDR, on-board networking, and navigation technologies by conducting space experiments aboard the International Space Station. During its first year(s) on-orbit, the SCaN Testbed has achieved considerable accomplishments to better understand SDRs and their applications. The SDR platforms and software waveforms on each SDR have over 1500 hours of operation and are performing as designed. The Ka-band SDR on the SCaN Testbed is NASAs first space Ka-band transceiver and is NASA's first Ka-band mission using the Space Network. This has provided exciting opportunities to operate at Ka-band and assist with on-orbit tests of NASA newest Tracking and Data Relay Satellites (TDRS). During its first year, SCaN Testbed completed its first on-orbit SDR reconfigurations. SDR reconfigurations occur when implementing new waveforms on an SDR. SDR reconfigurations allow a radio to change minor parameters, such as data rate, or complete functionality. New waveforms which provide new capability and are reusable across different missions provide long term value for reconfigurable platforms such as SDRs. The STRS Standard provides guidelines for new waveform development by third parties. Waveform development by organizations other than the platform provider offers NASA the ability to develop waveforms itself and reduce its dependence and costs on the platform developer. Each of these

  11. Environment Emulation For Wsn Testbed

    Directory of Open Access Journals (Sweden)

    Radosław Kapłoniak

    2012-01-01

    Full Text Available The development of applications for wireless sensor networks is a challenging task. For this reason, several testbed platforms have been created. They simplify the manageability of nodes by offering easy ways of programming and debugging sensor nodes. These platforms, sometimes composed of dozens of sensors, provide a convenient way for carrying out research on medium access control and data exchange between nodes. In this article, we propose the extension of the WSN testbed, which could be used for evaluating and testing the functionality of sensor networks applications by emulating a real-world environment.

  12. Target Matching Recognition for Satellite Images Based on the Improved FREAK Algorithm

    Directory of Open Access Journals (Sweden)

    Yantong Chen

    2016-01-01

    Full Text Available Satellite remote sensing image target matching recognition exhibits poor robustness and accuracy because of the unfit feature extractor and large data quantity. To address this problem, we propose a new feature extraction algorithm for fast target matching recognition that comprises an improved feature from accelerated segment test (FAST feature detector and a binary fast retina key point (FREAK feature descriptor. To improve robustness, we extend the FAST feature detector by applying scale space theory and then transform the feature vector acquired by the FREAK descriptor from decimal into binary. We reduce the quantity of data in the computer and improve matching accuracy by using the binary space. Simulation test results show that our algorithm outperforms other relevant methods in terms of robustness and accuracy.

  13. An Improved Image Encryption Algorithm Based on Cyclic Rotations and Multiple Chaotic Sequences: Application to Satellite Images

    Directory of Open Access Journals (Sweden)

    MADANI Mohammed

    2017-10-01

    Full Text Available In this paper, a new satellite image encryption algorithm based on the combination of multiple chaotic systems and a random cyclic rotation technique is proposed. Our contribution consists in implementing three different chaotic maps (logistic, sine, and standard combined to improve the security of satellite images. Besides enhancing the encryption, the proposed algorithm also focuses on advanced efficiency of the ciphered images. Compared with classical encryption schemes based on multiple chaotic maps and the Rubik's cube rotation, our approach has not only the same merits of chaos systems like high sensitivity to initial values, unpredictability, and pseudo-randomness, but also other advantages like a higher number of permutations, better performances in Peak Signal to Noise Ratio (PSNR and a Maximum Deviation (MD.

  14. A practical algorithm for the retrieval of floe size distribution of Arctic sea ice from high-resolution satellite Synthetic Aperture Radar imagery

    Directory of Open Access Journals (Sweden)

    Byongjun Hwang

    2017-07-01

    Full Text Available In this study, we present an algorithm for summer sea ice conditions that semi-automatically produces the floe size distribution of Arctic sea ice from high-resolution satellite Synthetic Aperture Radar data. Currently, floe size distribution data from satellite images are very rare in the literature, mainly due to the lack of a reliable algorithm to produce such data. Here, we developed the algorithm by combining various image analysis methods, including Kernel Graph Cuts, distance transformation and watershed transformation, and a rule-based boundary revalidation. The developed algorithm has been validated against the ground truth that was extracted manually with the aid of 1-m resolution visible satellite data. Comprehensive validation analysis has shown both perspectives and limitations. The algorithm tends to fail to detect small floes (mostly less than 100 m in mean caliper diameter compared to ground truth, which is mainly due to limitations in water-ice segmentation. Some variability in the power law exponent of floe size distribution is observed due to the effects of control parameters in the process of de-noising, Kernel Graph Cuts segmentation, thresholds for boundary revalidation and image resolution. Nonetheless, the algorithm, for floes larger than 100 m, has shown a reasonable agreement with ground truth under various selections of these control parameters. Considering that the coverage and spatial resolution of satellite Synthetic Aperture Radar data have increased significantly in recent years, the developed algorithm opens a new possibility to produce large volumes of floe size distribution data, which is essential for improving our understanding and prediction of the Arctic sea ice cover

  15. Statistically Optimized Inversion Algorithm for Enhanced Retrieval of Aerosol Properties from Spectral Multi-Angle Polarimetric Satellite Observations

    Science.gov (United States)

    Dubovik, O; Herman, M.; Holdak, A.; Lapyonok, T.; Taure, D.; Deuze, J. L.; Ducos, F.; Sinyuk, A.

    2011-01-01

    The proposed development is an attempt to enhance aerosol retrieval by emphasizing statistical optimization in inversion of advanced satellite observations. This optimization concept improves retrieval accuracy relying on the knowledge of measurement error distribution. Efficient application of such optimization requires pronounced data redundancy (excess of the measurements number over number of unknowns) that is not common in satellite observations. The POLDER imager on board the PARASOL microsatellite registers spectral polarimetric characteristics of the reflected atmospheric radiation at up to 16 viewing directions over each observed pixel. The completeness of such observations is notably higher than for most currently operating passive satellite aerosol sensors. This provides an opportunity for profound utilization of statistical optimization principles in satellite data inversion. The proposed retrieval scheme is designed as statistically optimized multi-variable fitting of all available angular observations obtained by the POLDER sensor in the window spectral channels where absorption by gas is minimal. The total number of such observations by PARASOL always exceeds a hundred over each pixel and the statistical optimization concept promises to be efficient even if the algorithm retrieves several tens of aerosol parameters. Based on this idea, the proposed algorithm uses a large number of unknowns and is aimed at retrieval of extended set of parameters affecting measured radiation.

  16. Improved interpretation of satellite altimeter data using genetic algorithms

    Science.gov (United States)

    Messa, Kenneth; Lybanon, Matthew

    1992-01-01

    Genetic algorithms (GA) are optimization techniques that are based on the mechanics of evolution and natural selection. They take advantage of the power of cumulative selection, in which successive incremental improvements in a solution structure become the basis for continued development. A GA is an iterative procedure that maintains a 'population' of 'organisms' (candidate solutions). Through successive 'generations' (iterations) the population as a whole improves in simulation of Darwin's 'survival of the fittest'. GA's have been shown to be successful where noise significantly reduces the ability of other search techniques to work effectively. Satellite altimetry provides useful information about oceanographic phenomena. It provides rapid global coverage of the oceans and is not as severely hampered by cloud cover as infrared imagery. Despite these and other benefits, several factors lead to significant difficulty in interpretation. The GA approach to the improved interpretation of satellite data involves the representation of the ocean surface model as a string of parameters or coefficients from the model. The GA searches in parallel, a population of such representations (organisms) to obtain the individual that is best suited to 'survive', that is, the fittest as measured with respect to some 'fitness' function. The fittest organism is the one that best represents the ocean surface model with respect to the altimeter data.

  17. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  18. Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15

    Science.gov (United States)

    Event Archives Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15 On November 12th, Dr Workshop on Accessible Remote Testbeds (ART'15) at Georgia Tech. From the event website: The rationale behind the ART'15 workshop is that remote-access testbeds could, if done right, significantly change how

  19. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, J.; Schmidt, G. K.

    2016-12-01

    SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers. The SSERVI Analog Regolith Simulant Testbed provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment. The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area. SSERVI provides a bridge between several groups, joining together researchers from: 1) scientific and exploration communities, 2) multiple disciplines across a wide range of planetary sciences, and 3) domestic and international communities and partnerships. This testbed provides a means of consolidating the tasks of acquisition, storage and safety mitigation in handling large quantities of regolith simulant Facility hardware and environment testing scenarios include, but are not limited to the following; Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, and Surface features (i.e. grades and rocks) Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and planetary exploration activities at NASA Research Park, to academia and expanded commercial opportunities in California's Silicon Valley, as well as public outreach and education opportunities.

  20. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    International Nuclear Information System (INIS)

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Soummer, Remi; Macintosh, Bruce; Sivaramakrishnan, Anand

    2011-01-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  1. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    Science.gov (United States)

    Papathakis, Kurt V.; Kloesel, Kurt J.; Lin, Yohan; Clarke, Sean; Ediger, Jacob J.; Ginn, Starr

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC) (Edwards, California) is developing a Hybrid-Electric Integrated Systems Testbed (HEIST) Testbed as part of the HEIST Project, to study power management and transition complexities, modular architectures, and flight control laws for turbo-electric distributed propulsion technologies using representative hardware and piloted simulations. Capabilities are being developed to assess the flight readiness of hybrid electric and distributed electric vehicle architectures. Additionally, NASA will leverage experience gained and assets developed from HEIST to assist in flight-test proposal development, flight-test vehicle design, and evaluation of hybrid electric and distributed electric concept vehicles for flight safety. The HEIST test equipment will include three trailers supporting a distributed electric propulsion wing, a battery system and turbogenerator, dynamometers, and supporting power and communication infrastructure, all connected to the AFRC Core simulation. Plans call for 18 high performance electric motors that will be powered by batteries and the turbogenerator, and commanded by a piloted simulation. Flight control algorithms will be developed on the turbo-electric distributed propulsion system.

  2. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    Science.gov (United States)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  3. Implementation of standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Babiuc, M C [Department of Physics and Physical Science, Marshall University, Huntington, WV 25755 (United States); Husa, S [Friedrich Schiller University Jena, Max-Wien-Platz 1, 07743 Jena (Germany); Alic, D [Department of Physics, University of the Balearic Islands, Cra Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinder, I [Center for Gravitational Wave Physics, Pennsylvania State University, University Park, PA 16802 (United States); Lechner, C [Weierstrass Institute for Applied Analysis and Stochastics (WIAS), Mohrenstrasse 39, 10117 Berlin (Germany); Schnetter, E [Center for Computation and Technology, 216 Johnston Hall, Louisiana State University, Baton Rouge, LA 70803 (United States); Szilagyi, B; Dorband, N; Pollney, D; Winicour, J [Max-Planck-Institut fuer Gravitationsphysik (Albert-Einstein-Institut), Am Muehlenberg 1, 14076 Golm (Germany); Zlochower, Y [Center for Computational Relativity and Gravitation, School of Mathematical Sciences, Rochester Institute of Technology, 78 Lomb Memorial Drive, Rochester, New York 14623 (United States)

    2008-06-21

    We discuss results that have been obtained from the implementation of the initial round of testbeds for numerical relativity which was proposed in the first paper of the Apples with Apples Alliance. We present benchmark results for various codes which provide templates for analyzing the testbeds and to draw conclusions about various features of the codes. This allows us to sharpen the initial test specifications, design a new test and add theoretical insight.

  4. Robust Fault-Tolerant Control for Satellite Attitude Stabilization Based on Active Disturbance Rejection Approach with Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Fei Song

    2014-01-01

    Full Text Available This paper proposed a robust fault-tolerant control algorithm for satellite stabilization based on active disturbance rejection approach with artificial bee colony algorithm. The actuating mechanism of attitude control system consists of three working reaction flywheels and one spare reaction flywheel. The speed measurement of reaction flywheel is adopted for fault detection. If any reaction flywheel fault is detected, the corresponding fault flywheel is isolated and the spare reaction flywheel is activated to counteract the fault effect and ensure that the satellite is working safely and reliably. The active disturbance rejection approach is employed to design the controller, which handles input information with tracking differentiator, estimates system uncertainties with extended state observer, and generates control variables by state feedback and compensation. The designed active disturbance rejection controller is robust to both internal dynamics and external disturbances. The bandwidth parameter of extended state observer is optimized by the artificial bee colony algorithm so as to improve the performance of attitude control system. A series of simulation experiment results demonstrate the performance superiorities of the proposed robust fault-tolerant control algorithm.

  5. Real-Time Emulation of Heterogeneous Wireless Networks with End-to-Edge Quality of Service Guarantees: The AROMA Testbed

    Directory of Open Access Journals (Sweden)

    Anna Umbert

    2010-01-01

    Full Text Available This work presents and describes the real-time testbed for all-IP Beyond 3G (B3G heterogeneous wireless networks that has been developed in the framework of the European IST AROMA project. The main objective of the AROMA testbed is to provide a highly accurate and realistic framework where the performance of algorithms, policies, protocols, services, and applications for a complete heterogeneous wireless network can be fully assessed and evaluated before bringing them to a real system. The complexity of the interaction between all-IP B3G systems and user applications, while dealing with the Quality of Service (QoS concept, motivates the development of this kind of emulation platform where different solutions can be tested in realistic conditions that could not be achieved by means of simple offline simulations. This work provides an in-depth description of the AROMA testbed, emphasizing many interesting implementation details and lessons learned during the development of the tool that may result helpful to other researchers and system engineers in the development of similar emulation platforms. Several case studies are also presented in order to illustrate the full potential and capabilities of the presented emulation platform.

  6. Coordinate-Based Clustering Method for Indoor Fingerprinting Localization in Dense Cluttered Environments

    Directory of Open Access Journals (Sweden)

    Wen Liu

    2016-12-01

    Full Text Available Indoor positioning technologies has boomed recently because of the growing commercial interest in indoor location-based service (ILBS. Due to the absence of satellite signal in Global Navigation Satellite System (GNSS, various technologies have been proposed for indoor applications. Among them, Wi-Fi fingerprinting has been attracting much interest from researchers because of its pervasive deployment, flexibility and robustness to dense cluttered indoor environments. One challenge, however, is the deployment of Access Points (AP, which would bring a significant influence on the system positioning accuracy. This paper concentrates on WLAN based fingerprinting indoor location by analyzing the AP deployment influence, and studying the advantages of coordinate-based clustering compared to traditional RSS-based clustering. A coordinate-based clustering method for indoor fingerprinting location, named Smallest-Enclosing-Circle-based (SEC, is then proposed aiming at reducing the positioning error lying in the AP deployment and improving robustness to dense cluttered environments. All measurements are conducted in indoor public areas, such as the National Center For the Performing Arts (as Test-bed 1 and the XiDan Joy City (Floors 1 and 2, as Test-bed 2, and results show that SEC clustering algorithm can improve system positioning accuracy by about 32.7% for Test-bed 1, 71.7% for Test-bed 2 Floor 1 and 73.7% for Test-bed 2 Floor 2 compared with traditional RSS-based clustering algorithms such as K-means.

  7. Coordinate-Based Clustering Method for Indoor Fingerprinting Localization in Dense Cluttered Environments.

    Science.gov (United States)

    Liu, Wen; Fu, Xiao; Deng, Zhongliang

    2016-12-02

    Indoor positioning technologies has boomed recently because of the growing commercial interest in indoor location-based service (ILBS). Due to the absence of satellite signal in Global Navigation Satellite System (GNSS), various technologies have been proposed for indoor applications. Among them, Wi-Fi fingerprinting has been attracting much interest from researchers because of its pervasive deployment, flexibility and robustness to dense cluttered indoor environments. One challenge, however, is the deployment of Access Points (AP), which would bring a significant influence on the system positioning accuracy. This paper concentrates on WLAN based fingerprinting indoor location by analyzing the AP deployment influence, and studying the advantages of coordinate-based clustering compared to traditional RSS-based clustering. A coordinate-based clustering method for indoor fingerprinting location, named Smallest-Enclosing-Circle-based (SEC), is then proposed aiming at reducing the positioning error lying in the AP deployment and improving robustness to dense cluttered environments. All measurements are conducted in indoor public areas, such as the National Center For the Performing Arts (as Test-bed 1) and the XiDan Joy City (Floors 1 and 2, as Test-bed 2), and results show that SEC clustering algorithm can improve system positioning accuracy by about 32.7% for Test-bed 1, 71.7% for Test-bed 2 Floor 1 and 73.7% for Test-bed 2 Floor 2 compared with traditional RSS-based clustering algorithms such as K-means.

  8. Vacuum Nuller Testbed Performance, Characterization and Null Control

    Science.gov (United States)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  9. Algorithm to retrieve the melt pond fraction and the spectral albedo of Arctic summer ice from satellite optical data

    OpenAIRE

    Zege, E.; Malinka, A.; Katsev, I.; Prikhach, A.; Heygster, Georg; Istomina, L.; Birnbaum, Gerit; Schwarz, Pascal

    2015-01-01

    A new algorithmto retrieve characteristics (albedo and melt pond fraction) of summer ice in the Arctic fromoptical satellite data is described. In contrast to other algorithms this algorithm does not use a priori values of the spectral albedo of the sea-ice constituents (such asmelt ponds,white ice etc.). Instead, it is based on an analytical solution for the reflection from sea ice surface. The algorithm includes the correction of the sought-for ice and ponds characteristics with...

  10. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters

    Directory of Open Access Journals (Sweden)

    Alain Muñoz-Caravaca

    2008-07-01

    Full Text Available Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba, and sediment-rich waters in the Laucala Bay (Fiji. In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations. The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if < 1 FTU, it is recalculated using an algorithm based on Rrs620.Rrs681/Rrs412. On our data set, this algorithm is suitable for the 0.2-25 FTU turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU. This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach.

  11. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters

    Science.gov (United States)

    Ouillon, Sylvain; Douillet, Pascal; Petrenko, Anne; Neveux, Jacques; Dupouy, Cécile; Froidefond, Jean-Marie; Andréfouët, Serge; Muñoz-Caravaca, Alain

    2008-01-01

    Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba), and sediment-rich waters in the Laucala Bay (Fiji). In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations). The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if < 1 FTU, it is recalculated using an algorithm based on Rrs620.Rrs681/Rrs412. On our data set, this algorithm is suitable for the 0.2-25 FTU turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU). This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach. PMID:27879929

  12. Optical Algorithms at Satellite Wavelengths for Total Suspended Matter in Tropical Coastal Waters.

    Science.gov (United States)

    Ouillon, Sylvain; Douillet, Pascal; Petrenko, Anne; Neveux, Jacques; Dupouy, Cécile; Froidefond, Jean-Marie; Andréfouët, Serge; Muñoz-Caravaca, Alain

    2008-07-10

    Is it possible to derive accurately Total Suspended Matter concentration or its proxy, turbidity, from remote sensing data in tropical coastal lagoon waters? To investigate this question, hyperspectral remote sensing reflectance, turbidity and chlorophyll pigment concentration were measured in three coral reef lagoons. The three sites enabled us to get data over very diverse environments: oligotrophic and sediment-poor waters in the southwest lagoon of New Caledonia, eutrophic waters in the Cienfuegos Bay (Cuba), and sediment-rich waters in the Laucala Bay (Fiji). In this paper, optical algorithms for turbidity are presented per site based on 113 stations in New Caledonia, 24 stations in Cuba and 56 stations in Fiji. Empirical algorithms are tested at satellite wavebands useful to coastal applications. Global algorithms are also derived for the merged data set (193 stations). The performances of global and local regression algorithms are compared. The best one-band algorithms on all the measurements are obtained at 681 nm using either a polynomial or a power model. The best two-band algorithms are obtained with R412/R620, R443/R670 and R510/R681. Two three-band algorithms based on Rrs620.Rrs681/Rrs412 and Rrs620.Rrs681/Rrs510 also give fair regression statistics. Finally, we propose a global algorithm based on one or three bands: turbidity is first calculated from Rrs681 and then, if turbidity range and for the three sites sampled (mean bias: 3.6 %, rms: 35%, mean quadratic error: 1.4 FTU). This shows that defining global empirical turbidity algorithms in tropical coastal waters is at reach.

  13. A Novel UAV Electric Propulsion Testbed for Diagnostics and Prognostics

    Science.gov (United States)

    Gorospe, George E., Jr.; Kulkarni, Chetan S.

    2017-01-01

    This paper presents a novel hardware-in-the-loop (HIL) testbed for systems level diagnostics and prognostics of an electric propulsion system used in UAVs (unmanned aerial vehicle). Referencing the all electric, Edge 540T aircraft used in science and research by NASA Langley Flight Research Center, the HIL testbed includes an identical propulsion system, consisting of motors, speed controllers and batteries. Isolated under a controlled laboratory environment, the propulsion system has been instrumented for advanced diagnostics and prognostics. To produce flight like loading on the system a slave motor is coupled to the motor under test (MUT) and provides variable mechanical resistance, and the capability of introducing nondestructive mechanical wear-like frictional loads on the system. This testbed enables the verification of mathematical models of each component of the propulsion system, the repeatable generation of flight-like loads on the system for fault analysis, test-to-failure scenarios, and the development of advanced system level diagnostics and prognostics methods. The capabilities of the testbed are extended through the integration of a LabVIEW-based client for the Live Virtual Constructive Distributed Environment (LVCDC) Gateway which enables both the publishing of generated data for remotely located observers and prognosers and the synchronization the testbed propulsion system with vehicles in the air. The developed HIL testbed gives researchers easy access to a scientifically relevant portion of the aircraft without the overhead and dangers encountered during actual flight.

  14. Satellite remote sensing of harmful algal blooms: A new multi-algorithm method for detecting the Florida Red Tide (Karenia brevis)

    Science.gov (United States)

    Carvalho, Gustavo A.; Minnett, Peter J.; Fleming, Lora E.; Banzon, Viva F.; Baringer, Warner

    2010-01-01

    In a continuing effort to develop suitable methods for the surveillance of Harmful Algal Blooms (HABs) of Karenia brevis using satellite radiometers, a new multi-algorithm method was developed to explore whether improvements in the remote sensing detection of the Florida Red Tide was possible. A Hybrid Scheme was introduced that sequentially applies the optimized versions of two pre-existing satellite-based algorithms: an Empirical Approach (using water-leaving radiance as a function of chlorophyll concentration) and a Bio-optical Technique (using particulate backscatter along with chlorophyll concentration). The long-term evaluation of the new multi-algorithm method was performed using a multi-year MODIS dataset (2002 to 2006; during the boreal Summer-Fall periods – July to December) along the Central West Florida Shelf between 25.75°N and 28.25°N. Algorithm validation was done with in situ measurements of the abundances of K. brevis; cell counts ≥1.5×104 cells l−1 defined a detectable HAB. Encouraging statistical results were derived when either or both algorithms correctly flagged known samples. The majority of the valid match-ups were correctly identified (~80% of both HABs and non-blooming conditions) and few false negatives or false positives were produced (~20% of each). Additionally, most of the HAB-positive identifications in the satellite data were indeed HAB samples (positive predictive value: ~70%) and those classified as HAB-negative were almost all non-bloom cases (negative predictive value: ~86%). These results demonstrate an excellent detection capability, on average ~10% more accurate than the individual algorithms used separately. Thus, the new Hybrid Scheme could become a powerful tool for environmental monitoring of K. brevis blooms, with valuable consequences including leading to the more rapid and efficient use of ships to make in situ measurements of HABs. PMID:21037979

  15. Estimation of the soil temperature from the AVHRR-NOAA satellite data applying split window algorithms

    International Nuclear Information System (INIS)

    Parra, J.C.; Acevedo, P.S.; Sobrino, J.A.; Morales, L.J.

    2006-01-01

    Four algorithms based on the technique of split-window, to estimate the land surface temperature starting from the data provided by the sensor Advanced Very High Resolution radiometer (AVHRR), on board the series of satellites of the National Oceanic and Atmospheric Administration (NOAA), are carried out. These algorithms consider corrections for atmospheric characteristics and emissivity of the different surfaces of the land. Fourteen images AVHRR-NOAA corresponding to the months of October of 2003, and January of 2004 were used. Simultaneously, measurements of soil temperature in the Carillanca hydro-meteorological station were collected in the Region of La Araucana, Chile (38 deg 41 min S; 72 deg 25 min W). Of all the used algorithms, the best results correspond to the model proposed by Sobrino and Raussoni (2000), with a media and standard deviation corresponding to the difference among the temperature of floor measure in situ and the estimated for this algorithm, of -0.06 and 2.11 K, respectively. (Author)

  16. Incorporating Satellite Precipitation Estimates into a Radar-Gauge Multi-Sensor Precipitation Estimation Algorithm

    Directory of Open Access Journals (Sweden)

    Yuxiang He

    2018-01-01

    Full Text Available This paper presents a new and enhanced fusion module for the Multi-Sensor Precipitation Estimator (MPE that would objectively blend real-time satellite quantitative precipitation estimates (SQPE with radar and gauge estimates. This module consists of a preprocessor that mitigates systematic bias in SQPE, and a two-way blending routine that statistically fuses adjusted SQPE with radar estimates. The preprocessor not only corrects systematic bias in SQPE, but also improves the spatial distribution of precipitation based on SQPE and makes it closely resemble that of radar-based observations. It uses a more sophisticated radar-satellite merging technique to blend preprocessed datasets, and provides a better overall QPE product. The performance of the new satellite-radar-gauge blending module is assessed using independent rain gauge data over a five-year period between 2003–2007, and the assessment evaluates the accuracy of newly developed satellite-radar-gauge (SRG blended products versus that of radar-gauge products (which represents MPE algorithm currently used in the NWS (National Weather Service operations over two regions: (I Inside radar effective coverage and (II immediately outside radar coverage. The outcomes of the evaluation indicate (a ingest of SQPE over areas within effective radar coverage improve the quality of QPE by mitigating the errors in radar estimates in region I; and (b blending of radar, gauge, and satellite estimates over region II leads to reduction of errors relative to bias-corrected SQPE. In addition, the new module alleviates the discontinuities along the boundaries of radar effective coverage otherwise seen when SQPE is used directly to fill the areas outside of effective radar coverage.

  17. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links.

    Science.gov (United States)

    Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen

    2018-05-25

    Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection

  18. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links

    Directory of Open Access Journals (Sweden)

    Hongbo Zhao

    2018-05-01

    Full Text Available Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR, complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS and BeiDou Navigation Satellite System (BDS adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST. This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher

  19. Fuzzy Information Retrieval Using Genetic Algorithms and Relevance Feedback.

    Science.gov (United States)

    Petry, Frederick E.; And Others

    1993-01-01

    Describes an approach that combines concepts from information retrieval, fuzzy set theory, and genetic programing to improve weighted Boolean query formulation via relevance feedback. Highlights include background on information retrieval systems; genetic algorithms; subproblem formulation; and preliminary results based on a testbed. (Contains 12…

  20. SSERVI Analog Regolith Simulant Testbed Facility

    Science.gov (United States)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  1. An Algorithm to Generate Deep-Layer Temperatures from Microwave Satellite Observations for the Purpose of Monitoring Climate Change. Revised

    Science.gov (United States)

    Goldberg, Mitchell D.; Fleming, Henry E.

    1994-01-01

    An algorithm for generating deep-layer mean temperatures from satellite-observed microwave observations is presented. Unlike traditional temperature retrieval methods, this algorithm does not require a first guess temperature of the ambient atmosphere. By eliminating the first guess a potentially systematic source of error has been removed. The algorithm is expected to yield long-term records that are suitable for detecting small changes in climate. The atmospheric contribution to the deep-layer mean temperature is given by the averaging kernel. The algorithm computes the coefficients that will best approximate a desired averaging kernel from a linear combination of the satellite radiometer's weighting functions. The coefficients are then applied to the measurements to yield the deep-layer mean temperature. Three constraints were used in deriving the algorithm: (1) the sum of the coefficients must be one, (2) the noise of the product is minimized, and (3) the shape of the approximated averaging kernel is well-behaved. Note that a trade-off between constraints 2 and 3 is unavoidable. The algorithm can also be used to combine measurements from a future sensor (i.e., the 20-channel Advanced Microwave Sounding Unit (AMSU)) to yield the same averaging kernel as that based on an earlier sensor (i.e., the 4-channel Microwave Sounding Unit (MSU)). This will allow a time series of deep-layer mean temperatures based on MSU measurements to be continued with AMSU measurements. The AMSU is expected to replace the MSU in 1996.

  2. Wavefront control performance modeling with WFIRST shaped pupil coronagraph testbed

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijian; Krist, John; Cady, Eric; Kern, Brian; Poberezhskiy, Ilya

    2017-09-01

    NASA's WFIRST mission includes a coronagraph instrument (CGI) for direct imaging of exoplanets. Significant improvement in CGI model fidelity has been made recently, alongside a testbed high contrast demonstration in a simulated dynamic environment at JPL. We present our modeling method and results of comparisons to testbed's high order wavefront correction performance for the shaped pupil coronagraph. Agreement between model prediction and testbed result at better than a factor of 2 has been consistently achieved in raw contrast (contrast floor, chromaticity, and convergence), and with that comes good agreement in contrast sensitivity to wavefront perturbations and mask lateral shear.

  3. The Fourier-Kelvin Stellar Interferometer (FKSI) Nulling Testbed II: Closed-loop Path Length Metrology And Control Subsystem

    Science.gov (United States)

    Frey, B. J.; Barry, R. K.; Danchi, W. C.; Hyde, T. T.; Lee, K. Y.; Martino, A. J.; Zuray, M. S.

    2006-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer in the near to mid-infrared spectral region (3-8 microns), and will be a scientific and technological pathfinder for upcoming missions including TPF-I/DARWIN, SPECS, and SPIRIT. At NASA's Goddard Space Flight Center, we have constructed a symmetric Mach-Zehnder nulling testbed to demonstrate techniques and algorithms that can be used to establish and maintain the 10(exp 4) null depth that will be required for such a mission. Among the challenges inherent in such a system is the ability to acquire and track the null fringe to the desired depth for timescales on the order of hours in a laboratory environment. In addition, it is desirable to achieve this stability without using conventional dithering techniques. We describe recent testbed metrology and control system developments necessary to achieve these goals and present our preliminary results.

  4. Geostationary Communications Satellites as Sensors for the Space Weather Environment: Telemetry Event Identification Algorithms

    Science.gov (United States)

    Carlton, A.; Cahoy, K.

    2015-12-01

    Reliability of geostationary communication satellites (GEO ComSats) is critical to many industries worldwide. The space radiation environment poses a significant threat and manufacturers and operators expend considerable effort to maintain reliability for users. Knowledge of the space radiation environment at the orbital location of a satellite is of critical importance for diagnosing and resolving issues resulting from space weather, for optimizing cost and reliability, and for space situational awareness. For decades, operators and manufacturers have collected large amounts of telemetry from geostationary (GEO) communications satellites to monitor system health and performance, yet this data is rarely mined for scientific purposes. The goal of this work is to acquire and analyze archived data from commercial operators using new algorithms that can detect when a space weather (or non-space weather) event of interest has occurred or is in progress. We have developed algorithms, collectively called SEER (System Event Evaluation Routine), to statistically analyze power amplifier current and temperature telemetry by identifying deviations from nominal operations or other events and trends of interest. This paper focuses on our work in progress, which currently includes methods for detection of jumps ("spikes", outliers) and step changes (changes in the local mean) in the telemetry. We then examine available space weather data from the NOAA GOES and the NOAA-computed Kp index and sunspot numbers to see what role, if any, it might have played. By combining the results of the algorithm for many components, the spacecraft can be used as a "sensor" for the space radiation environment. Similar events occurring at one time across many component telemetry streams may be indicative of a space radiation event or system-wide health and safety concern. Using SEER on representative datasets of telemetry from Inmarsat and Intelsat, we find events that occur across all or many of

  5. Satellite teleradiology test bed for digital mammography

    Science.gov (United States)

    Barnett, Bruce G.; Dudding, Kathryn E.; Abdel-Malek, Aiman A.; Mitchell, Robert J.

    1996-05-01

    Teleradiology offers significant improvement in efficiency and patient compliance over current practices in traditional film/screen-based diagnosis. The increasing number of women who need to be screened for breast cancer, including those in remote rural regions, make the advantages of teleradiology especially attractive for digital mammography. At the same time, the size and resolution of digital mammograms are among the most challenging to support in a cost effective teleradiology system. This paper will describe a teleradiology architecture developed for use with digital mammography by GE Corporate Research and Development in collaboration with Massachusetts General Hospital under National Cancer Institute (NCI/NIH) grant number R01 CA60246-01. The testbed architecture is based on the Digital Imaging and Communications in Medicine (DICOM) standard, created by the American College of Radiology and National Electrical Manufacturers Association. The testbed uses several Sun workstations running SunOS, which emulate a rural examination facility connected to a central diagnostic facility, and uses a TCP-based DICOM application to transfer images over a satellite link. Network performance depends on the product of the bandwidth times the round- trip time. A satellite link has a round trip of 513 milliseconds, making the bandwidth-delay a significant problem. This type of high bandwidth, high delay network is called a Long Fat Network, or LFN. The goal of this project was to quantify the performance of the satellite link, and evaluate the effectiveness of TCP over an LFN. Four workstations have Sun's HSI/S (High Speed Interface) option. Two are connected by a cable, and two are connected through a satellite link. Both interfaces have the same T1 bandwidth (1.544 Megabits per second). The only difference was the round trip time. Even with large window buffers, the time to transfer a file over the satellite link was significantly longer, due to the bandwidth-delay. To

  6. Satellite constellation design and radio resource management using genetic algorithm.

    OpenAIRE

    Asvial, Muhamad.

    2003-01-01

    A novel strategy for automatic satellite constellation design with satellite diversity is proposed. The automatic satellite constellation design means some parameters of satellite constellation design can be determined simultaneously. The total number of satellites, the altitude of satellite, the angle between planes, the angle shift between satellites and the inclination angle are considered for automatic satellite constellation design. Satellite constellation design is modelled using a mult...

  7. Long-term analysis of aerosol optical depth over Northeast Asia using a satellite-based measurement: MI Yonsei Aerosol Retrieval Algorithm (YAER)

    Science.gov (United States)

    Kim, Mijin; Kim, Jhoon; Yoon, Jongmin; Chung, Chu-Yong; Chung, Sung-Rae

    2017-04-01

    In 2010, the Korean geostationary earth orbit (GEO) satellite, the Communication, Ocean, and Meteorological Satellite (COMS), was launched including the Meteorological Imager (MI). The MI measures atmospheric condition over Northeast Asia (NEA) using a single visible channel centered at 0.675 μm and four IR channels at 3.75, 6.75, 10.8, 12.0 μm. The visible measurement can also be utilized for the retrieval of aerosol optical properties (AOPs). Since the GEO satellite measurement has an advantage for continuous monitoring of AOPs, we can analyze the spatiotemporal variation of the aerosol using the MI observations over NEA. Therefore, we developed an algorithm to retrieve aerosol optical depth (AOD) using the visible observation of MI, and named as MI Yonsei Aerosol Retrieval Algorithm (YAER). In this study, we investigated the accuracy of MI YAER AOD by comparing the values with the long-term products of AERONET sun-photometer. The result showed that the MI AODs were significantly overestimated than the AERONET values over bright surface in low AOD case. Because the MI visible channel centered at red color range, contribution of aerosol signal to the measured reflectance is relatively lower than the surface contribution. Therefore, the AOD error in low AOD case over bright surface can be a fundamental limitation of the algorithm. Meanwhile, an assumption of background aerosol optical depth (BAOD) could result in the retrieval uncertainty, also. To estimate the surface reflectance by considering polluted air condition over the NEA, we estimated the BAOD from the MODIS dark target (DT) aerosol products by pixel. The satellite-based AOD retrieval, however, largely depends on the accuracy of the surface reflectance estimation especially in low AOD case, and thus, the BAOD could include the uncertainty in surface reflectance estimation of the satellite-based retrieval. Therefore, we re-estimated the BAOD using the ground-based sun-photometer measurement, and

  8. Towards standard testbeds for numerical relativity

    International Nuclear Information System (INIS)

    Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Hawley, Scott H; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilagyi, Bela; Takahashi, Ryoji; Winicour, Jeff

    2004-01-01

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community

  9. Towards standard testbeds for numerical relativity

    Energy Technology Data Exchange (ETDEWEB)

    Alcubierre, Miguel [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Allen, Gabrielle; Goodale, Tom; Guzman, F Siddhartha; Hawke, Ian; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany); Bona, Carles [Departament de Fisica, Universitat de les Illes Balears, Ctra de Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Fiske, David [Dept. of Physics, Univ. of Maryland, College Park, MD 20742-4111 (United States); Hawley, Scott H [Center for Relativity, Univ. of Texas at Austin, Austin, Texas 78712 (United States); Salgado, Marcelo [Inst. de Ciencias Nucleares, Univ. Nacional Autonoma de Mexico, Apartado Postal 70-543, Mexico Distrito Federal 04510 (Mexico); Schnetter, Erik [Inst. fuer Astronomie und Astrophysik, Universitaet Tuebingen, 72076 Tuebingen (Germany); Seidel, Edward [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Inst., 14476 Golm (Germany); Shinkai, Hisa-aki [Computational Science Div., Inst. of Physical and Chemical Research (RIKEN), Hirosawa 2-1, Wako, Saitama 351-0198 (Japan); Shoemaker, Deirdre [Center for Radiophysics and Space Research, Cornell Univ., Ithaca, NY 14853 (United States); Szilagyi, Bela [Dept. of Physics and Astronomy, Univ. of Pittsburgh, Pittsburgh, PA 15260 (United States); Takahashi, Ryoji [Theoretical Astrophysics Center, Juliane Maries Vej 30, 2100 Copenhagen, (Denmark); Winicour, Jeff [Max-Planck-Inst. fuer Gravitationsphysik, Albert-Einstein-Institut, 14476 Golm (Germany)

    2004-01-21

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.

  10. A Testbed For Validating the LHC Controls System Core Before Deployment

    CERN Document Server

    Nguyen Xuan, J

    2011-01-01

    Since the start-up of the LHC, it is crucial to carefully test core controls components before deploying them operationally. The Testbed of the CERN accelerator controls group was developed for this purpose. It contains different hardware (PPC, i386) running various operating systems (Linux and LynxOS) and core software components running on front-ends, communication middleware and client libraries. The Testbed first executes integration tests to verify that the components delivered by individual teams interoperate, and then system tests, which verify high-level, end-user functionality. It also verifies that different versions of components are compatible, which is vital, because not all parts of the operational LHC control system can be upgraded simultaneously. In addition, the Testbed can be used for performance and stress tests. Internally, the Testbed is driven by Atlassian Bamboo, a Continuous Integration server, which builds and deploys automatically new software versions into the Test...

  11. AMS San Diego Testbed - Calibration Data

    Data.gov (United States)

    Department of Transportation — The data in this repository were collected from the San Diego, California testbed, namely, I-15 from the interchange with SR-78 in the north to the interchange with...

  12. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Directory of Open Access Journals (Sweden)

    Jared A. Frank

    2016-08-01

    Full Text Available Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  13. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    Science.gov (United States)

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  14. High Precision Testbed to Evaluate Ethernet Performance for In-Car Networks

    DEFF Research Database (Denmark)

    Revsbech, Kasper; Madsen, Tatiana Kozlova; Schiøler, Henrik

    2012-01-01

    Validating safety-critical real-time systems such as in-car networks often involves a model-based performance analysis of the network. An important issue performing such analysis is to provide precise model parameters, matching the actual equipment. One way to obtain such parameters is to derive...... them by measurements of the equipment. In this work we describe the design of a testbed enabling active measurements on up to 1 [Gb=Sec] Copper based Ethernet Switches. By use of the testbed it self, we conduct a series of tests where the precision of the testbed is estimated. We find a maximum error...

  15. Multi-agent robotic systems and applications for satellite missions

    Science.gov (United States)

    Nunes, Miguel A.

    A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi

  16. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  17. Development of Liquid Propulsion Systems Testbed at MSFC

    Science.gov (United States)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  18. Algorithmic Foundation of Spectral Rarefaction for Measuring Satellite Imagery Heterogeneity at Multiple Spatial Scales

    Science.gov (United States)

    Rocchini, Duccio

    2009-01-01

    Measuring heterogeneity in satellite imagery is an important task to deal with. Most measures of spectral diversity have been based on Shannon Information theory. However, this approach does not inherently address different scales, ranging from local (hereafter referred to alpha diversity) to global scales (gamma diversity). The aim of this paper is to propose a method for measuring spectral heterogeneity at multiple scales based on rarefaction curves. An algorithmic solution of rarefaction applied to image pixel values (Digital Numbers, DNs) is provided and discussed. PMID:22389600

  19. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    OpenAIRE

    Jared A. Frank; Anthony Brill; Vikram Kapila

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their em...

  20. Wireless Sensor Networks TestBed: ASNTbed

    CSIR Research Space (South Africa)

    Dludla, AG

    2013-05-01

    Full Text Available Wireless sensor networks (WSNs) have been used in different types of applications and deployed within various environments. Simulation tools are essential for studying WSNs, especially for exploring large-scale networks. However, WSN testbeds...

  1. Growth plan for an inspirational test-bed of smart textile services

    NARCIS (Netherlands)

    Wensveen, S.A.G.; Tomico, O.; Bhomer, ten M.; Kuusk, K.

    2015-01-01

    In this pictorial we visualize the growth plan for an inspirational test-bed of smart textile product service systems. The goal of the test-bed is to inspire and inform the Dutch creative industries of textile, interaction and service design to combine their strengths and share opportunities. The

  2. Development of a hardware-in-the-loop testbed to demonstrate multiple spacecraft operations in proximity

    Science.gov (United States)

    Eun, Youngho; Park, Sang-Young; Kim, Geuk-Nam

    2018-06-01

    This paper presents a new state-of-the-art ground-based hardware-in-the-loop test facility, which was developed to verify and demonstrate autonomous guidance, navigation, and control algorithms for space proximity operations and formation flying maneuvers. The test facility consists of two complete spaceflight simulators, an aluminum-based operational arena, and a set of infrared motion tracking cameras; thus, the testbed is capable of representing space activities under circumstances prevailing on the ground. The spaceflight simulators have a maximum of five-degree-of-freedom in a quasi-momentum-free environment, which is produced by a set of linear/hemispherical air-bearings and a horizontally leveled operational arena. The tracking system measures the real-time three-dimensional position and attitude to provide state variables to the agents. The design of the testbed is illustrated in detail for every element throughout the paper. The practical hardware characteristics of the active/passive measurement units and internal actuators are identified in detail from various perspectives. These experimental results support the successful development of the entire facility and enable us to implement and verify the spacecraft proximity operation strategy in the near future.

  3. The AMSR2 Satellite-based Microwave Snow Algorithm (SMSA) to estimate regional to global snow depth and snow water equivalent

    Science.gov (United States)

    Kelly, R. E. J.; Saberi, N.; Li, Q.

    2017-12-01

    With moderate to high spatial resolution (observation approaches yet to be fully scoped and developed, the long-term satellite passive microwave record remains an important tool for cryosphere-climate diagnostics. A new satellite microwave remote sensing approach is described for estimating snow depth (SD) and snow water equivalent (SWE). The algorithm, called the Satellite-based Microwave Snow Algorithm (SMSA), uses Advanced Microwave Scanning Radiometer - 2 (AMSR2) observations aboard the Global Change Observation Mission - Water mission launched by the Japan Aerospace Exploration Agency in 2012. The approach is unique since it leverages observed brightness temperatures (Tb) with static ancillary data to parameterize a physically-based retrieval without requiring parameter constraints from in situ snow depth observations or historical snow depth climatology. After screening snow from non-snow surface targets (water bodies [including freeze/thaw state], rainfall, high altitude plateau regions [e.g. Tibetan plateau]), moderate and shallow snow depths are estimated by minimizing the difference between Dense Media Radiative Transfer model estimates (Tsang et al., 2000; Picard et al., 2011) and AMSR2 Tb observations to retrieve SWE and SD. Parameterization of the model combines a parsimonious snow grain size and density approach originally developed by Kelly et al. (2003). Evaluation of the SMSA performance is achieved using in situ snow depth data from a variety of standard and experiment data sources. Results presented from winter seasons 2012-13 to 2016-17 illustrate the improved performance of the new approach in comparison with the baseline AMSR2 algorithm estimates and approach the performance of the model assimilation-based approach of GlobSnow. Given the variation in estimation power of SWE by different land surface/climate models and selected satellite-derived passive microwave approaches, SMSA provides SWE estimates that are independent of real or near real

  4. Satellite-Derived Bathymetry: Accuracy Assessment on Depths Derivation Algorithm for Shallow Water Area

    Science.gov (United States)

    Said, N. M.; Mahmud, M. R.; Hasan, R. C.

    2017-10-01

    Over the years, the acquisition technique of bathymetric data has evolved from a shipborne platform to airborne and presently, utilising space-borne acquisition. The extensive development of remote sensing technology has brought in the new revolution to the hydrographic surveying. Satellite-Derived Bathymetry (SDB), a space-borne acquisition technique which derives bathymetric data from high-resolution multispectral satellite imagery for various purposes recently considered as a new promising technology in the hydrographic surveying industry. Inspiring by this latest developments, a comprehensive study was initiated by National Hydrographic Centre (NHC) and Universiti Teknologi Malaysia (UTM) to analyse SDB as a means for shallow water area acquisition. By adopting additional adjustment in calibration stage, a marginal improvement discovered on the outcomes from both Stumpf and Lyzenga algorithms where the RMSE values for the derived (predicted) depths were 1.432 meters and 1.728 meters respectively. This paper would deliberate in detail the findings from the study especially on the accuracy level and practicality of SDB over the tropical environmental setting in Malaysia.

  5. Bias correction of daily satellite precipitation data using genetic algorithm

    Science.gov (United States)

    Pratama, A. W.; Buono, A.; Hidayat, R.; Harsa, H.

    2018-05-01

    Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) was producted by blending Satellite-only Climate Hazards Group InfraRed Precipitation (CHIRP) with Stasion observations data. The blending process was aimed to reduce bias of CHIRP. However, Biases of CHIRPS on statistical moment and quantil values were high during wet season over Java Island. This paper presented a bias correction scheme to adjust statistical moment of CHIRP using observation precipitation data. The scheme combined Genetic Algorithm and Nonlinear Power Transformation, the results was evaluated based on different season and different elevation level. The experiment results revealed that the scheme robustly reduced bias on variance around 100% reduction and leaded to reduction of first, and second quantile biases. However, bias on third quantile only reduced during dry months. Based on different level of elevation, the performance of bias correction process is only significantly different on skewness indicators.

  6. Orbit computation of the TELECOM-2D satellite with a Genetic Algorithm

    Science.gov (United States)

    Deleflie, Florent; Coulot, David; Vienne, Alain; Decosta, Romain; Richard, Pascal; Lasri, Mohammed Amjad

    2014-07-01

    In order to test a preliminary orbit determination method, we fit an orbit of the geostationary satellite TELECOM-2D, as if we did not know any a priori information on its trajectory. The method is based on a genetic algorithm coupled to an analytical propagator of the trajectory, that is used over a couple of days, and that uses a whole set of altazimutal data that are acquired by the tracking network made up of the two TAROT telescopes. The adjusted orbit is then compared to a numerical reference. The method is described, and the results are analyzed, as a step towards an operational method of preliminary orbit determination for uncatalogued objects.

  7. Wireless Testbed Bonsai

    Science.gov (United States)

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  8. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    Science.gov (United States)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; hide

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  9. Integrating Simulated Physics and Device Virtualization in Control System Testbeds

    OpenAIRE

    Redwood , Owen; Reynolds , Jason; Burmester , Mike

    2016-01-01

    Part 3: INFRASTRUCTURE MODELING AND SIMULATION; International audience; Malware and forensic analyses of embedded cyber-physical systems are tedious, manual processes that testbeds are commonly not designed to support. Additionally, attesting the physics impact of embedded cyber-physical system malware has no formal methodologies and is currently an art. This chapter describes a novel testbed design methodology that integrates virtualized embedded industrial control systems and physics simula...

  10. Spectrum and power allocation in cognitive multi-beam satellite communications with flexible satellite payloads

    Science.gov (United States)

    Liu, Zhihui; Wang, Haitao; Dong, Tao; Yin, Jie; Zhang, Tingting; Guo, Hui; Li, Dequan

    2018-02-01

    In this paper, the cognitive multi-beam satellite system, i.e., two satellite networks coexist through underlay spectrum sharing, is studied, and the power and spectrum allocation method is employed for interference control and throughput maximization. Specifically, the multi-beam satellite with flexible payload reuses the authorized spectrum of the primary satellite, adjusting its transmission band as well as power for each beam to limit its interference on the primary satellite below the prescribed threshold and maximize its own achievable rate. This power and spectrum allocation problem is formulated as a mixed nonconvex programming. For effective solving, we first introduce the concept of signal to leakage plus noise ratio (SLNR) to decouple multiple transmit power variables in the both objective and constraint, and then propose a heuristic algorithm to assign spectrum sub-bands. After that, a stepwise plus slice-wise algorithm is proposed to implement the discrete power allocation. Finally, simulation results show that adopting cognitive technology can improve spectrum efficiency of the satellite communication.

  11. Design of aircraft cabin testbed for stress free air travel experiment

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    The paper presents an aircraft cabin testbed that is designed and built for the stress free air travel experiment. The project is funded by European Union in the aim of improving air travel comfort during long haul flight. The testbed is used to test and validate the adaptive system that is capable

  12. Improving the Regional Applicability of Satellite Precipitation Products by Ensemble Algorithm

    Directory of Open Access Journals (Sweden)

    Waseem Muhammad

    2018-04-01

    Full Text Available Satellite-based precipitation products (e.g., Integrated Multi-Satellite Retrievals for Global Precipitation Measurement (IMERG and its predecessor, Tropical Rainfall Measuring Mission (TRMM are a critical source of precipitation estimation, particularly for a region with less, or no, hydrometric networking. However, the inconsistency in the performance of these products has been observed in different climatic and topographic diverse regions, timescales, and precipitation intensities and there is still room for improvement. Hence, using a projected ensemble algorithm, the regional precipitation estimate (RP is introduced here. The RP concept is mainly based on the regional performance weights derived from the Mean Square Error (MSE and the precipitation estimate from the TRMM product, that is, TRMM 3B42 (TR, real-time (late (IT and the research (post-real-time (IR products of IMERG. The overall results of the selected contingency table (e.g., Probability of detection (POD and statistical indices (e.g., Correlation Coefficient (CC signposted that the proposed RP product has shown an overall better potential to capture the gauge observations compared with the TR, IR, and IT in five different climatic regions of Pakistan from January 2015 to December 2016, at a diurnal time scale. The current study could be the first research providing preliminary feedback from Pakistan for global precipitation measurement researchers by highlighting the need for refinement in the IMERG.

  13. The remote sensing of ocean primary productivity - Use of a new data compilation to test satellite algorithms

    Science.gov (United States)

    Balch, William; Evans, Robert; Brown, Jim; Feldman, Gene; Mcclain, Charles; Esaias, Wayne

    1992-01-01

    Global pigment and primary productivity algorithms based on a new data compilation of over 12,000 stations occupied mostly in the Northern Hemisphere, from the late 1950s to 1988, were tested. The results showed high variability of the fraction of total pigment contributed by chlorophyll, which is required for subsequent predictions of primary productivity. Two models, which predict pigment concentration normalized to an attenuation length of euphotic depth, were checked against 2,800 vertical profiles of pigments. Phaeopigments consistently showed maxima at about one optical depth below the chlorophyll maxima. CZCS data coincident with the sea truth data were also checked. A regression of satellite-derived pigment vs ship-derived pigment had a coefficient of determination. The satellite underestimated the true pigment concentration in mesotrophic and oligotrophic waters and overestimated the pigment concentration in eutrophic waters. The error in the satellite estimate showed no trends with time between 1978 and 1986.

  14. Versatile Electric Propulsion Aircraft Testbed, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — An all-electric aircraft testbed is proposed to provide a dedicated development environment for the rigorous study and advancement of electrically powered aircraft....

  15. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    Science.gov (United States)

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  16. Satellite Attitude Control System Simulator

    Directory of Open Access Journals (Sweden)

    G.T. Conti

    2008-01-01

    Full Text Available Future space missions will involve satellites with great autonomy and stringent pointing precision, requiring of the Attitude Control Systems (ACS with better performance than before, which is function of the control algorithms implemented on board computers. The difficulties for developing experimental ACS test is to obtain zero gravity and torque free conditions similar to the SCA operate in space. However, prototypes for control algorithms experimental verification are fundamental for space mission success. This paper presents the parameters estimation such as inertia matrix and position of mass centre of a Satellite Attitude Control System Simulator (SACSS, using algorithms based on least square regression and least square recursive methods. Simulations have shown that both methods have estimated the system parameters with small error. However, the least square recursive methods have performance more adequate for the SACSS objectives. The SACSS platform model will be used to do experimental verification of fundamental aspects of the satellite attitude dynamics and design of different attitude control algorithm.

  17. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  18. Performance Evaluation of Machine Learning Algorithms for Urban Pattern Recognition from Multi-spectral Satellite Images

    Directory of Open Access Journals (Sweden)

    Marc Wieland

    2014-03-01

    Full Text Available In this study, a classification and performance evaluation framework for the recognition of urban patterns in medium (Landsat ETM, TM and MSS and very high resolution (WorldView-2, Quickbird, Ikonos multi-spectral satellite images is presented. The study aims at exploring the potential of machine learning algorithms in the context of an object-based image analysis and to thoroughly test the algorithm’s performance under varying conditions to optimize their usage for urban pattern recognition tasks. Four classification algorithms, Normal Bayes, K Nearest Neighbors, Random Trees and Support Vector Machines, which represent different concepts in machine learning (probabilistic, nearest neighbor, tree-based, function-based, have been selected and implemented on a free and open-source basis. Particular focus is given to assess the generalization ability of machine learning algorithms and the transferability of trained learning machines between different image types and image scenes. Moreover, the influence of the number and choice of training data, the influence of the size and composition of the feature vector and the effect of image segmentation on the classification accuracy is evaluated.

  19. Mini-mast CSI testbed user's guide

    Science.gov (United States)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  20. High-contrast imager for Complex Aperture Telescopes (HiCAT): testbed design and coronagraph developments

    Science.gov (United States)

    N'Diaye, Mamadou; Choquet, E.; Pueyo, L.; Elliot, E.; Perrin, M. D.; Wallace, J.; Anderson, R. E.; Carlotti, A.; Groff, T. D.; Hartig, G. F.; Kasdin, J.; Lajoie, C.; Levecq, O.; Long, C.; Macintosh, B.; Mawet, D.; Norman, C. A.; Shaklan, S.; Sheckells, M.; Sivaramakrishnan, A.; Soummer, R.

    2014-01-01

    We present a new high-contrast imaging testbed designed to provide complete solutions for wavefront sensing and control and starlight suppression with complex aperture telescopes (NASA APRA; Soummer PI). This includes geometries with central obstruction, support structures, and/or primary mirror segmentation. Complex aperture telescopes are often associated with large telescope designs, which are considered for future space missions. However, these designs makes high-contrast imaging challenging because of additional diffraction features in the point spread function. We present a novel optimization approach for the testbed optical and opto-mechanical design that minimizes the impact of both phase and amplitude errors from the wave propagation of testbed optics surface errors. This design approach allows us to define the specification for the bench optics, which we then compare to the manufactured parts. We discuss the testbed alignment and first results. We also present our coronagraph design for different testbed pupil shapes (AFTA or ATLAST), which involves a new method for the optimization of Apodized Pupil Lyot Coronagraphs (APLC).

  1. Advanced Oil Spill Detection Algorithms For Satellite Based Maritime Environment Monitoring

    Science.gov (United States)

    Radius, Andrea; Azevedo, Rui; Sapage, Tania; Carmo, Paulo

    2013-12-01

    During the last years, the increasing pollution occurrence and the alarming deterioration of the environmental health conditions of the sea, lead to the need of global monitoring capabilities, namely for marine environment management in terms of oil spill detection and indication of the suspected polluter. The sensitivity of Synthetic Aperture Radar (SAR) to the different phenomena on the sea, especially for oil spill and vessel detection, makes it a key instrument for global pollution monitoring. The SAR performances in maritime pollution monitoring are being operationally explored by a set of service providers on behalf of the European Maritime Safety Agency (EMSA), which has launched in 2007 the CleanSeaNet (CSN) project - a pan-European satellite based oil monitoring service. EDISOFT, which is from the beginning a service provider for CSN, is continuously investing in R&D activities that will ultimately lead to better algorithms and better performance on oil spill detection from SAR imagery. This strategy is being pursued through EDISOFT participation in the FP7 EC Sea-U project and in the Automatic Oil Spill Detection (AOSD) ESA project. The Sea-U project has the aim to improve the current state of oil spill detection algorithms, through the informative content maximization obtained with data fusion, the exploitation of different type of data/ sensors and the development of advanced image processing, segmentation and classification techniques. The AOSD project is closely related to the operational segment, because it is focused on the automation of the oil spill detection processing chain, integrating auxiliary data, like wind information, together with image and geometry analysis techniques. The synergy between these different objectives (R&D versus operational) allowed EDISOFT to develop oil spill detection software, that combines the operational automatic aspect, obtained through dedicated integration of the processing chain in the existing open source NEST

  2. Development of a smart-antenna test-bed, demonstrating software defined digital beamforming

    NARCIS (Netherlands)

    Kluwer, T.; Slump, Cornelis H.; Schiphorst, Roelof; Hoeksema, F.W.

    2001-01-01

    This paper describes a smart-antenna test-bed consisting of ‘common of the shelf’ (COTS) hardware and software defined radio components. The use of software radio components enables a flexible platform to implement and test mobile communication systems as a real-world system. The test-bed is

  3. Development of a computationally efficient algorithm for attitude estimation of a remote sensing satellite

    Science.gov (United States)

    Labibian, Amir; Bahrami, Amir Hossein; Haghshenas, Javad

    2017-09-01

    This paper presents a computationally efficient algorithm for attitude estimation of remote a sensing satellite. In this study, gyro, magnetometer, sun sensor and star tracker are used in Extended Kalman Filter (EKF) structure for the purpose of Attitude Determination (AD). However, utilizing all of the measurement data simultaneously in EKF structure increases computational burden. Specifically, assuming n observation vectors, an inverse of a 3n×3n matrix is required for gain calculation. In order to solve this problem, an efficient version of EKF, namely Murrell's version, is employed. This method utilizes measurements separately at each sampling time for gain computation. Therefore, an inverse of a 3n×3n matrix is replaced by an inverse of a 3×3 matrix for each measurement vector. Moreover, gyro drifts during the time can reduce the pointing accuracy. Therefore, a calibration algorithm is utilized for estimation of the main gyro parameters.

  4. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    Science.gov (United States)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  5. Fine-tuning satellite-based rainfall estimates

    Science.gov (United States)

    Harsa, Hastuadi; Buono, Agus; Hidayat, Rahmat; Achyar, Jaumil; Noviati, Sri; Kurniawan, Roni; Praja, Alfan S.

    2018-05-01

    Rainfall datasets are available from various sources, including satellite estimates and ground observation. The locations of ground observation scatter sparsely. Therefore, the use of satellite estimates is advantageous, because satellite estimates can provide data on places where the ground observations do not present. However, in general, the satellite estimates data contain bias, since they are product of algorithms that transform the sensors response into rainfall values. Another cause may come from the number of ground observations used by the algorithms as the reference in determining the rainfall values. This paper describe the application of bias correction method to modify the satellite-based dataset by adding a number of ground observation locations that have not been used before by the algorithm. The bias correction was performed by utilizing Quantile Mapping procedure between ground observation data and satellite estimates data. Since Quantile Mapping required mean and standard deviation of both the reference and the being-corrected data, thus the Inverse Distance Weighting scheme was applied beforehand to the mean and standard deviation of the observation data in order to provide a spatial composition of them, which were originally scattered. Therefore, it was possible to provide a reference data point at the same location with that of the satellite estimates. The results show that the new dataset have statistically better representation of the rainfall values recorded by the ground observation than the previous dataset.

  6. SeaWiFS Technical Report Series. Volume 42; Satellite Primary Productivity Data and Algorithm Development: A Science Plan for Mission to Planet Earth

    Science.gov (United States)

    Falkowski, Paul G.; Behrenfeld, Michael J.; Esaias, Wayne E.; Balch, William; Campbell, Janet W.; Iverson, Richard L.; Kiefer, Dale A.; Morel, Andre; Yoder, James A.; Hooker, Stanford B. (Editor); hide

    1998-01-01

    Two issues regarding primary productivity, as it pertains to the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Program and the National Aeronautics and Space Administration (NASA) Mission to Planet Earth (MTPE) are presented in this volume. Chapter 1 describes the development of a science plan for deriving primary production for the world ocean using satellite measurements, by the Ocean Primary Productivity Working Group (OPPWG). Chapter 2 presents discussions by the same group, of algorithm classification, algorithm parameterization and data availability, algorithm testing and validation, and the benefits of a consensus primary productivity algorithm.

  7. INFN Tier-1 Testbed Facility

    International Nuclear Information System (INIS)

    Gregori, Daniele; Cavalli, Alessandro; Dell'Agnello, Luca; Dal Pra, Stefano; Prosperini, Andrea; Ricci, Pierpaolo; Ronchieri, Elisabetta; Sapunenko, Vladimir

    2012-01-01

    INFN-CNAF, located in Bologna, is the Information Technology Center of National Institute of Nuclear Physics (INFN). In the framework of the Worldwide LHC Computing Grid, INFN-CNAF is one of the eleven worldwide Tier-1 centers to store and reprocessing Large Hadron Collider (LHC) data. The Italian Tier-1 provides the resources of storage (i.e., disk space for short term needs and tapes for long term needs) and computing power that are needed for data processing and analysis to the LHC scientific community. Furthermore, INFN Tier-1 houses computing resources for other particle physics experiments, like CDF at Fermilab, SuperB at Frascati, as well as for astro particle and spatial physics experiments. The computing center is a very complex infrastructure, the hardaware layer include the network, storage and farming area, while the software layer includes open source and proprietary software. Software updating and new hardware adding can unexpectedly deteriorate the production activity of the center: therefore a testbed facility has been set up in order to reproduce and certify the various layers of the Tier-1. In this article we describe the testbed and the checks performed.

  8. Definition study for variable cycle engine testbed engine and associated test program

    Science.gov (United States)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  9. University of Florida Advanced Technologies Campus Testbed

    Science.gov (United States)

    2017-09-21

    The University of Florida (UF) and its Transportation Institute (UFTI), the Florida Department of Transportation (FDOT) and the City of Gainesville (CoG) are cooperating to develop a smart transportation testbed on the University of Florida (UF) main...

  10. Autonomous Satellite Command and Control Through the World Wide Web. Phase 3

    Science.gov (United States)

    Cantwell, Brian; Twiggs, Robert

    1998-01-01

    The Automated Space System Experimental Testbed (ASSET) system is a simple yet comprehensive real-world operations network being developed. Phase 3 of the ASSET Project was January-December 1997 and is the subject of this report. This phase permitted SSDL and its project partners to expand the ASSET system in a variety of ways. These added capabilities included the advancement of ground station capabilities, the adaptation of spacecraft on-board software, and the expansion of capabilities of the ASSET management algorithms. Specific goals of Phase 3 were: (1) Extend Web-based goal-level commanding for both the payload PI and the spacecraft engineer. (2) Support prioritized handling of multiple (PIs) Principle Investigators as well as associated payload experimenters. (3) Expand the number and types of experiments supported by the ASSET system and its associated spacecraft. (4) Implement more advanced resource management, modeling and fault management capabilities that integrate the space and ground segments of the space system hardware. (5) Implement a beacon monitoring test. (6) Implement an experimental blackboard controller for space system management. (7) Further define typical ground station developments required for Internet-based remote control and for full system automation of the PI-to-spacecraft link. Each of those goals are examined. Significant sections of this report were also published as a conference paper. Several publications produced in support of this grant are included as attachments. Titles include: 1) Experimental Initiatives in Space System Operations; 2) The ASSET Client Interface: Balancing High Level Specification with Low Level Control; 3) Specifying Spacecraft Operations At The Product/Service Level; 4) The Design of a Highly Configurable, Reusable Operating System for Testbed Satellites; 5) Automated Health Operations For The Sapphire Spacecraft; 6) Engineering Data Summaries for Space Missions; and 7) Experiments In Automated Health

  11. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    Science.gov (United States)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  12. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    Science.gov (United States)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  13. Optical Network Testbeds Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  14. Context-aware local Intrusion Detection in SCADA systems : a testbed and two showcases

    NARCIS (Netherlands)

    Chromik, Justyna Joanna; Haverkort, Boudewijn R.H.M.; Remke, Anne Katharina Ingrid; Pilch, Carina; Brackmann, Pascal; Duhme, Christof; Everinghoff, Franziska; Giberlein, Artur; Teodorowicz, Thomas; Wieland, Julian

    2017-01-01

    This paper illustrates the use of a testbed that we have developed for context-aware local intrusion detection. This testbed is based on the co-simulation framework Mosaik and allows for the validation of local intrusion detection mechanisms at field stations in power distribution networks. For two

  15. JPSS CGS Tools For Rapid Algorithm Updates

    Science.gov (United States)

    Smith, D. C.; Grant, K. D.

    2011-12-01

    The National Oceanic and Atmospheric Administration (NOAA) and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation civilian weather and environmental satellite system: the Joint Polar Satellite System (JPSS). JPSS will contribute the afternoon orbit component and ground processing system of the restructured National Polar-orbiting Operational Environmental Satellite System (NPOESS). As such, JPSS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the ground processing component of both POES and the Defense Meteorological Satellite Program (DMSP) replacement known as the Defense Weather Satellite System (DWSS), managed by the Department of Defense (DoD). The JPSS satellites will carry a suite of sensors designed to collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground processing system for JPSS is known as the JPSS Common Ground System (JPSS CGS), and consists of a Command, Control, and Communications Segment (C3S) and the Interface Data Processing Segment (IDPS). Both are developed by Raytheon Intelligence and Information Systems (IIS). The Interface Data Processing Segment will process NPOESS Preparatory Project, Joint Polar Satellite System and Defense Weather Satellite System satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Under NPOESS, Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization was responsible for the algorithms that produce the EDRs, including their quality aspects. For JPSS, that responsibility has transferred to NOAA's Center for Satellite Applications & Research (STAR). As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent JPSS and DWSS launches, rapid algorithm updates may be required. Raytheon and

  16. Validation of Cloud Parameters Derived from Geostationary Satellites, AVHRR, MODIS, and VIIRS Using SatCORPS Algorithms

    Science.gov (United States)

    Minnis, P.; Sun-Mack, S.; Bedka, K. M.; Yost, C. R.; Trepte, Q. Z.; Smith, W. L., Jr.; Painemal, D.; Chen, Y.; Palikonda, R.; Dong, X.; hide

    2016-01-01

    Validation is a key component of remote sensing that can take many different forms. The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) is applied to many different imager datasets including those from the geostationary satellites, Meteosat, Himiwari-8, INSAT-3D, GOES, and MTSAT, as well as from the low-Earth orbiting satellite imagers, MODIS, AVHRR, and VIIRS. While each of these imagers have similar sets of channels with wavelengths near 0.65, 3.7, 11, and 12 micrometers, many differences among them can lead to discrepancies in the retrievals. These differences include spatial resolution, spectral response functions, viewing conditions, and calibrations, among others. Even when analyzed with nearly identical algorithms, it is necessary, because of those discrepancies, to validate the results from each imager separately in order to assess the uncertainties in the individual parameters. This paper presents comparisons of various SatCORPS-retrieved cloud parameters with independent measurements and retrievals from a variety of instruments. These include surface and space-based lidar and radar data from CALIPSO and CloudSat, respectively, to assess the cloud fraction, height, base, optical depth, and ice water path; satellite and surface microwave radiometers to evaluate cloud liquid water path; surface-based radiometers to evaluate optical depth and effective particle size; and airborne in-situ data to evaluate ice water content, effective particle size, and other parameters. The results of comparisons are compared and contrasted and the factors influencing the differences are discussed.

  17. Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed

    Science.gov (United States)

    2012-01-01

    Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed Matthew Keeter1, Daniel Moore2,3, Ryan Muller2,3, Eric Nieters1, Jennifer...Many applications for autonomous vehicles involve three-dimensional domains, notably aerial and aquatic environments. Such applications include mon...TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Cooperative Search With Autonomous Vehicles In A 3D Aquatic Testbed 5a

  18. PEER Testbed Study on a Laboratory Building: Exercising Seismic Performance Assessment

    OpenAIRE

    Comerio, Mary C.; Stallmeyer, John C.; Smith, Ryan; Makris, Nicos; Konstantinidis, Dimitrios; Mosalam, Khalid; Lee, Tae-Hyung; Beck, James L.; Porter, Keith A.; Shaikhutdinov, Rustem; Hutchinson, Tara; Chaudhuri, Samit Ray; Chang, Stephanie E.; Falit-Baiamonte, Anthony; Holmes, William T.

    2005-01-01

    From 2002 to 2004 (years five and six of a ten-year funding cycle), the PEER Center organized the majority of its research around six testbeds. Two buildings and two bridges, a campus, and a transportation network were selected as case studies to “exercise” the PEER performance-based earthquake engineering methodology. All projects involved interdisciplinary teams of researchers, each producing data to be used by other colleagues in their research. The testbeds demonstrat...

  19. Easy as Pi: A Network Coding Raspberry Pi Testbed

    Directory of Open Access Journals (Sweden)

    Chres W. Sørensen

    2016-10-01

    Full Text Available In the near future, upcoming communications and storage networks are expected to tolerate major difficulties produced by huge amounts of data being generated from the Internet of Things (IoT. For these types of networks, strategies and mechanisms based on network coding have appeared as an alternative to overcome these difficulties in a holistic manner, e.g., without sacrificing the benefit of a given network metric when improving another. There has been recurrent issues on: (i making large-scale deployments akin to the Internet of Things; (ii assessing and (iii replicating the obtained results in preliminary studies. Therefore, finding testbeds that can deal with large-scale deployments and not lose historic data in order to evaluate these mechanisms are greatly needed and desirable from a research perspective. However, this can be hard to manage, not only due to the inherent costs of the hardware, but also due to maintenance challenges. In this paper, we present the required key steps to design, setup and maintain an inexpensive testbed using Raspberry Pi devices for communications and storage networks with network coding capabilities. This testbed can be utilized for any applications requiring results replicability.

  20. The Living With a Star Space Environment Testbed Program

    Science.gov (United States)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  1. Diffraction-based analysis of tunnel size for a scaled external occulter testbed

    Science.gov (United States)

    Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-07-01

    For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.

  2. An Enhanced Satellite-Based Algorithm for Detecting and Tracking Dust Outbreaks by Means of SEVIRI Data

    Directory of Open Access Journals (Sweden)

    Francesco Marchese

    2017-05-01

    Full Text Available Dust outbreaks are meteorological phenomena of great interest for scientists and authorities (because of their impact on the climate, environment, and human activities, which may be detected, monitored, and characterized from space using different methods and procedures. Among the recent dust detection algorithms, the RSTDUST multi-temporal technique has provided good results in different geographic areas (e.g., Mediterranean basin; Arabian Peninsula, exhibiting a better performance than traditional split window methods, in spite of some limitations. In this study, we present an optimized configuration of this technique, which better exploits data provided by Spinning Enhanced Visible and Infrared Imager (SEVIRI aboard Meteosat Second Generation (MSG satellites to address those issues (e.g., sensitivity reduction over arid and semi-arid regions; dependence on some meteorological clouds. Three massive dust events affecting Europe and the Mediterranean basin in May 2008/2010 are analysed in this work, using information provided by some independent and well-established aerosol products to assess the achieved results. The study shows that the proposed algorithm, christened eRSTDUST (i.e., enhanced RSTDUST, which provides qualitative information about dust outbreaks, is capable of increasing the trade-off between reliability and sensitivity. The results encourage further experimentations of this method in other periods of the year, also exploiting data provided by different satellite sensors, for better evaluating the advantages arising from the use of this dust detection technique in operational scenarios.

  3. Implementation of a virtual link between power system testbeds at Marshall Spaceflight Center and Lewis Research Center

    Science.gov (United States)

    Doreswamy, Rajiv

    1990-01-01

    The Marshall Space Flight Center (MSFC) owns and operates a space station module power management and distribution (SSM-PMAD) testbed. This system, managed by expert systems, is used to analyze and develop power system automation techniques for Space Station Freedom. The Lewis Research Center (LeRC), Cleveland, Ohio, has developed and implemented a space station electrical power system (EPS) testbed. This system and its power management controller are representative of the overall Space Station Freedom power system. A virtual link is being implemented between the testbeds at MSFC and LeRC. This link would enable configuration of SSM-PMAD as a load center for the EPS testbed at LeRC. This connection will add to the versatility of both systems, and provide an environment of enhanced realism for operation of both testbeds.

  4. Aerosol Retrievals from Proposed Satellite Bistatic Lidar Observations: Algorithm and Information Content

    Science.gov (United States)

    Alexandrov, M. D.; Mishchenko, M. I.

    2017-12-01

    Accurate aerosol retrievals from space remain quite challenging and typically involve solving a severely ill-posed inverse scattering problem. We suggested to address this ill-posedness by flying a bistatic lidar system. Such a system would consist of formation flying constellation of a primary satellite equipped with a conventional monostatic (backscattering) lidar and an additional platform hosting a receiver of the scattered laser light. If successfully implemented, this concept would combine the measurement capabilities of a passive multi-angle multi-spectral polarimeter with the vertical profiling capability of a lidar. Thus, bistatic lidar observations will be free of deficiencies affecting both monostatic lidar measurements (caused by the highly limited information content) and passive photopolarimetric measurements (caused by vertical integration and surface reflection).We present a preliminary aerosol retrieval algorithm for a bistatic lidar system consisting of a high spectral resolution lidar (HSRL) and an additional receiver flown in formation with it at a scattering angle of 165 degrees. This algorithm was applied to synthetic data generated using Mie-theory computations. The model/retrieval parameters in our tests were the effective radius and variance of the aerosol size distribution, complex refractive index of the particles, and their number concentration. Both mono- and bimodal aerosol mixtures were considered. Our algorithm allowed for definitive evaluation of error propagation from measurements to retrievals using a Monte Carlo technique, which involves random distortion of the observations and statistical characterization of the resulting retrieval errors. Our tests demonstrated that supplementing a conventional monostatic HSRL with an additional receiver dramatically increases the information content of the measurements and allows for a sufficiently accurate characterization of tropospheric aerosols.

  5. Chlorophyll-a Estimation Around the Antarctica Peninsula Using Satellite Algorithms: Hints from Field Water Leaving Reflectance.

    Science.gov (United States)

    Zeng, Chen; Xu, Huiping; Fischer, Andrew M

    2016-12-07

    Ocean color remote sensing significantly contributes to our understanding of phytoplankton distribution and abundance and primary productivity in the Southern Ocean (SO). However, the current SO in situ optical database is still insufficient and unevenly distributed. This limits the ability to produce robust and accurate measurements of satellite-based chlorophyll. Based on data collected on cruises around the Antarctica Peninsula (AP) on January 2014 and 2016, this research intends to enhance our knowledge of SO water and atmospheric optical characteristics and address satellite algorithm deficiency of ocean color products. We collected high resolution in situ water leaving reflectance (±1 nm band resolution), simultaneous in situ chlorophyll-a concentrations and satellite (MODIS and VIIRS) water leaving reflectance. Field samples show that clouds have a great impact on the visible green bands and are difficult to detect because NASA protocols apply the NIR band as a cloud contamination threshold. When compared to global case I water, water around the AP has lower water leaving reflectance and a narrower blue-green band ratio, which explains chlorophyll-a underestimation in high chlorophyll-a regions and overestimation in low chlorophyll-a regions. VIIRS shows higher spatial coverage and detection accuracy than MODIS. After coefficient improvement, VIIRS is able to predict chlorophyll a with 53% accuracy.

  6. Chlorophyll-a Estimation Around the Antarctica Peninsula Using Satellite Algorithms: Hints from Field Water Leaving Reflectance

    Directory of Open Access Journals (Sweden)

    Chen Zeng

    2016-12-01

    Full Text Available Ocean color remote sensing significantly contributes to our understanding of phytoplankton distribution and abundance and primary productivity in the Southern Ocean (SO. However, the current SO in situ optical database is still insufficient and unevenly distributed. This limits the ability to produce robust and accurate measurements of satellite-based chlorophyll. Based on data collected on cruises around the Antarctica Peninsula (AP on January 2014 and 2016, this research intends to enhance our knowledge of SO water and atmospheric optical characteristics and address satellite algorithm deficiency of ocean color products. We collected high resolution in situ water leaving reflectance (±1 nm band resolution, simultaneous in situ chlorophyll-a concentrations and satellite (MODIS and VIIRS water leaving reflectance. Field samples show that clouds have a great impact on the visible green bands and are difficult to detect because NASA protocols apply the NIR band as a cloud contamination threshold. When compared to global case I water, water around the AP has lower water leaving reflectance and a narrower blue-green band ratio, which explains chlorophyll-a underestimation in high chlorophyll-a regions and overestimation in low chlorophyll-a regions. VIIRS shows higher spatial coverage and detection accuracy than MODIS. After coefficient improvement, VIIRS is able to predict chlorophyll a with 53% accuracy.

  7. Cognitive Medical Wireless Testbed System (COMWITS)

    Science.gov (United States)

    2016-11-01

    Number: ...... ...... Sub Contractors (DD882) Names of other research staff Inventions (DD882) Scientific Progress This testbed merges two ARO grants...bit 64 bit CPU Intel Xeon Processor E5-1650v3 (6C, 3.5 GHz, Turbo, HT , 15M, 140W) Intel Core i7-3770 (3.4 GHz Quad Core, 77W) Dual Intel Xeon

  8. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    International Nuclear Information System (INIS)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill

    2016-01-01

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis

  9. A Method to Analyze Threats and Vulnerabilities by Using a Cyber Security Test-bed of an Operating NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong Sik; Son, Choul Woong; Lee, Soo Ill [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In order to implement cyber security controls for an Operating NPP, a security assessment should conduct in advance, and it is essential to analyze threats and vulnerabilities for a cyber security risk assessment phase. It might be impossible to perform a penetration test or scanning for a vulnerability analysis because the test may cause adverse effects on the inherent functions of ones. This is the reason why we develop and construct a cyber security test-bed instead of using real I and C systems in the operating NPP. In this paper, we propose a method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. The test-bed is being developed considering essential functions of the selected safety and non-safety system. This paper shows the method to analyze threats and vulnerabilities of a specific target system by using a cyber security test-bed. In order to develop the cyber security test-bed with both safety and non-safety functions, test-bed functions analysis and preliminary threats and vulnerabilities identification have been conducted. We will determine the attack scenarios and conduct the test-bed based vulnerability analysis.

  10. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  11. Vacuum nuller testbed (VNT) performance, characterization and null control: progress report

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-10-01

    Herein we report on the development, sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 108, 109, and ideally 1010 at an inner working angle of 2*λ/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  12. A Mobile Satellite Experiment (MSAT-X) network definition

    Science.gov (United States)

    Wang, Charles C.; Yan, Tsun-Yee

    1990-01-01

    The network architecture development of the Mobile Satellite Experiment (MSAT-X) project for the past few years is described. The results and findings of the network research activities carried out under the MSAT-X project are summarized. A framework is presented upon which the Mobile Satellite Systems (MSSs) operator can design a commercial network. A sample network configuration and its capability are also included under the projected scenario. The Communication Interconnection aspect of the MSAT-X network is discussed. In the MSAT-X network structure two basic protocols are presented: the channel access protocol, and the link connection protocol. The error-control techniques used in the MSAT-X project and the packet structure are also discussed. A description of two testbeds developed for experimentally simulating the channel access protocol and link control protocol, respectively, is presented. A sample network configuration and some future network activities of the MSAT-X project are also presented.

  13. NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, B.; Reed, B.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the EDRs, including their quality aspects. As the Calibration and Validation activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  14. Visible nulling coronagraph testbed results

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Petrone, Peter; Madison, Timothy; Rizzo, Maxime; Melnick, Gary; Tolls, Volker

    2009-08-01

    We report on our recent laboratory results with the NASA/Goddard Space Flight Center (GSFC) Visible Nulling Coronagraph (VNC) testbed. We have experimentally achieved focal plane contrasts of 1 x 108 and approaching 109 at inner working angles of 2 * wavelength/D and 4 * wavelength/D respectively where D is the aperture diameter. The result was obtained using a broadband source with a narrowband spectral filter of width 10 nm centered on 630 nm. To date this is the deepest nulling result with a visible nulling coronagraph yet obtained. Developed also is a Null Control Breadboard (NCB) to assess and quantify MEMS based segmented deformable mirror technology and develop and assess closed-loop null sensing and control algorithm performance from both the pupil and focal planes. We have demonstrated closed-loop control at 27 Hz in the laboratory environment. Efforts are underway to first bring the contrast to > 109 necessary for the direct detection and characterization of jovian (Jupiter-like) and then to > 1010 necessary for terrestrial (Earth-like) exosolar planets. Short term advancements are expected to both broaden the spectral passband from 10 nm to 100 nm and to increase both the long-term stability to > 2 hours and the extent of the null out to a ~ 10 * wavelength / D via the use of MEMS based segmented deformable mirror technology, a coherent fiber bundle, achromatic phase shifters, all in a vacuum chamber at the GSFC VNC facility. Additionally an extreme stability textbook sized compact VNC is under development.

  15. Resource-Aware Data Fusion Algorithms for Wireless Sensor Networks

    CERN Document Server

    Abdelgawad, Ahmed

    2012-01-01

    This book introduces resource-aware data fusion algorithms to gather and combine data from multiple sources (e.g., sensors) in order to achieve inferences.  These techniques can be used in centralized and distributed systems to overcome sensor failure, technological limitation, and spatial and temporal coverage problems. The algorithms described in this book are evaluated with simulation and experimental results to show they will maintain data integrity and make data useful and informative.   Describes techniques to overcome real problems posed by wireless sensor networks deployed in circumstances that might interfere with measurements provided, such as strong variations of pressure, temperature, radiation, and electromagnetic noise; Uses simulation and experimental results to evaluate algorithms presented and includes real test-bed; Includes case study implementing data fusion algorithms on a remote monitoring framework for sand production in oil pipelines.

  16. Adaptation of an aerosol retrieval algorithm using multi-wavelength and multi-pixel information of satellites (MWPM) to GOSAT/TANSO-CAI

    Science.gov (United States)

    Hashimoto, M.; Takenaka, H.; Higurashi, A.; Nakajima, T.

    2017-12-01

    Aerosol in the atmosphere is an important constituent for determining the earth's radiation budget, so the accurate aerosol retrievals from satellite is useful. We have developed a satellite remote sensing algorithm to retrieve the aerosol optical properties using multi-wavelength and multi-pixel information of satellite imagers (MWPM). The method simultaneously derives aerosol optical properties, such as aerosol optical thickness (AOT), single scattering albedo (SSA) and aerosol size information, by using spatial difference of wavelegths (multi-wavelength) and surface reflectances (multi-pixel). The method is useful for aerosol retrieval over spatially heterogeneous surface like an urban region. In this algorithm, the inversion method is a combination of an optimal method and smoothing constraint for the state vector. Furthermore, this method has been combined with the direct radiation transfer calculation (RTM) numerically solved by each iteration step of the non-linear inverse problem, without using look up table (LUT) with several constraints. However, it takes too much computation time. To accelerate the calculation time, we replaced the RTM with an accelerated RTM solver learned by neural network-based method, EXAM (Takenaka et al., 2011), using Rster code. And then, the calculation time was shorternd to about one thouthandth. We applyed MWPM combined with EXAM to GOSAT/TANSO-CAI (Cloud and Aerosol Imager). CAI is a supplement sensor of TANSO-FTS, dedicated to measure cloud and aerosol properties. CAI has four bands, 380, 674, 870 and 1600 nm, and observes in 500 meters resolution for band1, band2 and band3, and 1.5 km for band4. Retrieved parameters are aerosol optical properties, such as aerosol optical thickness (AOT) of fine and coarse mode particles at a wavelenth of 500nm, a volume soot fraction in fine mode particles, and ground surface albedo of each observed wavelength by combining a minimum reflectance method and Fukuda et al. (2013). We will show

  17. On-board attitude determination for the Explorer Platform satellite

    Science.gov (United States)

    Jayaraman, C.; Class, B.

    1992-01-01

    This paper describes the attitude determination algorithm for the Explorer Platform satellite. The algorithm, which is baselined on the Landsat code, is a six-element linear quadratic state estimation processor, in the form of a Kalman filter augmented by an adaptive filter process. Improvements to the original Landsat algorithm were required to meet mission pointing requirements. These consisted of a more efficient sensor processing algorithm and the addition of an adaptive filter which acts as a check on the Kalman filter during satellite slew maneuvers. A 1750A processor will be flown on board the satellite for the first time as a coprocessor (COP) in addition to the NASA Standard Spacecraft Computer. The attitude determination algorithm, which will be resident in the COP's memory, will make full use of its improved processing capabilities to meet mission requirements. Additional benefits were gained by writing the attitude determination code in Ada.

  18. Visible nulling coronagraphy testbed development for exoplanet detection

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-07-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 108, 109 and 1010 at an inner working angle of 2*λ/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  19. Comparison of satellite reflectance algorithms for estimating chlorophyll-a in a temperate reservoir using coincident hyperspectral aircraft imagery and dense coincident surface observations

    Science.gov (United States)

    We analyzed 10 established and 4 new satellite reflectance algorithms for estimating chlorophyll-a (Chl-a) in a temperate reservoir in southwest Ohio using coincident hyperspectral aircraft imagery and dense water truth collected within one hour of image acquisition to develop si...

  20. An overview of the U.S. Army Research Laboratory's Sensor Information Testbed for Collaborative Research Environment (SITCORE) and Automated Online Data Repository (AODR) capabilities

    Science.gov (United States)

    Ward, Dennis W.; Bennett, Kelly W.

    2017-05-01

    The Sensor Information Testbed COllaberative Research Environment (SITCORE) and the Automated Online Data Repository (AODR) are significant enablers of the U.S. Army Research Laboratory (ARL)'s Open Campus Initiative and together create a highly-collaborative research laboratory and testbed environment focused on sensor data and information fusion. SITCORE creates a virtual research development environment allowing collaboration from other locations, including DoD, industry, academia, and collation facilities. SITCORE combined with AODR provides end-toend algorithm development, experimentation, demonstration, and validation. The AODR enterprise allows the U.S. Army Research Laboratory (ARL), as well as other government organizations, industry, and academia to store and disseminate multiple intelligence (Multi-INT) datasets collected at field exercises and demonstrations, and to facilitate research and development (R and D), and advancement of analytical tools and algorithms supporting the Intelligence, Surveillance, and Reconnaissance (ISR) community. The AODR provides a potential central repository for standards compliant datasets to serve as the "go-to" location for lessons-learned and reference products. Many of the AODR datasets have associated ground truth and other metadata which provides a rich and robust data suite for researchers to develop, test, and refine their algorithms. Researchers download the test data to their own environments using a sophisticated web interface. The AODR allows researchers to request copies of stored datasets and for the government to process the requests and approvals in an automated fashion. Access to the AODR requires two-factor authentication in the form of a Common Access Card (CAC) or External Certificate Authority (ECA)

  1. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report. [supporting datasets - Phoenix Testbed

    Science.gov (United States)

    2017-07-26

    The datasets in this zip file are in support of FHWA-JPO-16-379, Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Program...

  2. Security Concepts for Satellite Links

    Science.gov (United States)

    Tobehn, C.; Penné, B.; Rathje, R.; Weigl, A.; Gorecki, Ch.; Michalik, H.

    2008-08-01

    The high costs to develop, launch and maintain a satellite network makes protecting the assets imperative. Attacks may be passive such as eavesdropping on the payload data. More serious threat are active attacks that try to gain control of the satellite, which may lead to the total lost of the satellite asset. To counter these threats, new satellite and ground systems are using cryptographic technologies to provide a range of services: confidentiality, entity & message authentication, and data integrity. Additionally, key management cryptographic services are required to support these services. This paper describes the key points of current satellite control and operations, that are authentication of the access to the satellite TMTC link and encryption of security relevant TM/TC data. For payload data management the key points are multi-user ground station access and high data rates both requiring frequent updates and uploads of keys with the corresponding key management methods. For secure satellite management authentication & key negotiation algorithms as HMAC-RIPEMD160, EC- DSA and EC-DH are used. Encryption of data uses algorithms as IDEA, AES, Triple-DES, or other. A channel coding and encryption unit for payload data provides download data rates up to Nx250 Mbps. The presented concepts are based on our experience and heritage of the security systems for all German MOD satellite projects (SATCOMBw2, SAR-Lupe multi- satellite system and German-French SAR-Lupe-Helios- II systems inter-operability) as well as for further international (KOMPSAT-II Payload data link system) and ESA activities (TMTC security and GMES).

  3. Improved Chlorophyll-a Algorithm for the Satellite Ocean Color Data in the Northern Bering Sea and Southern Chukchi Sea

    Science.gov (United States)

    Lee, Sang Heon; Ryu, Jongseong; Park, Jung-woo; Lee, Dabin; Kwon, Jae-Il; Zhao, Jingping; Son, SeungHyun

    2018-03-01

    The Bering and Chukchi seas are an important conduit to the Arctic Ocean and are reported to be one of the most productive regions in the world's oceans in terms of high primary productivity that sustains large numbers of fishes, marine mammals, and sea birds as well as benthic animals. Climate-induced changes in primary production and production at higher trophic levels also have been observed in the northern Bering and Chukchi seas. Satellite ocean color observations could enable the monitoring of relatively long term patterns in chlorophyll-a (Chl-a) concentrations that would serve as an indicator of phytoplankton biomass. The performance of existing global and regional Chl-a algorithms for satellite ocean color data was investigated in the northeastern Bering Sea and southern Chukchi Sea using in situ optical measurements from the Healy 2007 cruise. The model-derived Chl-a data using the previous Chl-a algorithms present striking uncertainties regarding Chl-a concentrations-for example, overestimation in lower Chl-a concentrations or systematic overestimation in the northeastern Bering Sea and southern Chukchi Sea. Accordingly, a simple two band ratio (R rs(443)/R rs(555)) algorithm of Chl-a for the satellite ocean color data was devised for the northeastern Bering Sea and southern Chukchi Sea. The MODIS-derived Chl-a data from July 2002 to December 2014 were produced using the new Chl-a algorithm to investigate the seasonal and interannual variations of Chl-a in the northern Bering Sea and the southern Chukchi Sea. The seasonal distribution of Chl-a shows that the highest (spring bloom) Chl-a concentrations are in May and the lowest are in July in the overall area. Chl-a concentrations relatively decreased in June, particularly in the open ocean waters of the Bering Sea. The Chl-a concentrations start to increase again in August and become quite high in September. In October, Chl-a concentrations decreased in the western area of the Study area and the Alaskan

  4. An Empirical Orthogonal Function-Based Algorithm for Estimating Terrestrial Latent Heat Flux from Eddy Covariance, Meteorological and Satellite Observations.

    Science.gov (United States)

    Feng, Fei; Li, Xianglan; Yao, Yunjun; Liang, Shunlin; Chen, Jiquan; Zhao, Xiang; Jia, Kun; Pintér, Krisztina; McCaughey, J Harry

    2016-01-01

    Accurate estimation of latent heat flux (LE) based on remote sensing data is critical in characterizing terrestrial ecosystems and modeling land surface processes. Many LE products were released during the past few decades, but their quality might not meet the requirements in terms of data consistency and estimation accuracy. Merging multiple algorithms could be an effective way to improve the quality of existing LE products. In this paper, we present a data integration method based on modified empirical orthogonal function (EOF) analysis to integrate the Moderate Resolution Imaging Spectroradiometer (MODIS) LE product (MOD16) and the Priestley-Taylor LE algorithm of Jet Propulsion Laboratory (PT-JPL) estimate. Twenty-two eddy covariance (EC) sites with LE observation were chosen to evaluate our algorithm, showing that the proposed EOF fusion method was capable of integrating the two satellite data sets with improved consistency and reduced uncertainties. Further efforts were needed to evaluate and improve the proposed algorithm at larger spatial scales and time periods, and over different land cover types.

  5. Tower-Based Greenhouse Gas Measurement Network Design---The National Institute of Standards and Technology North East Corridor Testbed.

    Science.gov (United States)

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k -means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  6. Prognostics-Enabled Power Supply for ADAPT Testbed, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Ridgetop's role is to develop electronic prognostics for sensing power systems in support of NASA/Ames ADAPT testbed. The prognostic enabled power systems from...

  7. Satellite observation of particulate organic carbon dynamics in ...

    Science.gov (United States)

    Particulate organic carbon (POC) plays an important role in coastal carbon cycling and the formation of hypoxia. Yet, coastal POC dynamics are often poorly understood due to a lack of long-term POC observations and the complexity of coastal hydrodynamic and biogeochemical processes that influence POC sources and sinks. Using field observations and satellite ocean color products, we developed a nw multiple regression algorithm to estimate POC on the Louisiana Continental Shelf (LCS) from satellite observations. The algorithm had reliable performance with mean relative error (MRE) of ?40% and root mean square error (RMSE) of ?50% for MODIS and SeaWiFS images for POC ranging between ?80 and ?1200 mg m23, and showed similar performance for a large estuary (Mobile Bay). Substantial spatiotemporal variability in the satellite-derived POC was observed on the LCS, with high POC found on the inner shelf (satellite data with carefully developed algorithms can greatly increase

  8. Towards a Perpetual Sensor Network Testbed without Backchannel

    DEFF Research Database (Denmark)

    Johansen, Aslak; Bonnet, Philippe; Sørensen, Thomas

    2012-01-01

    The sensor network testbeds available today rely on a communication channel different from the mote radio - a backchannel - to facilitate mote reprogramming, health monitoring and performance analysis. Such backchannels are either supported as wired communication channels (USB or Ethernet), or vi...

  9. Data dissemination in the wild: A testbed for high-mobility MANETs

    DEFF Research Database (Denmark)

    Vingelmann, Peter; Pedersen, Morten Videbæk; Heide, Janus

    2012-01-01

    This paper investigates the problem of efficient data dissemination in Mobile Ad hoc NETworks (MANETs) with high mobility. A testbed is presented; which provides a high degree of mobility in experiments. The testbed consists of 10 autonomous robots with mobile phones mounted on them. The mobile...... information, and the goal is to convey that information to all devices. A strategy is proposed that uses UDP broadcast transmissions and random linear network coding to facilitate the efficient exchange of information in the network. An application is introduced that implements this strategy on Nokia phones...

  10. Hypercube algorithms suitable for image understanding in uncertain environments

    International Nuclear Information System (INIS)

    Huntsberger, T.L.; Sengupta, A.

    1988-01-01

    Computer vision in a dynamic environment needs to be fast and able to tolerate incomplete or uncertain intermediate results. An appropriately chose representation coupled with a parallel architecture addresses both concerns. The wide range of numerical and symbolic processing needed for robust computer vision can only be achieved through a blend of SIMD and MIMD processing techniques. The 1024 element hypercube architecture has these capabilities, and was chosen as the test-bed hardware for development of highly parallel computer vision algorithms. This paper presents and analyzes parallel algorithms for color image segmentation and edge detection. These algorithms are part of a recently developed computer vision system which uses multiple valued logic to represent uncertainty in the imaging process and in intermediate results. Algorithms for the extraction of three dimensional properties of objects using dynamic scene analysis techniques within the same framework are examined. Results from experimental studies using a 1024 element hypercube implementation of the algorithm as applied to a series of natural scenes are reported

  11. In-Space Internet-Based Communications for Space Science Platforms Using Commercial Satellite Networks

    Science.gov (United States)

    Kerczewski, Robert J.; Bhasin, Kul B.; Fabian, Theodore P.; Griner, James H.; Kachmar, Brian A.; Richard, Alan M.

    1999-01-01

    The continuing technological advances in satellite communications and global networking have resulted in commercial systems that now can potentially provide capabilities for communications with space-based science platforms. This reduces the need for expensive government owned communications infrastructures to support space science missions while simultaneously making available better service to the end users. An interactive, high data rate Internet type connection through commercial space communications networks would enable authorized researchers anywhere to control space-based experiments in near real time and obtain experimental results immediately. A space based communications network architecture consisting of satellite constellations connecting orbiting space science platforms to ground users can be developed to provide this service. The unresolved technical issues presented by this scenario are the subject of research at NASA's Glenn Research Center in Cleveland, Ohio. Assessment of network architectures, identification of required new or improved technologies, and investigation of data communications protocols are being performed through testbed and satellite experiments and laboratory simulations.

  12. Traffic sharing algorithms for hybrid mobile networks

    Science.gov (United States)

    Arcand, S.; Murthy, K. M. S.; Hafez, R.

    1995-01-01

    In a hybrid (terrestrial + satellite) mobile personal communications networks environment, a large size satellite footprint (supercell) overlays on a large number of smaller size, contiguous terrestrial cells. We assume that the users have either a terrestrial only single mode terminal (SMT) or a terrestrial/satellite dual mode terminal (DMT) and the ratio of DMT to the total terminals is defined gamma. It is assumed that the call assignments to and handovers between terrestrial cells and satellite supercells take place in a dynamic fashion when necessary. The objectives of this paper are twofold, (1) to propose and define a class of traffic sharing algorithms to manage terrestrial and satellite network resources efficiently by handling call handovers dynamically, and (2) to analyze and evaluate the algorithms by maximizing the traffic load handling capability (defined in erl/cell) over a wide range of terminal ratios (gamma) given an acceptable range of blocking probabilities. Two of the algorithms (G & S) in the proposed class perform extremely well for a wide range of gamma.

  13. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal planeregion extending from 14 D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. TheVNC is a hybrid interferometriccoronagraphic approach for exoplanet science. It operates with high Lyot stopefficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential futureNASA flight telescopes. NASAGoddard Space Flight Center (GSFC) has a well-established effort to develop the VNCand its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and itsenabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry tounprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a W configurationto accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters.We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, criticaltechnologies and null sensing and control.

  14. High contrast vacuum nuller testbed (VNT) contrast, performance, and null control

    Science.gov (United States)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-09-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal plane region extending from 1 - 4 λ/D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. The VNC is a hybrid interferometric/coronagraphic approach for exoplanet science. It operates with high Lyot stop efficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential future NASA flight telescopes. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop the VNC and its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and its enabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry to unprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a “W” configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, critical technologies and null sensing and control.

  15. Automatic Integration Testbeds validation on Open Science Grid

    International Nuclear Information System (INIS)

    Caballero, J; Potekhin, M; Thapa, S; Gardner, R

    2011-01-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit 'VO-like' jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  16. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  17. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    International Nuclear Information System (INIS)

    Shin, Jinsoo; Heo, Gyunyoung; Son, Hanseong; An, Yongkyu; Rizwan, Uddin

    2015-01-01

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model

  18. Implementation of a RPS Cyber Security Test-bed with Two PLCs

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jinsoo; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Son, Hanseong [Joongbu Univ., Geumsan (Korea, Republic of); An, Yongkyu; Rizwan, Uddin [University of Illinois at Urbana-Champaign, Urbana (United States)

    2015-10-15

    Our research team proposed the methodology to evaluate cyber security with Bayesian network (BN) as a cyber security evaluation model and help operator, licensee, licensor or regulator in granting evaluation priorities. The methodology allowed for overall evaluation of cyber security by considering architectural aspect of facility and management aspect of cyber security at the same time. In order to emphasize reality of this model by inserting true data, it is necessary to conduct a penetration test that pretends an actual cyber-attack. Through the collaboration with University of Illinois at Urbana-Champaign, which possesses the Tricon a safety programmable logic controller (PLC) used at nuclear power plants and develops a test-bed for nuclear power plant, a test-bed for reactor protection system (RPS) is being developed with the PLCs. Two PLCs are used to construct a simple test-bed for RPS, bi-stable processor (BP) and coincidence processor (CP). By using two PLCs, it is possible to examine cyber-attack against devices such as PLC, cyber-attack against communication between devices, and the effects of a PLC on the other PLC. Two PLCs were used to construct a test-bed for penetration test in this study. Advantages of using two or more PLCs instead of single PLC are as follows. 1) Results of cyber-attack reflecting characteristics among PLCs can be obtained. 2) Cyber-attack can be attempted using a method of attacking communication between PLCs. True data obtained can be applied to existing cyber security evaluation model to emphasize reality of the model.

  19. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  20. Application of an optimization algorithm to satellite ocean color imagery: A case study in Southwest Florida coastal waters

    Science.gov (United States)

    Hu, Chuanmin; Lee, Zhongping; Muller-Karger, Frank E.; Carder, Kendall L.

    2003-05-01

    A spectra-matching optimization algorithm, designed for hyperspectral sensors, has been implemented to process SeaWiFS-derived multi-spectral water-leaving radiance data. The algorithm has been tested over Southwest Florida coastal waters. The total spectral absorption and backscattering coefficients can be well partitioned with the inversion algorithm, resulting in RMS errors generally less than 5% in the modeled spectra. For extremely turbid waters that come from either river runoff or sediment resuspension, the RMS error is in the range of 5-15%. The bio-optical parameters derived in this optically complex environment agree well with those obtained in situ. Further, the ability to separate backscattering (a proxy for turbidity) from the satellite signal makes it possible to trace water movement patterns, as indicated by the total absorption imagery. The derived patterns agree with those from concurrent surface drifters. For waters where CDOM overwhelmingly dominates the optical signal, however, the procedure tends to regard CDOM as the sole source of absorption, implying the need for better atmospheric correction and for adjustment of some model coefficients for this particular region.

  1. EVALUATING THE ACCURACY OF DEM GENERATION ALGORITHMS FROM UAV IMAGERY

    Directory of Open Access Journals (Sweden)

    J. J. Ruiz

    2013-08-01

    Full Text Available In this work we evaluated how the use of different positioning systems affects the accuracy of Digital Elevation Models (DEMs generated from aerial imagery obtained with Unmanned Aerial Vehicles (UAVs. In this domain, state-of-the-art DEM generation algorithms suffer from typical errors obtained by GPS/INS devices in the position measurements associated with each picture obtained. The deviations from these measurements to real world positions are about meters. The experiments have been carried out using a small quadrotor in the indoor testbed at the Center for Advanced Aerospace Technologies (CATEC. This testbed houses a system that is able to track small markers mounted on the UAV and along the scenario with millimeter precision. This provides very precise position measurements, to which we can add random noise to simulate errors in different GPS receivers. The results showed that final DEM accuracy clearly depends on the positioning information.

  2. Experimental validation of a distributed algorithm for dynamic spectrum access in local area networks

    DEFF Research Database (Denmark)

    Tonelli, Oscar; Berardinelli, Gilberto; Tavares, Fernando Menezes Leitão

    2013-01-01

    Next generation wireless networks aim at a significant improvement of the spectral efficiency in order to meet the dramatic increase in data service demand. In local area scenarios user-deployed base stations are expected to take place, thus making the centralized planning of frequency resources...... activities with the Autonomous Component Carrier Selection (ACCS) algorithm, a distributed solution for interference management among small neighboring cells. A preliminary evaluation of the algorithm performance is provided considering its live execution on a software defined radio network testbed...

  3. Multi-level infrastructure of interconnected testbeds of large-scale wireless sensor networks (MI2T-WSN)

    CSIR Research Space (South Africa)

    Abu-Mahfouz, Adnan M

    2012-06-01

    Full Text Available are still required for further testing before the real implementation. In this paper we propose a multi-level infrastructure of interconnected testbeds of large- scale WSNs. This testbed consists of 1000 sensor motes that will be distributed into four...

  4. The Airborne Optical Systems Testbed (AOSTB)

    Science.gov (United States)

    2017-05-31

    are the Atlantic Ocean and coastal waterways, which reflect back very little light at our SWIR operating wavelength of 1064 nm. The Airborne Optical...demonstrate our typical FOPEN capabilities, figure 5 shows two images taken over a forested area near Burlington, VT. Figure 5(a) is a 3D point...Systems Testbed (AOSTB) 1 - 6 STO-MP-SET-999 (a) (b) Fig. 5. Ladar target scan of a forested area in northern Vermont

  5. The Living With a Star Space Environment Testbed Payload

    Science.gov (United States)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  6. Accurate beacon positioning method for satellite-to-ground optical communication.

    Science.gov (United States)

    Wang, Qiang; Tong, Ling; Yu, Siyuan; Tan, Liying; Ma, Jing

    2017-12-11

    In satellite laser communication systems, accurate positioning of the beacon is essential for establishing a steady laser communication link. For satellite-to-ground optical communication, the main influencing factors on the acquisition of the beacon are background noise and atmospheric turbulence. In this paper, we consider the influence of background noise and atmospheric turbulence on the beacon in satellite-to-ground optical communication, and propose a new locating algorithm for the beacon, which takes the correlation coefficient obtained by curve fitting for image data as weights. By performing a long distance laser communication experiment (11.16 km), we verified the feasibility of this method. Both simulation and experiment showed that the new algorithm can accurately obtain the position of the centroid of beacon. Furthermore, for the distortion of the light spot through atmospheric turbulence, the locating accuracy of the new algorithm was 50% higher than that of the conventional gray centroid algorithm. This new approach will be beneficial for the design of satellite-to ground optical communication systems.

  7. NPP/NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, R.

    2010-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS Preparatory Project (NPP) and NPOESS satellites will carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The Northrop Grumman Aerospace Systems (NGAS) Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the Environmental Data Records (EDRs), including their quality aspects. As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  8. Geostationary satellites collocation

    CERN Document Server

    Li, Hengnian

    2014-01-01

    Geostationary Satellites Collocation aims to find solutions for deploying a safe and reliable collocation control. Focusing on the orbital perturbation analysis, the mathematical foundations for orbit and control of the geostationary satellite are summarized. The mathematical and physical principle of orbital maneuver and collocation strategies for multi geostationary satellites sharing with the same dead band is also stressed. Moreover, the book presents some applications using the above algorithms and mathematical models to help readers master the corrective method for planning station keeping maneuvers. Engineers and scientists in the fields of aerospace technology and space science can benefit from this book. Hengnian Li is the Deputy Director of State Key Laboratory of Astronautic Dynamics, China.

  9. The feasibility of retrieving vertical temperature profiles from satellite nadir UV observations: A sensitivity analysis and an inversion experiment with neural network algorithms

    International Nuclear Information System (INIS)

    Sellitto, P.; Del Frate, F.

    2014-01-01

    Atmospheric temperature profiles are inferred from passive satellite instruments, using thermal infrared or microwave observations. Here we investigate on the feasibility of the retrieval of height resolved temperature information in the ultraviolet spectral region. The temperature dependence of the absorption cross sections of ozone in the Huggins band, in particular in the interval 320–325 nm, is exploited. We carried out a sensitivity analysis and demonstrated that a non-negligible information on the temperature profile can be extracted from this small band. Starting from these results, we developed a neural network inversion algorithm, trained and tested with simulated nadir EnviSat-SCIAMACHY ultraviolet observations. The algorithm is able to retrieve the temperature profile with root mean square errors and biases comparable to existing retrieval schemes that use thermal infrared or microwave observations. This demonstrates, for the first time, the feasibility of temperature profiles retrieval from space-borne instruments operating in the ultraviolet. - Highlights: • A sensitivity analysis and an inversion scheme to retrieve temperature profiles from satellite UV observations (320–325 nm). • The exploitation of the temperature dependence of the absorption cross section of ozone in the Huggins band is proposed. • First demonstration of the feasibility of temperature profiles retrieval from satellite UV observations. • RMSEs and biases comparable with more established techniques involving TIR and MW observations

  10. A Comprehensive Training Data Set for the Development of Satellite-Based Volcanic Ash Detection Algorithms

    Science.gov (United States)

    Schmidl, Marius

    2017-04-01

    We present a comprehensive training data set covering a large range of atmospheric conditions, including disperse volcanic ash and desert dust layers. These data sets contain all information required for the development of volcanic ash detection algorithms based on artificial neural networks, urgently needed since volcanic ash in the airspace is a major concern of aviation safety authorities. Selected parts of the data are used to train the volcanic ash detection algorithm VADUGS. They contain atmospheric and surface-related quantities as well as the corresponding simulated satellite data for the channels in the infrared spectral range of the SEVIRI instrument on board MSG-2. To get realistic results, ECMWF, IASI-based, and GEOS-Chem data are used to calculate all parameters describing the environment, whereas the software package libRadtran is used to perform radiative transfer simulations returning the brightness temperatures for each atmospheric state. As optical properties are a prerequisite for radiative simulations accounting for aerosol layers, the development also included the computation of optical properties for a set of different aerosol types from different sources. A description of the developed software and the used methods is given, besides an overview of the resulting data sets.

  11. Moving object detection in video satellite image based on deep learning

    Science.gov (United States)

    Zhang, Xueyang; Xiang, Junhua

    2017-11-01

    Moving object detection in video satellite image is studied. A detection algorithm based on deep learning is proposed. The small scale characteristics of remote sensing video objects are analyzed. Firstly, background subtraction algorithm of adaptive Gauss mixture model is used to generate region proposals. Then the objects in region proposals are classified via the deep convolutional neural network. Thus moving objects of interest are detected combined with prior information of sub-satellite point. The deep convolution neural network employs a 21-layer residual convolutional neural network, and trains the network parameters by transfer learning. Experimental results about video from Tiantuo-2 satellite demonstrate the effectiveness of the algorithm.

  12. SCaN Testbed Software Development and Lessons Learned

    Science.gov (United States)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  13. ASE-BAN, a Wireless Body Area Network Testbed

    DEFF Research Database (Denmark)

    Madsen, Jens Kargaard; Karstoft, Henrik; Toftegaard, Thomas Skjødeberg

    2010-01-01

    /actuators attached to the body and a host server application. The gateway uses the BlackFin BF533 processor from Analog Devices, and uses Bluetooth for wireless communication. Two types of sensors are attached to the network: an electro-cardio-gram sensor and an oximeter sensor. The testbed has been successfully...

  14. Encryption protection for communication satellites

    Science.gov (United States)

    Sood, D. R.; Hoernig, O. W., Jr.

    In connection with the growing importance of the commercial communication satellite systems and the introduction of new technological developments, users and operators of these systems become increasingly concerned with aspects of security. The user community is concerned with maintaining confidentiality and integrity of the information being transmitted over the satellite links, while the satellite operators are concerned about the safety of their assets in space. In response to these concerns, the commercial satellite operators are now taking steps to protect the communication information and the satellites. Thus, communication information is being protected by end-to-end encryption of the customer communication traffic. Attention is given to the selection of the NBS DES algorithm, the command protection systems, and the communication protection systems.

  15. Coarse Initial Orbit Determination for a Geostationary Satellite Using Single-Epoch GPS Measurements

    Directory of Open Access Journals (Sweden)

    Ghangho Kim

    2015-04-01

    Full Text Available A practical algorithm is proposed for determining the orbit of a geostationary orbit (GEO satellite using single-epoch measurements from a Global Positioning System (GPS receiver under the sparse visibility of the GPS satellites. The algorithm uses three components of a state vector to determine the satellite’s state, even when it is impossible to apply the classical single-point solutions (SPS. Through consideration of the characteristics of the GEO orbital elements and GPS measurements, the components of the state vector are reduced to three. However, the algorithm remains sufficiently accurate for a GEO satellite. The developed algorithm was tested on simulated measurements from two or three GPS satellites, and the calculated maximum position error was found to be less than approximately 40 km or even several kilometers within the geometric range, even when the classical SPS solution was unattainable. In addition, extended Kalman filter (EKF tests of a GEO satellite with the estimated initial state were performed to validate the algorithm. In the EKF, a reliable dynamic model was adapted to reduce the probability of divergence that can be caused by large errors in the initial state.

  16. Coarse Initial Orbit Determination for a Geostationary Satellite Using Single-Epoch GPS Measurements

    Science.gov (United States)

    Kim, Ghangho; Kim, Chongwon; Kee, Changdon

    2015-01-01

    A practical algorithm is proposed for determining the orbit of a geostationary orbit (GEO) satellite using single-epoch measurements from a Global Positioning System (GPS) receiver under the sparse visibility of the GPS satellites. The algorithm uses three components of a state vector to determine the satellite’s state, even when it is impossible to apply the classical single-point solutions (SPS). Through consideration of the characteristics of the GEO orbital elements and GPS measurements, the components of the state vector are reduced to three. However, the algorithm remains sufficiently accurate for a GEO satellite. The developed algorithm was tested on simulated measurements from two or three GPS satellites, and the calculated maximum position error was found to be less than approximately 40 km or even several kilometers within the geometric range, even when the classical SPS solution was unattainable. In addition, extended Kalman filter (EKF) tests of a GEO satellite with the estimated initial state were performed to validate the algorithm. In the EKF, a reliable dynamic model was adapted to reduce the probability of divergence that can be caused by large errors in the initial state. PMID:25835299

  17. Optimization of Power Allocation for Multiusers in Multi-Spot-Beam Satellite Communication Systems

    Directory of Open Access Journals (Sweden)

    Heng Wang

    2014-01-01

    Full Text Available In recent years, multi-spot-beam satellite communication systems have played a key role in global seamless communication. However, satellite power resources are scarce and expensive, due to the limitations of satellite platform. Therefore, this paper proposes optimizing the power allocation of each user in order to improve the power utilization efficiency. Initially the capacity allocated to each user is calculated according to the satellite link budget equations, which can be achieved in the practical satellite communication systems. The problem of power allocation is then formulated as a convex optimization, taking account of a trade-off between the maximization of the total system capacity and the fairness of power allocation amongst the users. Finally, an iterative algorithm based on the duality theory is proposed to obtain the optimal solution to the optimization. Compared with the traditional uniform resource allocation or proportional resource allocation algorithms, the proposed optimal power allocation algorithm improves the fairness of power allocation amongst the users. Moreover, the computational complexity of the proposed algorithm is linear with both the numbers of the spot beams and users. As a result, the proposed power allocation algorithm is easy to be implemented in practice.

  18. LOS Throughput Measurements in Real-Time with a 128-Antenna Massive MIMO Testbed

    OpenAIRE

    Harris, Paul; Zhang, Siming; Beach, Mark; Mellios, Evangelos; Nix, Andrew; Armour, Simon; Doufexi, Angela; Nieman, Karl; Kundargi, Nikhil

    2017-01-01

    This paper presents initial results for a novel 128-antenna massive Multiple-Input, Multiple- Output (MIMO) testbed developed through Bristol Is Open in collaboration with National Instruments and Lund University. We believe that the results presented here validate the adoption of massive MIMO as a key enabling technology for 5G and pave the way for further pragmatic research by the massive MIMO community. The testbed operates in real-time with a Long-Term Evolution (LTE)-like PHY in Time Div...

  19. Current Developments in DETER Cybersecurity Testbed Technology

    Science.gov (United States)

    2015-12-08

    Management Experimental cybersecurity research is often inherently risky. An experiment may involve releasing live malware code, operating a real botnet...imagine a worm that can only propagate by first contacting a “propagation service” (T1 constraint), composed with a testbed firewall (T2...experiment. Finally, T1 constraints might be enforced by (1) explicit modification of malware to constrain its behavior, (2) implicit constraints

  20. Operation Duties on the F-15B Research Testbed

    Science.gov (United States)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  1. Construction of test-bed system of voltage management system to ...

    African Journals Online (AJOL)

    Construction of test-bed system of voltage management system to apply physical power system. ... Journal of Fundamental and Applied Sciences ... system of voltage management system (VMS) in order to apply physical power system.

  2. Development of a Testbed for Wireless Underground Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mehmet C. Vuran

    2010-01-01

    Full Text Available Wireless Underground Sensor Networks (WUSNs constitute one of the promising application areas of the recently developed wireless sensor networking techniques. WUSN is a specialized kind of Wireless Sensor Network (WSN that mainly focuses on the use of sensors that communicate through soil. Recent models for the wireless underground communication channel are proposed but few field experiments were realized to verify the accuracy of the models. The realization of field WUSN experiments proved to be extremely complex and time-consuming in comparison with the traditional wireless environment. To the best of our knowledge, this is the first work that proposes guidelines for the development of an outdoor WUSN testbed with the goals of improving the accuracy and reducing of time for WUSN experiments. Although the work mainly aims WUSNs, many of the presented practices can also be applied to generic WSN testbeds.

  3. A Study on Retrieval Algorithm of Black Water Aggregation in Taihu Lake Based on HJ-1 Satellite Images

    International Nuclear Information System (INIS)

    Lei, Zou; Bing, Zhang; Junsheng, Li; Qian, Shen; Fangfang, Zhang; Ganlin, Wang

    2014-01-01

    The phenomenon of black water aggregation (BWA) occurs in inland water when massive algal bodies aggregate, die, and react with the toxic sludge in certain climate conditions to deprive the water of oxygen. This process results in the deterioration of water quality and damage to the ecosystem. Because charge coupled device (CCD) camera data from the Chinese HJ environmental satellite shows high potential in monitoring BWA, we acquired four HJ-CCD images of Taihu Lake captured during 2009 to 2011 to study this phenomenon. The first study site was selected near the Shore of Taihu Lake. We pre-processed the HJ-CCD images and analyzed the digital number (DN) gray values in the research area and in typical BWA areas. The results show that the DN values of visible bands in BWA areas are obviously lower than those in the research areas. Moreover, we developed an empirical retrieving algorithm of BWA based on the DN mean values and variances of research areas. Finally, we tested the accuracy of this empirical algorithm. The retrieving accuracies were89.9%, 58.1%, 73.4%, and 85.5%, respectively, which demonstrates the efficiency of empirical algorithm in retrieving the approximate distributions of BWA

  4. Pre- and post-flight radiation performance evaluation of the space GPS receiver (SGR)

    International Nuclear Information System (INIS)

    Oldfield, M.K.; Underwood, C.I.; Unwin, M.J.; Asenek, V.; Harboe-Sorensen, R.

    1999-01-01

    SSTL (Survey Satellite Technology Ltd), in collaboration with ESA/ESTEC, recently developed a state-of-the-art low cost GPS (Global Positioning System) receiver payload for use on small satellites. The space GPS Receiver (SGR), will be flown on the TiungSAT-1 micro-satellite, UoSAT-12 mini-satellite and ESA's PROBA satellite. The SGR payload is currently flying on the TMSAT micro-satellite in low Earth orbit (LEO) and has carried out autonomous on-board positioning whilst also providing an experimental test-bed for evaluating spacecraft attitude determination algorithms. In order to reduce development time and costs, the SGR consists solely of industry standard COTS (commercial off-the-shelf) devices. This paper describes the ground-based radiation testing of several payload-critical COTS devices used in the SGR payload and describes its on-orbit performance. (authors)

  5. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Evans, Philip G [ORNL; Sheets, Gregory S [ORNL; Jones, John P [ORNL; Flynn, Toby H [ORNL; O' Neil, Lori Ross [Pacific Northwest National Laboratory (PNNL); Hutton, William [Pacific Northwest National Laboratory (PNNL); Pratt, Richard [Pacific Northwest National Laboratory (PNNL); Carroll, Thomas E. [Pacific Northwest National Laboratory (PNNL)

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  6. Phased Array Antenna Testbed Development at the NASA Glenn Research Center

    Science.gov (United States)

    Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey

    2003-01-01

    Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.

  7. LTE-Advanced/WLAN testbed

    OpenAIRE

    Plaisner, Denis

    2017-01-01

    Táto práca sa zaoberá skúmaním a vyhodnocovaním komunikácie štandardov LTE-Advance a WiFi (IEEE 802.11n/ac). Pri jednotlivých štandardoch je preskúmaný chybový parameter EVM. Pre prácu s jednotlivými štandardmi je navrhnuté univerzálne pracovisko (testbed). Toto univerzálne pracovisko slúži na nastavovanie vysielacieho a prijímacieho zariadenia a na spracovávanie prenášaných signálov a ich vyhodnocovanie. Pre túto prácu je vybrané prostredie Matlab, cez ktoré sa ovládajú použité prístroje ako...

  8. Satellite Ocean Aerosol Retrieval (SOAR) Algorithm Extension to S-NPP VIIRS as Part of the "Deep Blue" Aerosol Project

    Science.gov (United States)

    Sayer, A. M.; Hsu, N. C.; Lee, J.; Bettenhausen, C.; Kim, W. V.; Smirnov, A.

    2018-01-01

    The Suomi National Polar-Orbiting Partnership (S-NPP) satellite, launched in late 2011, carries the Visible Infrared Imaging Radiometer Suite (VIIRS) and several other instruments. VIIRS has similar characteristics to prior satellite sensors used for aerosol optical depth (AOD) retrieval, allowing the continuation of space-based aerosol data records. The Deep Blue algorithm has previously been applied to retrieve AOD from Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate Resolution Imaging Spectroradiometer (MODIS) measurements over land. The SeaWiFS Deep Blue data set also included a SeaWiFS Ocean Aerosol Retrieval (SOAR) algorithm to cover water surfaces. As part of NASA's VIIRS data processing, Deep Blue is being applied to VIIRS data over land, and SOAR has been adapted from SeaWiFS to VIIRS for use over water surfaces. This study describes SOAR as applied in version 1 of NASA's S-NPP VIIRS Deep Blue data product suite. Several advances have been made since the SeaWiFS application, as well as changes to make use of the broader spectral range of VIIRS. A preliminary validation against Maritime Aerosol Network (MAN) measurements suggests a typical uncertainty on retrieved 550 nm AOD of order ±(0.03+10%), comparable to existing SeaWiFS/MODIS aerosol data products. Retrieved Ångström exponent and fine-mode AOD fraction are also well correlated with MAN data, with small biases and uncertainty similar to or better than SeaWiFS/MODIS products.

  9. A MIMO-OFDM Testbed for Wireless Local Area Networks

    Directory of Open Access Journals (Sweden)

    Conrat Jean-Marc

    2006-01-01

    Full Text Available We describe the design steps and final implementation of a MIMO OFDM prototype platform developed to enhance the performance of wireless LAN standards such as HiperLAN/2 and 802.11, using multiple transmit and multiple receive antennas. We first describe the channel measurement campaign used to characterize the indoor operational propagation environment, and analyze the influence of the channel on code design through a ray-tracing channel simulator. We also comment on some antenna and RF issues which are of importance for the final realization of the testbed. Multiple coding, decoding, and channel estimation strategies are discussed and their respective performance-complexity trade-offs are evaluated over the realistic channel obtained from the propagation studies. Finally, we present the design methodology, including cross-validation of the Matlab, C++, and VHDL components, and the final demonstrator architecture. We highlight the increased measured performance of the MIMO testbed over the single-antenna system.

  10. A Testbed Environment for Buildings-to-Grid Cyber Resilience Research and Development

    Energy Technology Data Exchange (ETDEWEB)

    Sridhar, Siddharth; Ashok, Aditya; Mylrea, Michael E.; Pal, Seemita; Rice, Mark J.; Gourisetti, Sri Nikhil Gup

    2017-09-19

    The Smart Grid is characterized by the proliferation of advanced digital controllers at all levels of its operational hierarchy from generation to end consumption. Such controllers within modern residential and commercial buildings enable grid operators to exercise fine-grained control over energy consumption through several emerging Buildings-to-Grid (B2G) applications. Though this capability promises significant benefits in terms of operational economics and improved reliability, cybersecurity weaknesses in the supporting infrastructure could be exploited to cause a detrimental effect and this necessitates focused research efforts on two fronts. First, the understanding of how cyber attacks in the B2G space could impact grid reliability and to what extent. Second, the development and validation of cyber-physical application-specific countermeasures that are complementary to traditional infrastructure cybersecurity mechanisms for enhanced cyber attack detection and mitigation. The PNNL B2G testbed is currently being developed to address these core research needs. Specifically, the B2G testbed combines high-fidelity buildings+grid simulators, industry-grade building automation and Supervisory Control and Data Acquisition (SCADA) systems in an integrated, realistic, and reconfigurable environment capable of supporting attack-impact-detection-mitigation experimentation. In this paper, we articulate the need for research testbeds to model various B2G applications broadly by looking at the end-to-end operational hierarchy of the Smart Grid. Finally, the paper not only describes the architecture of the B2G testbed in detail, but also addresses the broad spectrum of B2G resilience research it is capable of supporting based on the smart grid operational hierarchy identified earlier.

  11. Congestion control and routing over satellite networks

    Science.gov (United States)

    Cao, Jinhua

    Satellite networks and transmissions find their application in fields of computer communications, telephone communications, television broadcasting, transportation, space situational awareness systems and so on. This thesis mainly focuses on two networking issues affecting satellite networking: network congestion control and network routing optimization. Congestion, which leads to long queueing delays, packet losses or both, is a networking problem that has drawn the attention of many researchers. The goal of congestion control mechanisms is to ensure high bandwidth utilization while avoiding network congestion by regulating the rate at which traffic sources inject packets into a network. In this thesis, we propose a stable congestion controller using data-driven, safe switching control theory to improve the dynamic performance of satellite Transmission Control Protocol/Active Queue Management (TCP/AQM) networks. First, the stable region of the Proportional-Integral (PI) parameters for a nominal model is explored. Then, a PI controller, whose parameters are adaptively tuned by switching among members of a given candidate set, using observed plant data, is presented and compared with some classical AQM policy examples, such as Random Early Detection (RED) and fixed PI control. A new cost detectable switching law with an interval cost function switching algorithm, which improves the performance and also saves the computational cost, is developed and compared with a law commonly used in the switching control literature. Finite-gain stability of the system is proved. A fuzzy logic PI controller is incorporated as a special candidate to achieve good performance at all nominal points with the available set of candidate controllers. Simulations are presented to validate the theory. An effocient routing algorithm plays a key role in optimizing network resources. In this thesis, we briefly analyze Low Earth Orbit (LEO) satellite networks, review the Cross Entropy (CE

  12. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    Science.gov (United States)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  13. A demonstration of remote survey and characterization of a buried waste site using the SRIP [Soldier Robot Interface Project] testbed

    International Nuclear Information System (INIS)

    Burks, B.L.; Richardson, B.S.; Armstrong, G.A.; Hamel, W.R.; Jansen, J.F.; Killough, S.M.; Thompson, D.H.; Emery, M.S.

    1990-01-01

    During FY 1990, the Oak Ridge National Laboratory (ORNL) supported the Department of Energy (DOE) Environmental Restoration and Waste Management (ER ampersand WM) Office of Technology Development through several projects including the development of a semiautonomous survey of a buried waste site using a remotely operated all-terrain robotic testbed borrowed from the US Army. The testbed was developed for the US Army's Human Engineering Laboratory (HEL) for the US Army's Soldier Robot Interface Project (SRIP). Initial development of the SRIP testbed was performed by a team including ORNL, HEL, Tooele Army Depot, and Odetics, Inc., as an experimental testbed for a variety of human factors issues related to military applications of robotics. The SRIP testbed was made available to the DOE and ORNL for the further development required for a remote landfill survey. The robot was modified extensively, equipped with environmental sensors, and used to demonstrate an automated remote survey of Solid Waste Storage Area No. 3 (SWSA 3) at ORNL on Tuesday, September 18, 1990. Burial trenches in this area containing contaminated materials were covered with soil nearly twenty years ago. This paper describes the SRIP testbed and work performed in FY 1990 to demonstrate a semiautonomous landfill survey at ORNL. 5 refs

  14. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    Science.gov (United States)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  15. On-wire lithography-generated molecule-based transport junctions: a new testbed for molecular electronics.

    Science.gov (United States)

    Chen, Xiaodong; Jeon, You-Moon; Jang, Jae-Won; Qin, Lidong; Huo, Fengwei; Wei, Wei; Mirkin, Chad A

    2008-07-02

    On-wire lithography (OWL) fabricated nanogaps are used as a new testbed to construct molecular transport junctions (MTJs) through the assembly of thiolated molecular wires across a nanogap formed between two Au electrodes. In addition, we show that one can use OWL to rapidly characterize a MTJ and optimize gap size for two molecular wires of different dimensions. Finally, we have used this new testbed to identify unusual temperature-dependent transport mechanisms for alpha,omega-dithiol terminated oligo(phenylene ethynylene).

  16. Real-Time Simulation and Hardware-in-the-Loop Testbed for Distribution Synchrophasor Applications

    Directory of Open Access Journals (Sweden)

    Matthias Stifter

    2018-04-01

    Full Text Available With the advent of Distribution Phasor Measurement Units (D-PMUs and Micro-Synchrophasors (Micro-PMUs, the situational awareness in power distribution systems is going to the next level using time-synchronization. However, designing, analyzing, and testing of such accurate measurement devices are still challenging. Due to the lack of available knowledge and sufficient history for synchrophasors’ applications at the power distribution level, the realistic simulation, and validation environments are essential for D-PMU development and deployment. This paper presents a vendor agnostic PMU real-time simulation and hardware-in-the-Loop (PMU-RTS-HIL testbed, which helps in multiple PMUs validation and studies. The network of real and virtual PMUs was built in a full time-synchronized environment for PMU applications’ validation. The proposed testbed also includes an emulated communication network (CNS layer to replicate bandwidth, packet loss and collisions conditions inherent to the PMUs data streams’ issues. Experimental results demonstrate the flexibility and scalability of the developed PMU-RTS-HIL testbed by producing large amounts of measurements under typical normal and abnormal distribution grid operation conditions.

  17. Algorithm integration using ADL (Algorithm Development Library) for improving CrIMSS EDR science product quality

    Science.gov (United States)

    Das, B.; Wilson, M.; Divakarla, M. G.; Chen, W.; Barnet, C.; Wolf, W.

    2013-05-01

    Algorithm Development Library (ADL) is a framework that mimics the operational system IDPS (Interface Data Processing Segment) that is currently being used to process data from instruments aboard Suomi National Polar-orbiting Partnership (S-NPP) satellite. The satellite was launched successfully in October 2011. The Cross-track Infrared and Microwave Sounder Suite (CrIMSS) consists of the Advanced Technology Microwave Sounder (ATMS) and Cross-track Infrared Sounder (CrIS) instruments that are on-board of S-NPP. These instruments will also be on-board of JPSS (Joint Polar Satellite System) that will be launched in early 2017. The primary products of the CrIMSS Environmental Data Record (EDR) include global atmospheric vertical temperature, moisture, and pressure profiles (AVTP, AVMP and AVPP) and Ozone IP (Intermediate Product from CrIS radiances). Several algorithm updates have recently been proposed by CrIMSS scientists that include fixes to the handling of forward modeling errors, a more conservative identification of clear scenes, indexing corrections for daytime products, and relaxed constraints between surface temperature and air temperature for daytime land scenes. We have integrated these improvements into the ADL framework. This work compares the results from ADL emulation of future IDPS system incorporating all the suggested algorithm updates with the current official processing results by qualitative and quantitative evaluations. The results prove these algorithm updates improve science product quality.

  18. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Loop-scale Testbed Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kisner, Roger A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging to design and operate. Extreme environments limit the options for sensors and actuators and degrade their performance. Because sensors and actuators are necessary for feedback control, these limitations mean that designing embedded instrumentation and control systems for the challenging environments of nuclear reactors requires advanced technical solutions that are not available commercially. This report details the development of testbed that will be used for cross-cutting embedded instrumentation and control research for nuclear power applications. This research is funded by the Department of Energy's Nuclear Energy Enabling Technology program's Advanced Sensors and Instrumentation topic. The design goal of the loop-scale testbed is to build a low temperature pump that utilizes magnetic bearing that will be incorporated into a water loop to test control system performance and self-sensing techniques. Specifically, this testbed will be used to analyze control system performance in response to nonlinear and cross-coupling fluid effects between the shaft axes of motion, rotordynamics and gyroscopic effects, and impeller disturbances. This testbed will also be used to characterize the performance losses when using self-sensing position measurement techniques. Active magnetic bearings are a technology that can reduce failures and maintenance costs in nuclear power plants. They are particularly relevant to liquid salt reactors that operate at high temperatures (700 C). Pumps used in the extreme environment of liquid salt reactors provide many engineering challenges that can be overcome with magnetic bearings and their associated embedded instrumentation and control. This report will give details of the mechanical design and electromagnetic design of the loop-scale embedded instrumentation and control testbed.

  19. An Adaptive Tradeoff Algorithm for Multi-issue SLA Negotiation

    Science.gov (United States)

    Son, Seokho; Sim, Kwang Mong

    Since participants in a Cloud may be independent bodies, mechanisms are necessary for resolving different preferences in leasing Cloud services. Whereas there are currently mechanisms that support service-level agreement negotiation, there is little or no negotiation support for concurrent price and timeslot for Cloud service reservations. For the concurrent price and timeslot negotiation, a tradeoff algorithm to generate and evaluate a proposal which consists of price and timeslot proposal is necessary. The contribution of this work is thus to design an adaptive tradeoff algorithm for multi-issue negotiation mechanism. The tradeoff algorithm referred to as "adaptive burst mode" is especially designed to increase negotiation speed and total utility and to reduce computational load by adaptively generating concurrent set of proposals. The empirical results obtained from simulations carried out using a testbed suggest that due to the concurrent price and timeslot negotiation mechanism with adaptive tradeoff algorithm: 1) both agents achieve the best performance in terms of negotiation speed and utility; 2) the number of evaluations of each proposal is comparatively lower than previous scheme (burst-N).

  20. Proceedings from the 2nd International Symposium on Formation Flying Missions and Technologies

    Science.gov (United States)

    2004-01-01

    Topics discussed include: The Stellar Imager (SI) "Vision Mission"; First Formation Flying Demonstration Mission Including on Flight Nulling; Formation Flying X-ray Telescope in L2 Orbit; SPECS: The Kilometer-baseline Far-IR Interferometer in NASA's Space Science Roadmap Presentation; A Tight Formation for Along-track SAR Interferometry; Realization of the Solar Power Satellite using the Formation Flying Solar Reflector; SIMBOL-X : Formation Flying for High-Energy Astrophysics; High Precision Optical Metrology for DARWIN; Close Formation Flight of Micro-Satellites for SAR Interferometry; Station-Keeping Requirements for Astronomical Imaging with Constellations of Free-Flying Collectors; Closed-Loop Control of Formation Flying Satellites; Formation Control for the MAXIM Mission; Precision Formation Keeping at L2 Using the Autonomous Formation Flying Sensor; Robust Control of Multiple Spacecraft Formation Flying; Virtual Rigid Body (VRB) Satellite Formation Control: Stable Mode-Switching and Cross-Coupling; Electromagnetic Formation Flight (EMFF) System Design, Mission Capabilities, and Testbed Development; Navigation Algorithms for Formation Flying Missions; Use of Formation Flying Small Satellites Incorporating OISL's in a Tandem Cluster Mission; Semimajor Axis Estimation Strategies; Relative Attitude Determination of Earth Orbiting Formations Using GPS Receivers; Analysis of Formation Flying in Eccentric Orbits Using Linearized Equations of Relative Motion; Conservative Analytical Collision Probabilities for Orbital Formation Flying; Equations of Motion and Stability of Two Spacecraft in Formation at the Earth/Moon Triangular Libration Points; Formations Near the Libration Points: Design Strategies Using Natural and Non-Natural Ares; An Overview of the Formation and Attitude Control System for the Terrestrial Planet Finder Formation Flying Interferometer; GVE-Based Dynamics and Control for Formation Flying Spacecraft; GNC System Design for a New Concept of X

  1. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  2. Torpedo and countermeasures modelling in the Torpedo Defence System Testbed

    NARCIS (Netherlands)

    Benders, F.P.A.; Witberg, R.R.; H.J. Grootendorst, H.J.

    2002-01-01

    Several years ago, TNO-FEL started the development of the Torpedo Defence System Testbed (TDSTB) based on the TORpedo SIMulation (TORSIM) model and the Maritime Operations Simulation and Evaluation System (MOSES). MOSES provides the simulation and modelling environment for the evaluation and

  3. Development of Fast Error Compensation Algorithm for Integrated Inertial-Satellite Navigation System of Small-size Unmanned Aerial Vehicles in Complex Environment

    Directory of Open Access Journals (Sweden)

    A. V. Fomichev

    2015-01-01

    Full Text Available In accordance with the structural features of small-size unmanned aerial vehicle (UAV, and considering the feasibility of this project, the article studies an integrated inertial-satellite navigation system (INS. The INS algorithm development is based on the method of indirect filtration and principle of loosely coupled combination of output data on UAV positions and velocity. Data on position and velocity are provided from the strapdown inertial navigation system (SINS and satellite navigation system (GPS. A difference between the output flows of measuring data on position and velocity provided from the SINS and GPS is used to evaluate SINS errors by means of the basic algorithm of Kalman filtering. Then the outputs of SINS are revised. The INS possesses the following advantages: a simpler mathematical model of Kalman filtering, high reliability, two independently operating navigation systems, and high redundancy of available navigation information.But in case of loosely coupled scheme, INS can meet the challenge of high precision and reliability of navigation only when the SINS and GPS operating conditions are normal all the time. The proposed INS is used with UAV moving in complex environment due to obstacles available, severe natural climatic conditions, etc. This case expects that it is impossible for UAV to receive successful GPS-signals frequently. In order to solve this problem, was developed an algorithm for rapid compensation for errors of INS information, which could effectively solve the problem of failure of the navigation system in case there are no GPS-signals .Since it is almost impossible to obtain the data of the real trajectory in practice, in the course of simulation in accordance with the kinematic model of the UAV and the complex environment of the terrain, the flight path generator is used to produce the flight path. The errors of positions and velocities are considered as an indicator of the INS effectiveness. The results

  4. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan.

    Science.gov (United States)

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate theimpacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM)strategies. The outputs (mo...

  5. GEO light imaging national testbed (GLINT) heliostat design and testing status

    Science.gov (United States)

    Thornton, Marcia A.; Oldenettel, Jerry R.; Hult, Dane W.; Koski, Katrina; Depue, Tracy; Cuellar, Edward L.; Balfour, Jim; Roof, Morey; Yarger, Fred W.; Newlin, Greg; Ramzel, Lee; Buchanan, Peter; Mariam, Fesseha G.; Scotese, Lee

    2002-01-01

    The GEO Light Imaging National Testbed (GLINT) will use three laser beams producing simultaneous interference fringes to illuminate satellites in geosynchronous earth orbit (GEO). The reflected returns will be recorded using a large 4,000 m2 'light bucket' receiver. This imaging methodology is termed Fourier Telescopy. A major component of the 'light bucket' will be an array of 40 - 80 heliostats. Each heliostat will have a mirrored surface area of 100 m2 mounted on a rigid truss structure which is supported by an A-frame. The truss structure attaches to the torque tube elevation drive and the A-frame structure rests on an azimuth ring that could provide nearly full coverage of the sky. The heliostat is designed to operate in 15 mph winds with jitter of less than 500 microradians peak-to- peak. One objective of the design was to minimize receiver cost to the maximum extent possible while maintaining GLINT system performance specifications. The mechanical structure weights approximately seven tons and is a simple fabricated steel framework. A prototype heliostat has been assembled at Stallion Range Center, White Sands Missile Range, New Mexico and is being tested under a variety of weather and operational conditions. The preliminary results of that testing will be presented as well as some finite element model analyses that were performed to predict the performance of the structure.

  6. The Day-1 GPM Combined Precipitation Algorithm: IMERG

    Science.gov (United States)

    Huffman, G. J.; Bolvin, D. T.; Braithwaite, D.; Hsu, K.; Joyce, R.; Kidd, C.; Sorooshian, S.; Xie, P.

    2012-12-01

    The Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (GPM) mission (IMERG) algorithm will provide the at-launch combined-sensor precipitation dataset being produced by the U.S. GPM Science Team. IMERG is being developed as a unified U.S. algorithm that takes advantage of strengths in three current U.S. algorithms: - the TRMM Multi-satellite Precipitation Analysis (TMPA), which addresses inter-satellite calibration of precipitation estimates and monthly scale combination of satellite and gauge analyses; - the CPC Morphing algorithm with Kalman Filtering (KF-CMORPH), which provides quality-weighted time interpolation of precipitation patterns following storm motion; and - the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS), which provides a neural-network-based scheme for generating microwave-calibrated precipitation estimates from geosynchronous infrared brightness temperatures, and filters out some non-raining cold clouds. The goal is to provide a long-term, fine-scale record of global precipitation from the entire constellation of precipitation-relevant satellite sensors, with input from surface precipitation gauges. The record will begin January 1998 at the start of the Tropical Rainfall Measuring Mission (TRMM) and extend as GPM records additional data. Although homogeneity is considered desirable, the use of diverse and evolving data sources works against the strict long-term homogeneity that characterizes a Climate Data Record (CDR). This talk will briefly review the design requirements for IMERG, including multiple runs at different latencies (most likely around 4 hours, 12 hours, and 2 months after observation time), various intermediate data fields as part of the IMERG data file, and the plans to bring up IMERG with calibration by TRMM initially, transitioning to GPM when its individual-sensor precipitation algorithms are fully functional

  7. Visual attitude propagation for small satellites

    Science.gov (United States)

    Rawashdeh, Samir A.

    As electronics become smaller and more capable, it has become possible to conduct meaningful and sophisticated satellite missions in a small form factor. However, the capability of small satellites and the range of possible applications are limited by the capabilities of several technologies, including attitude determination and control systems. This dissertation evaluates the use of image-based visual attitude propagation as a compliment or alternative to other attitude determination technologies that are suitable for miniature satellites. The concept lies in using miniature cameras to track image features across frames and extracting the underlying rotation. The problem of visual attitude propagation as a small satellite attitude determination system is addressed from several aspects: related work, algorithm design, hardware and performance evaluation, possible applications, and on-orbit experimentation. These areas of consideration reflect the organization of this dissertation. A "stellar gyroscope" is developed, which is a visual star-based attitude propagator that uses relative motion of stars in an imager's field of view to infer the attitude changes. The device generates spacecraft relative attitude estimates in three degrees of freedom. Algorithms to perform the star detection, correspondence, and attitude propagation are presented. The Random Sample Consensus (RANSAC) approach is applied to the correspondence problem to successfully pair stars across frames while mitigating falsepositive and false-negative star detections. This approach provides tolerance to the noise levels expected in using miniature optics and no baffling, and the noise caused by radiation dose on orbit. The hardware design and algorithms are validated using test images of the night sky. The application of the stellar gyroscope as part of a CubeSat attitude determination and control system is described. The stellar gyroscope is used to augment a MEMS gyroscope attitude propagation

  8. Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approach

    Science.gov (United States)

    Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.

    2017-12-01

    Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.

  9. Enhancement of Satellite Image Compression Using a Hybrid (DWT-DCT) Algorithm

    Science.gov (United States)

    Shihab, Halah Saadoon; Shafie, Suhaidi; Ramli, Abdul Rahman; Ahmad, Fauzan

    2017-12-01

    Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) image compression techniques have been utilized in most of the earth observation satellites launched during the last few decades. However, these techniques have some issues that should be addressed. The DWT method has proven to be more efficient than DCT for several reasons. Nevertheless, the DCT can be exploited to improve the high-resolution satellite image compression when combined with the DWT technique. Hence, a proposed hybrid (DWT-DCT) method was developed and implemented in the current work, simulating an image compression system on-board on a small remote sensing satellite, with the aim of achieving a higher compression ratio to decrease the onboard data storage and the downlink bandwidth, while avoiding further complex levels of DWT. This method also succeeded in maintaining the reconstructed satellite image quality through replacing the standard forward DWT thresholding and quantization processes with an alternative process that employed the zero-padding technique, which also helped to reduce the processing time of DWT compression. The DCT, DWT and the proposed hybrid methods were implemented individually, for comparison, on three LANDSAT 8 images, using the MATLAB software package. A comparison was also made between the proposed method and three other previously published hybrid methods. The evaluation of all the objective and subjective results indicated the feasibility of using the proposed hybrid (DWT-DCT) method to enhance the image compression process on-board satellites.

  10. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    Science.gov (United States)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  11. Investigation of Lake Water Salinity by Using Four-Band Salinity Algorithm on WorldView-2 Satellite Image for a Saline Industrial Lake

    Science.gov (United States)

    Budakoǧlu, Murat; Karaman, Muhittin; Damla Uça Avcı, Z.; Kumral, Mustafa; Geredeli (Yılmaz), Serpil

    2014-05-01

    Salinity of a lake is an important characteristic since, these are potentially industrial lakes and the degree of salinity can significantly be used for determination of mineral resources and for the production management. In the literature, there are many studies of using satellite data for salinity related lake studies such as determination of salinity distribution and detection of potential freshwater sources in less salt concentrated regions. As the study area Lake Acigol, located in Denizli (Turkey) was selected. With it's saline environment, it's the major sodium sulphate production resource of Turkey. In this study, remote sensing data and data from a field study was used and correlated. Remote sensing is an efficient tool to monitor and analyze lake properties by using it complementary to field data. Worldview-2 satellite data was used in this study which consists of 8 bands. At the same time with the satellite data acquisition, a field study was conducted to collect the salinity values in 17 points of the laker with using YSI 556 Multiparametre for measurements. The values were measured as salinity amount in grams per kilogram solution and obtained as ppt unit. It was observed that the values vary from 34 ppt - 40.1 ppt and the average is 38.056 ppt. In Thalassic serie, the lake was in mixoeuhaline state in the time of issue. As a first step, ATCOR correction was performed on satellite image for atmospheric correction. There were some clouds on the lake field, hence it was decided to continue the study by using the 12 sampling points which were clear on the image. Then, for each sampling point, a spectral value was obtained by calculating the average at a 11*11 neighborhood. The relation between the spectral reflectance values and the salinity was investigated. The 4-band algorithm, which was used for determination of chlorophyll-a distribution in highly turbid coastal environment by Wei (2012) was applied. Salinity α (Λi-1 / Λj-1) * (Λk-1 / Λm-1) (i

  12. A demand assignment control in international business satellite communications network

    Science.gov (United States)

    Nohara, Mitsuo; Takeuchi, Yoshio; Takahata, Fumio; Hirata, Yasuo

    An experimental system is being developed for use in an international business satellite (IBS) communications network based on demand-assignment (DA) and TDMA techniques. This paper discusses its system design, in particular from the viewpoints of a network configuration, a DA control, and a satellite channel-assignment algorithm. A satellite channel configuration is also presented along with a tradeoff study on transmission rate, HPA output power, satellite resource efficiency, service quality, and so on.

  13. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    Science.gov (United States)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  14. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  15. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    Science.gov (United States)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  16. Simulation of tsunami effects on sea surface salinity using MODIS satellite data

    International Nuclear Information System (INIS)

    Ramlan, N E F; Genderen, J van; Hashim, M; Marghany, M

    2014-01-01

    Remote sensing technology has been recognized as powerful tool for environmental disaster studies. Ocean surface salinity is considered as a major element in the marine environment. In this study, we simulate the 2004 tsunami's impact on a physical ocean parameter using the least square algorithm to retrieve sea surface salinity (SSS) from MODIS satellite data. The accuracy of this work has been examined using the root mean of sea surface salinity retrieved from MODIS satellite data. The study shows a comprehensive relationship between the in situ measurements and least square algorithm with high r 2 of 0.95, and RMS of bias value of ±0.9 psu. In conclusion, the least square algorithm can be used to retrieve SSS from MODIS satellite data during a tsunami event

  17. Distributed computing testbed for a remote experimental environment

    International Nuclear Information System (INIS)

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.

    1995-01-01

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ''Collaboratory.'' The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation's Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility

  18. Instrument-induced spatial crosstalk deconvolution algorithm

    Science.gov (United States)

    Wright, Valerie G.; Evans, Nathan L., Jr.

    1986-01-01

    An algorithm has been developed which reduces the effects of (deconvolves) instrument-induced spatial crosstalk in satellite image data by several orders of magnitude where highly precise radiometry is required. The algorithm is based upon radiance transfer ratios which are defined as the fractional bilateral exchange of energy betwen pixels A and B.

  19. Advances in multi-sensor data fusion: algorithms and applications.

    Science.gov (United States)

    Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying

    2009-01-01

    With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.

  20. Seasonal nitrate algorithms for nitrate retrieval using OCEANSAT-2 and MODIS-AQUA satellite data.

    Science.gov (United States)

    Durairaj, Poornima; Sarangi, Ranjit Kumar; Ramalingam, Shanthi; Thirunavukarassu, Thangaradjou; Chauhan, Prakash

    2015-04-01

    In situ datasets of nitrate, sea surface temperature (SST), and chlorophyll a (chl a) collected during the monthly coastal samplings and organized cruises along the Tamilnadu and Andhra Pradesh coast between 2009 and 2013 were used to develop seasonal nitrate algorithms. The nitrate algorithms have been built up based on the three-dimensional regressions between SST, chl a, and nitrate in situ data using linear, Gaussian, Lorentzian, and paraboloid function fittings. Among these four functions, paraboloid was found to be better with the highest co-efficient of determination (postmonsoon: R2=0.711, n=357; summer: R2=0.635, n=302; premonsoon: R2=0.829, n=249; and monsoon: R2=0.692, n=272) for all seasons. Based on these fittings, seasonal nitrate images were generated using the concurrent satellite data of SST from Moderate Resolution Imaging Spectroradiometer (MODIS) and chlorophyll (chl) from Ocean Color Monitor (OCM-2) and MODIS. The best retrieval of modeled nitrate (R2=0.527, root mean square error (RMSE)=3.72, and mean normalized bias (MNB)=0.821) was observed for the postmonsoon season due to the better retrieval of both SST MODIS (28 February 2012, R2=0.651, RMSE=2.037, and MNB=0.068) and chl OCM-2 (R2=0.534, RMSE=0.317, and MNB=0.27). Present results confirm that the chl OCM-2 and SST MODIS retrieve nitrate well than the MODIS-derived chl and SST largely due to the better retrieval of chl by OCM-2 than MODIS.

  1. Effect of Ionosphere on Geostationary Communication Satellite Signals

    Science.gov (United States)

    Erdem, Esra; Arikan, Feza; Gulgonul, Senol

    2016-07-01

    Geostationary orbit (GEO) communications satellites allow radio, television, and telephone transmissions to be sent live anywhere in the world. They are extremely important in daily life and also for military applications. Since, satellite communication is an expensive technology addressing crowd of people, it is critical to improve the performance of this technology. GEO satellites are at 35,786 kilometres from Earth's surface situated directly over the equator. A satellite in a geostationary orbit (GEO) appears to stand still in the sky, in a fixed position with respect to an observer on the earth, because the satellite's orbital period is the same as the rotation rate of the Earth. The advantage of this orbit is that ground antennas can be fixed to point towards to satellite without their having to track the satellite's motion. Radio frequency ranges used in satellite communications are C, X, Ku, Ka and even EHG and V-band. Satellite signals are disturbed by atmospheric effects on the path between the satellite and the receiver antenna. These effects are mostly rain, cloud and gaseous attenuation. It is expected that ionosphere has a minor effect on the satellite signals when the ionosphere is quiet. But there are anomalies and perturbations on the structure of ionosphere with respect to geomagnetic field and solar activity and these conditions may cause further affects on the satellite signals. In this study IONOLAB-RAY algorithm is adopted to examine the effect of ionosphere on satellite signals. IONOLAB-RAY is developed to calculate propagation path and characteristics of high frequency signals. The algorithm does not have any frequency limitation and models the plasmasphere up to 20,200 km altitude, so that propagation between a GEO satellite and antenna on Earth can be simulated. The algorithm models inhomogeneous, anisotropic and time dependent structure of the ionosphere with a 3-D spherical grid geometry and calculates physical parameters of the

  2. A feed-forward Hopfield neural network algorithm (FHNNA) with a colour satellite image for water quality mapping

    Science.gov (United States)

    Asal Kzar, Ahmed; Mat Jafri, M. Z.; Hwee San, Lim; Al-Zuky, Ali A.; Mutter, Kussay N.; Hassan Al-Saleh, Anwar

    2016-06-01

    There are many techniques that have been given for water quality problem, but the remote sensing techniques have proven their success, especially when the artificial neural networks are used as mathematical models with these techniques. Hopfield neural network is one type of artificial neural networks which is common, fast, simple, and efficient, but it when it deals with images that have more than two colours such as remote sensing images. This work has attempted to solve this problem via modifying the network that deals with colour remote sensing images for water quality mapping. A Feed-forward Hopfield Neural Network Algorithm (FHNNA) was modified and used with a satellite colour image from type of Thailand earth observation system (THEOS) for TSS mapping in the Penang strait, Malaysia, through the classification of TSS concentrations. The new algorithm is based essentially on three modifications: using HNN as feed-forward network, considering the weights of bitplanes, and non-self-architecture or zero diagonal of weight matrix, in addition, it depends on a validation data. The achieved map was colour-coded for visual interpretation. The efficiency of the new algorithm has found out by the higher correlation coefficient (R=0.979) and the lower root mean square error (RMSE=4.301) between the validation data that were divided into two groups. One used for the algorithm and the other used for validating the results. The comparison was with the minimum distance classifier. Therefore, TSS mapping of polluted water in Penang strait, Malaysia, can be performed using FHNNA with remote sensing technique (THEOS). It is a new and useful application of HNN, so it is a new model with remote sensing techniques for water quality mapping which is considered important environmental problem.

  3. Evaluation of Future Internet Technologies for Processing and Distribution of Satellite Imagery

    Science.gov (United States)

    Becedas, J.; Perez, R.; Gonzalez, G.; Alvarez, J.; Garcia, F.; Maldonado, F.; Sucari, A.; Garcia, J.

    2015-04-01

    Satellite imagery data centres are designed to operate a defined number of satellites. For instance, difficulties when new satellites have to be incorporated in the system appear. This occurs because traditional infrastructures are neither flexible nor scalable. With the appearance of Future Internet technologies new solutions can be provided to manage large and variable amounts of data on demand. These technologies optimize resources and facilitate the appearance of new applications and services in the traditional Earth Observation (EO) market. The use of Future Internet technologies for the EO sector were validated with the GEO-Cloud experiment, part of the Fed4FIRE FP7 European project. This work presents the final results of the project, in which a constellation of satellites records the whole Earth surface on a daily basis. The satellite imagery is downloaded into a distributed network of ground stations and ingested in a cloud infrastructure, where the data is processed, stored, archived and distributed to the end users. The processing and transfer times inside the cloud, workload of the processors, automatic cataloguing and accessibility through the Internet are evaluated to validate if Future Internet technologies present advantages over traditional methods. Applicability of these technologies is evaluated to provide high added value services. Finally, the advantages of using federated testbeds to carry out large scale, industry driven experiments are analysed evaluating the feasibility of an experiment developed in the European infrastructure Fed4FIRE and its migration to a commercial cloud: SoftLayer, an IBM Company.

  4. M1 mirror print-through investigation and performance on the thermo-opto-mechanical testbed for the Space Interferometry Mission

    Science.gov (United States)

    Feria, V. Alfonso; Lam, Jonathan; Van Buren, Dave

    2006-06-01

    SIM PlanetQuest (SIM) is a large (9-meter baseline) space-borne optical interferometer that will determine the position and distance of stars to high accuracy. With microarcsecond measurements SIM will probe nearby stars for Earth-sized planets. To achieve this precision, SIM requires very tight manufacturing tolerances and high stability of optical components. To reduce technical risks, the SIM project developed an integrated thermal, mechanical and optical testbed (TOM3) to allow predictions of the system performance at the required high precision. The TOM3 testbed used full-scale brassboard optical components and picometer-class metrology to reach the SIM target performance levels. During the testbed integration and after one of the testbed mirrors, M1, was bonded into its mount, some surface distortion dimples that exceeded the optical specification were discovered. A detailed finite element model was used to analyze different load cases to try to determine the source of the M1 surface deformations. The same model was also used to compare with actual deformations due to varied thermal conditions on the TOM3 testbed. This paper presents the studies carried out to determine the source of the surface distortions on the M1 mirror as well as comparison and model validation during testing. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  5. Social media analytics and research testbed (SMART: Exploring spatiotemporal patterns of human dynamics with geo-targeted social media messages

    Directory of Open Access Journals (Sweden)

    Jiue-An Yang

    2016-06-01

    Full Text Available The multilevel model of meme diffusion conceptualizes how mediated messages diffuse over time and space. As a pilot application of implementing the meme diffusion, we developed the social media analytics and research testbed to monitor Twitter messages and track the diffusion of information in and across different cities and geographic regions. Social media analytics and research testbed is an online geo-targeted search and analytics tool, including an automatic data processing procedure at the backend and an interactive frontend user interface. Social media analytics and research testbed is initially designed to facilitate (1 searching and geo-locating tweet topics and terms in different cities and geographic regions; (2 filtering noise from raw data (such as removing redundant retweets and using machine learning methods to improve precision; (3 analyzing social media data from a spatiotemporal perspective; and (4 visualizing social media data in diagnostic ways (such as weekly and monthly trends, trend maps, top media, top retweets, top mentions, or top hashtags. Social media analytics and research testbed provides researchers and domain experts with a tool that can efficiently facilitate the refinement, formalization, and testing of research hypotheses or questions. Three case studies (flu outbreaks, Ebola epidemic, and marijuana legalization are introduced to illustrate how the predictions of meme diffusion can be examined and to demonstrate the potentials and key functions of social media analytics and research testbed.

  6. Multi-Stage Hybrid Rocket Conceptual Design for Micro-Satellites Launch using Genetic Algorithm

    Science.gov (United States)

    Kitagawa, Yosuke; Kitagawa, Koki; Nakamiya, Masaki; Kanazaki, Masahiro; Shimada, Toru

    The multi-objective genetic algorithm (MOGA) is applied to the multi-disciplinary conceptual design problem for a three-stage launch vehicle (LV) with a hybrid rocket engine (HRE). MOGA is an optimization tool used for multi-objective problems. The parallel coordinate plot (PCP), which is a data mining method, is employed in the post-process in MOGA for design knowledge discovery. A rocket that can deliver observing micro-satellites to the sun-synchronous orbit (SSO) is designed. It consists of an oxidizer tank containing liquid oxidizer, a combustion chamber containing solid fuel, a pressurizing tank and a nozzle. The objective functions considered in this study are to minimize the total mass of the rocket and to maximize the ratio of the payload mass to the total mass. To calculate the thrust and the engine size, the regression rate is estimated based on an empirical model for a paraffin (FT-0070) propellant. Several non-dominated solutions are obtained using MOGA, and design knowledge is discovered for the present hybrid rocket design problem using a PCP analysis. As a result, substantial knowledge on the design of an LV with an HRE is obtained for use in space transportation.

  7. Relativistic Time Transfer for Inter-satellite Links

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Yi, E-mail: yixie@nju.edu.cn [Department of Astronomy, School of Astronomy and Space Sciences, Nanjing University, Nanjing (China); Shanghai Key Laboratory of Space Navigation and Position Techniques, Shanghai (China); Key Laboratory of Modern Astronomy and Astrophysics, Nanjing University, Ministry of Education, Nanjing (China)

    2016-04-26

    Inter-Satellite links (ISLs) will be an important technique for a global navigation satellite system (GNSS) in the future. Based on the principles of general relativity, the time transfer in an ISL is modeled and the algorithm for onboard computation is described. It is found, in general, satellites with circular orbits and identical semi-major axes can benefit inter-satellite time transfer by canceling out terms associated with the transformations between the proper times and the Geocentric Coordinate Time. For a GPS-like GNSS, the Shapiro delay is as large as 0.1 ns when the ISL passes at the limb of the Earth. However, in more realistic cases, this value will decrease to about 50 ps.

  8. Small satellite attitude determination based on GPS/IMU data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Golovan, Andrey [Navigation and Control Laboratory, M.V. Lomonosov Moscow State University, GSP-1, Leninskie Gory, Moscow (Russian Federation); Cepe, Ali [Department of Applied Mechanics and Control, M.V. Lomonosov Moscow State University, Moscow (Russian Federation)

    2014-12-10

    In this paper, we present the mathematical models and algorithms that describe the problem of attitude determination for a small satellite using measurements from three angular rate sensors (ARS) and aiding measurements from multiple GPS receivers/antennas rigidly attached to the platform of the satellite.

  9. Low-cost Citizen Science Balloon Platform for Measuring Air Pollutants to Improve Satellite Retrieval Algorithms

    Science.gov (United States)

    Potosnak, M. J.; Beck-Winchatz, B.; Ritter, P.

    2016-12-01

    High-altitude balloons (HABs) are an engaging platform for citizen science and formal and informal STEM education. However, the logistics of launching, chasing and recovering a payload on a 1200 g or 1500 g balloon can be daunting for many novice school groups and citizen scientists, and the cost can be prohibitive. In addition, there are many interesting scientific applications that do not require reaching the stratosphere, including measuring atmospheric pollutants in the planetary boundary layer. With a large number of citizen scientist flights, these data can be used to constrain satellite retrieval algorithms. In this poster presentation, we discuss a novel approach based on small (30 g) balloons that are cheap and easy to handle, and low-cost tracking devices (SPOT trackers for hikers) that do not require a radio license. Our scientific goal is to measure air quality in the lower troposphere. For example, particulate matter (PM) is an air pollutant that varies on small spatial scales and has sources in rural areas like biomass burning and farming practices such as tilling. Our HAB platform test flight incorporates an optical PM sensor, an integrated single board computer that records the PM sensor signal in addition to flight parameters (pressure, location and altitude), and a low-cost tracking system. Our goal is for the entire platform to cost less than $500. While the datasets generated by these flights are typically small, integrating a network of flight data from citizen scientists into a form usable for comparison to satellite data will require big data techniques.

  10. A satellite digital controller or 'play that PID tune again, Sam'. [Position, Integral, Derivative feedback control algorithm for design strategy

    Science.gov (United States)

    Seltzer, S. M.

    1976-01-01

    The problem discussed is to design a digital controller for a typical satellite. The controlled plant is considered to be a rigid body acting in a plane. The controller is assumed to be a digital computer which, when combined with the proposed control algorithm, can be represented as a sampled-data system. The objective is to present a design strategy and technique for selecting numerical values for the control gains (assuming position, integral, and derivative feedback) and the sample rate. The technique is based on the parameter plane method and requires that the system be amenable to z-transform analysis.

  11. Cargo container inspection test program at ARPA's Nonintrusive Inspection Technology Testbed

    Science.gov (United States)

    Volberding, Roy W.; Khan, Siraj M.

    1994-10-01

    An x-ray-based cargo inspection system test program is being conducted at the Advanced Research Project Agency (ARPA)-sponsored Nonintrusive Inspection Technology Testbed (NITT) located in the Port of Tacoma, Washington. The test program seeks to determine the performance that can be expected from a dual, high-energy x-ray cargo inspection system when inspecting ISO cargo containers. This paper describes an intensive, three-month, system test involving two independent test groups, one representing the criminal smuggling element and the other representing the law enforcement community. The first group, the `Red Team', prepares ISO containers for inspection at an off-site facility. An algorithm randomly selects and indicates the positions and preparation of cargoes within a container. The prepared container is dispatched to the NITT for inspection by the `Blue Team'. After in-gate processing, it is queued for examination. The Blue Team inspects the container and decides whether or not to pass the container. The shipment undergoes out-gate processing and returns to the Red Team. The results of the inspection are recorded for subsequent analysis. The test process, including its governing protocol, the cargoes, container preparation, the examination and results available at the time of submission are presented.

  12. Space Station technology testbed: 2010 deep space transport

    Science.gov (United States)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  13. Validation of ozone profile retrievals derived from the OMPS LP version 2.5 algorithm against correlative satellite measurements

    Science.gov (United States)

    Kramarova, Natalya A.; Bhartia, Pawan K.; Jaross, Glen; Moy, Leslie; Xu, Philippe; Chen, Zhong; DeLand, Matthew; Froidevaux, Lucien; Livesey, Nathaniel; Degenstein, Douglas; Bourassa, Adam; Walker, Kaley A.; Sheese, Patrick

    2018-05-01

    The Limb Profiler (LP) is a part of the Ozone Mapping and Profiler Suite launched on board of the Suomi NPP satellite in October 2011. The LP measures solar radiation scattered from the atmospheric limb in ultraviolet and visible spectral ranges between the surface and 80 km. These measurements of scattered solar radiances allow for the retrieval of ozone profiles from cloud tops up to 55 km. The LP started operational observations in April 2012. In this study we evaluate more than 5.5 years of ozone profile measurements from the OMPS LP processed with the new NASA GSFC version 2.5 retrieval algorithm. We provide a brief description of the key changes that had been implemented in this new algorithm, including a pointing correction, new cloud height detection, explicit aerosol correction and a reduction of the number of wavelengths used in the retrievals. The OMPS LP ozone retrievals have been compared with independent satellite profile measurements obtained from the Aura Microwave Limb Sounder (MLS), Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS) and Odin Optical Spectrograph and InfraRed Imaging System (OSIRIS). We document observed biases and seasonal differences and evaluate the stability of the version 2.5 ozone record over 5.5 years. Our analysis indicates that the mean differences between LP and correlative measurements are well within required ±10 % between 18 and 42 km. In the upper stratosphere and lower mesosphere (> 43 km) LP tends to have a negative bias. We find larger biases in the lower stratosphere and upper troposphere, but LP ozone retrievals have significantly improved in version 2.5 compared to version 2 due to the implemented aerosol correction. In the northern high latitudes we observe larger biases between 20 and 32 km due to the remaining thermal sensitivity issue. Our analysis shows that LP ozone retrievals agree well with the correlative satellite observations in characterizing vertical, spatial and temporal

  14. Design and construction of a 76m long-travel laser enclosure for a space occulter testbed

    Science.gov (United States)

    Galvin, Michael; Kim, Yunjong; Kasdin, N. Jeremy; Sirbu, Dan; Vanderbei, Robert; Echeverri, Dan; Sagolla, Giuseppe; Rousing, Andreas; Balasubramanian, Kunjithapatham; Ryan, Daniel; Shaklan, Stuart; Lisman, Doug

    2016-07-01

    Princeton University is upgrading our space occulter testbed. In particular, we are lengthening it to 76m to achieve flightlike Fresnel numbers. This much longer testbed required an all-new enclosure design. In this design, we prioritized modularity and the use of commercial off-the-shelf (COTS) and semi-COTS components. Several of the technical challenges encountered included an unexpected slow beam drift and black paint selection. Herein we describe the design and construction of this long-travel laser enclosure.

  15. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    Science.gov (United States)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  16. Adaptive suppression of passive intermodulation in digital satellite transceivers

    Directory of Open Access Journals (Sweden)

    Lu TIAN

    2017-06-01

    Full Text Available For the performance issues of satellite transceivers suffering passive intermodulation interference, a novel and effective digital suppression algorithm is presented in this paper. In contrast to analog approaches, digital passive intermodulation (PIM suppression approaches can be easily reconfigured and therefore are highly attractive for future satellite communication systems. A simplified model of nonlinear distortion from passive microwave devices is established in consideration of the memory effect. The multiple high-order PIM products falling into the receiving band can be described as a bilinear predictor function. A suppression algorithm based on a bilinear polynomial decorrelated adaptive filter is proposed for baseband digital signal processing. In consideration of the time-varying characteristics of passive intermodulation, this algorithm can achieve the rapidness of online interference estimation and low complexity with less consumption of resources. Numerical simulation results show that the algorithm can effectively compensate the passive intermodulation interference, and achieve a high signal-to-interference ratio gain.

  17. A Kalman filter-based short baseline RTK algorithm for single-frequency combination of GPS and BDS.

    Science.gov (United States)

    Zhao, Sihao; Cui, Xiaowei; Guan, Feng; Lu, Mingquan

    2014-08-20

    The emerging Global Navigation Satellite Systems (GNSS) including the BeiDou Navigation Satellite System (BDS) offer more visible satellites for positioning users. To employ those new satellites in a real-time kinematic (RTK) algorithm to enhance positioning precision and availability, a data processing model for the dual constellation of GPS and BDS is proposed and analyzed. A Kalman filter-based algorithm is developed to estimate the float ambiguities for short baseline scenarios. The entire work process of the high-precision algorithm based on the proposed model is deeply investigated in detail. The model is validated with real GPS and BDS data recorded from one zero and two short baseline experiments. Results show that the proposed algorithm can generate fixed baseline output with the same precision level as that of either a single GPS or BDS RTK algorithm. The significantly improved fixed rate and time to first fix of the proposed method demonstrates a better availability and effectiveness on processing multi-GNSSs.

  18. A Kalman Filter-Based Short Baseline RTK Algorithm for Single-Frequency Combination of GPS and BDS

    Directory of Open Access Journals (Sweden)

    Sihao Zhao

    2014-08-01

    Full Text Available The emerging Global Navigation Satellite Systems (GNSS including the BeiDou Navigation Satellite System (BDS offer more visible satellites for positioning users. To employ those new satellites in a real-time kinematic (RTK algorithm to enhance positioning precision and availability, a data processing model for the dual constellation of GPS and BDS is proposed and analyzed. A Kalman filter-based algorithm is developed to estimate the float ambiguities for short baseline scenarios. The entire work process of the high-precision algorithm based on the proposed model is deeply investigated in detail. The model is validated with real GPS and BDS data recorded from one zero and two short baseline experiments. Results show that the proposed algorithm can generate fixed baseline output with the same precision level as that of either a single GPS or BDS RTK algorithm. The significantly improved fixed rate and time to first fix of the proposed method demonstrates a better availability and effectiveness on processing multi-GNSSs.

  19. Development of optical packet and circuit integrated ring network testbed.

    Science.gov (United States)

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated. © 2011 Optical Society of America

  20. Assessment of satellite derived diffuse attenuation coefficients ...

    Science.gov (United States)

    Optical data collected in coastal waters off South Florida and in the Caribbean Sea between January 2009 and December 2010 were used to evaluate products derived with three bio-optical inversion algorithms applied to MOIDS/Aqua, MODIS/Terra, and SeaWiFS satellite observations. The products included the diffuse attenuation coefficient at 490 nm (Kd_490) and for the visible range (Kd_PAR), and euphotic depth (Zeu, corresponding to 1% of the surface incident photosynthetically available radiation or PAR). Above-water hyperspectral reflectance data collected over optically shallow waters of the Florida Keys between June 1997 and August 2011 were used to help understand algorithm performance over optically shallow waters. The in situ data covered a variety of water types in South Florida and the Caribbean Sea, ranging from deep clear waters, turbid coastal waters, and optically shallow waters (Kd_490 range of ~0.03 – 1.29m-1). An algorithm based on Inherent Optical Properties (IOPs) showed the best performance (RMSD turbidity or shallow bottom contamination. Similar results were obtained when only in situ data were used to evaluate algorithm performance. The excellent agreement between satellite-derived remote sensing reflectance (Rrs) and in situ Rrs suggested that

  1. Experimental Demonstration of an Algorithm to Detect the Presence of a Parasitic Satellite

    National Research Council Canada - National Science Library

    Dabrowski, Vincent

    2003-01-01

    Published reports of microsatellite weapons testing have led to a concern that some of these "parasitic" satellites could be deployed against US satellites to rendezvous dock and then disrupt, degrade...

  2. Autonomous, agile micro-satellites and supporting technologies

    International Nuclear Information System (INIS)

    Breitfeller, E; Dittman, M D; Gaughan, R J; Jones, M S; Kordas, J F; Ledebuhr, A G; Ng, L C; Whitehead, J C; Wilson, B

    1999-01-01

    This paper updates the on-going effort at Lawrence Livermore National Laboratory to develop autonomous, agile micro-satellites (MicroSats). The objective of this development effort is to develop MicroSats weighing only a few tens of kilograms, that are able to autonomously perform precision maneuvers and can be used telerobotically in a variety of mission modes. The required capabilities include satellite rendezvous, inspection, proximity-operations, docking, and servicing. The MicroSat carries an integrated proximity-operations sensor-suite incorporating advanced avionics. A new self-pressurizing propulsion system utilizing a miniaturized pump and non-toxic mono-propellant hydrogen peroxide was successfully tested. This system can provide a nominal 25 kg MicroSat with 200-300 m/s delta-v including a warm-gas attitude control system. The avionics is based on the latest PowerPC processor using a CompactPCI bus architecture, which is modular, high-performance and processor-independent. This leverages commercial-off-the-shelf (COTS) technologies and minimizes the effects of future changes in processors. The MicroSat software development environment uses the Vx-Works real-time operating system (RTOS) that provides a rapid development environment for integration of new software modules, allowing early integration and test. We will summarize results of recent integrated ground flight testing of our latest non-toxic pumped propulsion MicroSat testbed vehicle operated on our unique dynamic air-rail

  3. Passive Thermal Design Approach for the Space Communications and Navigation (SCaN) Testbed Experiment on the International Space Station (ISS)

    Science.gov (United States)

    Siamidis, John; Yuko, Jim

    2014-01-01

    The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).

  4. The Orlando TDWR testbed and airborne wind shear date comparison results

    Science.gov (United States)

    Campbell, Steven; Berke, Anthony; Matthews, Michael

    1992-01-01

    The focus of this talk is on comparing terminal Doppler Weather Radar (TDWR) and airborne wind shear data in computing a microburst hazard index called the F factor. The TDWR is a ground-based system for detecting wind shear hazards to aviation in the terminal area. The Federal Aviation Administration will begin deploying TDWR units near 45 airports in late 1992. As part of this development effort, M.I.T. Lincoln Laboratory operates under F.A.A. support a TDWR testbed radar in Orlando, FL. During the past two years, a series of flight tests has been conducted with instrumented aircraft penetrating microburst events while under testbed radar surveillance. These tests were carried out with a Cessna Citation 2 aircraft operated by the University of North Dakota (UND) Center for Aerospace Sciences in 1990, and a Boeing 737 operated by NASA Langley Research Center in 1991. A large data base of approximately 60 instrumented microburst penetrations has been obtained from these flights.

  5. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    Science.gov (United States)

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  6. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - Evaluation Report for the San Diego Testbed

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  7. An optimization tool for satellite equipment layout

    Science.gov (United States)

    Qin, Zheng; Liang, Yan-gang; Zhou, Jian-ping

    2018-01-01

    Selection of the satellite equipment layout with performance constraints is a complex task which can be viewed as a constrained multi-objective optimization and a multiple criteria decision making problem. The layout design of a satellite cabin involves the process of locating the required equipment in a limited space, thereby satisfying various behavioral constraints of the interior and exterior environments. The layout optimization of satellite cabin in this paper includes the C.G. offset, the moments of inertia and the space debris impact risk of the system, of which the impact risk index is developed to quantify the risk to a satellite cabin of coming into contact with space debris. In this paper an optimization tool for the integration of CAD software as well as the optimization algorithms is presented, which is developed to automatically find solutions for a three-dimensional layout of equipment in satellite. The effectiveness of the tool is also demonstrated by applying to the layout optimization of a satellite platform.

  8. An Automatic Cloud Detection Method for ZY-3 Satellite

    Directory of Open Access Journals (Sweden)

    CHEN Zhenwei

    2015-03-01

    Full Text Available Automatic cloud detection for optical satellite remote sensing images is a significant step in the production system of satellite products. For the browse images cataloged by ZY-3 satellite, the tree discriminate structure is adopted to carry out cloud detection. The image was divided into sub-images and their features were extracted to perform classification between clouds and grounds. However, due to the high complexity of clouds and surfaces and the low resolution of browse images, the traditional classification algorithms based on image features are of great limitations. In view of the problem, a prior enhancement processing to original sub-images before classification was put forward in this paper to widen the texture difference between clouds and surfaces. Afterwards, with the secondary moment and first difference of the images, the feature vectors were extended in multi-scale space, and then the cloud proportion in the image was estimated through comprehensive analysis. The presented cloud detection algorithm has already been applied to the ZY-3 application system project, and the practical experiment results indicate that this algorithm is capable of promoting the accuracy of cloud detection significantly.

  9. Optical neural network system for pose determination of spinning satellites

    Science.gov (United States)

    Lee, Andrew; Casasent, David

    1990-01-01

    An optical neural network architecture and algorithm based on a Hopfield optimization network are presented for multitarget tracking. This tracker utilizes a neuron for every possible target track, and a quadratic energy function of neural activities which is minimized using gradient descent neural evolution. The neural net tracker is demonstrated as part of a system for determining position and orientation (pose) of spinning satellites with respect to a robotic spacecraft. The input to the system is time sequence video from a single camera. Novelty detection and filtering are utilized to locate and segment novel regions from the input images. The neural net multitarget tracker determines the correspondences (or tracks) of the novel regions as a function of time, and hence the paths of object (satellite) parts. The path traced out by a given part or region is approximately elliptical in image space, and the position, shape and orientation of the ellipse are functions of the satellite geometry and its pose. Having a geometric model of the satellite, and the elliptical path of a part in image space, the three-dimensional pose of the satellite is determined. Digital simulation results using this algorithm are presented for various satellite poses and lighting conditions.

  10. Breadth-First Search-Based Single-Phase Algorithms for Bridge Detection in Wireless Sensor Networks

    Science.gov (United States)

    Akram, Vahid Khalilpour; Dagdeviren, Orhan

    2013-01-01

    Wireless sensor networks (WSNs) are promising technologies for exploring harsh environments, such as oceans, wild forests, volcanic regions and outer space. Since sensor nodes may have limited transmission range, application packets may be transmitted by multi-hop communication. Thus, connectivity is a very important issue. A bridge is a critical edge whose removal breaks the connectivity of the network. Hence, it is crucial to detect bridges and take preventions. Since sensor nodes are battery-powered, services running on nodes should consume low energy. In this paper, we propose energy-efficient and distributed bridge detection algorithms for WSNs. Our algorithms run single phase and they are integrated with the Breadth-First Search (BFS) algorithm, which is a popular routing algorithm. Our first algorithm is an extended version of Milic's algorithm, which is designed to reduce the message length. Our second algorithm is novel and uses ancestral knowledge to detect bridges. We explain the operation of the algorithms, analyze their proof of correctness, message, time, space and computational complexities. To evaluate practical importance, we provide testbed experiments and extensive simulations. We show that our proposed algorithms provide less resource consumption, and the energy savings of our algorithms are up by 5.5-times. PMID:23845930

  11. DEM GENERATION FROM HIGH RESOLUTION SATELLITE IMAGES THROUGH A NEW 3D LEAST SQUARES MATCHING ALGORITHM

    Directory of Open Access Journals (Sweden)

    T. Kim

    2012-09-01

    Full Text Available Automated generation of digital elevation models (DEMs from high resolution satellite images (HRSIs has been an active research topic for many years. However, stereo matching of HRSIs, in particular based on image-space search, is still difficult due to occlusions and building facades within them. Object-space matching schemes, proposed to overcome these problem, often are very time consuming and critical to the dimensions of voxels. In this paper, we tried a new least square matching (LSM algorithm that works in a 3D object space. The algorithm starts with an initial height value on one location of the object space. From this 3D point, the left and right image points are projected. The true height is calculated by iterative least squares estimation based on the grey level differences between the left and right patches centred on the projected left and right points. We tested the 3D LSM to the Worldview images over 'Terrassa Sud' provided by the ISPRS WG I/4. We also compared the performance of the 3D LSM with the correlation matching based on 2D image space and the correlation matching based on 3D object space. The accuracy of the DEM from each method was analysed against the ground truth. Test results showed that 3D LSM offers more accurate DEMs over the conventional matching algorithms. Results also showed that 3D LSM is sensitive to the accuracy of initial height value to start the estimation. We combined the 3D COM and 3D LSM for accurate and robust DEM generation from HRSIs. The major contribution of this paper is that we proposed and validated that LSM can be applied to object space and that the combination of 3D correlation and 3D LSM can be a good solution for automated DEM generation from HRSIs.

  12. Development and Application of a Portable Health Algorithms Test System

    Science.gov (United States)

    Melcher, Kevin J.; Fulton, Christopher E.; Maul, William A.; Sowers, T. Shane

    2007-01-01

    This paper describes the development and initial demonstration of a Portable Health Algorithms Test (PHALT) System that is being developed by researchers at the NASA Glenn Research Center (GRC). The PHALT System was conceived as a means of evolving the maturity and credibility of algorithms developed to assess the health of aerospace systems. Comprising an integrated hardware-software environment, the PHALT System allows systems health management algorithms to be developed in a graphical programming environment; to be tested and refined using system simulation or test data playback; and finally, to be evaluated in a real-time hardware-in-the-loop mode with a live test article. In this paper, PHALT System development is described through the presentation of a functional architecture, followed by the selection and integration of hardware and software. Also described is an initial real-time hardware-in-the-loop demonstration that used sensor data qualification algorithms to diagnose and isolate simulated sensor failures in a prototype Power Distribution Unit test-bed. Success of the initial demonstration is highlighted by the correct detection of all sensor failures and the absence of any real-time constraint violations.

  13. Variational and symplectic integrators for satellite relative orbit propagation including drag

    Science.gov (United States)

    Palacios, Leonel; Gurfil, Pini

    2018-04-01

    Orbit propagation algorithms for satellite relative motion relying on Runge-Kutta integrators are non-symplectic—a situation that leads to incorrect global behavior and degraded accuracy. Thus, attempts have been made to apply symplectic methods to integrate satellite relative motion. However, so far all these symplectic propagation schemes have not taken into account the effect of atmospheric drag. In this paper, drag-generalized symplectic and variational algorithms for satellite relative orbit propagation are developed in different reference frames, and numerical simulations with and without the effect of atmospheric drag are presented. It is also shown that high-order versions of the newly-developed variational and symplectic propagators are more accurate and are significantly faster than Runge-Kutta-based integrators, even in the presence of atmospheric drag.

  14. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    Science.gov (United States)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  15. Comparison of primary productivity estimates in the Baltic Sea based on the DESAMBEM algorithm with estimates based on other similar algorithms

    Directory of Open Access Journals (Sweden)

    Małgorzata Stramska

    2013-02-01

    Full Text Available The quasi-synoptic view available from satellites has been broadly used in recent years to observe in near-real time the large-scale dynamics of marine ecosystems and to estimate primary productivity in the world ocean. However, the standard global NASA ocean colour algorithms generally do not produce good results in the Baltic Sea. In this paper, we compare the ability of seven algorithms to estimate depth-integrated daily primary production (PP, mg C m-2 in the Baltic Sea. All the algorithms use surface chlorophyll concentration, sea surface temperature, photosynthetic available radiation, latitude, longitude and day of the year as input data. Algorithm-derived PP is then compared with PP estimates obtained from 14C uptake measurements. The results indicate that the best agreement between the modelled and measured PP in the Baltic Sea is obtained with the DESAMBEM algorithm. This result supports the notion that a regional approach should be used in the interpretation of ocean colour satellite data in the Baltic Sea.

  16. Development of an Experimental Testbed for Research in Lithium-Ion Battery Management Systems

    Directory of Open Access Journals (Sweden)

    Mehdi Ferdowsi

    2013-10-01

    Full Text Available Advanced electrochemical batteries are becoming an integral part of a wide range of applications from household and commercial to smart grid, transportation, and aerospace applications. Among different battery technologies, lithium-ion (Li-ion batteries are growing more and more popular due to their high energy density, high galvanic potential, low self-discharge, low weight, and the fact that they have almost no memory effect. However, one of the main obstacles facing the widespread commercialization of Li-ion batteries is the design of reliable battery management systems (BMSs. An efficient BMS ensures electrical safety during operation, while increasing battery lifetime, capacity and thermal stability. Despite the need for extensive research in this field, the majority of research conducted on Li-ion battery packs and BMS are proprietary works conducted by manufacturers. The available literature, however, provides either general descriptions or detailed analysis of individual components of the battery system, and ignores addressing details of the overall system development. This paper addresses the development of an experimental research testbed for studying Li-ion batteries and their BMS design. The testbed can be configured in a variety of cell and pack architectures, allowing for a wide range of BMS monitoring, diagnostics, and control technologies to be tested and analyzed. General considerations that should be taken into account while designing Li-ion battery systems are reviewed and different technologies and challenges commonly encountered in Li-ion battery systems are investigated. This testbed facilitates future development of more practical and improved BMS technologies with the aim of increasing the safety, reliability, and efficiency of existing Li-ion battery systems. Experimental results of initial tests performed on the system are used to demonstrate some of the capabilities of the developed research testbed. To the authors

  17. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    Science.gov (United States)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  18. Generative Street Addresses from Satellite Imagery

    Directory of Open Access Journals (Sweden)

    İlke Demir

    2018-03-01

    Full Text Available We describe our automatic generative algorithm to create street addresses from satellite images by learning and labeling roads, regions, and address cells. Currently, 75% of the world’s roads lack adequate street addressing systems. Recent geocoding initiatives tend to convert pure latitude and longitude information into a memorable form for unknown areas. However, settlements are identified by streets, and such addressing schemes are not coherent with the road topology. Instead, we propose a generative address design that maps the globe in accordance with streets. Our algorithm starts with extracting roads from satellite imagery by utilizing deep learning. Then, it uniquely labels the regions, roads, and structures using some graph- and proximity-based algorithms. We also extend our addressing scheme to (i cover inaccessible areas following similar design principles; (ii be inclusive and flexible for changes on the ground; and (iii lead as a pioneer for a unified street-based global geodatabase. We present our results on an example of a developed city and multiple undeveloped cities. We also compare productivity on the basis of current ad hoc and new complete addresses. We conclude by contrasting our generative addresses to current industrial and open solutions.

  19. Climatology 2011: An MLS and Sonde Derived Ozone Climatology for Satellite Retrieval Algorithms

    Science.gov (United States)

    McPeters, Richard D.; Labow, Gordon J.

    2012-01-01

    The ozone climatology used as the a priori for the version 8 Solar Backscatter Ultraviolet (SBUV) retrieval algorithms has been updated. The Microwave Limb Sounder (MLS) instrument on Aura has excellent latitude coverage and measures ozone daily from the upper troposphere to the lower mesosphere. The new climatology consists of monthly average ozone profiles for ten degree latitude zones covering pressure altitudes from 0 to 65 km. The climatology was formed by combining data from Aura MLS (2004-2010) with data from balloon sondes (1988-2010). Ozone below 8 km (below 12 km at high latitudes) is based on balloons sondes, while ozone above 16 km (21 km at high latitudes) is based on MLS measurements. Sonde and MLS data are blended in the transition region. Ozone accuracy in the upper troposphere is greatly improved because of the near uniform coverage by Aura MLS, while the addition of a large number of balloon sonde measurements improves the accuracy in the lower troposphere, in the tropics and southern hemisphere in particular. The addition of MLS data also improves the accuracy of climatology in the upper stratosphere and lower mesosphere. The revised climatology has been used for the latest reprocessing of SBUV and TOMS satellite ozone data.

  20. System engineering approach to GPM retrieval algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Rose, C. R. (Chris R.); Chandrasekar, V.

    2004-01-01

    System engineering principles and methods are very useful in large-scale complex systems for developing the engineering requirements from end-user needs. Integrating research into system engineering is a challenging task. The proposed Global Precipitation Mission (GPM) satellite will use a dual-wavelength precipitation radar to measure and map global precipitation with unprecedented accuracy, resolution and areal coverage. The satellite vehicle, precipitation radars, retrieval algorithms, and ground validation (GV) functions are all critical subsystems of the overall GPM system and each contributes to the success of the mission. Errors in the radar measurements and models can adversely affect the retrieved output values. Ground validation (GV) systems are intended to provide timely feedback to the satellite and retrieval algorithms based on measured data. These GV sites will consist of radars and DSD measurement systems and also have intrinsic constraints. One of the retrieval algorithms being studied for use with GPM is the dual-wavelength DSD algorithm that does not use the surface reference technique (SRT). The underlying microphysics of precipitation structures and drop-size distributions (DSDs) dictate the types of models and retrieval algorithms that can be used to estimate precipitation. Many types of dual-wavelength algorithms have been studied. Meneghini (2002) analyzed the performance of single-pass dual-wavelength surface-reference-technique (SRT) based algorithms. Mardiana (2003) demonstrated that a dual-wavelength retrieval algorithm could be successfully used without the use of the SRT. It uses an iterative approach based on measured reflectivities at both wavelengths and complex microphysical models to estimate both No and Do at each range bin. More recently, Liao (2004) proposed a solution to the Do ambiguity problem in rain within the dual-wavelength algorithm and showed a possible melting layer model based on stratified spheres. With the No and Do

  1. Validation of ozone profile retrievals derived from the OMPS LP version 2.5 algorithm against correlative satellite measurements

    Directory of Open Access Journals (Sweden)

    N. A. Kramarova

    2018-05-01

    Full Text Available The Limb Profiler (LP is a part of the Ozone Mapping and Profiler Suite launched on board of the Suomi NPP satellite in October 2011. The LP measures solar radiation scattered from the atmospheric limb in ultraviolet and visible spectral ranges between the surface and 80 km. These measurements of scattered solar radiances allow for the retrieval of ozone profiles from cloud tops up to 55 km. The LP started operational observations in April 2012. In this study we evaluate more than 5.5 years of ozone profile measurements from the OMPS LP processed with the new NASA GSFC version 2.5 retrieval algorithm. We provide a brief description of the key changes that had been implemented in this new algorithm, including a pointing correction, new cloud height detection, explicit aerosol correction and a reduction of the number of wavelengths used in the retrievals. The OMPS LP ozone retrievals have been compared with independent satellite profile measurements obtained from the Aura Microwave Limb Sounder (MLS, Atmospheric Chemistry Experiment Fourier Transform Spectrometer (ACE-FTS and Odin Optical Spectrograph and InfraRed Imaging System (OSIRIS. We document observed biases and seasonal differences and evaluate the stability of the version 2.5 ozone record over 5.5 years. Our analysis indicates that the mean differences between LP and correlative measurements are well within required ±10 % between 18 and 42 km. In the upper stratosphere and lower mesosphere (> 43 km LP tends to have a negative bias. We find larger biases in the lower stratosphere and upper troposphere, but LP ozone retrievals have significantly improved in version 2.5 compared to version 2 due to the implemented aerosol correction. In the northern high latitudes we observe larger biases between 20 and 32 km due to the remaining thermal sensitivity issue. Our analysis shows that LP ozone retrievals agree well with the correlative satellite observations in characterizing

  2. Satellite Ocean Biology: Past, Present, Future

    Science.gov (United States)

    McClain, Charles R.

    2012-01-01

    Since 1978 when the first satellite ocean color proof-of-concept sensor, the Nimbus-7 Coastal Zone Color Scanner, was launched, much progress has been made in refining the basic measurement concept and expanding the research applications of global satellite time series of biological and optical properties such as chlorophyll-a concentrations. The seminar will review the fundamentals of satellite ocean color measurements (sensor design considerations, on-orbit calibration, atmospheric corrections, and bio-optical algorithms), scientific results from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) and Moderate resolution Imaging Spectroradiometer (MODIS) missions, and the goals of future NASA missions such as PACE, the Aerosol, Cloud, Ecology (ACE), and Geostationary Coastal and Air Pollution Events (GeoCAPE) missions.

  3. Design of a low-power testbed for Wireless Sensor Networks and verification

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dulman, S.O.; Havinga, Paul J.M.; Kip, Harry J.

    In this document the design considerations and component choices of a testbed prototype device for wireless sensor networks will be discussed. These devices must be able to monitor their physical environment, process data and assist other nodes in forwarding sensor readings. For these tasks, five

  4. Spin glasses and algorithm benchmarks: A one-dimensional view

    International Nuclear Information System (INIS)

    Katzgraber, H G

    2008-01-01

    Spin glasses are paradigmatic models that deliver concepts relevant for a variety of systems. However, rigorous analytical results are difficult to obtain for spin-glass models, in particular for realistic short-range models. Therefore large-scale numerical simulations are the tool of choice. Concepts and algorithms derived from the study of spin glasses have been applied to diverse fields in computer science and physics. In this work a one-dimensional long-range spin-glass model with power-law interactions is discussed. The model has the advantage over conventional systems in that by tuning the power-law exponent of the interactions the effective space dimension can be changed thus effectively allowing the study of large high-dimensional spin-glass systems to address questions as diverse as the existence of an Almeida-Thouless line, ultrametricity and chaos in short range spin glasses. Furthermore, because the range of interactions can be changed, the model is a formidable test-bed for optimization algorithms

  5. Testbed model and data assimilation for ARM

    International Nuclear Information System (INIS)

    Louis, J.F.

    1992-01-01

    The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes

  6. Attitude Determination with Magnetometers and Accelerometers to Use in Satellite Simulator

    Directory of Open Access Journals (Sweden)

    Helio Koiti Kuga

    2013-01-01

    Full Text Available Attitude control of artificial satellites is dependent on information provided by its attitude determination process. This paper presents the implementation and tests of a fully self-contained algorithm for the attitude determination using magnetometers and accelerometers, for application on a satellite simulator based on frictionless air bearing tables. However, it is known that magnetometers and accelerometers need to be calibrated so as to allow that measurements are used to their ultimate accuracy. A calibration method is implemented which proves to be essential for improving attitude determination accuracy. For the stepwise real-time attitude determination, it was used the well-known QUEST algorithm which yields quick response with reduced computer resources. The algorithms are tested and qualified with actual data collected on the streets under controlled situations. For such street runaways, the experiment employs a solid-state magnetoresistive magnetometer and an IMU navigation block consisting of triads of accelerometers and gyros, with MEMS technology. A GPS receiver is used to record positional information. The collected measurements are processed through the developed algorithms, and comparisons are made for attitude determination using calibrated and noncalibrated data. The results show that the attitude accuracy reaches the requirements for real-time operation for satellite simulator platforms.

  7. A Cloud Top Pressure Algorithm for DSCOVR-EPIC

    Science.gov (United States)

    Min, Q.; Morgan, E. C.; Yang, Y.; Marshak, A.; Davis, A. B.

    2017-12-01

    The Earth Polychromatic Imaging Camera (EPIC) sensor on the Deep Space Climate Observatory (DSCOVR) satellite presents unique opportunities to derive cloud properties of the entire daytime Earth. In particular, the Oxygen A- and B-band and corresponding reference channels provide cloud top pressure information. In order to address the in-cloud penetration depth issue—and ensuing retrieval bias—a comprehensive sensitivity study has been conducted to simulate satellite-observed radiances for a wide variety of cloud structures and optical properties. Based on this sensitivity study, a cloud top pressure algorithm for DSCOVR-EPIC has been developed. Further, the algorithm has been applied to EPIC measurements.

  8. Moon Search Algorithms for NASA's Dawn Mission to Asteroid Vesta

    Science.gov (United States)

    Memarsadeghi, Nargess; Mcfadden, Lucy A.; Skillman, David R.; McLean, Brian; Mutchler, Max; Carsenty, Uri; Palmer, Eric E.

    2012-01-01

    A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid. Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies. Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid Vesta primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and tools used when addressing this problem, and 3) describe applications of various image processing and computational algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore, we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach and orbit the dwarf planet Ceres.

  9. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    Science.gov (United States)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  10. Event metadata records as a testbed for scalable data mining

    International Nuclear Information System (INIS)

    Gemmeren, P van; Malon, D

    2010-01-01

    At a data rate of 200 hertz, event metadata records ('TAGs,' in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise 'data mining,' but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  11. Satellite Control Laboratory

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Bak, Thomas

    2001-01-01

    The Satellite Laboratory at the Department of Control Engineering of Aalborg University (SatLab) is a dynamic motion facility designed for analysis and test of micro spacecraft. A unique feature of the laboratory is that it provides a completely gravity-free environment. A test spacecraft...... of the laboratory is to conduct dynamic tests of the control and attitude determination algorithms during nominal operation and in abnormal conditions. Further it is intended to use SatLab for validation of various algorithms for fault detection, accommodation and supervisory control. Different mission objectives...... can be implemented in the laboratory, e.g. three-axis attitude control, slew manoeuvres, spins stabilization using magnetic actuation and/or reaction wheels. The spacecraft attitude can be determined applying magnetometer measurements...

  12. Evaluation of Land Surface Temperature Operationally Retrieved from Korean Geostationary Satellite (COMS Data

    Directory of Open Access Journals (Sweden)

    A-Ra Cho

    2013-08-01

    Full Text Available We evaluated the precision of land surface temperature (LST operationally retrieved from the Korean multipurpose geostationary satellite, Communication, Ocean and Meteorological Satellite (COMS. The split-window (SW-type retrieval algorithm was developed through radiative transfer model simulations under various atmospheric profiles, satellite zenith angles, surface emissivity values and surface lapse rate conditions using Moderate Resolution Atmospheric Transmission version 4 (MODTRAN4. The estimation capabilities of the COMS SW (CSW LST algorithm were evaluated for various impacting factors, and the retrieval accuracy of COMS LST data was evaluated with collocated Moderate Resolution Imaging Spectroradiometer (MODIS LST data. The surface emissivity values for two SW channels were generated using a vegetation cover method. The CSW algorithm estimated the LST distribution reasonably well (averaged bias = 0.00 K, Root Mean Square Error (RMSE = 1.41 K, correlation coefficient = 0.99; however, the estimation capabilities of the CSW algorithm were significantly impacted by large brightness temperature differences and surface lapse rates. The CSW algorithm reproduced spatiotemporal variations of LST comparing well to MODIS LST data, irrespective of what month or time of day the data were collected from. The one-year evaluation results with MODIS LST data showed that the annual mean bias, RMSE and correlation coefficient for the CSW algorithm were −1.009 K, 2.613 K and 0.988, respectively.

  13. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Science.gov (United States)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  14. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    Directory of Open Access Journals (Sweden)

    C. Knote

    2018-02-01

    Full Text Available The Background Error Analysis Testbed (BEATBOX is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX to the Kinetic Pre-Processor (KPP, this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  15. A technical description of the FlexHouse Project Testbed

    DEFF Research Database (Denmark)

    Sørensen, Jens Otto

    2000-01-01

    This paper describes the FlexHouse project testbed; a server dedicated to experiments within the FlexHouse project. The FlexHouse project is a project originating from The Business Computing Research Group at The Aarhus School of Business. The purpose of the project is to identify and develop...... methods that satisfy the following three requirements. Flexibility with respect to evolving data sources. Flexibility with respect to change of information needs. Efficiency with respect to view management....

  16. Testbed for a LiFi system integrated in streetlights

    OpenAIRE

    Monzón Baeza, Victor; Sánchez Fernández, Matilde Pilar; García-Armada, Ana; Royo, A.

    2015-01-01

    Proceeding at: 2015 European Conference on Networks and Communications (EuCNC) took place June 29 - July 2 in Paris, France. In this paper, a functional LiFi real-time testbed implemented on FPGAs is presented. The setup evaluates the performance of our design in a downlink scenario where the transmitter is embedded on the streetlights and a mobile phone’s camera is used as receiver, therefore achieving the goal of lighting and communicating simultaneously. To validate the ...

  17. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the San Diego Testbed : Draft Report.

    Science.gov (United States)

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  18. PlanetLab Europe as Geographically-Distributed Testbed for Software Development and Evaluation

    Directory of Open Access Journals (Sweden)

    Dan Komosny

    2015-01-01

    Full Text Available In this paper, we analyse the use of PlanetLab Europe for development and evaluation of geographically-oriented Internet services. PlanetLab is a global research network with the main purpose to support development of new Internet services and protocols. PlanetLab is divided into several branches; one of them is PlanetLab Europe. PlanetLab Europe consists of about 350 nodes at 150 geographically different sites. The nodes are accessible by remote login, and the users can run their software on the nodes. In the paper, we study the PlanetLab's properties that are significant for its use as a geographically distributed testbed. This includes node position accuracy, services availability and stability. We find a considerable number of location inaccuracies and a number of services that cannot be considered as reliable. Based on the results we propose a simple approach to nodes selection in testbeds for geographically-oriented Internet services development and evaluation.

  19. Energy-driven scheduling algorithm for nanosatellite energy harvesting maximization

    Science.gov (United States)

    Slongo, L. K.; Martínez, S. V.; Eiterer, B. V. B.; Pereira, T. G.; Bezerra, E. A.; Paiva, K. V.

    2018-06-01

    The number of tasks that a satellite may execute in orbit is strongly related to the amount of energy its Electrical Power System (EPS) is able to harvest and to store. The manner the stored energy is distributed within the satellite has also a great impact on the CubeSat's overall efficiency. Most CubeSat's EPS do not prioritize energy constraints in their formulation. Unlike that, this work proposes an innovative energy-driven scheduling algorithm based on energy harvesting maximization policy. The energy harvesting circuit is mathematically modeled and the solar panel I-V curves are presented for different temperature and irradiance levels. Considering the models and simulations, the scheduling algorithm is designed to keep solar panels working close to their maximum power point by triggering tasks in the appropriate form. Tasks execution affects battery voltage, which is coupled to the solar panels through a protection circuit. A software based Perturb and Observe strategy allows defining the tasks to be triggered. The scheduling algorithm is tested in FloripaSat, which is an 1U CubeSat. A test apparatus is proposed to emulate solar irradiance variation, considering the satellite movement around the Earth. Tests have been conducted to show that the scheduling algorithm improves the CubeSat energy harvesting capability by 4.48% in a three orbit experiment and up to 8.46% in a single orbit cycle in comparison with the CubeSat operating without the scheduling algorithm.

  20. Use of Multiangle Satellite Observations to Retrieve Aerosol Properties and Ocean Color

    Science.gov (United States)

    Martonchik, John V.; Diner, David; Khan, Ralph

    2005-01-01

    A new technique is described for retrieving aerosol over ocean water and the associated ocean color using multiangle satellite observations. Unlike current satellite aerosol retrieval algorithms which only utilize observations at red wavelengths and longer, with the assumption that these wavelengths have a negligible ocean (water-leaving radiance), this new algorithm uses all available spectral bands and simultaneously retrieves both aerosol properties and the spectral ocean color. We show some results of case studies using MISR data, performed over different water conditions (coastal water, blooms, and open water).

  1. Optimization of Joint Power and Bandwidth Allocation in Multi-Spot-Beam Satellite Communication Systems

    Directory of Open Access Journals (Sweden)

    Heng Wang

    2014-01-01

    Full Text Available Multi-spot-beam technique has been widely applied in modern satellite communication systems. However, the satellite power and bandwidth resources in a multi-spot-beam satellite communication system are scarce and expensive; it is urgent to utilize the resources efficiently. To this end, dynamically allocating the power and bandwidth is an available way. This paper initially formulates the problem of resource joint allocation as a convex optimization problem, taking into account a compromise between the maximum total system capacity and the fairness among the spot beams. A joint bandwidth and power allocation iterative algorithm based on duality theory is then proposed to obtain the optimal solution of this optimization problem. Compared with the existing separate bandwidth or power optimal allocation algorithms, it is shown that the joint allocation algorithm improves both the total system capacity and the fairness among spot beams. Moreover, it is easy to be implemented in practice, as the computational complexity of the proposed algorithm is linear with the number of spot beams.

  2. Accelerating Innovation that Enhances Resource Recovery in the Wastewater Sector: Advancing a National Testbed Network.

    Science.gov (United States)

    Mihelcic, James R; Ren, Zhiyong Jason; Cornejo, Pablo K; Fisher, Aaron; Simon, A J; Snyder, Seth W; Zhang, Qiong; Rosso, Diego; Huggins, Tyler M; Cooper, William; Moeller, Jeff; Rose, Bob; Schottel, Brandi L; Turgeon, Jason

    2017-07-18

    This Feature examines significant challenges and opportunities to spur innovation and accelerate adoption of reliable technologies that enhance integrated resource recovery in the wastewater sector through the creation of a national testbed network. The network is a virtual entity that connects appropriate physical testing facilities, and other components needed for a testbed network, with researchers, investors, technology providers, utilities, regulators, and other stakeholders to accelerate the adoption of innovative technologies and processes that are needed for the water resource recovery facility of the future. Here we summarize and extract key issues and developments, to provide a strategy for the wastewater sector to accelerate a path forward that leads to new sustainable water infrastructures.

  3. Exploring Subpixel Learning Algorithms for Estimating Global Land Cover Fractions from Satellite Data Using High Performance Computing

    Directory of Open Access Journals (Sweden)

    Uttam Kumar

    2017-10-01

    Full Text Available Land cover (LC refers to the physical and biological cover present over the Earth’s surface in terms of the natural environment such as vegetation, water, bare soil, etc. Most LC features occur at finer spatial scales compared to the resolution of primary remote sensing satellites. Therefore, observed data are a mixture of spectral signatures of two or more LC features resulting in mixed pixels. One solution to the mixed pixel problem is the use of subpixel learning algorithms to disintegrate the pixel spectrum into its constituent spectra. Despite the popularity and existing research conducted on the topic, the most appropriate approach is still under debate. As an attempt to address this question, we compared the performance of several subpixel learning algorithms based on least squares, sparse regression, signal–subspace and geometrical methods. Analysis of the results obtained through computer-simulated and Landsat data indicated that fully constrained least squares (FCLS outperformed the other techniques. Further, FCLS was used to unmix global Web-Enabled Landsat Data to obtain abundances of substrate (S, vegetation (V and dark object (D classes. Due to the sheer nature of data and computational needs, we leveraged the NASA Earth Exchange (NEX high-performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into four classes, namely forest, farmland, water and urban areas (in conjunction with nighttime lights data over California, USA using a random forest classifier. Validation of these LC maps with the National Land Cover Database 2011 products and North American Forest Dynamics static forest map shows a 6% improvement in unmixing-based classification relative to per-pixel classification. As such, abundance maps continue to offer a useful alternative to high-spatial-resolution classified maps for forest inventory analysis, multi

  4. A preliminary study of level 1A data processing of a low–low satellite to satellite tracking mission

    Directory of Open Access Journals (Sweden)

    Peng Xu

    2015-09-01

    Full Text Available With the Gravity Recovery and Climate Experiment (GRACE mission as the prime example, an overview is given on the management and processing of Level 1A data of a low–low satellite to satellite tracking mission. To illustrate the underlying principle and algorithm, a detailed study is made on the K-band ranging (KBR assembly, which includes the measurement principles, modeling of noises, the generation of Level 1A data from that of Level 0 as well as Level 1A to Level 1B data processing.

  5. Retrieval of land surface temperature (LST) from landsat TM6 and TIRS data by single channel radiative transfer algorithm using satellite and ground-based inputs

    Science.gov (United States)

    Chatterjee, R. S.; Singh, Narendra; Thapa, Shailaja; Sharma, Dravneeta; Kumar, Dheeraj

    2017-06-01

    The present study proposes land surface temperature (LST) retrieval from satellite-based thermal IR data by single channel radiative transfer algorithm using atmospheric correction parameters derived from satellite-based and in-situ data and land surface emissivity (LSE) derived by a hybrid LSE model. For example, atmospheric transmittance (τ) was derived from Terra MODIS spectral radiance in atmospheric window and absorption bands, whereas the atmospheric path radiance and sky radiance were estimated using satellite- and ground-based in-situ solar radiation, geographic location and observation conditions. The hybrid LSE model which is coupled with ground-based emissivity measurements is more versatile than the previous LSE models and yields improved emissivity values by knowledge-based approach. It uses NDVI-based and NDVI Threshold method (NDVITHM) based algorithms and field-measured emissivity values. The model is applicable for dense vegetation cover, mixed vegetation cover, bare earth including coal mining related land surface classes. The study was conducted in a coalfield of India badly affected by coal fire for decades. In a coal fire affected coalfield, LST would provide precise temperature difference between thermally anomalous coal fire pixels and background pixels to facilitate coal fire detection and monitoring. The derived LST products of the present study were compared with radiant temperature images across some of the prominent coal fire locations in the study area by graphical means and by some standard mathematical dispersion coefficients such as coefficient of variation, coefficient of quartile deviation, coefficient of quartile deviation for 3rd quartile vs. maximum temperature, coefficient of mean deviation (about median) indicating significant increase in the temperature difference among the pixels. The average temperature slope between adjacent pixels, which increases the potential of coal fire pixel detection from background pixels, is

  6. A Testbed to Evaluate the FIWARE-Based IoT Platform in the Domain of Precision Agriculture

    Science.gov (United States)

    Martínez, Ramón; Pastor, Juan Ángel; Álvarez, Bárbara; Iborra, Andrés

    2016-01-01

    Wireless sensor networks (WSNs) represent one of the most promising technologies for precision farming. Over the next few years, a significant increase in the use of such systems on commercial farms is expected. WSNs present a number of problems, regarding scalability, interoperability, communications, connectivity with databases and data processing. Different Internet of Things middleware is appearing to overcome these challenges. This paper checks whether one of these middleware, FIWARE, is suitable for the development of agricultural applications. To the authors’ knowledge, there are no works that show how to use FIWARE in precision agriculture and study its appropriateness, its scalability and its efficiency for this kind of applications. To do this, a testbed has been designed and implemented to simulate different deployments and load conditions. The testbed is a typical FIWARE application, complete, yet simple and comprehensible enough to show the main features and components of FIWARE, as well as the complexity of using this technology. Although the testbed has been deployed in a laboratory environment, its design is based on the analysis of an Internet of Things use case scenario in the domain of precision agriculture. PMID:27886091

  7. Development and experimentation of an eye/brain/task testbed

    Science.gov (United States)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  8. A Modified Spatiotemporal Fusion Algorithm Using Phenological Information for Predicting Reflectance of Paddy Rice in Southern China

    Directory of Open Access Journals (Sweden)

    Mengxue Liu

    2018-05-01

    Full Text Available Satellite data for studying surface dynamics in heterogeneous landscapes are missing due to frequent cloud contamination, low temporal resolution, and technological difficulties in developing satellites. A modified spatiotemporal fusion algorithm for predicting the reflectance of paddy rice is presented in this paper. The algorithm uses phenological information extracted from a moderate-resolution imaging spectroradiometer enhanced vegetation index time series to improve the enhanced spatial and temporal adaptive reflectance fusion model (ESTARFM. The algorithm is tested with satellite data on Yueyang City, China. The main contribution of the modified algorithm is the selection of similar neighborhood pixels by using phenological information to improve accuracy. Results show that the modified algorithm performs better than ESTARFM in visual inspection and quantitative metrics, especially for paddy rice. This modified algorithm provides not only new ideas for the improvement of spatiotemporal data fusion method, but also technical support for the generation of remote sensing data with high spatial and temporal resolution.

  9. Optimizing the Attitude Control of Small Satellite Constellations for Rapid Response Imaging

    Science.gov (United States)

    Nag, S.; Li, A.

    2016-12-01

    Distributed Space Missions (DSMs) such as formation flight and constellations, are being recognized as important solutions to increase measurement samples over space and time. Given the increasingly accurate attitude control systems emerging in the commercial market, small spacecraft now have the ability to slew and point within few minutes of notice. In spite of hardware development in CubeSats at the payload (e.g. NASA InVEST) and subsystems (e.g. Blue Canyon Technologies), software development for tradespace analysis in constellation design (e.g. Goddard's TAT-C), planning and scheduling development in single spacecraft (e.g. GEO-CAPE) and aerial flight path optimizations for UAVs (e.g. NASA Sensor Web), there is a gap in open-source, open-access software tools for planning and scheduling distributed satellite operations in terms of pointing and observing targets. This paper will demonstrate results from a tool being developed for scheduling pointing operations of narrow field-of-view (FOV) sensors over mission lifetime to maximize metrics such as global coverage and revisit statistics. Past research has shown the need for at least fourteen satellites to cover the Earth globally everyday using a LandSat-like sensor. Increasing the FOV three times reduces the need to four satellites, however adds image distortion and BRDF complexities to the observed reflectance. If narrow FOV sensors on a small satellite constellation were commanded using robust algorithms to slew their sensor dynamically, they would be able to coordinately cover the global landmass much faster without compensating for spatial resolution or BRDF effects. Our algorithm to optimize constellation satellite pointing is based on a dynamic programming approach under the constraints of orbital mechanics and existing attitude control systems for small satellites. As a case study for our algorithm, we minimize the time required to cover the 17000 Landsat images with maximum signal to noise ratio fall

  10. Monthly-Diurnal Water Budget Variability Over Gulf of Mexico-Caribbean Sea Basin from Satellite Observations

    Science.gov (United States)

    Smith, E. A.; Santos, P.

    2006-01-01

    This study presents results from a multi-satellite/multi-sensor retrieval system design d to obtain the atmospheric water budget over the open ocean. A combination of hourly-sampled monthly datasets derived from the GOES-8 5-channel Imager, the TRMM TMI radiometer, and the DMSP 7-channel passive microwave radiometers (SSM/I) have been acquired for the combined Gulf of Mexico-Caribbean Sea basin. Whereas the methodology has been tested over this basin, the retrieval system is designed for portability to any open-ocean region. Algorithm modules using the different datasets to retrieve individual geophysical parameters needed in the water budget equation are designed in a manner that takes advantage of the high temporal resolution of the GOES-8 measurements, as well as the physical relationships inherent to the TRMM and SSM/I passive microwave measurements in conjunction with water vapor, cloud liquid water, and rainfall. The methodology consists of retrieving the precipitation, surface evaporation, and vapor-cloud water storage terms in the atmospheric water balance equation from satellite techniques, with the water vapor advection term being obtained as the residue needed for balance. Thus, the intent is to develop a purely satellite-based method for obtaining the full set of terms in the atmospheric water budget equation without requiring in situ sounding information on the wind profile. The algorithm is validated by cross-checking all the algorithm components through multiple-algorithm retrieval intercomparisons. A further check on the validation is obtained by directly comparing water vapor transports into the targeted basin diagnosed from the satellite algorithms to those obtained observationally from a network of land-based upper air stations that nearly uniformly surround the basin, although it is fair to say that these checks are more effective in identifying problems in estimating vapor transports from a "leaky" operational radiosonde network than in

  11. The CMS Integration Grid Testbed

    CERN Document Server

    Graham, G E; Aziz, Shafqat; Bauerdick, L.A.T.; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yu-jun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge Luis; Kategari, Suchindra; Couvares, Peter; DeSmet, Alan; Livny, Miron; Roy, Alain; Tannenbaum, Todd; Graham, Gregory E.; Aziz, Shafqat; Ernst, Michael; Kaiser, Joseph; Ratnikova, Natalia; Wenzel, Hans; Wu, Yujun; Aslakson, Erik; Bunn, Julian; Iqbal, Saima; Legrand, Iosif; Newman, Harvey; Singh, Suresh; Steenberg, Conrad; Branson, James; Fisk, Ian; Letts, James; Arbree, Adam; Avery, Paul; Bourilkov, Dimitri; Cavanaugh, Richard; Rodriguez, Jorge; Kategari, Suchindra; Couvares, Peter; Smet, Alan De; Livny, Miron; Roy, Alain; Tannenbaum, Todd

    2003-01-01

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distrib ution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuo us two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. ...

  12. Satellite Imagery Assisted Road-Based Visual Navigation System

    Science.gov (United States)

    Volkova, A.; Gibbens, P. W.

    2016-06-01

    There is a growing demand for unmanned aerial systems as autonomous surveillance, exploration and remote sensing solutions. Among the key concerns for robust operation of these systems is the need to reliably navigate the environment without reliance on global navigation satellite system (GNSS). This is of particular concern in Defence circles, but is also a major safety issue for commercial operations. In these circumstances, the aircraft needs to navigate relying only on information from on-board passive sensors such as digital cameras. An autonomous feature-based visual system presented in this work offers a novel integral approach to the modelling and registration of visual features that responds to the specific needs of the navigation system. It detects visual features from Google Earth* build a feature database. The same algorithm then detects features in an on-board cameras video stream. On one level this serves to localise the vehicle relative to the environment using Simultaneous Localisation and Mapping (SLAM). On a second level it correlates them with the database to localise the vehicle with respect to the inertial frame. The performance of the presented visual navigation system was compared using the satellite imagery from different years. Based on comparison results, an analysis of the effects of seasonal, structural and qualitative changes of the imagery source on the performance of the navigation algorithm is presented. * The algorithm is independent of the source of satellite imagery and another provider can be used

  13. Cooperating expert systems for Space Station - Power/thermal subsystem testbeds

    Science.gov (United States)

    Wong, Carla M.; Weeks, David J.; Sundberg, Gale R.; Healey, Kathleen L.; Dominick, Jeffrey S.

    1988-01-01

    The Systems Autonomy Demonstration Project (SADP) is a NASA-sponsored series of increasingly complex demonstrations to show the benefits of integrating knowledge-based systems with conventional process control in real-time, real-world problem domains that can facilitate the operations and availability of major Space Station distributed systems. This paper describes the system design, objectives, approaches, and status of each of the testbed knowledge-based systems. Simplified schematics of the systems are shown.

  14. High accuracy satellite drag model (HASDM)

    Science.gov (United States)

    Storz, Mark F.; Bowman, Bruce R.; Branson, Major James I.; Casali, Stephen J.; Tobiska, W. Kent

    The dominant error source in force models used to predict low-perigee satellite trajectories is atmospheric drag. Errors in operational thermospheric density models cause significant errors in predicted satellite positions, since these models do not account for dynamic changes in atmospheric drag for orbit predictions. The Air Force Space Battlelab's High Accuracy Satellite Drag Model (HASDM) estimates and predicts (out three days) a dynamically varying global density field. HASDM includes the Dynamic Calibration Atmosphere (DCA) algorithm that solves for the phases and amplitudes of the diurnal and semidiurnal variations of thermospheric density near real-time from the observed drag effects on a set of Low Earth Orbit (LEO) calibration satellites. The density correction is expressed as a function of latitude, local solar time and altitude. In HASDM, a time series prediction filter relates the extreme ultraviolet (EUV) energy index E10.7 and the geomagnetic storm index ap, to the DCA density correction parameters. The E10.7 index is generated by the SOLAR2000 model, the first full spectrum model of solar irradiance. The estimated and predicted density fields will be used operationally to significantly improve the accuracy of predicted trajectories for all low-perigee satellites.

  15. GPS Modeling and Analysis. Summary of Research: GPS Satellite Axial Ratio Predictions

    Science.gov (United States)

    Axelrad, Penina; Reeh, Lisa

    2002-01-01

    This report outlines the algorithms developed at the Colorado Center for Astrodynamics Research to model yaw and predict the axial ratio as measured from a ground station. The algorithms are implemented in a collection of Matlab functions and scripts that read certain user input, such as ground station coordinates, the UTC time, and the desired GPS (Global Positioning System) satellites, and compute the above-mentioned parameters. The position information for the GPS satellites is obtained from Yuma almanac files corresponding to the prescribed date. The results are displayed graphically through time histories and azimuth-elevation plots.

  16. The use of a MODIS band-ratio algorithm versus a new hybrid approach for estimating colored dissolved organic matter (CDOM)

    Science.gov (United States)

    Satellite remote sensing offers synoptic and frequent monitoring of optical water quality parameters, such as chlorophyll-a, turbidity, and colored dissolved organic matter (CDOM). While traditional satellite algorithms were developed for the open ocean, these algorithms often do...

  17. Establishment of a sensor testbed at NIST for plant productivity monitoring

    Science.gov (United States)

    Allen, D. W.; Hutyra, L.; Reinmann, A.; Trlica, A.; Marrs, J.; Jones, T.; Whetstone, J. R.; Logan, B.; Reblin, J.

    2017-12-01

    Accurate assessments of biogenic carbon fluxes is challenging. Correlating optical signatures to plant activity allows for monitoring large regions. New methods, including solar-induced fluorescence (SIF), promise to provide more timely and accurate estimate of plant activity, but we are still developing a full understanding of the mechanistic leakage between plant assimilation of carbon and SIF. We have initiated a testbed to facilitate the evaluation of sensors and methods for remote monitoring of plant activity at the NIST headquarters. The test bed utilizes a forested area of mature trees in a mixed urban environment. A 1 hectare plot within the 26 hectare forest has been instrumented for ecophysiological measurements with an edge (100 m long) that is persistently monitored with multimodal optical sensors (SIF spectrometers, hyperspectral imagers, thermal infrared imaging, and lidar). This biological testbed has the advantage of direct access to the national scales maintained by NIST of measurements related to both the physical and optical measurements of interest. We offer a description of the test site, the sensors, and preliminary results from the first season of observations for ecological, physiological, and remote sensing based estimates of ecosystem productivity.

  18. The Objectives of NASA's Living with a Star Space Environment Testbed

    Science.gov (United States)

    Barth, Janet L.; LaBel, Kenneth A.; Brewer, Dana; Kauffman, Billy; Howard, Regan; Griffin, Geoff; Day, John H. (Technical Monitor)

    2001-01-01

    NASA is planning to fly a series of Space Environment Testbeds (SET) as part of the Living With A Star (LWS) Program. The goal of the testbeds is to improve and develop capabilities to mitigate and/or accommodate the affects of solar variability in spacecraft and avionics design and operation. This will be accomplished by performing technology validation in space to enable routine operations, characterize technology performance in space, and improve and develop models, guidelines and databases. The anticipated result of the LWS/SET program is improved spacecraft performance, design, and operation for survival of the radiation, spacecraft charging, meteoroid, orbital debris and thermosphere/ionosphere environments. The program calls for a series of NASA Research Announcements (NRAs) to be issued to solicit flight validation experiments, improvement in environment effects models and guidelines, and collateral environment measurements. The selected flight experiments may fly on the SET experiment carriers and flights of opportunity on other commercial and technology missions. This paper presents the status of the project so far, including a description of the types of experiments that are intended to fly on SET-1 and a description of the SET-1 carrier parameters.

  19. Geostationary Sensor Based Forest Fire Detection and Monitoring: An Improved Version of the SFIDE Algorithm

    Directory of Open Access Journals (Sweden)

    Valeria Di Biase

    2018-05-01

    Full Text Available The paper aims to present the results obtained in the development of a system allowing for the detection and monitoring of forest fires and the continuous comparison of their intensity when several events occur simultaneously—a common occurrence in European Mediterranean countries during the summer season. The system, called SFIDE (Satellite FIre DEtection, exploits a geostationary satellite sensor (SEVIRI, Spinning Enhanced Visible and InfraRed Imager, on board of MSG, Meteosat Second Generation, satellite series. The algorithm was developed several years ago in the framework of a project (SIGRI funded by the Italian Space Agency (ASI. This algorithm has been completely reviewed in order to enhance its efficiency by reducing false alarms rate preserving a high sensitivity. Due to the very low spatial resolution of SEVIRI images (4 × 4 km2 at Mediterranean latitude the sensitivity of the algorithm should be very high to detect even small fires. The improvement of the algorithm has been obtained by: introducing the sun elevation angle in the computation of the preliminary thresholds to identify potential thermal anomalies (hot spots, introducing a contextual analysis in the detection of clouds and in the detection of night-time fires. The results of the algorithm have been validated in the Sardinia region by using ground true data provided by the regional Corpo Forestale e di Vigilanza Ambientale (CFVA. A significant reduction of the commission error (less than 10% has been obtained with respect to the previous version of the algorithm and also with respect to fire-detection algorithms based on low earth orbit satellites.

  20. Intercomparison of Ocean Color Algorithms for Picophytoplankton Carbon in the Ocean

    Directory of Open Access Journals (Sweden)

    Víctor Martínez-Vicente

    2017-12-01

    Full Text Available The differences among phytoplankton carbon (Cphy predictions from six ocean color algorithms are investigated by comparison with in situ estimates of phytoplankton carbon. The common satellite data used as input for the algorithms is the Ocean Color Climate Change Initiative merged product. The matching in situ data are derived from flow cytometric cell counts and per-cell carbon estimates for different types of pico-phytoplankton. This combination of satellite and in situ data provides a relatively large matching dataset (N > 500, which is independent from most of the algorithms tested and spans almost two orders of magnitude in Cphy. Results show that not a single algorithm outperforms any of the other when using all matching data. Concentrating on the oligotrophic regions (Chlorophyll-a concentration, B, less than 0.15 mg Chl m−3, where flow cytometric analysis captures most of the phytoplankton biomass, reveals significant differences in algorithm performance. The bias ranges from −35 to +150% and unbiased root mean squared difference from 5 to 10 mg C m−3 among algorithms, with chlorophyll-based algorithms performing better than the rest. The backscattering-based algorithms produce different results at the clearest waters and these differences are discussed in terms of the different algorithms used for optical particle backscattering coefficient (bbp retrieval.

  1. A Prefiltered Cuckoo Search Algorithm with Geometric Operators for Solving Sudoku Problems

    Directory of Open Access Journals (Sweden)

    Ricardo Soto

    2014-01-01

    Full Text Available The Sudoku is a famous logic-placement game, originally popularized in Japan and today widely employed as pastime and as testbed for search algorithms. The classic Sudoku consists in filling a 9×9 grid, divided into nine 3×3 regions, so that each column, row, and region contains different digits from 1 to 9. This game is known to be NP-complete, with existing various complete and incomplete search algorithms able to solve different instances of it. In this paper, we present a new cuckoo search algorithm for solving Sudoku puzzles combining prefiltering phases and geometric operations. The geometric operators allow one to correctly move toward promising regions of the combinatorial space, while the prefiltering phases are able to previously delete from domains the values that do not conduct to any feasible solution. This integration leads to a more efficient domain filtering and as a consequence to a faster solving process. We illustrate encouraging experimental results where our approach noticeably competes with the best approximate methods reported in the literature.

  2. Connecting Satellite-Based Precipitation Estimates to Users

    Science.gov (United States)

    Huffman, George J.; Bolvin, David T.; Nelkin, Eric

    2018-01-01

    Beginning in 1997, the Merged Precipitation Group at NASA Goddard has distributed gridded global precipitation products built by combining satellite and surface gauge data. This started with the Global Precipitation Climatology Project (GPCP), then the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA), and recently the Integrated Multi-satellitE Retrievals for the Global Precipitation Measurement (GPM) mission (IMERG). This 20+-year (and on-going) activity has yielded an important set of insights and lessons learned for making state-of-the-art precipitation data accessible to the diverse communities of users. Merged-data products critically depend on the input sensors and the retrieval algorithms providing accurate, reliable estimates, but it is also important to provide ancillary information that helps users determine suitability for their application. We typically provide fields of estimated random error, and recently reintroduced the quality index concept at user request. Also at user request we have added a (diagnostic) field of estimated precipitation phase. Over time, increasingly more ancillary fields have been introduced for intermediate products that give expert users insight into the detailed performance of the combination algorithm, such as individual merged microwave and microwave-calibrated infrared estimates, the contributing microwave sensor types, and the relative influence of the infrared estimate.

  3. Assessment of Machine Learning Algorithms for Automatic Benthic Cover Monitoring and Mapping Using Towed Underwater Video Camera and High-Resolution Satellite Images

    Directory of Open Access Journals (Sweden)

    Hassan Mohamed

    2018-05-01

    Full Text Available Benthic habitat monitoring is essential for many applications involving biodiversity, marine resource management, and the estimation of variations over temporal and spatial scales. Nevertheless, both automatic and semi-automatic analytical methods for deriving ecologically significant information from towed camera images are still limited. This study proposes a methodology that enables a high-resolution towed camera with a Global Navigation Satellite System (GNSS to adaptively monitor and map benthic habitats. First, the towed camera finishes a pre-programmed initial survey to collect benthic habitat videos, which can then be converted to geo-located benthic habitat images. Second, an expert labels a number of benthic habitat images to class habitats manually. Third, attributes for categorizing these images are extracted automatically using the Bag of Features (BOF algorithm. Fourth, benthic cover categories are detected automatically using Weighted Majority Voting (WMV ensembles for Support Vector Machines (SVM, K-Nearest Neighbor (K-NN, and Bagging (BAG classifiers. Fifth, WMV-trained ensembles can be used for categorizing more benthic cover images automatically. Finally, correctly categorized geo-located images can provide ground truth samples for benthic cover mapping using high-resolution satellite imagery. The proposed methodology was tested over Shiraho, Ishigaki Island, Japan, a heterogeneous coastal area. The WMV ensemble exhibited 89% overall accuracy for categorizing corals, sediments, seagrass, and algae species. Furthermore, the same WMV ensemble produced a benthic cover map using a Quickbird satellite image with 92.7% overall accuracy.

  4. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won

    2014-01-01

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface

  5. Adaptive Coding and Modulation Experiment With NASA's Space Communication and Navigation Testbed

    Science.gov (United States)

    Downey, Joseph; Mortensen, Dale; Evans, Michael; Briones, Janette; Tollis, Nicholas

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed is an advanced integrated communication payload on the International Space Station. This paper presents results from an adaptive coding and modulation (ACM) experiment over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options, and uses the Space Data Link Protocol (Consultative Committee for Space Data Systems (CCSDS) standard) for the uplink and downlink data framing. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Several approaches for improving the ACM system are presented, including predictive and learning techniques to accommodate signal fades. Performance of the system is evaluated as a function of end-to-end system latency (round-trip delay), and compared to the capacity of the link. Finally, improvements over standard NASA waveforms are presented.

  6. Development of Research Reactor Simulator and Its Application to Dynamic Test-bed

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Park, Jae Chang; Lee, Seung Wook; Bang, Dane; Bae, Sung Won [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    We developed HANARO and the Jordan Research and Training Reactor (JRTR) real-time simulator for operating staff training. The main purpose of this simulator is operator training, but we modified this simulator as a dynamic test-bed to test the reactor regulating system in HANARO or JRTR before installation. The simulator configuration is divided into hardware and software. The simulator hardware consists of a host computer, 6 operator stations, a network switch, and a large display panel. The simulator software is divided into three major parts: a mathematical modeling module, which executes the plant dynamic modeling program in real-time, an instructor station module that manages user instructions, and a human machine interface (HMI) module. The developed research reactors are installed in the Korea Atomic Energy Research Institute nuclear training center for reactor operator training. To use the simulator as a dynamic test-bed, the reactor regulating system modeling software of the simulator was replaced by a hardware controller and the simulator and target controller were interfaced with a hard-wired and network-based interface.

  7. Static and dynamic optimization of CAPE problems using a Model Testbed

    DEFF Research Database (Denmark)

    This paper presents a new computer aided tool for setting up and solving CAPE related static and dynamic optimisation problems. The Model Testbed (MOT) offers an integrated environment for setting up and solving a very large range of CAPE problems, including complex optimisation problems...... and dynamic optimisation, and how interfacing of solvers and seamless information flow can lead to more efficient solution of process design problems....

  8. Online Access to Weather Satellite Imagery Through the World Wide Web

    Science.gov (United States)

    Emery, W.; Baldwin, D.

    1998-01-01

    Both global area coverage (GAC) and high-resolution picture transmission (HRTP) data from the Advanced Very High Resolution Radiometer (AVHRR) are made available to laternet users through an online data access system. Older GOES-7 data am also available. Created as a "testbed" data system for NASA's future Earth Observing System Data and Information System (EOSDIS), this testbed provides an opportunity to test both the technical requirements of an onune'd;ta system and the different ways in which the -general user, community would employ such a system. Initiated in December 1991, the basic data system experienced five major evolutionary changes In response to user requests and requirements. Features added with these changes were the addition of online browse, user subsetting, dynamic image Processing/navigation, a stand-alone data storage system, and movement,from an X-windows graphical user Interface (GUI) to a World Wide Web (WWW) interface. Over Its lifetime, the system has had as many as 2500 registered users. The system on the WWW has had over 2500 hits since October 1995. Many of these hits are by casual users that only take the GIF images directly from the interface screens and do not specifically order digital data. Still, there b a consistent stream of users ordering the navigated image data and related products (maps and so forth). We have recently added a real-time, seven- day, northwestern United States normalized difference vegetation index (NDVI) composite that has generated considerable Interest. Index Terms-Data system, earth science, online access, satellite data.

  9. Programming a real-time operating system for satellite control applications Satellite Control Applications

    International Nuclear Information System (INIS)

    Omer, M.; Anjum, O.; Suddle, M.R.

    2004-01-01

    With the realization of ideas like formation flights and multi-body space vehicles the demands on an attitude control system have become increasingly complex. Even in its most simplified form, the control system for a typical geostationary satellite has to run various supervisory functions along with determination and control algorithms side by side. Within each algorithm it has to employ multiple actuation and sensing mechanisms and service real time interrupts, for example, in the case of actuator saturation and sensor data fusion. This entails the idea of thread scheduling and program synchronization, tasks specifically meant for a real time OS. This paper explores the embedding of attitude determination and control loop within the framework of a real time operating system provided for TI's DSP C6xxx series. The paper details out the much functionality provided within the scaleable real time kernel and the analysis and configuration tools available, It goes on to describe a layered implementation stack associated with a typical control for Geo Stationary satellites. An application for control is then presented in which state of the art analysis tools are employed to view program threads, synchronization semaphores, hardware interrupts and data exchange pipes operating in real time. (author)

  10. Optimization of communication network topology for navigation sharing among distributed satellites

    Science.gov (United States)

    Dang, Zhaohui; Zhang, Yulin

    2013-01-01

    Navigation sharing among distributed satellites is quite important for coordinated motion and collision avoidance. This paper proposes optimization methods of the communication network topology to achieve navigation sharing. The whole communication network constructing by inter-satellite links are considered as a topology graph. The aim of this paper is to find the communication network topology with minimum communication connections' number (MCCN) in different conditions. It has found that the communication capacity and the number of channels are two key parameters affecting the results. The model of MCCN topology for navigation sharing is established and corresponding method is designed. Two main scenarios, viz., homogeneous case and heterogeneous case, are considered. For the homogeneous case where each member has the same communication capacity, it designs a construction method (Algorithm 1) to find the MCCN topology. For the heterogeneous case, it introduces a modified genetic algorithm (Algorithm 2) to find the MCCN topology. When considering the fact that the number of channels is limited, the Algorithm 2 is further modified by adding a penalized term in the fitness function. The effectiveness of these algorithms is all proved in theoretical. Three examples are further tested to illustrate the methods developed in this paper.

  11. First light of an external occulter testbed at flight Fresnel numbers

    Science.gov (United States)

    Kim, Yunjong; Sirbu, Dan; Hu, Mia; Kasdin, Jeremy; Vanderbei, Robert J.; Harness, Anthony; Shaklan, Stuart

    2017-01-01

    Many approaches have been suggested over the last couple of decades for imaging Earth-like planets. One of the main candidates for creating high-contrast for future Earth-like planets detection is an external occulter. The external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. The occulter is typically tens of meters in diameter and the separation from the telescope is of the order of tens of thousands of kilometers. Optical testing of a full-scale external occulter on the ground is impossible because of the long separations. Therefore, laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The goal of this experiment is to demonstrate a pupil plane suppression of better than 1e-9 with a corresponding image plane contrast of better than 1e-11. The occulter testbed uses a 77.2 m optical propagation distance to realize the flight Fresnel number of 14.5. The scaled mask is placed at 27.2 m from the artificial source and the camera is located 50.0 m from the scaled mask. We will use an etched silicon mask, manufactured by the Microdevices Lab(MDL) of the Jet Propulsion Laboratory(JPL), as the occulter. Based on conversations with MDL, we expect that 0.5 μm feature size is an achievable resolution in the mask manufacturing process and is therefore likely the indicator of the best possible performance. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the experimental setup of the testbed. We compare the experimental results with simulations

  12. Prototype Implementation of Two Efficient Low-Complexity Digital Predistortion Algorithms

    Directory of Open Access Journals (Sweden)

    Timo I. Laakso

    2008-01-01

    Full Text Available Predistortion (PD lineariser for microwave power amplifiers (PAs is an important topic of research. With larger and larger bandwidth as it appears today in modern WiMax standards as well as in multichannel base stations for 3GPP standards, the relatively simple nonlinear effect of a PA becomes a complex memory-including function, severely distorting the output signal. In this contribution, two digital PD algorithms are investigated for the linearisation of microwave PAs in mobile communications. The first one is an efficient and low-complexity algorithm based on a memoryless model, called the simplicial canonical piecewise linear (SCPWL function that describes the static nonlinear characteristic of the PA. The second algorithm is more general, approximating the pre-inverse filter of a nonlinear PA iteratively using a Volterra model. The first simpler algorithm is suitable for compensation of amplitude compression and amplitude-to-phase conversion, for example, in mobile units with relatively small bandwidths. The second algorithm can be used to linearise PAs operating with larger bandwidths, thus exhibiting memory effects, for example, in multichannel base stations. A measurement testbed which includes a transmitter-receiver chain with a microwave PA is built for testing and prototyping of the proposed PD algorithms. In the testing phase, the PD algorithms are implemented using MATLAB (floating-point representation and tested in record-and-playback mode. The iterative PD algorithm is then implemented on a Field Programmable Gate Array (FPGA using fixed-point representation. The FPGA implementation allows the pre-inverse filter to be tested in a real-time mode. Measurement results show excellent linearisation capabilities of both the proposed algorithms in terms of adjacent channel power suppression. It is also shown that the fixed-point FPGA implementation of the iterative algorithm performs as well as the floating-point implementation.

  13. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — evaluation report for ATDM program. [supporting datasets - Pasadena Testbed

    Science.gov (United States)

    2017-07-26

    This zip file contains POSTDATA.ATT (.ATT); Print to File (.PRN); Portable Document Format (.PDF); and document (.DOCX) files of data to support FHWA-JPO-16-385, Analysis, modeling, and simulation (AMS) testbed development and evaluation to support d...

  14. The Living With a Star Space Environment Testbed Experiments

    Science.gov (United States)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  15. Using Information From Prior Satellite Scans to Improve Cloud Detection Near the Day-Night Terminator

    Science.gov (United States)

    Yost, Christopher R.; Minnis, Patrick; Trepte, Qing Z.; Palikonda, Rabindra; Ayers, Jeffrey K.; Spangenberg, Doulas A.

    2012-01-01

    With geostationary satellite data it is possible to have a continuous record of diurnal cycles of cloud properties for a large portion of the globe. Daytime cloud property retrieval algorithms are typically superior to nighttime algorithms because daytime methods utilize measurements of reflected solar radiation. However, reflected solar radiation is difficult to accurately model for high solar zenith angles where the amount of incident radiation is small. Clear and cloudy scenes can exhibit very small differences in reflected radiation and threshold-based cloud detection methods have more difficulty setting the proper thresholds for accurate cloud detection. Because top-of-atmosphere radiances are typically more accurately modeled outside the terminator region, information from previous scans can help guide cloud detection near the terminator. This paper presents an algorithm that uses cloud fraction and clear and cloudy infrared brightness temperatures from previous satellite scan times to improve the performance of a threshold-based cloud mask near the terminator. Comparisons of daytime, nighttime, and terminator cloud fraction derived from Geostationary Operational Environmental Satellite (GOES) radiance measurements show that the algorithm greatly reduces the number of false cloud detections and smoothes the transition from the daytime to the nighttime clod detection algorithm. Comparisons with the Geoscience Laser Altimeter System (GLAS) data show that using this algorithm decreases the number of false detections by approximately 20 percentage points.

  16. EPIC: A Testbed for Scientifically Rigorous Cyber-Physical Security Experimentation

    OpenAIRE

    SIATERLIS CHRISTOS; GENGE BELA; HOHENADEL MARC

    2013-01-01

    Recent malware, like Stuxnet and Flame, constitute a major threat to Networked Critical Infrastructures (NCIs), e.g., power plants. They revealed several vulnerabilities in today's NCIs, but most importantly they highlighted the lack of an efficient scientific approach to conduct experiments that measure the impact of cyber threats on both the physical and the cyber parts of NCIs. In this paper we present EPIC, a novel cyber-physical testbed and a modern scientific instrument that can pr...

  17. A New Temperature-Vegetation Triangle Algorithm with Variable Edges (TAVE for Satellite-Based Actual Evapotranspiration Estimation

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    2016-09-01

    Full Text Available The estimation of spatially-variable actual evapotranspiration (AET is a critical challenge to regional water resources management. We propose a new remote sensing method, the Triangle Algorithm with Variable Edges (TAVE, to generate daily AET estimates based on satellite-derived land surface temperature and the vegetation index NDVI. The TAVE captures heterogeneity in AET across elevation zones and permits variability in determining local values of wet and dry end-member classes (known as edges. Compared to traditional triangle methods, TAVE introduces three unique features: (i the discretization of the domain as overlapping elevation zones; (ii a variable wet edge that is a function of elevation zone; and (iii variable values of a combined-effect parameter (that accounts for aerodynamic and surface resistance, vapor pressure gradient, and soil moisture availability along both wet and dry edges. With these features, TAVE effectively addresses the combined influence of terrain and water stress on semi-arid environment AET estimates. We demonstrate the effectiveness of this method in one of the driest countries in the world—Jordan, and compare it to a traditional triangle method (TA and a global AET product (MOD16 over different land use types. In irrigated agricultural lands, TAVE matched the results of the single crop coefficient model (−3%, in contrast to substantial overestimation by TA (+234% and underestimation by MOD16 (−50%. In forested (non-irrigated, water consuming regions, TA and MOD16 produced AET average deviations 15.5 times and −3.5 times of those based on TAVE. As TAVE has a simple structure and low data requirements, it provides an efficient means to satisfy the increasing need for evapotranspiration estimation in data-scarce semi-arid regions. This study constitutes a much needed step towards the satellite-based quantification of agricultural water consumption in Jordan.

  18. Decision tree approach for classification of remotely sensed satellite

    Indian Academy of Sciences (India)

    DTC) algorithm for classification of remotely sensed satellite data (Landsat TM) using open source support. The decision tree is constructed by recursively partitioning the spectral distribution of the training dataset using WEKA, open source ...

  19. SALIENCY BASED SEGMENTATION OF SATELLITE IMAGES

    Directory of Open Access Journals (Sweden)

    A. Sharma

    2015-03-01

    Full Text Available Saliency gives the way as humans see any image and saliency based segmentation can be eventually helpful in Psychovisual image interpretation. Keeping this in view few saliency models are used along with segmentation algorithm and only the salient segments from image have been extracted. The work is carried out for terrestrial images as well as for satellite images. The methodology used in this work extracts those segments from segmented image which are having higher or equal saliency value than a threshold value. Salient and non salient regions of image become foreground and background respectively and thus image gets separated. For carrying out this work a dataset of terrestrial images and Worldview 2 satellite images (sample data are used. Results show that those saliency models which works better for terrestrial images are not good enough for satellite image in terms of foreground and background separation. Foreground and background separation in terrestrial images is based on salient objects visible on the images whereas in satellite images this separation is based on salient area rather than salient objects.

  20. Short-Term Prediction Research and Transition (SPoRT) Center: Transitioning Satellite Data to Operations

    Science.gov (United States)

    Zavodsky, Bradley

    2012-01-01

    The Short-term Prediction Research and Transition (SPoRT) Center located at NASA Marshall Space Flight Center has been conducting testbed activities aimed at transitioning satellite products to National Weather Service operational end users for the last 10 years. SPoRT is a NASA/NOAA funded project that has set the bar for transition of products to operational end users through a paradigm of understanding forecast challenges and forecaster needs, displaying products in end users decision support systems, actively assessing the operational impact of these products, and improving products based on forecaster feedback. Aiming for quality partnerships rather than a large quantity of data users, SPoRT has become a community leader in training operational forecasters on the use of up-and-coming satellite data through the use of legacy instruments and proxy data. Traditionally, SPoRT has supplied satellite imagery and products from NASA instruments such as the Moderate-resolution Imaging Spectroradiometer (MODIS) and the Atmospheric Infrared Sounder (AIRS). However, recently, SPoRT has been funded by the GOES-R and Joint Polar Satellite System (JPSS) Proving Grounds to accelerate the transition of selected imagery and products to help improve forecaster awareness of upcoming operational data from the Visible Infrared Imager Radiometer Suite (VIIRS), Cross-track Infrared Sounder (CrIS), Advanced Baseline Imager (ABI), and Geostationary Lightning Mapper (GLM). This presentation provides background on the SPoRT Center, the SPoRT paradigm, and some example products that SPoRT is excited to work with forecasters to evaluate.

  1. Distributed Extended Kalman Filter for Position, Velocity, Time, Estimation in Satellite Navigation Receivers

    Directory of Open Access Journals (Sweden)

    O. Jakubov

    2013-09-01

    Full Text Available Common techniques for position-velocity-time estimation in satellite navigation, iterative least squares and the extended Kalman filter, involve matrix operations. The matrix inversion and inclusion of a matrix library pose requirements on a computational power and operating platform of the navigation processor. In this paper, we introduce a novel distributed algorithm suitable for implementation in simple parallel processing units each for a tracked satellite. Such a unit performs only scalar sum, subtraction, multiplication, and division. The algorithm can be efficiently implemented in hardware logic. Given the fast position-velocity-time estimator, frequent estimates can foster dynamic performance of a vector tracking receiver. The algorithm has been designed from a factor graph representing the extended Kalman filter by splitting vector nodes into scalar ones resulting in a cyclic graph with few iterations needed. Monte Carlo simulations have been conducted to investigate convergence and accuracy. Simulation case studies for a vector tracking architecture and experimental measurements with a real-time software receiver developed at CTU in Prague were conducted. The algorithm offers compromises in stability, accuracy, and complexity depending on the number of iterations. In scenarios with a large number of tracked satellites, it can outperform the traditional methods at low complexity.

  2. Satellite network robust QoS-aware routing

    CERN Document Server

    Long, Fei

    2014-01-01

    Satellite Network Robust QoS-aware Routing presents a novel routing strategy for satellite networks. This strategy is useful for the design of multi-layered satellite networks as it can greatly reduce the number of time slots in one system cycle. The traffic prediction and engineering approaches make the system robust so that the traffic spikes can be handled effectively. The multi-QoS optimization routing algorithm can satisfy various potential user requirements. Clear and sufficient illustrations are also presented in the book. As the chapters cover the above topics independently, readers from different research backgrounds in constellation design, multi-QoS routing, and traffic engineering can benefit from the book.   Fei Long is a senior engineer at Beijing R&D Center of 54th Research Institute of China Electronics Technology Group Corporation.

  3. Low-Thrust Out-of-Plane Orbital Station-Keeping Maneuvers for Satellites

    Directory of Open Access Journals (Sweden)

    Vivian M. Gomes

    2012-01-01

    Full Text Available This paper considers the problem of out of plane orbital maneuvers for station keeping of satellites. The main idea is to consider that a satellite is in an orbit around the Earth and that it has its orbit is disturbed by one or more forces. Then, it is necessary to perform a small amplitude orbital correction to return the satellite to its original orbit, to keep it performing its mission. A low thrust propulsion is used to complete this task. It is important to search for solutions that minimize the fuel consumption to increase the lifetime of the satellite. To solve this problem a hybrid optimal control approach is used. The accuracy of the satisfaction of the constraints is considered, in order to try to decrease the fuel expenditure by taking advantage of this freedom. This type of problem presents numerical difficulties and it is necessary to adjust parameters, as well as details of the algorithm, to get convergence. In this versions of the algorithm that works well for planar maneuvers are usually not adequate for the out of plane orbital corrections. In order to illustrate the method, some numerical results are presented.

  4. New Channel Coding Methods for Satellite Communication

    Directory of Open Access Journals (Sweden)

    J. Sebesta

    2010-04-01

    Full Text Available This paper deals with the new progressive channel coding methods for short message transmission via satellite transponder using predetermined length of frame. The key benefits of this contribution are modification and implementation of a new turbo code and utilization of unique features with applications of methods for bit error rate estimation and algorithm for output message reconstruction. The mentioned methods allow an error free communication with very low Eb/N0 ratio and they have been adopted for satellite communication, however they can be applied for other systems working with very low Eb/N0 ratio.

  5. Testbed diversity as a fundamental principle for effective ICS security research

    OpenAIRE

    Green, Benjamin; Frey, Sylvain Andre Francis; Rashid, Awais; Hutchison, David

    2016-01-01

    The implementation of diversity in testbeds is essential to understanding and improving the security and resilience of Industrial Control Systems (ICS). Employing a wide spec- trum of equipment, diverse networks, and business processes, as deployed in real-life infrastructures, is particularly diffi- cult in experimental conditions. However, this level of di- versity is key from a security perspective, as attackers can exploit system particularities and process intricacies to their advantage....

  6. THERMAL AND VISIBLE SATELLITE IMAGE FUSION USING WAVELET IN REMOTE SENSING AND SATELLITE IMAGE PROCESSING

    Directory of Open Access Journals (Sweden)

    A. H. Ahrari

    2017-09-01

    Full Text Available Multimodal remote sensing approach is based on merging different data in different portions of electromagnetic radiation that improves the accuracy in satellite image processing and interpretations. Remote Sensing Visible and thermal infrared bands independently contain valuable spatial and spectral information. Visible bands make enough information spatially and thermal makes more different radiometric and spectral information than visible. However low spatial resolution is the most important limitation in thermal infrared bands. Using satellite image fusion, it is possible to merge them as a single thermal image that contains high spectral and spatial information at the same time. The aim of this study is a performance assessment of thermal and visible image fusion quantitatively and qualitatively with wavelet transform and different filters. In this research, wavelet algorithm (Haar and different decomposition filters (mean.linear,ma,min and rand for thermal and panchromatic bands of Landast8 Satellite were applied as shortwave and longwave fusion method . Finally, quality assessment has been done with quantitative and qualitative approaches. Quantitative parameters such as Entropy, Standard Deviation, Cross Correlation, Q Factor and Mutual Information were used. For thermal and visible image fusion accuracy assessment, all parameters (quantitative and qualitative must be analysed with respect to each other. Among all relevant statistical factors, correlation has the most meaningful result and similarity to the qualitative assessment. Results showed that mean and linear filters make better fused images against the other filters in Haar algorithm. Linear and mean filters have same performance and there is not any difference between their qualitative and quantitative results.

  7. Ship detection in satellite imagery using rank-order greyscale hit-or-miss transforms

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, Neal R [Los Alamos National Laboratory; Porter, Reid B [Los Alamos National Laboratory; Theiler, James [Los Alamos National Laboratory

    2010-01-01

    Ship detection from satellite imagery is something that has great utility in various communities. Knowing where ships are and their types provides useful intelligence information. However, detecting and recognizing ships is a difficult problem. Existing techniques suffer from too many false-alarms. We describe approaches we have taken in trying to build ship detection algorithms that have reduced false alarms. Our approach uses a version of the grayscale morphological Hit-or-Miss transform. While this is well known and used in its standard form, we use a version in which we use a rank-order selection for the dilation and erosion parts of the transform, instead of the standard maximum and minimum operators. This provides some slack in the fitting that the algorithm employs and provides a method for tuning the algorithm's performance for particular detection problems. We describe our algorithms, show the effect of the rank-order parameter on the algorithm's performance and illustrate the use of this approach for real ship detection problems with panchromatic satellite imagery.

  8. The CMS integration grid testbed

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  9. Assessment of a Bidirectional Reflectance Distribution Correction of Above-Water and Satellite Water-Leaving Radiance in Coastal Waters

    Science.gov (United States)

    Hlaing, Soe; Gilerson, Alexander; Harmal, Tristan; Tonizzo, Alberto; Weidemann, Alan; Arnone, Robert; Ahmed, Samir

    2012-01-01

    Water-leaving radiances, retrieved from in situ or satellite measurements, need to be corrected for the bidirectional properties of the measured light in order to standardize the data and make them comparable with each other. The current operational algorithm for the correction of bidirectional effects from the satellite ocean color data is optimized for typical oceanic waters. However, versions of bidirectional reflectance correction algorithms specifically tuned for typical coastal waters and other case 2 conditions are particularly needed to improve the overall quality of those data. In order to analyze the bidirectional reflectance distribution function (BRDF) of case 2 waters, a dataset of typical remote sensing reflectances was generated through radiative transfer simulations for a large range of viewing and illumination geometries. Based on this simulated dataset, a case 2 water focused remote sensing reflectance model is proposed to correct above-water and satellite water-leaving radiance data for bidirectional effects. The proposed model is first validated with a one year time series of in situ above-water measurements acquired by collocated multispectral and hyperspectral radiometers, which have different viewing geometries installed at the Long Island Sound Coastal Observatory (LISCO). Match-ups and intercomparisons performed on these concurrent measurements show that the proposed algorithm outperforms the algorithm currently in use at all wavelengths, with average improvement of 2.4% over the spectral range. LISCO's time series data have also been used to evaluate improvements in match-up comparisons of Moderate Resolution Imaging Spectroradiometer satellite data when the proposed BRDF correction is used in lieu of the current algorithm. It is shown that the discrepancies between coincident in-situ sea-based and satellite data decreased by 3.15% with the use of the proposed algorithm.

  10. Satellite-Based Precipitation Datasets

    Science.gov (United States)

    Munchak, S. J.; Huffman, G. J.

    2017-12-01

    Of the possible sources of precipitation data, those based on satellites provide the greatest spatial coverage. There is a wide selection of datasets, algorithms, and versions from which to choose, which can be confusing to non-specialists wishing to use the data. The International Precipitation Working Group (IPWG) maintains tables of the major publicly available, long-term, quasi-global precipitation data sets (http://www.isac.cnr.it/ ipwg/data/datasets.html), and this talk briefly reviews the various categories. As examples, NASA provides two sets of quasi-global precipitation data sets: the older Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) and current Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (GPM) mission (IMERG). Both provide near-real-time and post-real-time products that are uniformly gridded in space and time. The TMPA products are 3-hourly 0.25°x0.25° on the latitude band 50°N-S for about 16 years, while the IMERG products are half-hourly 0.1°x0.1° on 60°N-S for over 3 years (with plans to go to 16+ years in Spring 2018). In addition to the precipitation estimates, each data set provides fields of other variables, such as the satellite sensor providing estimates and estimated random error. The discussion concludes with advice about determining suitability for use, the necessity of being clear about product names and versions, and the need for continued support for satellite- and surface-based observation.

  11. Cloud detection algorithm comparison and validation for operational Landsat data products

    Science.gov (United States)

    Foga, Steven Curtis; Scaramuzza, Pat; Guo, Song; Zhu, Zhe; Dilley, Ronald; Beckmann, Tim; Schmidt, Gail L.; Dwyer, John L.; Hughes, MJ; Laue, Brady

    2017-01-01

    Clouds are a pervasive and unavoidable issue in satellite-borne optical imagery. Accurate, well-documented, and automated cloud detection algorithms are necessary to effectively leverage large collections of remotely sensed data. The Landsat project is uniquely suited for comparative validation of cloud assessment algorithms because the modular architecture of the Landsat ground system allows for quick evaluation of new code, and because Landsat has the most comprehensive manual truth masks of any current satellite data archive. Currently, the Landsat Level-1 Product Generation System (LPGS) uses separate algorithms for determining clouds, cirrus clouds, and snow and/or ice probability on a per-pixel basis. With more bands onboard the Landsat 8 Operational Land Imager (OLI)/Thermal Infrared Sensor (TIRS) satellite, and a greater number of cloud masking algorithms, the U.S. Geological Survey (USGS) is replacing the current cloud masking workflow with a more robust algorithm that is capable of working across multiple Landsat sensors with minimal modification. Because of the inherent error from stray light and intermittent data availability of TIRS, these algorithms need to operate both with and without thermal data. In this study, we created a workflow to evaluate cloud and cloud shadow masking algorithms using cloud validation masks manually derived from both Landsat 7 Enhanced Thematic Mapper Plus (ETM +) and Landsat 8 OLI/TIRS data. We created a new validation dataset consisting of 96 Landsat 8 scenes, representing different biomes and proportions of cloud cover. We evaluated algorithm performance by overall accuracy, omission error, and commission error for both cloud and cloud shadow. We found that CFMask, C code based on the Function of Mask (Fmask) algorithm, and its confidence bands have the best overall accuracy among the many algorithms tested using our validation data. The Artificial Thermal-Automated Cloud Cover Algorithm (AT-ACCA) is the most accurate

  12. Satellite lithium-ion battery remaining useful life estimation with an iterative updated RVM fused with the KF algorithm

    Institute of Scientific and Technical Information of China (English)

    Yuchen SONG; Datong LIU; Yandong HOU; Jinxiang YU; Yu PENG

    2018-01-01

    Lithium-ion batteries have become the third-generation space batteries and are widely utilized in a series of spacecraft. Remaining Useful Life (RUL) estimation is essential to a spacecraft as the battery is a critical part and determines the lifetime and reliability. The Relevance Vector Machine (RVM) is a data-driven algorithm used to estimate a battery's RUL due to its sparse fea-ture and uncertainty management capability. Especially, some of the regressive cases indicate that the RVM can obtain a better short-term prediction performance rather than long-term prediction. As a nonlinear kernel learning algorithm, the coefficient matrix and relevance vectors are fixed once the RVM training is conducted. Moreover, the RVM can be simply influenced by the noise with the training data. Thus, this work proposes an iterative updated approach to improve the long-term prediction performance for a battery's RUL prediction. Firstly, when a new estimator is output by the RVM, the Kalman filter is applied to optimize this estimator with a physical degradation model. Then, this optimized estimator is added into the training set as an on-line sample, the RVM model is re-trained, and the coefficient matrix and relevance vectors can be dynamically adjusted to make next iterative prediction. Experimental results with a commercial battery test data set and a satellite battery data set both indicate that the proposed method can achieve a better per-formance for RUL estimation.

  13. Satellite lithium-ion battery remaining useful life estimation with an iterative updated RVM fused with the KF algorithm

    Directory of Open Access Journals (Sweden)

    Yuchen SONG

    2018-01-01

    Full Text Available Lithium-ion batteries have become the third-generation space batteries and are widely utilized in a series of spacecraft. Remaining Useful Life (RUL estimation is essential to a spacecraft as the battery is a critical part and determines the lifetime and reliability. The Relevance Vector Machine (RVM is a data-driven algorithm used to estimate a battery’s RUL due to its sparse feature and uncertainty management capability. Especially, some of the regressive cases indicate that the RVM can obtain a better short-term prediction performance rather than long-term prediction. As a nonlinear kernel learning algorithm, the coefficient matrix and relevance vectors are fixed once the RVM training is conducted. Moreover, the RVM can be simply influenced by the noise with the training data. Thus, this work proposes an iterative updated approach to improve the long-term prediction performance for a battery’s RUL prediction. Firstly, when a new estimator is output by the RVM, the Kalman filter is applied to optimize this estimator with a physical degradation model. Then, this optimized estimator is added into the training set as an on-line sample, the RVM model is re-trained, and the coefficient matrix and relevance vectors can be dynamically adjusted to make next iterative prediction. Experimental results with a commercial battery test data set and a satellite battery data set both indicate that the proposed method can achieve a better performance for RUL estimation.

  14. Satellite cluster flight using on-off cyclic control

    Science.gov (United States)

    Zhang, Hao; Gurfil, Pini

    2015-01-01

    Nano-satellite clusters and disaggregated satellites are new concepts in the realm of distributed satellite systems, which require complex cluster management - mainly regulating the maximal and minimal inter-satellite distances on time scales of years - while utilizing simple on-off propulsion systems. The simple actuators and long time scales require judicious astrodynamical modeling coupled with specialized orbit control. This paper offers a satellite cluster orbit control law which works for long time scales in a perturbed environment while utilizing fixed-magnitude thrusters. The main idea is to design a distributed controller which balances the fuel consumption among the satellites, thus mitigating the effect of differential drag perturbations. The underlying methodology utilizes a cyclic control algorithm based on a mean orbital elements feedback. Stability properties of the closed-loop cyclic control system do not adhere to the classical Lyapunov stability theory, so an effort is made to define and implement a suitable stability theory of noncompact equilibria sets. A state selection scheme is proposed for efficiently establishing a low Earth orbit cluster. Several simulations, including a real mission study, and several comparative investigations, are performed to show the strengths of the proposed control law.

  15. Analysing Regional Land Surface Temperature Changes by Satellite Data, a Case Study of Zonguldak, Turkey

    Science.gov (United States)

    Sekertekin, A.; Kutoglu, S.; Kaya, S.; Marangoz, A. M.

    2014-12-01

    In recent years, climate change is one of the most important problems that the ecological system of the world has been encountering. Global warming and climate change have been studied frequently by all disciplines all over the world and Geomatics Engineering also contributes to such studies by means of remote sensing, global positioning system etc. Monitoring Land Surface Temperature (LST) via remote sensing satellites is one of the most important contributions to climatology. LST is an important parameter governing the energy balance on the Earth and there are lots of algorithms to obtain LST by remote sensing techniques. The most commonly used algorithms are split-window algorithm, temperature/emissivity separation method, mono-window algorithm and single channel method. Generally three algorithms are used to obtain LST by using Landsat 5 TM data. These algorithms are radiative transfer equation method, single channel method and mono-window algorithm. Radiative transfer equation method is not applicable because during the satellite pass, atmospheric parameters must be measured in-situ. In this research, mono window algorithm was implemented to Landsat 5 TM image. Besides, meteorological data such as humidity and temperature are used in the algorithm. Acquisition date of the image is 28.08.2011 and our study area is Zonguldak, Turkey. High resolution images are used to investigate the relationships between LST and land cover type. As a result of these analyses, area with vegetation cover has approximately 5 ºC lower temperature than the city center and arid land. Because different surface properties like reinforced concrete construction, green zones and sandbank are all together in city center, LST differs about 10 ºC in the city center. The temperature around some places in thermal power plant region Çatalağzı, is about 5 ºC higher than city center. Sandbank and agricultural areas have highest temperature because of land cover structure. Thanks to this

  16. Dissolved Organic Carbon along the Louisiana coast from MODIS and MERIS satellite data

    Science.gov (United States)

    Chaichi Tehrani, N.; D'Sa, E. J.

    2012-12-01

    Dissolved organic carbon (DOC) plays a critical role in the coastal and ocean carbon cycle. Hence, it is important to monitor and investigate its the distribution and fate in coastal waters. Since DOC cannot be measured directly through satellite remote sensors, chromophoric dissolved organic matter (CDOM) as an optically active fraction of DOC can be used as an alternative proxy to trace DOC concentrations. Here, satellite ocean color data from MODIS, MERIS, and field measurements of CDOM and DOC were used to develop and assess CDOM and DOC ocean color algorithms for coastal waters. To develop a CDOM retrieval algorithm, empirical relationships between CDOM absorption coefficient at 412 nm (aCDOM(412)) and reflectance ratios Rrs(488)/Rrs(555) for MODIS and Rrs(510)/Rrs(560) for MERIS were established. The performance of two CDOM empirical algorithms were evaluated for retrieval of (aCDOM(412)) from MODIS and MERIS in the northern Gulf of Mexico. Further, empirical algorithms were developed to estimate DOC concentration using the relationship between in situ aCDOM(412) and DOC, as well as using the newly developed CDOM empirical algorithms. Accordingly, our results revealed that DOC concentration was strongly correlated to aCDOM (412) for summer and spring-winter periods (r2 = 0.9 for both periods). Then, using the aCDOM(412)-Rrs and the aCDOM(412)-DOC relationships derived from field measurements, a relationship between DOC-Rrs was established for MODIS and MERIS data. The DOC empirical algorithms performed well as indicated by match-up comparisons between satellite estimates and field data (R2=0.52 and 0.58 for MODIS and MERIS for summer period, respectively). These algorithms were then used to examine DOC distribution along the Louisiana coast.

  17. Automated tracking for advanced satellite laser ranging systems

    Science.gov (United States)

    McGarry, Jan F.; Degnan, John J.; Titterton, Paul J., Sr.; Sweeney, Harold E.; Conklin, Brion P.; Dunn, Peter J.

    1996-06-01

    NASA's Satellite Laser Ranging Network was originally developed during the 1970's to track satellites carrying corner cube reflectors. Today eight NASA systems, achieving millimeter ranging precision, are part of a global network of more than 40 stations that track 17 international satellites. To meet the tracking demands of a steadily growing satellite constellation within existing resources, NASA is embarking on a major automation program. While manpower on the current systems will be reduced to a single operator, the fully automated SLR2000 system is being designed to operate for months without human intervention. Because SLR2000 must be eyesafe and operate in daylight, tracking is often performed in a low probability of detection and high noise environment. The goal is to automatically select the satellite, setup the tracking and ranging hardware, verify acquisition, and close the tracking loop to optimize data yield. TO accomplish the autotracking tasks, we are investigating (1) improved satellite force models, (2) more frequent updates of orbital ephemerides, (3) lunar laser ranging data processing techniques to distinguish satellite returns from noise, and (4) angular detection and search techniques to acquire the satellite. A Monte Carlo simulator has been developed to allow optimization of the autotracking algorithms by modeling the relevant system errors and then checking performance against system truth. A combination of simulator and preliminary field results will be presented.

  18. Creative thinking of design and redesign on SEAT aircraft cabin testbed: a case study

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    this paper, the intuition approach in the design and redesign of the environmental friendly innovative aircraft cabin simulator is presented.. The aircraft cabin simulator is a testbed that used for European Project SEAT (Smart tEchnologies for Stress free Air Travel). The SEAT project aims to

  19. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    Energy Technology Data Exchange (ETDEWEB)

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  20. Development of an autonomous power system testbed

    International Nuclear Information System (INIS)

    Barton, J.R.; Adams, T.; Liffring, M.E.

    1985-01-01

    A power system testbed has been assembled to advance the development of large autonomous electrical power systems required for the space station, spacecraft, and aircraft. The power system for this effort was designed to simulate single- or dual-bus autonomous power systems, or autonomous systems that reconfigure from a single bus to a dual bus following a severe fault. The approach taken was to provide a flexible power system design with two computer systems for control and management. One computer operates as the control system and performs basic control functions, data and command processing, charge control, and provides status to the second computer. The second computer contains expert system software for mission planning, load management, fault identification and recovery, and sends load and configuration commands to the control system

  1. Test-bed for the remote health monitoring system for bridge structures using FBG sensors

    Science.gov (United States)

    Lee, Chin-Hyung; Park, Ki-Tae; Joo, Bong-Chul; Hwang, Yoon-Koog

    2009-05-01

    This paper reports on test-bed for the long-term health monitoring system for bridge structures employing fiber Bragg grating (FBG) sensors, which is remotely accessible via the web, to provide real-time quantitative information on a bridge's response to live loading and environmental changes, and fast prediction of the structure's integrity. The sensors are attached on several locations of the structure and connected to a data acquisition system permanently installed onsite. The system can be accessed through remote communication using an optical cable network, through which the evaluation of the bridge behavior under live loading can be allowed at place far away from the field. Live structural data are transmitted continuously to the server computer at the central office. The server computer is connected securely to the internet, where data can be retrieved, processed and stored for the remote web-based health monitoring. Test-bed revealed that the remote health monitoring technology will enable practical, cost-effective, and reliable condition assessment and maintenance of bridge structures.

  2. Development of Ray Tracing Algorithms for Scanning Plane and Transverse Plane Analysis for Satellite Multibeam Application

    Directory of Open Access Journals (Sweden)

    N. H. Abd Rahman

    2014-01-01

    Full Text Available Reflector antennas have been widely used in many areas. In the implementation of parabolic reflector antenna for broadcasting satellite applications, it is essential for the spacecraft antenna to provide precise contoured beam to effectively serve the required region. For this purpose, combinations of more than one beam are required. Therefore, a tool utilizing ray tracing method is developed to calculate precise off-axis beams for multibeam antenna system. In the multibeam system, each beam will be fed from different feed positions to allow the main beam to be radiated at the exact direction on the coverage area. Thus, detailed study on caustics of a parabolic reflector antenna is performed and presented in this paper, which is to investigate the behaviour of the rays and its relation to various antenna parameters. In order to produce accurate data for the analysis, the caustic behaviours are investigated in two distinctive modes: scanning plane and transverse plane. This paper presents the detailed discussions on the derivation of the ray tracing algorithms, the establishment of the equations of caustic loci, and the verification of the method through calculation of radiation pattern.

  3. Research of the key technology in satellite communication networks

    Science.gov (United States)

    Zeng, Yuan

    2018-02-01

    According to the prediction, in the next 10 years the wireless data traffic will be increased by 500-1000 times. Not only the wireless data traffic will be increased exponentially, and the demand for diversified traffic will be increased. Higher requirements for future mobile wireless communication system had brought huge market space for satellite communication system. At the same time, the space information networks had been greatly developed with the depth of human exploration of space activities, the development of space application, the expansion of military and civilian application. The core of spatial information networks is the satellite communication. The dissertation presented the communication system architecture, the communication protocol, the routing strategy, switch scheduling algorithm and the handoff strategy based on the satellite communication system. We built the simulation platform of the LEO satellites networks and simulated the key technology using OPNET.

  4. DETERMINATION OF THE LIGHT CURVE OF THE ARTIFICIAL SATELLITE BY ITS ROTATION PATH AS PREPARATION TO THE INVERSE PROBLEM SOLUTION

    OpenAIRE

    Pavlenko, Daniil

    2012-01-01

    Developing the algorithm of estimation of the rotational parameters of the artificial satellite by its light curve, we face the necessity to compute test light curves for various initially given types of rotation and specific features of lighting of the satellite. In the present study the algorithm of creation of such light curves with the simulation method and the obtained result are described.

  5. Inter-Comparison of High-Resolution Satellite Precipitation Products over Central Asia

    Directory of Open Access Journals (Sweden)

    Hao Guo

    2015-06-01

    Full Text Available This paper examines the spatial error structures of eight precipitation estimates derived from four different satellite retrieval algorithms including TRMM Multi-satellite Precipitation Analysis (TMPA, Climate Prediction Center morphing technique (CMORPH, Global Satellite Mapping of Precipitation (GSMaP and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN. All the original satellite and bias-corrected products of each algorithm (3B42RTV7 and 3B42V7, CMORPH_RAW and CMORPH_CRT, GSMaP_MVK and GSMaP_Gauge, PERSIANN_RAW and PERSIANN_CDR are evaluated against ground-based Asian Precipitation-Highly Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE over Central Asia for the period of 2004 to 2006. The analyses show that all products except PERSIANN exhibit overestimation over Aral Sea and its surrounding areas. The bias-correction improves the quality of the original satellite TMPA products and GSMaP significantly but slightly in CMORPH and PERSIANN over Central Asia. 3B42RTV7 overestimates precipitation significantly with large Relative Bias (RB (128.17% while GSMaP_Gauge shows consistent high correlation coefficient (CC (>0.8 but RB fluctuates between −57.95% and 112.63%. The PERSIANN_CDR outperforms other products in winter with the highest CC (0.67. Both the satellite-only and gauge adjusted products have particularly poor performance in detecting rainfall events in terms of lower POD (less than 65%, CSI (less than 45% and relatively high FAR (more than 35%.

  6. Parallel algorithms on the ASTRA SIMD machine

    International Nuclear Information System (INIS)

    Odor, G.; Rohrbach, F.; Vesztergombi, G.; Varga, G.; Tatrai, F.

    1996-01-01

    In view of the tremendous computing power jump of modern RISC processors the interest in parallel computing seems to be thinning out. Why use a complicated system of parallel processors, if the problem can be solved by a single powerful micro-chip. It is a general law, however, that exponential growth will always end by some kind of a saturation, and then parallelism will again become a hot topic. We try to prepare ourselves for this eventuality. The MPPC project started in 1990 in the keydeys of parallelism and produced four ASTRA machines (presented at CHEP's 92) with 4k processors (which are expandable to 16k) based on yesterday's chip-technology (chip presented at CHEP'91). These machines now provide excellent test-beds for algorithmic developments in a complete, real environment. We are developing for example fast-pattern recognition algorithms which could be used in high-energy physics experiments at the LHC (planned to be operational after 2004 at CERN) for triggering and data reduction. The basic feature of our ASP (Associate String Processor) approach is to use extremely simple (thus very cheap) processor elements but in huge quantities (up to millions of processors) connected together by a very simple string-like communication chain. In this paper we present powerful algorithms based on this architecture indicating the performance perspectives if the hardware quality reaches present or even future technology levels. (author)

  7. An Image Matching Algorithm Integrating Global SRTM and Image Segmentation for Multi-Source Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Xiao Ling

    2016-08-01

    Full Text Available This paper presents a novel image matching method for multi-source satellite images, which integrates global Shuttle Radar Topography Mission (SRTM data and image segmentation to achieve robust and numerous correspondences. This method first generates the epipolar lines as a geometric constraint assisted by global SRTM data, after which the seed points are selected and matched. To produce more reliable matching results, a region segmentation-based matching propagation is proposed in this paper, whereby the region segmentations are extracted by image segmentation and are considered to be a spatial constraint. Moreover, a similarity measure integrating Distance, Angle and Normalized Cross-Correlation (DANCC, which considers geometric similarity and radiometric similarity, is introduced to find the optimal correspondences. Experiments using typical satellite images acquired from Resources Satellite-3 (ZY-3, Mapping Satellite-1, SPOT-5 and Google Earth demonstrated that the proposed method is able to produce reliable and accurate matching results.

  8. Satellite-based technique for nowcasting of thunderstorms over ...

    Indian Academy of Sciences (India)

    Suman Goyal

    2017-08-31

    Aug 31, 2017 ... Due to inadequate radar network, satellite plays the dominant role for nowcast of these thunderstorms. In this study, a nowcast based algorithm ForTracc developed by Vila ... of actual development of cumulonimbus clouds, ... MCS over Indian region using Infrared Channel ... (2016) based on case study of.

  9. Studying NASA's Transition to Ka-Band Communications for Low Earth Orbit

    Science.gov (United States)

    Chelmins, David T.; Reinhart, Richard C.; Mortensen, Dale; Welch, Bryan; Downey, Joseph; Evans, Michael

    2014-01-01

    As the S-band spectrum becomes crowded, future space missions will need to consider moving command and telemetry services to Ka-band. NASA's Space Communications and Navigation (SCaN) Testbed provides a software-defined radio (SDR) platform that is capable of supporting investigation of this service transition. The testbed contains two S-band SDRs and one Ka-band SDR. Over the past year, SCaN Testbed has demonstrated Ka-band communications capabilities with NASAs Tracking and Data Relay Satellite System (TDRSS) using both open- and closed-loop antenna tracking profiles. A number of technical areas need to be addressed for successful transition to Ka-band. The smaller antenna beamwidth at Ka-band increases the criticality of antenna pointing, necessitating closed loop tracking algorithms and new techniques for received power estimation. Additionally, the antenna pointing routines require enhanced knowledge of spacecraft position and attitude for initial acquisition, versus an S-band antenna. Ka-band provides a number of technical advantages for bulk data transfer. Unlike at S-band, a larger bandwidth may be available for space missions, allowing increased data rates. The potential for high rate data transfer can also be extended for direct-to-ground links through use of variable or adaptive coding and modulation. Specific examples of Ka-band research from SCaN Testbeds first year of operation will be cited, such as communications link performance with TDRSS, and the effects of truss flexure on antenna pointing.

  10. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    Science.gov (United States)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  11. Algorithms and Applications in Grass Growth Monitoring

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2013-01-01

    Full Text Available Monitoring vegetation phonology using satellite data has been an area of growing research interest in recent decades. Validation is an essential issue in land surface phenology study at large scale. In this paper, double logistic function-fitting algorithm was used to retrieve phenophases for grassland in North China from a consistently processed Moderate Resolution Spectrodiometer (MODIS dataset. Then, the accuracy of the satellite-based estimates was assessed using field phenology observations. Results show that the method is valid to identify vegetation phenology with good success. The phenophases derived from satellite and observed on ground are generally similar. Greenup onset dates identified by Normalized Difference Vegetation Index (NDVI and in situ observed dates showed general agreement. There is an excellent agreement between the dates of maturity onset determined by MODIS and the field observations. The satellite-derived length of vegetation growing season is generally consistent with the surface observation.

  12. Evaluasi Kinerja Layanan IPTV pada Jaringan Testbed WiMAX Berbasis Standar IEEE 802.16-2004

    Directory of Open Access Journals (Sweden)

    Prasetiyono Hari Mukti

    2015-09-01

    Full Text Available In this paper, a performance evaluation for IPTV Services over WiMAX testbed based on IEEE Standard 802.16-2004 will be described. The performance of the proposed system is evaluated in terms of delay, jitter, throughput and packet loss. Service performance evaluations are conducted on network topology of point-to-point in the variation of background traffic with different scheduling types. Background traffic is injected into the system to give the sense that the proposed system has variation traffic load. Scheduling type which are used in this paper are Best Effort (BE, Non-Real-Time Polling Service (nrtPS, Real-Time Polling Service (rtPS and Unsolicited Grant Service (UGS. The expemerintal results of IPTV service performance over the testbed network show that the maximum average of delay, jitter, packet loss and jitter are 16.581 ms, 58.515 ms, 0.67 Mbps dan 10.96%, respectively.

  13. Early Examples from the Integrated Multi-Satellite Retrievals for GPM (IMERG)

    Science.gov (United States)

    Huffman, George; Bolvin, David; Braithwaite, Daniel; Hsu, Kuolin; Joyce, Robert; Kidd, Christopher; Sorooshian, Soroosh; Xie, Pingping

    2014-05-01

    The U.S. GPM Science Team's Day-1 algorithm for computing combined precipitation estimates as part of GPM is the Integrated Multi-satellitE Retrievals for GPM (IMERG). The goal is to compute the best time series of (nearly) global precipitation from "all" precipitation-relevant satellites and global surface precipitation gauge analyses. IMERG is being developed as a unified U.S. algorithm drawing on strengths in the three contributing groups, whose previous work includes: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA); 2) the CPC Morphing algorithm with Kalman Filtering (K-CMORPH); and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS). We review the IMERG design and development, plans for testing, and current status. Some of the lessons learned in running and reprocessing the previous data sets include the importance of quality-controlling input data sets, strategies for coping with transitions in the various input data sets, and practical approaches to retrospective analysis of multiple output products (namely the real- and post-real-time data streams). IMERG output will be illustrated using early test data, including the variety of supporting fields, such as the merged-microwave and infrared estimates, and the precipitation type. We end by considering recent changes in input data specifications, the transition from TRMM-based calibration to GPM-based, and further "Day 2" development.

  14. Validation and Intercomparison of Ocean Color Algorithms for Estimating Particulate Organic Carbon in the Oceans

    Directory of Open Access Journals (Sweden)

    Hayley Evers-King

    2017-08-01

    Full Text Available Particulate Organic Carbon (POC plays a vital role in the ocean carbon cycle. Though relatively small compared with other carbon pools, the POC pool is responsible for large fluxes and is linked to many important ocean biogeochemical processes. The satellite ocean-color signal is influenced by particle composition, size, and concentration and provides a way to observe variability in the POC pool at a range of temporal and spatial scales. To provide accurate estimates of POC concentration from satellite ocean color data requires algorithms that are well validated, with uncertainties characterized. Here, a number of algorithms to derive POC using different optical variables are applied to merged satellite ocean color data provided by the Ocean Color Climate Change Initiative (OC-CCI and validated against the largest database of in situ POC measurements currently available. The results of this validation exercise indicate satisfactory levels of performance from several algorithms (highest performance was observed from the algorithms of Loisel et al., 2002; Stramski et al., 2008 and uncertainties that are within the requirements of the user community. Estimates of the standing stock of the POC can be made by applying these algorithms, and yield an estimated mixed-layer integrated global stock of POC between 0.77 and 1.3 Pg C of carbon. Performance of the algorithms vary regionally, suggesting that blending of region-specific algorithms may provide the best way forward for generating global POC products.

  15. Satellite Retrieval of Atmospheric Water Budget over Gulf of Mexico- Caribbean Basin: Seasonal Variability

    Science.gov (United States)

    Smith, Eric A.; Santos, Pablo; Einaudi, Franco (Technical Monitor)

    2001-01-01

    This study presents results from a multi-satellite/multi-sensor retrieval system designed to obtain the atmospheric water budget over the open ocean. A combination of hourly-sampled monthly datasets derived from the GOES-8 5 Imager and the DMSP 7-channel passive microwave radiometer (SSM/I) have been acquired for the Gulf of Mexico-Caribbean Sea basin. Whereas the methodology is being tested over this basin, the retrieval system is designed for portability to any open-ocean region. Algorithm modules using the different datasets to retrieve individual geophysical parameters needed in the water budget equation are designed in a manner that takes advantage of the high temporal resolution of the GOES-8 measurements, as well as the physical relationships inherent to the SSM/I passive microwave signals in conjunction with water vapor, cloud liquid water, and rainfall. The methodology consists of retrieving the precipitation, surface evaporation, and vapor-cloud water storage terms in the atmospheric water balance equation from satellite techniques, with the water vapor advection term being obtained as the residue needed for balance. Thus, we have sought to develop a purely satellite-based method for obtaining the full set of terms in the atmospheric water budget equation without requiring in situ sounding information on the wind profile. The algorithm is partly validated by first cross-checking all the algorithm components through multiple-algorithm retrieval intercomparisons. More fundamental validation is obtained by directly comparing water vapor transports into the targeted basin diagnosed from the satellite algorithm to those obtained observationally from a network of land-based upper air stations that nearly uniformly surround the basin. Total columnar atmospheric water budget results will be presented for an extended annual cycle consisting of the months of October-97, January-98, April-98, July-98, October-98, and January-1999. These results are used to emphasize

  16. Using Fuzzy SOM Strategy for Satellite Image Retrieval and Information Mining

    Directory of Open Access Journals (Sweden)

    Yo-Ping Huang

    2008-02-01

    Full Text Available This paper proposes an efficient satellite image retrieval and knowledge discovery model. The strategy comprises two major parts. First, a computational algorithm is used for off-line satellite image feature extraction, image data representation and image retrieval. Low level features are automatically extracted from the segmented regions of satellite images. A self-organization feature map is used to construct a two-layer satellite image concept hierarchy. The events are stored in one layer and the corresponding feature vectors are categorized in the other layer. Second, a user friendly interface is provided that retrieves images of interest and mines useful information based on the events in the concept hierarchy. The proposed system is evaluated with prominent features such as typhoons or high-pressure masses.

  17. Real-time remote diagnostic monitoring test-bed in JET

    International Nuclear Information System (INIS)

    Castro, R.; Kneupner, K.; Vega, J.; De Arcas, G.; Lopez, J.M.; Purahoo, K.; Murari, A.; Fonseca, A.; Pereira, A.; Portas, A.

    2010-01-01

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. Its main functionality is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there is one data generator, which is the acquisition equipment associated with the reflectometer diagnostic that generates data and status information. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on JAVA Web Start technology has been used. There are three interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of an architecture, flexible enough to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements. Finally, the third result is a secure system, taking into account internal networks and firewalls aspects of JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to monitor diagnostics in real-time, and enabling the integration of this service into the EFDA Federation (Castro et al., 2008 ).

  18. Real-time remote diagnostic monitoring test-bed in JET

    Energy Technology Data Exchange (ETDEWEB)

    Castro, R., E-mail: rodrigo.castro@ciemat.e [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); Kneupner, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Vega, J. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain); De Arcas, G.; Lopez, J.M. [Universidad Politecnica de Madrid, Grupo I2A2, Madrid (Spain); Purahoo, K. [EURATOM/UKAEA Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom); Murari, A. [Associazione EURATOM-ENEA per la Fusione, Consorzio RFX, 4-35127 Padova (Italy); Fonseca, A. [Associacao EURATOM/IST, Lisbon (Portugal); Pereira, A.; Portas, A. [Asociacion EURATOM/CIEMAT para Fusion, Madrid (Spain)

    2010-07-15

    Based on the remote experimentation concept oriented to long pulse shots, a test-bed system has been implemented in JET. Its main functionality is the real-time monitoring, on remote, of a reflectometer diagnostic, to visualize different data outputs and status information. The architecture of the system is formed by: the data generator components, the data distribution system, an access control service, and the client applications. In the test-bed there is one data generator, which is the acquisition equipment associated with the reflectometer diagnostic that generates data and status information. The data distribution system has been implemented using a publishing-subscribing technology that receives data from data generators and redistributes them to client applications. And finally, for monitoring, a client application based on JAVA Web Start technology has been used. There are three interesting results from this project. The first one is the analysis of different aspects (data formats, data frame rate, data resolution, etc) related with remote real-time diagnostic monitoring oriented to long pulse experiments. The second one is the definition and implementation of an architecture, flexible enough to be applied to different types of data generated from other diagnostics, and that fits with remote access requirements. Finally, the third result is a secure system, taking into account internal networks and firewalls aspects of JET, and securing the access from remote users. For this last issue, PAPI technology has been used, enabling access control based on user attributes, enabling mobile users to monitor diagnostics in real-time, and enabling the integration of this service into the EFDA Federation (Castro et al., 2008 ).

  19. A MEMS-based Adaptive AHRS for Marine Satellite Tracking Antenna

    DEFF Research Database (Denmark)

    Wang, Yunlong; Hussain, Dil Muhammed Akbar; Soltani, Mohsen

    2015-01-01

    Satellite tracking is a challenging task for marine applications. An attitude determination system should estimate the wave disturbances on the ship body accurately. To achieve this, an Attitude Heading Reference System (AHRS) based on Micro-Electro-Mechanical Systems (MEMS) sensors, composed...... of three-axis gyroscope, accelerometer and magnetometer, is developed for Marine Satellite Tracking Antenna (MSTA). In this paper, the attitude determination algorithm is improved using an adaptive mechanism that tunes the attitude estimator parameters based on an estimation of ship motion frequency...

  20. Improving Satellite Quantitative Precipitation Estimation Using GOES-Retrieved Cloud Optical Depth

    Energy Technology Data Exchange (ETDEWEB)

    Stenz, Ronald; Dong, Xiquan; Xi, Baike; Feng, Zhe; Kuligowski, Robert J.

    2016-02-01

    To address significant gaps in ground-based radar coverage and rain gauge networks in the U.S., geostationary satellite quantitative precipitation estimates (QPEs) such as the Self-Calibrating Multivariate Precipitation Retrievals (SCaMPR) can be used to fill in both the spatial and temporal gaps of ground-based measurements. Additionally, with the launch of GOES-R, the temporal resolution of satellite QPEs may be comparable to that of Weather Service Radar-1988 Doppler (WSR-88D) volume scans as GOES images will be available every five minutes. However, while satellite QPEs have strengths in spatial coverage and temporal resolution, they face limitations particularly during convective events. Deep Convective Systems (DCSs) have large cloud shields with similar brightness temperatures (BTs) over nearly the entire system, but widely varying precipitation rates beneath these clouds. Geostationary satellite QPEs relying on the indirect relationship between BTs and precipitation rates often suffer from large errors because anvil regions (little/no precipitation) cannot be distinguished from rain-cores (heavy precipitation) using only BTs. However, a combination of BTs and optical depth (τ) has been found to reduce overestimates of precipitation in anvil regions (Stenz et al. 2014). A new rain mask algorithm incorporating both τ and BTs has been developed, and its application to the existing SCaMPR algorithm was evaluated. The performance of the modified SCaMPR was evaluated using traditional skill scores and a more detailed analysis of performance in individual DCS components by utilizing the Feng et al. (2012) classification algorithm. SCaMPR estimates with the new rain mask applied benefited from significantly reduced overestimates of precipitation in anvil regions and overall improvements in skill scores.

  1. Efficient retrieval of vegetation leaf area index and canopy clumping factor from satellite data to support pollutant deposition assessments

    International Nuclear Information System (INIS)

    Nikolov, Ned; Zeller, Karl

    2006-01-01

    Canopy leaf area index (LAI) is an important structural parameter of the vegetation controlling pollutant uptake by terrestrial ecosystems. This paper presents a computationally efficient algorithm for retrieval of vegetation LAI and canopy clumping factor from satellite data using observed Simple Ratios (SR) of near-infrared to red reflectance. The method employs numerical inversion of a physics-based analytical canopy radiative transfer model that simulates the bi-directional reflectance distribution function (BRDF). The algorithm is independent of ecosystem type. The method is applied to 1-km resolution AVHRR satellite images to retrieve a geo-referenced data set of monthly LAI values for the conterminous USA. Satellite-based LAI estimates are compared against independent ground LAI measurements over a range of ecosystem types. Verification results suggest that the new algorithm represents a viable approach to LAI retrieval at continental scale, and can facilitate spatially explicit studies of regional pollutant deposition and trace gas exchange. - The paper presents a physics-based algorithm for retrieval of vegetation LAI and canopy-clumping factor from satellite data to assist research of pollutant deposition and trace-gas exchange. The method is employed to derive a monthly LAI dataset for the conterminous USA and verified at a continental scale

  2. Efficient retrieval of vegetation leaf area index and canopy clumping factor from satellite data to support pollutant deposition assessments

    Energy Technology Data Exchange (ETDEWEB)

    Nikolov, Ned [Natural Resource Research Center, 2150 Centre Avenue, Building A, Room 368, Fort Collins, CO 80526 (United States)]. E-mail: nnikolov@fs.fed.us; Zeller, Karl [USDA FS Rocky Mountain Research Station, 240 W. Prospect Road, Fort Collins, CO 80526 (United States)]. E-mail: kzeller@fs.fed.us

    2006-06-15

    Canopy leaf area index (LAI) is an important structural parameter of the vegetation controlling pollutant uptake by terrestrial ecosystems. This paper presents a computationally efficient algorithm for retrieval of vegetation LAI and canopy clumping factor from satellite data using observed Simple Ratios (SR) of near-infrared to red reflectance. The method employs numerical inversion of a physics-based analytical canopy radiative transfer model that simulates the bi-directional reflectance distribution function (BRDF). The algorithm is independent of ecosystem type. The method is applied to 1-km resolution AVHRR satellite images to retrieve a geo-referenced data set of monthly LAI values for the conterminous USA. Satellite-based LAI estimates are compared against independent ground LAI measurements over a range of ecosystem types. Verification results suggest that the new algorithm represents a viable approach to LAI retrieval at continental scale, and can facilitate spatially explicit studies of regional pollutant deposition and trace gas exchange. - The paper presents a physics-based algorithm for retrieval of vegetation LAI and canopy-clumping factor from satellite data to assist research of pollutant deposition and trace-gas exchange. The method is employed to derive a monthly LAI dataset for the conterminous USA and verified at a continental scale.

  3. First results of the Test-Bed Telescopes (TBT) project: Cebreros telescope commissioning

    Science.gov (United States)

    Ocaña, Francisco; Ibarra, Aitor; Racero, Elena; Montero, Ángel; Doubek, Jirí; Ruiz, Vicente

    2016-07-01

    The TBT project is being developed under ESA's General Studies and Technology Programme (GSTP), and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario within the Space Situational Awareness (SSA) programme of the European Space Agency (ESA). The goal of the project is to provide two fully robotic telescopes, which will serve as prototypes for development of a future network. The system consists of two telescopes, one in Spain and the second one in the Southern Hemisphere. The telescope is a fast astrograph with a large Field of View (FoV) of 2.5 x 2.5 square-degrees and a plate scale of 2.2 arcsec/pixel. The tube is mounted on a fast direct-drive mount moving with speed up to 20 degrees per second. The focal plane hosts a 2-port 4K x 4K back-illuminated CCD with readout speeds up to 1MHz per port. All these characteristics ensure good survey performance for transients and fast moving objects. Detection software and hardware are optimised for the detection of NEOs and objects in high Earth orbits (objects moving from 0.1-40 arcsec/second). Nominal exposures are in the range from 2 to 30 seconds, depending on the observational strategy. Part of the validation scenario involves the scheduling concept integrated in the robotic operations for both sensors. Every night it takes all the input needed and prepares a schedule following predefined rules allocating tasks for the telescopes. Telescopes are managed by RTS2 control software, that performs the real-time scheduling of the observation and manages all the devices at the observatory.1 At the end of the night the observing systems report astrometric positions and photometry of the objects detected. The first telescope was installed in Cebreros Satellite Tracking Station in mid-2015. It is currently in the commissioning phase and we present here the first results of the telescope. We evaluate the site characteristics and the performance of the TBT Cebreros

  4. Real-time maneuver optimization of space-based robots in a dynamic environment: Theory and on-orbit experiments

    Science.gov (United States)

    Chamitoff, Gregory E.; Saenz-Otero, Alvar; Katz, Jacob G.; Ulrich, Steve; Morrell, Benjamin J.; Gibbens, Peter W.

    2018-01-01

    This paper presents the development of a real-time path-planning optimization approach to controlling the motion of space-based robots. The algorithm is capable of planning three dimensional trajectories for a robot to navigate within complex surroundings that include numerous static and dynamic obstacles, path constraints and performance limitations. The methodology employs a unique transformation that enables rapid generation of feasible solutions for complex geometries, making it suitable for application to real-time operations and dynamic environments. This strategy was implemented on the Synchronized Position Hold Engage Reorient Experimental Satellite (SPHERES) test-bed on the International Space Station (ISS), and experimental testing was conducted onboard the ISS during Expedition 17 by the first author. Lessons learned from the on-orbit tests were used to further refine the algorithm for future implementations.

  5. Coral-based Proxy Records of Ocean Acidification: A Pilot Study at the Puerto Rico Test-bed Site

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Coral cores collected nearby the Atlantic Ocean Acidification Test-bed (AOAT) at La Parguera, Puerto Rico were used to characterize the relationship between...

  6. 3D reconstruction from multi-view VHR-satellite images in MicMac

    Science.gov (United States)

    Rupnik, Ewelina; Pierrot-Deseilligny, Marc; Delorme, Arthur

    2018-05-01

    This work addresses the generation of high quality digital surface models by fusing multiple depths maps calculated with the dense image matching method. The algorithm is adapted to very high resolution multi-view satellite images, and the main contributions of this work are in the multi-view fusion. The algorithm is insensitive to outliers, takes into account the matching quality indicators, handles non-correlated zones (e.g. occlusions), and is solved with a multi-directional dynamic programming approach. No geometric constraints (e.g. surface planarity) or auxiliary data in form of ground control points are required for its operation. Prior to the fusion procedures, the RPC geolocation parameters of all images are improved in a bundle block adjustment routine. The performance of the algorithm is evaluated on two VHR (Very High Resolution)-satellite image datasets (Pléiades, WorldView-3) revealing its good performance in reconstructing non-textured areas, repetitive patterns, and surface discontinuities.

  7. Development and Validation of Improved Techniques for Cloud Property Retrieval from Environmental Satellites

    National Research Council Canada - National Science Library

    Gustafson, Gary

    2000-01-01

    ...) develop extensible cloud property retrieval algorithms suitable for expanding existing cloud analysis capabilities to utilize data from new and future environmental satellite sensing systems; (2...

  8. The Living With a Star Program Space Environment Testbed

    Science.gov (United States)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  9. Smart Grid: Network simulator for smart grid test-bed

    International Nuclear Information System (INIS)

    Lai, L C; Ong, H S; Che, Y X; Do, N Q; Ong, X J

    2013-01-01

    Smart Grid become more popular, a smaller scale of smart grid test-bed is set up at UNITEN to investigate the performance and to find out future enhancement of smart grid in Malaysia. The fundamental requirement in this project is design a network with low delay, no packet drop and with high data rate. Different type of traffic has its own characteristic and is suitable for different type of network and requirement. However no one understands the natural of traffic in smart grid. This paper presents the comparison between different types of traffic to find out the most suitable traffic for the optimal network performance.

  10. Multi-Satellite Observation Scheduling for Large Area Disaster Emergency Response

    Science.gov (United States)

    Niu, X. N.; Tang, H.; Wu, L. X.

    2018-04-01

    an optimal imaging plan, plays a key role in coordinating multiple satellites to monitor the disaster area. In the paper, to generate imaging plan dynamically according to the disaster relief, we propose a dynamic satellite task scheduling method for large area disaster response. First, an initial robust scheduling scheme is generated by a robust satellite scheduling model in which both the profit and the robustness of the schedule are simultaneously maximized. Then, we use a multi-objective optimization model to obtain a series of decomposing schemes. Based on the initial imaging plan, we propose a mixed optimizing algorithm named HA_NSGA-II to allocate the decomposing results thus to obtain an adjusted imaging schedule. A real disaster scenario, i.e., 2008 Wenchuan earthquake, is revisited in terms of rapid response using satellite resources and used to evaluate the performance of the proposed method with state-of-the-art approaches. We conclude that our satellite scheduling model can optimize the usage of satellite resources so as to obtain images in disaster response in a more timely and efficient manner.

  11. Research on Coal Exploration Technology Based on Satellite Remote Sensing

    Directory of Open Access Journals (Sweden)

    Dong Xiao

    2016-01-01

    Full Text Available Coal is the main source of energy. In China and Vietnam, coal resources are very rich, but the exploration level is relatively low. This is mainly caused by the complicated geological structure, the low efficiency, the related damage, and other bad situations. To this end, we need to make use of some advanced technologies to guarantee the resource exploration is implemented smoothly and orderly. Numerous studies show that remote sensing technology is an effective way in coal exploration and measurement. In this paper, we try to measure the distribution and reserves of open-air coal area through satellite imagery. The satellite picture of open-air coal mining region in Quang Ninh Province of Vietnam was collected as the experimental data. Firstly, the ENVI software is used to eliminate satellite imagery spectral interference. Then, the image classification model is established by the improved ELM algorithm. Finally, the effectiveness of the improved ELM algorithm is verified by using MATLAB simulations. The results show that the accuracies of the testing set reach 96.5%. And it reaches 83% of the image discernment precision compared with the same image from Google.

  12. Use of GOES, SSM/I, TRMM Satellite Measurements Estimating Water Budget Variations in Gulf of Mexico - Caribbean Sea Basins

    Science.gov (United States)

    Smith, Eric A.

    2004-01-01

    This study presents results from a multi-satellite/multi-sensor retrieval system designed to obtain the atmospheric water budget over the open ocean. A combination of 3ourly-sampled monthly datasets derived from the GOES-8 5-channel Imager, the TRMM TMI radiometer, and the DMSP 7-channel passive microwave radiometers (SSM/I) have been acquired for the combined Gulf of Mexico-Caribbean Sea basin. Whereas the methodology has been tested over this basin, the retrieval system is designed for portability to any open-ocean region. Algorithm modules using the different datasets to retrieve individual geophysical parameters needed in the water budget equation are designed in a manner that takes advantage of the high temporal resolution of the GOES-8 measurements, as well as the physical relationships inherent to the TRMM and SSM/I passive microwave measurements in conjunction with water vapor, cloud liquid water, and rainfall. The methodology consists of retrieving the precipitation, surface evaporation, and vapor-cloud water storage terms in the atmospheric water balance equation from satellite techniques, with the water vapor advection term being obtained as the residue needed for balance. Thus, the intent is to develop a purely satellite-based method for obtaining the full set of terms in the atmospheric water budget equation without requiring in situ sounding information on the wind profile. The algorithm is validated by cross-checking all the algorithm components through multiple- algorithm retrieval intercomparisons. A further check on the validation is obtained by directly comparing water vapor transports into the targeted basin diagnosed from the satellite algorithms to those obtained observationally from a network of land-based upper air stations that nearly uniformly surround the basin, although it is fair to say that these checks are more effective m identifying problems in estimating vapor transports from a leaky operational radiosonde network than in verifying

  13. Meteosat SEVIRI Fire Radiative Power (FRP) products from the Land Surface Analysis Satellite Applications Facility (LSA SAF) - Part 1: Algorithms, product contents and analysis

    Science.gov (United States)

    Wooster, M. J.; Roberts, G.; Freeborn, P. H.; Xu, W.; Govaerts, Y.; Beeby, R.; He, J.; Lattanzio, A.; Mullen, R.

    2015-06-01

    Characterising changes in landscape scale fire activity at very high temporal resolution is best achieved using thermal observations of actively burning fires made from geostationary Earth observation (EO) satellites. Over the last decade or more, a series of research and/or operational "active fire" products have been developed from these types of geostationary observations, often with the aim of supporting the generation of data related to biomass burning fuel consumption and trace gas and aerosol emission fields. The Fire Radiative Power (FRP) products generated by the Land Surface Analysis Satellite Applications Facility (LSA SAF) from data collected by the Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI) are one such set of products, and are freely available in both near real-time and archived form. Every 15 min, the algorithms used to generate these products identify and map the location of new SEVIRI observations containing actively burning fires, and characterise their individual rates of radiative energy release (fire radiative power; FRP) that is believed proportional to rates of biomass consumption and smoke emission. The FRP-PIXEL product contains the highest spatial resolution FRP dataset, delivered for all of Europe, northern and southern Africa, and part of South America at a spatial resolution of 3 km (decreasing away from the west African sub-satellite point) at the full 15 min temporal resolution. The FRP-GRID product is an hourly summary of the FRP-PIXEL data, produced at a 5° grid cell size and including simple bias adjustments for meteorological cloud cover and for the regional underestimation of FRP caused, primarily, by the non-detection of low FRP fire pixels at SEVIRI's relatively coarse pixel size. Here we describe the enhanced geostationary Fire Thermal Anomaly (FTA) algorithm used to detect the SEVIRI active fire pixels, and detail methods used to deliver atmospherically corrected FRP information

  14. Optimizing Electric Vehicle Coordination Over a Heterogeneous Mesh Network in a Scaled-Down Smart Grid Testbed

    DEFF Research Database (Denmark)

    Bhattarai, Bishnu Prasad; Lévesque, Martin; Maier, Martin

    2015-01-01

    High penetration of renewable energy sources and electric vehicles (EVs) create power imbalance and congestion in the existing power network, and hence causes significant problems in the control and operation. Despite investing huge efforts from the electric utilities, governments, and researchers......, smart grid (SG) is still at the developmental stage to address those issues. In this regard, a smart grid testbed (SGT) is desirable to develop, analyze, and demonstrate various novel SG solutions, namely demand response, real-time pricing, and congestion management. In this paper, a novel SGT...... is developed in a laboratory by scaling a 250 kVA, 0.4 kV real low-voltage distribution feeder down to 1 kVA, 0.22 kV. Information and communication technology is integrated in the scaled-down network to establish real-time monitoring and control. The novelty of the developed testbed is demonstrated...

  15. Semi-empirical Algorithm for the Retrieval of Ecology-Relevant Water Constituents in Various Aquatic Environments

    Directory of Open Access Journals (Sweden)

    Robert Shuchman

    2009-03-01

    Full Text Available An advanced operational semi-empirical algorithm for processing satellite remote sensing data in the visible region is described. Based on the Levenberg-Marquardt multivariate optimization procedure, the algorithm is developed for retrieving major water colour producing agents: chlorophyll-a, suspended minerals and dissolved organics. Two assurance units incorporated by the algorithm are intended to flag pixels with inaccurate atmospheric correction and specific hydro-optical properties not covered by the applied hydro-optical model. The hydro-optical model is a set of spectral cross-sections of absorption and backscattering of the colour producing agents. The combination of the optimization procedure and a replaceable hydro-optical model makes the developed algorithm not specific to a particular satellite sensor or a water body. The algorithm performance efficiency is amply illustrated for SeaWiFS, MODIS and MERIS images over a variety of water bodies.

  16. Telescience testbed: Operational support functions for biomedical experiments

    Science.gov (United States)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  17. Towards an integrated strategy for monitoring wetland inundation with virtual constellations of optical and radar satellites

    Science.gov (United States)

    DeVries, B.; Huang, W.; Huang, C.; Jones, J. W.; Lang, M. W.; Creed, I. F.; Carroll, M.

    2017-12-01

    The function of wetlandscapes in hydrological and biogeochemical cycles is largely governed by surface inundation, with small wetlands that experience periodic inundation playing a disproportionately large role in these processes. However, the spatial distribution and temporal dynamics of inundation in these wetland systems are still poorly understood, resulting in large uncertainties in global water, carbon and greenhouse gas budgets. Satellite imagery provides synoptic and repeat views of the Earth's surface and presents opportunities to fill this knowledge gap. Despite the proliferation of Earth Observation satellite missions in the past decade, no single satellite sensor can simultaneously provide the spatial and temporal detail needed to adequately characterize inundation in small, dynamic wetland systems. Surface water data products must therefore integrate observations from multiple satellite sensors in order to address this objective, requiring the development of improved and coordinated algorithms to generate consistent estimates of surface inundation. We present a suite of algorithms designed to detect surface inundation in wetlands using data from a virtual constellation of optical and radar sensors comprising the Landsat and Sentinel missions (DeVries et al., 2017). Both optical and radar algorithms were able to detect inundation in wetlands without the need for external training data, allowing for high-efficiency monitoring of wetland inundation at large spatial and temporal scales. Applying these algorithms across a gradient of wetlands in North America, preliminary findings suggest that while these fully automated algorithms can detect wetland inundation at higher spatial and temporal resolutions than currently available surface water data products, limitations specific to the satellite sensors and their acquisition strategies are responsible for uncertainties in inundation estimates. Further research is needed to investigate strategies for

  18. Rain detection over land surfaces using passive microwave satellite data

    NARCIS (Netherlands)

    Bauer, P.; Burose, D.; Schulz, J.

    2002-01-01

    An algorithm is presented for the detection of surface rainfall using passive microwave measurements by satellite radiometers. The technique consists of a two-stage approach to distinguish precipitation signatures from other effects: (1) Contributions from slowly varying parameters (surface type and

  19. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  20. Arctic sea ice albedo - A comparison of two satellite-derived data sets

    Science.gov (United States)

    Schweiger, Axel J.; Serreze, Mark C.; Key, Jeffrey R.

    1993-01-01

    Spatial patterns of mean monthly surface albedo for May, June, and July, derived from DMSP Operational Line Scan (OLS) satellite imagery are compared with surface albedos derived from the International Satellite Cloud Climatology Program (ISCCP) monthly data set. Spatial patterns obtained by the two techniques are in general agreement, especially for June and July. Nevertheless, systematic differences in albedo of 0.05 - 0.10 are noted which are most likely related to uncertainties in the simple parameterizations used in the DMSP analyses, problems in the ISCCP cloud-clearing algorithm and other modeling simplifications. However, with respect to the eventual goal of developing a reliable automated retrieval algorithm for compiling a long-term albedo data base, these initial comparisons are very encouraging.

  1. Prediction of the Sun-Glint Locations for the Communication, Ocean and Meteorological Satellite

    Directory of Open Access Journals (Sweden)

    Jae-Ik Park

    2005-09-01

    Full Text Available For the Communication, Ocean and Meteorological Satellite (COMS which will be launched in 2008, an algorithm for finding the precise location of the sun-glint point on the ocean surface is studied. The precise locations of the sun-glint are estimated by considering azimuth and elevation angles of Sun-satellite-Earth geometric position and the law of reflection. The obtained nonlinear equations are solved by using the Newton-Raphson method. As a result, when COMS is located at 116.2°E or 128.2°E longitude, the sun-glint covers region of ±10° (N-S latitude and 80-150° (E-W longitude. The diurnal path of the sun-glint in the southern hemisphere is curved towards the North Pole, and the path in the northern hemisphere is forwards the south pole. The algorithm presented in this paper can be applied to predict the precise location of sun-glint region in any other geostationary satellites.

  2. A Method to Derive Monitoring Variables for a Cyber Security Test-bed of I and C System

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kyung Soo; Song, Jae Gu; Lee, Joung Woon; Lee, Cheol Kwon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    In the IT field, monitoring techniques have been developed to protect the systems connected by networks from cyber attacks and incidents. For the development of monitoring systems for I and C cyber security, it is necessary to review the monitoring systems in the IT field and derive cyber security-related monitoring variables among the proprietary operating information about the I and C systems. Tests for the development and application of these monitoring systems may cause adverse effects on the I and C systems. To analyze influences on the system and safely intended variables, the construction of an I and C system Test-bed should be preceded. This article proposes a method of deriving variables that should be monitored through a monitoring system for cyber security as a part of I and C Test-bed. The surveillance features and the monitored variables of NMS(Network Management System), a monitoring technique in the IT field, were reviewed in section 2. In Section 3, the monitoring variables for an I and C cyber security were derived by the of NMS and the investigation for information used for hacking techniques that can be practiced against I and C systems. The monitoring variables of NMS in the IT field and the information about the malicious behaviors used for hacking were derived as expected variables to be monitored for an I and C cyber security research. The derived monitoring variables were classified into the five functions of NMS for efficient management. For the cyber security of I and C systems, the vulnerabilities should be understood through a penetration test etc. and an assessment of influences on the actual system should be carried out. Thus, constructing a test-bed of I and C systems is necessary for the safety system in operation. In the future, it will be necessary to develop a logging and monitoring system for studies on the vulnerabilities of I and C systems with test-beds.

  3. NASA Langley's AirSTAR Testbed: A Subscale Flight Test Capability for Flight Dynamics and Control System Experiments

    Science.gov (United States)

    Jordan, Thomas L.; Bailey, Roger M.

    2008-01-01

    As part of the Airborne Subscale Transport Aircraft Research (AirSTAR) project, NASA Langley Research Center (LaRC) has developed a subscaled flying testbed in order to conduct research experiments in support of the goals of NASA s Aviation Safety Program. This research capability consists of three distinct components. The first of these is the research aircraft, of which there are several in the AirSTAR stable. These aircraft range from a dynamically-scaled, twin turbine vehicle to a propeller driven, off-the-shelf airframe. Each of these airframes carves out its own niche in the research test program. All of the airplanes have sophisticated on-board data acquisition and actuation systems, recording, telemetering, processing, and/or receiving data from research control systems. The second piece of the testbed is the ground facilities, which encompass the hardware and software infrastructure necessary to provide comprehensive support services for conducting flight research using the subscale aircraft, including: subsystem development, integrated testing, remote piloting of the subscale aircraft, telemetry processing, experimental flight control law implementation and evaluation, flight simulation, data recording/archiving, and communications. The ground facilities are comprised of two major components: (1) The Base Research Station (BRS), a LaRC laboratory facility for system development, testing and data analysis, and (2) The Mobile Operations Station (MOS), a self-contained, motorized vehicle serving as a mobile research command/operations center, functionally equivalent to the BRS, capable of deployment to remote sites for supporting flight tests. The third piece of the testbed is the test facility itself. Research flights carried out by the AirSTAR team are conducted at NASA Wallops Flight Facility (WFF) on the Eastern Shore of Virginia. The UAV Island runway is a 50 x 1500 paved runway that lies within restricted airspace at Wallops Flight Facility. The

  4. A Method to Derive Monitoring Variables for a Cyber Security Test-bed of I and C System

    International Nuclear Information System (INIS)

    Han, Kyung Soo; Song, Jae Gu; Lee, Joung Woon; Lee, Cheol Kwon

    2013-01-01

    In the IT field, monitoring techniques have been developed to protect the systems connected by networks from cyber attacks and incidents. For the development of monitoring systems for I and C cyber security, it is necessary to review the monitoring systems in the IT field and derive cyber security-related monitoring variables among the proprietary operating information about the I and C systems. Tests for the development and application of these monitoring systems may cause adverse effects on the I and C systems. To analyze influences on the system and safely intended variables, the construction of an I and C system Test-bed should be preceded. This article proposes a method of deriving variables that should be monitored through a monitoring system for cyber security as a part of I and C Test-bed. The surveillance features and the monitored variables of NMS(Network Management System), a monitoring technique in the IT field, were reviewed in section 2. In Section 3, the monitoring variables for an I and C cyber security were derived by the of NMS and the investigation for information used for hacking techniques that can be practiced against I and C systems. The monitoring variables of NMS in the IT field and the information about the malicious behaviors used for hacking were derived as expected variables to be monitored for an I and C cyber security research. The derived monitoring variables were classified into the five functions of NMS for efficient management. For the cyber security of I and C systems, the vulnerabilities should be understood through a penetration test etc. and an assessment of influences on the actual system should be carried out. Thus, constructing a test-bed of I and C systems is necessary for the safety system in operation. In the future, it will be necessary to develop a logging and monitoring system for studies on the vulnerabilities of I and C systems with test-beds

  5. Derivation and evaluation of land surface temperature from the geostationary operational environmental satellite series

    Science.gov (United States)

    Fang, Li

    The Geostationary Operational Environmental Satellites (GOES) have been continuously monitoring the earth surface since 1970, providing valuable and intensive data from a very broad range of wavelengths, day and night. The National Oceanic and Atmospheric Administration's (NOAA's) National Environmental Satellite, Data, and Information Service (NESDIS) is currently operating GOES-15 and GOES-13. The design of the GOES series is now heading to the 4 th generation. GOES-R, as a representative of the new generation of the GOES series, is scheduled to be launched in 2015 with higher spatial and temporal resolution images and full-time soundings. These frequent observations provided by GOES Image make them attractive for deriving information on the diurnal land surface temperature (LST) cycle and diurnal temperature range (DTR). These parameters are of great value for research on the Earth's diurnal variability and climate change. Accurate derivation of satellite-based LSTs from thermal infrared data has long been an interesting and challenging research area. To better support the research on climate change, the generation of consistent GOES LST products for both GOES-East and GOES-West from operational dataset as well as historical archive is in great demand. The derivation of GOES LST products and the evaluation of proposed retrieval methods are two major objectives of this study. Literature relevant to satellite-based LST retrieval techniques was reviewed. Specifically, the evolution of two LST algorithm families and LST retrieval methods for geostationary satellites were summarized in this dissertation. Literature relevant to the evaluation of satellite-based LSTs was also reviewed. All the existing methods are a valuable reference to develop the GOES LST product. The primary objective of this dissertation is the development of models for deriving consistent GOES LSTs with high spatial and high temporal coverage. Proper LST retrieval algorithms were studied

  6. 多系统GNSS卫星轨道快速积分方法%A Rapid Orbit Integration Algorithm for Multi-GNSS Satellites

    Institute of Scientific and Technical Information of China (English)

    范磊; 李敏; 宋伟伟; 施闯; 王成

    2016-01-01

    A rapid and efficient orbit numerical integration algorithm with high accuracy is needed in multi-GNSS rapid precise orbit determination.In order to improve the compute efficiency, an adaptive step-changed Admas integration method and a synchronous integration algoritm for multi-GNSS satellites are developed in this paper.To validate the precision and efficiency of the proposed method, the multi-GNSS precise orbit products calculated by Wuhan University (WHU) and Center for Orbit Determination in Europe (CODE) are used for orbit fitting.Results show that, the average 3DRMS of GPS, GLONASS, BDS and Galileo satellites are all below 20mm.Comparing with the traditional step-fixed orbit integraion method applied for each satellite separately, the computational efficiency of the proposed method is improved significantly: without damaging the accuracy, it takes only 0.09s for a single satellite, which is 14 times faster than the traditional method.Besides, further improvement can be achieved when the number of satellites is increased.%快速高效且高精度的轨道数值积分算法是多系统GNSS卫星联合快速精密定轨的重要基础.本文从自适应变换Admas积分步长和多卫星同步积分两方面研究了多系统GNSS卫星轨道快速积分方法.为了验证该方法的精度和效率,利用武汉大学(WHU)与欧洲定轨中心(CODE)发布的事后精密星历进行轨道动力学拟合.试验结果表明:GPS/GLONASS/BDS/Galileo 4个系统卫星平均三维RMS均优于20mm;在不损失传统方法精度的前提下,单颗卫星平均积分与拟合耗时仅需0.09s,较传统逐颗卫星固定步长积分算法提升了14倍,并且随着卫星数的增加,效率提升越明显.

  7. Onboard autonomous mission re-planning for multi-satellite system

    Science.gov (United States)

    Zheng, Zixuan; Guo, Jian; Gill, Eberhard

    2018-04-01

    This paper presents an onboard autonomous mission re-planning system for Multi-Satellites System (MSS) to perform onboard re-planing in disruptive situations. The proposed re-planning system can deal with different potential emergency situations. This paper uses Multi-Objective Hybrid Dynamic Mutation Genetic Algorithm (MO-HDM GA) combined with re-planning techniques as the core algorithm. The Cyclically Re-planning Method (CRM) and the Near Real-time Re-planning Method (NRRM) are developed to meet different mission requirements. Simulations results show that both methods can provide feasible re-planning sequences under unforeseen situations. The comparisons illustrate that using the CRM is average 20% faster than the NRRM on computation time. However, by using the NRRM more raw data can be observed and transmitted than using the CRM within the same period. The usability of this onboard re-planning system is not limited to multi-satellite system. Other mission planning and re-planning problems related to autonomous multiple vehicles with similar demands are also applicable.

  8. Developing Information Services and Tools to Access and Evaluate Data Quality in Global Satellite-based Precipitation Products

    Science.gov (United States)

    Liu, Z.; Shie, C. L.; Meyer, D. J.

    2017-12-01

    Global satellite-based precipitation products have been widely used in research and applications around the world. Compared to ground-based observations, satellite-based measurements provide precipitation data on a global scale, especially in remote continents and over oceans. Over the years, satellite-based precipitation products have evolved from single sensor and single algorithm to multi-sensors and multi-algorithms. As a result, many satellite-based precipitation products have been enhanced such as spatial and temporal coverages. With inclusion of ground-based measurements, biases of satellite-based precipitation products have been significantly reduced. However, data quality issues still exist and can be caused by many factors such as observations, satellite platform anomaly, algorithms, production, calibration, validation, data services, etc. The NASA Goddard Earth Sciences (GES) Data and Information Services Center (DISC) is home to NASA global precipitation product archives including the Tropical Rainfall Measuring Mission (TRMM), the Global Precipitation Measurement (GPM), as well as other global and regional precipitation products. Precipitation is one of the top downloaded and accessed parameters in the GES DISC data archive. Meanwhile, users want to easily locate and obtain data quality information at regional and global scales to better understand how precipitation products perform and how reliable they are. As data service providers, it is necessary to provide an easy access to data quality information, however, such information normally is not available, and when it is available, it is not in one place and difficult to locate. In this presentation, we will present challenges and activities at the GES DISC to address precipitation data quality issues.

  9. Planning the FUSE Mission Using the SOVA Algorithm

    Science.gov (United States)

    Lanzi, James; Heatwole, Scott; Ward, Philip R.; Civeit, Thomas; Calvani, Humberto; Kruk, Jeffrey W.; Suchkov, Anatoly

    2011-01-01

    Three documents discuss the Sustainable Objective Valuation and Attainability (SOVA) algorithm and software as used to plan tasks (principally, scientific observations and associated maneuvers) for the Far Ultraviolet Spectroscopic Explorer (FUSE) satellite. SOVA is a means of managing risk in a complex system, based on a concept of computing the expected return value of a candidate ordered set of tasks as a product of pre-assigned task values and assessments of attainability made against qualitatively defined strategic objectives. For the FUSE mission, SOVA autonomously assembles a week-long schedule of target observations and associated maneuvers so as to maximize the expected scientific return value while keeping the satellite stable, managing the angular momentum of spacecraft attitude- control reaction wheels, and striving for other strategic objectives. A six-degree-of-freedom model of the spacecraft is used in simulating the tasks, and the attainability of a task is calculated at each step by use of strategic objectives as defined by use of fuzzy inference systems. SOVA utilizes a variant of a graph-search algorithm known as the A* search algorithm to assemble the tasks into a week-long target schedule, using the expected scientific return value to guide the search.

  10. Fast Emission Estimates in China Constrained by Satellite Observations (Invited)

    Science.gov (United States)

    Mijling, B.; van der A, R.

    2013-12-01

    Emission inventories of air pollutants are crucial information for policy makers and form important input data for air quality models. Unfortunately, bottom-up emission inventories, compiled from large quantities of statistical data, are easily outdated for an emerging economy such as China, where rapid economic growth changes emissions accordingly. Alternatively, top-down emission estimates from satellite observations of air constituents have important advantages of being spatial consistent, having high temporal resolution, and enabling emission updates shortly after the satellite data become available. Constraining emissions from concentration measurements is, however, computationally challenging. Within the GlobEmission project of the European Space Agency (ESA) a new algorithm has been developed, specifically designed for fast daily emission estimates of short-lived atmospheric species on a mesoscopic scale (0.25 × 0.25 degree) from satellite observations of column concentrations. The algorithm needs only one forward model run from a chemical transport model to calculate the sensitivity of concentration to emission, using trajectory analysis to account for transport away from the source. By using a Kalman filter in the inverse step, optimal use of the a priori knowledge and the newly observed data is made. We apply the algorithm for NOx emission estimates in East China, using the CHIMERE model together with tropospheric NO2 column retrievals of the OMI and GOME-2 satellite instruments. The observations are used to construct a monthly emission time series, which reveal important emission trends such as the emission reduction measures during the Beijing Olympic Games, and the impact and recovery from the global economic crisis. The algorithm is also able to detect emerging sources (e.g. new power plants) and improve emission information for areas where proxy data are not or badly known (e.g. shipping emissions). The new emission estimates result in a better

  11. Aerodynamic design of the National Rotor Testbed.

    Energy Technology Data Exchange (ETDEWEB)

    Kelley, Christopher Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbine in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.

  12. Test-bed Assessment of Communication Technologies for a Power-Balancing Controller

    DEFF Research Database (Denmark)

    Findrik, Mislav; Pedersen, Rasmus; Hasenleithner, Eduard

    2016-01-01

    and control. In this paper, we present a Smart Grid test-bed that integrates various communication technologies and deploys a power balancing controller for LV grids. Control performance of the introduced power balancing controller is subsequently investigated and its robustness to communication network cross......Due to growing need for sustainable energy, increasing number of different renewable energy resources are being connected into distribution grids. In order to efficiently manage a decentralized power generation units, the smart grid will rely on communication networks for information exchange...

  13. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  14. AN ACTIVE-PASSIVE COMBINED ALGORITHM FOR HIGH SPATIAL RESOLUTION RETRIEVAL OF SOIL MOISTURE FROM SATELLITE SENSORS (Invited)

    Science.gov (United States)

    Lakshmi, V.; Mladenova, I. E.; Narayan, U.

    2009-12-01

    Soil moisture is known to be an essential factor in controlling the partitioning of rainfall into surface runoff and infiltration and solar energy into latent and sensible heat fluxes. Remote sensing has long proven its capability to obtain soil moisture in near real-time. However, at the present time we have the Advanced Scanning Microwave Radiometer (AMSR-E) on board NASA’s AQUA platform is the only satellite sensor that supplies a soil moisture product. AMSR-E coarse spatial resolution (~ 50 km at 6.9 GHz) strongly limits its applicability for small scale studies. A very promising technique for spatial disaggregation by combining radar and radiometer observations has been demonstrated by the authors using a methodology is based on the assumption that any change in measured brightness temperature and backscatter from one to the next time step is due primarily to change in soil wetness. The approach uses radiometric estimates of soil moisture at a lower resolution to compute the sensitivity of radar to soil moisture at the lower resolution. This estimate of sensitivity is then disaggregated using vegetation water content, vegetation type and soil texture information, which are the variables on which determine the radar sensitivity to soil moisture and are generally available at a scale of radar observation. This change detection algorithm is applied to several locations. We have used aircraft observed active and passive data over Walnut Creek watershed in Central Iowa in 2002; the Little Washita Watershed in Oklahoma in 2003 and the Murrumbidgee Catchment in southeastern Australia for 2006. All of these locations have different soils and land cover conditions which leads to a rigorous test of the disaggregation algorithm. Furthermore, we compare the derived high spatial resolution soil moisture to in-situ sampling and ground observation networks

  15. Shadow imaging of geosynchronous satellites

    Science.gov (United States)

    Douglas, Dennis Michael

    Geosynchronous (GEO) satellites are essential for modern communication networks. If communication to a GEO satellite is lost and a malfunction occurs upon orbit insertion such as a solar panel not deploying there is no direct way to observe it from Earth. Due to the GEO orbit distance of ~36,000 km from Earth's surface, the Rayleigh criteria dictates that a 14 m telescope is required to conventionally image a satellite with spatial resolution down to 1 m using visible light. Furthermore, a telescope larger than 30 m is required under ideal conditions to obtain spatial resolution down to 0.4 m. This dissertation evaluates a method for obtaining high spatial resolution images of GEO satellites from an Earth based system by measuring the irradiance distribution on the ground resulting from the occultation of the satellite passing in front of a star. The representative size of a GEO satellite combined with the orbital distance results in the ground shadow being consistent with a Fresnel diffraction pattern when observed at visible wavelengths. A measurement of the ground shadow irradiance is used as an amplitude constraint in a Gerchberg-Saxton phase retrieval algorithm that produces a reconstruction of the satellite's 2D transmission function which is analogous to a reverse contrast image of the satellite. The advantage of shadow imaging is that a terrestrial based redundant set of linearly distributed inexpensive small telescopes, each coupled to high speed detectors, is a more effective resolved imaging system for GEO satellites than a very large telescope under ideal conditions. Modeling and simulation efforts indicate sub-meter spatial resolution can be readily achieved using collection apertures of less than 1 meter in diameter. A mathematical basis is established for the treatment of the physical phenomena involved in the shadow imaging process. This includes the source star brightness and angular extent, and the diffraction of starlight from the satellite

  16. Ames life science telescience testbed evaluation

    Science.gov (United States)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  17. Geographically weighted regression based methods for merging satellite and gauge precipitation

    Science.gov (United States)

    Chao, Lijun; Zhang, Ke; Li, Zhijia; Zhu, Yuelong; Wang, Jingfeng; Yu, Zhongbo

    2018-03-01

    Real-time precipitation data with high spatiotemporal resolutions are crucial for accurate hydrological forecasting. To improve the spatial resolution and quality of satellite precipitation, a three-step satellite and gauge precipitation merging method was formulated in this study: (1) bilinear interpolation is first applied to downscale coarser satellite precipitation to a finer resolution (PS); (2) the (mixed) geographically weighted regression methods coupled with a weighting function are then used to estimate biases of PS as functions of gauge observations (PO) and PS; and (3) biases of PS are finally corrected to produce a merged precipitation product. Based on the above framework, eight algorithms, a combination of two geographically weighted regression methods and four weighting functions, are developed to merge CMORPH (CPC MORPHing technique) precipitation with station observations on a daily scale in the Ziwuhe Basin of China. The geographical variables (elevation, slope, aspect, surface roughness, and distance to the coastline) and a meteorological variable (wind speed) were used for merging precipitation to avoid the artificial spatial autocorrelation resulting from traditional interpolation methods. The results show that the combination of the MGWR and BI-square function (MGWR-BI) has the best performance (R = 0.863 and RMSE = 7.273 mm/day) among the eight algorithms. The MGWR-BI algorithm was then applied to produce hourly merged precipitation product. Compared to the original CMORPH product (R = 0.208 and RMSE = 1.208 mm/hr), the quality of the merged data is significantly higher (R = 0.724 and RMSE = 0.706 mm/hr). The developed merging method not only improves the spatial resolution and quality of the satellite product but also is easy to implement, which is valuable for hydrological modeling and other applications.

  18. Adaptation of a fuzzy controller’s scaling gains using genetic algorithms for balancing an inverted pendulum

    Directory of Open Access Journals (Sweden)

    Duka Adrian-Vasile

    2011-12-01

    Full Text Available This paper examines the development of a genetic adaptive fuzzy control system for the Inverted Pendulum. The inverted pendulum is a classical problem in Control Engineering, used for testing different control algorithms. The goal is to balance the inverted pendulum in the upright position by controlling the horizontal force applied to its cart. Because it is unstable and has a complicated nonlinear dynamics, the inverted pendulum is a good testbed for the development of nonconventional advanced control techniques. Fuzzy logic technique has been successfully applied to control this type of system, however most of the time the design of the fuzzy controller is done in an ad-hoc manner, and choosing certain parameters (controller gains, membership functions proves difficult. This paper examines the implementation of an adaptive control method based on genetic algorithms (GA, which can be used on-line to produce the adaptation of the fuzzy controller’s gains in order to achieve the stabilization of the pendulum. The performances of the proposed control algorithms are evaluated and shown by means of digital simulation.

  19. A novel angle computation and calibration algorithm of bio-inspired sky-light polarization navigation sensor.

    Science.gov (United States)

    Xian, Zhiwen; Hu, Xiaoping; Lian, Junxiang; Zhang, Lilian; Cao, Juliang; Wang, Yujie; Ma, Tao

    2014-09-15

    Navigation plays a vital role in our daily life. As traditional and commonly used navigation technologies, Inertial Navigation System (INS) and Global Navigation Satellite System (GNSS) can provide accurate location information, but suffer from the accumulative error of inertial sensors and cannot be used in a satellite denied environment. The remarkable navigation ability of animals shows that the pattern of the polarization sky can be used for navigation. A bio-inspired POLarization Navigation Sensor (POLNS) is constructed to detect the polarization of skylight. Contrary to the previous approach, we utilize all the outputs of POLNS to compute input polarization angle, based on Least Squares, which provides optimal angle estimation. In addition, a new sensor calibration algorithm is presented, in which the installation angle errors and sensor biases are taken into consideration. Derivation and implementation of our calibration algorithm are discussed in detail. To evaluate the performance of our algorithms, simulation and real data test are done to compare our algorithms with several exiting algorithms. Comparison results indicate that our algorithms are superior to the others and are more feasible and effective in practice.

  20. Fast emission estimates in China and South Africa constrained by satellite observations

    Science.gov (United States)

    Mijling, Bas; van der A, Ronald

    2013-04-01

    Emission inventories of air pollutants are crucial information for policy makers and form important input data for air quality models. Unfortunately, bottom-up emission inventories, compiled from large quantities of statistical data, are easily outdated for emerging economies such as China and South Africa, where rapid economic growth change emissions accordingly. Alternatively, top-down emission estimates from satellite observations of air constituents have important advantages of being spatial consistent, having high temporal resolution, and enabling emission updates shortly after the satellite data become available. However, constraining emissions from observations of concentrations is computationally challenging. Within the GlobEmission project (part of the Data User Element programme of ESA) a new algorithm has been developed, specifically designed for fast daily emission estimates of short-lived atmospheric species on a mesoscopic scale (0.25 × 0.25 degree) from satellite observations of column concentrations. The algorithm needs only one forward model run from a chemical transport model to calculate the sensitivity of concentration to emission, using trajectory analysis to account for transport away from the source. By using a Kalman filter in the inverse step, optimal use of the a priori knowledge and the newly observed data is made. We apply the algorithm for NOx emission estimates in East China and South Africa, using the CHIMERE chemical transport model together with tropospheric NO2 column retrievals of the OMI and GOME-2 satellite instruments. The observations are used to construct a monthly emission time series, which reveal important emission trends such as the emission reduction measures during the Beijing Olympic Games, and the impact and recovery from the global economic crisis. The algorithm is also able to detect emerging sources (e.g. new power plants) and improve emission information for areas where proxy data are not or badly known (e

  1. Satellites

    International Nuclear Information System (INIS)

    Burns, J.A.; Matthews, M.S.

    1986-01-01

    The present work is based on a conference: Natural Satellites, Colloquium 77 of the IAU, held at Cornell University from July 5 to 9, 1983. Attention is given to the background and origins of satellites, protosatellite swarms, the tectonics of icy satellites, the physical characteristics of satellite surfaces, and the interactions of planetary magnetospheres with icy satellite surfaces. Other topics include the surface composition of natural satellites, the cratering of planetary satellites, the moon, Io, and Europa. Consideration is also given to Ganymede and Callisto, the satellites of Saturn, small satellites, satellites of Uranus and Neptune, and the Pluto-Charon system

  2. Evaluation of short-period rainfall estimates from Kalpana-1 satellite

    Indian Academy of Sciences (India)

    The INSAT Multispectral Rainfall Algorithm (IMSRA) technique for rainfall estimation, has recently been developed to meet the shortcomings of the Global Precipitation Index (GPI) technique of rainfall estimation from the data of geostationary satellites; especially for accurate short period rainfall estimates. This study ...

  3. Heuristic approach to Satellite Range Scheduling with Bounds using Lagrangian Relaxation.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K.; Arguello, Bryan; Nozick, Linda Karen; Xu, Ningxiong [Cornell

    2017-03-01

    This paper focuses on scheduling antennas to track satellites using a heuristic method. In order to validate the performance of the heuristic, bounds are developed using Lagrangian relaxation. The performance of the algorithm is established using several illustrative problems.

  4. A testbed to explore the optimal electrical stimulation parameters for suppressing inter-ictal spikes in human hippocampal slices.

    Science.gov (United States)

    Min-Chi Hsiao; Pen-Ning Yu; Dong Song; Liu, Charles Y; Heck, Christi N; Millett, David; Berger, Theodore W

    2014-01-01

    New interventions using neuromodulatory devices such as vagus nerve stimulation, deep brain stimulation and responsive neurostimulation are available or under study for the treatment of refractory epilepsy. Since the actual mechanisms of the onset and termination of the seizure are still unclear, most researchers or clinicians determine the optimal stimulation parameters through trial-and-error procedures. It is necessary to further explore what types of electrical stimulation parameters (these may include stimulation frequency, amplitude, duration, interval pattern, and location) constitute a set of optimal stimulation paradigms to suppress seizures. In a previous study, we developed an in vitro epilepsy model using hippocampal slices from patients suffering from mesial temporal lobe epilepsy. Using a planar multi-electrode array system, inter-ictal activity from human hippocampal slices was consistently recorded. In this study, we have further transferred this in vitro seizure model to a testbed for exploring the possible neurostimulation paradigms to inhibit inter-ictal spikes. The methodology used to collect the electrophysiological data, the approach to apply different electrical stimulation parameters to the slices are provided in this paper. The results show that this experimental testbed will provide a platform for testing the optimal stimulation parameters of seizure cessation. We expect this testbed will expedite the process for identifying the most effective parameters, and may ultimately be used to guide programming of new stimulating paradigms for neuromodulatory devices.

  5. Japanese Global Precipitation Measurement (GPM) mission status and application of satellite-based global rainfall map

    Science.gov (United States)

    Kachi, Misako; Shimizu, Shuji; Kubota, Takuji; Yoshida, Naofumi; Oki, Riko; Kojima, Masahiro; Iguchi, Toshio; Nakamura, Kenji

    2010-05-01

    . Collaboration with GCOM-W is not only limited to its participation to GPM constellation but also coordination in areas of algorithm development and validation in Japan. Generation of high-temporal and high-accurate global rainfall map is one of targets of the GPM mission. As a proto-type for GPM era, JAXA has developed and operates the Global Precipitation Map algorithm in near-real-time since October 2008, and hourly and 0.1-degree resolution binary data and images available at http://sharaku.eorc.jaxa.jp/GSMaP/ four hours after observation. The algorithms are based on outcomes from the Global Satellite Mapping for Precipitation (GSMaP) project, which was sponsored by the Japan Science and Technology Agency (JST) under the Core Research for Evolutional Science and Technology (CREST) framework between 2002 and 2007 (Okamoto et al., 2005; Aonashi et al., 2009; Ushio et al., 2009). Target of GSMaP project is to produce global rainfall maps that are highly accurate and in high temporal and spatial resolution through the development of rain rate retrieval algorithms based on reliable precipitation physical models by using several microwave radiometer data, and comprehensive use of precipitation radar and geostationary infrared imager data. Near-real-time GSMaP data is distributed via internet and utilized by end users. Purpose of data utilization by each user covers broad areas and in world wide; Science researches (model validation, data assimilation, typhoon study, etc.), weather forecast/service, flood warning and rain analysis over river basin, oceanographic condition forecast, agriculture, and education. Toward the GPM era, operational application should be further emphasized as well as science application. JAXA continues collaboration with hydrological communities to utilize satellite-based precipitation data as inputs to future flood prediction and warning system, as well as with meteorological agencies to proceed further data utilization in numerical weather prediction

  6. IDMA-Based MAC Protocol for Satellite Networks with Consideration on Channel Quality

    Directory of Open Access Journals (Sweden)

    Gongliang Liu

    2014-01-01

    Full Text Available In order to overcome the shortcomings of existing medium access control (MAC protocols based on TDMA or CDMA in satellite networks, interleave division multiple access (IDMA technique is introduced into satellite communication networks. Therefore, a novel wide-band IDMA MAC protocol based on channel quality is proposed in this paper, consisting of a dynamic power allocation algorithm, a rate adaptation algorithm, and a call admission control (CAC scheme. Firstly, the power allocation algorithm combining the technique of IDMA SINR-evolution and channel quality prediction is developed to guarantee high power efficiency even in terrible channel conditions. Secondly, the effective rate adaptation algorithm, based on accurate channel information per timeslot and by the means of rate degradation, can be realized. What is more, based on channel quality prediction, the CAC scheme, combining the new power allocation algorithm, rate scheduling, and buffering strategies together, is proposed for the emerging IDMA systems, which can support a variety of traffic types, and offering quality of service (QoS requirements corresponding to different priority levels. Simulation results show that the new wide-band IDMA MAC protocol can make accurate estimation of available resource considering the effect of multiuser detection (MUD and QoS requirements of multimedia traffic, leading to low outage probability as well as high overall system throughput.

  7. Semi-automatic mapping of linear-trending bedforms using 'Self-Organizing Maps' algorithm

    Science.gov (United States)

    Foroutan, M.; Zimbelman, J. R.

    2017-09-01

    Increased application of high resolution spatial data such as high resolution satellite or Unmanned Aerial Vehicle (UAV) images from Earth, as well as High Resolution Imaging Science Experiment (HiRISE) images from Mars, makes it necessary to increase automation techniques capable of extracting detailed geomorphologic elements from such large data sets. Model validation by repeated images in environmental management studies such as climate-related changes as well as increasing access to high-resolution satellite images underline the demand for detailed automatic image-processing techniques in remote sensing. This study presents a methodology based on an unsupervised Artificial Neural Network (ANN) algorithm, known as Self Organizing Maps (SOM), to achieve the semi-automatic extraction of linear features with small footprints on satellite images. SOM is based on competitive learning and is efficient for handling huge data sets. We applied the SOM algorithm to high resolution satellite images of Earth and Mars (Quickbird, Worldview and HiRISE) in order to facilitate and speed up image analysis along with the improvement of the accuracy of results. About 98% overall accuracy and 0.001 quantization error in the recognition of small linear-trending bedforms demonstrate a promising framework.

  8. Improving retrieval of volcanic sulphur dioxide from backscattered UV satellite observations

    NARCIS (Netherlands)

    Yang, Kai; Krotkov, N.A.; Krueger, A.J.; Carn, S.A.; Bhartia, P.K.; Levelt, P.F.

    2009-01-01

    Existing algorithms that use satellite measurements of solar backscattered ultraviolet (BUV) radiances to retrieve sulfur dioxide (SO2) vertical columns underestimate the large SO2 amounts encountered in fresh volcanic eruption clouds. To eliminate this underestimation we have developed a new

  9. An SDR based AIS receiver for satellites

    DEFF Research Database (Denmark)

    Larsen, Jesper Abildgaard; Mortensen, Hans Peter; Nielsen, Jens Frederik Dalsgaard

    2011-01-01

    For a few years now, there has been a high interest in monitoring the global ship traffic from space. A few satellite, capable of listening for ship borne AIS transponders have already been launched, and soon the AAUSAT3, carrying two different types of AIS receivers will also be launched. One...... of the AIS receivers onboard AAUSAT3 is an SDR based AIS receiver. This paper serves to describe the background of the AIS system, and how the SDR based receiver has been integrated into the AAUSAT3 satellite. Amongst some of the benefits of using an SDR based receiver is, that due to its versatility, new...... detection algorithms are easily deployed, and it is easily adapted the new proposed AIS transmission channels....

  10. FloorNet: Deployment and Evaluation of a Multihop Wireless 802.11 Testbed

    Directory of Open Access Journals (Sweden)

    Zink Michael

    2010-01-01

    Full Text Available A lot of attention has been given to multihop wireless networks lately, but further research—in particular, through experimentation—is needed. This attention has motivated an increase in the number of 802.11-based deployments, both indoor and outdoor. These testbeds, which require a significant amount of resources during both deployment and maintenance, are used to run measurements in order to analyze and understand the limitation and differences between analytical or simulation-based figures and the results from real-life experimentation. This paper makes two major contributions: (i first, we describe a novel wireless multihop testbed, which we name FloorNet, that is deployed and operated under the false floor of a lab in our Computer Science building. This false floor provides a strong physical protection that prevents disconnections or misplacements, as well as radio shielding (to some extent thanks to the false floor panels—this later feature is assessed through experimentation; (ii second, by running exhaustive and controlled experiments we are able to analyze the performance limits of commercial off-the-shelf hardware, as well as to derive practical design criteria for the deployment and configuration of mesh networks. These results both provide valuable insights of wireless multihop performance and prove that FloorNet constitutes a valuable asset to research on wireless mesh networks.

  11. Interactive aircraft cabin testbed for stress-free air travel system experiment: an innovative concurrent design approach

    NARCIS (Netherlands)

    Tan, C.F.; Chen, W.; Rauterberg, G.W.M.

    2009-01-01

    In this paper, a study of the concurrent engineering design for the environmental friendly low cost aircraft cabin simulator is presented. The study describes the used of concurrent design technique in the design activity. The simulator is a testbed that was designed and built for research on

  12. NInFEA: an embedded framework for the real-time evaluation of fetal ECG extraction algorithms.

    Science.gov (United States)

    Pani, Danilo; Barabino, Gianluca; Raffo, Luigi

    2013-02-01

    Fetal electrocardiogram (ECG) extraction from non-invasive biopotential recordings is a long-standing research topic. Despite the significant number of algorithms presented in the scientific literature, it is difficult to find information about embedded hardware implementations able to provide real-time support for the required features, bridging the gap between theory and practice. This article presents the NInFEA (non-invasive fetal ECG analysis) tool, an embedded hardware/software framework based on the hybrid dual-core OMAP-L137 low-power processor for the real-time evaluation of fetal ECG extraction algorithms. The hybrid platform, including a digital signal processor (DSP) and a general-purpose processor (GPP), allows achieving the best performance compared with single-core architectures. The GPP provides a portable graphical user interface, whereas the DSP is extensively used for advanced signal processing tasks. As a case study, three state-of-the-art fetal ECG extraction algorithms have been ported onto NInFEA, along with some support routines needed to provide the additional information required by the clinicians and supported by the user interface. NInFEA can be regarded both as a reference design for similar applications and as a common embedded low-power testbed for real-time fetal ECG extraction algorithms.

  13. A FD/DAMA network architecture for the first generation land mobile satellite services

    Science.gov (United States)

    Yan, T.-Y.; Wang, C.; Cheng, U.; Dessouky, K.; Rafferty, W.

    1989-01-01

    A frequency division/demand assigned multiple access (FD/DAMA) network architecture for the first-generation land mobile satellite services is presented. Rationales and technical approaches are described. In this architecture, each mobile subscriber must follow a channel access protocol to make a service request to the network management center before transmission for either open-end or closed-end services. Open-end service requests will be processed on a blocked call cleared basis, while closed-end requests will be processed on a first-come-first-served basis. Two channel access protocols are investigated, namely, a recently proposed multiple channel collision resolution scheme which provides a significantly higher useful throughput, and the traditional slotted Aloha scheme. The number of channels allocated for either open-end or closed-end services can be adaptively changed according to aggregated traffic requests. Both theoretical and simulation results are presented. Theoretical results have been verified by simulation on the JPL network testbed.

  14. Report of the Interagency Optical Network Testbeds Workshop 2 September 12-14, 2006 NASA Ames Research Center

    Energy Technology Data Exchange (ETDEWEB)

    Joe Mambretti Richard desJardins

    2006-05-01

    A new generation of optical networking services and technologies is rapidly changing the world of communications. National and international networks are implementing optical services to supplement traditional packet routed services. On September 12-14, 2005, the Optical Network Testbeds Workshop 2 (ONT2), an invitation-only forum hosted by the NASA Research and Engineering Network (NREN) and co-sponsored by the Department of Energy (DOE), was held at NASA Ames Research Center in Mountain View, California. The aim of ONT2 was to help the Federal Large Scale Networking Coordination Group (LSN) and its Joint Engineering Team (JET) to coordinate testbed and network roadmaps describing agency and partner organization views and activities for moving toward next generation communication services based on leading edge optical networks in the 3-5 year time frame. ONT2 was conceived and organized as a sequel to the first Optical Network Testbeds Workshop (ONT1, August 2004, www.nren.nasa.gov/workshop7). ONT1 resulted in a series of recommendations to LSN. ONT2 was designed to move beyond recommendations to agree on a series of “actionable objectives” that would proactively help federal and partner optical network testbeds and advanced research and education (R&E) networks to begin incorporating technologies and services representing the next generation of advanced optical networks in the next 1-3 years. Participants in ONT2 included representatives from innovative prototype networks (Panel A), basic optical network research testbeds (Panel B), and production R&D networks (Panels C and D), including “JETnets,” selected regional optical networks (RONs), international R&D networks, commercial network technology and service providers (Panel F), and senior engineering and R&D managers from LSN agencies and partner organizations. The overall goal of ONT2 was to identify and coordinate short and medium term activities and milestones for researching, developing, identifying

  15. Utilizing the ISS Mission as a Testbed to Develop Cognitive Communications Systems

    Science.gov (United States)

    Jackson, Dan

    2016-01-01

    The ISS provides an excellent opportunity for pioneering artificial intelligence software to meet the challenges of real-time communications (comm) link management. This opportunity empowers the ISS Program to forge a testbed for developing cognitive communications systems for the benefit of the ISS mission, manned Low Earth Orbit (LEO) science programs and future planetary exploration programs. In November, 1998, the Flight Operations Directorate (FOD) started the ISS Antenna Manager (IAM) project to develop a single processor supporting multiple comm satellite tracking for two different antenna systems. Further, the processor was developed to be highly adaptable as it supported the ISS mission through all assembly stages. The ISS mission mandated communications specialists with complete knowledge of when the ISS was about to lose or gain comm link service. The current specialty mandated cognizance of large sun-tracking solar arrays and thermal management panels in addition to the highly-dynamic satellite service schedules and rise/set tables. This mission requirement makes the ISS the ideal communications management analogue for future LEO space station and long-duration planetary exploration missions. Future missions, with their precision-pointed, dynamic, laser-based comm links, require complete autonomy for managing high-data rate communications systems. Development of cognitive communications management systems that permit any crew member or payload science specialist, regardless of experience level, to control communications is one of the greater benefits the ISS can offer new space exploration programs. The IAM project met a new mission requirement never previously levied against US space-born communications systems management: process and display the orientation of large solar arrays and thermal control panels based on real-time joint angle telemetry. However, IAM leaves the actual communications availability assessment to human judgement, which introduces

  16. Open-Source Based Testbed for Multioperator 4G/5G Infrastructure Sharing in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Ricardo Marco Alaez

    2017-01-01

    Full Text Available Fourth-Generation (4G mobile networks are based on Long-Term Evolution (LTE technologies and are being deployed worldwide, while research on further evolution towards the Fifth Generation (5G has been recently initiated. 5G will be featured with advanced network infrastructure sharing capabilities among different operators. Therefore, an open-source implementation of 4G/5G networks with this capability is crucial to enable early research in this area. The main contribution of this paper is the design and implementation of such a 4G/5G open-source testbed to investigate multioperator infrastructure sharing capabilities executed in virtual architectures. The proposed design and implementation enable the virtualization and sharing of some of the components of the LTE architecture. A testbed has been implemented and validated with intensive empirical experiments conducted to validate the suitability of virtualizing LTE components in virtual infrastructures (i.e., infrastructures with multitenancy sharing capabilities. The impact of the proposed technologies can lead to significant saving of both capital and operational costs for mobile telecommunication operators.

  17. Entropy-Based Block Processing for Satellite Image Registration

    Directory of Open Access Journals (Sweden)

    Ikhyun Lee

    2012-11-01

    Full Text Available Image registration is an important task in many computer vision applications such as fusion systems, 3D shape recovery and earth observation. Particularly, registering satellite images is challenging and time-consuming due to limited resources and large image size. In such scenario, state-of-the-art image registration methods such as scale-invariant feature transform (SIFT may not be suitable due to high processing time. In this paper, we propose an algorithm based on block processing via entropy to register satellite images. The performance of the proposed method is evaluated using different real images. The comparative analysis shows that it not only reduces the processing time but also enhances the accuracy.

  18. Feature extraction algorithm for space targets based on fractal theory

    Science.gov (United States)

    Tian, Balin; Yuan, Jianping; Yue, Xiaokui; Ning, Xin

    2007-11-01

    In order to offer a potential for extending the life of satellites and reducing the launch and operating costs, satellite servicing including conducting repairs, upgrading and refueling spacecraft on-orbit become much more frequently. Future space operations can be more economically and reliably executed using machine vision systems, which can meet real time and tracking reliability requirements for image tracking of space surveillance system. Machine vision was applied to the research of relative pose for spacecrafts, the feature extraction algorithm was the basis of relative pose. In this paper fractal geometry based edge extraction algorithm which can be used in determining and tracking the relative pose of an observed satellite during proximity operations in machine vision system was presented. The method gets the gray-level image distributed by fractal dimension used the Differential Box-Counting (DBC) approach of the fractal theory to restrain the noise. After this, we detect the consecutive edge using Mathematical Morphology. The validity of the proposed method is examined by processing and analyzing images of space targets. The edge extraction method not only extracts the outline of the target, but also keeps the inner details. Meanwhile, edge extraction is only processed in moving area to reduce computation greatly. Simulation results compared edge detection using the method which presented by us with other detection methods. The results indicate that the presented algorithm is a valid method to solve the problems of relative pose for spacecrafts.

  19. Guidance, Navigation, and Control Techniques and Technologies for Active Satellite Removal

    Science.gov (United States)

    Ortega Hernando, Guillermo; Erb, Sven; Cropp, Alexander; Voirin, Thomas; Dubois-Matra, Olivier; Rinalducci, Antonio; Visentin, Gianfranco; Innocenti, Luisa; Raposo, Ana

    2013-09-01

    This paper shows an internal feasibility analysis to de- orbit a non-functional satellite of big dimensions by the Technical Directorate of the European Space Agency ESA. The paper focuses specifically on the design of the techniques and technologies for the Guidance, Navigation, and Control (GNC) system of the spacecraft mission that will capture the satellite and ultimately will de-orbit it on a controlled re-entry.The paper explains the guidance strategies to launch, rendezvous, close-approach, and capture the target satellite. The guidance strategy uses chaser manoeuvres, hold points, and collision avoidance trajectories to ensure a safe capture. It also details the guidance profile to de-orbit it in a controlled re-entry.The paper continues with an analysis of the required sensing suite and the navigation algorithms to allow the homing, fly-around, and capture of the target satellite. The emphasis is placed around the design of a system to allow the rendezvous with an un-cooperative target, including the autonomous acquisition of both the orbital elements and the attitude of the target satellite.Analysing the capture phase, the paper provides a trade- off between two selected capture systems: the net and the tentacles. Both are studied from the point of view of the GNC system.The paper analyses as well the advanced algorithms proposed to control the final compound after the capture that will allow the controlled de-orbiting of the assembly in a safe place in the Earth.The paper ends proposing the continuation of this work with the extension to the analysis of the destruction process of the compound in consecutive segments starting from the entry gate to the rupture and break up.

  20. Algorithms for the Computation of Debris Risk

    Science.gov (United States)

    Matney, Mark J.

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of satellites. A number of tools have been developed in NASA’s Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA’s Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper presents an introduction to these algorithms and the assumptions upon which they are based.