WorldWideScience

Sample records for ground sensor fusion

  1. Environmental Perception and Sensor Data Fusion for Unmanned Ground Vehicle

    Directory of Open Access Journals (Sweden)

    Yibing Zhao

    2013-01-01

    Full Text Available Unmanned Ground Vehicles (UGVs that can drive autonomously in cross-country environment have received a good deal of attention in recent years. They must have the ability to determine whether the current terrain is traversable or not by using onboard sensors. This paper explores new methods related to environment perception based on computer image processing, pattern recognition, multisensors data fusion, and multidisciplinary theory. Kalman filter is used for low-level fusion of physical level, thus using the D-S evidence theory for high-level data fusion. Probability Test and Gaussian Mixture Model are proposed to obtain the traversable region in the forward-facing camera view for UGV. One feature set including color and texture information is extracted from areas of interest and combined with a classifier approach to resolve two types of terrain (traversable or not. Also, three-dimension data are employed; the feature set contains components such as distance contrast of three-dimension data, edge chain-code curvature of camera image, and covariance matrix based on the principal component method. This paper puts forward one new method that is suitable for distributing basic probability assignment (BPA, based on which D-S theory of evidence is employed to integrate sensors information and recognize the obstacle. The subordination obtained by using the fuzzy interpolation is applied to calculate the basic probability assignment. It is supposed that the subordination is equal to correlation coefficient in the formula. More accurate results of object identification are achieved by using the D-S theory of evidence. Control on motion behavior or autonomous navigation for UGV is based on the method, which is necessary for UGV high speed driving in cross-country environment. The experiment results have demonstrated the viability of the new method.

  2. Convolutional neural network based sensor fusion for forward looking ground penetrating radar

    Science.gov (United States)

    Sakaguchi, Rayn; Crosskey, Miles; Chen, David; Walenz, Brett; Morton, Kenneth

    2016-05-01

    Forward looking ground penetrating radar (FLGPR) is an alternative buried threat sensing technology designed to offer additional standoff compared to downward looking GPR systems. Due to additional flexibility in antenna configurations, FLGPR systems can accommodate multiple sensor modalities on the same platform that can provide complimentary information. The different sensor modalities present challenges in both developing informative feature extraction methods, and fusing sensor information in order to obtain the best discrimination performance. This work uses convolutional neural networks in order to jointly learn features across two sensor modalities and fuse the information in order to distinguish between target and non-target regions. This joint optimization is possible by modifying the traditional image-based convolutional neural network configuration to extract data from multiple sources. The filters generated by this process create a learned feature extraction method that is optimized to provide the best discrimination performance when fused. This paper presents the results of applying convolutional neural networks and compares these results to the use of fusion performed with a linear classifier. This paper also compares performance between convolutional neural networks architectures to show the benefit of fusing the sensor information in different ways.

  3. Fusion of ground-penetrating radar and electromagnetic induction sensors for landmine detection and discrimination

    Science.gov (United States)

    Kolba, Mark P.; Torrione, Peter A.; Collins, Leslie M.

    2010-04-01

    Ground penetrating radar (GPR) and electromagnetic induction (EMI) sensors provide complementary capabilities in detecting buried targets such as landmines, suggesting that the fusion of GPR and EMI modalities may provide improved detection performance over that obtained using only a single modality. This paper considers both pre-screening and the discrimination of landmines from non-landmine objects using real landmine data collected from a U.S. government test site as part of the Autonomous Mine Detection System (AMDS) landmine program. GPR and EMI pre-screeners are first reviewed and then a fusion pre-screener is presented that combines the GPR and EMI prescreeners using a distance-based likelihood ratio test (DLRT) classifier to produce a fused confidence for each pre-screener alarm. The fused pre-screener is demonstrated to provide substantially improved performance over the individual GPR and EMI pre-screeners. The discrimination of landmines from non-landmine objects using feature-based classifiers is also considered. The GPR feature utilized is a pre-processed, spatially filtered normalized energy metric. Features used for the EMI sensor include model-based features generated from the AETC model and a dipole model as well as features from a matched subspace detector. The EMI and GPR features are then fused using a random forest classifier. The fused classifier performance is superior to the performance of classifiers using GPR or EMI features alone, again indicating that performance improvements may be obtained through the fusion of GPR and EMI sensors. The performance improvements obtained both for pre-screening and for discrimination have been verified by blind test results scored by an independent U.S. government contractor.

  4. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection.

    Science.gov (United States)

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-07-19

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated

  5. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    Directory of Open Access Journals (Sweden)

    Sungho Kim

    2016-07-01

    Full Text Available Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR images or infrared (IR images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter and an asymmetric morphological closing filter (AMCF, post-filter into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic

  6. Sensor Data Fusion

    DEFF Research Database (Denmark)

    Plascencia, Alfredo; Stepán, Petr

    2006-01-01

    The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a Sensor Data Fusion (SDF) architecture. This approach involves combined sonar array with stereo vision readings.  Sonar readings are interpreted using probability density functions...

  7. Sensor Data Fusion

    DEFF Research Database (Denmark)

    Plascencia, Alfredo; Stepán, Petr

    2006-01-01

    The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a Sensor Data Fusion (SDF) architecture. This approach involves combined sonar array with stereo vision readings.  Sonar readings are interpreted using probability density functions...... to the occupied and empty regions. Scale Invariant Feature Transform (SIFT) feature descriptors are interpreted using gaussian probabilistic error models. The use of occupancy grids is proposed for representing the sensor readings. The Bayesian estimation approach is applied to update the sonar array......  and the SIFT descriptors' uncertainty grids. The sensor fusion yields a significant reduction in the uncertainty of the occupancy grid compared to the individual sensor readings....

  8. Aspects of sensor data fusion in interoperable ISR systems of systems for wide-area ground surveillance

    Science.gov (United States)

    Koch, Wolfgang; Ulmke, Martin; Biermann, Joachim; Sielemann, Marion

    2010-04-01

    Within the context of C4ISTAR information "systems of systems", we discuss sensor data fusion aspects that are aiming at the generation of higher-level in-formation according to the JDL model of data fusion. In particular, two issues are addressed: (1) Tracking-derived Situation Elements: Standard target tracking applications gain information related to 'Level 1 Fusion' according to the well-established terminology of the JDL model. Kinematic data of this type, however, are by no means the only information to be derived from tar-get tracks. In many cases, reliable and quantitative higher level information according to the JDL terminology can be obtained. (2) Anomaly Detection in Tracking Data Bases: Anomaly detection can be regarded as a process of information fusion that aims at focusing the attention of human decision makers or decision making systems is focused on particular events that are "irregular" or may cause harm and thus require special actions.

  9. Soldier systems sensor fusion

    Science.gov (United States)

    Brubaker, Kathryne M.

    1998-08-01

    This paper addresses sensor fusion and its applications in emerging Soldier Systems integration and the unique challenges associated with the human platform. Technology that,provides the highest operational payoff in a lightweight warrior system must not only have enhanced capabilities, but have low power components resulting in order of magnitude reductions coupled with significant cost reductions. These reductions in power and cost will be achieved through partnership with industry and leveraging of commercial state of the art advancements in microelectronics and power sources. As new generation of full solution fire control systems (to include temperature, wind and range sensors) and target acquisition systems will accompany a new generation of individual combat weapons and upgrade existing weapon systems. Advanced lightweight thermal, IR, laser and video senors will be used for surveillance, target acquisition, imaging and combat identification applications. Multifunctional sensors will provide embedded training features in combat configurations allowing the soldier to 'train as he fights' without the traditional cost and weight penalties associated with separate systems. Personal status monitors (detecting pulse, respiration rate, muscle fatigue, core temperature, etc.) will provide commanders and highest echelons instantaneous medical data. Seamless integration of GPS and dead reckoning (compass and pedometer) and/or inertial sensors will aid navigation and increase position accuracy. Improved sensors and processing capability will provide earlier detection of battlefield hazards such as mines, enemy lasers and NBC (nuclear, biological, chemical) agents. Via the digitized network the situational awareness database will automatically be updated with weapon, medical, position and battlefield hazard data. Soldier Systems Sensor Fusion will ultimately establish each individual soldier as an individual sensor on the battlefield.

  10. Sensor fusion for social robotics

    OpenAIRE

    Duffy, Brian R.; Garcia, C; Rooney, Colm, (Thesis); O'Hare, G.M.P.

    2000-01-01

    This paper advocates the application of sensor fusion for the visualisation of social robotic behaviour. Experiments with the Virtual Reality Workbench integrate the key elements of Virtual Reality and robotics in a coherent and systematic manner. The deliberative focusing of attention and sensor fusion between vision systems and sonar sensors is implemented on autonomous mobile robots functioning in standard office environments

  11. An improved DS acoustic-seismic modality fusion algorithm based on a new cascaded fuzzy classifier for ground-moving targets classification in wireless sensor networks

    Science.gov (United States)

    Pan, Qiang; Wei, Jianming; Cao, Hongbing; Li, Na; Liu, Haitao

    2007-04-01

    A new cascaded fuzzy classifier (CFC) is proposed to implement ground-moving targets classification tasks locally at sensor nodes in wireless sensor networks (WSN). The CFC is composed of three and two binary fuzzy classifiers (BFC) respectively in seismic and acoustic signal channel in order to classify person, Light-wheeled (LW) Vehicle, and Heavywheeled (HW) Vehicle in presence of environmental background noise. Base on the CFC, a new basic belief assignment (bba) function is defined for each component BFC to give out a piece of evidence instead of a hard decision label. An evidence generator is used to synthesize available evidences from BFCs into channel evidences and channel evidences are further temporal-fused. Finally, acoustic-seismic modality fusion using Dempster-Shafer method is performed. Our implementation gives significantly better performance than the implementation with majority-voting fusion method through leave-one-out experiments.

  12. Data Fusion and Sensors Model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    In this paper, we take the model of Laser range finder based on synchronized scanner as example, show how to use data fusion method in the process of sensor model designing to get more robust output. Also we provide our idea on the relation of sensor model, data fusion and system structure, and in the paper, there is a solution that transform the parameter space to get linear model for Kalman filter.

  13. Multi-sensor fusion development

    Science.gov (United States)

    Bish, Sheldon; Rohrer, Matthew; Scheffel, Peter; Bennett, Kelly

    2016-05-01

    The U.S. Army Research Laboratory (ARL) and McQ Inc. are developing a generic sensor fusion architecture that involves several diverse processes working in combination to create a dynamic task-oriented, real-time informational capability. Processes include sensor data collection, persistent and observational data storage, and multimodal and multisensor fusion that includes the flexibility to modify the fusion program rules for each mission. Such a fusion engine lends itself to a diverse set of sensing applications and architectures while using open-source software technologies. In this paper, we describe a fusion engine architecture that combines multimodal and multi-sensor fusion within an Open Standard for Unattended Sensors (OSUS) framework. The modular, plug-and-play architecture of OSUS allows future fusion plugin methodologies to have seamless integration into the fusion architecture at the conceptual and implementation level. Although beyond the scope of this paper, this architecture allows for data and information manipulation and filtering for an array of applications.

  14. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    Science.gov (United States)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  15. Optimal Fusion of Sensors

    DEFF Research Database (Denmark)

    Larsen, Thomas Dall

    within some global frame of reference using a wide variety of sensors providing odometric, inertial and absolute data concerning the robot and its surroundings. Kalman filters have for a long time been widely used to solve this problem. However, when measurements are delayed or the mobile robot...

  16. Sensor fusion for antipersonnel landmine detection, a case study

    NARCIS (Netherlands)

    Breejen, E. den; Schutte, K.; Cremer, F.

    1999-01-01

    In this paper the multi sensor fusion results obtained within the European research project GEODE (Ground Explosive Ordnance Detection system) are presented. The lay out of the test lane and the individual sensors used are described. The implementation of the SCOOP algorithm improves the ROC curves,

  17. Sensor fusion for antipersonnel landmine detection, a case study

    NARCIS (Netherlands)

    Breejen, E. den; Schutte, K.; Cremer, F.

    1999-01-01

    In this paper the multi sensor fusion results obtained within the European research project GEODE (Ground Explosive Ordnance Detection system) are presented. The lay out of the test lane and the individual sensors used are described. The implementation of the SCOOP algorithm improves the ROC curves,

  18. A novel fuzzy sensor fusion algorithm

    Institute of Scientific and Technical Information of China (English)

    FU Hua; YANG Yi-kui; MA Ke; LIU Yu-jia

    2011-01-01

    A novel fusion algorithm was given based on fuzzy similarity and fuzzy integral theory.First,it calculated the fuzzy similarity among a certain sensor's measurement values and the multiple sensors' objective prediction values to determine the importance weight of each sensor and realize multi-sensor data fusion.Then according to the determined importance weight,an intelligent fusion system based on fuzzy integral theory was given,which can solve FEI-DEO and DEI-DEO fusion problems and realize the decision fusion.Simulation results were proved that fuzzy integral algorithm has enhanced the capability of handling the uncertain information and improved the intelligence degrees.

  19. Sensor fusion for airborne landmine detection

    Science.gov (United States)

    Schatten, Miranda A.; Gader, Paul D.; Bolton, Jeremy; Zare, Alina; Mendez-Vasquez, Andres

    2006-05-01

    Sensor fusion has become a vital research area for mine detection because of the countermine community's conclusion that no single sensor is capable of detecting mines at the necessary detection and false alarm rates over a wide variety of operating conditions. The U. S. Army Night Vision and Electronic Sensors Directorate (NVESD) evaluates sensors and algorithms for use in a multi-sensor multi-platform airborne detection modality. A large dataset of hyperspectral and radar imagery exists from the four major data collections performed at U. S. Army temperate and arid testing facilities in Autumn 2002, Spring 2003, Summer 2004, and Summer 2005. There are a number of algorithm developers working on single-sensor algorithms in order to optimize feature and classifier selection for that sensor type. However, a given sensor/algorithm system has an absolute limitation based on the physical phenomena that system is capable of sensing. Therefore, we perform decision-level fusion of the outputs from single-channel algorithms and we choose to combine systems whose information is complementary across operating conditions. That way, the final fused system will be robust to a variety of conditions, which is a critical property of a countermine detection system. In this paper, we present the analysis of fusion algorithms on data from a sensor suite consisting of high frequency radar imagery combined with hyperspectral long-wave infrared sensor imagery. The main type of fusion being considered is Choquet integral fusion. We evaluate performance achieved using the Choquet integral method for sensor fusion versus Boolean and soft "and," "or," mean, or majority voting.

  20. Optimal decision fusion given sensor rules

    Institute of Scientific and Technical Information of China (English)

    Yunmin ZHU; Xiaorong LI

    2005-01-01

    When all the rules of sensor decision are known,the optimal distributed decision fusion,which relies only on the joint conditional probability densities,can be derived for very general decision systems.They include those systems with interdependent sensor observations and any network structure.It is also valid for m-ary Bayesian decision problems and binary problems under the Neyman-Pearson criterion.Local decision rules of a sensor with communication from other sensors that are optimal for the sensor itself are also presented,which take the form of a generalized likelihood ratio test.Numerical examples are given to reveal some interesting phenomena that communication between sensors can improve performance of a senor decision,but cannot guarantee to improve the global fusion performance when sensor rules were given before fusing.

  1. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation

    Directory of Open Access Journals (Sweden)

    Marwah Almasri

    2015-12-01

    Full Text Available Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.

  2. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation.

    Science.gov (United States)

    Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar

    2015-12-26

    Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.

  3. Fusion of Noisy Multi-sensor Imagery

    Directory of Open Access Journals (Sweden)

    Anima Mishra

    2008-01-01

    Full Text Available Interest in fusing multiple sensor data for both military and civil applications has beengrowing. Some of the important applications integrate image information from multiple sensorsto aid in navigation guidance, object detection and recognition, medical diagnosis, datacompression, etc. While, human beings may visually inspect various images and integrateinformation, it is of interest to develop algorithms that can fuse various input imagery to producea composite image. Fusion of images from various sensor modalities is expected to produce anoutput that captures all the relevant information in the input. The standard multi-resolution-based edge fusion scheme has been reviewed in this paper. A theoretical framework is given forthis edge fusion method by showing how edge fusion can be framed as information maximisation.However, the presence of noise complicates the situation. The framework developed is used toshow that for noisy images, all edges no longer correspond to information. In this paper, varioustechniques have been presented for fusion of noisy multi-sensor images.  These techniques aredeveloped for a single resolution as well as using multi-resolution decomposition. Some of thetechniques are based on modifying edge maps by filtering images, while others depend onalternate definition of information maps. Both these approaches can also be combined.Experiments show that the proposed algorithms work well for various kinds of noisy multi-sensor images.

  4. Sensor Fusion for Autonomous Mobile Robot Navigation

    DEFF Research Database (Denmark)

    Plascencia, Alfredo

    Multi-sensor data fusion is a broad area of constant research which is applied to a wide variety of fields such as the field of mobile robots. Mobile robots are complex systems where the design and implementation of sensor fusion is a complex task. But research applications are explored constantly....... The scope of the thesis is limited to building a map for a laboratory robot by fusing range readings from a sonar array with landmarks extracted from stereo vision images using the (Scale Invariant Feature Transform) SIFT algorithm....

  5. Health-Enabled Smart Sensor Fusion Technology

    Science.gov (United States)

    Wang, Ray

    2012-01-01

    A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.

  6. Networked unattented ground sensors assesment

    Science.gov (United States)

    Bouguereau, Julien; Gattefin, Christian; Dupuy, Gilles

    2003-09-01

    Within the framework of the NATO AC 323 / RTO TG 25 group, relating to advanced concepts of acoustic and seismic technology for military applications, Technical Establishment of Bourges welcomed and organized a joint campaign of experiment intending to demonstrate the interest of a networked unattented ground sensors for vehicles detection and tracking in an area defense context. Having reminded the principle of vehicles tracking, this paper describes the progress of the test campaign and details particularly sensors and participants deployment, the solution of interoperability chosen by the group and the instrumentation used to acquire, network, process and publish in real-time data available during the test: meteorological data, trajectography data and targets detection reports data. Finally, some results of the campaign are presented.

  7. Sensor fusion method for machine performance enhancement

    Energy Technology Data Exchange (ETDEWEB)

    Mou, J.I. [Arizona State Univ., Tempe, AZ (United States); King, C.; Hillaire, R. [Sandia National Labs., Livermore, CA (United States). Integrated Manufacturing Systems Center; Jones, S.; Furness, R. [Ford Motor Co., Dearborn, MI (United States)

    1998-03-01

    A sensor fusion methodology was developed to uniquely integrate pre-process, process-intermittent, and post-process measurement and analysis technology to cost-effectively enhance the accuracy and capability of computer-controlled manufacturing equipment. Empirical models and computational algorithms were also developed to model, assess, and then enhance the machine performance.

  8. City Data Fusion: Sensor Data Fusion in the Internet of Things

    OpenAIRE

    2015-01-01

    Internet of Things (IoT) has gained substantial attention recently and play a significant role in smart city application deployments. A number of such smart city applications depend on sensor fusion capabilities in the cloud from diverse data sources. We introduce the concept of IoT and present in detail ten different parameters that govern our sensor data fusion evaluation framework. We then evaluate the current state-of-the art in sensor data fusion against our sensor data fusion framework....

  9. Sensor fusion for improved indoor navigation

    Science.gov (United States)

    Emilsson, Erika; Rydell, Joakim

    2012-09-01

    A reliable indoor positioning system providing high accuracy has the potential to increase the safety of first responders and military personnel significantly. To enable navigation in a broad range of environments and obtain more accurate and robust positioning results, we propose a multi-sensor fusion approach. We describe and evaluate a positioning system, based on sensor fusion between a foot-mounted inertial measurement unit (IMU) and a camera-based system for simultaneous localization and mapping (SLAM). The complete system provides accurate navigation in many relevant environments without depending on preinstalled infrastructure. The camera-based system uses both inertial measurements and visual data, thereby enabling navigation also in environments and scenarios where one of the sensors provides unreliable data during a few seconds. When sufficient light is available, the camera-based system generally provides good performance. The foot-mounted system provides accurate positioning when distinct steps can be detected, e.g., during walking and running, even in dark or smoke-filled environments. By combining the two systems, the integrated positioning system can be expected to enable accurate navigation in almost all kinds of environments and scenarios. In this paper we present results from initial tests, which show that the proposed sensor fusion improves the navigation solution considerably in scenarios where either the foot-mounted or camera-based system is unable to navigate on its own.

  10. Multimodal Sensor Fusion for Personnel Detection

    Science.gov (United States)

    2011-07-01

    Multimodal Sensor Fusion for Personnel Detection Xin Jin Shalabh Gupta Asok Ray Department of Mechanical Engineering The Pennsylvania State...have con- sidered relations taken only two at a time, but we propose to explore relations between higher order cliques as future work. D. Feature...detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 6, pp. 577–589, 2001. [11] A. Ray , “Symbolic dynamic analysis

  11. A comparison of decision-level sensor-fusion methods for anti-personnel landmine detection.

    NARCIS (Netherlands)

    Schutte, K.; Schavemaker, J.G.M.; Cremer, F.; Breejen, E. den

    2001-01-01

    We present the sensor-fusion results obtained from measurements within the European research project ground explosive ordinance detection (GEODE) system that strives for the realisation of a vehicle-mounted, multi-sensor, anti-personnel landmine-detection system for humanitarian de-mining. The syste

  12. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian demin

  13. A comparison of decision-level sensor-fusion methods for anti-personnel landmine detection.

    NARCIS (Netherlands)

    Schutte, K.; Schavemaker, J.G.M.; Cremer, F.; Breejen, E. den

    2001-01-01

    We present the sensor-fusion results obtained from measurements within the European research project ground explosive ordinance detection (GEODE) system that strives for the realisation of a vehicle-mounted, multi-sensor, anti-personnel landmine-detection system for humanitarian de-mining. The

  14. Non-verbal communication through sensor fusion

    Science.gov (United States)

    Tairych, Andreas; Xu, Daniel; O'Brien, Benjamin M.; Anderson, Iain A.

    2016-04-01

    When we communicate face to face, we subconsciously engage our whole body to convey our message. In telecommunication, e.g. during phone calls, this powerful information channel cannot be used. Capturing nonverbal information from body motion and transmitting it to the receiver parallel to speech would make these conversations feel much more natural. This requires a sensing device that is capable of capturing different types of movements, such as the flexion and extension of joints, and the rotation of limbs. In a first embodiment, we developed a sensing glove that is used to control a computer game. Capacitive dielectric elastomer (DE) sensors measure finger positions, and an inertial measurement unit (IMU) detects hand roll. These two sensor technologies complement each other, with the IMU allowing the player to move an avatar through a three-dimensional maze, and the DE sensors detecting finger flexion to fire weapons or open doors. After demonstrating the potential of sensor fusion in human-computer interaction, we take this concept to the next level and apply it in nonverbal communication between humans. The current fingerspelling glove prototype uses capacitive DE sensors to detect finger gestures performed by the sending person. These gestures are mapped to corresponding messages and transmitted wirelessly to another person. A concept for integrating an IMU into this system is presented. The fusion of the DE sensor and the IMU combines the strengths of both sensor types, and therefore enables very comprehensive body motion sensing, which makes a large repertoire of gestures available to nonverbal communication over distances.

  15. Sensor data fusion to predict multiple soil properties

    NARCIS (Netherlands)

    Mahmood, H.S.; Hoogmoed, W.B.; Henten, van E.J.

    2012-01-01

    The accuracy of a single sensor is often low because all proximal soil sensors respond to more than one soil property of interest. Sensor data fusion can potentially overcome this inability of a single sensor and can best extract useful and complementary information from multiple sensors or sources.

  16. Sensor Fusion and Smart Sensor in Sports and Biomedical Applications

    Directory of Open Access Journals (Sweden)

    José Jair Alves Mendes Jr.

    2016-09-01

    Full Text Available The following work presents an overview of smart sensors and sensor fusion targeted at biomedical applications and sports areas. In this work, the integration of these areas is demonstrated, promoting a reflection about techniques and applications to collect, quantify and qualify some physical variables associated with the human body. These techniques are presented in various biomedical and sports applications, which cover areas related to diagnostics, rehabilitation, physical monitoring, and the development of performance in athletes, among others. Although some applications are described in only one of two fields of study (biomedicine and sports, it is very likely that the same application fits in both, with small peculiarities or adaptations. To illustrate the contemporaneity of applications, an analysis of specialized papers published in the last six years has been made. In this context, the main characteristic of this review is to present the largest quantity of relevant examples of sensor fusion and smart sensors focusing on their utilization and proposals, without deeply addressing one specific system or technique, to the detriment of the others.

  17. Sensor Fusion and Smart Sensor in Sports and Biomedical Applications

    Science.gov (United States)

    Mendes, José Jair Alves; Vieira, Mário Elias Marinho; Pires, Marcelo Bissi; Stevan, Sergio Luiz

    2016-01-01

    The following work presents an overview of smart sensors and sensor fusion targeted at biomedical applications and sports areas. In this work, the integration of these areas is demonstrated, promoting a reflection about techniques and applications to collect, quantify and qualify some physical variables associated with the human body. These techniques are presented in various biomedical and sports applications, which cover areas related to diagnostics, rehabilitation, physical monitoring, and the development of performance in athletes, among others. Although some applications are described in only one of two fields of study (biomedicine and sports), it is very likely that the same application fits in both, with small peculiarities or adaptations. To illustrate the contemporaneity of applications, an analysis of specialized papers published in the last six years has been made. In this context, the main characteristic of this review is to present the largest quantity of relevant examples of sensor fusion and smart sensors focusing on their utilization and proposals, without deeply addressing one specific system or technique, to the detriment of the others. PMID:27669260

  18. Desensitized Optimal Filtering and Sensor Fusion Tool Kit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Research on desensitized optimal filtering techniques and a navigation and sensor fusion tool kit using advanced filtering techniques is proposed. Research focuses...

  19. Sensor Fusion of Force and Acceleration for Robot Force Control

    OpenAIRE

    Gámez García, Javier; Robertsson, Anders; Gómez Ortega, Juan; Johansson, Rolf

    2004-01-01

    In this paper, robotic sensor fusion of acceleration and force measurement is considered. We discuss the problem of using accelerometers close to the end-effectors of robotic manipulators and how it may improve the force control performance. We introduce a new model-based observer approach to sensor fusion of information from various different sensors. During contact transition, accelerometers and force sensors play a very important role and it can overcome many of the difficulties of uncerta...

  20. Fluorescent sensors based on bacterial fusion proteins

    Science.gov (United States)

    Prats Mateu, Batirtze; Kainz, Birgit; Pum, Dietmar; Sleytr, Uwe B.; Toca-Herrera, José L.

    2014-06-01

    Fluorescence proteins are widely used as markers for biomedical and technological purposes. Therefore, the aim of this project was to create a fluorescent sensor, based in the green and cyan fluorescent protein, using bacterial S-layers proteins as scaffold for the fluorescent tag. We report the cloning, expression and purification of three S-layer fluorescent proteins: SgsE-EGFP, SgsE-ECFP and SgsE-13aa-ECFP, this last containing a 13-amino acid rigid linker. The pH dependence of the fluorescence intensity of the S-layer fusion proteins, monitored by fluorescence spectroscopy, showed that the ECFP tag was more stable than EGFP. Furthermore, the fluorescent fusion proteins were reassembled on silica particles modified with cationic and anionic polyelectrolytes. Zeta potential measurements confirmed the particle coatings and indicated their colloidal stability. Flow cytometry and fluorescence microscopy showed that the fluorescence of the fusion proteins was pH dependent and sensitive to the underlying polyelectrolyte coating. This might suggest that the fluorescent tag is not completely exposed to the bulk media as an independent moiety. Finally, it was found out that viscosity enhanced the fluorescence intensity of the three fluorescent S-layer proteins.

  1. Phase 1 report on sensor technology, data fusion and data interpretation for site characterization

    Energy Technology Data Exchange (ETDEWEB)

    Beckerman, M.

    1991-10-01

    In this report we discuss sensor technology, data fusion and data interpretation approaches of possible maximal usefulness for subsurface imaging and characterization of land-fill waste sites. Two sensor technologies, terrain conductivity using electromagnetic induction and ground penetrating radar, are described and the literature on the subject is reviewed. We identify the maximum entropy stochastic method as one providing a rigorously justifiable framework for fusing the sensor data, briefly summarize work done by us in this area, and examine some of the outstanding issues with regard to data fusion and interpretation. 25 refs., 17 figs.

  2. Advances in multi-sensor data fusion: algorithms and applications.

    Science.gov (United States)

    Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying

    2009-01-01

    With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.

  3. Tracking and sensor data fusion methodological framework and selected applications

    CERN Document Server

    Koch, Wolfgang

    2013-01-01

    Sensor Data Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output in accuracy, reliability, or cost. This book provides an introduction Sensor Data Fusion, as an information technology as well as a branch of engineering science and informatics. Part I presents a coherent methodological framework, thus providing th

  4. Presence detection under optimum fusion in an ultrasonic sensor system.

    Science.gov (United States)

    Srinivasan, Sriram; Pandharipande, Ashish

    2012-04-01

    Reliable presence detection is a requirement in energy-efficient occupancy-adaptive indoor lighting systems. A system of multiple ultrasonic sensors is considered for presence detection, and the performance gain from optimum fusion is studied. Two cases are considered wherein an individual sensor determines presence based on (i) local detection by processing echoes at its receiver, and (ii) the optimum Chair-Varshney fusion rule using multiple sensor detection results. The performance gains of using optimum fusion over local detection are characterized under different sensor system configurations and it is shown that improved detection sensitivity is obtained over a larger detection coverage region.

  5. Distributed data fusion across multiple hard and soft mobile sensor platforms

    Science.gov (United States)

    Sinsley, Gregory

    One of the biggest challenges currently facing the robotics field is sensor data fusion. Unmanned robots carry many sophisticated sensors including visual and infrared cameras, radar, laser range finders, chemical sensors, accelerometers, gyros, and global positioning systems. By effectively fusing the data from these sensors, a robot would be able to form a coherent view of its world that could then be used to facilitate both autonomous and intelligent operation. Another distinct fusion problem is that of fusing data from teammates with data from onboard sensors. If an entire team of vehicles has the same worldview they will be able to cooperate much more effectively. Sharing worldviews is made even more difficult if the teammates have different sensor types. The final fusion challenge the robotics field faces is that of fusing data gathered by robots with data gathered by human teammates (soft sensors). Humans sense the world completely differently from robots, which makes this problem particularly difficult. The advantage of fusing data from humans is that it makes more information available to the entire team, thus helping each agent to make the best possible decisions. This thesis presents a system for fusing data from multiple unmanned aerial vehicles, unmanned ground vehicles, and human observers. The first issue this thesis addresses is that of centralized data fusion. This is a foundational data fusion issue, which has been very well studied. Important issues in centralized fusion include data association, classification, tracking, and robotics problems. Because these problems are so well studied, this thesis does not make any major contributions in this area, but does review it for completeness. The chapter on centralized fusion concludes with an example unmanned aerial vehicle surveillance problem that demonstrates many of the traditional fusion methods. The second problem this thesis addresses is that of distributed data fusion. Distributed data fusion

  6. Sensor fusion for intelligent process control.

    Energy Technology Data Exchange (ETDEWEB)

    Connors, John J. (PPG Industries, Inc., Harmar Township, PA); Hill, Kevin (PPG Industries, Inc., Harmar Township, PA); Hanekamp, David (PPG Industries, Inc., Harmar Township, PA); Haley, William F. (PPG Industries, Inc., Wichita Falls, TX); Gallagher, Robert J.; Gowin, Craig (PPG Industries, Inc., Batavia, IL); Farrar, Arthur R. (PPG Industries, Inc., Wichita Falls, TX); Sheaffer, Donald A.; DeYoung, Mark A. (PPG Industries, Inc., Mt. Zion, IL); Bertram, Lee A.; Dodge, Craig (PPG Industries, Inc., Mt. Zion, IL); Binion, Bruce (PPG Industries, Inc., Mt. Zion, IL); Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R. (University of Utah, Salt Lake City, UT); Tiwary, Rajiv (PPG Industries, Inc., Harmar Township, PA); Stokes, Michael R. (PPG Industries, Inc.); Miller, Alan J. (PPG Industries, Inc., Mt. Zion, IL); Michael, Richard W. (PPG Industries, Inc., Lincoln, AL); Mayer, Raymond M. (PPG Industries, Inc., Harmar Township, PA); Jiao, Yu (PPG Industries, Inc., Harmar Township, PA); Smith, Philip J. (University of Utah, Salt Lake City, UT); Arbab, Mehran (PPG Industries, Inc., Harmar Township, PA); Hillaire, Robert G.

    2004-08-01

    An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

  7. PERSON AUTHENTICATION USING MULTIPLE SENSOR DATA FUSION

    Directory of Open Access Journals (Sweden)

    S. Vasuhi

    2011-04-01

    Full Text Available This paper proposes a real-time system for face authentication, obtained through fusion of Infra Red (IR and visible images. In order to identify the unknown person authentication in highly secured areas, multiple algorithms are needed. The four well known algorithms for face recognition, Block Independent Component Analysis(BICA, Kalman Filtering(KF method, Discrete Cosine Transform(DCT and Orthogonal Locality Preserving Projections (OLPP are used to extract the features. If the data base size is very large and the features are not distinct then ambiguity will exists in face recognition. Hence more than one sensor is needed for critical and/or highly secured areas. This paper deals with multiple fusion methodology using weighted average and Fuzzy Logic. The visible sensor output depends on the environmental condition namely lighting conditions, illumination etc., to overcome this problem use histogram technique to choose appropriate algorithm. DCT and Kalman filtering are holistic approaches, BICA follows feature based approach and OLPP preserves the Euclidean structure of face space. These recognizers are capable of considering the problem of dimensionality reduction by eliminating redundant features and reducing the feature space. The system can handle variations like illumination, pose, orientation, occlusion, etc. up to a significant level. The integrated system overcomes the drawbacks of individual recognizers. The proposed system is aimed at increasing the accuracy of the person authentication system and at the same time reducing the limitations of individual algorithms. It is tested on real time database and the results are found to be 96% accurate.

  8. Sensor Fusion and Model Verification for a Mobile Robot

    OpenAIRE

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck; Bendtsen, Jan Dimon; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practi...

  9. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation

    OpenAIRE

    Marwah Almasri; Khaled Elleithy; Abrar Alajlan

    2015-01-01

    Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot...

  10. Advances in Multi-Sensor Data Fusion: Algorithms and Applications

    Directory of Open Access Journals (Sweden)

    Jingying Fu

    2009-09-01

    Full Text Available With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1 Improvements of fusion algorithms; (2 Development of “algorithm fusion” methods; (3 Establishment of an automatic quality assessment scheme.

  11. Detection of anti-personnel land-mines using sensor-fusion techniques

    NARCIS (Netherlands)

    Cremer, F.; Schavemaker, J.G.M.; Breejen, E. den; Schutte, K.

    1999-01-01

    In this paper we present the sensor-fusion results based on the measurements obtained within the European research project GEODE (Ground Explosive Ordnance DEtection system) that strives for the realisation of a vehicle-mounted, multisensor, anti-personnel land-mine detection system for humanitarian

  12. Tier-scalable reconnaissance: the challenge of sensor optimization, sensor deployment, sensor fusion, and sensor interoperability

    Science.gov (United States)

    Fink, Wolfgang; George, Thomas; Tarbell, Mark A.

    2007-04-01

    Robotic reconnaissance operations are called for in extreme environments, not only those such as space, including planetary atmospheres, surfaces, and subsurfaces, but also in potentially hazardous or inaccessible operational areas on Earth, such as mine fields, battlefield environments, enemy occupied territories, terrorist infiltrated environments, or areas that have been exposed to biochemical agents or radiation. Real time reconnaissance enables the identification and characterization of transient events. A fundamentally new mission concept for tier-scalable reconnaissance of operational areas, originated by Fink et al., is aimed at replacing the engineering and safety constrained mission designs of the past. The tier-scalable paradigm integrates multi-tier (orbit atmosphere surface/subsurface) and multi-agent (satellite UAV/blimp surface/subsurface sensing platforms) hierarchical mission architectures, introducing not only mission redundancy and safety, but also enabling and optimizing intelligent, less constrained, and distributed reconnaissance in real time. Given the mass, size, and power constraints faced by such a multi-platform approach, this is an ideal application scenario for a diverse set of MEMS sensors. To support such mission architectures, a high degree of operational autonomy is required. Essential elements of such operational autonomy are: (1) automatic mapping of an operational area from different vantage points (including vehicle health monitoring); (2) automatic feature extraction and target/region-of-interest identification within the mapped operational area; and (3) automatic target prioritization for close-up examination. These requirements imply the optimal deployment of MEMS sensors and sensor platforms, sensor fusion, and sensor interoperability.

  13. Uncooled microbolometer thermal imaging sensors for unattended ground sensor applications

    Science.gov (United States)

    Figler, Burton D.

    2001-09-01

    Starting in the early 1990's, uncooled microbolometer thermal imaging sensor technology began to move out of the basic development laboratories of the Honeywell Corporation in Minneapolis and into applied development at several companies which have licensed the basic technology. Now, this technology is addressing military, government, and commercial applications in the real world. Today, thousands of uncooled microbolometer thermal imaging sensors are being produced and sold annually. At the same time, applied research and development on the technology continues at an unabated pace. These research and development efforts have two primary goals: 1) improving sensor performance in terms of increased resolution and greater thermal sensitivity and 2) reducing sensor cost. Success is being achieved in both areas. In this paper we will describe advances in uncooled microbolometer thermal imaging sensor technology as they apply to the modern battlefield and to unattended ground sensor applications in particular. Improvements in sensor performance include: a) reduced size, b) increased spatial resolution, c) increased thermal sensitivity, d) reduced electrical power, and e) reduced weight. For battlefield applications, unattended sensors are used not only in fixed ground locations, but also on a variety of moving platforms, including remotely operated ground vehicles, as well as Micro and Miniature Aerial Vehicles. The use of uncooled microbolometer thermal imaging sensors on these platforms will be discussed, and the results from simulations, of an uncooled microbolometer sensor flying on a Micro Aerial Vehicle will be presented. Finally, we will describe microbolometer technology advancements currently being made or planned at BAE SYSTEMS. Where possible, examples of actual improvements, in the form of real imagery and/or actual performance measurements, will be provided.

  14. Fusion of Radar and EO-sensors for Surveillance

    NARCIS (Netherlands)

    Kester, L.J.H.M.; Theil, A.

    2000-01-01

    Fusion of radar and EO-sensors for the purpose of surveillance is investigated. All sensors are considered to be co-located with respect to the distance of the area under surveillance. More specifically, the applicability for such multi-sensor systems is examined for surveillance in littoral

  15. Fusion of Radar and EO-sensors for Surveillance

    NARCIS (Netherlands)

    Kester, L.J.H.M.; Theil, A.

    2001-01-01

    Fusion of radar and EO-sensors is investigated for the purpose of surveillance in littoral waters is. All sensors are considered to be co-located with respect to the distance, typically 1 to 10 km, of the area under surveillance. The sensor suite is a coherent polarimetric radar in combination with

  16. Fusion of Radar and EO-sensors for Surveillance

    NARCIS (Netherlands)

    Kester, L.J.H.M.; Theil, A.

    2001-01-01

    Fusion of radar and EO-sensors is investigated for the purpose of surveillance in littoral waters is. All sensors are considered to be co-located with respect to the distance, typically 1 to 10 km, of the area under surveillance. The sensor suite is a coherent polarimetric radar in combination with

  17. Assessing the Performance of Sensor Fusion Methods: Application to Magnetic-Inertial-Based Human Body Tracking.

    Science.gov (United States)

    Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria

    2016-01-26

    Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor's uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.

  18. Driver drowsiness detection using multimodal sensor fusion

    Science.gov (United States)

    Andreeva, Elena O.; Aarabi, Parham; Philiastides, Marios G.; Mohajer, Keyvan; Emami, Majid

    2004-04-01

    This paper proposes a multi-modal sensor fusion algorithm for the estimation of driver drowsiness. Driver sleepiness is believed to be responsible for more than 30% of passenger car accidents and for 4% of all accident fatalities. In commercial vehicles, drowsiness is blamed for 58% of single truck accidents and 31% of commercial truck driver fatalities. This work proposes an innovative automatic sleep-onset detection system. Using multiple sensors, the driver"s body is studied as a mechanical structure of springs and dampeners. The sleep-detection system consists of highly sensitive triple-axial accelerometers to monitor the driver"s upper body in 3-D. The subject is modeled as a linear time-variant (LTV) system. An LMS adaptive filter estimation algorithm generates the transfer function (i.e. weight coefficients) for this LTV system. Separate coefficients are generated for the awake and asleep states of the subject. These coefficients are then used to train a neural network. Once trained, the neural network classifies the condition of the driver as either awake or asleep. The system has been tested on a total of 8 subjects. The tests were conducted on sleep-deprived individuals for the sleep state and on fully awake individuals for the awake state. When trained and tested on the same subject, the system detected sleep and awake states of the driver with a success rate of 95%. When the system was trained on three subjects and then retested on a fourth "unseen" subject, the classification rate dropped to 90%. Furthermore, it was attempted to correlate driver posture and sleepiness by observing how car vibrations propagate through a person"s body. Eight additional subjects were studied for this purpose. The results obtained in this experiment proved inconclusive which was attributed to significant differences in the individual habitual postures.

  19. Multiple sensor fusion under unknown distributions

    Energy Technology Data Exchange (ETDEWEB)

    Rao, N.S.V.

    1996-10-01

    In a system of N sensors, the sensor {ital S{sub i}}, i = 1, 2 ..., N, outputs {ital Y}{sup (i)} {element_of} {Re}, according to an unknown probability distribution P{sub Y{sup (i)}}{vert_bar}X, corresponding to input X {element_of} {Re}. A training {ital n}-sample (X{sub 1}, Y{sub 1}), (X{sub 2},Y{sub 2}), ..., (X{sub n},Y{sub n}) is given where {ital Y{sub i}} = (Y{sub i}{sup (1)},Y{sub i}{sup (2)},...,Y{sub i}{sup (N)}) such that Y{sub i}{sup (j)} is the output of S{sub j} in response to input X{sub i}. The problem is to design a fusion rule expected square error: I({ital f}) = {integral}[X - f (Y)]{sup 2}dP{sub y{vert_bar}X}dPx, where Y=(Y{sup (1)}, Y{sup (2)},..., Y({sup N)}),is minimized over a family of functions {ital F}. Let f{sup *} minimize I(.) over {ital F}; in general, f{sup *} cannot be computed since the underlying distributions are unknown. We consider sufficient conditions based on smoothness and/or combinatorial dimensions of {ital F} to ensure that an estimator {cflx {ital f}} satisfies P[I({cflx {ital f}}) - I(f{sup *}) > {epsilon}] < {delta} for any {epsilon} > 0 and 0 < {delta} < 1. We present two methods for computing {cflx {ital f}} based on feedforward sigmoidal networks and Nadaraya-Watson estimator. Design and performance characteristics of the two methods are discussed, based both on theoretical and simulation results.

  20. Desensitized Optimal Filtering and Sensor Fusion Tool Kit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop desensitized optimal filtering techniques and to implement these algorithms in a navigation and sensor fusion tool kit. These proposed...

  1. An Alternate View Of Munition Sensor Fusion

    Science.gov (United States)

    Mayersak, J. R.

    1988-08-01

    An alternate multimode sensor fusion scheme is treated. The concept is designed to acquire and engage high value relocatable targets in a lock-on-after-launch sequence. The approach uses statistical decision concepts to determine the authority to be assigned to each mode in the acquisition sequence voting and decision process. Statistical target classification and recognition in the engagement sequence is accomplished through variable length feature vectors set by adaptive logics. The approach uses multiple decision for acquisition and classification, in the number of spaces selected, is adaptively weighted and adjusted. The scheme uses type of climate -- arctic, temperate, desert, and equatorial -- diurnal effects --- time of day -- type of background, type of countermeasures present -- signature suppresssion or obscuration, false target decoy or electronic warfare -- and other factors to make these selections. The approach is discussed in simple terms. Voids and deficiencies in the statistical data base used to train such algorithms is discussed. The approach is being developed to engage deep battle targets such as surface-to-surface missile systems, air defense units and self-propelled artillery.

  2. Distributed Fusion in Sensor Networks with Information Genealogy

    Science.gov (United States)

    2011-06-28

    1 llh International Conference on Information fusion. [2] KC Chang, CY Chong , and Shozo Mori, "On Scalable Distributed Sensor fusion," in Proc. 11...2011. [8] KC Chang, Chee-Yee Chong , and Shozo Mori, "Analytical and Computational Evaluation of Scalable Distributed Fusion Algorithms," IEEE Trans...Zhejiang University, Hang/. hou , China. He received the M.S. and Ph.D. in operations research from George Mason University, Fairfax, VA, in 2003 and

  3. Application of a sensor fusion algorithm for improving grasping stability

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Hyeon; Yoon, Hyun Suck; Moon, Hyung Pil; Choi, Hyouk Ryeol; Koo Ja Choon [Sungkyunkwan University, Suwon (Korea, Republic of)

    2015-07-15

    A robot hand normally employees various sensors that are packaged in small form factor, perform with delicately accurate, and cost mostly very expensive. Grasping operation of the hand relies especially on accuracy of those sensors. Even with a set of advanced sensory systems embedded in a robot hand, securing a stable grasping is still challenging task. The present work makes an attempt to improve force sensor accuracy by applying sensor fusion method. An optimal weight value sensor fusion method formulated with Kalman filters is presented and tested in the work. Using a set of inexpensive sensors, the work achieves a reliable force sensing and applies the enhanced sensor stability to an object pinch grasping.

  4. Multi-sensor image fusion and its applications

    CERN Document Server

    Blum, Rick S

    2005-01-01

    Taking another lesson from nature, the latest advances in image processing technology seek to combine image data from several diverse types of sensors in order to obtain a more accurate view of the scene: very much the same as we rely on our five senses. Multi-Sensor Image Fusion and Its Applications is the first text dedicated to the theory and practice of the registration and fusion of image data, covering such approaches as statistical methods, color-related techniques, model-based methods, and visual information display strategies.After a review of state-of-the-art image fusion techniques,

  5. Assessing the Performance of Sensor Fusion Methods: Application to Magnetic-Inertial-Based Human Body Tracking

    Directory of Open Access Journals (Sweden)

    Gabriele Ligorio

    2016-01-01

    Full Text Available Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor’s uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.

  6. Dynamic gesture recognition based on multiple sensors fusion technology.

    Science.gov (United States)

    Wenhui, Wang; Xiang, Chen; Kongqiao, Wang; Xu, Zhang; Jihai, Yang

    2009-01-01

    This paper investigates the roles of a three-axis accelerometer, surface electromyography sensors and a webcam for dynamic gesture recognition. A decision-level multiple sensor fusion method based on action elements is proposed to distinguish a set of 20 kinds of dynamic hand gestures. Experiments are designed and conducted to collect three kinds of sensor data stream simultaneously during gesture implementation and compare the performance of different subsets in gesture recognition. Experimental results from three subjects show that the combination of three kinds of sensor achieves recognition accuracies at 87.5%-91.8%, which are higher largely than that of the single sensor conditions. This study is valuable to realize continuous and dynamic gesture recognition based on multiple sensor fusion technology for multi-model interaction.

  7. Force and Acceleration Sensor Fusion for Compliant Robot Motion Control

    OpenAIRE

    Gámez García, Javier; Robertsson, Anders; Gómez Ortega, Juan; Johansson, Rolf

    2005-01-01

    In this work, we present implementation and experiment of the theory of dynamic force sensing for robotic manipulators, which uses a sensor fusion technique in order to extract the contact force exerted by the end-effector of the manipulator from those measured by a wrist force sensor, which are corrupted by the inertial forces on the end-effector. We propose a new control strategy based on multisensor fusion with three different sensors that is, encoders mounted at each joint of the robot wi...

  8. Physiological sensor signals classification for healthcare using sensor data fusion and case-based reasoning.

    Science.gov (United States)

    Begum, Shahina; Barua, Shaibal; Ahmed, Mobyen Uddin

    2014-07-03

    Today, clinicians often do diagnosis and classification of diseases based on information collected from several physiological sensor signals. However, sensor signal could easily be vulnerable to uncertain noises or interferences and due to large individual variations sensitivity to different physiological sensors could also vary. Therefore, multiple sensor signal fusion is valuable to provide more robust and reliable decision. This paper demonstrates a physiological sensor signal classification approach using sensor signal fusion and case-based reasoning. The proposed approach has been evaluated to classify Stressed or Relaxed individuals using sensor data fusion. Physiological sensor signals i.e., Heart Rate (HR), Finger Temperature (FT), Respiration Rate (RR), Carbon dioxide (CO2) and Oxygen Saturation (SpO2) are collected during the data collection phase. Here, sensor fusion has been done in two different ways: (i) decision-level fusion using features extracted through traditional approaches; and (ii) data-level fusion using features extracted by means of Multivariate Multiscale Entropy (MMSE). Case-Based Reasoning (CBR) is applied for the classification of the signals. The experimental result shows that the proposed system could classify Stressed or Relaxed individual 87.5% accurately compare to an expert in the domain. So, it shows promising result in the psychophysiological domain and could be possible to adapt this approach to other relevant healthcare systems.

  9. Physiological Sensor Signals Classification for Healthcare Using Sensor Data Fusion and Case-Based Reasoning

    Directory of Open Access Journals (Sweden)

    Shahina Begum

    2014-07-01

    Full Text Available Today, clinicians often do diagnosis and classification of diseases based on information collected from several physiological sensor signals. However, sensor signal could easily be vulnerable to uncertain noises or interferences and due to large individual variations sensitivity to different physiological sensors could also vary. Therefore, multiple sensor signal fusion is valuable to provide more robust and reliable decision. This paper demonstrates a physiological sensor signal classification approach using sensor signal fusion and case-based reasoning. The proposed approach has been evaluated to classify Stressed or Relaxed individuals using sensor data fusion. Physiological sensor signals i.e., Heart Rate (HR, Finger Temperature (FT, Respiration Rate (RR, Carbon dioxide (CO2 and Oxygen Saturation (SpO2 are collected during the data collection phase. Here, sensor fusion has been done in two different ways: (i decision-level fusion using features extracted through traditional approaches; and (ii data-level fusion using features extracted by means of Multivariate Multiscale Entropy (MMSE. Case-Based Reasoning (CBR is applied for the classification of the signals. The experimental result shows that the proposed system could classify Stressed or Relaxed individual 87.5% accurately compare to an expert in the domain. So, it shows promising result in the psychophysiological domain and could be possible to adapt this approach to other relevant healthcare systems.

  10. Statistical sensor fusion of ECG data using automotive-grade sensors

    Science.gov (United States)

    Koenig, A.; Rehg, T.; Rasshofer, R.

    2015-11-01

    Driver states such as fatigue, stress, aggression, distraction or even medical emergencies continue to be yield to severe mistakes in driving and promote accidents. A pathway towards improving driver state assessment can be found in psycho-physiological measures to directly quantify the driver's state from physiological recordings. Although heart rate is a well-established physiological variable that reflects cognitive stress, obtaining heart rate contactless and reliably is a challenging task in an automotive environment. Our aim was to investigate, how sensory fusion of two automotive grade sensors would influence the accuracy of automatic classification of cognitive stress levels. We induced cognitive stress in subjects and estimated levels from their heart rate signals, acquired from automotive ready ECG sensors. Using signal quality indices and Kalman filters, we were able to decrease Root Mean Squared Error (RMSE) of heart rate recordings by 10 beats per minute. We then trained a neural network to classify the cognitive workload state of subjects from heart rate and compared classification performance for ground truth, the individual sensors and the fused heart rate signal. We obtained an increase of 5 % higher correct classification by fusing signals as compared to individual sensors, staying only 4 % below the maximally possible classification accuracy from ground truth. These results are a first step towards real world applications of psycho-physiological measurements in vehicle settings. Future implementations of driver state modeling will be able to draw from a larger pool of data sources, such as additional physiological values or vehicle related data, which can be expected to drive classification to significantly higher values.

  11. Portable sensor technology for rotational ground motions

    Science.gov (United States)

    Bernauer, Felix; Wassermann, Joachim; Guattari, Frédéric; Igel, Heiner

    2016-04-01

    In this contribution we present performance characteristics of a single component interferometric fiber-optic gyroscope (IFOG). The prototype sensor is provided by iXBlue, France. It is tested in the framework of the European Research Council Project, ROMY (Rotational motions - a new observable for seismology), on its applicability as a portable and field-deployable sensor for rotational ground motions. To fully explore the benefits of this new seismic observable especially in the fields of vulcanology, ocean generated noise and geophysical exploration, such a sensor has to fulfill certain requirements regarding portability, power consumption, time stamping stability and dynamic range. With GPS-synchronized time stamping and miniseed output format, data acquisition is customized for the use in seismology. Testing time stamping accuracy yields a time shift of less than 0.0001 s and a correlation coefficient of 0.99 in comparison to a commonly used data acquisition system, Reftek 120. Sensor self-noise is below 5.0 ṡ 10-8 rads-1Hz-1/2 for a frequency band from 0.001 Hz to 5.0 Hz. Analysis of Allan deviation shows an angle random walk of 3.5 ṡ 10-8 rads-1Hz-1/2. Additionally, the operating range diagram is shown and ambient noise analysis is performed. The sensitivity of sensor self-noise to variations in surrounding temperature and magnetic field is tested in laboratory experiments. With a power consumption of less than 10 W, the whole system (single component sensor + data acquisition) is appropriate for field use with autonomous power supply.

  12. Multi-Sensor Fusion with Interacting Multiple Model Filter for Improved Aircraft Position Accuracy

    Directory of Open Access Journals (Sweden)

    Changho Lee

    2013-03-01

    Full Text Available The International Civil Aviation Organization (ICAO has decided to adopt Communications, Navigation, and Surveillance/Air Traffic Management (CNS/ATM as the 21st century standard for navigation. Accordingly, ICAO members have provided an impetus to develop related technology and build sufficient infrastructure. For aviation surveillance with CNS/ATM, Ground-Based Augmentation System (GBAS, Automatic Dependent Surveillance-Broadcast (ADS-B, multilateration (MLAT and wide-area multilateration (WAM systems are being established. These sensors can track aircraft positions more accurately than existing radar and can compensate for the blind spots in aircraft surveillance. In this paper, we applied a novel sensor fusion method with Interacting Multiple Model (IMM filter to GBAS, ADS-B, MLAT, and WAM data in order to improve the reliability of the aircraft position. Results of performance analysis show that the position accuracy is improved by the proposed sensor fusion method with the IMM filter.

  13. Decentralized Sensor Fusion for Ubiquitous Networking Robotics in Urban Areas

    Science.gov (United States)

    Sanfeliu, Alberto; Andrade-Cetto, Juan; Barbosa, Marco; Bowden, Richard; Capitán, Jesús; Corominas, Andreu; Gilbert, Andrew; Illingworth, John; Merino, Luis; Mirats, Josep M.; Moreno, Plínio; Ollero, Aníbal; Sequeira, João; Spaan, Matthijs T.J.

    2010-01-01

    In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites), a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted. PMID:22294927

  14. Decentralized Sensor Fusion for Ubiquitous Networking Robotics in Urban Areas

    Directory of Open Access Journals (Sweden)

    Aníbal Ollero

    2010-03-01

    Full Text Available In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites, a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted.

  15. Sensor fusion methods for high performance active vibration isolation systems

    Science.gov (United States)

    Collette, C.; Matichard, F.

    2015-04-01

    Sensor noise often limits the performance of active vibration isolation systems. Inertial sensors used in such systems can be selected through a wide variety of instrument noise and size characteristics. However, the most sensitive instruments are often the biggest and the heaviest. Consequently, high-performance active isolators sometimes embed many tens of kilograms in instrumentation. The weight and size of instrumentation can add unwanted constraint on the design. It tends to lower the structures natural frequencies and reduces the collocation between sensors and actuators. Both effects tend to reduce feedback control performance and stability. This paper discusses sensor fusion techniques that can be used in order to increase the control bandwidth (and/or the stability). For this, the low noise inertial instrument signal dominates the fusion at low frequency to provide vibration isolation. Other types of sensors (relative motion, smaller but noisier inertial, or force sensors) are used at higher frequencies to increase stability. Several sensor fusion configurations are studied. The paper shows the improvement that can be expected for several case studies including a rigid equipment, a flexible equipment, and a flexible equipment mounted on a flexible support structure.

  16. Intelligent processing techniques for sensor fusion

    Science.gov (United States)

    Byrd, Katherine A.; Smith, Bart; Allen, Doug; Morris, Norman; Bjork, Charles A., Jr.; Deal-Giblin, Kim; Rushing, John A.

    1998-03-01

    Intelligent processing techniques which can effectively combine sensor data from disparate sensors by selecting and using only the most beneficial individual sensor data is a critical element of exoatmospheric interceptor systems. A major goal of these algorithms is to provide robust discrimination against stressing threats in poor a priori conditions, and to incorporate adaptive approaches in off- nominal conditions. This paper summarizes the intelligent processing algorithms being developed, implemented and tested to intelligently fuse data from passive infrared and active LADAR sensors at the measurement, feature and decision level. These intelligent algorithms employ dynamic selection of individual sensors features and the weighting of multiple classifier decisions to optimize performance in good a priori conditions and robustness in poor a priori conditions. Features can be dynamically selected based on an estimate of the feature confidence which is determined from feature quality and weighting terms derived from the quality of sensor data and expected phenomenology. Multiple classifiers are employed which use both fuzzy logic and knowledge based approaches to fuse the sensor data and to provide a target lethality estimate. Target designation decisions can be made by fusing weighted individual classifier decisions whose output contains an estimate of the confidence of the data and the discrimination decisions. The confidence in the data and decisions can be used in real time to dynamically select different sensor feature data or to request additional sensor data on specific objects that have not been confidently identified as being lethal or non- lethal. The algorithms are implemented in C within a graphic user interface framework. Dynamic memory allocation and the sequentialy implementation of the feature algorithms are employed. The baseline set of fused sensor discrimination algorithms with intelligent processing are described in this paper. Example results

  17. Context extraction for local fusion for landmine detection with multi-sensor systems

    Science.gov (United States)

    Frigui, Hichem; Gader, Paul D.; Ben Abdallah, Ahmed Chamseddine

    2009-05-01

    We present a local method for fusing the results of several landmine detectors using Ground Penetrating Radar (GPR) and Wideband Electro-Magnetic Induction (WEMI) sensors. The detectors considered include Edge Histogram Descriptor (EHD), Hidden Markov Models (HMM), and Spectral Correlation Feature (SCF) for the GPR sensor, and a feature-based classifier for the metal detector. The above detectors use different types of features and different classification methods. Our approach, called Context Extraction for Local Fusion with Feature Discrimination(CELF-FD), is a local approach that adapts the fusion method to different regions of the feature space. It is based on a novel objective function that combines context identification and multi-algorithm fusion criteria into a joint objective function. The context identification component thrives to partition the input feature space into clusters and identify the relevant features within each cluster. The fusion component thrives to learns the optimal fusion parameters within each cluster. Results on large and diverse GPR and WEMI data collections show that the proposed method can identify meaningful and coherent clusters and that these clusters require different fusion parameters. Our initial experiments have also indicated that CELF-FD outperforms the original CELF algorithm and all individual detectors.

  18. Sensor Fusion-based Event Detection in Wireless Sensor Networks

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, N.; Havinga, P.J.M.

    2009-01-01

    Recently, Wireless Sensor Networks (WSN) community has witnessed an application focus shift. Although, monitoring was the initial application of wireless sensor networks, in-network data processing and (near) real-time actuation capability have made wireless sensor networks suitable candidate for ev

  19. Data Fusion in Distributed Multi-sensor System

    Institute of Scientific and Technical Information of China (English)

    GUO Hang; YU Min

    2004-01-01

    This paper presents a data fusion method in distributed multi-sensor system including GPS and INS sensors' data processing. First, a residual χ2-test strategy with the corresponding algorithm is designed. Then a coefficient matrices calculation method of the information sharing principle is derived. Finally, the federated Kalman filter is used to combine these independent, parallel, real-time data. A pseudolite (PL) simulation example is given.

  20. Fuzzy-Based Sensor Fusion for Cognitive Radio-Based Vehicular Ad Hoc and Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mohammad Jalil Piran

    2015-01-01

    Full Text Available In wireless sensor networks, sensor fusion is employed to integrate the acquired data from diverse sensors to provide a unified interpretation. The best and most salient advantage of sensor fusion is to obtain high-level information in both statistical and definitive aspects, which cannot be attained by a single sensor. In this paper, we propose a novel sensor fusion technique based on fuzzy theory for our earlier proposed Cognitive Radio-based Vehicular Ad Hoc and Sensor Networks (CR-VASNET. In the proposed technique, we considered four input sensor readings (antecedents and one output (consequent. The employed mobile nodes in CR-VASNET are supposed to be equipped with diverse sensors, which cater to our antecedent variables, for example, The Jerk, Collision Intensity, and Temperature and Inclination Degree. Crash_Severity is considered as the consequent variable. The processing and fusion of the diverse sensory signals are carried out by fuzzy logic scenario. Accuracy and reliability of the proposed protocol, demonstrated by the simulation results, introduce it as an applicable system to be employed to reduce the causalities rate of the vehicles’ crashes.

  1. Sensor fusion approaches for EMI and GPR-based subsurface threat identification

    Science.gov (United States)

    Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.

    2011-06-01

    Despite advances in both electromagnetic induction (EMI) and ground penetrating radar (GPR) sensing and related signal processing, neither sensor alone provides a perfect tool for detecting the myriad of possible buried objects that threaten the lives of Soldiers and civilians. However, while neither GPR nor EMI sensing alone can provide optimal detection across all target types, the two approaches are highly complementary. As a result, many landmine systems seek to make use of both sensing modalities simultaneously and fuse the results from both sensors to improve detection performance for targets with widely varying metal content and GPR responses. Despite this, little work has focused on large-scale comparisons of different approaches to sensor fusion and machine learning for combining data from these highly orthogonal phenomenologies. In this work we explore a wide array of pattern recognition techniques for algorithm development and sensor fusion. Results with the ARA Nemesis landmine detection system suggest that nonlinear and non-parametric classification algorithms provide significant performance benefits for single-sensor algorithm development, and that fusion of multiple algorithms can be performed satisfactorily using basic parametric approaches, such as logistic discriminant classification, for the targets under consideration in our data sets.

  2. Kalman Filter Sensor Fusion for Mecanum Wheeled Automated Guided Vehicle Localization

    Directory of Open Access Journals (Sweden)

    Sang Won Yoon

    2015-01-01

    Full Text Available The Mecanum automated guided vehicle (AGV, which can move in any direction by using a special wheel structure with a LIM-wheel and a diagonally positioned roller, holds considerable promise for the field of industrial electronics. A conventional method for Mecanum AGV localization has certain limitations, such as slip phenomena, because there are variations in the surface of the road and ground friction. Therefore, precise localization is a very important issue for the inevitable slip phenomenon situation. So a sensor fusion technique is developed to cope with this drawback by using the Kalman filter. ENCODER and StarGazer were used for sensor fusion. StarGazer is a position sensor for an image recognition device and always generates some errors due to the limitations of the image recognition device. ENCODER has also errors accumulating over time. On the other hand, there are no moving errors. In this study, we developed a Mecanum AGV prototype system and showed by simulation that we can eliminate the disadvantages of each sensor. We obtained the precise localization of the Mecanum AGV in a slip phenomenon situation via sensor fusion using a Kalman filter.

  3. Oil exploration oriented multi-sensor image fusion algorithm

    Science.gov (United States)

    Xiaobing, Zhang; Wei, Zhou; Mengfei, Song

    2017-04-01

    In order to accurately forecast the fracture and fracture dominance direction in oil exploration, in this paper, we propose a novel multi-sensor image fusion algorithm. The main innovations of this paper lie in that we introduce Dual-tree complex wavelet transform (DTCWT) in data fusion and divide an image to several regions before image fusion. DTCWT refers to a new type of wavelet transform, and it is designed to solve the problem of signal decomposition and reconstruction based on two parallel transforms of real wavelet. We utilize DTCWT to segment the features of the input images and generate a region map, and then exploit normalized Shannon entropy of a region to design the priority function. To test the effectiveness of our proposed multi-sensor image fusion algorithm, four standard pairs of images are used to construct the dataset. Experimental results demonstrate that the proposed algorithm can achieve high accuracy in multi-sensor image fusion, especially for images of oil exploration.

  4. Multiple image sensor data fusion through artificial neural networks

    Science.gov (United States)

    With multisensor data fusion technology, the data from multiple sensors are fused in order to make a more accurate estimation of the environment through measurement, processing and analysis. Artificial neural networks are the computational models that mimic biological neural networks. With high per...

  5. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  6. Fault-tolerant Sensor Fusion for Marine Navigation

    DEFF Research Database (Denmark)

    Blanke, Mogens

    2006-01-01

    where essential navigation information is provided even with multiple faults in instrumentation. The paper proposes a provable correct implementation through auto-generated state-event logics in a supervisory part of the algorithms. Test results from naval vessels document the performance and shows...... events where the fault-tolerant sensor fusion provided uninterrupted navigation data despite temporal instrument defects...

  7. Fault-tolerant Sensor Fusion for Marine Navigation

    DEFF Research Database (Denmark)

    Blanke, Mogens

    2006-01-01

    where essential navigation information is provided even with multiple faults in instrumentation. The paper proposes a provable correct implementation through auto-generated state-event logics in a supervisory part of the algorithms. Test results from naval vessels document the performance and shows...... events where the fault-tolerant sensor fusion provided uninterrupted navigation data despite temporal instrument defects...

  8. Sensor fusion for active vibration isolation in precision equipment

    NARCIS (Netherlands)

    Tjepkema, D.; Dijk, van J.; Soemers, H.M.J.R.

    2012-01-01

    Sensor fusion is a promising control strategy to improve the performance of active vibration isolation systems that are used in precision equipment. Normally, those vibration isolation systems are only capable of realizing a low transmissibility. Additional objectives are to increase the damping rat

  9. Integration of multiple sensor fusion in controller design.

    Science.gov (United States)

    Abdelrahman, Mohamed; Kandasamy, Parameshwaran

    2003-04-01

    The main focus of this research is to reduce the risk of a catastrophic response of a feedback control system when some of the feedback data from the system sensors are not reliable, while maintaining a reasonable performance of the control system. In this paper a methodology for integrating multiple sensor fusion into the controller design is presented. The multiple sensor fusion algorithm produces, in addition to the estimate of the measurand, a parameter that measures the confidence in the estimated value. This confidence is integrated as a parameter into the controller to produce fast system response when the confidence in the estimate is high, and a slow response when the confidence in the estimate is low. Conditions for the stability of the system with the developed controller are discussed. This methodology is demonstrated on a cupola furnace model. The simulations illustrate the advantages of the new methodology.

  10. Resource-Aware Data Fusion Algorithms for Wireless Sensor Networks

    CERN Document Server

    Abdelgawad, Ahmed

    2012-01-01

    This book introduces resource-aware data fusion algorithms to gather and combine data from multiple sources (e.g., sensors) in order to achieve inferences.  These techniques can be used in centralized and distributed systems to overcome sensor failure, technological limitation, and spatial and temporal coverage problems. The algorithms described in this book are evaluated with simulation and experimental results to show they will maintain data integrity and make data useful and informative.   Describes techniques to overcome real problems posed by wireless sensor networks deployed in circumstances that might interfere with measurements provided, such as strong variations of pressure, temperature, radiation, and electromagnetic noise; Uses simulation and experimental results to evaluate algorithms presented and includes real test-bed; Includes case study implementing data fusion algorithms on a remote monitoring framework for sand production in oil pipelines.

  11. Extending lifetime of wireless sensor networks using multi-sensor data fusion

    Indian Academy of Sciences (India)

    SOUMITRA DAS; S BARANI; SANJEEV WAGH; S S SONAVANE

    2017-07-01

    In this paper a multi-sensor data fusion approach for wireless sensor network based on bayesian methods and ant colony optimization techniques has been proposed. In this method, each node is equipped with multiple sensors (i.e., temperature and humidity). Use of more than one sensor provides additional information about the environmental conditions. The data fusion approach based on the competitive-type hierarchical processing is considered for experimentation. Initially the data are collected by the sensors placed in the sensing fields and then the data fusion probabilities are computed on the sensed data. In this proposed methodology, the collected temperature tand humidity data are processed by multi-sensor data fusion techniques, which help in decreasing the energy consumption as well as communication cost by fusing the redundant data. The multipledata fusion process improves the reliability and accuracy of the sensed information and simultaneously saves energy, which was our primary objective. The proposed algorithms were simulated using Matlab. The executions of proposed arnd low-energy adaptive clustering hierarchy algorithms were carried out and the results show that the proposed algorithms could efficiently reduce the use of energy and were able to save more energy, thus increasing the overall network lifetime.

  12. Sensor Fusion for Nuclear Proliferation Activity Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Adel Ghanem, Ph D

    2007-03-30

    The objective of Phase 1 of this STTR project is to demonstrate a Proof-of-Concept (PoC) of the Geo-Rad system that integrates a location-aware SmartTag (made by ZonTrak) and a radiation detector (developed by LLNL). It also includes the ability to transmit the collected radiation data and location information to the ZonTrak server (ZonService). The collected data is further transmitted to a central server at LLNL (the Fusion Server) to be processed in conjunction with overhead imagery to generate location estimates of nuclear proliferation and radiation sources.

  13. Fusion techniques for hybrid ground-penetrating radar: electromagnetic induction landmine detection systems

    Science.gov (United States)

    Laffin, Matt; Mohamed, Magdi A.; Etebari, Ali; Hibbard, Mark

    2010-04-01

    Hybrid ground penetrating radar (GPR) and electromagnetic induction (EMI) sensors have advanced landmine detection far beyond the capabilities of a single sensing modality. Both probability of detection (PD) and false alarm rate (FAR) are impacted by the algorithms utilized by each sensing mode and the manner in which the information is fused. Algorithm development and fusion will be discussed, with an aim at achieving a threshold probability of detection (PD) of 0.98 with a low false alarm rate (FAR) of less than 1 false alarm per 2 square meters. Stochastic evaluation of prescreeners and classifiers is presented with subdivisions determined based on mine type, metal content, and depth. Training and testing of an optimal prescreener on lanes that contain mostly low metal anti-personnel mines is presented. Several fusion operators for pre-screeners and classifiers, including confidence map multiplication, will be investigated and discussed for integration into the algorithm architecture.

  14. Multi-sensor image fusion using discrete wavelet frame transform

    Institute of Scientific and Technical Information of China (English)

    Zhenhua Li(李振华); Zhongliang Jing(敬忠良); Shaoyuan Sun(孙韶媛)

    2004-01-01

    An algorithm is presented for multi-sensor image fusion using discrete wavelet frame transform (DWFT).The source images to be fused are firstly decomposed by DWFT. The fusion process is the combining of the source coefficients. Before the image fusion process, image segmentation is performed on each source image in order to obtain the region representation of each source image. For each source image, the salience of each region in its region representation is calculated. By overlapping all these region representations of all the source images, we produce a shared region representation to label all the input images. The fusion process is guided by these region representations. Region match measure of the source images is calculated for each region in the shared region representation. When fusing the similar regions, weighted averaging mode is performed; otherwise selection mode is performed. Experimental results using real data show that the proposed algorithm outperforms the traditional pyramid transform based or discrete wavelet transform (DWT) based algorithms in multi-sensor image fusion.

  15. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  16. Fusion of Onboard Sensors for Better Navigation

    Directory of Open Access Journals (Sweden)

    Ravi Shankar

    2013-03-01

    Full Text Available This paper presents simulation results of navigation sensors such as integrated navigation system (INS, global navigation satellite system (GNSS and TACAN sensors onboard an aircraft to find the navigation solutions. Mathematical models for INS, GNSS (GPS satellite trajectories, GPS receiver and TACAN characteristics are simulated in Matlab. The INS simulation generates the output for position, velocity and attitude based on aerosond dynamic model. The GPS constellation is generated based on the YUMA almanac data. The GPS dilution of precession (DOP parameters are calculated and the best combination of four satellites (minimum PDOP is used for calculating the user position and velocity. The INS, GNSS, and TACAN solutions are integrated through loosely coupled extended Kalman filter for calculating the optimum navigation solution. The work is starting stone for providing aircraft based augmentation system for required navigation performance in terms of availability, accuracy, continuity and integrity.

  17. Maneuvering Vehicle Tracking Based on Multi-sensor Fusion

    Institute of Scientific and Technical Information of China (English)

    CHENYing; HANChong-Zhao

    2005-01-01

    Maneuvering targets tracking is a fundamental task in intelligent vehicle research. This paper focuses on the problem of fusion between radar and image sensors in targets tracking. In order to improve positioning accuracy and narrow down the image working area, a novel method that integrates radar filter with image intensity is proposed to establish an adaptive vision window.A weighted Hausdorff distance is introduced to define the functional relationship between image and model projection, and a modified simulated annealing algorithm is used to find optimum orientation parameter. Furthermore, the global state is estimated, which refers to the distributed data fusion algorithm. Experiment results show that our method is accurate.

  18. Data fusion for target tracking and classification with wireless sensor network

    Science.gov (United States)

    Pannetier, Benjamin; Doumerc, Robin; Moras, Julien; Dezert, Jean; Canevet, Loic

    2016-10-01

    In this paper, we address the problem of multiple ground target tracking and classification with information obtained from a unattended wireless sensor network. A multiple target tracking (MTT) algorithm, taking into account road and vegetation information, is proposed based on a centralized architecture. One of the key issue is how to adapt classical MTT approach to satisfy embedded processing. Based on track statistics, the classification algorithm uses estimated location, velocity and acceleration to help to classify targets. The algorithms enables tracking human and vehicles driving both on and off road. We integrate road or trail width and vegetation cover, as constraints in target motion models to improve performance of tracking under constraint with classification fusion. Our algorithm also presents different dynamic models, to palliate the maneuvers of targets. The tracking and classification algorithms are integrated into an operational platform (the fusion node). In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).

  19. Data fusion of multiple kinect sensors for a rehabilitation system.

    Science.gov (United States)

    Huibin Du; Yiwen Zhao; Jianda Han; Zheng Wang; Guoli Song

    2016-08-01

    Kinect-like depth sensors have been widely used in rehabilitation systems. However, single depth sensor processes limb-blocking, data loss or data error poorly, making it less reliable. This paper focus on using two Kinect sensors and data fusion method to solve these problems. First, two Kinect sensors capture the motion data of the healthy arm of the hemiplegic patient; Second, merge the data using the method of Set-Membership-Filter (SMF); Then, mirror this motion data by the Middle-Plane; In the end, control the wearable robotic arm driving the patient's paralytic arm so that the patient can interactively and initiatively complete a variety of recovery actions prompted by computer with 3D animation games.

  20. Non-Linear Fusion of Observations Provided by Two Sensors

    Directory of Open Access Journals (Sweden)

    Monir Azmani

    2013-07-01

    Full Text Available When we try to make the best estimate of some quantity, the problem of combining results from different experiments is encountered. In multi-sensor data fusion, the problem is seen as combining observations provided by different sensors. Sensors provide observations and information on an unknown quantity, which can differ in precision. We propose a combined estimate that uses prior information. We consider the simpler aspects of the problem, so that two sensors provide an observation of the same quantity. The standard error of the observations is supposed to be known. The prior information is an interval that bounds the parameter of the estimate. We derive the proposed combined estimate methodology, and we show its efficiency in the minimum mean square sense. The proposed combined estimate is assessed using synthetic data, and an application is presented.

  1. Fusion of Onboard Sensors for Better Navigation

    Directory of Open Access Journals (Sweden)

    Ravi Shankar

    2013-03-01

    Full Text Available This paper presents simulation results of navigation sensors such as integrated navigation system (INS, global navigation satellite system (GNSS and TACAN sensors onboard an aircraft to find the navigation solutions. Mathematical models for INS, GNSS (GPS satellite trajectories, GPS receiver and TACAN characteristics are simulated in Matlab. The INS simulation generates the output for position, velocity and attitude based on aerosond dynamic model. The GPS constellation is generated based on the YUMA almanac data. The GPS dilution of precession (DOP parameters are calculated and the best combination of four satellites (minimum PDOP is used for calculating the user position and velocity. The INS, GNSS, and TACAN solutions are integrated through loosely coupled extended Kalman filter for calculating the optimum navigation solution. The work is starting stone for providing aircraft based augmentation system for required navigation performance in terms of availability, accuracy, continuity and integrity.Defence Science Journal, 2013, 63(2, pp.145-152, DOI:http://dx.doi.org/10.14429/dsj.63.4256

  2. Distributed Sensor Fusion for Scalar Field Mapping Using Mobile Sensor Networks.

    Science.gov (United States)

    La, Hung Manh; Sheng, Weihua

    2013-04-01

    In this paper, autonomous mobile sensor networks are deployed to measure a scalar field and build its map. We develop a novel method for multiple mobile sensor nodes to build this map using noisy sensor measurements. Our method consists of two parts. First, we develop a distributed sensor fusion algorithm by integrating two different distributed consensus filters to achieve cooperative sensing among sensor nodes. This fusion algorithm has two phases. In the first phase, the weighted average consensus filter is developed, which allows each sensor node to find an estimate of the value of the scalar field at each time step. In the second phase, the average consensus filter is used to allow each sensor node to find a confidence of the estimate at each time step. The final estimate of the value of the scalar field is iteratively updated during the movement of the mobile sensors via weighted average. Second, we develop the distributed flocking-control algorithm to drive the mobile sensors to form a network and track the virtual leader moving along the field when only a small subset of the mobile sensors know the information of the leader. Experimental results are provided to demonstrate our proposed algorithms.

  3. Diagnostics and data fusion of robotic sensors

    Energy Technology Data Exchange (ETDEWEB)

    Dhar, M.; Bardsley, S; Cowper, L.; Hamm, R.; Jammu, V.; Wagner, J.

    1996-12-31

    Robotic systems for remediation of hazardous waste sites must be highly reliable to avoid equipment failures and subsequent possible exposure of personnel to hazardous environments. Safe, efficient cleanup operations also require accurate, complete knowledge of the task space. This paper presents progress made on a 18 month program to meet these needs. To enhance robot reliability, a conceptual design of a monitoring and diagnostic system is being developed to predict the onset of mechanical failure modes, provide maximum lead time to make operational changes or repairs, and minimize the occurrence of on-site breakdowns. To ensure safe operation, a comprehensive software package is being developed that will fuse data from multiple surface mapping sensors and poses so as to reduce the error effects in individual data points and provide accurate 3-D maps of a work space.

  4. Sensor fusion by pseudo information measure: a mobile robot application.

    Science.gov (United States)

    Asharif, Mohammad Reza; Moshiri, Behzad; HoseinNezhad, Reza

    2002-07-01

    In any autonomous mobile robot, one of the most important issues to be designed and implemented is environment perception. In this paper, a new approach is formulated in order to perform sensory data integration for generation of an occupancy grid map of the environment. This method is an extended version of the Bayesian fusion method for independent sources of information. The performance of the proposed method of fusion and its sensitivity are discussed. Map building simulation for a cylindrical robot with eight ultrasonic sensors and mapping implementation for a Khepera robot have been separately tried in simulation and experimental works. A new neural structure is introduced for conversion of proximity data that are given by Khepera IR sensors to occupancy probabilities. Path planning experiments have also been applied to the resulting maps. For each map, two factors are considered and calculated: the fitness and the augmented occupancy of the map with respect to the ideal map. The length and the least distance to obstacles were the other two factors that were calculated for the routes that are resulted by path planning experiments. Experimental and simulation results show that by using the new fusion formulas, more informative maps of the environment are obtained. By these maps more appropriate routes could be achieved. Actually, there is a tradeoff between the length of the resulting routes and their safety and by choosing the proper fusion function, this tradeoff is suitably tuned for different map building applications.

  5. Ground strain measuring system using optical fiber sensors

    Science.gov (United States)

    Sato, Tadanobu; Honda, Riki; Shibata, Shunjiro; Takegawa, Naoki

    2001-08-01

    This paper presents a device to measure the dynamic horizontal shear strain of the ground during earthquake. The proposed device consists of a bronze plate with fiber Bragg grating sensors attached on it. The device is vertically installed in the ground, and horizontal shear strain of the ground is measured as deflection angle of the plate. Employment of optical fiber sensors makes the proposed device simple in mechanism and highly durable, which makes it easy to install our device in the ground. We conducted shaking table tests using ground model to verify applicability of the proposed device.

  6. Comparison and Intercalibration of Vegetation Indices from Different Sensors for Monitoring Above-Ground Plant Nitrogen Uptake in Winter Wheat

    Directory of Open Access Journals (Sweden)

    Yan Zhu

    2013-03-01

    Full Text Available Various sensors have been used to obtain the canopy spectral reflectance for monitoring above-ground plant nitrogen (N uptake in winter wheat. Comparison and intercalibration of spectral reflectance and vegetation indices derived from different sensors are important for multi-sensor data fusion and utilization. In this study, the spectral reflectance and its derived vegetation indices from three ground-based sensors (ASD Field Spec Pro spectrometer, CropScan MSR 16 and GreenSeeker RT 100 in six winter wheat field experiments were compared. Then, the best sensor (ASD and its normalized difference vegetation index (NDVI (807, 736 for estimating above-ground plant N uptake were determined (R2 of 0.885 and RMSE of 1.440 g·N·m−2 for model calibration. In order to better utilize the spectral reflectance from the three sensors, intercalibration models for vegetation indices based on different sensors were developed. The results indicated that the vegetation indices from different sensors could be intercalibrated, which should promote application of data fusion and make monitoring of above-ground plant N uptake more precise and accurate.

  7. Towards an operational sensor-fusion system for anti-personnel landmine detection

    NARCIS (Netherlands)

    Cremer, F.; Schutte, K.; Schavemaker, J.G.M.; Breejen, E. den

    2000-01-01

    To acquire detection performance required for an operational system for the detection of anti-personnel landmines, it is necessary to use multiple sensors and sensor-fusion techniques. This paper describes five decision-level sensor-fusion techniques and their common optimisation method. The perform

  8. Cognitive foundations for model-based sensor fusion

    Science.gov (United States)

    Perlovsky, Leonid I.; Weijers, Bertus; Mutz, Chris W.

    2003-08-01

    Target detection, tracking, and sensor fusion are complicated problems, which usually are performed sequentially. First detecting targets, then tracking, then fusing multiple sensors reduces computations. This procedure however is inapplicable to difficult targets which cannot be reliably detected using individual sensors, on individual scans or frames. In such more complicated cases one has to perform functions of fusing, tracking, and detecting concurrently. This often has led to prohibitive combinatorial complexity and, as a consequence, to sub-optimal performance as compared to the information-theoretic content of all the available data. It is well appreciated that in this task the human mind is by far superior qualitatively to existing mathematical methods of sensor fusion, however, the human mind is limited in the amount of information and speed of computation it can cope with. Therefore, research efforts have been devoted toward incorporating "biological lessons" into smart algorithms, yet success has been limited. Why is this so, and how to overcome existing limitations? The fundamental reasons for current limitations are analyzed and a potentially breakthrough research and development effort is outlined. We utilize the way our mind combines emotions and concepts in the thinking process and present the mathematical approach to accomplishing this in the current technology computers. The presentation will summarize the difficulties encountered by intelligent systems over the last 50 years related to combinatorial complexity, analyze the fundamental limitations of existing algorithms and neural networks, and relate it to the type of logic underlying the computational structure: formal, multivalued, and fuzzy logic. A new concept of dynamic logic will be introduced along with algorithms capable of pulling together all the available information from multiple sources. This new mathematical technique, like our brain, combines conceptual understanding with

  9. Contribution of sensor fusion to urban mapping: application to simulated SPOT 5-6 data

    OpenAIRE

    1996-01-01

    International audience; This communication intends to enhance the contribution of a sensor fusion method to urban mapping using the simulated SPOT 5-6 data. A new scheme is proposed for cartography of urban areas which takes into account the multispectral and the multiresolution nature of the data. This process makes use of classification and segmentation. An application of the sensor fusion method to analyse the simulated SPOT 5-6 data is presented. The benefits of using sensor fusion before...

  10. AN INFORMATION FUSION METHOD FOR SENSOR DATA RECTIFICATION

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhen; Xu Lizhong; Harry Hua Li; Shi Aiye; Han Hua; Wang Huibin

    2012-01-01

    In the applications of water regime monitoring,incompleteness,and inaccuracy of sensor data may directly affect the reliability of acquired monitoring information.Based on the spatial and temporal correlation of water regime monitoring information,this paper addresses this issue and proposes an information fusion method to implement data rectification.An improved Back Propagation (BP) neural network is used to perform data fusion on the hardware platform of a stantion unit,which takes Field-Programmable Gate Array (FPGA) as the core component.In order to verify the effectiveness,five measurements including water level,discharge and velocity are selected from three different points in a water regime monitoring station.The simulation results show that this method can recitify random errors as well as gross errors significantly.

  11. Discrete Kalman Filter based Sensor Fusion for Robust Accessibility Interfaces

    Science.gov (United States)

    Ghersi, I.; Mariño, M.; Miralles, M. T.

    2016-04-01

    Human-machine interfaces have evolved, benefiting from the growing access to devices with superior, embedded signal-processing capabilities, as well as through new sensors that allow the estimation of movements and gestures, resulting in increasingly intuitive interfaces. In this context, sensor fusion for the estimation of the spatial orientation of body segments allows to achieve more robust solutions, overcoming specific disadvantages derived from the use of isolated sensors, such as the sensitivity of magnetic-field sensors to external influences, when used in uncontrolled environments. In this work, a method for the combination of image-processing data and angular-velocity registers from a 3D MEMS gyroscope, through a Discrete-time Kalman Filter, is proposed and deployed as an alternate user interface for mobile devices, in which an on-screen pointer is controlled with head movements. Results concerning general performance of the method are presented, as well as a comparative analysis, under a dedicated test application, with results from a previous version of this system, in which the relative-orientation information was acquired directly from MEMS sensors (3D magnetometer-accelerometer). These results show an improved response for this new version of the pointer, both in terms of precision and response time, while keeping many of the benefits that were highlighted for its predecessor, giving place to a complementary method for signal acquisition that can be used as an alternative-input device, as well as for accessibility solutions.

  12. Distributed fusion and automated sensor tasking in ISR systems

    Science.gov (United States)

    Preden, Jurgo; Pahtma, Raido; Astapov, Sergei; Ehala, Johannes; Riid, Andri; Motus, Leo

    2014-06-01

    Modern Intelligence, Surveillance and Reconnaissance (ISR) systems are increasingly being assembled from autonomous systems, so the resulting ISR system is a System of Systems (SoS). In order to take full advantage of the capabilities of the ISR SoS, the architecture and the design of these SoS should be able to facilitate the benefits inherent in a SoS approach - high resilience, higher level of adaptability and higher diversity, enabling on-demand system composition. The tasks performed by ISR SoS can well go beyond basic data acquisition, conditioning and communication as data processing can be easily integrated in the SoS. Such an ISR SoS can perform data fusion, classification and tracking (and conditional sensor tasking for additional data acquisition), these are extremely challenging tasks in this context, especially if the fusion is performed in a distributed manner. Our premise for the ISR SoS design and deployment is that the system is not designed as a complete system, where the capabilities of individual data providers are considered and the interaction paths, including communication channel capabilities, are specified at design time. Instead, we assume a loosely coupled SoS, where the data needs for a specific fusion task are described at a high level at design time and data providers (i.e., sensor systems) required for a specific fusion task are discovered dynamically at run time, the selection criteria for the data providers being the type and properties of data that can be provided by the specific data provider. The paper describes some of the aspects of a distributed ISR SoS design and implementation, bringing examples on both architectural design as well as on algorithm implementations.

  13. Sensor Fusion - Sonar and Stereo Vision, Using Occupancy Grids and SIFT

    DEFF Research Database (Denmark)

    Plascencia, Alfredo; Bendtsen, Jan Dimon

    2006-01-01

    The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a SDF (Sensor Data Fusion) architecture. This approach involves combined sonar and stereo vision readings. Sonar readings are interpreted using probability density functions to the o......The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a SDF (Sensor Data Fusion) architecture. This approach involves combined sonar and stereo vision readings. Sonar readings are interpreted using probability density functions...

  14. An alternative sensor fusion method for object orientation using low-cost MEMS inertial sensors

    Science.gov (United States)

    Bouffard, Joshua L.

    This thesis develops an alternative sensor fusion approach for object orientation using low-cost MEMS inertial sensors. The alternative approach focuses on the unique challenges of small UAVs. Such challenges include the vibrational induced noise onto the accelerometer and bias offset errors of the rate gyroscope. To overcome these challenges, a sensor fusion algorithm combines the measured data from the accelerometer and rate gyroscope to achieve a single output free from vibrational noise and bias offset errors. One of the most prevalent sensor fusion algorithms used for orientation estimation is the Extended Kalman filter (EKF). The EKF filter performs the fusion process by first creating the process model using the nonlinear equations of motion and then establishing a measurement model. With the process and measurement models established, the filter operates by propagating the mean and covariance of the states through time. The success of EKF relies on the ability to establish a representative process and measurement model of the system. In most applications, the EKF measurement model utilizes the accelerometer and GPS-derived accelerations to determine an estimate of the orientation. However, if the GPS-derived accelerations are not available then the measurement model becomes less reliable when subjected to harsh vibrational environments. This situation led to the alternative approach, which focuses on the correlation between the rate gyroscope and accelerometer-derived angle. The correlation between the two sensors then determines how much the algorithm will use one sensor over the other. The result is a measurement that does not suffer from the vibrational noise or from bias offset errors.

  15. OBSTACLE DETECTION SYSTEM INVOLVING FUSION OF MULTIPLE SENSOR TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    C. Giannì

    2017-08-01

    Full Text Available Obstacle detection is a fundamental task for Unmanned Aerial Vehicles (UAV as a part of a Sense and Avoid system. In this study, we present a method of multi-sensor obstacle detection that demonstrated good results on different kind of obstacles. This method can be implemented on low-cost platforms involving a DSP or small FPGA. In this paper, we also present a study on the typical targets that can be tough to detect because of their characteristics of reflectivity, form factor, heterogeneity and show how data fusion can often overcome the limitations of each technology.

  16. Hand-Writing Motion Tracking with Vision-Inertial Sensor Fusion: Calibration and Error Correction

    Directory of Open Access Journals (Sweden)

    Shengli Zhou

    2014-08-01

    Full Text Available The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model.

  17. Relative Vessel Motion Tracking using Sensor Fusion, Aruco Markers, and MRU Sensors

    Directory of Open Access Journals (Sweden)

    Sondre Sanden Tordal

    2017-04-01

    Full Text Available This paper presents a novel approach for estimating the relative motion between two moving offshore vessels. The method is based on a sensor fusion algorithm including a vision system and two motion reference units (MRUs. The vision system makes use of the open-source computer vision library OpenCV and a cube with Aruco markers placed onto each of the cube sides. The Extended Quaternion Kalman Filter (EQKF is used for bad pose rejection for the vision system. The presented sensor fusion algorithm is based on the Indirect Feedforward Kalman Filter for error estimation. The system is self-calibrating in the sense that the Aruco cube can be placed in an arbitrary location on the secondary vessel. Experimental 6-DOF results demonstrate the accuracy and efficiency of the proposed sensor fusion method compared with the internal joint sensors of two Stewart platforms and the industrial robot. The standard deviation error was found to be 31mm or better when the Arcuo cube was placed at three different locations.

  18. Development of Mine Explosion Ground Truth Smart Sensors

    Science.gov (United States)

    2011-09-01

    DEVELOPMENT OF MINE EXPLOSION GROUND TRUTH SMART SENSORS Steven R. Taylor1, Phillip E. Harben1, Steve Jarpe2, and David B. Harris3 Rocky...improved location is the compilation of ground truth data sets for which origin time and location are accurately known. Substantial effort by the...National Laboratories and seismic monitoring groups have been undertaken to acquire and develop ground truth catalogs that form the basis of location

  19. Context-aided sensor fusion for enhanced urban navigation.

    Science.gov (United States)

    Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María

    2012-12-06

     The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments.

  20. Semantically enriched data for effective sensor data fusion

    Science.gov (United States)

    de Mel, Geeth; Pham, Tien; Damarla, Thyagaraju; Vasconcelos, Wamberto; Norman, Tim

    2011-06-01

    Data fusion plays a major role in assisting decision makers by providing them with an improved situational awareness so that informed decisions could be made about the events that occur in the field. This involves combining a multitude of sensor modalities such that the resulting output is better (i.e., more accurate, complete, dependable etc.) than what it would have been if the data streams (hereinafter referred to as 'feeds') from the resources are taken individually. However, these feeds lack any context-related information (e.g., detected event, event classification, relationships to other events, etc.). This hinders the fusion process and may result in creating an incorrect picture about the situation. Thus, results in false alarms, waste valuable time/resources. In this paper, we propose an approach that enriches feeds with semantic attributes so that these feeds have proper meaning. This will assist underlying applications to present analysts with correct feeds for a particular event for fusion. We argue annotated stored feeds will assist in easy retrieval of historical data that may be related to the current fusion. We use a subset of Web Ontology Language (OWL), OWL-DL to present a lightweight and efficient knowledge layer for feeds annotation and use rules to capture crucial domain concepts. We discuss a solution architecture and provide a proof-of-concept tool to evaluate the proposed approach. We discuss the importance of such an approach with a set of user cases and show how a tool like the one proposed could assist analysts, planners to make better informed decisions.

  1. Applications of FBG-based sensors to ground stability monitoring

    Institute of Scientific and Technical Information of China (English)

    An-Bin Huang; Chien-Chih Wang; Jui-Ting Lee; Yen-Te Ho

    2016-01-01

    Over the past few decades, many optical fiber sensing techniques have been developed. Among these available sensing methods, optical fiber Bragg grating (FBG) is probably the most popular one. With its unique capabilities, FBG-based geotechnical sensors can be used as a sensor array for distributive (profile) measurements, deployed under water (submersible), for localized high resolution and/or dif-ferential measurements. The authors have developed a series of FBG-based transducers that include inclination, linear displacement and gauge/differential pore pressure sensors. Techniques that involve the field deployment of FBG inclination, extension and pore-pressure sensor arrays for automated slope stability and ground subsidence monitoring have been developed. The paper provides a background of FBG and the design concepts behind the FBG-based field monitoring sensors. Cases of field monitoring using the FBG sensor arrays are presented, and their practical implications are discussed.

  2. Sensor fusion III: 3-D perception and recognition; Proceedings of the Meeting, Boston, MA, Nov. 5-8, 1990

    Science.gov (United States)

    Schenker, Paul S. (Editor)

    1991-01-01

    The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.

  3. Optimal multi-sensor Kalman smoothing fusion for discrete multichannel ARMA signals

    Institute of Scientific and Technical Information of China (English)

    Shuli SUN

    2005-01-01

    Based on the multi-sensor optimal information fusion criterion weighted by matrices in the linear minimum variance sense,using white noise estimators,an optimal fusion distributed Kalman smoother is given for discrete multi-channel ARMA (autoregressive moving average) signals.The smoothing error cross-covariance matrices between any two sensors are given for measurement noises.Furthermore,the fusion smoother gives higher precision than any local smoother does.

  4. Adaptive and mobile ground sensor array.

    Energy Technology Data Exchange (ETDEWEB)

    Holzrichter, Michael Warren; O' Rourke, William T.; Zenner, Jennifer; Maish, Alexander B.

    2003-12-01

    The goal of this LDRD was to demonstrate the use of robotic vehicles for deploying and autonomously reconfiguring seismic and acoustic sensor arrays with high (centimeter) accuracy to obtain enhancement of our capability to locate and characterize remote targets. The capability to accurately place sensors and then retrieve and reconfigure them allows sensors to be placed in phased arrays in an initial monitoring configuration and then to be reconfigured in an array tuned to the specific frequencies and directions of the selected target. This report reviews the findings and accomplishments achieved during this three-year project. This project successfully demonstrated autonomous deployment and retrieval of a payload package with an accuracy of a few centimeters using differential global positioning system (GPS) signals. It developed an autonomous, multisensor, temporally aligned, radio-frequency communication and signal processing capability, and an array optimization algorithm, which was implemented on a digital signal processor (DSP). Additionally, the project converted the existing single-threaded, monolithic robotic vehicle control code into a multi-threaded, modular control architecture that enhances the reuse of control code in future projects.

  5. Prospects of steady state magnetic diagnostic of fusion reactors based on metallic Hall sensors

    Science.gov (United States)

    Ďuran, I.; Sentkerestiová, J.; Kovařík, K.; Viererbl, L.

    2012-06-01

    Employment of sensors based on Hall effect (Hall sensors) is one of the candidate approaches to detection of almost steady state magnetic fields in future fusion reactors based on magnetic confinement (tokamaks, stellarators etc.), and also in possible fusion-fission hybrid systems having these fusion reactors as a neutron source and driver. This contribution reviews the initial considerations concerning application of metallic Hall sensors in fusion reactor harsh environment that include high neutron loads (>1018 cm-2) and elevated temperatures (>200°C). In particular, the candidate sensing materials, candidate technologies for sensors production, initial analysis of activation and transmutation of sensors under reactor relevant neutron loads and the tests of the the first samples of copper Hall sensors are presented.

  6. Application of Multi-Sensors Information Fusion for Self-protection System of Robot

    Directory of Open Access Journals (Sweden)

    Qiuhong Gao

    2013-01-01

    Full Text Available This paper developed a kind of robot self-protection system using the multi-sensors information fusion technology. This system used five groups of photoelectric sensor and ultrasonic sensor which were installed in different direction of the robot. In this study, signals were gathered by using the complement of ranging of photoelectric sensor and ultrasonic sensor. Then the signals were sent to MCU to achieve multi-sensors information fusion.Core fusion technology was the adaptive weighted fusion estimation algorithm, which can make measurement data more accurate. With such technology, an accurate robot self-protection command was made as to avoid obstacles, to judge narrow highland and to prevent dropping. The experiment results validated its good self-protection function.

  7. Passive localization processing for tactical unattended ground sensors

    Energy Technology Data Exchange (ETDEWEB)

    Ng, L.C.; Breitfeller, E.F.

    1995-09-01

    This report summarizes our preliminary results of a development effort to assess the potential capability of a system of unattended ground sensors to detect, classify, and localize underground sources. This report also discusses the pertinent signal processing methodologies, demonstrates the approach with computer simulations, and validates the simulations with experimental data. Specific localization methods discussed include triangulation and measurement of time difference of arrival from multiple sensor arrays.

  8. Dynamic tire pressure sensor for measuring ground vibration.

    Science.gov (United States)

    Wang, Qi; McDaniel, James Gregory; Wang, Ming L

    2012-11-07

    This work presents a convenient and non-contact acoustic sensing approach for measuring ground vibration. This approach, which uses an instantaneous dynamic tire pressure sensor (DTPS), possesses the capability to replace the accelerometer or directional microphone currently being used for inspecting pavement conditions. By measuring dynamic pressure changes inside the tire, ground vibration can be amplified and isolated from environmental noise. In this work, verifications of the DTPS concept of sensing inside the tire have been carried out. In addition, comparisons between a DTPS, ground-mounted accelerometer, and directional microphone are made. A data analysis algorithm has been developed and optimized to reconstruct ground acceleration from DTPS data. Numerical and experimental studies of this DTPS reveal a strong potential for measuring ground vibration caused by a moving vehicle. A calibration of transfer function between dynamic tire pressure change and ground acceleration may be needed for different tire system or for more accurate application.

  9. Compact networked radars for Army unattended ground sensors

    Science.gov (United States)

    Wikner, David A.; Viveiros, Edward A.; Wellman, Ronald; Clark, John; Kurtz, Jim; Pulskamp, Jeff; Proie, Robert; Ivanov, Tony; Polcawich, Ronald G.; Adler, Eric D.

    2010-04-01

    The Army Research Laboratory is in partnership with the University of Florida - Electronics Communications Laboratory to develop compact radar technology and demonstrate that it is scalable to a variety of ultra-lightweight platforms (<10 lbs.) to meet Army mission needs in persistent surveillance, unattended ground sensor (UGS), unmanned systems, and man-portable sensor applications. The advantage of this compact radar is its steerable beam technology and relatively long-range capability compared to other small, battery-powered radar concepts. This paper will review the ongoing development of the sensor and presents a sample of the collected data thus far.

  10. Knowledge assistant: A sensor fusion framework for robotic environmental characterization

    Energy Technology Data Exchange (ETDEWEB)

    Feddema, J.T.; Rivera, J.J.; Tucker, S.D.

    1996-12-01

    A prototype sensor fusion framework called the {open_quotes}Knowledge Assistant{close_quotes} has been developed and tested on a gantry robot at Sandia National Laboratories. This Knowledge Assistant guides the robot operator during the planning, execution, and post analysis stages of the characterization process. During the planning stage, the Knowledge Assistant suggests robot paths and speeds based on knowledge of sensors available and their physical characteristics. During execution, the Knowledge Assistant coordinates the collection of data through a data acquisition {open_quotes}specialist.{close_quotes} During execution and post analysis, the Knowledge Assistant sends raw data to other {open_quotes}specialists,{close_quotes} which include statistical pattern recognition software, a neural network, and model-based search software. After the specialists return their results, the Knowledge Assistant consolidates the information and returns a report to the robot control system where the sensed objects and their attributes (e.g. estimated dimensions, weight, material composition, etc.) are displayed in the world model. This paper highlights the major components of this system.

  11. A Fault-Tolerant Multiple Sensor Fusion Approach Applied to UAV Attitude Estimation

    Directory of Open Access Journals (Sweden)

    Yu Gu

    2016-01-01

    Full Text Available A novel sensor fusion design framework is presented with the objective of improving the overall multisensor measurement system performance and achieving graceful degradation following individual sensor failures. The Unscented Information Filter (UIF is used to provide a useful tool for combining information from multiple sources. A two-step off-line and on-line calibration procedure refines sensor error models and improves the measurement performance. A Fault Detection and Identification (FDI scheme crosschecks sensor measurements and simultaneously monitors sensor biases. Low-quality or faulty sensor readings are then rejected from the final sensor fusion process. The attitude estimation problem is used as a case study for the multiple sensor fusion algorithm design, with information provided by a set of low-cost rate gyroscopes, accelerometers, magnetometers, and a single-frequency GPS receiver’s position and velocity solution. Flight data collected with an Unmanned Aerial Vehicle (UAV research test bed verifies the sensor fusion, adaptation, and fault-tolerance capabilities of the designed sensor fusion algorithm.

  12. SENSOR FUSION CONTROL SYSTEM FOR COMPUTER INTEGRATED MANUFACTURING

    Directory of Open Access Journals (Sweden)

    C.M. Kumile

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Manufacturing companies of today face unpredictable, high frequency market changes driven by global competition. To stay competitive, these companies must have the characteristics of cost-effective rapid response to the market needs. As an engineering discipline, mechatronics strives to integrate mechanical, electronic, and computer systems optimally in order to create high precision products and manufacturing processes. This paper presents a methodology of increasing flexibility and reusability of a generic computer integrated manufacturing (CIM cell-control system using simulation and modelling of mechatronic sensory system (MSS concepts. The utilisation of sensors within the CIM cell is highlighted specifically for data acquisition, analysis, and multi-sensor data fusion. Thus the designed reference architecture provides comprehensive insight for the functions and methodologies of a generic shop-floor control system (SFCS, which consequently enables the rapid deployment of a flexible system.

    AFRIKAANSE OPSOMMING: Hedendaagse vervaardigingsondernemings ervaar gereeld onvoorspelbare markveranderinge wat aangedryf word deur wêreldwye mededinging. Om kompeterend te bly moet hierdie ondernemings die eienskappe van kosteeffektiwiteit en snelle-respons op markfluktuasies toon. Megatronika streef daarna om meganiese, elektroniese en rekenaarstelsels optimaal te integreer om hoëpresisieprodukte en produksieprosesse daar te stel. Hierdie artikel suggereer 'n metodologie vir toenemende aanpasbaarheid en herbruikbaarheid van 'n generiese rekenaargeïntegreerde vervaardigingsel-beheersisteem deur die gebruik van simulasie en die modellering van megatroniese sensorsisteemkonsepte. Die aanwending van sensors binne die sel fasiliteer datavaslegging, ontleding en multisensordatafusie. Sodoende verskaf die ontwerpte argitektuur insig in die funksie en metodologie van 'n generiese stukwerkwinkelbeheersisteem wat die vinnige

  13. Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992

    Science.gov (United States)

    Schenker, Paul S.

    1992-11-01

    Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)

  14. CONDITION MONITOR OF DEEP-HOLE DRILLING BASED ON MULTI-SENSOR INFORMATION FUSION

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A condition monitoring method of deep-hole drilling based on multi-sensor information fusion is discussed. The signal of vibration and cutting force are collected when the condition of deep-hole drilling on stainless steel 0Crl7Ni4Cu4Nb is normal or abnormal. Four eigenvectors are extracted on time-domain and frequency-domain analysis of the signals. Then the four eigenvectors are combined and sent to neural networks to dispose. The fusion results indicate that multi-sensor information fusion is superior to single-sensor information, and that cutting force signal can reflect the condition of cutting tool better than vibration signal.

  15. Belief Function Based Decision Fusion for Decentralized Target Classification in Wireless Sensor Networks.

    Science.gov (United States)

    Zhang, Wenyu; Zhang, Zhenjiang

    2015-08-19

    Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any type of classifier because the basic belief assignments (BBAs) of each sensor are constructed on the basis of the classifier's training output confusion matrix and real-time observations. We also derive explicit global BBA in the fusion center under Dempster's combinational rule, making the decision making operation in the fusion center greatly simplified. Also, sending the whole BBA structure to the fusion center is avoided. Experimental results demonstrate that the proposed fusion rule has better performance in fusion accuracy compared with the naïve Bayes rule and weighted majority voting rule.

  16. Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks

    Directory of Open Access Journals (Sweden)

    Elena Bergamini

    2014-10-01

    Full Text Available Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter and complementary (Non-linear observer filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles and heading (yaw angle errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.

  17. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... oxides (NOx) emission of cement kilns. On-site tests demonstrate improved performance over soft sensors based on conventional process measurements only....

  18. Asynchronous Sensor fuSion for Improved Safety of air Traffic (ASSIST) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI proposes to develop, implement and test a collision detection system for unmanned aerial vehicles (UAV), referred to as the Asynchronous Sensor fuSion for...

  19. Fault diagnosis in neutral point indirectly grounded system based on information fusion

    Institute of Scientific and Technical Information of China (English)

    于飞; 鞠丽叶; 刘喜梅; 崔平远; 钟秋海

    2003-01-01

    In neutral point indirectly grounded systems, phase-to-ground fault is putting new demands on fault diagnosis technology. Information fusion is applied to detect the phase-to-ground fault, which integrates several sources of information, including line current, line voltage, zero sequence current and voltage, and quintic harmonic wave component. This method is testified through the simulation of Matlab. Simulation results show that the precision and reliability of the detection has been greatly increased.

  20. Dynamic reweighting of three modalities for sensor fusion.

    Directory of Open Access Journals (Sweden)

    Sungjae Hwang

    Full Text Available We simultaneously perturbed visual, vestibular and proprioceptive modalities to understand how sensory feedback is re-weighted so that overall feedback remains suited to stabilizing upright stance. Ten healthy young subjects received an 80 Hz vibratory stimulus to their bilateral Achilles tendons (stimulus turns on-off at 0.28 Hz, a ± 1 mA binaural monopolar galvanic vestibular stimulus at 0.36 Hz, and a visual stimulus at 0.2 Hz during standing. The visual stimulus was presented at different amplitudes (0.2, 0.8 deg rotation about ankle axis to measure: the change in gain (weighting to vision, an intramodal effect; and a change in gain to vibration and galvanic vestibular stimulation, both intermodal effects. The results showed a clear intramodal visual effect, indicating a de-emphasis on vision when the amplitude of visual stimulus increased. At the same time, an intermodal visual-proprioceptive reweighting effect was observed with the addition of vibration, which is thought to change proprioceptive inputs at the ankles, forcing the nervous system to rely more on vision and vestibular modalities. Similar intermodal effects for visual-vestibular reweighting were observed, suggesting that vestibular information is not a "fixed" reference, but is dynamically adjusted in the sensor fusion process. This is the first time, to our knowledge, that the interplay between the three primary modalities for postural control has been clearly delineated, illustrating a central process that fuses these modalities for accurate estimates of self-motion.

  1. A radiosonde using a humidity sensor array with a platinum resistance heater and multi-sensor data fusion.

    Science.gov (United States)

    Shi, Yunbo; Luo, Yi; Zhao, Wenjie; Shang, Chunxue; Wang, Yadong; Chen, Yinsheng

    2013-07-12

    This paper describes the design and implementation of a radiosonde which can measure the meteorological temperature, humidity, pressure, and other atmospheric data. The system is composed of a CPU, microwave module, temperature sensor, pressure sensor and humidity sensor array. In order to effectively solve the humidity sensor condensation problem due to the low temperatures in the high altitude environment, a capacitive humidity sensor including four humidity sensors to collect meteorological humidity and a platinum resistance heater was developed using micro-electro-mechanical-system (MEMS) technology. A platinum resistance wire with 99.999% purity and 0.023 mm in diameter was used to obtain the meteorological temperature. A multi-sensor data fusion technique was applied to process the atmospheric data. Static and dynamic experimental results show that the designed humidity sensor with platinum resistance heater can effectively tackle the sensor condensation problem, shorten response times and enhance sensitivity. The humidity sensor array can improve measurement accuracy and obtain a reliable initial meteorological humidity data, while the multi-sensor data fusion technique eliminates the uncertainty in the measurement. The radiosonde can accurately reflect the meteorological changes.

  2. A Radiosonde Using a Humidity Sensor Array with a Platinum Resistance Heater and Multi-Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Yadong Wang

    2013-07-01

    Full Text Available This paper describes the design and implementation of a radiosonde which can measure the meteorological temperature, humidity, pressure, and other atmospheric data. The system is composed of a CPU, microwave module, temperature sensor, pressure sensor and humidity sensor array. In order to effectively solve the humidity sensor condensation problem due to the low temperatures in the high altitude environment, a capacitive humidity sensor including four humidity sensors to collect meteorological humidity and a platinum resistance heater was developed using micro-electro-mechanical-system (MEMS technology. A platinum resistance wire with 99.999% purity and 0.023 mm in diameter was used to obtain the meteorological temperature. A multi-sensor data fusion technique was applied to process the atmospheric data. Static and dynamic experimental results show that the designed humidity sensor with platinum resistance heater can effectively tackle the sensor condensation problem, shorten response times and enhance sensitivity. The humidity sensor array can improve measurement accuracy and obtain a reliable initial meteorological humidity data, while the multi-sensor data fusion technique eliminates the uncertainty in the measurement. The radiosonde can accurately reflect the meteorological changes.

  3. Application of D-S Evidence Fusion Method in the Fault Detection of Temperature Sensor

    Directory of Open Access Journals (Sweden)

    Zheng Dou

    2014-01-01

    Full Text Available Due to the complexity and dangerousness of drying process, the fault detection of temperature sensor is very difficult and dangerous in actual working practice and the detection effectiveness is not satisfying. For this problem, in this paper, based on the idea of information fusion and the requirements of D-S evidence method, a D-S evidence fusion structure with two layers was introduced to detect the temperature sensor fault in drying process. The first layer was data layer to establish the basic belief assignment function of evidence which could be realized by BP Neural Network. The second layer was decision layer to detect and locate the sensor fault which could be realized by D-S evidence fusion method. According to the numerical simulation results, the working conditions of sensors could be described effectively and accurately by this method, so that it could be used to detect and locate the sensor fault.

  4. An Investigation for Ground State Features of Some Structural Fusion Materials

    Science.gov (United States)

    Aytekin, H.; Tel, E.; Baldik, R.; Aydin, A.

    2011-02-01

    Environmental concerns associated with fossil fuels are creating increased interest in alternative non-fossil energy sources. Nuclear fusion can be one of the most attractive sources of energy from the viewpoint of safety and minimal environmental impact. When considered in all energy systems, the requirements for performance of structural materials in a fusion reactor first wall, blanket or diverter, are arguably more demanding or difficult than for other energy system. The development of fusion materials for the safety of fusion power systems and understanding nuclear properties is important. In this paper, ground state properties for some structural fusion materials as 27Al, 51V, 52Cr, 55Mn, and 56Fe are investigated using Skyrme-Hartree-Fock method. The obtained results have been discussed and compared with the available experimental data.

  5. Freeway Multisensor Data Fusion Approach Integrating Data from Cellphone Probes and Fixed Sensors

    Directory of Open Access Journals (Sweden)

    Shanglu He

    2016-01-01

    Full Text Available Freeway traffic state information from multiple sources provides sufficient support to the traffic surveillance but also brings challenges. This paper made an investigation into the fusion of a new data combination from cellular handoff probe system and microwave sensors. And a fusion method based on the neural network technique was proposed. To identify the factors influencing the accuracy of fusion results, we analyzed the sensitivity of those factors by changing the inputs of neural-network-based fusion model. The results showed that handoff link length and sample size were identified as the most influential parameters to the precision of fusion. Then, the effectiveness and capability of proposed fusion method under various traffic conditions were evaluated. And a comparative analysis between the proposed method and other fusion approaches was conducted. The results of simulation test and evaluation showed that the fusion method could complement the drawback of each collection method, improve the overall estimation accuracy, adapt to the variable traffic condition (free flow or incident state, suit the fusion of data from cellphone probes and fixed sensors, and outperform other fusion methods.

  6. Detecting Pedestrian Flocks by Fusion of Multi-Modal Sensors in Mobile Phones

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Wirz, Martin; Roggen, Daniel

    2012-01-01

    derived from multiple sensor modalities of modern smartphones. Automatic detection of flocks has several important applications, including evacuation management and socially aware computing. The novelty of this paper is, firstly, to use data fusion techniques to combine several sensor modalities (Wi...

  7. OPART: an intelligent sensor dedicated to ground robotics

    Science.gov (United States)

    Dalgalarrondo, Andre; Luzeaux, Dominique; Hoffmann, Patrik W.

    2001-09-01

    We present an intelligent sensor, consisting in 2 CCDs with different field of view sharing the same optical motion, which can be controlled independently or not in their horizontal, vertical and rotational axis, and are connected in a closed loop to image processing resources. The goal of such a sensor is to be a testbed of image processing algorithms in real conditions. It illustrates the active perception paradigm and is used for autonomous navigation and target detection/tracking missions. Such a sensor has to meet many requirements : it is designed to be easily mounted on a standard tracked or wheeled military vehicle evolving in offroad conditions. Due to the rather wide range of missions UGVs may be involved in and to the computing cost of image processing, its computing resources have to be reprogrammable, of great power (real-time constraints), modular at the software level as well as at the hardware level and able to communicate with other systems. First, the paper details the mechanical, electronical and software design of the whole sensor. Then, we explain its functioning, the constraints due to its parallel processing architecture, the image processing algorithms that have been implemented for it and their current uses and performances. Finally, we describe experiments conducted on tracked and wheeled vehicles and conclude on the future development and use of this sensor for unmanned ground vehicles.

  8. Sensor Fusion of Cameras and a Laser for City-Scale 3D Reconstruction

    Directory of Open Access Journals (Sweden)

    Yunsu Bok

    2014-11-01

    Full Text Available This paper presents a sensor fusion system of cameras and a 2D laser sensorfor large-scale 3D reconstruction. The proposed system is designed to capture data on afast-moving ground vehicle. The system consists of six cameras and one 2D laser sensor,and they are synchronized by a hardware trigger. Reconstruction of 3D structures is doneby estimating frame-by-frame motion and accumulating vertical laser scans, as in previousworks. However, our approach does not assume near 2D motion, but estimates free motion(including absolute scale in 3D space using both laser data and image features. In orderto avoid the degeneration associated with typical three-point algorithms, we present a newalgorithm that selects 3D points from two frames captured by multiple cameras. The problemof error accumulation is solved by loop closing, not by GPS. The experimental resultsshow that the estimated path is successfully overlaid on the satellite images, such that thereconstruction result is very accurate.

  9. Sensor fusion of cameras and a laser for city-scale 3D reconstruction.

    Science.gov (United States)

    Bok, Yunsu; Choi, Dong-Geol; Kweon, In So

    2014-11-04

    This paper presents a sensor fusion system of cameras and a 2D laser sensorfor large-scale 3D reconstruction. The proposed system is designed to capture data on afast-moving ground vehicle. The system consists of six cameras and one 2D laser sensor,and they are synchronized by a hardware trigger. Reconstruction of 3D structures is doneby estimating frame-by-frame motion and accumulating vertical laser scans, as in previousworks. However, our approach does not assume near 2D motion, but estimates free motion(including absolute scale) in 3D space using both laser data and image features. In orderto avoid the degeneration associated with typical three-point algorithms, we present a newalgorithm that selects 3D points from two frames captured by multiple cameras. The problemof error accumulation is solved by loop closing, not by GPS. The experimental resultsshow that the estimated path is successfully overlaid on the satellite images, such that thereconstruction result is very accurate.

  10. Distributed fusion estimation for sensor networks with communication constraints

    CERN Document Server

    Zhang, Wen-An; Song, Haiyu; Yu, Li

    2016-01-01

    This book systematically presents energy-efficient robust fusion estimation methods to achieve thorough and comprehensive results in the context of network-based fusion estimation. It summarizes recent findings on fusion estimation with communication constraints; several novel energy-efficient and robust design methods for dealing with energy constraints and network-induced uncertainties are presented, such as delays, packet losses, and asynchronous information... All the results are presented as algorithms, which are convenient for practical applications.

  11. Fusion of airborne radar and FLIR sensors for runway incursion detection

    Science.gov (United States)

    White, Joseph H.; Haidt, James G.; Britt, Charles L.; Archer, Cynthia; Neece, Robert T.

    2009-08-01

    Forward looking infrared and Radar (X-band or Ku-band) sensors are potential components in external hazard monitoring systems for general aviation aircraft. We are investigating the capability of these sensors to provide hazard information to the pilot when normal visibility is reduced by meteorological conditions. Fusing detection results from FLIR and Radar sensors can improve hazard detection performance. We have developed a demonstration fusion system for the detection of runway incursions. In this paper, we present our fusion system, along with detection results from data recorded on approach to a landing during clear daylight, overcast daylight, and clear night conditions.

  12. Gesture-Directed Sensor-Information Fusion for Communication in Hazardous Environments

    Science.gov (United States)

    2010-06-01

    sensors for gesture recognition [1], [2]. An important future step to enhance the effectiveness of the war fighter is to integrate CBRN and other...addition to the standard eGlove magnetic and motion gesture recognition sensors. War fighters progressing through a battlespace are now providing...a camera for gesture recognition is absolutely not an option for a CBRN war fighter in a battlefield scenario. Multi sensor fusion is commonly

  13. Computational Complexity Comparison Of Multi-Sensor Single Target Data Fusion Methods By Matlab

    OpenAIRE

    Hoseini, Sayed Amir; Ashraf, Mohammad Reza

    2013-01-01

    Target tracking using observations from multiple sensors can achieve better estimation performance than a single sensor. The most famous estimation tool in target tracking is Kalman filter. There are several mathematical approaches to combine the observations of multiple sensors by use of Kalman filter. An important issue in applying a proper approach is computational complexity. In this paper, four data fusion algorithms based on Kalman filter are considered including three centralized and o...

  14. Wrap-Around Out-the-Window Sensor Fusion System

    Science.gov (United States)

    Fox, Jeffrey; Boe, Eric A.; Delgado, Francisco; Secor, James B.; Clark, Michael R.; Ehlinger, Kevin D.; Abernathy, Michael F.

    2009-01-01

    The Advanced Cockpit Evaluation System (ACES) includes communication, computing, and display subsystems, mounted in a van, that synthesize out-the-window views to approximate the views of the outside world as it would be seen from the cockpit of a crewed spacecraft, aircraft, or remote control of a ground vehicle or UAV (unmanned aerial vehicle). The system includes five flat-panel display units arranged approximately in a semicircle around an operator, like cockpit windows. The scene displayed on each panel represents the view through the corresponding cockpit window. Each display unit is driven by a personal computer equipped with a video-capture card that accepts live input from any of a variety of sensors (typically, visible and/or infrared video cameras). Software running in the computers blends the live video images with synthetic images that could be generated, for example, from heads-up-display outputs, waypoints, corridors, or from satellite photographs of the same geographic region. Data from a Global Positioning System receiver and an inertial navigation system aboard the remote vehicle are used by the ACES software to keep the synthetic and live views in registration. If the live image were to fail, the synthetic scenes could still be displayed to maintain situational awareness.

  15. Development of mine explosion ground truth smart sensors

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Steven R. [Rocky Mountain Geophysics, Inc., Los Alamos, NM (United States); Harben, Phillip E. [Rocky Mountain Geophysics, Inc., Los Alamos, NM (United States); Jarpe, Steve [Jarpe Data Solutions, Prescott, AZ (United States); Harris, David B. [Deschutes Signal Processing, Maupin, OR (United States)

    2015-09-14

    Accurate seismo-acoustic source location is one of the fundamental aspects of nuclear explosion monitoring. Critical to improved location is the compilation of ground truth data sets for which origin time and location are accurately known. Substantial effort by the National Laboratories and other seismic monitoring groups have been undertaken to acquire and develop ground truth catalogs that form the basis of location efforts (e.g. Sweeney, 1998; Bergmann et al., 2009; Waldhauser and Richards, 2004). In particular, more GT1 (Ground Truth 1 km) events are required to improve three-dimensional velocity models that are currently under development. Mine seismicity can form the basis of accurate ground truth datasets. Although the location of mining explosions can often be accurately determined using array methods (e.g. Harris, 1991) and from overhead observations (e.g. MacCarthy et al., 2008), accurate origin time estimation can be difficult. Occasionally, mine operators will share shot time, location, explosion size and even shot configuration, but this is rarely done, especially in foreign countries. Additionally, shot times provided by mine operators are often inaccurate. An inexpensive, ground truth event detector that could be mailed to a contact, placed in close proximity (< 5 km) to mining regions or earthquake aftershock regions that automatically transmits back ground-truth parameters, would greatly aid in development of ground truth datasets that could be used to improve nuclear explosion monitoring capabilities. We are developing an inexpensive, compact, lightweight smart sensor unit (or units) that could be used in the development of ground truth datasets for the purpose of improving nuclear explosion monitoring capabilities. The units must be easy to deploy, be able to operate autonomously for a significant period of time (> 6 months) and inexpensive enough to be discarded after useful operations have expired (although this may not be part of our business

  16. The Rover Environmental Monitoring Station Ground Temperature Sensor: A Pyrometer for Measuring Ground Temperature on Mars

    OpenAIRE

    2010-01-01

    We describe the parameters that drive the design and modeling of the Rover Environmental Monitoring Station (REMS) Ground Temperature Sensor (GTS), an instrument aboard NASA’s Mars Science Laboratory, and report preliminary test results. REMS GTS is a lightweight, low-power, and low cost pyrometer for measuring the Martian surface kinematic temperature. The sensor’s main feature is its innovative design, based on a simple mechanical structure with no moving parts. It includes an in-flight cal...

  17. The Rover Environmental Monitoring Station Ground Temperature Sensor: A Pyrometer for Measuring Ground Temperature on Mars

    Directory of Open Access Journals (Sweden)

    Miguel Ramos

    2010-10-01

    Full Text Available We describe the parameters that drive the design and modeling of the Rover Environmental Monitoring Station (REMS Ground Temperature Sensor (GTS, an instrument aboard NASA’s Mars Science Laboratory, and report preliminary test results. REMS GTS is a lightweight, low-power, and low cost pyrometer for measuring the Martian surface kinematic temperature. The sensor’s main feature is its innovative design, based on a simple mechanical structure with no moving parts. It includes an in-flight calibration system that permits sensor recalibration when sensor sensitivity has been degraded by deposition of dust over the optics. This paper provides the first results of a GTS engineering model working in a Martian-like, extreme environment.

  18. The fiber optic gyroscope - a portable rotational ground motion sensor

    Science.gov (United States)

    Wassermann, J. M.; Bernauer, F.; Guattari, F.; Igel, H.

    2016-12-01

    It was already shown that a portable broadband rotational ground motion sensor will have large impact on several fields of seismological research such as volcanology, marine geophysics, seismic tomography and planetary seismology. Here, we present results of tests and experiments with one of the first broadband rotational motion sensors available. BlueSeis-3A, is a fiber optic gyroscope (FOG) especially designed for the needs of seismology, developed by iXBlue, France, in close collaboration with researchers financed by the European Research council project ROMY (Rotational motions - a new observable for seismology). We first present the instrument characteristics which were estimated by different standard laboratory tests, e.g. self noise using operational range diagrams or Allan deviation. Next we present the results of a field experiment which was designed to demonstrate the value of a 6C measurement (3 components of translation and 3 components of rotation). This field test took place at Mt. Stromboli volcano, Italy, and is accompanied by seismic array installation to proof the FOG output against more commonly known array derived rotation. As already shown with synthetic data an additional direct measurement of three components of rotation can reduce the ambiguity in source mechanism estimation and can be taken to correct for dynamic tilt of the translational sensors (i.e. seismometers). We can therefore demonstrate that the deployment of a weak motion broadband rotational motion sensor is in fact producing superior results by a reduction of the number of deployed instruments.

  19. Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter.

    Science.gov (United States)

    Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei

    2016-11-02

    Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system's error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the

  20. Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis.

    Science.gov (United States)

    Jiang, Wen; Xie, Chunhe; Zhuang, Miaoyan; Shou, Yehang; Tang, Yongchuan

    2016-09-15

    Sensor data fusion technology is widely employed in fault diagnosis. The information in a sensor data fusion system is characterized by not only fuzziness, but also partial reliability. Uncertain information of sensors, including randomness, fuzziness, etc., has been extensively studied recently. However, the reliability of a sensor is often overlooked or cannot be analyzed adequately. A Z-number, Z = (A, B), can represent the fuzziness and the reliability of information simultaneously, where the first component A represents a fuzzy restriction on the values of uncertain variables and the second component B is a measure of the reliability of A. In order to model and process the uncertainties in a sensor data fusion system reasonably, in this paper, a novel method combining the Z-number and Dempster-Shafer (D-S) evidence theory is proposed, where the Z-number is used to model the fuzziness and reliability of the sensor data and the D-S evidence theory is used to fuse the uncertain information of Z-numbers. The main advantages of the proposed method are that it provides a more robust measure of reliability to the sensor data, and the complementary information of multi-sensors reduces the uncertainty of the fault recognition, thus enhancing the reliability of fault detection.

  1. Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Wen Jiang

    2016-09-01

    Full Text Available Sensor data fusion technology is widely employed in fault diagnosis. The information in a sensor data fusion system is characterized by not only fuzziness, but also partial reliability. Uncertain information of sensors, including randomness, fuzziness, etc., has been extensively studied recently. However, the reliability of a sensor is often overlooked or cannot be analyzed adequately. A Z-number, Z = (A, B, can represent the fuzziness and the reliability of information simultaneously, where the first component A represents a fuzzy restriction on the values of uncertain variables and the second component B is a measure of the reliability of A. In order to model and process the uncertainties in a sensor data fusion system reasonably, in this paper, a novel method combining the Z-number and Dempster–Shafer (D-S evidence theory is proposed, where the Z-number is used to model the fuzziness and reliability of the sensor data and the D-S evidence theory is used to fuse the uncertain information of Z-numbers. The main advantages of the proposed method are that it provides a more robust measure of reliability to the sensor data, and the complementary information of multi-sensors reduces the uncertainty of the fault recognition, thus enhancing the reliability of fault detection.

  2. Information Fusion of Online Oil Monitoring System Using Multiple Sensors

    Institute of Scientific and Technical Information of China (English)

    高慧良; 周新聪; 程海明; 赵春华; 严新平

    2004-01-01

    Machine lubrication contains abundant information on the equipment operation.Nowadays, most measuring methods are based on offline sampling or on online measuring with a single sensor.An online oil monitoring system with multiple sensors was designed.The measurement data was processed with a fuzzy intelligence system.Information from integrated sensors in an oil online monitoring system was evaluated using fuzzy logic.The analyses show that the multiple sensors evaluation results are more reliable than online monitoring systems with single sensors.

  3. JDL level 0 and 1 algorithms for processing and fusion of hard sensor data

    Science.gov (United States)

    Rimland, Jeffrey C.; Iyer, Ganesh M.; Agumamidi, Rachana R.; Pisupati, Soumya V.; Graham, Jake

    2011-06-01

    A current trend in information fusion involves distributed methods of combining both conventional "hard" sensor data and human-based "soft" information in a manner that exploits the most useful and accurate capabilities of each modality. In addition, new and evolving technologies such as Flash LIDAR have greatly enhanced the ability of a single device to rapidly sense attributes of a scene in ways that were not previously possible. At the Pennsylvania State University we are participating in a multi-disciplinary university research initiative (MURI) program funded by the U.S. Army Research Office to investigate issues related to fusing hard and soft data in counterinsurgency (COIN) situations. We are developing level 0 and level 1 methods (using the Joint Directors of Laboratories (JDL) data fusion process model) for fusion of physical ("hard") sensor data. Techniques include methods for data alignment, tracking, recognition, and identification for a sensor suite that includes LIDAR, multi-camera systems, and acoustic sensors. The goal is to develop methods that dovetail on-going research in soft sensor processing. This paper describes various hard sensor processing algorithms and their evolving roles and implementations within a distributed hard and soft information fusion system.

  4. Optical Communication System for Remote Monitoring and Adaptive Control of Distributed Ground Sensors Exhibiting Collective Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, S.M.; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.

    1998-11-01

    Comprehensive management of the battle-space has created new requirements in information management, communication, and interoperability as they effect surveillance and situational awareness. The objective of this proposal is to expand intelligent controls theory to produce a uniquely powerful implementation of distributed ground-based measurement incorporating both local collective behavior, and interoperative global optimization for sensor fusion and mission oversight. By using a layered hierarchal control architecture to orchestrate adaptive reconfiguration of autonomous robotic agents, we can improve overall robustness and functionality in dynamic tactical environments without information bottlenecks. In this concept, each sensor is equipped with a miniaturized optical reflectance modulator which is interactively monitored as a remote transponder using a covert laser communication protocol from a remote mothership or operative. Robot data-sharing at the ground level can be leveraged with global evaluation criteria, including terrain overlays and remote imaging data. Information sharing and distributed intelli- gence opens up a new class of remote-sensing applications in which small single-function autono- mous observers at the local level can collectively optimize and measure large scale ground-level signals. AS the need for coverage and the number of agents grows to improve spatial resolution, cooperative behavior orchestrated by a global situational awareness umbrella will be an essential ingredient to offset increasing bandwidth requirements within the net. A system of the type described in this proposal will be capable of sensitively detecting, tracking, and mapping spatial distributions of measurement signatures which are non-stationary or obscured by clutter and inter- fering obstacles by virtue of adaptive reconfiguration. This methodology could be used, for example, to field an adaptive ground-penetrating radar for detection of underground structures in

  5. Fusion of forward looking infrared and ground penetrating radar for improved stopping distances in landmine detection

    Science.gov (United States)

    Malof, Jordan M.; Morton, Kenneth D.; Collins, Leslie M.; Torrione, Peter A.

    2014-06-01

    Ground penetrating radar (GPR) is a popular sensing modality for buried threat detection that offers low false alarm rates (FARs), but suffers from a short detection stopping or standoff distance. This short stopping distance leaves little time for the system operator to react when a threat is detected, limiting the speed of advance. This problem arises, in part, because of the way GPR data is typically processed. GPR data is first prescreened to reduce the volume of data considered for higher level feature-processing. Although fast, prescreening introduces latency that delays the feature processing and lowers the stopping distance of the system. In this work we propose a novel sensor fusion framework where a forward looking infrared (FLIR) camera is used as a prescreener, providing suspicious locations to the GPRbased system with zero latency. The FLIR camera is another detection modality that typically yields a higher FAR than GPR while offering much larger stopping distances. This makes it well-suited in the role of a zero-latency prescreener. In this framework, GPR-based feature processing can begin without any latency, improving stopping distances. This framework was evaluated using well-known FLIR and GPR detection algorithms on a large dataset collected at a Western US test site. Experiments were conducted to investigate the tradeoff between early stopping distance and FAR. The results indicate that earlier stopping distances are achievable while maintaining effective FARs. However, because an earlier stopping distance yields less data for feature extraction, there is a general tradeoff between detection performance and stopping distance.

  6. Performance evaluation of multi-sensor data-fusion systems in launch vehicles

    Indian Academy of Sciences (India)

    B N Suresh; K Sivan

    2004-04-01

    In this paper, the utilization of multi-sensors of different types, their characteristics, and their data-fusion in launch vehicles to achieve the goal of injecting the satellite into a precise orbit is explained. Performance requirements of sensors and their redundancy management in a typical launch vehicle are also included. The role of an integrated system level-test bed for evaluating multi-sensors and mission performance in a typical launch vehicle mission is described. Some of the typical simulation results to evaluate the effect of the sensors on the overall system are highlighted.

  7. Battery-free power for unattended ground sensors

    Science.gov (United States)

    Moldt, Vera A.

    2003-09-01

    In our current military environment, many operations are fought with small, highly mobile reconnaissance and strike forces that must move in and out of hostile terrain, setting up temporary bases and perimeters. As such, today's warfighter has to be well equipped to insure independent operation and survival of small, deployed groups. The use of unattended ground sensors in reconfigurable sensor networks can provide portable perimeter security for such special operations. Since all of the equipment for the missions must be carried by the warfighter, weight is a critical issue. Currently, batteries constitute much of that weight, as batteries are short-lived and unreliable. An alternative power source is required to eliminate the need for carrying multiple replacement batteries to support special operations. Such a battery-free, replenishable, energy management technology has been developed by Ambient Control Systems. Ambient has developed an advanced mid-door photovoltaic technology, which converts light to energy over a wide range of lighting conditions. The energy is then stored in supercapacitors, a highly robust, long-term storage medium. Ambient's advanced energy management technology will power remote sensor and control systems 24 hours/day, 7 days/week for over 20 years, without batteries, providing for ongoing detection, surveillance and other remote operations.

  8. A fusion method that performs better than best sensor

    Energy Technology Data Exchange (ETDEWEB)

    Rao, N.S.V.

    1998-03-01

    In multiple sensor systems, it is generally known that a good fuser will outperform the best sensor, and on the other hand, an inappropriate fuser can perform worse than the worst sensor. If the error distributions of the sensors are precisely known, an optimal fuser--that performs at least as well as best sensor--can be designed using statistical estimation methods. In engineering and robotic systems, however, it is too difficult and expensive to derive closed form error distributions required by these methods. This problem is further compounded by the variety and complexity of present day sensor systems, wherein a number of sensing hardware units and computing modules could be integrated into a single sensor. In these systems, however, it is possible to collect sensor data by sensing objects with known parameters. Thus, it is very important to have sample-based methods that enable a fuser to perform at least as well as the best sensor. In this paper, the author presents a generic analytical formulation of this problem, and provide a very simple property that yields such fuser.

  9. A sensor fusion method for Wi-Fi-based indoor positioning

    Directory of Open Access Journals (Sweden)

    Dongsoo Han

    2016-06-01

    Full Text Available This paper presents a sensor fusion method for a Wi-Fi-based indoor positioning system, named the KAist Indoor LOcating System (KAILOS, which was developed to realize a global indoor positioning system (GIPS that utilizes crowd-sourced fingerprints. KAILOS supports the deployment of indoor positioning systems in buildings by collecting indoor maps and fingerprint DBs of buildings for the GIPS. Thereby, KAILOS provides a method based on sensor fusion for volunteers to develop indoor positioning systems for their buildings. KAILOS has been made available online for public use. In addition, various location-based applications can also be developed using KAILOS.

  10. Autonomous navigation vehicle system based on robot vision and multi-sensor fusion

    Science.gov (United States)

    Wu, Lihong; Chen, Yingsong; Cui, Zhouping

    2011-12-01

    The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.

  11. Sensor fusion to enable next generation low cost Night Vision systems

    Science.gov (United States)

    Schweiger, R.; Franz, S.; Löhlein, O.; Ritter, W.; Källhammer, J.-E.; Franks, J.; Krekels, T.

    2010-04-01

    The next generation of automotive Night Vision Enhancement systems offers automatic pedestrian recognition with a performance beyond current Night Vision systems at a lower cost. This will allow high market penetration, covering the luxury as well as compact car segments. Improved performance can be achieved by fusing a Far Infrared (FIR) sensor with a Near Infrared (NIR) sensor. However, fusing with today's FIR systems will be too costly to get a high market penetration. The main cost drivers of the FIR system are its resolution and its sensitivity. Sensor cost is largely determined by sensor die size. Fewer and smaller pixels will reduce die size but also resolution and sensitivity. Sensitivity limits are mainly determined by inclement weather performance. Sensitivity requirements should be matched to the possibilities of low cost FIR optics, especially implications of molding of highly complex optical surfaces. As a FIR sensor specified for fusion can have lower resolution as well as lower sensitivity, fusing FIR and NIR can solve performance and cost problems. To allow compensation of FIR-sensor degradation on the pedestrian detection capabilities, a fusion approach called MultiSensorBoosting is presented that produces a classifier holding highly discriminative sub-pixel features from both sensors at once. The algorithm is applied on data with different resolution and on data obtained from cameras with varying optics to incorporate various sensor sensitivities. As it is not feasible to record representative data with all different sensor configurations, transformation routines on existing high resolution data recorded with high sensitivity cameras are investigated in order to determine the effects of lower resolution and lower sensitivity to the overall detection performance. This paper also gives an overview of the first results showing that a reduction of FIR sensor resolution can be compensated using fusion techniques and a reduction of sensitivity can be

  12. Changing requirements and solutions for unattended ground sensors

    Science.gov (United States)

    Prado, Gervasio; Johnson, Robert

    2007-10-01

    Unattended Ground Sensors (UGS) were first used to monitor Viet Cong activity along the Ho Chi Minh Trail in the 1960's. In the 1980's, significant improvement in the capabilities of UGS became possible with the development of digital signal processors; this led to their use as fire control devices for smart munitions (for example: the Wide Area Mine) and later to monitor the movements of mobile missile launchers. In these applications, the targets of interest were large military vehicles with strong acoustic, seismic and magnetic signatures. Currently, the requirements imposed by new terrorist threats and illegal border crossings have changed the emphasis to the monitoring of light vehicles and foot traffic. These new requirements have changed the way UGS are used. To improve performance against targets with lower emissions, sensors are used in multi-modal arrangements. Non-imaging sensors (acoustic, seismic, magnetic and passive infrared) are now being used principally as activity sensors to cue imagers and remote cameras. The availability of better imaging technology has made imagers the preferred source of "actionable intelligence". Infrared cameras are now based on un-cooled detector-arrays that have made their application in UGS possible in terms of their cost and power consumption. Visible light imagers are also more sensitive extending their utility well beyond twilight. The imagers are equipped with sophisticated image processing capabilities (image enhancement, moving target detection and tracking, image compression). Various commercial satellite services now provide relatively inexpensive long-range communications and the Internet provides fast worldwide access to the data.

  13. Data fusion on a distributed heterogeneous sensor network.

    Energy Technology Data Exchange (ETDEWEB)

    Lamborn, Peter; Williams, Pamela J.

    2006-02-01

    Alarm-based sensor systems are being explored as a tool to expand perimeter security for facilities and force protection. However, the collection of increased sensor data has resulted in an insufficient solution that includes faulty data points. Data analysis is needed to reduce nuisance and false alarms, which will improve officials decision making and confidence levels in the system's alarms. Moreover, operational costs can be allayed and losses mitigated if authorities are alerted only when a real threat is detected. In the current system, heuristics such as persistence of alarm and type of sensor that detected an event are used to guide officials responses. We hypothesize that fusing data from heterogeneous sensors in the sensor field can provide more complete situational awareness than looking at individual sensor data. We propose a two stage approach to reduce false alarms. First, we use self organizing maps to cluster sensors based on global positioning coordinates and then train classifiers on the within cluster data to obtain a local view of the event. Next, we train a classifier on the local results to compute a global solution. We investigate the use of machine learning techniques, such as k-nearest neighbor, neural networks, and support vector machines to improve alarm accuracy. On simulated sensor data, the proposed approach identifies false alarms with greater accuracy than a weighted voting algorithm.

  14. STUDY ON THE COAL-ROCK INTER-FACE RECOGNITION METHOD BASED ON MULTI-SENSOR DATA FUSION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Ren Fang; Yang Zhaojian; Xiong Shibo

    2003-01-01

    The coal-rock interface recognition method based on multi-sensor data fusion technique is put forward because of the localization of single type sensor recognition method. The measuring theory based on multi-sensor data fusion technique is analyzed, and hereby the test platform of recognition system is manufactured. The advantage of data fusion with the fuzzy neural network (FNN) technique has been probed. The two-level FNN is constructed and data fusion is carried out. The experiments show that in various conditions the method can always acquire a much higher recognition rate than normal ones.

  15. Comparison of pH Data Measured with a pH Sensor Array Using Different Data Fusion Methods

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liao

    2012-09-01

    Full Text Available This paper introduces different data fusion methods which are used for an electrochemical measurement using a sensor array. In this study, we used ruthenium dioxide sensing membrane pH electrodes to form a sensor array. The sensor array was used for detecting the pH values of grape wine, generic cola drink and bottled base water. The measured pH data were used for data fusion methods to increase the reliability of the measured results, and we also compared the fusion results with other different data fusion methods.

  16. Advanced data visualization and sensor fusion: Conversion of techniques from medical imaging to Earth science

    Science.gov (United States)

    Savage, Richard C.; Chen, Chin-Tu; Pelizzari, Charles; Ramanathan, Veerabhadran

    1993-01-01

    Hughes Aircraft Company and the University of Chicago propose to transfer existing medical imaging registration algorithms to the area of multi-sensor data fusion. The University of Chicago's algorithms have been successfully demonstrated to provide pixel by pixel comparison capability for medical sensors with different characteristics. The research will attempt to fuse GOES (Geostationary Operational Environmental Satellite), AVHRR (Advanced Very High Resolution Radiometer), and SSM/I (Special Sensor Microwave Imager) sensor data which will benefit a wide range of researchers. The algorithms will utilize data visualization and algorithm development tools created by Hughes in its EOSDIS (Earth Observation SystemData/Information System) prototyping. This will maximize the work on the fusion algorithms since support software (e.g. input/output routines) will already exist. The research will produce a portable software library with documentation for use by other researchers.

  17. Delineation of Management Zones in Precision Agriculture by Integration of Proximal Sensing with Multivariate Geostatistics. Examples of Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Annamaria Castrignanò

    2015-07-01

    Full Text Available Fundamental to the philosophy of Precision Agriculture (PA is the concept of matching inputs to needs. Recent research in PA has focused on use of Management Zones (MZ that are field areas characterised by homogeneous attributes in landscape and soil conditions. Proximal sensing (such as Electromagnetic Induction (EMI, Ground Penetrating Radar (GPR and X-ray fluorescence can complement direct sampling and a multisensory platform can enable us to map soil features unambiguously. Several methods of multi-sensor data analysis have been developed to determine the location of subfield areas. Modern geostatistical techniques, treating variables as continua in a joint attribute and geographic space, offer the potential to analyse such data effectively. The objective of the paper is to show the potential of multivariate geostatistics to create MZ in the perspective of PA by integrating field data from different types of sensors, describing two study cases. In particular, in the first case study, cokriging and factorial cokriging were employed to produce thematic maps of soil trace elements and to delineate homogenous zones, respectively. In the second case, a multivariate geostatistical data-fusion technique (multi collocated cokriging was applied to different geophysical sensor data (GPR and EMI, for stationary estimation of soil water content and for delineating within-field zone with different wetting degree. The results have shown that linking sensors of different type improves the overall assessment of soil and sensor data fusion could be effectively applied to delineate MZs in Precision Agriculture. However, techniques of data integration are urgently required as a result of the proliferation of data from different sources.

  18. Developing a Model for Simplified Higher Level Sensor Fusion

    Science.gov (United States)

    2013-01-01

    conveying its wide scope is to use a process model. The most referenced model within the DoD appears to be the Joint Director of Labs ( JDL ) data fusion...model shown in Figure 1 [5]. The JDL , is an organization which no longer exists but in the 1980s they were tasked to develop a model for data fu- sion...This JDL model, revised in 1999, was created to show a general process of data fusion with wide applicability for both government and academia. It

  19. Using Bayesian Programming for Multi-Sensor Data Fusion in Automotive Applications

    OpenAIRE

    Coué, Christophe; Fraichard, Thierry; Bessiere, Pierre; Mazer, Emmanuel

    2002-01-01

    International audience; A prerequisite to the design of future Advanced Driver Assistance Systems for cars is a sensing sytem providing all the information required for high-level driving assistance tasks. Carsense is a European project whose purpose is to develop such a new sensing system. It will combine different sensors (laser, radar and video) and will rely on the fusion of the information coming from these sensors in order to achieve better accuracy, robustness and an increase of the in...

  20. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device

    Directory of Open Access Journals (Sweden)

    Xiang He

    2015-12-01

    Full Text Available Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer, wireless signal strength indicators (WiFi, Bluetooth, Zigbee, and visual sensors (LiDAR, camera. People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.

  1. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device.

    Science.gov (United States)

    He, Xiang; Aloi, Daniel N; Li, Jia

    2015-12-14

    Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.

  2. Neural network implementations of data association algorithms for sensor fusion

    Science.gov (United States)

    Brown, Donald E.; Pittard, Clarence L.; Martin, Worthy N.

    1989-01-01

    The paper is concerned with locating a time varying set of entities in a fixed field when the entities are sensed at discrete time instances. At a given time instant a collection of bivariate Gaussian sensor reports is produced, and these reports estimate the location of a subset of the entities present in the field. A database of reports is maintained, which ideally should contain one report for each entity sensed. Whenever a collection of sensor reports is received, the database must be updated to reflect the new information. This updating requires association processing between the database reports and the new sensor reports to determine which pairs of sensor and database reports correspond to the same entity. Algorithms for performing this association processing are presented. Neural network implementation of the algorithms, along with simulation results comparing the approaches are provided.

  3. Health-Enabled Smart Sensor Fusion Technology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It has been proven that the combination of smart sensors with embedded metadata and wireless technologies present real opportunities for significant improvements in...

  4. Advanced array techniques for unattended ground sensor applications

    Energy Technology Data Exchange (ETDEWEB)

    Followill, F.E.; Wolford, J.K.; Candy, J.V.

    1997-05-06

    Sensor arrays offer opportunities to beam form, and time-frequency analyses offer additional insights to the wavefield data. Data collected while monitoring three different sources with unattended ground sensors in a 16-element, small-aperture (approximately 5 meters) geophone array are used as examples of model-based seismic signal processing on actual geophone array data. The three sources monitored were: (Source 01). A frequency-modulated chirp of an electromechanical shaker mounted on the floor of an underground bunker. Three 60-second time-windows corresponding to (a) 50 Hz to 55 Hz sweep, (b) 60 Hz to 70 Hz sweep, and (c) 80 Hz to 90 Hz sweep. (Source 02). A single transient impact of a hammer striking the floor of the bunker. Twenty seconds of data (with the transient event approximately mid-point in the time window.(Source 11)). The transient event of a diesel generator turning on, including a few seconds before the turn-on time and a few seconds after the generator reaches steady-state conditions. The high-frequency seismic array was positioned at the surface of the ground at a distance of 150 meters (North) of the underground bunker. Four Y-shaped subarrays (each with 2-meter apertures) in a Y-shaped pattern (with a 6-meter aperture) using a total of 16 3-component, high-frequency geophones were deployed. These 48 channels of seismic data were recorded at 6000 and 12000 samples per second on 16-bit data loggers. Representative examples of the data and analyses illustrate the results of this experiment.

  5. Joint-FACET: The Canada-Netherlands initiative to study multi-sensor data fusion systems

    NARCIS (Netherlands)

    Bossee, E.; Theil, A.; Huizing, A.G.; Aartsen, C.S. van

    1998-01-01

    This paper presents the progress of a collaborative effort between Canada and The Netherlands in analyzing multi-sensor data fusion systems, e.g. for potential application to their respective frigates. In view of the overlapping interest in studying and comparing applicability and performance and ad

  6. Comparison of belief functions and voting method for fusion of mine detection sensors

    NARCIS (Netherlands)

    Milisavljevic, N.; Broek, S.P. van den; Bloch, I.; Schwering, P.B.W.; Lensen, H.A.; Acheroy, M.

    2001-01-01

    In this paper, two methods for fusion of mine detection sensors are presented, based on belief functions and on voting procedures, respectively. Their application is illustrated and compared on a real multisensor data set collected at the TNO test facilities under the HOM 2000 project. This set cont

  7. Bathtub-Shaped Failure Rate of Sensors for Distributed Detection and Fusion

    Directory of Open Access Journals (Sweden)

    Junhai Luo

    2014-01-01

    Log-likelihood Ratio Test (ELRT rule is derived. Finally, the ROC curve for this model is presented. The simulation results show that the ELRT rule improves the robust performance of the system, compared with the traditional fusion rule without considering sensor failures.

  8. Context-dependent fusion for landmine detection with ground-penetrating radar

    Science.gov (United States)

    Frigui, Hichem; Zhang, Lijun; Gader, Paul; Ho, Dominic

    2007-04-01

    We present a novel method for fusing the results of multiple landmine detection algorithms that use different types of features and different classification methods. The proposed fusion method, called Context-Dependent Fusion (CDF) is motivated by the fact that the relative performance of different detectors can vary significantly depending on the mine type, geographical site, soil and weather conditions, and burial depth. The training part of CDF has two components: context extraction and algorithm fusion. In context extraction, the features used by the different algorithms are combined and used to partition the feature space into groups of similar signatures, or contexts. The algorithm fusion component assigns an aggregation weight to each detector in each context based on its relative performance within the context. Results on large and diverse Ground Penetrating Radar data collections show that the proposed method can identify meaningful and coherent clusters and that different expert algorithms can be identified for the different contexts. Our initial experiments have also indicated that the context-dependent fusion outperforms all individual detectors.

  9. Botnet Detection Architecture Based on Heterogeneous Multi-sensor Information Fusion

    Directory of Open Access Journals (Sweden)

    HaiLong Wang

    2011-12-01

    Full Text Available As technology has been developed rapidly, botnet threats to the global cyber community are also increasing. And the botnet detection has recently become a major research topic in the field of network security. Most of the current detection approaches work only on the evidence from single information source, which can not hold all the traces of botnet and hardly achieve high accuracy. In this paper, a novel botnet detection architecture based on heterogeneous multi-sensor information fusion is proposed. The architecture is designed to carry out information integration in the three fusion levels of data, feature, and decision. As the core component, a feature extraction module is also elaborately designed. And an extended algorithm of the Dempster-Shafer (D-S theory is proved and adopted in decision fusion. Furthermore, a representative case is provided to illustrate that the detection architecture can effectively fuse the complicated information from various sensors, thus to achieve better detection effect.

  10. Fusion: ultra-high-speed and IR image sensors

    Science.gov (United States)

    Etoh, T. Goji; Dao, V. T. S.; Nguyen, Quang A.; Kimata, M.

    2015-08-01

    Most targets of ultra-high-speed video cameras operating at more than 1 Mfps, such as combustion, crack propagation, collision, plasma, spark discharge, an air bag at a car accident and a tire under a sudden brake, generate sudden heat. Researchers in these fields require tools to measure the high-speed motion and heat simultaneously. Ultra-high frame rate imaging is achieved by an in-situ storage image sensor. Each pixel of the sensor is equipped with multiple memory elements to record a series of image signals simultaneously at all pixels. Image signals stored in each pixel are read out after an image capturing operation. In 2002, we developed an in-situ storage image sensor operating at 1 Mfps 1). However, the fill factor of the sensor was only 15% due to a light shield covering the wide in-situ storage area. Therefore, in 2011, we developed a backside illuminated (BSI) in-situ storage image sensor to increase the sensitivity with 100% fill factor and a very high quantum efficiency 2). The sensor also achieved a much higher frame rate,16.7 Mfps, thanks to the wiring on the front side with more freedom 3). The BSI structure has another advantage that it has less difficulties in attaching an additional layer on the backside, such as scintillators. This paper proposes development of an ultra-high-speed IR image sensor in combination of advanced nano-technologies for IR imaging and the in-situ storage technology for ultra-highspeed imaging with discussion on issues in the integration.

  11. Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions

    Science.gov (United States)

    DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.

    2008-01-01

    bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).

  12. Automatic noncontact 3-dimensional gauging via sensor fusion

    Science.gov (United States)

    Buckley, Shawn; Tavormina, Joseph J.

    1993-09-01

    Manufacturers are now driving toward the increased use of automation and the goal of zero-defects. As quality is improved and defect rates approach the popularized " Six-Sigma" level (customarily 3. 4 defects per million) manual or sampled measurementtechniques limit the achievementof product quality and manufacturing cost objectives. New automated inspection and gaging technology is required for process verification and control. To be competitive in the current manufacturing environment new gaging technology must be integrated into the manufacturing process to provide on-line feedback. The co-authors are founders of CogniSense a technology company dedicated to industrial inspection and gaging applications which use non-contact sensing techniques. CogniSense is currently applying its technology in the precision metalforming and other manufacturing industries to perform automatic dimensional measurement and provide real time information used to control and fine-tune the manufacturing process. A variety of sensors are used to detect the characteristics of parts on-line as they are produced. Data from multiple sensors is " fused" and analyzed by a dedicated microcomputer which evaluates the sensory signature and calculates critical dimensions from the sensor input to determine whether parts are within the acceptable tolerance range. Pattern recognition algorithms are used to automatically select the sensors which provide the most important information about critical part characteristics and dimensions. These algorithms operate by observing the changes in sensor output as critical features of the part are varied. The decision-making algorithms

  13. Sensor Data Fusion for Accurate Cloud Presence Prediction Using Dempster-Shafer Evidence Theory

    Directory of Open Access Journals (Sweden)

    Jesse S. Jin

    2010-10-01

    Full Text Available Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.

  14. Sensor data fusion for accurate cloud presence prediction using Dempster-Shafer evidence theory.

    Science.gov (United States)

    Li, Jiaming; Luo, Suhuai; Jin, Jesse S

    2010-01-01

    Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.

  15. Modelling and Simulation of Multi-target Multi-sensor Data Fusion for Trajectory Tracking

    Directory of Open Access Journals (Sweden)

    A.K. Singh

    2009-05-01

    Full Text Available An implementation of track fusion using various algorthims has been demonstrated . The sensor measurements of these targets are modelled using Kalman filter (KF and interacting multiple models (IMM filter. The joint probabilistic data association filter (JPDAF and neural network fusion (NNF algorithms were used for tracking multiple man-euvring targets. Track association and fusion algorithm are executed to get the fused track data for various scenarios, two sensors tracking a single target to three sensors tracking three targets, to evaluate the effects of multiple and dispersed sensors for single target, two targets, and multiple targets. The targets chosen were distantly spaced, closely spaced and crossing. Performance of different filters was compared and fused trajectory is found to be closer to the true target trajectory as compared to that for any of the sensor measurements of that target.Defence Science Journal, 2009, 59(3, pp.205-214, DOI:http://dx.doi.org/10.14429/dsj.59.1513

  16. Design of Liquid Level Measurement System Using Multi Sensor Data Fusion for Improved Characteristics and Fault Detection

    OpenAIRE

    SANTHOSH K V Shashank Kumar

    2016-01-01

    Online validation of multi sensor data fusion based liquid level measurement technique using capacitance level sensor and ultrasonic level sensor is implemented in this work. The objectives of the proposed work is to calibrate level measurement system by fusing the outputs of fuzzy sets of Capacitive Level Sensor (CLS) and Ultrasonic Level Sensor (ULS) such that (i) sensitivity and linearity should be improved as compared to ULS, (ii) reduction of nonlinear characteristics like offset and sat...

  17. All-IP-Ethernet architecture for real-time sensor-fusion processing

    Science.gov (United States)

    Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya

    2016-03-01

    Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.

  18. Multi-Source Sensor Fusion for Small Unmanned Aircraft Systems Using Fuzzy Logic

    Science.gov (United States)

    Cook, Brandon; Cohen, Kelly

    2017-01-01

    As the applications for using small Unmanned Aircraft Systems (sUAS) beyond visual line of sight (BVLOS) continue to grow in the coming years, it is imperative that intelligent sensor fusion techniques be explored. In BVLOS scenarios the vehicle position must accurately be tracked over time to ensure no two vehicles collide with one another, no vehicle crashes into surrounding structures, and to identify off-nominal scenarios. Therefore, in this study an intelligent systems approach is used to estimate the position of sUAS given a variety of sensor platforms, including, GPS, radar, and on-board detection hardware. Common research challenges include, asynchronous sensor rates and sensor reliability. In an effort to realize these challenges, techniques such as a Maximum a Posteriori estimation and a Fuzzy Logic based sensor confidence determination are used.

  19. Fusion of Smartphone Motion Sensors for Physical Activity Recognition

    NARCIS (Netherlands)

    Shoaib, Muhammad; Bosch, Stephan; Durmaz Incel, Ozlem; Scholten, Hans; Havinga, Paul J.M.

    2014-01-01

    For physical activity recognition, smartphone sensors, such as an accelerometer and a gyroscope, are being utilized in many research studies. So far, particularly, the accelerometer has been extensively studied. In a few recent studies, a combination of a gyroscope, a magnetometer (in a supporting r

  20. Fusion of smartphone motion sensors for physical activity recognition.

    Science.gov (United States)

    Shoaib, Muhammad; Bosch, Stephan; Incel, Ozlem Durmaz; Scholten, Hans; Havinga, Paul J M

    2014-06-10

    For physical activity recognition, smartphone sensors, such as an accelerometer and a gyroscope, are being utilized in many research studies. So far, particularly, the accelerometer has been extensively studied. In a few recent studies, a combination of a gyroscope, a magnetometer (in a supporting role) and an accelerometer (in a lead role) has been used with the aim to improve the recognition performance. How and when are various motion sensors, which are available on a smartphone, best used for better recognition performance, either individually or in combination? This is yet to be explored. In order to investigate this question, in this paper, we explore how these various motion sensors behave in different situations in the activity recognition process. For this purpose, we designed a data collection experiment where ten participants performed seven different activities carrying smart phones at different positions. Based on the analysis of this data set, we show that these sensors, except the magnetometer, are each capable of taking the lead roles individually, depending on the type of activity being recognized, the body position, the used data features and the classification method employed (personalized or generalized). We also show that their combination only improves the overall recognition performance when their individual performances are not very high, so that there is room for performance improvement. We have made our data set and our data collection application publicly available, thereby making our experiments reproducible.

  1. Fusion of Smartphone Motion Sensors for Physical Activity Recognition

    Directory of Open Access Journals (Sweden)

    Muhammad Shoaib

    2014-06-01

    Full Text Available For physical activity recognition, smartphone sensors, such as an accelerometer and a gyroscope, are being utilized in many research studies. So far, particularly, the accelerometer has been extensively studied. In a few recent studies, a combination of a gyroscope, a magnetometer (in a supporting role and an accelerometer (in a lead role has been used with the aim to improve the recognition performance. How and when are various motion sensors, which are available on a smartphone, best used for better recognition performance, either individually or in combination? This is yet to be explored. In order to investigate this question, in this paper, we explore how these various motion sensors behave in different situations in the activity recognition process. For this purpose, we designed a data collection experiment where ten participants performed seven different activities carrying smart phones at different positions. Based on the analysis of this data set, we show that these sensors, except the magnetometer, are each capable of taking the lead roles individually, depending on the type of activity being recognized, the body position, the used data features and the classification method employed (personalized or generalized. We also show that their combination only improves the overall recognition performance when their individual performances are not very high, so that there is room for performance improvement. We have made our data set and our data collection application publicly available, thereby making our experiments reproducible.

  2. Engineering of Sensor Network Structure for Dependable Fusion

    Science.gov (United States)

    2014-08-15

    design a novel sensor architecture that partitioned the overall design into two separate but interacting design spaces, (1) Information Space (IS...classification, Signal Processing (03 2012) Devesh K. Jha, Dheeraj S. Singh, S. Gupta, A. Ray. Fractal analysis of crack initiation in polycrystalline alloys

  3. Sensor fusion in head pose tracking for augmented reality

    NARCIS (Netherlands)

    Persa, S.F.

    2006-01-01

    The focus of this thesis is on studying diverse techniques, methods and sensors for position and orientation determination with application to augmented reality applications. In Chapter 2 we reviewed a variety of existing techniques and systems for position determination. From a practical point of v

  4. Multi-sensor data fusion for measurement of complex freeform surfaces

    Science.gov (United States)

    Ren, M. J.; Liu, M. Y.; Cheung, C. F.; Yin, Y. H.

    2016-01-01

    Along with the rapid development of the science and technology in fields such as space optics, multi-scale enriched freeform surfaces are widely used to enhance the performance of the optical systems in both functionality and size reduction. Multi-sensor technology is considered as one of the promising methods to measure and characterize these surfaces at multiple scales. This paper presents a multi-sensor data fusion based measurement method to purposely extract the geometric information of the components with different scales which is used to establish a holistic geometry of the surface via data fusion. To address the key problems of multi-sensor data fusion, an intrinsic feature pattern based surface registration method is developed to transform the measured datasets to a common coordinate frame. Gaussian zero-order regression filter is then used to separate each measured data in different scales, and the datasets are fused based on an edge intensity data fusion algorithm within the same wavelength. The fused data at different scales is then merged to form a new surface with holistic multiscale information. Experimental study is presented to verify the effectiveness of the proposed method.

  5. Machine Learning and Sensor Fusion for Estimating Continuous Energy Expenditure

    OpenAIRE

    Vyas, Nisarg; BodyMedia, Inc.; Farringdon, Jonathan; BodyMedia Inc.; Andre, David; Cerebellum Capital, Inc.; Stivoric, John Ivo; BodyMedia

    2012-01-01

    In this article we provide insight into the BodyMedia FIT armband system — a wearable multi-sensor technology that continuously monitors physiological events related to energy expenditure for weight management using machine learning and data modeling methods. Since becoming commercially available in 2001, more than half a million users have used the system to track their physiological parameters and to achieve their individual health goals including weight-loss. We describe several challenges...

  6. 航迹融合算法在多传感器融合中的应用%Application of the Track Fusion in Multi-sensor Fusion

    Institute of Scientific and Technical Information of China (English)

    田雪怡; 李一兵; 李志刚

    2012-01-01

    The track fusion is an important aspect in the multi-sensor data fusion. Because of the public noise, the track estimate errors from the different sensors are not independent in the state estimate fusion system. So the fu sion problem becomes complex. This article researched the simple fusion, adaptive track fusion and weighted covari ance fusion. The comparison of data fusion methods shows that adaptive track fusion and weighted covariance fusion is effective to multi-sensor data fusion. The simulation indicates that the algorithm has preferable fusion result.%研究寻的制导优化控制问题,针对传统单一传感器导引不能满足性能要求,提出采用多传感器复合制导.航迹融合是多传感器数据融合中一个非常重要的方面.由于公共过程噪声的原因,使在应用状态估计融合系统中,来自不同传感器的航迹估计误差未必有独立性,为了使航迹与航迹关联和融合,提出自适应航迹和协方差加权航迹融合的算法.通过仿真研究说明自适应航迹融合和协方差加权航迹融合的算法对多传感器数据融合技术有很明显的作用,数据融合效果好,为复合寻的制导优化设计提供了依据.

  7. Decision and feature fusion over the fractal inference network using camera and range sensors

    Science.gov (United States)

    Erkmen, Ismet; Erkmen, Aydan M.; Ucar, Ekin

    1998-10-01

    The objective of the ongoing work is to fuse information from uncertain environmental data taken by cameras, short range sensors including infrared and ultrasound sensors for strategic target recognition and task specific action in Mobile Robot applications. Our present goal in this paper is to demonstrate target recognition for service robot in a simple office environment. It is proposed to fuse all sensory signals obtained from multiple sensors over a fully layer-connected sensor network system that provides an equal opportunity competitive environment for sensory data where those bearing less uncertainty, less complexity and less inconsistencies with the overall goal survive, while others fade out. In our work, this task is achieved as a decision fusion using the Fractal Inference Network (FIN), where information patterns or units--modeled as textured belief functions bearing a fractal dimension due to uncertainty-- propagate while being processed at the nodes of the network. Each local process of a node generates a multiresolutional feature fusion. In this model, the environment is observed by multisensors of different type, different resolution and different spatial location without a prescheduled sensing scenario in data gathering. Node activation and flow control of information over the FIN is performed by a neuro- controller, a concept that has been developed recently as an improvement over the classical Fractal Inference Network. In this paper, the mathematical closed form representation for decision fusion over the FIN is developed in a way suitable for analysis and is applied to a NOMAD mobile robot servicing an office environment.

  8. A Weighted Belief Entropy-Based Uncertainty Measure for Multi-Sensor Data Fusion.

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Xu, Shuai; He, Zichang

    2017-04-22

    In real applications, how to measure the uncertain degree of sensor reports before applying sensor data fusion is a big challenge. In this paper, in the frame of Dempster-Shafer evidence theory, a weighted belief entropy based on Deng entropy is proposed to quantify the uncertainty of uncertain information. The weight of the proposed belief entropy is based on the relative scale of a proposition with regard to the frame of discernment (FOD). Compared with some other uncertainty measures in Dempster-Shafer framework, the new measure focuses on the uncertain information represented by not only the mass function, but also the scale of the FOD, which means less information loss in information processing. After that, a new multi-sensor data fusion approach based on the weighted belief entropy is proposed. The rationality and superiority of the new multi-sensor data fusion method is verified according to an experiment on artificial data and an application on fault diagnosis of a motor rotor.

  9. New results and implications for lunar crustal iron distribution using sensor data fusion techniques

    Science.gov (United States)

    Clark, P. E.; McFadden, L. A.

    2000-02-01

    Remote measurements of the Moon have provided iron maps, and thus essential constraints for models of lunar crustal formation and mare basalt petrogenesis. A bulk crustal iron map was produced for the equatorial region from Apollo gamma-ray (AGR) spectrometer measurements, and a global iron variation map from recent Clementine spectral reflectance (CSR) measurements. Both iron maps show bimodal distribution, but have significantly different peak values and variations. In this paper, CSR data have been recalibrated to pyroxene in lunar landing site soils. A residual iron map is derived from the difference between AGR (bulk) and recalibrated CSR (pyroxene) iron abundances. The most likely interpretation is that the residual represents ferrous iron in olivine. This residual iron is anticorrelated to basin age, with older basins containing less olivine, suggesting segregation of basin basalt sources from a progressively fractionating underlying source region at the time of basin formation. Results presented here provide a quantitative basis for (1) establishing the relationship between direct geochemical (gamma-ray, X-ray) and mineralogical (near-IR) remote sensing data sets using sensor data fusion techniques to allow (2) simultaneous determination of elemental and mineralogical component distribution on remote targets and (3) meaningful interpretation of orbital and ground-based spectral reflectance measurements. When calibrated data from the Lunar Prospector mission are available, mapping of bulk crustal iron and iron-bearing soil components will be possible for the entire Moon. Similar analyses for data from the Near Earth Asteroid Rendezvous (NEAR) mission to asteroid 433 Eros will constrain models of asteroid formation.

  10. Design of Liquid Level Measurement System Using Multi Sensor Data Fusion for Improved Characteristics and Fault Detection

    Directory of Open Access Journals (Sweden)

    SANTHOSH K V Shashank Kumar

    2016-10-01

    Full Text Available Online validation of multi sensor data fusion based liquid level measurement technique using capacitance level sensor and ultrasonic level sensor is implemented in this work. The objectives of the proposed work is to calibrate level measurement system by fusing the outputs of fuzzy sets of Capacitive Level Sensor (CLS and Ultrasonic Level Sensor (ULS such that (i sensitivity and linearity should be improved as compared to ULS, (ii reduction of nonlinear characteristics like offset and saturation which persists in CLS, and (iii detection and identification of faults in sensors if any. These objectives are achieved by using the Joint Directors of Laboratories (JDL multi sensor data fusion framework in cascade to the outputs of both the sensor. The proposed liquid level measurement technique was subjected to testing with practical data and results show successful implementation of liquid level measurement system.

  11. Simple, High-Performance Fusion Rule for Censored Decisions in Wireless Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    LIU Xiangyang; PENG Yingning; WANG Xiutan

    2008-01-01

    Data selection-based summation fusion (DSSF) was developed to overcome the shortcomings of previously developed likelihood ratio tests based on channel statistics (LRT-CS) for the problem of fusing censored binary decisions transmitted over Nakagami fading channels in a wireless sensor network (WSN).The LRT-CS relies on detection probabilities of the local sensors, while the detection probabilities are a pri-ori unknown for uncooperative targets. Also, for Nakagami fading channels, the LRT-CS involves an infinite series, which is cumbersome for real-time application. In contrast, the DSSF only involves data comparisons and additions and does not require the detection probabilities of local sensors. Furthermore, the perform-ance of DSSF is only slightly degraded in comparison with the LRT-CS when the detection probabilities of local sensors are a priori unknown. Therefore, the DSSF should be used in a WSN with limited resources.

  12. Fast obstacle detection based on multi-sensor information fusion

    Science.gov (United States)

    Lu, Linli; Ying, Jie

    2014-11-01

    Obstacle detection is one of the key problems in areas such as driving assistance and mobile robot navigation, which cannot meet the actual demand by using a single sensor. A method is proposed to realize the real-time access to the information of the obstacle in front of the robot and calculating the real size of the obstacle area according to the mechanism of the triangle similarity in process of imaging by fusing datum from a camera and an ultrasonic sensor, which supports the local path planning decision. In the part of image analyzing, the obstacle detection region is limited according to complementary principle. We chose ultrasonic detection range as the region for obstacle detection when the obstacle is relatively near the robot, and the travelling road area in front of the robot is the region for a relatively-long-distance detection. The obstacle detection algorithm is adapted from a powerful background subtraction algorithm ViBe: Visual Background Extractor. We extracted an obstacle free region in front of the robot in the initial frame, this region provided a reference sample set of gray scale value for obstacle detection. Experiments of detecting different obstacles at different distances respectively, give the accuracy of the obstacle detection and the error percentage between the calculated size and the actual size of the detected obstacle. Experimental results show that the detection scheme can effectively detect obstacles in front of the robot and provide size of the obstacle with relatively high dimensional accuracy.

  13. Initial Realization of a Sensor Fusion Based Onboard Maritime Integrated PNT Unit

    Directory of Open Access Journals (Sweden)

    Ralf Ziebold

    2013-03-01

    Full Text Available This paper introduces the basic concept of the Position Navigation and Timing (PNT Module as future part of a ship side Integrated Navigation System (INS. Core of the PNT Module is a sensor fusion based processing system (PNT Unit. The paper will focus on important aspects and first results of the initial practical realization of such a PNT Unit, including a realization of a Consistent Common Reference System (CCRS, GNSS/IMU tightly coupled positioning results as well as contingency performance of the inertial sensors.

  14. Indoor Positioning and Localisation System with Sensor Fusion : AN IMPLEMENTATION ON AN INDOOR AUTONOMOUS ROBOT AT ÅF

    OpenAIRE

    Ericsson, John-Eric; Eriksson, Daniel

    2014-01-01

    This thesis will present guidelines of how to select sensors and algorithms for indoor positioning and localisation systems with sensor fusion. These guidelines are based on an extensive theory and state of the art research. Different scenarios are presented to give some examples of proposed sensors and algorithms for certain applications. There are of course no right or wrong sensor combinations, but some factors are good to bear in mind when a system is designed. To give an example of the p...

  15. A Dempster-Shafer Method for Multi-Sensor Fusion

    Science.gov (United States)

    2012-03-01

    cadence, stride length, step length, joint angles (hip, knee, ankle), speed, and ground contact time [11; 27]. These measurements are easy to obtain because... Sacroiliac Somatic Dysfunction.” The Journal of the American Osteopathic Association, 110(2): 81-86 (1 February 2010). http://www.jaoa.org/content/110/2/81/F3...Conference of Radar. 393-396. Beijing: October 1996. 27. “ Joint Function & GAIT Analysis.” http://wings.buffalo.edu/eng/mae/courses/417- 517/Orthopaedic

  16. Multi-sensor radiation detection, imaging, and fusion

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Kai [Department of Nuclear Engineering, University of California, Berkeley, CA 94720 (United States); Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-01-01

    Glenn Knoll was one of the leaders in the field of radiation detection and measurements and shaped this field through his outstanding scientific and technical contributions, as a teacher, his personality, and his textbook. His Radiation Detection and Measurement book guided me in my studies and is now the textbook in my classes in the Department of Nuclear Engineering at UC Berkeley. In the spirit of Glenn, I will provide an overview of our activities at the Berkeley Applied Nuclear Physics program reflecting some of the breadth of radiation detection technologies and their applications ranging from fundamental studies in physics to biomedical imaging and to nuclear security. I will conclude with a discussion of our Berkeley Radwatch and Resilient Communities activities as a result of the events at the Dai-ichi nuclear power plant in Fukushima, Japan more than 4 years ago. - Highlights: • .Electron-tracking based gamma-ray momentum reconstruction. • .3D volumetric and 3D scene fusion gamma-ray imaging. • .Nuclear Street View integrates and associates nuclear radiation features with specific objects in the environment. • Institute for Resilient Communities combines science, education, and communities to minimize impact of disastrous events.

  17. Sensor fusion IV: Control paradigms and data structures; Proceedings of the Meeting, Boston, MA, Nov. 12-15, 1991

    Science.gov (United States)

    Schenker, Paul S. (Editor)

    1992-01-01

    Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.

  18. Sensor fusion IV: Control paradigms and data structures; Proceedings of the Meeting, Boston, MA, Nov. 12-15, 1991

    Science.gov (United States)

    Schenker, Paul S. (Editor)

    1992-01-01

    Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.

  19. Sensor fusion and nonlinear prediction for anomalous event detection

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, J.V.; Moore, K.R.; Elphic, R.C.

    1995-03-07

    The authors consider the problem of using the information from various time series, each one characterizing a different physical quantity, to predict the future state of the system and, based on that information, to detect and classify anomalous events. They stress the application of principal components analysis (PCA) to analyze and combine data from different sensors. They construct both linear and nonlinear predictors. In particular, for linear prediction the authors use the least-mean-square (LMS) algorithm and for nonlinear prediction they use both backpropagation (BP) networks and fuzzy predictors (FP). As an application, they consider the prediction of gamma counts from past values of electron and gamma counts recorded by the instruments of a high altitude satellite.

  20. Step Characterization using Sensor Information Fusion and Machine Learning

    Directory of Open Access Journals (Sweden)

    Ricardo Anacleto

    2015-12-01

    Full Text Available A pedestrian inertial navigation system is typically used to suppress the Global Navigation Satellite System limitation to track persons in indoor or in dense environments. However, low- cost inertial systems provide huge location estimation errors due to sensors and pedestrian dead reckoning inherent characteristics. To suppress some of these errors we propose a system that uses two inertial measurement units spread in person’s body, which measurements are aggregated using learning algorithms that learn the gait behaviors. In this work we present our results on using different machine learning algorithms which are used to characterize the step according to its direction and length. This characterization is then used to adapt the navigation algorithm according to the performed classifications.

  1. Performance evaluation of multi-sensor data fusion technique for test range application

    Indian Academy of Sciences (India)

    Shrabani Bhattacharya; R Appavu Raj

    2004-04-01

    We have adopted the state-vector fusion technique for fusing multiple sensors track data to provide complete and precise trajectory information about the flight vehicle under test, for the purpose of flight safety monitoring and decisionmaking at Test Range. The present paper brings out the performance of the algorithm for different process noise and measurement noise using simulated as well as real track data.

  2. Localization in orchards using Extended Kalman Filter for sensor-fusion - A FroboMind component

    DEFF Research Database (Denmark)

    Christiansen, Martin Peter; Jensen, Kjeld; Ellekilde, Lars-Peter

    Using the detected trees seen in gure 4(b) a localised SLAM map of the surroundings area, can be created an used to determine the localisation of the tractor. This kind of sensor-fusion is used, to keep the amount of prior information about outlay of the orchard to a minimum, so it can be used...... in orchards with dierent outlays of the trees....

  3. Resistive sensor and electromagnetic actuator for feedback stabilization of liquid metal walls in fusion reactors

    CERN Document Server

    Mirhoseini, S H M

    2016-01-01

    Liquid metal walls in fusion reactors will be subject to instabilities, turbulence, induced currents, error fields and temperature gradients that will make them locally bulge, thus entering in contact with the plasma, or deplete, hence exposing the underlying solid substrate. To prevent this, research has begun to actively stabilize static or flowing liquid metal layers by locally applying forces in feedback with thickness measurements. Here we present resistive sensors of liquid metal thickness and demonstrate jxB actuators, to locally control it.

  4. Multiscale Recognition Algorithm for Eye Ground Texture Based on Fusion Threshold Equalization

    Directory of Open Access Journals (Sweden)

    Zhongsheng Qiu

    2014-09-01

    Full Text Available The eye ground texture is disturbed by non ideal imaging factor such as noise, it will affect the clinical diagnosis in practice, an improved multi scale retina eye ground texture recognition algorithm is proposed based on fusion area threshold. The nonlinear sampling multi-scale transform is used to analyze the geometric space coefficient of retinal vessels with multi direction and shift invariant features, the regional threshold filtering is integrated, it is used to suppress the effect of non-uniform blocks for texture recognition. The maximum likelihood local mean standard deviation analysis is used for texture parameters estimation and recognition. The noise reduced greatly, accurate identification of texture feature is obtained. Simulation results show that the algorithm can well characterize the retinal vascular texture, it has good performance in different texture feature recognition, the recognition accuracy is improved, and it has good robustness.

  5. Data fusion for the detection of buried land mines

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G.A.; Sengupta, S.K.; Schaich, P.C.; Sherwood, R.J.; Buhl, M.R.; Hernandez, J.E.; Kane, R.J.; Barth, M.J.; Fields, D.J.; Carter, M.R.

    1993-10-01

    The authors conducted experiments to demonstrate the enhanced delectability of buried land mines using sensor fusion techniques. Multiple sensors, including imagery, infrared imagery, and ground penetrating radar, have been used to acquire data on a number of buried mines and mine surrogates. The authors present this data along with a discussion of the application of sensor fusion techniques for this particular detection problem. The authors describe the data fusion architecture and discuss some relevant results of these classification methods.

  6. Sensor Fusion of Position- and Micro-Sensors (MEMS) integrated in a Wireless Sensor Network for movement detection in landslide areas

    Science.gov (United States)

    Arnhardt, Christian; Fernández-Steeger, Tomas; Azzam, Rafig

    2010-05-01

    Monitoring systems in landslide areas are important elements of effective Early Warning structures. Data acquisition and retrieval allows the detection of movement processes and thus is essential to generate warnings in time. Apart from the precise measurement, the reliability of data is fundamental, because outliers can trigger false alarms and leads to the loss of acceptance of such systems. For the monitoring of mass movements and their risk it is important to know, if there is movement, how fast it is and how trustworthy is the information. The joint project "Sensorbased landslide early warning system" (SLEWS) deals with these questions, and tries to improve data quality and to reduce false alarm rates, due to the combination of sensor date (sensor fusion). The project concentrates on the development of a prototypic Alarm- and Early Warning system (EWS) for different types of landslides by using various low-cost sensors, integrated in a wireless sensor network (WSN). The network consists of numerous connection points (nodes) that transfer data directly or over other nodes (Multi-Hop) in real-time to a data collection point (gateway). From there all the data packages are transmitted to a spatial data infrastructure (SDI) for further processing, analyzing and visualizing with respect to end-user specifications. The ad-hoc characteristic of the network allows the autonomous crosslinking of the nodes according to existing connections and communication strength. Due to the independent finding of new or more stable connections (self healing) a breakdown of the whole system is avoided. The bidirectional data stream enables the receiving of data from the network but also allows the transfer of commands and pointed requests into the WSN. For the detection of surface deformations in landslide areas small low-cost Micro-Electro-Mechanical-Systems (MEMS) and positionsensors from the automobile industries, different industrial applications and from other measurement

  7. Fusion

    CERN Document Server

    Mahaffey, James A

    2012-01-01

    As energy problems of the world grow, work toward fusion power continues at a greater pace than ever before. The topic of fusion is one that is often met with the most recognition and interest in the nuclear power arena. Written in clear and jargon-free prose, Fusion explores the big bang of creation to the blackout death of worn-out stars. A brief history of fusion research, beginning with the first tentative theories in the early 20th century, is also discussed, as well as the race for fusion power. This brand-new, full-color resource examines the various programs currently being funded or p

  8. Person re-identification across aerial and ground-based cameras by deep feature fusion

    Science.gov (United States)

    Schumann, Arne; Metzler, Jürgen

    2017-05-01

    Person re-identification is the task of correctly matching visual appearances of the same person in image or video data while distinguishing appearances of different persons. The traditional setup for re-identification is a network of fixed cameras. However, in recent years mobile aerial cameras mounted on unmanned aerial vehicles (UAV) have become increasingly useful for security and surveillance tasks. Aerial data has many characteristics different from typical camera network data. Thus, re-identification approaches designed for a camera network scenario can be expected to suffer a drop in accuracy when applied to aerial data. In this work, we investigate the suitability of features, which were shown to give robust results for re- identification in camera networks, for the task of re-identifying persons between a camera network and a mobile aerial camera. Specifically, we apply hand-crafted region covariance features and features extracted by convolu- tional neural networks which were learned on separate data. We evaluate their suitability for this new and as yet unexplored scenario. We investigate common fusion methods to combine the hand-crafted and learned features and propose our own deep fusion approach which is already applied during training of the deep network. We evaluate features and fusion methods on our own dataset. The dataset consists of fourteen people moving through a scene recorded by four fixed ground-based cameras and one mobile camera mounted on a small UAV. We discuss strengths and weaknesses of the features in the new scenario and show that our fusion approach successfully leverages the strengths of each feature and outperforms all single features significantly.

  9. Swarm Robot Control for Human Services and Moving Rehabilitation by Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Tresna Dewi

    2014-01-01

    Full Text Available A current trend in robotics is fusing different types of sensors having different characteristics to improve the performance of a robot system and also benefit from the reduced cost of sensors. One type of robot that requires sensor fusion for its application is the service robot. To achieve better performance, several service robots are preferred to work together, and, hence, this paper concentrates on swarm service robots. Swarm service mobile robots operating within a fixed area need to cope with dynamic changes in the environment, and they must also be capable of avoiding dynamic and static obstacles. This study applies sensor fusion and swarm concept for service mobile robots in human services and rehabilitation environment. The swarm robots follow the human moving trajectory to provide support to human moving and perform several tasks required in their living environment. This study applies a reference control and proportional-integral (PI control for the obstacle avoidance function. Various computer simulations are performed to verify the effectiveness of the proposed method.

  10. Two-level Robust Measurement Fusion Kalman Filter for Clustering Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; QI Wen-Juan; DENG Zi-Li

    2014-01-01

    This paper investigates the distributed fusion Kalman filtering over clustering sensor networks. The sensor network is partitioned as clusters by the nearest neighbor rule and each cluster consists of sensing nodes and cluster-head. Using the minimax robust estimation principle, based on the worst-case conservative system with the conservative upper bounds of noise variances, two-level robust measurement fusion Kalman filter is presented for the clustering sensor network systems with uncertain noise variances. It can significantly reduce the communication load and save energy when the number of sensors is very large. A Lyapunov equation approach for the robustness analysis is presented, by which the robustness of the local and fused Kalman filters is proved. The concept of the robust accuracy is presented, and the robust accuracy relations among the local and fused robust Kalman filters are proved. It is proved that the robust accuracy of the two-level weighted measurement fuser is equal to that of the global centralized robust fuser and is higher than those of each local robust filter and each local weighted measurement fuser. A simulation example shows the correctness and effectiveness of the proposed results.

  11. Position Estimation by Wearable Walking Navigation System for Visually Impaired with Sensor Fusion

    Science.gov (United States)

    Watanabe, Hiromi; Yamamoto, Yoshihiko; Tanzawa, Tsutomu; Kotani, Shinji

    A wearable walking navigation system without any special infrastructures has been developed to guide visually impaired. It is important to estimate a position correctly so that safe navigation can be realized. In our system, different sensor data are fused to estimate a pedestrian's position. An image processing system and a laser range finder were used to estimate the positions indoors. In this paper, we introduce the concept of “similarity” between map information and sensor data. This similarity is used to estimate the positions. Experimental results show that highly accurate position estimation can be achieved by sensor fusion. The positions in a linear passage were estimated using image processing data, and when the passage turns, the positions were estimated using LRF data.

  12. Improving Quantitative Precipitation Estimation via Data Fusion of High-Resolution Ground-based Radar Network and CMORPH Satellite-based Product

    Science.gov (United States)

    Cifelli, R.; Chen, H.; Chandrasekar, V.; Xie, P.

    2015-12-01

    A large number of precipitation products at multi-scales have been developed based upon satellite, radar, and/or rain gauge observations. However, how to produce optimal rainfall estimation for a given region is still challenging due to the spatial and temporal sampling difference of different sensors. In this study, we develop a data fusion mechanism to improve regional quantitative precipitation estimation (QPE) by utilizing satellite-based CMORPH product, ground radar measurements, as well as numerical model simulations. The CMORPH global precipitation product is essentially derived based on retrievals from passive microwave measurements and infrared observations onboard satellites (Joyce et al. 2004). The fine spatial-temporal resolution of 0.05o Lat/Lon and 30-min is appropriate for regional hydrologic and climate studies. However, it is inadequate for localized hydrometeorological applications such as urban flash flood forecasting. Via fusion of the Regional CMORPH product and local precipitation sensors, the high-resolution QPE performance can be improved. The area of interest is the Dallas-Fort Worth (DFW) Metroplex, which is the largest land-locked metropolitan area in the U.S. In addition to an NWS dual-polarization S-band WSR-88DP radar (i.e., KFWS radar), DFW hosts the high-resolution dual-polarization X-band radar network developed by the center for Collaborative Adaptive Sensing of the Atmosphere (CASA). This talk will present a general framework of precipitation data fusion based on satellite and ground observations. The detailed prototype architecture of using regional rainfall instruments to improve regional CMORPH precipitation product via multi-scale fusion techniques will also be discussed. Particularly, the temporal and spatial fusion algorithms developed for the DFW Metroplex will be described, which utilizes CMORPH product, S-band WSR-88DP, and X-band CASA radar measurements. In order to investigate the uncertainties associated with each

  13. Multi-sensor fusion techniques for state estimation of micro air vehicles

    Science.gov (United States)

    Donavanik, Daniel; Hardt-Stremayr, Alexander; Gremillion, Gregory; Weiss, Stephan; Nothwang, William

    2016-05-01

    Aggressive flight of micro air vehicles (MAVs) in unstructured, GPS-denied environments poses unique challenges for estimation of vehicle pose and velocity due to the noise, delay, and drift in individual sensor measurements. Maneuvering flight at speeds in excess of 5 m/s poses additional challenges even for active range sensors; in the case of LIDAR, an assembled scan of the vehicles environment will in most cases be obsolete by the time it is processed. Multi-sensor fusion techniques which combine inertial measurements with passive vision techniques and/or LIDAR have achieved breakthroughs in the ability to maintain accurate state estimates without the use of external positioning sensors. In this paper, we survey algorithmic approaches to exploiting sensors with a wide range of nonlinear dynamics using filter and bundle-adjustment based approaches for state estimation and optimal control. From this foundation, we propose a biologically-inspired framework for incorporating the human operator in the loop as a privileged sensor in a combined human/autonomy paradigm.

  14. RGB-D, Laser and Thermal Sensor Fusion for People following in a Mobile Robot

    Directory of Open Access Journals (Sweden)

    Loreto Susperregi

    2013-06-01

    Full Text Available Detecting and tracking people is a key capability for robots that operate in populated environments. In this paper, we used a multiple sensor fusion approach that combines three kinds of sensors in order to detect people using RGB-D vision, lasers and a thermal sensor mounted on a mobile platform. The Kinect sensor offers a rich data set at a significantly low cost, however, there are some limitations to its use in a mobile platform, mainly that the Kinect algorithms for people detection rely on images captured by a static camera. To cope with these limitations, this work is based on the combination of the Kinect and a Hokuyo laser and a thermopile array sensor. A real-time particle filter system merges the information provided by the sensors and calculates the position of the target, using probabilistic leg and thermal patterns, image features and optical flow to this end. Experimental results carried out with a mobile platform in a Science museum have shown that the combination of different sensory cues increases the reliability of the people following system.

  15. Approach towards sensor placement, selection and fusion for real-time condition monitoring of precision machines

    Science.gov (United States)

    Er, Poi Voon; Teo, Chek Sing; Tan, Kok Kiong

    2016-02-01

    Moving mechanical parts in a machine will inevitably generate vibration profiles reflecting its operating conditions. Vibration profile analysis is a useful tool for real-time condition monitoring to avoid loss of performance and unwanted machine downtime. In this paper, we propose and validate an approach for sensor placement, selection and fusion for continuous machine condition monitoring. The main idea is to use a minimal series of sensors mounted at key locations of a machine to measure and infer the actual vibration spectrum at a critical point where it is not suitable to mount a sensor. The locations for sensors' mountings which are subsequently used for vibration inference are identified based on sensitivity calibration at these locations moderated with normalized Fisher Information (NFI) associated with the measurement quality of the sensor at that location. Each of the identified sensor placement location is associated with one or more sensitive frequencies for which it ranks top in terms of the moderated sensitivities calibrated. A set of Radial Basis Function (RBF), each of them associated with a range of sensitive frequencies, is used to infer the vibration at the critical point for that frequency. The overall vibration spectrum of the critical point is then fused from these components. A comprehensive set of experimental results for validation of the proposed approach is provided in the paper.

  16. RGB-D, Laser and Thermal Sensor Fusion for People Following in a Mobile Robot

    Directory of Open Access Journals (Sweden)

    Loreto Susperregi

    2013-06-01

    Full Text Available Detecting and tracking people is a key capability for robots that operate in populated environments. In this paper, we used a multiple sensor fusion approach that combines three kinds of sensors in order to detect people using RGB-D vision, lasers and a thermal sensor mounted on a mobile platform. The Kinect sensor offers a rich data set at a significantly low cost, however, there are some limitations to its use in a mobile platform, mainly that the Kinect algorithms for people detection rely on images captured by a static camera. To cope with these limitations, this work is based on the combination of the Kinect and a Hokuyo laser and a thermopile array sensor. A real-time particle filter system merges the information provided by the sensors and calculates the position of the target, using probabilistic leg and thermal patterns, image features and optical flow to this end. Experimental results carried out with a mobile platform in a Science museum have shown that the combination of different sensory cues increases the reliability of the people following system.

  17. Unmanned Ground Vehicle Navigation and Coverage Hole Patching in Wireless Sensor Networks

    Science.gov (United States)

    Zhang, Guyu

    2013-01-01

    This dissertation presents a study of an Unmanned Ground Vehicle (UGV) navigation and coverage hole patching in coordinate-free and localization-free Wireless Sensor Networks (WSNs). Navigation and coverage maintenance are related problems since coverage hole patching requires effective navigation in the sensor network environment. A…

  18. The Android smartphone as an inexpensive sentry ground sensor

    Science.gov (United States)

    Schwamm, Riqui; Rowe, Neil C.

    2012-06-01

    A key challenge of sentry and monitoring duties is detection of approaching people in areas of little human traffic. We are exploring smartphones as easily available, easily portable, and less expensive alternatives to traditional military sensors for this task, where the sensors are already integrated into the package. We developed an application program for the Android smartphone that uses its sensors to detect people passing nearby; it takes their pictures for subsequent transmission to a central monitoring station. We experimented with the microphone, light sensor, vibration sensor, proximity sensor, orientation sensor, and magnetic sensor of the Android. We got best results with the microphone (looking for footsteps) and light sensor (looking for abrupt changes in light), and sometimes good results with the vibration sensor. We ran a variety of tests with subjects walking at various distances from the phone under different environmental conditions to measure limits on acceptable detection. We got best results by combining average loudness over a 200 millisecond period with a brightness threshold adjusted to the background brightness, and we set our phones to trigger pictures no more than twice a second. Subjects needed to be within ten feet of the phone for reliable triggering, and some surfaces gave poorer results. We primarily tested using the Motorola Atrix 4G (Android 2.3.4) and HTC Evo 4G (Android 2.3.3) and found only a few differences in performance running the same program, which we attribute to differences in the hardware. We also tested two older Android phones that had problems with crashing when running our program. Our results provide good guidance for when and where to use this approach to inexpensive sensing.

  19. Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2013-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the 'A-Train' platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (MERRA), stratify the comparisons using a classification of the 'cloud scenes' from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically 'sharded' by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will

  20. Image-Based Multi-Sensor Data Representation and Fusion Via 2D Non-Linear Convolution

    OpenAIRE

    Aaron R. Rababaah

    2012-01-01

    Sensor data fusion is the process of combining data collected from multi sensors of homogeneous or heterogeneous modalities to perform inferences that may not be possible using a single sensor. This process encompasses several stages to arrive at a sound reliable decision making end result. These stages include: senor-signal preprocessing, sub-object refinement, object refinement, situation refinement, threat refinement and process refinement. Every stage draws from different domains to achie...

  1. Coordinated Workload Scheduling in Hierarchical Sensor Networks for Data Fusion Applications

    Institute of Scientific and Technical Information of China (English)

    Xiao-Lin Li; Jian-Nong Cao

    2008-01-01

    To minimize the execution time of a sensing task over a multi-hop hierarchical sensor network, we present acoordinated scheduling method following the divisible load scheduling paradigm. The proposed scheduling strategy builds on eliminating transmission collisions and idle gaps between two successive data transmissions. We consider a sensor network consisting of several clusters. In a cluster, after related raw data measured by source nodes are collected at the fusion node,in-network data aggregation is further considered. The scheduling strategies consist of two phases: intra-cluster scheduling and inter-cluster scheduling. Intra-cluster scheduling deals with assigning different fractions of a sensing workload among source nodes in each cluster; inter-cluster scheduling involves the distribution of fused data among all fusion nodes. Closed-form solutions to the problem of task scheduling are derived. Finally, numerical examples are presented to demonstrate the impacts of different system parameters such as the number of sensor nodes, measurement, communication, and processing speed, on the finish time and energy consumption.

  2. Sensor fusion methods for reducing false alarms in heart rate monitoring.

    Science.gov (United States)

    Borges, Gabriel; Brusamarello, Valner

    2016-12-01

    Automatic patient monitoring is an essential resource in hospitals for good health care management. While alarms caused by abnormal physiological conditions are important for the delivery of fast treatment, they can be also a source of unnecessary noise because of false alarms caused by electromagnetic interference or motion artifacts. One significant source of false alarms is related to heart rate, which is triggered when the heart rhythm of the patient is too fast or too slow. In this work, the fusion of different physiological sensors is explored in order to create a robust heart rate estimation. A set of algorithms using heart rate variability index, Bayesian inference, neural networks, fuzzy logic and majority voting is proposed to fuse the information from the electrocardiogram, arterial blood pressure and photoplethysmogram. Three kinds of information are extracted from each source, namely, heart rate variability, the heart rate difference between sensors and the spectral analysis of low and high noise of each sensor. This information is used as input to the algorithms. Twenty recordings selected from the MIMIC database were used to validate the system. The results showed that neural networks fusion had the best false alarm reduction of 92.5 %, while the Bayesian technique had a reduction of 84.3 %, fuzzy logic 80.6 %, majority voter 72.5 % and the heart rate variability index 67.5 %. Therefore, the proposed algorithms showed good performance and could be useful in bedside monitors.

  3. Fusion of WiFi, Smartphone Sensors and Landmarks Using the Kalman Filter for Indoor Localization

    Directory of Open Access Journals (Sweden)

    Zhenghua Chen

    2015-01-01

    Full Text Available Location-based services (LBS have attracted a great deal of attention recently. Outdoor localization can be solved by the GPS technique, but how to accurately and efficiently localize pedestrians in indoor environments is still a challenging problem. Recent techniques based on WiFi or pedestrian dead reckoning (PDR have several limiting problems, such as the variation of WiFi signals and the drift of PDR. An auxiliary tool for indoor localization is landmarks, which can be easily identified based on specific sensor patterns in the environment, and this will be exploited in our proposed approach. In this work, we propose a sensor fusion framework for combining WiFi, PDR and landmarks. Since the whole system is running on a smartphone, which is resource limited, we formulate the sensor fusion problem in a linear perspective, then a Kalman filter is applied instead of a particle filter, which is widely used in the literature. Furthermore, novel techniques to enhance the accuracy of individual approaches are adopted. In the experiments, an Android app is developed for real-time indoor localization and navigation. A comparison has been made between our proposed approach and individual approaches. The results show significant improvement using our proposed framework. Our proposed system can provide an average localization accuracy of 1 m.

  4. Fusion of WiFi, smartphone sensors and landmarks using the Kalman filter for indoor localization.

    Science.gov (United States)

    Chen, Zhenghua; Zou, Han; Jiang, Hao; Zhu, Qingchang; Soh, Yeng Chai; Xie, Lihua

    2015-01-05

    Location-based services (LBS) have attracted a great deal of attention recently. Outdoor localization can be solved by the GPS technique, but how to accurately and efficiently localize pedestrians in indoor environments is still a challenging problem. Recent techniques based on WiFi or pedestrian dead reckoning (PDR) have several limiting problems, such as the variation of WiFi signals and the drift of PDR. An auxiliary tool for indoor localization is landmarks, which can be easily identified based on specific sensor patterns in the environment, and this will be exploited in our proposed approach. In this work, we propose a sensor fusion framework for combining WiFi, PDR and landmarks. Since the whole system is running on a smartphone, which is resource limited, we formulate the sensor fusion problem in a linear perspective, then a Kalman filter is applied instead of a particle filter, which is widely used in the literature. Furthermore, novel techniques to enhance the accuracy of individual approaches are adopted. In the experiments, an Android app is developed for real-time indoor localization and navigation. A comparison has been made between our proposed approach and individual approaches. The results show significant improvement using our proposed framework. Our proposed system can provide an average localization accuracy of 1 m.

  5. On the use of sensor fusion to reduce the impact of rotational and additive noise in human activity recognition.

    Science.gov (United States)

    Banos, Oresti; Damas, Miguel; Pomares, Hector; Rojas, Ignacio

    2012-01-01

    The main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise) imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered.

  6. On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition

    Directory of Open Access Journals (Sweden)

    Ignacio Rojas

    2012-06-01

    Full Text Available The main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered.

  7. Identifying and tracking pedestrians based on sensor fusion and motion stability predictions.

    Science.gov (United States)

    Musleh, Basam; García, Fernando; Otamendi, Javier; Armingol, José Maria; de la Escalera, Arturo

    2010-01-01

    The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS) applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem) and dense disparity maps and u-v disparity (vision subsystem). Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle.

  8. Identifying and Tracking Pedestrians Based on Sensor Fusion and Motion Stability Predictions

    Directory of Open Access Journals (Sweden)

    Arturo de la Escalera

    2010-08-01

    Full Text Available The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem and dense disparity maps and u-v disparity (vision subsystem. Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle.

  9. Autonomous landing of a helicopter UAV with a ground-based multisensory fusion system

    Science.gov (United States)

    Zhou, Dianle; Zhong, Zhiwei; Zhang, Daibing; Shen, Lincheng; Yan, Chengping

    2015-02-01

    In this study, this paper focus on the vision-based autonomous helicopter unmanned aerial vehicle (UAV) landing problems. This paper proposed a multisensory fusion to autonomous landing of an UAV. The systems include an infrared camera, an Ultra-wideband radar that measure distance between UAV and Ground-Based system, an PAN-Tilt Unit (PTU). In order to identify all weather UAV targets, we use infrared cameras. To reduce the complexity of the stereovision or one-cameral calculating the target of three-dimensional coordinates, using the ultra-wideband radar distance module provides visual depth information, real-time Image-PTU tracking UAV and calculate the UAV threedimensional coordinates. Compared to the DGPS, the test results show that the paper is effectiveness and robustness.

  10. Towards the development of tamper-resistant, ground-based mobile sensor nodes

    Science.gov (United States)

    Mascarenas, David; Stull, Christopher; Farrar, Charles

    2011-11-01

    Mobile sensor nodes hold great potential for collecting field data using fewer resources than human operators would require and potentially requiring fewer sensors than a fixed-position sensor array. It would be very beneficial to allow these mobile sensor nodes to operate unattended with a minimum of human intervention. In order to allow mobile sensor nodes to operate unattended in a field environment, it is imperative that they be capable of identifying and responding to external agents that may attempt to tamper with, damage or steal the mobile sensor nodes, while still performing their data collection mission. Potentially hostile external agents could include animals, other mobile sensor nodes, or humans. This work will focus on developing control policies to help enable a mobile sensor node to identify and avoid capture by a hostile un-mounted human. The work is developed in a simulation environment, and demonstrated using a non-holonomic, ground-based mobile sensor node. This work will be a preliminary step toward ensuring the cyber-physical security of ground-based mobile sensor nodes that operate unattended in potentially unfriendly environments.

  11. The addition of a sagittal image fusion improves the prostate cancer detection in a sensor-based MRI /ultrasound fusion guided targeted biopsy.

    Science.gov (United States)

    Günzel, Karsten; Cash, Hannes; Buckendahl, John; Königbauer, Maximilian; Asbach, Patrick; Haas, Matthias; Neymeyer, Jörg; Hinz, Stefan; Miller, Kurt; Kempkensteffen, Carsten

    2017-01-13

    To explore the diagnostic benefit of an additional image fusion of the sagittal plane in addition to the standard axial image fusion, using a sensor-based MRI/US fusion platform. During July 2013 and September 2015, 251 patients with at least one suspicious lesion on mpMRI (rated by PI-RADS) were included into the analysis. All patients underwent MRI/US targeted biopsy (TB) in combination with a 10 core systematic prostate biopsy (SB). All biopsies were performed on a sensor-based fusion system. Group A included 162 men who received TB by an axial MRI/US image fusion. Group B comprised 89 men in whom the TB was performed with an additional sagittal image fusion. The median age in group A was 67 years (IQR 61-72) and in group B 68 years (IQR 60-71). The median PSA level in group A was 8.10 ng/ml (IQR 6.05-14) and in group B 8.59 ng/ml (IQR 5.65-12.32). In group A the proportion of patients with a suspicious digital rectal examination (DRE) (14 vs. 29%, p = 0.007) and the proportion of primary biopsies (33 vs 46%, p = 0.046) were significantly lower. The rate of PI-RADS 3 lesions were overrepresented in group A compared to group B (19 vs. 9%; p = 0.044). Classified according to PI-RADS 3, 4 and 5, the detection rates of TB were 42, 48, 75% in group A and 25, 74, 90% in group B. The rate of PCa with a Gleason score ≥7 missed by TB was 33% (18 cases) in group A and 9% (5 cases) in group B; p-value 0.072. An explorative multivariate binary logistic regression analysis revealed that PI-RADS, a suspicious DRE and performing an additional sagittal image fusion were significant predictors for PCa detection in TB. 9 PCa were only detected by TB with sagittal fusion (sTB) and sTB identified 10 additional clinically significant PCa (Gleason ≥7). Performing an additional sagittal image fusion besides the standard axial fusion appears to improve the accuracy of the sensor-based MRI/US fusion platform.

  12. Coal blending scheduling in coal preparation plant based on multi-sensor information fusion

    Energy Technology Data Exchange (ETDEWEB)

    Gao, L.; Yu, H.; Wang, Y. [CUMT, Xuzhou (China). School of Information and Electrical Engineering

    2004-01-01

    It is important to research on a reasonable blending schedule according to the customer requirement and the practice of products in coal preparation plant. In order to solve this problem, a mathematic model was set up on the basis of analysing coal blending schedule. Multi-sensors information fusion was used to monitor the density and the amount of coal. The genetic algorithm was used to solve the nonlinear function in the maths model. A satisfied result was obtained by simulating test. 8 refs., 2 figs.

  13. Reconnaissance blind multi-chess: an experimentation platform for ISR sensor fusion and resource management

    Science.gov (United States)

    Newman, Andrew J.; Richardson, Casey L.; Kain, Sean M.; Stankiewicz, Paul G.; Guseman, Paul R.; Schreurs, Blake A.; Dunne, Jeffrey A.

    2016-05-01

    This paper introduces the game of reconnaissance blind multi-chess (RBMC) as a paradigm and test bed for understanding and experimenting with autonomous decision making under uncertainty and in particular managing a network of heterogeneous Intelligence, Surveillance and Reconnaissance (ISR) sensors to maintain situational awareness informing tactical and strategic decision making. The intent is for RBMC to serve as a common reference or challenge problem in fusion and resource management of heterogeneous sensor ensembles across diverse mission areas. We have defined a basic rule set and a framework for creating more complex versions, developed a web-based software realization to serve as an experimentation platform, and developed some initial machine intelligence approaches to playing it.

  14. Regularized discriminant analysis for multi-sensor decision fusion and damage detection with Lamb waves

    Science.gov (United States)

    Mishra, Spandan; Vanli, O. Arda; Huffer, Fred W.; Jung, Sungmoon

    2016-04-01

    In this study we propose a regularized linear discriminant analysis approach for damage detection which does not require an intermediate feature extraction step and therefore more efficient in handling data with high-dimensionality. A robust discriminant model is obtained by shrinking of the covariance matrix to a diagonal matrix and thresholding redundant predictors without hurting the predictive power of the model. The shrinking and threshold parameters of the discriminant function (decision boundary) are estimated to minimize the classification error. Furthermore, it is shown how the damage classification achieved by the proposed method can be extended to multiple sensors by following a Bayesian decision-fusion formulation. The detection probability of each sensor is used as a prior condition to estimate the posterior detection probability of the entire network and the posterior detection probability is used as a quantitative basis to make the final decision about the damage.

  15. Neuromorphic Audio-Visual Sensor Fusion on a Sound-Localising Robot

    Directory of Open Access Journals (Sweden)

    Vincent Yue-Sek Chan

    2012-02-01

    Full Text Available This paper presents the first robotic system featuring audio-visual sensor fusion with neuromorphic sensors. We combine a pair of silicon cochleae and a silicon retina on a robotic platform to allow the robot to learn sound localisation through self-motion and visual feedback, using an adaptive ITD-based sound localisation algorithm. After training, the robot can localise sound sources (white or pink noise in a reverberant environment with an RMS error of 4 to 5 degrees in azimuth. In the second part of the paper, we investigate the source binding problem. An experiment is conducted to test the effectiveness of matching an audio event with a corresponding visual event based on their onset time. The results show that this technique can be quite effective, despite its simplicity.

  16. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    Science.gov (United States)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  17. A high precision, compact electromechanical ground rotation sensor

    Science.gov (United States)

    Dergachev, V.; DeSalvo, R.; Asadoor, M.; Bhawal, A.; Gong, P.; Kim, C.; Lottarini, A.; Minenkov, Y.; Murphy, C.; O'Toole, A.; Peña Arellano, F. E.; Rodionov, A. V.; Shaner, M.; Sobacchi, E.

    2014-05-01

    We present a mechanical rotation sensor consisting of a balance pivoting on a tungsten carbide knife edge. These sensors are important for precision seismic isolation systems, as employed in land-based gravitational wave interferometers and for the new field of rotational seismology. The position sensor used is an air-core linear variable differential transformer with a demonstrated noise floor of {1}{ × 10^{-11}}textrm { m}/sqrt{textrm {Hz}}. We describe the instrument construction and demonstrate low noise operation with a noise floor upper bound of {5.7}{ × 10^{-9}}textrm { rad}/sqrt{textrm {Hz}} at 10 mHz and {6.4}{ × 10^{-10}}textrm { rad}/sqrt{textrm {Hz}} at 0.1 Hz. The performance of the knife edge hinge is compatible with a behaviorur free of noise from dislocation self-organized criticality.

  18. A high precision, compact electromechanical ground rotation sensor

    Energy Technology Data Exchange (ETDEWEB)

    Dergachev, V., E-mail: volodya@caltech.edu [LIGO Laboratory, California Institute of Technology, MS 100-36, Pasadena, California 91125 (United States); DeSalvo, R. [LIGO Laboratory, California Institute of Technology, MS 100-36, Pasadena, California 91125 (United States); University of Sannio, C.so Garibaldi 107, Benevento 82100 (Italy); Asadoor, M. [Mayfield Senior School, 500 Bellefontaine Street, Pasadena, California 91105 (United States); Oklahoma State University, 219 Student Union, Stillwater, Oklahoma 74074 (United States); Bhawal, A. [Arcadia High School, 180 Campus Drive, Arcadia, California 91007 (United States); Carnegie Mellon University, 5000 Forbes Ave, Pittsburgh, Pennsylvania 15213 (United States); Gong, P. [Department of Precision Instrument, Tsinghua University, Beijing 100084 (China); School of Industrial and System Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332-0205 (United States); Kim, C. [California Institute of Technology, Pasadena, California 91125 (United States); Lottarini, A. [Department of Computer Science, University of Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); Department of Computer Science, Columbia University, 1214 Amsterdam Avenue, New York, New York 10027 (United States); Minenkov, Y. [Sezione INFN Tor Vergata, via della Ricerca Scientifica  1, 00133 Roma (Italy); Murphy, C. [School of Physics, The University of Western Australia, 35 Stirling Highway, Crawley, Perth, Western Australia 6009 (Australia); University of Melbourne Grattan Street, Parkville VIC 3010 (Australia); O' Toole, A. [University of California, Los Angeles, 405 Hilgard Ave, Los Angeles, California 90095 (United States); Michigan Technological University, 1400 Townsend Dr, Houghton, Michigan 49931 (United States); Peña Arellano, F. E. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); and others

    2014-05-15

    We present a mechanical rotation sensor consisting of a balance pivoting on a tungsten carbide knife edge. These sensors are important for precision seismic isolation systems, as employed in land-based gravitational wave interferometers and for the new field of rotational seismology. The position sensor used is an air-core linear variable differential transformer with a demonstrated noise floor of 1 × 10{sup −11}m/√( Hz ). We describe the instrument construction and demonstrate low noise operation with a noise floor upper bound of 5.7 × 10{sup −9} rad /√( Hz ) at 10 mHz and 6.4 × 10{sup −10} rad /√( Hz ) at 0.1 Hz. The performance of the knife edge hinge is compatible with a behaviorur free of noise from dislocation self-organized criticality.

  19. Pheromone-based coordination strategy to static sensors on the ground and unmanned aerial vehicles carried sensors

    Science.gov (United States)

    Pignaton de Freitas, Edison; Heimfarth, Tales; Pereira, Carlos Eduardo; Morado Ferreira, Armando; Rech Wagner, Flávio; Larsson, Tony

    2010-04-01

    A current trend that is gaining strength in the wireless sensor network area is the use of heterogeneous sensor nodes in one coordinated overall network, needed to fulfill the requirements of sophisticated emerging applications, such as area surveillance systems. One of the main concerns when developing such sensor networks is how to provide coordination among the heterogeneous nodes, in order to enable them to efficiently respond the user needs. This study presents an investigation of strategies to coordinate a set of static sensor nodes on the ground cooperating with wirelessly connected Unmanned Aerial Vehicles (UAVs) carrying a variety of sensors, in order to provide efficient surveillance over an area of interest. The sensor nodes on the ground are set to issue alarms on the occurrence of a given event of interest, e.g. entrance of a non-authorized vehicle in the area, while the UAVs receive the issued alarms and have to decide which of them is the most suitable to handle the issued alarm. A bio-inspired coordination strategy based on the concept of pheromones is presented. As a complement of this strategy, a utility-based decision making approach is proposed.

  20. An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation.

    Science.gov (United States)

    He, Changyu; Kazanzides, Peter; Sen, Hasan Tutkun; Kim, Sungmin; Liu, Yue

    2015-07-08

    Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.

  1. An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation

    Directory of Open Access Journals (Sweden)

    Changyu He

    2015-07-01

    Full Text Available Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD that integrates an optical tracking system (OTS and inertial measurement unit (IMU. Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions.

  2. Solid state magnetic field sensors for micro unattended ground networks using spin dependent tunneling

    Science.gov (United States)

    Tondra, Mark; Nordman, Catherine A.; Lange, Erik H.; Reed, Daniel; Jander, Albrect; Akou, Seraphin; Daughton, James

    2001-09-01

    Micro Unattended Ground Sensor Networks will likely employ magnetic sensors, primarily for discrimination of objects as opposed to initial detection. These magnetic sensors, then, must fit within very small cost, size, and power budgets to be compatible with the envisioned sensor suites. Also, a high degree of sensitivity is required to minimize the number of sensor cells required to survey a given area in the field. Solid state magnetoresistive sensors, with their low cost, small size, and ease of integration, are excellent candidates for these applications assuming that their power and sensitivity performance are acceptable. SDT devices have been fabricated into prototype magnetic field sensors suitable for use in micro unattended ground sensor networks. They are housed in tiny SOIC 8-pin packages and mounted on a circuit board with required voltage regulation, signal amplification and conditioning, and sensor control and communications functions. The best sensitivity results to date are 289 pT/rt. Hz at 1 Hz, and and 7 pT/rt. Hz at f > 10 kHz. Expected near term improvements in performance would bring these levels to approximately 10 pT/rt Hz at 1 Hz and approximately 1 pT/rt. Hz at > 1 kHz.

  3. Sensor fusion: lane marking detection and autonomous intelligent cruise control system

    Science.gov (United States)

    Baret, Marc; Baillarin, S.; Calesse, C.; Martin, Lionel

    1995-12-01

    In the past few years MATRA and RENAULT have developed an Autonomous Intelligent Cruise Control (AICC) system based on a LIDAR sensor. This sensor incorporating a charge coupled device was designed to acquire pulsed laser diode emission reflected by standard car reflectors. The absence of moving mechanical parts, the large field of view, the high measurement rate and the very good accuracy for distance range and angular position of targets make this sensor very interesting. It provides the equipped car with the distance and the relative speed of other vehicles enabling the safety distance to be controlled by acting on the throttle and the automatic gear box. Experiments in various real traffic situations have shown the limitations of this kind of system especially on bends. All AICC sensors are unable to distinguish between a bend and a change of lane. This is easily understood if we consider a road without lane markings. This fact has led MATRA to improve its AICC system by providing the lane marking information. Also in the scope of the EUREKA PROMETHEUS project, MATRA and RENAULT have developed a lane keeping system in order to warn of the drivers lack of vigilance. Thus, MATRA have spread this system to far field lane marking detection and have coupled it with the AICC system. Experiments will be carried out on roads to estimate the gain in performance and comfort due to this fusion.

  4. Fault tolerant multi-sensor fusion based on the information gain

    Science.gov (United States)

    Hage, Joelle Al; El Najjar, Maan E.; Pomorski, Denis

    2017-01-01

    In the last decade, multi-robot systems are used in several applications like for example, the army, the intervention areas presenting danger to human life, the management of natural disasters, the environmental monitoring, exploration and agriculture. The integrity of localization of the robots must be ensured in order to achieve their mission in the best conditions. Robots are equipped with proprioceptive (encoders, gyroscope) and exteroceptive sensors (Kinect). However, these sensors could be affected by various faults types that can be assimilated to erroneous measurements, bias, outliers, drifts,… In absence of a sensor fault diagnosis step, the integrity and the continuity of the localization are affected. In this work, we present a muti-sensors fusion approach with Fault Detection and Exclusion (FDE) based on the information theory. In this context, we are interested by the information gain given by an observation which may be relevant when dealing with the fault tolerance aspect. Moreover, threshold optimization based on the quantity of information given by a decision on the true hypothesis is highlighted.

  5. Tomographic Imaging on Distributed Unattended Ground Sensor Arrays

    Science.gov (United States)

    2007-11-02

    around the next corner, what is upstairs, where is the person in a red jacket , or even what was the person in the red jacket doing 5 minutes ago...cameras and detectors to seismic , acoustic, magnetic, smoke, toxin, and temperature sensors. A working example of just such a network was developed at

  6. A small, lightweight multipollutant sensor system for ground ...

    Science.gov (United States)

    Characterizing highly dynamic, transient, and vertically lofted emissions from open area sources poses unique measurement challenges. This study developed and applied a multipollutant sensor and integrated sampler system for use on mobile applications including tethered balloons (aerostats) and unmanned aerial vehicles (UAVs). The system is particularly applicable to open area sources, such as forest fires, due to its light weight (3.5 kg), compact size (6.75 L), and internal power supply. The sensor system, termed “Kolibri”, consists of sensors measuring CO2 and CO, and samplers for particulate matter (PM) and volatile organic compounds (VOCs). The Kolibri is controlled by a microcontroller which can record and transfer data in real time through a radio module. Selection of the sensors was based on laboratory testing for accuracy, response delay and recovery, cross-sensitivity, and precision. The Kolibri was compared against rack-mounted continuous emissions monitoring system (CEMs) and another mobile sampling instrument (the “Flyer”) that has been used in over ten open area pollutant sampling events. Our results showed that the time series of CO, CO2, and PM2.5 concentrations measured by the Kolibri agreed well with those from the CEMs and the Flyer, with a laboratory-tested percentage error of 4.9%, 3%, and 5.8%, respectively. The VOC emission factors obtained using the Kolibri were consistent with existing literature values that relate concentration

  7. Fusion

    Science.gov (United States)

    Herman, Robin

    1990-10-01

    The book abounds with fascinating anecdotes about fusion's rocky path: the spurious claim by Argentine dictator Juan Peron in 1951 that his country had built a working fusion reactor, the rush by the United States to drop secrecy and publicize its fusion work as a propaganda offensive after the Russian success with Sputnik; the fortune Penthouse magazine publisher Bob Guccione sank into an unconventional fusion device, the skepticism that met an assertion by two University of Utah chemists in 1989 that they had created "cold fusion" in a bottle. Aimed at a general audience, the book describes the scientific basis of controlled fusion--the fusing of atomic nuclei, under conditions hotter than the sun, to release energy. Using personal recollections of scientists involved, it traces the history of this little-known international race that began during the Cold War in secret laboratories in the United States, Great Britain and the Soviet Union, and evolved into an astonishingly open collaboration between East and West.

  8. A NOVEL ALGORITHM OF MULTI-SENSOR IMAGE FUSION BASED ON WAVELET PACKET TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In order to enhance the image information from multi-sensor and to improve the abilities of theinformation analysis and the feature extraction, this letter proposed a new fusion approach in pixel level bymeans of the Wavelet Packet Transform (WPT). The WPT is able to decompose an image into low frequencyband and high frequency band in higher scale. It offers a more precise method for image analysis than Wave-let Transform (WT). Firstly, the proposed approach employs HIS (Hue, Intensity, Saturation) transform toobtain the intensity component of CBERS (China-Brazil Earth Resource Satellite) multi-spectral image. ThenWPT transform is employed to decompose the intensity component and SPOT (Systeme Pour I'Observationde la Therre ) image into low frequency band and high frequency band in three levels. Next, two high fre-quency coefficients and low frequency coefficients of the images are combined by linear weighting strategies.Finally, the fused image is obtained with inverse WPT and inverse HIS. The results show the new approachcan fuse details of input image successfully, and thereby can obtain a more satisfactory result than that of HM(Histogram Matched)-based fusion algorithm and WT-based fusion approach.

  9. Gyro Drift Correction for An Indirect Kalman Filter Based Sensor Fusion Driver

    Directory of Open Access Journals (Sweden)

    Chan-Gun Lee

    2016-06-01

    Full Text Available Sensor fusion techniques have made a significant contribution to the success of the recently emerging mobile applications era because a variety of mobile applications operate based on multi-sensing information from the surrounding environment, such as navigation systems, fitness trackers, interactive virtual reality games, etc. For these applications, the accuracy of sensing information plays an important role to improve the user experience (UX quality, especially with gyroscopes and accelerometers. Therefore, in this paper, we proposed a novel mechanism to resolve the gyro drift problem, which negatively affects the accuracy of orientation computations in the indirect Kalman filter based sensor fusion. Our mechanism focuses on addressing the issues of external feedback loops and non-gyro error elements contained in the state vectors of an indirect Kalman filter. Moreover, the mechanism is implemented in the device-driver layer, providing lower process latency and transparency capabilities for the upper applications. These advances are relevant to millions of legacy applications since utilizing our mechanism does not require the existing applications to be re-programmed. The experimental results show that the root mean square errors (RMSE before and after applying our mechanism are significantly reduced from 6.3 × 10−1 to 5.3 × 10−7, respectively.

  10. Optimization-Based Sensor Fusion of GNSS and IMU Using a Moving Horizon Approach

    Science.gov (United States)

    Girrbach, Fabian; Hol, Jeroen D.; Bellusci, Giovanni; Diehl, Moritz

    2017-01-01

    The rise of autonomous systems operating close to humans imposes new challenges in terms of robustness and precision on the estimation and control algorithms. Approaches based on nonlinear optimization, such as moving horizon estimation, have been shown to improve the accuracy of the estimated solution compared to traditional filter techniques. This paper introduces an optimization-based framework for multi-sensor fusion following a moving horizon scheme. The framework is applied to the often occurring estimation problem of motion tracking by fusing measurements of a global navigation satellite system receiver and an inertial measurement unit. The resulting algorithm is used to estimate position, velocity, and orientation of a maneuvering airplane and is evaluated against an accurate reference trajectory. A detailed study of the influence of the horizon length on the quality of the solution is presented and evaluated against filter-like and batch solutions of the problem. The versatile configuration possibilities of the framework are finally used to analyze the estimated solutions at different evaluation times exposing a nearly linear behavior of the sensor fusion problem. PMID:28534857

  11. Gyro Drift Correction for An Indirect Kalman Filter Based Sensor Fusion Driver.

    Science.gov (United States)

    Lee, Chan-Gun; Dao, Nhu-Ngoc; Jang, Seonmin; Kim, Deokhwan; Kim, Yonghun; Cho, Sungrae

    2016-06-11

    Sensor fusion techniques have made a significant contribution to the success of the recently emerging mobile applications era because a variety of mobile applications operate based on multi-sensing information from the surrounding environment, such as navigation systems, fitness trackers, interactive virtual reality games, etc. For these applications, the accuracy of sensing information plays an important role to improve the user experience (UX) quality, especially with gyroscopes and accelerometers. Therefore, in this paper, we proposed a novel mechanism to resolve the gyro drift problem, which negatively affects the accuracy of orientation computations in the indirect Kalman filter based sensor fusion. Our mechanism focuses on addressing the issues of external feedback loops and non-gyro error elements contained in the state vectors of an indirect Kalman filter. Moreover, the mechanism is implemented in the device-driver layer, providing lower process latency and transparency capabilities for the upper applications. These advances are relevant to millions of legacy applications since utilizing our mechanism does not require the existing applications to be re-programmed. The experimental results show that the root mean square errors (RMSE) before and after applying our mechanism are significantly reduced from 6.3 × 10(-1) to 5.3 × 10(-7), respectively.

  12. Trust Model of Wireless Sensor Networks and Its Application in Data Fusion.

    Science.gov (United States)

    Chen, Zhenguo; Tian, Liqin; Lin, Chuang

    2017-03-28

    In order to ensure the reliability and credibility of the data in wireless sensor networks (WSNs), this paper proposes a trust evaluation model and data fusion mechanism based on trust. First of all, it gives the model structure. Then, the calculation rules of trust are given. In the trust evaluation model, comprehensive trust consists of three parts: behavior trust, data trust, and historical trust. Data trust can be calculated by processing the sensor data. Based on the behavior of nodes in sensing and forwarding, the behavior trust is obtained. The initial value of historical trust is set to the maximum and updated with comprehensive trust. Comprehensive trust can be obtained by weighted calculation, and then the model is used to construct the trust list and guide the process of data fusion. Using the trust model, simulation results indicate that energy consumption can be reduced by an average of 15%. The detection rate of abnormal nodes is at least 10% higher than that of the lightweight and dependable trust system (LDTS) model. Therefore, this model has good performance in ensuring the reliability and credibility of the data. Moreover, the energy consumption of transmitting was greatly reduced.

  13. Approach to explosive hazard detection using sensor fusion and multiple kernel learning with downward-looking GPR and EMI sensor data

    Science.gov (United States)

    Pinar, Anthony; Masarik, Matthew; Havens, Timothy C.; Burns, Joseph; Thelen, Brian; Becker, John

    2015-05-01

    This paper explores the effectiveness of an anomaly detection algorithm for downward-looking ground penetrating radar (GPR) and electromagnetic inductance (EMI) data. Threat detection with GPR is challenged by high responses to non-target/clutter objects, leading to a large number of false alarms (FAs), and since the responses of target and clutter signatures are so similar, classifier design is not trivial. We suggest a method based on a Run Packing (RP) algorithm to fuse GPR and EMI data into a composite confidence map to improve detection as measured by the area-under-ROC (NAUC) metric. We examine the value of a multiple kernel learning (MKL) support vector machine (SVM) classifier using image features such as histogram of oriented gradients (HOG), local binary patterns (LBP), and local statistics. Experimental results on government furnished data show that use of our proposed fusion and classification methods improves the NAUC when compared with the results from individual sensors and a single kernel SVM classifier.

  14. Pose estimation of surgical instrument using sensor data fusion with optical tracker and IMU based on Kalman filter

    Directory of Open Access Journals (Sweden)

    Oh Hyunmin

    2015-01-01

    Full Text Available Tracking system is essential for Image Guided Surgery(IGS. The Optical Tracking Sensor(OTS has been widely used as tracking system for IGS due to its high accuracy and easy usage. However, OTS has a limit that tracking fails when occlusion of marker occurs. In this paper, sensor fusion with OTS and Inertial Measurement Unit(IMU is proposed to solve this problem. The proposed algorithm improves the accuracy of tracking system by eliminating scattering error of the sensor and supplements the disadvantages of OTS and IMU through sensor fusion based on Kalman filter. Also, coordinate axis calibration method that improves the accuracy is introduced. The performed experiment verifies the effectualness of the proposed algorithm.

  15. Temporal Pattern Recognition: A Network Architecture For Multi-Sensor Fusion

    Science.gov (United States)

    Priebe, C. E.; Marchette, D. J.

    1989-03-01

    A self-organizing network architecture for the learning and recognition of temporal patterns is proposed. This multi-layered architecture has as its focal point a layer of multi-dimensional Gaussian classification nodes, and the learning scheme employed is based on standard statistical moving mean and moving covariance calculations. The nodes are implemented in the network architecture by using a Gaussian, rather than sigmoidal, transfer function acting on the input from numerous connections. Each connection is analogous to a separate dimension for the Gaussian function. The learning scheme is a one-pass method, eliminating the need for repetitive presentation of the teaching stimuli. The Gaussian classes developed are representative of the statistics of the teaching data and act as templates in classifying novel inputs. The input layer employs a time-based decay to develop a time-ordered representation of the input stimuli. This temporal pattern recognition architecture is used to perform multi-sensor fusion and scene analysis for ROBART II, an autonomous sentry robot employing heterogeneous and homogeneous binary (on / off) sensors. The system receives sensor packets from ROBART indicating which sensors are active. The packets from various sensors are integrated in the input layer. As time progresses these sensor outputs become ordered, allowing the system to recognize activities which are dependent, not only on the individual events which make up the activity, but also on the order in which these events occur and their relative spacing throughout time. Each Gaussian classification node, representing a learned activity as an ordered sequence of sensor outputs, calculates its activation value independently, based on the activity in the input layer. These Gaussian activation values are then used to determine which, if any, of the learned sequences are present and with what confidence. The classification system is capable of recognizing activities despite missing

  16. An Air-Ground Wireless Sensor Network for Crop Monitoring

    Directory of Open Access Journals (Sweden)

    Claudio Rossi

    2011-06-01

    Full Text Available This paper presents a collaborative system made up of a Wireless Sensor Network (WSN and an aerial robot, which is applied to real-time frost monitoring in vineyards. The core feature of our system is a dynamic mobile node carried by an aerial robot, which ensures communication between sparse clusters located at fragmented parcels and a base station. This system overcomes some limitations of the wireless networks in areas with such characteristics. The use of a dedicated communication channel enables data routing to/from unlimited distances.

  17. Multi sensor fusion framework for indoor-outdoor localization of limited resource mobile robots.

    Science.gov (United States)

    Marín, Leonardo; Vallés, Marina; Soriano, Ángel; Valera, Ángel; Albertos, Pedro

    2013-10-21

    This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  18. Fusion of Haptic and Gesture Sensors for Rehabilitation of Bimanual Coordination and Dexterous Manipulation

    Directory of Open Access Journals (Sweden)

    Ningbo Yu

    2016-03-01

    Full Text Available Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs, and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training.

  19. Motion-sensor fusion-based gesture recognition and its VLSI architecture design for mobile devices

    Science.gov (United States)

    Zhu, Wenping; Liu, Leibo; Yin, Shouyi; Hu, Siqi; Tang, Eugene Y.; Wei, Shaojun

    2014-05-01

    With the rapid proliferation of smartphones and tablets, various embedded sensors are incorporated into these platforms to enable multimodal human-computer interfaces. Gesture recognition, as an intuitive interaction approach, has been extensively explored in the mobile computing community. However, most gesture recognition implementations by now are all user-dependent and only rely on accelerometer. In order to achieve competitive accuracy, users are required to hold the devices in predefined manner during the operation. In this paper, a high-accuracy human gesture recognition system is proposed based on multiple motion sensor fusion. Furthermore, to reduce the energy overhead resulted from frequent sensor sampling and data processing, a high energy-efficient VLSI architecture implemented on a Xilinx Virtex-5 FPGA board is also proposed. Compared with the pure software implementation, approximately 45 times speed-up is achieved while operating at 20 MHz. The experiments show that the average accuracy for 10 gestures achieves 93.98% for user-independent case and 96.14% for user-dependent case when subjects hold the device randomly during completing the specified gestures. Although a few percent lower than the conventional best result, it still provides competitive accuracy acceptable for practical usage. Most importantly, the proposed system allows users to hold the device randomly during operating the predefined gestures, which substantially enhances the user experience.

  20. Fusion of Haptic and Gesture Sensors for Rehabilitation of Bimanual Coordination and Dexterous Manipulation.

    Science.gov (United States)

    Yu, Ningbo; Xu, Chang; Li, Huanshuai; Wang, Kui; Wang, Liancheng; Liu, Jingtai

    2016-03-18

    Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs), and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training.

  1. Fusion of Haptic and Gesture Sensors for Rehabilitation of Bimanual Coordination and Dexterous Manipulation

    Science.gov (United States)

    Yu, Ningbo; Xu, Chang; Li, Huanshuai; Wang, Kui; Wang, Liancheng; Liu, Jingtai

    2016-01-01

    Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs), and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training. PMID:26999149

  2. A weighted optimization approach to time-of-flight sensor fusion.

    Science.gov (United States)

    Schwarz, Sebastian; Sjostrom, Marten; Olsson, Roger

    2014-01-01

    Acquiring scenery depth is a fundamental task in computer vision, with many applications in manufacturing, surveillance, or robotics relying on accurate scenery information. Time-of-flight cameras can provide depth information in real-time and overcome short-comings of traditional stereo analysis. However, they provide limited spatial resolution and sophisticated upscaling algorithms are sought after. In this paper, we present a sensor fusion approach to time-of-flight super resolution, based on the combination of depth and texture sources. Unlike other texture guided approaches, we interpret the depth upscaling process as a weighted energy optimization problem. Three different weights are introduced, employing different available sensor data. The individual weights address object boundaries in depth, depth sensor noise, and temporal consistency. Applied in consecutive order, they form three weighting strategies for time-of-flight super resolution. Objective evaluations show advantages in depth accuracy and for depth image based rendering compared with state-of-the-art depth upscaling. Subjective view synthesis evaluation shows a significant increase in viewer preference by a factor of four in stereoscopic viewing conditions. To the best of our knowledge, this is the first extensive subjective test performed on time-of-flight depth upscaling. Objective and subjective results proof the suitability of our approach to time-of-flight super resolution approach for depth scenery capture.

  3. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots

    Directory of Open Access Journals (Sweden)

    Pedro Albertos

    2013-10-01

    Full Text Available This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU on an event based schedule, using fewer resources (execution time and bandwidth but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estimation error covariance exceeds a predefined limit. The proposed experimental platforms are based on the LEGO Mindstorm NXT, and consist of a differential wheel mobile robot navigating indoors with a zenithal camera as global sensor, and an Ackermann steering mobile robot navigating outdoors with a SBG Systems GPS accessed through an IGEP board that also serves as datalogger. The IMU in both robots is built using the NXT motor encoders along with one gyroscope, one compass and two accelerometers from Hitecnic, placed according to a particle based dynamic model of the robots. The tests performed reflect the correct performance and low execution time of the proposed framework. The robustness and stability is observed during a long walk test in both indoors and outdoors environments.

  4. Potential use of ground-based sensor technologies for weed detection.

    Science.gov (United States)

    Peteinatos, Gerassimos G; Weis, Martin; Andújar, Dionisio; Rueda Ayala, Victor; Gerhards, Roland

    2014-02-01

    Site-specific weed management is the part of precision agriculture (PA) that tries to effectively control weed infestations with the least economical and environmental burdens. This can be achieved with the aid of ground-based or near-range sensors in combination with decision rules and precise application technologies. Near-range sensor technologies, developed for mounting on a vehicle, have been emerging for PA applications during the last three decades. These technologies focus on identifying plants and measuring their physiological status with the aid of their spectral and morphological characteristics. Cameras, spectrometers, fluorometers and distance sensors are the most prominent sensors for PA applications. The objective of this article is to describe-ground based sensors that have the potential to be used for weed detection and measurement of weed infestation level. An overview of current sensor systems is presented, describing their concepts, results that have been achieved, already utilized commercial systems and problems that persist. A perspective for the development of these sensors is given. © 2013 Society of Chemical Industry.

  5. Classification of paddy rice through multi-temporal multi-sensor data fusion

    Science.gov (United States)

    Im, Jungho; Park, Seonyoung

    2017-04-01

    Rice is one of important food resources in the world and its consumption continues to increase with increasing world population. Accurate paddy rice mapping and monitoring are crucial for food security and agricultural mitigation because they enable us to forecast rice production. There have been studies for paddy rice classification using optical sensor data. However, optical sensor data has a limitation for data acquisition due to cloud contamination. Active Synthetic Aperture Radar (SAR) data have been used to complement the cloud problems of optical sensor images. Integration of the multispectral and SAR data can produce the more reliable crop classification results than from a single sensor data. In addition, as paddy rice has distinct phenology, many studies used phenology features from multi-temporal data for detecting paddy rice. Thus, this study aims at mapping paddy rice by expanding the spectral and temporal dimensions of data. In this study, we conducted paddy rice classification through fusion of multi-temporal optical sensor (Landsat) and SAR (RADARSAT-1 and ALSO PALSAR) data using two machine learning approaches—random forest (RF) and support vector machines (SVM) over two study sites (Dangjin-si in South Korea and Sutter County, California in the United States). This study examined six scenarios to identify the effect of the expansion of data dimension. Each scenario has a different combination of data sources and seasonal characteristics. We examined variable importance to identify which sensor data collected at which season are important to classify paddy rice. In addition, this study proposed a new index called Paddy rice Mapping Index (PMI) for effective paddy rice classification considering the spectral and temporal characteristics of paddy rice. Scenario 6 that uses optical sensor and SAR multi temporal data showed the highest overall accuracy (site 1: 98.67%; site 2: 93.87%) for paddy rice classification among six scenarios. Both machine

  6. Enviro-Net: From Networks of Ground-Based Sensor Systems to a Web Platform for Sensor Data Management

    Directory of Open Access Journals (Sweden)

    Mario A. Nascimento

    2011-06-01

    Full Text Available Ecosystems monitoring is essential to properly understand their development and the effects of events, both climatological and anthropological in nature. The amount of data used in these assessments is increasing at very high rates. This is due to increasing availability of sensing systems and the development of new techniques to analyze sensor data. The Enviro-Net Project encompasses several of such sensor system deployments across five countries in the Americas. These deployments use a few different ground-based sensor systems, installed at different heights monitoring the conditions in tropical dry forests over long periods of time. This paper presents our experience in deploying and maintaining these systems, retrieving and pre-processing the data, and describes the Web portal developed to help with data management, visualization and analysis.

  7. 3D-information fusion from very high resolution satellite sensors

    Science.gov (United States)

    Krauss, T.; d'Angelo, P.; Kuschk, G.; Tian, J.; Partovi, T.

    2015-04-01

    In this paper we show the pre-processing and potential for environmental applications of very high resolution (VHR) satellite stereo imagery like these from WorldView-2 or Pl'eiades with ground sampling distances (GSD) of half a metre to a metre. To process such data first a dense digital surface model (DSM) has to be generated. Afterwards from this a digital terrain model (DTM) representing the ground and a so called normalized digital elevation model (nDEM) representing off-ground objects are derived. Combining these elevation based data with a spectral classification allows detection and extraction of objects from the satellite scenes. Beside the object extraction also the DSM and DTM can directly be used for simulation and monitoring of environmental issues. Examples are the simulation of floodings, building-volume and people estimation, simulation of noise from roads, wave-propagation for cellphones, wind and light for estimating renewable energy sources, 3D change detection, earthquake preparedness and crisis relief, urban development and sprawl of informal settlements and much more. Also outside of urban areas volume information brings literally a new dimension to earth oberservation tasks like the volume estimations of forests and illegal logging, volume of (illegal) open pit mining activities, estimation of flooding or tsunami risks, dike planning, etc. In this paper we present the preprocessing from the original level-1 satellite data to digital surface models (DSMs), corresponding VHR ortho images and derived digital terrain models (DTMs). From these components we present how a monitoring and decision fusion based 3D change detection can be realized by using different acquisitions. The results are analyzed and assessed to derive quality parameters for the presented method. Finally the usability of 3D information fusion from VHR satellite imagery is discussed and evaluated.

  8. Geometric calibration of multi-sensor image fusion system with thermal infrared and low-light camera

    Science.gov (United States)

    Peric, Dragana; Lukic, Vojislav; Spanovic, Milana; Sekulic, Radmila; Kocic, Jelena

    2014-10-01

    A calibration platform for geometric calibration of multi-sensor image fusion system is presented in this paper. The accurate geometric calibration of the extrinsic geometric parameters of cameras that uses planar calibration pattern is applied. For calibration procedure specific software is made. Patterns used in geometric calibration are prepared with aim to obtain maximum contrast in both visible and infrared spectral range - using chessboards which fields are made of different emissivity materials. Experiments were held in both indoor and outdoor scenarios. Important results of geometric calibration for multi-sensor image fusion system are extrinsic parameters in form of homography matrices used for homography transformation of the object plane to the image plane. For each camera a corresponding homography matrix is calculated. These matrices can be used for image registration of images from thermal and low light camera. We implemented such image registration algorithm to confirm accuracy of geometric calibration procedure in multi-sensor image fusion system. Results are given for selected patterns - chessboard with fields made of different emissivity materials. For the final image registration algorithm in surveillance system for object tracking we have chosen multi-resolution image registration algorithm which naturally combines with a pyramidal fusion scheme. The image pyramids which are generated at each time step of image registration algorithm may be reused at the fusion stage so that overall number of calculations that must be performed is greatly reduced.

  9. An Adaptive Multi-Sensor Data Fusion Method Based on Deep Convolutional Neural Networks for Fault Diagnosis of Planetary Gearbox

    Science.gov (United States)

    Jing, Luyang; Wang, Taiyong; Zhao, Ming; Wang, Peng

    2017-01-01

    A fault diagnosis approach based on multi-sensor data fusion is a promising tool to deal with complicated damage detection problems of mechanical systems. Nevertheless, this approach suffers from two challenges, which are (1) the feature extraction from various types of sensory data and (2) the selection of a suitable fusion level. It is usually difficult to choose an optimal feature or fusion level for a specific fault diagnosis task, and extensive domain expertise and human labor are also highly required during these selections. To address these two challenges, we propose an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis. The proposed method can learn features from raw data and optimize a combination of different fusion levels adaptively to satisfy the requirements of any fault diagnosis task. The proposed method is tested through a planetary gearbox test rig. Handcraft features, manual-selected fusion levels, single sensory data, and two traditional intelligent models, back-propagation neural networks (BPNN) and a support vector machine (SVM), are used as comparisons in the experiment. The results demonstrate that the proposed method is able to detect the conditions of the planetary gearbox effectively with the best diagnosis accuracy among all comparative methods in the experiment. PMID:28230767

  10. A New Multi-Sensor Fusion Scheme to Improve the Accuracy of Knee Flexion Kinematics for Functional Rehabilitation Movements

    Science.gov (United States)

    Tannous, Halim; Istrate, Dan; Benlarbi-Delai, Aziz; Sarrazin, Julien; Gamet, Didier; Ho Ba Tho, Marie Christine; Dao, Tien Tuan

    2016-01-01

    Exergames have been proposed as a potential tool to improve the current practice of musculoskeletal rehabilitation. Inertial or optical motion capture sensors are commonly used to track the subject’s movements. However, the use of these motion capture tools suffers from the lack of accuracy in estimating joint angles, which could lead to wrong data interpretation. In this study, we proposed a real time quaternion-based fusion scheme, based on the extended Kalman filter, between inertial and visual motion capture sensors, to improve the estimation accuracy of joint angles. The fusion outcome was compared to angles measured using a goniometer. The fusion output shows a better estimation, when compared to inertial measurement units and Kinect outputs. We noted a smaller error (3.96°) compared to the one obtained using inertial sensors (5.04°). The proposed multi-sensor fusion system is therefore accurate enough to be applied, in future works, to our serious game for musculoskeletal rehabilitation. PMID:27854288

  11. Real-Time Classification and Sensor Fusion with a Spiking Deep Belief Network

    Directory of Open Access Journals (Sweden)

    Peter eO'Connor

    2013-10-01

    Full Text Available Deep Belief Networks (DBNs have recently shown impressive performance on a broad range of classification problems. Their generative properties allow better understanding of the performance, and provide a simpler solution for sensor fusion tasks. However, because of their inherent need for feedback and parallel update of large numbers of units, DBNs are expensive to implement on serial computers. This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation. The method is demonstrated in simulation and by a real-time implementation of a 3-layer network with 2694 neurons used for visual classification of MNIST handwritten digits with input from a 128x128 Dynamic Vision Sensor (DVS silicon retina, and sensory-fusion using additional input from a 64-channel AER-EAR silicon cochlea. The system is implemented through the open-source software in the jAER project and runs in real-time on a laptop computer. It is demonstrated that the system can recognize digits in the presence of distractions, noise, scaling, translation and rotation, and that the degradation of recognition performance by using an event-based approach is less than 1%. Recognition is achieved in an average of 5.8 ms after the onset of the presentation of a digit. By cue integration from both silicon retina and cochlea outputs we show that the system can be biased to select the correct digit from otherwise ambiguous input.

  12. ADS-B and multilateration sensor fusion algorithm for air traffic control

    Science.gov (United States)

    Liang, Mengchen

    Air traffic is expected to increase rapidly in the next decade. But, the current Air Traffic Control (ATC) system does not meet the demand of the future safety and efficiency. The Next Generation Air Transportation System (NextGen) is a transformation program for the ATC system in the United States. The latest estimates by Federal Aviation Administration (FAA) show that by 2018 NextGen will reduce total delays in flight by 35 percent and provide 23 billion dollars in cumulative benefits. A satellite-based technology called the Automatic Dependent Surveillance-Broadcast (ADS-B) system is one of the most important elements in NextGen. FAA expects that ADS-B systems will be available in the National Airspace System (NAS) by 2020. However, an alternative surveillance system is needed due to vulnerabilities that exist in ADS-B systems. Multilateration has a high accuracy performance and is believed to be an ideal back-up strategy for ADS-B systems. Thus, in this study, we develop the ADS-B and multilateration sensor fusion algorithm for aircraft tracking applications in ATC. The algorithm contains a fault detection function for ADS-B information monitoring by using Trajectory Change Points reports from ADS-B and numerical vectors from a hybrid estimation algorithm. We consider two types of faults in the ADS-B measurement model to show that the algorithm is able to deal with the bad data from ADS-B systems and automatically select good data from multilateration systems. We apply fuzzy logic concepts and generate time variant parameters during the fusion process. The parameters play a role of weights for combining data from different sensors. The algorithm performance is validated through two aircraft tracking examples.

  13. Multiple sensor detection of process phenomena in laser powder bed fusion

    Science.gov (United States)

    Lane, Brandon; Whitenton, Eric; Moylan, Shawn

    2016-05-01

    Laser powder bed fusion (LPBF) is an additive manufacturing (AM) process in which a high power laser melts metal powder layers into complex, three-dimensional shapes. LPBF parts are known to exhibit relatively high residual stresses, anisotropic microstructure, and a variety of defects. To mitigate these issues, in-situ measurements of the melt-pool phenomena may illustrate relationships between part quality and process signatures. However, phenomena such as spatter, plume formation, laser modulation, and melt-pool oscillations may require data acquisition rates exceeding 10 kHz. This hinders use of relatively data-intensive, streaming imaging sensors in a real-time monitoring and feedback control system. Single-point sensors such as photodiodes provide the temporal bandwidth to capture process signatures, while providing little spatial information. This paper presents results from experiments conducted on a commercial LPBF machine which incorporated synchronized, in-situ acquisition of a thermal camera, high-speed visible camera, photodiode, and laser modulation signal during fabrication of a nickel alloy 625 AM part with an overhang geometry. Data from the thermal camera provides temperature information, the visible camera provides observation of spatter, and the photodiode signal provides high temporal bandwidth relative brightness stemming from the melt pool region. In addition, joint-time frequency analysis (JTFA) was performed on the photodiode signal. JTFA results indicate what digital filtering and signal processing are required to highlight particular signatures. Image fusion of the synchronized data obtained over multiple build layers allows visual comparison between the photodiode signal and relating phenomena observed in the imaging detectors.

  14. Real-time classification and sensor fusion with a spiking deep belief network

    Science.gov (United States)

    O'Connor, Peter; Neil, Daniel; Liu, Shih-Chii; Delbruck, Tobi; Pfeiffer, Michael

    2013-01-01

    Deep Belief Networks (DBNs) have recently shown impressive performance on a broad range of classification problems. Their generative properties allow better understanding of the performance, and provide a simpler solution for sensor fusion tasks. However, because of their inherent need for feedback and parallel update of large numbers of units, DBNs are expensive to implement on serial computers. This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation. The method is demonstrated in simulation and by a real-time implementation of a 3-layer network with 2694 neurons used for visual classification of MNIST handwritten digits with input from a 128 × 128 Dynamic Vision Sensor (DVS) silicon retina, and sensory-fusion using additional input from a 64-channel AER-EAR silicon cochlea. The system is implemented through the open-source software in the jAER project and runs in real-time on a laptop computer. It is demonstrated that the system can recognize digits in the presence of distractions, noise, scaling, translation and rotation, and that the degradation of recognition performance by using an event-based approach is less than 1%. Recognition is achieved in an average of 5.8 ms after the onset of the presentation of a digit. By cue integration from both silicon retina and cochlea outputs we show that the system can be biased to select the correct digit from otherwise ambiguous input. PMID:24115919

  15. Pose estimation of surgical instrument using sensor data fusion with optical tracker and IMU based on Kalman filter

    OpenAIRE

    Oh Hyunmin; Chae You Seong; An Jinung; Kim Min Young

    2015-01-01

    Tracking system is essential for Image Guided Surgery(IGS). The Optical Tracking Sensor(OTS) has been widely used as tracking system for IGS due to its high accuracy and easy usage. However, OTS has a limit that tracking fails when occlusion of marker occurs. In this paper, sensor fusion with OTS and Inertial Measurement Unit(IMU) is proposed to solve this problem. The proposed algorithm improves the accuracy of tracking system by eliminating scattering error of the sensor and supplements the...

  16. Spatial Aspects of Multi-Sensor Data Fusion: Aerosol Optical Thickness

    Science.gov (United States)

    Leptoukh, Gregory; Zubko, V.; Gopalan, A.

    2007-01-01

    The Goddard Earth Sciences Data and Information Services Center (GES DISC) investigated the applicability and limitations of combining multi-sensor data through data fusion, to increase the usefulness of the multitude of NASA remote sensing data sets, and as part of a larger effort to integrate this capability in the GES-DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni). This initial study focused on merging daily mean Aerosol Optical Thickness (AOT), as measured by the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites, to increase spatial coverage and produce complete fields to facilitate comparison with models and station data. The fusion algorithm used the maximum likelihood technique to merge the pixel values where available. The algorithm was applied to two regional AOT subsets (with mostly regular and irregular gaps, respectively) and a set of AOT fields that differed only in the size and location of artificially created gaps. The Cumulative Semivariogram (CSV) was found to be sensitive to the spatial distribution of gap areas and, thus, useful for assessing the sensitivity of the fused data to spatial gaps.

  17. Passive in-vehicle driver breath alcohol detection using advanced sensor signal acquisition and fusion.

    Science.gov (United States)

    Ljungblad, Jonas; Hök, Bertil; Allalou, Amin; Pettersson, Håkan

    2017-04-03

    The research objective of the present investigation is to demonstrate the present status of passive in-vehicle driver breath alcohol detection and highlighting the necessary conditions for large scale implementation of such a system. Completely passive detection has remained a challenge mainly because of the requirements on signal resolution combined with the constraints of vehicle integration. The work is part of the DADSS (driver alcohol detection system for safety) program aiming at massive deployment of alcohol sensing systems which could potentially save thousands of American lives annually. The work reported here builds on earlier investigations, in which it has been shown that detection of alcohol vapor in the proximity of a human subject may be traced to that subject by means of simultaneous recording of carbon dioxide (CO2) at the same location. Sensors based on infrared spectroscopy were developed to detect and quantify low concentrations of alcohol and CO2. In the present investigation, alcohol and CO2 were recorded at various locations in a vehicle cabin while human subjects were performing normal in-step procedures and driving preparations. A video camera directed to the driver position was recording images of the driver's upper body parts including the face, and the images were analyzed with respect to features of significance to the breathing behavior and breath detection, such as mouth opening and head direction. Improvement of the sensor system with respect to signal resolution including algorithm and software development, and fusion of the sensor and camera signals was successfully implemented and tested before starting the human study. In addition, experimental tests and simulations were performed with the purpose of connecting human subject data with repeatable experimental conditions. The results include occurrence statistics of detected breaths by signal peaks of CO2 and alcohol. From the statistical data, the accuracy of breath alcohol

  18. Evaluation of chemical sensors for in situ ground-water monitoring at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, E.M.; Hostetler, D.D.

    1989-03-01

    This report documents a preliminary review and evaluation of instrument systems and sensors that may be used to detect ground-water contaminants in situ at the Hanford Site. Three topics are covered in this report: (1) identification of a group of priority contaminants at Hanford that could be monitored in situ, (2) a review of current instrument systems and sensors for environmental monitoring, and (3) an evaluation of instrument systems that could be used to monitor Hanford contaminants. Thirteen priority contaminants were identified in Hanford ground water, including carbon tetrachloride and six related chlorinated hydrocarbons, cyanide, methyl ethyl ketone, chromium (VI), fluoride, nitrate, and uranium. Based on transduction principles, chemical sensors were divided into four classes, ten specific types of instrument systems were considered: fluorescence spectroscopy, surface-enhanced Raman spectroscopy (SERS), spark excitation-fiber optic spectrochemical emission sensor (FOSES), chemical optrodes, stripping voltammetry, catalytic surface-modified ion electrode immunoassay sensors, resistance/capacitance, quartz piezobalance and surface acoustic wave devices. Because the flow of heat is difficult to control, there are currently no environmental chemical sensors based on thermal transduction. The ability of these ten instrument systems to detect the thirteen priority contaminants at the Hanford Site at the required sensitivity was evaluated. In addition, all ten instrument systems were qualitatively evaluated for general selectivity, response time, reliability, and field operability. 45 refs., 23 figs., 7 tabs.

  19. Research and Analysis on Multi-sensor Data Fusion Algorithm Based on Intelligent Vehicle%基于智能车辆的多传感器数据融合算法研究与分析综述

    Institute of Scientific and Technical Information of China (English)

    宋维堂; 张鸰

    2012-01-01

    Multi-sensor data fusion is a new technology which developed in 1980s. It can be used to synthesize the intelligent vehicle data that collected by multiple sensors,and take advantage of redundancy and complementary between the multi-sensor data to arrive an accurate environmental information for ground vehicle location,vehicle tracking, vehicle navigation etc. Based on classification and summary of existing data fusion methods, this paper elaborates the research on multi-sensor data fusion algorithm and the application of data fusion technology. This paper could provide reference for research of multi-sensor data fusion on intelligent vehicles.%多传感器数据融合是20世纪80年代发展起来的一门新技术,将智能车辆中多个传感器采集的数据进行合成,并充分利用多感器数据间的冗余和互补特性,从而得出准确的环境信息用于地面车辆定位、车辆跟踪、车辆导航等。文章通过对现有的数据融合方法进行分类和归纳总结,对多传感器数据融合算法的研究和数据融合技术的应用情况进行阐述,为智能车辆多传感数据融合方面的研究提供参考。

  20. Urban-scale mapping of PM2.5 distribution via data fusion between high-density sensor network and MODIS Aerosol Optical Depth

    Science.gov (United States)

    Ba, Yu Tao; xian Liu, Bao; Sun, Feng; Wang, Li hua; Tang, Yu jia; Zhang, Da wei

    2017-04-01

    High-resolution mapping of PM2.5 is the prerequisite for precise analytics and subsequent anti-pollution interventions. Considering the large variances of particulate distribution, urban-scale mapping is challenging either with ground-based fixed stations, with satellites or via models. In this study, a dynamic fusion method between high-density sensor network and MODIS Aerosol Optical Depth (AOD) was introduced. The sensor network was deployed in Beijing ( > 1000 fixed monitors across 16000 km2 area) to provide raw observations with high temporal resolution (sampling interval 5 km). The MODIS AOD was calibrated to provide distribution map with low temporal resolution (daily) and moderate spatial resolution ( = 3 km). By encoding the data quality and defects (e.g. could, reflectance, abnormal), a hybrid interpolation procedure with cross-validation generated PM2.5 distribution with both high temporal and spatial resolution. Several no-pollutant and high-pollution periods were tested to validate the proposed fusion method for capturing the instantaneous patterns of PM2.5 emission.

  1. Geographic information system for fusion and analysis of high-resolution remote sensing and ground data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1993-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System (GIS), integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a boreal forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case 1) calibrated DC-8 SAR (Synthetic Aperture Radar) data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case 2) will produce calibrated DC-8 SAR

  2. Geographic information system for fusion and analysis of high-resolution remote sensing and ground data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1993-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System (GIS), integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a boreal forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case 1) calibrated DC-8 SAR (Synthetic Aperture Radar) data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case 2) will produce calibrated DC-8 SAR

  3. The application of machine learning in multi sensor data fusion for activity recognition in mobile device space

    Science.gov (United States)

    Marhoubi, Asmaa H.; Saravi, Sara; Edirisinghe, Eran A.

    2015-05-01

    The present generation of mobile handheld devices comes equipped with a large number of sensors. The key sensors include the Ambient Light Sensor, Proximity Sensor, Gyroscope, Compass and the Accelerometer. Many mobile applications are driven based on the readings obtained from either one or two of these sensors. However the presence of multiple-sensors will enable the determination of more detailed activities that are carried out by the user of a mobile device, thus enabling smarter mobile applications to be developed that responds more appropriately to user behavior and device usage. In the proposed research we use recent advances in machine learning to fuse together the data obtained from all key sensors of a mobile device. We investigate the possible use of single and ensemble classifier based approaches to identify a mobile device's behavior in the space it is present. Feature selection algorithms are used to remove non-discriminant features that often lead to poor classifier performance. As the sensor readings are noisy and include a significant proportion of missing values and outliers, we use machine learning based approaches to clean the raw data obtained from the sensors, before use. Based on selected practical case studies, we demonstrate the ability to accurately recognize device behavior based on multi-sensor data fusion.

  4. A dynamic and context-aware semantic mediation service for discovering and fusion of heterogeneous sensor data

    Directory of Open Access Journals (Sweden)

    Mohamed Bakillah

    2013-06-01

    Full Text Available Sensors play an increasingly critical role in capturing and distributing observation of phenomena in our environment. The Semantic Sensor Web enables interoperability to support various applications that use data made available by semantically heterogeneous sensor services. However, several challenges still need to be addressed to achieve this vision. More particularly, mechanisms that can support context-aware semantic mapping that adapts to dynamic metadata of sensors are required. Semantic mapping for Sensor Web is required to support sensor data fusion, sensor data discovery and retrieval, and automatic semantic annotation, to name only a few applications. This paper presents a context-aware ontology-based semantic mediation service for heterogeneous sensor services. The semantic mediation service is context-aware and dynamic because it takes into account the real-time variability of thematic, spatial and temporal features that describe sensor data in different contexts. The semantic mediation service integrates rule-based reasoning to support resolution of semantic heterogeneities. An application scenario is presented showing how the semantic mediation service can improve sensor data interpretation, reuse, and sharing in static and dynamic settings.

  5. Flux Tensor Constrained Geodesic Active Contours with Sensor Fusion for Persistent Object Tracking

    Directory of Open Access Journals (Sweden)

    Filiz Bunyak

    2007-08-01

    Full Text Available This paper makes new contributions in motion detection, object segmentation and trajectory estimation to create a successful object tracking system. A new efficient motion detection algorithm referred to as the flux tensor is used to detect moving objects in infrared video without requiring background modeling or contour extraction. The flux tensor-based motion detector when applied to infrared video is more accurate than thresholding ”hot-spots”, and is insensitive to shadows as well as illumination changes in the visible channel. In real world monitoring tasks fusing scene information from multiple sensors and sources is a useful core mechanism to deal with complex scenes, lighting conditions and environmental variables. The object segmentation algorithm uses level set-based geodesic active contour evolution that incorporates the fusion of visible color and infrared edge informations in a novel manner. Touching or overlapping objects are further refined during the segmentation process using an appropriate shapebased model. Multiple object tracking using correspondence graphs is extended to handle groups of objects and occlusion events by Kalman filter-based cluster trajectory analysis and watershed segmentation. The proposed object tracking algorithm was successfully tested on several difficult outdoor multispectral videos from stationary sensors and is not confounded by shadows or illumination variations.

  6. Fusion of Landsat TM and ground spectrometry data in monitoring of non-operating mine

    Science.gov (United States)

    Borisova, Denitsa; Nikolov, Hristo N.

    2009-09-01

    Surface mining activities in Europe are estimated to cover an area of 5-10 000 km2. In this paper we suggest that the availability of Landsat Thematic Mapper (TM) for Earth observation allows the collection of environmental and minerelated data for use in the planning and undertaking of mine restoration work on cost-effective basis. The advantage is that these data are acquired digitally and can be easily processed and utilized in various information formats. Important step in the data processing is the verification of airborne data. For this purpose ground spectrometry measurements of samples taken from test sites have been performed. In the last decade several mining areas and corresponding dumps are subject to reclamation process in Bulgaria. We focused our research on one of the most important in the copper production for 20 year period for our country - Asarel-Medet deposit. This mining complex consists of an open mine, the dumps and a processing plant. After ceasing the exploitation of Medet deposit in 1994 a rehabilitation program for soil cover and hydrographic network was established and launched. A continuous task is the monitoring of these activities from the beginning for at least 15 years period, which is to end this year. To process the data, which characterize the progress of the land cover restoration, several techniques, both standard, such as basic and advanced statistics, image enhancement and data fusion, and novel methods for supervised classification were used. The results obtained show that used data and the implemented approach are useful in environmental monitoring and are economically attractive for the company responsible for the ecological state of the region.

  7. Pixel-by-pixel VIS/NIR and LIR sensor fusion system

    Science.gov (United States)

    Zhang, Evan; Zhang, James S.; Song, Vivian W.; Chin, Ken P.; Hu, Gelbert

    2003-01-01

    Visible (VIS) camera (such as CCD) or Near Infrared (NIR) camera (such as low light level CCD or image intensifier) has high resolution and is easy to distinguish enemy and foe, but it cannot see through thin fog/cloud, heavy smoke/dust, foliage, camouflage, and darkness. The Long Infrared (LIR) imager can overcome above problems, but the resolution is too low and it cannot see the NIR aiming light from enemy. The best solution is to fuse the VIS/NIR and LIR sensors to overcome their shortcomings and take advantages of both sensors. In order to see the same target without parallax, the fusio system must have a common optical aperature. In this paper, three common optical apertures are designed: common reflective objective lens, common beam splitter, and common transmissive objective lens. The first one has very small field of view and the second one needs two heads, so the best choice is the third one, but we must find suitable optical materials and correct the color aberrations from 0.6 to 12 μ. It is a tough job. By choosing ZnSe as the first common piece of the objective lens and using glass for NIR and Ge (or IR glass) for LIR as rest pieces, we only need to and are able to correct the aberrations from 0.6 to 1.0 μ for NIR and from 8 to 12 μ for LIR. Finally, a common reflective objective lens and the common beam splitter are also successfully designed. Five application examples are given. In the digital signal processing, we use only one Altera chip. After inserting data, scaling the image size, and adjusting the signal level, the LIR will have the same format and same pixel number of the VIS/NIR, so real-time pixel-by-pixel sensor fusion is realized. The digital output can be used for further image processing and automatic target recognition, such as if we overlap the LIR image on the VIS/NIR image for missile guidance or rifle sight we don't need to worry about the time and the environment again. A gum-size wireless transmitter is also designed that is

  8. Data dimensionality reduction and data fusion for fast characterization of green coffee samples using hyperspectral sensors.

    Science.gov (United States)

    Calvini, Rosalba; Foca, Giorgia; Ulrici, Alessandro

    2016-10-01

    Hyperspectral sensors represent a powerful tool for chemical mapping of solid-state samples, since they provide spectral information localized in the image domain in very short times and without the need of sample pretreatment. However, due to the large data size of each hyperspectral image, data dimensionality reduction (DR) is necessary in order to develop hyperspectral sensors for real-time monitoring of large sets of samples with different characteristics. In particular, in this work, we focused on DR methods to convert the three-dimensional data array corresponding to each hyperspectral image into a one-dimensional signal (1D-DR), which retains spectral and/or spatial information. In this way, large datasets of hyperspectral images can be converted into matrices of signals, which in turn can be easily processed using suitable multivariate statistical methods. Obviously, different 1D-DR methods highlight different aspects of the hyperspectral image dataset. Therefore, in order to investigate their advantages and disadvantages, in this work, we compared three different 1D-DR methods: average spectrum (AS), single space hyperspectrogram (SSH) and common space hyperspectrogram (CSH). In particular, we have considered 370 NIR-hyperspectral images of a set of green coffee samples, and the three 1D-DR methods were tested for their effectiveness in sensor fault detection, data structure exploration and sample classification according to coffee variety and to coffee processing method. Principal component analysis and partial least squares-discriminant analysis were used to compare the three separate DR methods. Furthermore, low-level and mid-level data fusion was also employed to test the advantages of using AS, SSH and CSH altogether. Graphical Abstract Key steps in hyperspectral data dimenionality reduction.

  9. Kalman Filters in Geotechnical Monitoring of Ground Subsidence Using Data from MEMS Sensors.

    Science.gov (United States)

    Li, Cheng; Azzam, Rafig; Fernández-Steeger, Tomás M

    2016-07-19

    The fast development of wireless sensor networks and MEMS make it possible to set up today real-time wireless geotechnical monitoring. To handle interferences and noises from the output data, Kalman filter can be selected as a method to achieve a more realistic estimate of the observations. In this paper, a one-day wireless measurement using accelerometers and inclinometers was deployed on top of a tunnel section under construction in order to monitor ground subsidence. The normal vectors of the sensors were firstly obtained with the help of rotation matrices, and then be projected to the plane of longitudinal section, by which the dip angles over time would be obtained via a trigonometric function. Finally, a centralized Kalman filter was applied to estimate the tilt angles of the sensor nodes based on the data from the embedded accelerometer and the inclinometer. Comparing the results from two sensor nodes deployed away and on the track respectively, the passing of the tunnel boring machine can be identified from unusual performances. Using this method, the ground settlement due to excavation can be measured and a real-time monitoring of ground subsidence can be realized.

  10. Commercial off the Shelf Ground Control Supports Calibration and Conflation from Ground to Space Based Sensors

    Science.gov (United States)

    Danielová, M.; Hummel, P.

    2016-06-01

    The need for rapid deployment of aerial and satellite imagery in support of GIS and engineering integration projects require new sources of geodetic control to ensure the accuracy for geospatial projects. In the past, teams of surveyors would need to deploy to project areas to provide targeted or photo identifiable points that are used to provide data for orthorecificaion, QA/QC and calibration for multi-platform sensors. The challenge of integrating street view, UAS, airborne and Space based sensors to produce the common operational picture requires control to tie multiple sources together. Today commercial off the shelf delivery of existing photo identifiable control is increasing the speed of deployment of this data without having to revisit sites over and over again. The presentation will discuss the processes developed by CompassData to build a global library of 40,000 control points available today. International Organization for Standardization (ISO) based processes and initiatives ensure consistent quality of survey data, photo identifiable features selected and meta data to support photogrammetrist, engineers and GIS professionals to quickly deliver projects with better accuracy.

  11. Embry-Riddle Aeronautical University multispectral sensor and data fusion laboratory: a model for distributed research and education

    Science.gov (United States)

    McMullen, Sonya A. H.; Henderson, Troy; Ison, David

    2017-05-01

    The miniaturization of unmanned systems and spacecraft, as well as computing and sensor technologies, has opened new opportunities in the areas of remote sensing and multi-sensor data fusion for a variety of applications. Remote sensing and data fusion historically have been the purview of large government organizations, such as the Department of Defense (DoD), National Aeronautics and Space Administration (NASA), and National Geospatial-Intelligence Agency (NGA) due to the high cost and complexity of developing, fielding, and operating such systems. However, miniaturized computers with high capacity processing capabilities, small and affordable sensors, and emerging, commercially available platforms such as UAS and CubeSats to carry such sensors, have allowed for a vast range of novel applications. In order to leverage these developments, Embry-Riddle Aeronautical University (ERAU) has developed an advanced sensor and data fusion laboratory to research component capabilities and their employment on a wide-range of autonomous, robotic, and transportation systems. This lab is unique in several ways, for example, it provides a traditional campus laboratory for students and faculty to model and test sensors in a range of scenarios, process multi-sensor data sets (both simulated and experimental), and analyze results. Moreover, such allows for "virtual" modeling, testing, and teaching capability reaching beyond the physical confines of the facility for use among ERAU Worldwide students and faculty located around the globe. Although other institutions such as Georgia Institute of Technology, Lockheed Martin, University of Dayton, and University of Central Florida have optical sensor laboratories, the ERAU virtual concept is the first such lab to expand to multispectral sensors and data fusion, while focusing on the data collection and data products and not on the manufacturing aspect. Further, the initiative is a unique effort among Embry-Riddle faculty to develop multi

  12. Ground and river water quality monitoring using a smartphone-based pH sensor

    Directory of Open Access Journals (Sweden)

    Sibasish Dutta

    2015-05-01

    Full Text Available We report here the working of a compact and handheld smartphone-based pH sensor for monitoring of ground and river water quality. Using simple laboratory optical components and the camera of the smartphone, we develop a compact spectrophotometer which is operational in the wavelength range of 400-700 nm and having spectral resolution of 0.305 nm/pixel for our equipment. The sensor measures variations in optical absorption band of pH sensitive dye sample in different pH solutions. The transmission image spectra through a transmission grating gets captured by the smartphone, and subsequently converted into intensity vs. wavelengths. Using the designed sensor, we measure water quality of ground water and river water from different locations in Assam and the results are found to be reliable when compared with the standard spectrophotometer tool. The overall cost involved for development of the sensor is relatively low. We envision that the designed sensing technique could emerge as an inexpensive, compact and portable pH sensor that would be useful for in-field applications.

  13. Ground and river water quality monitoring using a smartphone-based pH sensor

    Science.gov (United States)

    Dutta, Sibasish; Sarma, Dhrubajyoti; Nath, Pabitra

    2015-05-01

    We report here the working of a compact and handheld smartphone-based pH sensor for monitoring of ground and river water quality. Using simple laboratory optical components and the camera of the smartphone, we develop a compact spectrophotometer which is operational in the wavelength range of 400-700 nm and having spectral resolution of 0.305 nm/pixel for our equipment. The sensor measures variations in optical absorption band of pH sensitive dye sample in different pH solutions. The transmission image spectra through a transmission grating gets captured by the smartphone, and subsequently converted into intensity vs. wavelengths. Using the designed sensor, we measure water quality of ground water and river water from different locations in Assam and the results are found to be reliable when compared with the standard spectrophotometer tool. The overall cost involved for development of the sensor is relatively low. We envision that the designed sensing technique could emerge as an inexpensive, compact and portable pH sensor that would be useful for in-field applications.

  14. Stochastic approximation methods for fusion-rule estimation in multiple sensor systems

    Energy Technology Data Exchange (ETDEWEB)

    Rao, N.S.V.

    1994-06-01

    A system of N sensors S{sub 1}, S{sub 2},{hor_ellipsis},S{sub N} is considered; corresponding to an object with parameter x {element_of} {Re}{sup d}, sensor S{sub i} yields output y{sup (i)}{element_of}{Re}{sup d} according to an unknown probability distribution p{sub i}(y{sup (i)}{vert_bar}x). A training l-sample (x{sub 1}, y{sub 1}), (x{sub 2}, y{sub 2}),{hor_ellipsis},(x{sub l}, y{sub l}) is given where y{sub i} = (y{sub i}({sup 1}), y{sub i}({sup 2}),{hor_ellipsis},y{sub i}({sup N}) and y{sub i}({sup j}) is the output of S{sub j} in response to input X{sub i}. The problem is to estimate a fusion rule f : {Re}{sup Nd} {yields} {Re}{sup d}, based on the sample, such that the expected square error I(f) = {integral}[x {minus} f(y{sup 1}, y{sup 2},{hor_ellipsis},y{sup N})]{sup 2} p(y{sup 1}, y{sup 2},{hor_ellipsis},y{sup N}){vert_bar}x)p(x)dy{sup 1}dy{sup 2} {hor_ellipsis} dy{sup N}dx is to be minimized over a family of fusion rules {lambda} based on the given l-sample. Let f{sub *} {element_of} {lambda} minimize I(f); f{sub *} cannot be computed since the underlying probability distributions are unknown. Three stochastic approximation methods are presented to compute {cflx f}, such that under suitable conditions, for sufficiently large sample, P[I{cflx f} {minus} I(f{sub *}) > {epsilon}] < {delta} for arbitrarily specified {epsilon} > 0 and {delta}, 0 < {delta} < 1. The three methods are based on Robbins-Monro style algorithms, empirical risk minimization, and regression estimation algorithms.

  15. Fusion rule estimation in multiple sensor systems with unknown noise distributions

    Energy Technology Data Exchange (ETDEWEB)

    Rao, N.S.V.

    1993-12-31

    A system of N sensors S{sub 1},S{sub 2},{hor_ellipsis},S{sub N} is considered; corresponding to an object with parameter x {epsilon} R{sup d}, sensor S{sub i} yields output y{sup (i)} {epsilon} R{sup d} according to an unknown probability distribution p{sub i}(y{sup (i)}{vert_bar}x). A training l-sample (x{sub 1},y{sub 1}),(x{sub 2},y{sub 2}),{hor_ellipsis},(x{sub l},y{sub l}) is given where y{sub i} = (y{sub i}{sup (1)}, y{sub i}{sup (2)},{hor_ellipsis},y{sub i}{sup (N)}) and y{sub i}{sup (j)} is the output of S{sub j} in response to input x{sub i}. The problem is to estimate a fusion rule f:R{sup Nd} {yields} R{sup d}, based on the sample, such that the expected square error I(f) = {integral}[x {minus} f(y{sup (1)},y{sup (2)}, {hor_ellipsis},y{sup (N)})]{sup 2}p(y{sup (1)},y{sup (2)}, {hor_ellipsis},y{sup (N)}{vert_bar}x)p(x)dy{sup (1)}dy{sup (2)}{hor_ellipsis}dy{sup (N)}dx is to be minimized over a family of fusion rules {Lambda} based on the given l-sample. Let f{sub *} {epsilon} {Lambda} minimize I(f); f{sub *} cannot be computed since the underlying probability distributions are unknown. Using Vapnik`s empirical risk minimization method, we show that if {Lambda} has finite capacity, then under bounded error, for sufficiently large sample, f{sub emp} can be obtained such that P[I(f{sub emp}) {minus} I(f{sub *}) > {epsilon}] < {delta} for arbitrarily specified {epsilon} > 0 and {delta}, 0 < {delta} < 1. We identify several computational methods to obtain f{sub emp} or its approximations based on neural networks, radial basis functions, wavelets, non-polynomial networks, and polynomials and splines. We then discuss linearly separable systems to identify objects from a finite class where f{sub emp} can be computed in polynomial time using quadratic programming methods.

  16. Human Arm Motion Tracking by Orientation-Based Fusion of Inertial Sensors and Kinect Using Unscented Kalman Filter.

    Science.gov (United States)

    Atrsaei, Arash; Salarieh, Hassan; Alasty, Aria

    2016-09-01

    Due to various applications of human motion capture techniques, developing low-cost methods that would be applicable in nonlaboratory environments is under consideration. MEMS inertial sensors and Kinect are two low-cost devices that can be utilized in home-based motion capture systems, e.g., home-based rehabilitation. In this work, an unscented Kalman filter approach was developed based on the complementary properties of Kinect and the inertial sensors to fuse the orientation data of these two devices for human arm motion tracking during both stationary shoulder joint position and human body movement. A new measurement model of the fusion algorithm was obtained that can compensate for the inertial sensors drift problem in high dynamic motions and also joints occlusion in Kinect. The efficiency of the proposed algorithm was evaluated by an optical motion tracker system. The errors were reduced by almost 50% compared to cases when either inertial sensor or Kinect measurements were utilized.

  17. Experimental measurement of oil-water two-phase flow by data fusion of electrical tomography sensors and venturi tube

    Science.gov (United States)

    Liu, Yinyan; Deng, Yuchi; Zhang, Maomao; Yu, Peining; Li, Yi

    2017-09-01

    Oil-water two-phase flows are commonly found in the production processes of the petroleum industry. Accurate online measurement of flow rates is crucial to ensure the safety and efficiency of oil exploration and production. A research team from Tsinghua University has developed an experimental apparatus for multiphase flow measurement based on an electrical capacitance tomography (ECT) sensor, an electrical resistance tomography (ERT) sensor, and a venturi tube. This work presents the phase fraction and flow rate measurements of oil-water two-phase flows based on the developed apparatus. Full-range phase fraction can be obtained by the combination of the ECT sensor and the ERT sensor. By data fusion of differential pressures measured by venturi tube and the phase fraction, the total flow rate and single-phase flow rate can be calculated. Dynamic experiments were conducted on the multiphase flow loop in horizontal and vertical pipelines and at various flow rates.

  18. Cooperative Surveillance and Pursuit Using Unmanned Aerial Vehicles and Unattended Ground Sensors

    Science.gov (United States)

    Las Fargeas, Jonathan; Kabamba, Pierre; Girard, Anouck

    2015-01-01

    This paper considers the problem of path planning for a team of unmanned aerial vehicles performing surveillance near a friendly base. The unmanned aerial vehicles do not possess sensors with automated target recognition capability and, thus, rely on communicating with unattended ground sensors placed on roads to detect and image potential intruders. The problem is motivated by persistent intelligence, surveillance, reconnaissance and base defense missions. The problem is formulated and shown to be intractable. A heuristic algorithm to coordinate the unmanned aerial vehicles during surveillance and pursuit is presented. Revisit deadlines are used to schedule the vehicles' paths nominally. The algorithm uses detections from the sensors to predict intruders' locations and selects the vehicles' paths by minimizing a linear combination of missed deadlines and the probability of not intercepting intruders. An analysis of the algorithm's completeness and complexity is then provided. The effectiveness of the heuristic is illustrated through simulations in a variety of scenarios. PMID:25591168

  19. New Research on MEMS Acoustic Vector Sensors Used in Pipeline Ground Markers

    Directory of Open Access Journals (Sweden)

    Xiaopeng Song

    2014-12-01

    Full Text Available According to the demands of current pipeline detection systems, the above-ground marker (AGM system based on sound detection principle has been a major development trend in pipeline technology. A novel MEMS acoustic vector sensor for AGM systems which has advantages of high sensitivity, high signal-to-noise ratio (SNR, and good low frequency performance has been put forward. Firstly, it is presented that the frequency of the detected sound signal is concentrated in a lower frequency range, and the sound attenuation is relatively low in soil. Secondly, the MEMS acoustic vector sensor structure and basic principles are introduced. Finally, experimental tests are conducted and the results show that in the range of 0°~90°, when r = 5 m, the proposed MEMS acoustic vector sensor can effectively detect sound signals in soil. The measurement errors of all angles are less than 5°.

  20. Torsion pendulum facility for ground testing of gravitational sensors for LISA

    CERN Document Server

    Hüller, M; Dolesi, R; Vitale, S; Weber, W J

    2002-01-01

    We report here on a torsion pendulum facility for ground-based testing of the Laser Interferometer Space Antenna (LISA) gravitational sensors. We aim to measure weak forces exerted by a capacitive position sensor on a lightweight version of the LISA test mass, suspended from a thin torsion fibre. This facility will permit measurement of the residual, springlike coupling between the test mass and the sensor and characterization of other stray forces relevant to LISA drag-free control. The expected force sensitivity of the proposed torsion pendulum is limited by the intrinsic thermal noise at approx 3x10 sup - sup 1 sup 3 N Hz sup - sup 1 sup / sup 2 at 1 mHz. We briefly describe the design and implementation of the apparatus, its expected performance and preliminary experimental data.

  1. Cooperative Surveillance and Pursuit Using Unmanned Aerial Vehicles and Unattended Ground Sensors

    Directory of Open Access Journals (Sweden)

    Jonathan Las Fargeas

    2015-01-01

    Full Text Available This paper considers the problem of path planning for a team of unmanned aerial vehicles performing surveillance near a friendly base. The unmanned aerial vehicles do not possess sensors with automated target recognition capability and, thus, rely on communicating with unattended ground sensors placed on roads to detect and image potential intruders. The problem is motivated by persistent intelligence, surveillance, reconnaissance and base defense missions. The problem is formulated and shown to be intractable. A heuristic algorithm to coordinate the unmanned aerial vehicles during surveillance and pursuit is presented. Revisit deadlines are used to schedule the vehicles’ paths nominally. The algorithm uses detections from the sensors to predict intruders’ locations and selects the vehicles’ paths by minimizing a linear combination of missed deadlines and the probability of not intercepting intruders. An analysis of the algorithm’s completeness and complexity is then provided. The effectiveness of the heuristic is illustrated through simulations in a variety of scenarios.

  2. Ground penetrating detection using miniaturized radar system based on solid state microwave sensor.

    Science.gov (United States)

    Yao, B M; Fu, L; Chen, X S; Lu, W; Guo, H; Gui, Y S; Hu, C-M

    2013-12-01

    We propose a solid-state-sensor-based miniaturized microwave radar technique, which allows a rapid microwave phase detection for continuous wave operation using a lock-in amplifier rather than using expensive and complicated instruments such as vector network analyzers. To demonstrate the capability of this sensor-based imaging technique, the miniaturized system has been used to detect embedded targets in sand by measuring the reflection for broadband microwaves. Using the reconstruction algorithm, the imaging of the embedded target with a diameter less than 5 cm buried in the sands with a depth of 5 cm or greater is clearly detected. Therefore, the sensor-based approach emerges as an innovative and cost-effective way for ground penetrating detection.

  3. Statistical Sensor Fusion of a 9-DOF Mems Imu for Indoor Navigation

    Science.gov (United States)

    Chow, J. C. K.

    2017-09-01

    Sensor fusion of a MEMS IMU with a magnetometer is a popular system design, because such 9-DoF (degrees of freedom) systems are capable of achieving drift-free 3D orientation tracking. However, these systems are often vulnerable to ambient magnetic distortions and lack useful position information; in the absence of external position aiding (e.g. satellite/ultra-wideband positioning systems) the dead-reckoned position accuracy from a 9-DoF MEMS IMU deteriorates rapidly due to unmodelled errors. Positioning information is valuable in many satellite-denied geomatics applications (e.g. indoor navigation, location-based services, etc.). This paper proposes an improved 9-DoF IMU indoor pose tracking method using batch optimization. By adopting a robust in-situ user self-calibration approach to model the systematic errors of the accelerometer, gyroscope, and magnetometer simultaneously in a tightly-coupled post-processed least-squares framework, the accuracy of the estimated trajectory from a 9-DoF MEMS IMU can be improved. Through a combination of relative magnetic measurement updates and a robust weight function, the method is able to tolerate a high level of magnetic distortions. The proposed auto-calibration method was tested in-use under various heterogeneous magnetic field conditions to mimic a person walking with the sensor in their pocket, a person checking their phone, and a person walking with a smartwatch. In these experiments, the presented algorithm improved the in-situ dead-reckoning orientation accuracy by 79.8-89.5 % and the dead-reckoned positioning accuracy by 72.9-92.8 %, thus reducing the relative positioning error from metre-level to decimetre-level after ten seconds of integration, without making assumptions about the user's dynamics.

  4. Trunk Motion System (TMS) Using Printed Body Worn Sensor (BWS) via Data Fusion Approach

    Science.gov (United States)

    Mokhlespour Esfahani, Mohammad Iman; Zobeiri, Omid; Moshiri, Behzad; Narimani, Roya; Mehravar, Mohammad; Rashedi, Ehsan; Parnianpour, Mohamad

    2017-01-01

    Human movement analysis is an important part of biomechanics and rehabilitation, for which many measurement systems are introduced. Among these, wearable devices have substantial biomedical applications, primarily since they can be implemented both in indoor and outdoor applications. In this study, a Trunk Motion System (TMS) using printed Body-Worn Sensors (BWS) is designed and developed. TMS can measure three-dimensional (3D) trunk motions, is lightweight, and is a portable and non-invasive system. After the recognition of sensor locations, twelve BWSs were printed on stretchable clothing with the purpose of measuring the 3D trunk movements. To integrate BWSs data, a neural network data fusion algorithm was used. The outcome of this algorithm along with the actual 3D anatomical movements (obtained by Qualisys system) were used to calibrate the TMS. Three healthy participants with different physical characteristics participated in the calibration tests. Seven different tasks (each repeated three times) were performed, involving five planar, and two multiplanar movements. Results showed that the accuracy of TMS system was less than 1.0°, 0.8°, 0.6°, 0.8°, 0.9°, and 1.3° for flexion/extension, left/right lateral bending, left/right axial rotation, and multi-planar motions, respectively. In addition, the accuracy of TMS for the identified movement was less than 2.7°. TMS, developed to monitor and measure the trunk orientations, can have diverse applications in clinical, biomechanical, and ergonomic studies to prevent musculoskeletal injuries, and to determine the impact of interventions. PMID:28075342

  5. STATISTICAL SENSOR FUSION OF A 9-DOF MEMS IMU FOR INDOOR NAVIGATION

    Directory of Open Access Journals (Sweden)

    J. C. K. Chow

    2017-09-01

    Full Text Available Sensor fusion of a MEMS IMU with a magnetometer is a popular system design, because such 9-DoF (degrees of freedom systems are capable of achieving drift-free 3D orientation tracking. However, these systems are often vulnerable to ambient magnetic distortions and lack useful position information; in the absence of external position aiding (e.g. satellite/ultra-wideband positioning systems the dead-reckoned position accuracy from a 9-DoF MEMS IMU deteriorates rapidly due to unmodelled errors. Positioning information is valuable in many satellite-denied geomatics applications (e.g. indoor navigation, location-based services, etc.. This paper proposes an improved 9-DoF IMU indoor pose tracking method using batch optimization. By adopting a robust in-situ user self-calibration approach to model the systematic errors of the accelerometer, gyroscope, and magnetometer simultaneously in a tightly-coupled post-processed least-squares framework, the accuracy of the estimated trajectory from a 9-DoF MEMS IMU can be improved. Through a combination of relative magnetic measurement updates and a robust weight function, the method is able to tolerate a high level of magnetic distortions. The proposed auto-calibration method was tested in-use under various heterogeneous magnetic field conditions to mimic a person walking with the sensor in their pocket, a person checking their phone, and a person walking with a smartwatch. In these experiments, the presented algorithm improved the in-situ dead-reckoning orientation accuracy by 79.8–89.5 % and the dead-reckoned positioning accuracy by 72.9–92.8 %, thus reducing the relative positioning error from metre-level to decimetre-level after ten seconds of integration, without making assumptions about the user’s dynamics.

  6. Wi-GIM system: a new wireless sensor network (WSN) for accurate ground instability monitoring

    Science.gov (United States)

    Mucchi, Lorenzo; Trippi, Federico; Schina, Rosa; Fornaciai, Alessandro; Gigli, Giovanni; Nannipieri, Luca; Favalli, Massimiliano; Marturia Alavedra, Jordi; Intrieri, Emanuele; Agostini, Andrea; Carnevale, Ennio; Bertolini, Giovanni; Pizziolo, Marco; Casagli, Nicola

    2016-04-01

    Landslides are among the most serious and common geologic hazards around the world. Their impact on human life is expected to increase in the next future as a consequence of human-induced climate change as well as the population growth in proximity of unstable slopes. Therefore, developing better performing technologies for monitoring landslides and providing local authorities with new instruments able to help them in the decision making process, is becoming more and more important. The recent progresses in Information and Communication Technologies (ICT) allow us to extend the use of wireless technologies in landslide monitoring. In particular, the developments in electronics components have permitted to lower the price of the sensors and, at the same time, to actuate more efficient wireless communications. In this work we present a new wireless sensor network (WSN) system, designed and developed for landslide monitoring in the framework of EU Wireless Sensor Network for Ground Instability Monitoring - Wi-GIM project (LIFE12 ENV/IT/001033). We show the preliminary performance of the Wi-GIM system after the first period of monitoring on the active Roncovetro Landslide and on a large subsiding area in the neighbourhood of Sallent village. The Roncovetro landslide is located in the province of Reggio Emilia (Italy) and moved an inferred volume of about 3 million cubic meters. Sallent village is located at the centre of the Catalan evaporitic basin in Spain. The Wi-GIM WSN monitoring system consists of three levels: 1) Master/Gateway level coordinates the WSN and performs data aggregation and local storage; 2) Master/Server level takes care of acquiring and storing data on a remote server; 3) Nodes level that is based on a mesh of peripheral nodes, each consisting in a sensor board equipped with sensors and wireless module. The nodes are located in the landslide ground perimeter and are able to create an ad-hoc WSN. The location of each sensor on the ground is

  7. Sensors Fusion based Online Mapping and Features Extraction of Mobile Robot in the Road Following and Roundabout

    Science.gov (United States)

    Ali, Mohammed A. H.; Mailah, Musa; Yussof, Wan Azhar B.; Hamedon, Zamzuri B.; Yussof, Zulkifli B.; Majeed, Anwar P. P.

    2016-02-01

    A road feature extraction based mapping system using a sensor fusion technique for mobile robot navigation in road environments is presented in this paper. The online mapping of mobile robot is performed continuously in the road environments to find the road properties that enable the robot to move from a certain start position to pre-determined goal while discovering and detecting the roundabout. The sensors fusion involving laser range finder, camera and odometry which are installed in a new platform, are used to find the path of the robot and localize it within its environments. The local maps are developed using camera and laser range finder to recognize the roads borders parameters such as road width, curbs and roundabout. Results show the capability of the robot with the proposed algorithms to effectively identify the road environments and build a local mapping for road following and roundabout.

  8. Multi-sensor fusion system using wavelet-based detection algorithm applied to physiological monitoring under high-G environment

    Science.gov (United States)

    Ryoo, Han Chool

    2000-06-01

    A significant problem in physiological state monitoring systems with single data channels is high rates of false alarm. In order to reduce false alarm probability, several data channels can be integrated to enhance system performance. In this work, we have investigated a sensor fusion methodology applicable to physiological state monitoring, which combines local decisions made from dispersed detectors. Difficulties in biophysical signal processing are associated with nonstationary signal patterns and individual characteristics of human physiology resulting in nonidentical observation statistics. Thus a two compartment design, a modified version of well established fusion theory in communication systems, is presented and applied to biological signal processing where we combine discrete wavelet transforms (DWT) with sensor fusion theory. The signals were decomposed in time-frequency domain by discrete wavelet transform (DWT) to capture localized transient features. Local decisions by wavelet power analysis are followed by global decisions at the data fusion center operating under an optimization criterion, i.e., minimum error criterion (MEC). We used three signals acquired from human volunteers exposed to high-G forces at the human centrifuge/dynamic flight simulator facility in Warminster, PA. The subjects performed anti-G straining maneuvers to protect them from the adverse effects of high-G forces. These maneuvers require muscular tensing and altered breathing patterns. We attempted to determine the subject's state by detecting the presence or absence of the voluntary anti-G straining maneuvers (AGSM). During the exposure to high G force the respiratory patterns, blood pressure and electroencephalogram (EEG) were measured to determine changes in the subject's state. Experimental results show that the probability of false alarm under MEC can be significantly reduced by applying the same rule found at local thresholds to all subjects, and MEC can be employed as a

  9. Nano-based chemical sensor array systems for uninhabited ground and airborne vehicles

    Science.gov (United States)

    Brantley, Christina; Ruffin, Paul B.; Edwards, Eugene

    2009-03-01

    In a time when homemade explosive devices are being used against soldiers and in the homeland security environment, it is becoming increasingly evident that there is an urgent need for high-tech chemical sensor packages to be mounted aboard ground and air vehicles to aid soldiers in determining the location of explosive devices and the origin of bio-chemical warfare agents associated with terrorist activities from a safe distance. Current technologies utilize relatively large handheld detection systems that are housed on sizeable robotic vehicles. Research and development efforts are underway at the Army Aviation & Missile Research, Development, and Engineering Center (AMRDEC) to develop novel and less expensive nano-based chemical sensors for detecting explosives and chemical agents used against the soldier. More specifically, an array of chemical sensors integrated with an electronics control module on a flexible substrate that can conform to and be surface-mounted to manned or unmanned vehicles to detect harmful species from bio-chemical warfare and other explosive devices is being developed. The sensor system under development is a voltammetry-based sensor system capable of aiding in the detection of any chemical agent and in the optimization of sensor microarray geometry to provide nonlinear Fourier algorithms to characterize target area background (e.g., footprint areas). The status of the research project is reviewed in this paper. Critical technical challenges associated with achieving system cost, size, and performance requirements are discussed. The results obtained from field tests using an unmanned remote controlled vehicle that houses a CO2/chemical sensor, which detects harmful chemical agents and wirelessly transmits warning signals back to the warfighter, are presented. Finally, the technical barriers associated with employing the sensor array system aboard small air vehicles will be discussed.

  10. Fusion of multi-sensor surface elevation data for a better characterization of rapidly changing outlet glaciers in Greenland

    Science.gov (United States)

    Schenk, A. F.; Csatho, B. M.; McCormick, D. P.; Van der Veen, C. J.

    2013-12-01

    During the last two decades surface elevation data have been gathered over the Greenland Ice Sheet (GrIS) from a variety of different sensors such as spaceborne and airborne laser altimetry (ICESat, ATM and LVIS) as well as from stereo imaging systems, most notably from ASTER and Worldview. The spatio-temporal resolution, the accuracy, and the spatial coverage of all these data differ widely. For example, laser altimetry systems are much more accurate than DEMs derived by correlation from imaging systems. On the other hand, DEMs usually have a superior spatial resolution and extended spatial coverage. We have developed the SERAC (Surface Elevation Reconstruction And Change detection) system to cope with the data complexity and the computation of elevation change histories. SERAC simultaneously determines the ice sheet surface shape and the time-series of elevation changes for surface patches whose size depends on the ruggedness of the surface and the point distribution of the sensors involved. By incorporating different sensors, SERAC is a true fusion system that generates the best plausible result (time-series of elevation changes)-a result that is better than the sum of its individual parts. We present detail examples of Kangerlussuaq and Helheim glaciers, involving ICESat, ATM and LVIS laser altimetry data, together with ASTER DEMs. ASTER DEMs are readily available but notorious for their accuracy behavior. The nominally stated accuracy of ~15 m may occasionally reach much higher values. By embedding ASTER DEMs into the SERAC time-series of elevation changes, we are able to determine plausible corrections. Thus, we can use ASTER DEMs to temporally and spatially densify the elevation change record. This is especially important on rapidly changing outlet glaciers where laser altimetry data might only be available sporadically To investigate the mechanisms controlling their behavior, we reconstructed elevation change histories along the central flowlines of these

  11. Multi-sensors data and information fusion algorithm for indoor localization

    Directory of Open Access Journals (Sweden)

    XIA Jun

    2015-02-01

    Full Text Available The localization algorithm of based on IMU is one of autonomous localization methods while it possesses the disadvantage of drift error and accumulated error,so this paper proposes a multi-sensors including wearable multi-IMUs and IWSN data and information fusion algorithm for indoor localization.On the one hand,almost all indoor localization algorithms based on IMU use only one IMU while this single-IMU-based algorithm can′t judge the posture of person precisely,one appropriate solution is that we can utilize multi-IMUs to cooperate in localization process,besides,we can fuse position information of multi-IMUs by fuzzy voting scheme.On the other hand,in order to overcome the disadvantage of drift error and accumulate error,combining with IWSN in indoor and fusing the position information calculated by IWSN and multi-IMUs via Kalman Filter algorithm.Experiment results show that the proposed indoor localization algorithm possesses good property in judging posture of person and decreasing drift error and accumulate error comparing with traditional IMU-based indoor localization algorithm.

  12. A Locomotion Intent Prediction System Based on Multi-Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Baojun Chen

    2014-07-01

    Full Text Available Locomotion intent prediction is essential for the control of powered lower-limb prostheses to realize smooth locomotion transitions. In this research, we develop a multi-sensor fusion based locomotion intent prediction system, which can recognize current locomotion mode and detect locomotion transitions in advance. Seven able-bodied subjects were recruited for this research. Signals from two foot pressure insoles and three inertial measurement units (one on the thigh, one on the shank and the other on the foot are measured. A two-level recognition strategy is used for the recognition with linear discriminate classifier. Six kinds of locomotion modes and ten kinds of locomotion transitions are tested in this study. Recognition accuracy during steady locomotion periods (i.e., no locomotion transitions is 99.71% ± 0.05% for seven able-bodied subjects. During locomotion transition periods, all the transitions are correctly detected and most of them can be detected before transiting to new locomotion modes. No significant deterioration in recognition performance is observed in the following five hours after the system is trained, and small number of experiment trials are required to train reliable classifiers.

  13. Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2015-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, MODIS, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. HySDS is a Hybrid-Cloud Science Data System that has been developed and applied under NASA AIST, MEaSUREs, and ACCESS grants. HySDS uses the SciFlow workflow engine to partition analysis workflows into parallel tasks (e.g. segmenting by time or space) that are pushed into a durable job queue. The tasks are "pulled" from the queue by worker Virtual Machines (VM's) and executed in an on-premise Cloud (Eucalyptus or OpenStack) or at Amazon in the public Cloud or govCloud. In this way, years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the transferred data. We are using HySDS to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a MEASURES grant. We will present the architecture of HySDS, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. Our system demonstrates how one can pull A-Train variables (Levels 2 & 3) on-demand into the Amazon Cloud, and cache only those variables that are heavily used, so that any number of compute jobs can be

  14. Real-time EO/IR sensor fusion on a portable computer and head-mounted display

    Science.gov (United States)

    Yue, Zhanfeng; Topiwala, Pankaj

    2007-04-01

    Multi-sensor platforms are widely used in surveillance video systems for both military and civilian applications. The complimentary nature of different types of sensors (e.g. EO and IR sensors) makes it possible to observe the scene under almost any condition (day/night/fog/smoke). In this paper, we propose an innovative EO/IR sensor registration and fusion algorithm which runs real-time on a portable computing unit with head-mounted display. The EO/IR sensor suite is mounted on a helmet for a dismounted soldier and the fused scene is shown in the goggle display upon the processing on a portable computing unit. The linear homography transformation between images from the two sensors is precomputed for the mid-to-far scene, which reduces the computational cost for the online calibration of the sensors. The system is implemented in a highly optimized C++ code, with MMX/SSE, and performing a real-time registration. The experimental results on real captured video show the system works very well both in speed and in performance.

  15. The research of auto-focusing method for the image mosaic and fusion system with multi-sensor

    Science.gov (United States)

    Pang, Ke; Yao, Suying; Shi, Zaifeng; Xu, Jiangtao; Liu, Jiangming

    2013-09-01

    In modern image processing, due to the development of digital image processing, the focus of the sensor can be automatically set by the digital processing system through computation. In the other hand, the auto-focusing synchronously and consistently is one of the most important factors for image mosaic and fusion processing, especially for the system with multi-sensor which are put on one line in order to gain the wide angle video information. Different images sampled by the sensors with different focal length values will always increase the complexity of the affine matrix of the image mosaic and fusion in next, which potentially reducing the efficiency of the system and consuming more power. Here, a new fast evaluation method based on the gray value variance of the image pixel is proposed to find the common focal length value for all sensors to achieve the better image sharpness. For the multi-frame pictures that are sampled from different sensors that have been adjusted and been regarded as time synchronization, the gray value variances of the adjacent pixels are determined to generate one curve. This curve is the focus measure function which describes the relationship between the image sharpness and the focal length value of the sensor. On the basis of all focus measure functions of all sensors in the image processing system, this paper uses least square method to carry out the data fitting to imitate the disperse curves and give one objective function for the multi-sensor system, and then find the optimal solution corresponding to the extreme value of the image sharpness according to the evaluation of the objective function. This optimal focal length value is the common parameter for all sensors in this system. By setting the common focal length value, in the premise of ensuring the image sharpness, the computing of the affine matrix which is the core processing of the image mosaic and fusion which stitching all those pictures into one wide angle image will be

  16. Development of a Pedestrian Indoor Navigation System Based on Multi-Sensor Fusion and Fuzzy Logic Estimation Algorithms

    Science.gov (United States)

    Lai, Y. C.; Chang, C. C.; Tsai, C. M.; Lin, S. Y.; Huang, S. C.

    2015-05-01

    This paper presents a pedestrian indoor navigation system based on the multi-sensor fusion and fuzzy logic estimation algorithms. The proposed navigation system is a self-contained dead reckoning navigation that means no other outside signal is demanded. In order to achieve the self-contained capability, a portable and wearable inertial measure unit (IMU) has been developed. Its adopted sensors are the low-cost inertial sensors, accelerometer and gyroscope, based on the micro electro-mechanical system (MEMS). There are two types of the IMU modules, handheld and waist-mounted. The low-cost MEMS sensors suffer from various errors due to the results of manufacturing imperfections and other effects. Therefore, a sensor calibration procedure based on the scalar calibration and the least squares methods has been induced in this study to improve the accuracy of the inertial sensors. With the calibrated data acquired from the inertial sensors, the step length and strength of the pedestrian are estimated by multi-sensor fusion and fuzzy logic estimation algorithms. The developed multi-sensor fusion algorithm provides the amount of the walking steps and the strength of each steps in real-time. Consequently, the estimated walking amount and strength per step are taken into the proposed fuzzy logic estimation algorithm to estimates the step lengths of the user. Since the walking length and direction are both the required information of the dead reckoning navigation, the walking direction is calculated by integrating the angular rate acquired by the gyroscope of the developed IMU module. Both the walking length and direction are calculated on the IMU module and transmit to a smartphone with Bluetooth to perform the dead reckoning navigation which is run on a self-developed APP. Due to the error accumulating of dead reckoning navigation, a particle filter and a pre-loaded map of indoor environment have been applied to the APP of the proposed navigation system to extend its

  17. DEVELOPMENT OF A PEDESTRIAN INDOOR NAVIGATION SYSTEM BASED ON MULTI-SENSOR FUSION AND FUZZY LOGIC ESTIMATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    Y. C. Lai

    2015-05-01

    Full Text Available This paper presents a pedestrian indoor navigation system based on the multi-sensor fusion and fuzzy logic estimation algorithms. The proposed navigation system is a self-contained dead reckoning navigation that means no other outside signal is demanded. In order to achieve the self-contained capability, a portable and wearable inertial measure unit (IMU has been developed. Its adopted sensors are the low-cost inertial sensors, accelerometer and gyroscope, based on the micro electro-mechanical system (MEMS. There are two types of the IMU modules, handheld and waist-mounted. The low-cost MEMS sensors suffer from various errors due to the results of manufacturing imperfections and other effects. Therefore, a sensor calibration procedure based on the scalar calibration and the least squares methods has been induced in this study to improve the accuracy of the inertial sensors. With the calibrated data acquired from the inertial sensors, the step length and strength of the pedestrian are estimated by multi-sensor fusion and fuzzy logic estimation algorithms. The developed multi-sensor fusion algorithm provides the amount of the walking steps and the strength of each steps in real-time. Consequently, the estimated walking amount and strength per step are taken into the proposed fuzzy logic estimation algorithm to estimates the step lengths of the user. Since the walking length and direction are both the required information of the dead reckoning navigation, the walking direction is calculated by integrating the angular rate acquired by the gyroscope of the developed IMU module. Both the walking length and direction are calculated on the IMU module and transmit to a smartphone with Bluetooth to perform the dead reckoning navigation which is run on a self-developed APP. Due to the error accumulating of dead reckoning navigation, a particle filter and a pre-loaded map of indoor environment have been applied to the APP of the proposed navigation system

  18. Improved Maturity and Ripeness Classifications of Magnifera Indica cv. Harumanis Mangoes through Sensor Fusion of an Electronic Nose and Acoustic Sensor

    Directory of Open Access Journals (Sweden)

    Latifah Munirah Kamarudin

    2012-05-01

    Full Text Available In recent years, there have been a number of reported studies on the use of non-destructive techniques to evaluate and determine mango maturity and ripeness levels. However, most of these reported works were conducted using single-modality sensing systems, either using an electronic nose, acoustics or other non-destructive measurements. This paper presents the work on the classification of mangoes (Magnifera Indica cv. Harumanis maturity and ripeness levels using fusion of the data of an electronic nose and an acoustic sensor. Three groups of samples each from two different harvesting times (week 7 and week 8 were evaluated by the e-nose and then followed by the acoustic sensor. Principal Component Analysis (PCA and Linear Discriminant Analysis (LDA were able to discriminate the mango harvested at week 7 and week 8 based solely on the aroma and volatile gases released from the mangoes. However, when six different groups of different maturity and ripeness levels were combined in one classification analysis, both PCA and LDA were unable to discriminate the age difference of the Harumanis mangoes. Instead of six different groups, only four were observed using the LDA, while PCA showed only two distinct groups. By applying a low level data fusion technique on the e-nose and acoustic data, the classification for maturity and ripeness levels using LDA was improved. However, no significant improvement was observed using PCA with data fusion technique. Further work using a hybrid LDA-Competitive Learning Neural Network was performed to validate the fusion technique and classify the samples. It was found that the LDA-CLNN was also improved significantly when data fusion was applied.

  19. Crosstalk suppression in networked resistive sensor arrays using virtual ground technique

    Science.gov (United States)

    Sahai Saxena, Raghvendra; Semwal, Sushil Kumar; Singh Rana, Pratap; Bhan, R. K.

    2013-11-01

    In 2D resistive sensor arrays, the interconnections are reduced considerably by sharing rows and columns among various sensor elements in such a way that one end of each sensor is connected to a row node and other end connected to a column node. This scheme results in total N + M interconnections for N × M array of sensors. Thus, it simplifies the interconnect complexity but suffers from the crosstalk problem among its elements. We experimentally demonstrate that this problem can be overcome by putting all the row nodes at virtually equal potential using virtual ground of high gain operational amplifiers in negative feedback. Although it requires large number of opamps, it solves the crosstalk problem to a large extent. Additionally, we get the response of all the sensors lying in a column simultaneously, resulting in a faster scanning capability. By performing lock-in-amplifier based measurements on a light dependent resistor at a randomly selected location in a 4 × 4 array of otherwise fixed valued resistors, we have shown that the technique can provide 86 dB crosstalk suppression even with a simple opamp. Finally, we demonstrate the circuit implementation of this technique for a 16 × 16 imaging array of light dependent resistors.

  20. The Phased Array Terrain Interferometer (PathIn): A New Sensor for UAS Synthetic Vision and Ground Collision Avoidance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal introduces an innovative sensor to advance ground collision avoidance for UAS platforms by providing real-time height maps for hazard anomaly...

  1. Acoustic detection and localization of weapons fire by unattended ground sensors and aerostat-borne sensors

    Science.gov (United States)

    Naz, P.; Marty, Ch.; Hengy, S.; Miller, L. S.

    2009-05-01

    The detection and localization of artillery guns on the battlefield is envisaged by means of acoustic and seismic waves. The main objective of this work is to examine the different frequency ranges usable for the detection of small arms, mortars, and artillery guns on the same hardware platform. The main stages of this study have consisted of: data acquisition of the acoustic signals of the different weapons used, signal processing and evaluation of the localization performance for various types of individual arrays, and modeling of the wave propagation in the atmosphere. The study of the propagation effects on the signatures of these weapons is done by comparing the acoustic signals measured during various days, at ground level and at the altitude of our aerostat (typically 200 m). Numerical modeling has also been performed to reinforce the interpretation of the experimental results.

  2. Hybrid Motion Planning Method for Autonomous Robots Using Kinect Based Sensor Fusion and Virtual Plane Approach in Dynamic Environments

    Directory of Open Access Journals (Sweden)

    Doopalam Tuvshinjargal

    2015-01-01

    Full Text Available A new reactive motion planning method for an autonomous vehicle in dynamic environments is proposed. The new dynamic motion planning method combines a virtual plane based reactive motion planning technique with a sensor fusion based obstacle detection approach, which results in improving robustness and autonomy of vehicle navigation within unpredictable dynamic environments. The key feature of the new reactive motion planning method is based on a local observer in the virtual plane which allows the effective transformation of complex dynamic planning problems into simple stationary in the virtual plane. In addition, a sensor fusion based obstacle detection technique provides the pose estimation of moving obstacles by using a Kinect sensor and a sonar sensor, which helps to improve the accuracy and robustness of the reactive motion planning approach in uncertain dynamic environments. The performance of the proposed method was demonstrated through not only simulation studies but also field experiments using multiple moving obstacles even in hostile environments where conventional method failed.

  3. Information-based sensor management for the intelligent tasking of ground penetrating radar and electromagnetic induction sensors in landmine detection pre-screening

    Science.gov (United States)

    Kolba, Mark P.; Collins, Leslie M.

    2010-04-01

    Previous work has introduced a framework for information-based sensor management that is capable of tasking multiple sensors searching for targets among a set of discrete objects or in a cell grid. However, in many real-world scenarios-- such as detecting landmines along a lane or road--an unknown number of targets are present in a continuous spatial region of interest. Consequently, this paper introduces a grid-free sensor management approach that allows multiple sensors to be managed in a sequential search for targets in a grid-free spatial region. Simple yet expressive Gaussian target models are introduced to model the spatial target responses that are observed by the sensors. The sensor manager is then formulated using a Bayesian approach, and sensors are directed to make new observations that maximize the expected information gain between the posterior density on the target parameters after a new observation and the current posterior target parameter density. The grid-free sensor manager is applied to a set of real landmine detection data collected with ground penetrating radar (GPR) and electromagnetic induction (EMI) sensors at a U.S. government test site. Results are presented that compare the performance of the sensor manager with the performance of an unmanaged joint pre-screener that fuses individual GPR and EMI pre-screeners. The sensor manager is demonstrated to provide improved detection performance while requiring substantially fewer sensor observations than are made with the unmanaged joint pre-screening approach.

  4. Sensor data fusion and image processing for object and hazard detection; Sensordatenfusion und Bildverarbeitung zur Objekt- und Gefahrenerkennung

    Energy Technology Data Exchange (ETDEWEB)

    Catala Prat, Alvaro

    2011-03-15

    The present work deals with automatic detection and tracking of objects in driving situations as well as derivation of potential hazards. To do this, the data of a laser scanner and a camera is processed and fused. The work provides new methods in the area of immediate environment detection and modeling. Thus, it creates a basis for innovative driver assistance and automation systems. The aim of such systems is to improve driving safety, traffic efficiency and driving comfort. The methods introduced in this work can be classified into different abstraction levels: At sensor data level, the data is prepared and reduced. In this work, the focus is especially set on the detection of driving oscillations from camera images and on the detection of the driving corridor from the data of different sensors, used later as the primary area of interest. At object level the central data fusion is done. High reliability, availability and sensor independency are achieved by choosing a competitive object fusion approach. As an input of the data fusion, object observations from camera and laser scanner data are extracted. These are then fused at the aim of object detection and tracking, where aspects such as robustness against manoeuvring objects, measurement outliers, split and merge effects, as well as partial object observability are addressed. At application level, early detection of potential hazards is addressed. A statistical approach has been chosen and developed, in which hazards are handled as atypical situations. This general and expandable approach is exemplarily shown based on the detected object data. The presented strategies and methods have been developed systematically, implemented in a modular prototype and tested with simulated and real data. The test results of the data fusion system show a win in data quality and robustness, with which an improvement of driver assistance and automation systems can be reached. (orig.)

  5. A radar unattended ground sensor with micro-Doppler capabilities for false alarm reduction

    Science.gov (United States)

    Tahmoush, Dave; Silvious, Jerry; Burke, Ed

    2010-10-01

    Unattended ground sensors (UGS) provide the capability to inexpensively secure remote borders and other areas of interest. However, the presence of normal animal activity can often trigger a false alarm. Accurately detecting humans and distinguishing them from natural fauna is an important issue in security applications to reduce false alarm rates and improve the probability of detection. In particular, it is important to detect and classify people who are moving in remote locations and transmit back detections and analysis over extended periods at a low cost and with minimal maintenance. We developed and demonstrate a compact radar technology that is scalable to a variety of ultra-lightweight and low-power platforms for wide area persistent surveillance as an unattended, unmanned, and man-portable ground sensor. The radar uses micro-Doppler processing to characterize the tracks of moving targets and to then eliminate unimportant detections due to animals as well as characterize the activity of human detections. False alarms from sensors are a major liability that hinders widespread use. Incorporating rudimentary intelligence into sensors can reduce false alarms but can also result in a reduced probability of detection. Allowing an initial classification that can be updated with new observations and tracked over time provides a more robust framework for false alarm reduction at the cost of additional sensor observations. This paper explores these tradeoffs with a small radar sensor for border security. Multiple measurements were done to try to characterize the micro-Doppler of human versus animal and vehicular motion across a range of activities. Measurements were taken at the multiple sites with realistic but low levels of clutter. Animals move with a quadrupedal motion, which can be distinguished from the bipedal human motion. The micro-Doppler of a vehicle with rotating parts is also shown, along with ground truth images. Comparisons show large variations for

  6. Detection capability of a pulsed Ground Penetrating Radar utilizing an oscilloscope and Radargram Fusion Approach for optimal signal quality

    Science.gov (United States)

    Seyfried, Daniel; Schoebel, Joerg

    2015-07-01

    In scientific research pulsed radars often employ a digital oscilloscope as sampling unit. The sensitivity of an oscilloscope is determined in general by means of the number of digits of its analog-to-digital converter and the selected full scale vertical setting, i.e., the maximal voltage range displayed. Furthermore oversampling or averaging of the input signal may increase the effective number of digits, hence the sensitivity. Especially for Ground Penetrating Radar applications high sensitivity of the radar system is demanded since reflection amplitudes of buried objects are strongly attenuated in ground. Hence, in order to achieve high detection capability this parameter is one of the most crucial ones. In this paper we analyze the detection capability of our pulsed radar system utilizing a Rohde & Schwarz RTO 1024 oscilloscope as sampling unit for Ground Penetrating Radar applications, such as detection of pipes and cables in the ground. Also effects of averaging and low-noise amplification of the received signal prior to sampling are investigated by means of an appropriate laboratory setup. To underline our findings we then present real-world radar measurements performed on our GPR test site, where we have buried pipes and cables of different types and materials in different depths. The results illustrate the requirement for proper choice of the settings of the oscilloscope for optimal data recording. However, as we show, displaying both strong signal contributions due to e.g., antenna cross-talk and direct ground bounce reflection as well as weak reflections from objects buried deeper in ground requires opposing trends for the oscilloscope's settings. We therefore present our Radargram Fusion Approach. By means of this approach multiple radargrams recorded in parallel, each with an individual optimized setting for a certain type of contribution, can be fused in an appropriate way in order to finally achieve a single radargram which displays all

  7. Sensor Fusion of Monocular Cameras and Laser Rangefinders for Line-Based Simultaneous Localization and Mapping (SLAM Tasks in Autonomous Mobile Robots

    Directory of Open Access Journals (Sweden)

    Xinzheng Zhang

    2012-01-01

    Full Text Available This paper presents a sensor fusion strategy applied for Simultaneous Localization and Mapping (SLAM in dynamic environments. The designed approach consists of two features: (i the first one is a fusion module which synthesizes line segments obtained from laser rangefinder and line features extracted from monocular camera. This policy eliminates any pseudo segments that appear from any momentary pause of dynamic objects in laser data. (ii The second characteristic is a modified multi-sensor point estimation fusion SLAM (MPEF-SLAM that incorporates two individual Extended Kalman Filter (EKF based SLAM algorithms: monocular and laser SLAM. The error of the localization in fused SLAM is reduced compared with those of individual SLAM. Additionally, a new data association technique based on the homography transformation matrix is developed for monocular SLAM. This data association method relaxes the pleonastic computation. The experimental results validate the performance of the proposed sensor fusion and data association method.

  8. Sensor fusion of monocular cameras and laser rangefinders for line-based Simultaneous Localization and Mapping (SLAM) tasks in autonomous mobile robots.

    Science.gov (United States)

    Zhang, Xinzheng; Rad, Ahmad B; Wong, Yiu-Kwong

    2012-01-01

    This paper presents a sensor fusion strategy applied for Simultaneous Localization and Mapping (SLAM) in dynamic environments. The designed approach consists of two features: (i) the first one is a fusion module which synthesizes line segments obtained from laser rangefinder and line features extracted from monocular camera. This policy eliminates any pseudo segments that appear from any momentary pause of dynamic objects in laser data. (ii) The second characteristic is a modified multi-sensor point estimation fusion SLAM (MPEF-SLAM) that incorporates two individual Extended Kalman Filter (EKF) based SLAM algorithms: monocular and laser SLAM. The error of the localization in fused SLAM is reduced compared with those of individual SLAM. Additionally, a new data association technique based on the homography transformation matrix is developed for monocular SLAM. This data association method relaxes the pleonastic computation. The experimental results validate the performance of the proposed sensor fusion and data association method.

  9. A Sensor Fusion Algorithm for Filtering Pyrometer Measurement Noise in the Czochralski Crystallization Process

    Directory of Open Access Journals (Sweden)

    M. Komperød

    2011-01-01

    Full Text Available The Czochralski (CZ crystallization process is used to produce monocrystalline silicon for solar cell wafers and electronics. Tight temperature control of the molten silicon is most important for achieving high crystal quality. SINTEF Materials and Chemistry operates a CZ process. During one CZ batch, two pyrometers were used for temperature measurement. The silicon pyrometer measures the temperature of the molten silicon. This pyrometer is assumed to be accurate, but has much high-frequency measurement noise. The graphite pyrometer measures the temperature of a graphite material. This pyrometer has little measurement noise. There is quite a good correlation between the two pyrometer measurements. This paper presents a sensor fusion algorithm that merges the two pyrometer signals for producing a temperature estimate with little measurement noise, while having significantly less phase lag than traditional lowpass- filtering of the silicon pyrometer. The algorithm consists of two sub-algorithms: (i A dynamic model is used to estimate the silicon temperature based on the graphite pyrometer, and (ii a lowpass filter and a highpass filter designed as complementary filters. The complementary filters are used to lowpass-filter the silicon pyrometer, highpass-filter the dynamic model output, and merge these filtered signals. Hence, the lowpass filter attenuates noise from the silicon pyrometer, while the graphite pyrometer and the dynamic model estimate those frequency components of the silicon temperature that are lost when lowpass-filtering the silicon pyrometer. The algorithm works well within a limited temperature range. To handle a larger temperature range, more research must be done to understand the process' nonlinear dynamics, and build this into the dynamic model.

  10. Reliability of measured data for pH sensor arrays with fault diagnosis and data fusion based on LabVIEW.

    Science.gov (United States)

    Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi

    2013-12-13

    Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.

  11. Reliability of Measured Data for pH Sensor Arrays with Fault Diagnosis and Data Fusion Based on LabVIEW

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liao

    2013-12-01

    Full Text Available Fault diagnosis (FD and data fusion (DF technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2 sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.

  12. Height Compensation Using Ground Inclination Estimation in Inertial Sensor-Based Pedestrian Navigation

    Directory of Open Access Journals (Sweden)

    Sang Kyeong Park

    2011-08-01

    Full Text Available In an inertial sensor-based pedestrian navigation system, the position is estimated by double integrating external acceleration. A new algorithm is proposed to reduce z axis position (height error. When a foot is on the ground, a foot angle is estimated using accelerometer output. Using a foot angle, the inclination angle of a road is estimated. Using this road inclination angle, height difference of one walking step is estimated and this estimation is used to reduce height error. Through walking experiments on roads with different inclination angles, the usefulness of the proposed algorithm is verified.

  13. High-stability temperature control for ST-7/LISA Pathfinder gravitational reference sensor ground verification testing

    Science.gov (United States)

    Higuchi, S.; Allen, G.; Bencze, W.; Byer, R.; Dang, A.; DeBra, D. B.; Lauben, D.; Dorlybounxou, S.; Hanson, J.; Ho, L.; Huffman, G.; Sabur, F.; Sun, K.; Tavernetti, R.; Rolih, L.; Van Patten, R.; Wallace, J.; Williams, S.

    2006-03-01

    This article demonstrates experimental results of a thermal control system developed for ST-7 gravitational reference sensor (GRS) ground verification testing which provides thermal stability δT control of the LISA spacecraft to compensate solar irradiate 1/f fluctuations. Although for ground testing these specifications can be met fairly readily with sufficient insulation and thermal mass, in contrast, for spacecraft the very limited thermal mass calls for an active control system which can simultaneously meet disturbance rejection and stability requirements in the presence of long time delay; a considerable design challenge. Simple control laws presently provide ~ 1mK/surdHz for >24 hours. Continuing development of a model predictive feedforward control algorithm will extend performance to <1 mK/surdHz at f < 0.01 mHz and possibly lower, extending LISA coverage of super massive black hole mergers.

  14. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  15. Robust Sequential Covariance Intersection Fusion Kalman Filtering over Multi-agent Sensor Networks with Measurement Delays and Uncertain Noise Variances

    Institute of Scientific and Technical Information of China (English)

    QI Wen-Juan; ZHANG Peng; DENG Zi-Li

    2014-01-01

    This paper deals with the problem of designing robust sequential covariance intersection (SCI) fusion Kalman filter for the clustering multi-agent sensor network system with measurement delays and uncertain noise variances. The sensor network is partitioned into clusters by the nearest neighbor rule. Using the minimax robust estimation principle, based on the worst-case conservative sensor network system with conservative upper bounds of noise variances, and applying the unbiased linear minimum variance (ULMV) optimal estimation rule, we present the two-layer SCI fusion robust steady-state Kalman filter which can reduce communication and computation burdens and save energy sources, and guarantee that the actual filtering error variances have a less-conservative upper-bound. A Lyapunov equation method for robustness analysis is proposed, by which the robustness of the local and fused Kalman filters is proved. The concept of the robust accuracy is presented and the robust accuracy relations of the local and fused robust Kalman filters are proved. It is proved that the robust accuracy of the global SCI fuser is higher than those of the local SCI fusers and the robust accuracies of all SCI fusers are higher than that of each local robust Kalman filter. A simulation example for a tracking system verifies the robustness and robust accuracy relations.

  16. Applications of state estimation in multi-sensor information fusion for the monitoring of open pit mine slope deformation

    Institute of Scientific and Technical Information of China (English)

    FU Hua; LIU Yin-ping; XIAO Jian

    2008-01-01

    The traditional open pit mine slope deformation monitoring system can not use the monitoring information coming from many monitoring points at the same time,can only using the monitoring data coming from a key monitoring point, and that is to say it can only handle one-dimensional time series. Given this shortage in the monitoring,the multi-sensor information fusion in the state estimation techniques would be introduced to the slope deformation monitoring system, and by the dynamic characteristics of deformation slope, the open pit slope would be regarded as a dynamic goal, the condition monitoring of which would be regarded as a dynamic target tracking. Distributed Information fusion technology with feedback was used to process the monitoring data and on this basis Klman filtering algorithms was introduced, and the simulation examples was used to prove its effectivenes.

  17. A Method for Improving the Pose Accuracy of a Robot Manipulator Based on Multi-Sensor Combined Measurement and Data Fusion

    Science.gov (United States)

    Liu, Bailing; Zhang, Fumin; Qu, Xinghua

    2015-01-01

    An improvement method for the pose accuracy of a robot manipulator by using a multiple-sensor combination measuring system (MCMS) is presented. It is composed of a visual sensor, an angle sensor and a series robot. The visual sensor is utilized to measure the position of the manipulator in real time, and the angle sensor is rigidly attached to the manipulator to obtain its orientation. Due to the higher accuracy of the multi-sensor, two efficient data fusion approaches, the Kalman filter (KF) and multi-sensor optimal information fusion algorithm (MOIFA), are used to fuse the position and orientation of the manipulator. The simulation and experimental results show that the pose accuracy of the robot manipulator is improved dramatically by 38%∼78% with the multi-sensor data fusion. Comparing with reported pose accuracy improvement methods, the primary advantage of this method is that it does not require the complex solution of the kinematics parameter equations, increase of the motion constraints and the complicated procedures of the traditional vision-based methods. It makes the robot processing more autonomous and accurate. To improve the reliability and accuracy of the pose measurements of MCMS, the visual sensor repeatability is experimentally studied. An optimal range of 1 × 0.8 × 1 ∼ 2 × 0.8 × 1 m in the field of view (FOV) is indicated by the experimental results. PMID:25850067

  18. Estimation of attitudes from a low-cost miniaturized inertial platform using Kalman Filter-based sensor fusion algorithm

    Indian Academy of Sciences (India)

    N Shantha Kumar; T Jann

    2004-04-01

    Due to costs, size and mass, commercially available inertial navigation systems are not suitable for small, autonomous flying vehicles like ALEX and other UAVs. In contrast, by using modern MEMS (or of similar class) sensors, hardware costs, size and mass can be reduced substantially. However, low-cost sensors often suffer from inaccuracy and are influenced greatly by temperature variation. In this work, such inaccuracies and dependence on temperature variations have been studied, modelled and compensated in order to reach an adequate quality of measurements for the estimation of attitudes. This has been done applying a Kalman Filter-based sensor fusion algorithm that combines sensor models, error parameters and estimation scheme. Attitude estimation from low-cost sensors is first realized in a Matlab/Simulink platform and then implemented on hardware by programming the micro controller and validated. The accuracies of the estimated roll and pitch attitudes are well within the stipulated accuracy level of ±5° for the ALEX. However, the estimation of heading, which is mainly derived from the magnetometer readings, seems to be influenced greatly by the variation in local magnetic field.

  19. Fiber-optic ground settlement sensor based on low-coherent interferometry.

    Science.gov (United States)

    Zhang, Pinglei; Wei, Heming; Zhao, Xuefeng; Sun, Changsen

    2014-05-20

    Ground settlement (GS) monitoring is a basic prerequisite in civil engineering. A commercialized instrument to meet this requirement has been available with millimeter accuracy. Major difficulties to improve this to micrometer scale, which are needed in special cases such as in high-speed railways, are challenged by the long stability of the sensor in the condition of the extremely slow settlement. A fiber-optic GS methodology was proposed by using a scanning low-coherent Michelson interferometer. One of the paths of the interferometer is formed by the liquid surface, and therefore the readout of the interferometer can make the measurement of the surface approach a micrometer scale. The liquid-contained chambers are hydraulically connected together at the bottom by using a water-filled tube. The liquid surface inside each chamber is at the same level initially. One of the chambers is located on stable ground or at a point that can be easily surveyed, too. The others are located at the points where settlement or heave is to be measured. Differential settlement, or heave, between the chambers will result in an apparent rise or fall of the liquid level, which biased the initial equal status. The experimental results demonstrated that the best accuracy of ±20  μm for GS monitoring was obtained with a reference compensation sensor.

  20. Monitoring of Carbon Dioxide and Methane Plumes from Combined Ground-Airborne Sensors

    Science.gov (United States)

    Jacob, Jamey; Mitchell, Taylor; Honeycutt, Wes; Materer, Nicholas; Ley, Tyler; Clark, Peter

    2016-11-01

    A hybrid ground-airborne sensing network for real-time plume monitoring of CO2 and CH4 for carbon sequestration is investigated. Conventional soil gas monitoring has difficulty in distinguishing gas flux signals from leakage with those associated with meteorologically driven changes. A low-cost, lightweight sensor system has been developed and implemented onboard a small unmanned aircraft and is combined with a large-scale ground network that measures gas concentration. These are combined with other atmospheric diagnostics, including thermodynamic data and velocity from ultrasonic anemometers and multi-hole probes. To characterize the system behavior and verify its effectiveness, field tests have been conducted with simulated discharges of CO2 and CH4 from compressed gas tanks to mimic leaks and generate gaseous plumes, as well as field tests over the Farnsworth CO2-EOR site in the Anadarko Basin. Since the sensor response time is a function of vehicle airspeed, dynamic calibration models are required to determine accurate location of gas concentration in space and time. Comparisons are made between the two tests and results compared with historical models combining both flight and atmospheric dynamics. Supported by Department of Energy Award DE-FE0012173.

  1. Multi-band sensor-fused explosive hazards detection in forward-looking ground penetrating radar

    Science.gov (United States)

    Havens, Timothy C.; Becker, John; Pinar, Anthony; Schulz, Timothy J.

    2014-05-01

    Explosive hazard detection and remediation is a pertinent area of interest for the U.S. Army. There are many types of detection methods that the Army has or is currently investigating, including ground-penetrating radar, thermal and visible spectrum cameras, acoustic arrays, laser vibrometers, etc. Since standoff range is an important characteristic for sensor performance, forward-looking ground-penetrating radar has been investigated for some time. Recently, the Army has begun testing a forward-looking system that combines L-band and X-band radar arrays. Our work focuses on developing imaging and detection methods for this sensor-fused system. In this paper, we investigate approaches that fuse L-band radar and X-band radar for explosive hazard detection and false alarm rejection. We use multiple kernel learning with support vector machines as the classification method and histogram of gradients (HOG) and local statistics as the main feature descriptors. We also perform preliminary testing on a context aware approach for detection. Results on government furnished data show that our false alarm rejection method improves area-under-ROC by up to 158%.

  2. Fusion of KLMS and blob based pre-screener for buried landmine detection using ground penetrating radar

    Science.gov (United States)

    Baydar, Bora; Akar, Gözde Bozdaǧi.; Yüksel, Seniha E.; Öztürk, Serhat

    2016-05-01

    In this paper, a decision level fusion using multiple pre-screener algorithms is proposed for the detection of buried landmines from Ground Penetrating Radar (GPR) data. The Kernel Least Mean Square (KLMS) and the Blob Filter pre-screeners are fused together to work in real time with less false alarms and higher true detection rates. The effect of the kernel variance is investigated for the KLMS algorithm. Also, the results of the KLMS and KLMS+Blob filter algorithms are compared to the LMS method in terms of processing time and false alarm rates. Proposed algorithm is tested on both simulated data and real data collected at the field of IPA Defence at METU, Ankara, Turkey.

  3. On Solving the Problem of Identifying Unreliable Sensors Without a Knowledge of the Ground Truth: The Case of Stochastic Environments.

    Science.gov (United States)

    Yazidi, Anis; Oommen, B John; Goodwin, Morten

    2016-04-28

    The purpose of this paper is to propose a solution to an extremely pertinent problem, namely, that of identifying unreliable sensors (in a domain of reliable and unreliable ones) without any knowledge of the ground truth. This fascinating paradox can be formulated in simple terms as trying to identify stochastic liars without any additional information about the truth. Though apparently impossible, we will show that it is feasible to solve the problem, a claim that is counter-intuitive in and of itself. One aspect of our contribution is to show how redundancy can be introduced, and how it can be effectively utilized in resolving this paradox. Legacy work and the reported literature (for example, in the so-called weighted majority algorithm) have merely addressed assessing the reliability of a sensor by comparing its reading to the ground truth either in an online or an offline manner. Unfortunately, the fundamental assumption of revealing the ground truth cannot be always guaranteed (or even expected) in many real life scenarios. While some extensions of the Condorcet jury theorem [9] can lead to a probabilistic guarantee on the quality of the fused process, they do not provide a solution to the unreliable sensor identification problem. The essence of our approach involves studying the agreement of each sensor with the rest of the sensors, and not comparing the reading of the individual sensors with the ground truth-as advocated in the literature. Under some mild conditions on the reliability of the sensors, we can prove that we can, indeed, filter out the unreliable ones. Our approach leverages the power of the theory of learning automata (LA) so as to gradually learn the identity of the reliable and unreliable sensors. To achieve this, we resort to a team of LA, where a distinct automaton is associated with each sensor. The solution provided here has been subjected to rigorous experimental tests, and the results presented are, in our opinion, both novel and

  4. Optical embedded dust sensor for engine protection and early warning on M1 Abrams/ground combat vehicles

    Science.gov (United States)

    Lin, Hai; Waldherr, Gregor A.; Burch, Timothy

    2012-06-01

    The Dual Optical Embedded Dust Sensor (DOEDS) is designed for the sensitive, accurate detection of particles for preventive health monitoring of the AGT1500 engine and M1 Abrams/Ground Combat Vehicles (GCVs). DOEDS is a real-time sensor that uses an innovative combination of optical particle sensing technologies and mechanical packaging in a rugged, compact and non-intrusive optical design. The optical sensor, implementing both a single particle sensor and a mass sensor, can operate in harsh environments (up to 400°F) to meet the particle size, size distribution, mass concentration, and response time criteria. The sensor may be flush- or inline-mounted in multiple engine locations and environments.

  5. Exploiting Real-Time FPGA Based Adaptive Systems Technology for Real-Time Sensor Fusion in Next Generation Automotive Safety Systems

    CERN Document Server

    Chappell, Steve; Preston, Dan; Olmstead, Dave; Flint, Bob; Sullivan, Chris

    2011-01-01

    We present a system for the boresighting of sensors using inertial measurement devices as the basis for developing a range of dynamic real-time sensor fusion applications. The proof of concept utilizes a COTS FPGA platform for sensor fusion and real-time correction of a misaligned video sensor. We exploit a custom-designed 32-bit soft processor core and C-based design & synthesis for rapid, platform-neutral development. Kalman filter and sensor fusion techniques established in advanced aviation systems are applied to automotive vehicles with results exceeding typical industry requirements for sensor alignment. Results of the static and the dynamic tests demonstrate that using inexpensive accelerometers mounted on (or during assembly of) a sensor and an Inertial Measurement Unit (IMU) fixed to a vehicle can be used to compute the misalignment of the sensor to the IMU and thus vehicle. In some cases the model predications and test results exceeded the requirements by an order of magnitude with a 3-sigma or ...

  6. Robust Fusion of Multiple Microphone and Geophone Arrays in a Ground Sensor Network

    Science.gov (United States)

    2006-10-01

    using modern pattern recognition methods that recognize patterns in, for instance, the emitted sound. Admittedly, there still are important obstacles ...d12, d13, and d23. Indeed, by knowing the distance d23 = t23/c, trigonometry gives the sought bearing angle θ. In theory, the t23 could be found as the

  7. Instrumental intelligent test of food sensory quality as mimic of human panel test combining multiple cross-perception sensors and data fusion.

    Science.gov (United States)

    Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng

    2014-09-02

    Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers.

  8. Thermophysical properties along Curiosity's traverse in Gale crater, Mars, derived from the REMS ground temperature sensor

    Science.gov (United States)

    Vasavada, Ashwin R.; Piqueux, Sylvain; Lewis, Kevin W.; Lemmon, Mark T.; Smith, Michael D.

    2017-03-01

    The REMS instrument onboard the Mars Science Laboratory rover, Curiosity, has measured ground temperature nearly continuously at hourly intervals for two Mars years. Coverage of the entire diurnal cycle at 1 Hz is available every few martian days. We compare these measurements with predictions of surface-atmosphere thermal models to derive the apparent thermal inertia and thermally derived albedo along the rover's traverse after accounting for the radiative effects of atmospheric water ice during fall and winter, as is necessary to match the measured seasonal trend. The REMS measurements can distinguish between active sand, other loose materials, mudstone, and sandstone based on their thermophysical properties. However, the apparent thermal inertias of bedrock-dominated surfaces (∼350-550 J m-2 K-1 s-½) are lower than expected. We use rover imagery and the detailed shape of the diurnal ground temperature curve to explore whether lateral or vertical heterogeneity in the surface materials within the sensor footprint might explain the low inertias. We find that the bedrock component of the surface can have a thermal inertia as high as 650-1700 J m-2 K-1 s-½ for mudstone sites and ∼700 J m-2 K-1 s-½ for sandstone sites in models runs that include lateral and vertical mixing. Although the results of our forward modeling approach may be non-unique, they demonstrate the potential to extract information about lateral and vertical variations in thermophysical properties from temporally resolved measurements of ground temperature.

  9. Unattended wireless proximity sensor networks for counterterrorism, force protection, littoral environments, PHM, and tamper monitoring ground applications

    Science.gov (United States)

    Forcier, Bob

    2003-09-01

    This paper describes a digital-ultrasonic ground network, which forms an unique "unattended mote sensor system" for monitoring the environment, personnel, facilities, vehicles, power generation systems or aircraft in Counter-Terrorism, Force Protection, Prognostic Health Monitoring (PHM) and other ground applications. Unattended wireless smart sensor/tags continuously monitor the environment and provide alerts upon changes or disruptions to the environment. These wireless smart sensor/tags are networked utilizing ultrasonic wireless motes, hybrid RF/Ultrasonic Network Nodes and Base Stations. The network is monitored continuously with a 24/7 remote and secure monitoring system. This system utilizes physical objects such as a vehicle"s structure or a building to provide the media for two way secure communication of key metrics and sensor data and eliminates the "blind spots" that are common in RF solutions because of structural elements of buildings, etc. The digital-ultrasonic sensors have networking capability and a 32-bit identifier, which provide a platform for a robust data acquisition (DAQ) for a large amount of sensors. In addition, the network applies a unique "signature" of the environment by comparing sensor-to-sensor data to pick up on minute changes, which would signal an invasion of unknown elements or signal a potential tampering in equipment or facilities. The system accommodates satellite and other secure network uplinks in either RF or UWB protocols. The wireless sensors can be dispersed by ground or air maneuvers. In addition, the sensors can be incorporated into the structure or surfaces of vehicles, buildings, or clothing of field personnel.

  10. Developing paradigms of data fusion for sensor-actuator networks that perform engineering tasks.

    OpenAIRE

    Iyengar, SS; Sastry, S.; Balakrishnan, N.

    2003-01-01

    In this article we provided a new foundation for data fusion based on two concepts: a conceptual framework and the goal-seeking paradigm. The conceptual framework emphasizes the dominant structures in the system. The goal-seeking paradigm is a mechanism for representing system evolution that explicitly manages uncertainty. The goal-seeking formulation for data fusion helps to distinguish between subjective decisions that resolve uncertainty by involving humans and objective decisions that can...

  11. Multi-Sensor Data Fusion Using a Relevance Vector Machine Based on an Ant Colony for Gearbox Fault Detection

    Directory of Open Access Journals (Sweden)

    Zhiwen Liu

    2015-08-01

    Full Text Available Sensors play an important role in the modern manufacturing and industrial processes. Their reliability is vital to ensure reliable and accurate information for condition based maintenance. For the gearbox, the critical machine component in the rotating machinery, the vibration signals collected by sensors are usually noisy. At the same time, the fault detection results based on the vibration signals from a single sensor may be unreliable and unstable. To solve this problem, this paper proposes an intelligent multi-sensor data fusion method using the relevance vector machine (RVM based on an ant colony optimization algorithm (ACO-RVM for gearboxes’ fault detection. RVM is a sparse probability model based on support vector machine (SVM. RVM not only has higher detection accuracy, but also better real-time accuracy compared with SVM. The ACO algorithm is used to determine kernel parameters of RVM. Moreover, the ensemble empirical mode decomposition (EEMD is applied to preprocess the raw vibration signals to eliminate the influence caused by noise and other unrelated signals. The distance evaluation technique (DET is employed to select dominant features as input of the ACO-RVM, so that the redundancy and inference in a large amount of features can be removed. Two gearboxes are used to demonstrate the performance of the proposed method. The experimental results show that the ACO-RVM has higher fault detection accuracy than the RVM with normal the cross-validation (CV.

  12. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method.

    Science.gov (United States)

    Deng, Xinyang; Jiang, Wen

    2017-09-12

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.

  13. Evaluation of Matrix Square Root Operations for UKF within a UAV GPS/INS Sensor Fusion Application

    Directory of Open Access Journals (Sweden)

    Matthew Rhudy

    2011-01-01

    Full Text Available Using an Unscented Kalman Filter (UKF as the nonlinear estimator within a Global Positioning System/Inertial Navigation System (GPS/INS sensor fusion algorithm for attitude estimation, various methods of calculating the matrix square root were discussed and compared. Specifically, the diagonalization method, Schur method, Cholesky method, and five different iterative methods were compared. Additionally, a different method of handling the matrix square root requirement, the square-root UKF (SR-UKF, was evaluated. The different matrix square root calculations were compared based on computational requirements and the sensor fusion attitude estimation performance, which was evaluated using flight data from an Unmanned Aerial Vehicle (UAV. The roll and pitch angle estimates were compared with independently measured values from a high quality mechanical vertical gyroscope. This manuscript represents the first comprehensive analysis of the matrix square root calculations in the context of UKF. From this analysis, it was determined that the best overall matrix square root calculation for UKF applications in terms of performance and execution time is the Cholesky method.

  14. Geospace Science from Ground-based Magnetometer Arrays: Advances in Sensors, Data Collection, and Data Integration

    Science.gov (United States)

    Mann, Ian; Chi, Peter

    2016-07-01

    Networks of ground-based magnetometers now provide the basis for the diagnosis of magnetic disturbances associated with solar wind-magnetosphere-ionosphere coupling on a truly global scale. Advances in sensor and digitisation technologies offer increases in sensitivity in fluxgate, induction coil, and new micro-sensor technologies - including the promise of hybrid sensors. Similarly, advances in remote connectivity provide the capacity for truly real-time monitoring of global dynamics at cadences sufficient for monitoring and in many cases resolving system level spatio-temporal ambiguities especially in combination with conjugate satellite measurements. A wide variety of the plasmaphysical processes active in driving geospace dynamics can be monitored based on the response of the electrical current system, including those associated with changes in global convection, magnetospheric substorms and nightside tail flows, as well as due to solar wind changes in both dynamic pressure and in response to rotations of the direction of the IMF. Significantly, any changes to the dynamical system must be communicated by the propagation of long-period Alfven and/or compressional waves. These wave populations hence provide diagnostics for not only the energy transport by the wave fields themselves, but also provide a mechanism for diagnosing the structure of the background plasma medium through which the waves propagate. Ultra-low frequency (ULF) waves are especially significant in offering a monitor for mass density profiles, often invisible to particle detectors because of their very low energy, through the application of a variety of magneto-seismology and cross-phase techniques. Renewed scientific interest in the plasma waves associated with near-Earth substorm dynamics, including magnetosphere-ionosphere coupling at substorm onset and their relation to magnetotail flows, as well the importance of global scale ultra-low frequency waves for the energisation, transport

  15. Martian Surface Temperature and Spectral Response from the MSL REMS Ground Temperature Sensor

    Science.gov (United States)

    Martin-Torres, Javier; Martínez-Frías, Jesús; Zorzano, María-Paz; Serrano, María; Mendaza, Teresa; Hamilton, Vicky; Sebastián, Eduardo; Armiens, Carlos; Gómez-Elvira, Javier; REMS Team

    2013-04-01

    The Rover Environmental Monitoring Station (REMS) on the Mars Science Laboratory (MSL) offers the opportunity to explore the near surface atmospheric conditions and, in particular will shed new light into the heat budget of the Martian surface. This is important for studies of the atmospheric boundary layer (ABL), as the ground and air temperatures measured directly by REMS control the coupling of the atmosphere with the surface [Zurek et al., 1992]. This coupling is driven by solar insolation. The ABL plays an important role in the general circulation and the local atmospheric dynamics of Mars. One of the REMS sensors, the ground temperature sensor (GTS), provides the data needed to study the thermal inertia properties of the regolith and rocks beneath the MSL rover. The GTS includes thermopile detectors, with infrared bands of 8-14 µm and 16-20 µm [Gómez-Elvira et al., 2012]. These sensors are clustered in a single location on the MSL mast and the 8-14 µm thermopile sounds the surface temperature. The infrared radiation reaching the thermopile is proportional to the emissivity of the surface minerals across these thermal wavelengths. We have developed a radiative transfer retrieval method for the REMS GTS using a database of thermal infrared laboratory spectra of analogue minerals and their mixtures. [Martín Redondo et al. 2009, Martínez-Frías et al. 2012 - FRISER-IRMIX database]. This method will be used to assess the perfomance of the REMS GTS as well as determine, through the error analysis, the surface temperature and emissivity values where MSL is operating. Comparisons with orbiter data will be performed. References Gómez-Elvira et al. [2012], REMS: The Environmental Sensor Suite for the Mars Science Laboratory Rover, Space Science Reviews, Volume 170, Issue 1-4, pp. 583-640. Martín-Redondo et al. [2009] Journal of Environmental Monitoring 11:, pp. 1428-1432. Martínez-Frías et al. [2012] FRISER-IRMIX database http

  16. A Multi-Sensor Remote Sensing Approach for Railway Corridor Ground Hazard Management

    Science.gov (United States)

    Kromer, Ryan; Hutchinson, Jean; Lato, Matt; Gauthier, Dave; Edwards, Tom

    2015-04-01

    Characterizing and monitoring ground hazard processes is a difficult endeavor along mountainous transportation corridors. This is primarily due to the quantity of hazard sites, complex topography, limited and sometimes hazardous access to sites, and obstructed views. The current hazard assessment approach for Canadian railways partly relies on the ability of inspection employees to assess hazard from track level, which isn't practical in complex slope environments. Various remote sensing sensors, implemented on numerous platforms have the potential to be used in these environments. They are frequently found to be complementary in their use, however, an optimum combination of these approaches has not yet been found for an operational rail setting. In this study, we investigate various cases where remote sensing technologies have been used to characterize and monitor ground hazards along railway corridors across the Canadian network, in order to better understand failure mechanisms, identify hazard source zones and to provide early warning. Since early 2012, a series of high resolution gigapixel images, Terrestrial Laser Scanning (TLS), Aerial laser scanning (ALS), ground based photogrammetry, oblique aerial photogrammetry (from helicopter and Unmanned Aerial Vehicle (UAV) platforms), have been collected at ground hazard sites throughout the Canadian rail network. On a network level scale, comparison of sequential ALS scanning data has been found to be an ideal methodology for observing large-scale change and prioritizing high hazard sites for more detailed monitoring with terrestrial methods. The combination of TLS and high resolution gigapixel imagery at various temporal scales has allowed for a detailed characterization of the hazard level posed by the slopes, the identification of the main failure modes, an analysis of hazard activity, and the observation failure precursors such as deformation, rockfall and tension crack opening. At sites not feasible for ground

  17. Highly precise distributed Brillouin scattering sensor for structural health monitoring of optical ground wire cable

    Science.gov (United States)

    Zou, Lufan; Ravet, Fabien; Bao, Xiaoyi; Chen, Liang

    2004-07-01

    A distributed Brillouin scattering sensor with high special precision has been developed for the measurement of small damages/cracks of 1.5 cm. The out-layer damaged regions in an optical ground wire (OPGW) cable have been identified successfully by measuring the strain distributions every 5 cm using this technology. The stress increased to 127 kN which corresponds to more than 7500 micro-strain in the fibers. The locations of structural indentations comprising repaired and undamaged regions are found and distinguished using their corresponding strain data. The elongation of repaired region increases with time on 127 kN. These results are quantified in terms of the fiber orientation, stress, and behavior relative to undamaged sections.

  18. Kernel-Based Sensor Fusion With Application to Audio-Visual Voice Activity Detection

    Science.gov (United States)

    Dov, David; Talmon, Ronen; Cohen, Israel

    2016-12-01

    In this paper, we address the problem of multiple view data fusion in the presence of noise and interferences. Recent studies have approached this problem using kernel methods, by relying particularly on a product of kernels constructed separately for each view. From a graph theory point of view, we analyze this fusion approach in a discrete setting. More specifically, based on a statistical model for the connectivity between data points, we propose an algorithm for the selection of the kernel bandwidth, a parameter, which, as we show, has important implications on the robustness of this fusion approach to interferences. Then, we consider the fusion of audio-visual speech signals measured by a single microphone and by a video camera pointed to the face of the speaker. Specifically, we address the task of voice activity detection, i.e., the detection of speech and non-speech segments, in the presence of structured interferences such as keyboard taps and office noise. We propose an algorithm for voice activity detection based on the audio-visual signal. Simulation results show that the proposed algorithm outperforms competing fusion and voice activity detection approaches. In addition, we demonstrate that a proper selection of the kernel bandwidth indeed leads to improved performance.

  19. Fused performance of passive thermal and active polarimetric EO demining sensor

    Science.gov (United States)

    Liao, Wen-Jiao; Baertlein, Brian A.

    2002-08-01

    The potential for performance improvements through sensor fusion is explored for two electro-optical (EO) imaging sensors: a passive thermal IR camera and an active polarimetric system. Tests of decision-level fusion using a small data set (roughly 60 mine signatures) suggest that a significant performance improvement can be obtained by using an AND fusion approach. The source of this improvement derives from correlation among the sensors. Specifically, the sensors exhibit a strong positive correlation when a mine is present, and a negligible correlation when viewing clutter. The observed improvement is independent of the local ground clutter, but it depends strongly on the decision thresholds used for the individual sensors.

  20. Distributed service-based approach for sensor data fusion in IoT environments.

    Science.gov (United States)

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L

    2014-10-15

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  1. Distributed Service-Based Approach for Sensor Data Fusion in IoT Environments

    Directory of Open Access Journals (Sweden)

    Sandra Rodríguez-Valenzuela

    2014-10-01

    Full Text Available The Internet of Things (IoT enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  2. Data fusion in multi sensor platforms for wide-area perception

    NARCIS (Netherlands)

    Polychronopoulos, A.; Floudas, N.; Amditis, A.; Bank, D.; Broek, S.P. van den

    2006-01-01

    there is a strong belief that the improvement of preventive safety applications and the extension of their operative range will be achieved by the deployment of multiple sensors with wide fields of view (FOV). The paper contributes to the solution of the problem and introduces distributed sensor dat

  3. Multi-sensor fusion using an adaptive multi-hypothesis tracking algorithm

    NARCIS (Netherlands)

    Kester, L.J.H.M.

    2003-01-01

    The purpose of a tracking algorithm is to associate data measured by one or more (moving) sensors to moving objects in the environment. The state of these objects that can be estimated with the tracking process depends on the type of data that is provided by these sensors. It is discussed how the

  4. Thin film metal sensors in fusion bonded glass chips for high-pressure microfluidics

    Science.gov (United States)

    Andersson, Martin; Ek, Johan; Hedman, Ludvig; Johansson, Fredrik; Sehlstedt, Viktor; Stocklassa, Jesper; Snögren, Pär; Pettersson, Victor; Larsson, Jonas; Vizuete, Olivier; Hjort, Klas; Klintberg, Lena

    2017-01-01

    High-pressure microfluidics offers fast analyses of thermodynamic parameters for compressed process solvents. However, microfluidic platforms handling highly compressible supercritical CO2 are difficult to control, and on-chip sensing would offer added control of the devices. Therefore, there is a need to integrate sensors into highly pressure tolerant glass chips. In this paper, thin film Pt sensors were embedded in shallow etched trenches in a glass wafer that was bonded with another glass wafer having microfluidic channels. The devices having sensors integrated into the flow channels sustained pressures up to 220 bar, typical for the operation of supercritical CO2. No leakage from the devices could be found. Integrated temperature sensors were capable of measuring local decompression cooling effects and integrated calorimetric sensors measured flow velocities over the range 0.5-13.8 mm s-1. By this, a better control of high-pressure microfluidic platforms has been achieved.

  5. Unsupervised learning in persistent sensing for target recognition by wireless ad hoc networks of ground-based sensors

    Science.gov (United States)

    Hortos, William S.

    2008-04-01

    In previous work by the author, effective persistent and pervasive sensing for recognition and tracking of battlefield targets were seen to be achieved, using intelligent algorithms implemented by distributed mobile agents over a composite system of unmanned aerial vehicles (UAVs) for persistence and a wireless network of unattended ground sensors for pervasive coverage of the mission environment. While simulated performance results for the supervised algorithms of the composite system are shown to provide satisfactory target recognition over relatively brief periods of system operation, this performance can degrade by as much as 50% as target dynamics in the environment evolve beyond the period of system operation in which the training data are representative. To overcome this limitation, this paper applies the distributed approach using mobile agents to the network of ground-based wireless sensors alone, without the UAV subsystem, to provide persistent as well as pervasive sensing for target recognition and tracking. The supervised algorithms used in the earlier work are supplanted by unsupervised routines, including competitive-learning neural networks (CLNNs) and new versions of support vector machines (SVMs) for characterization of an unknown target environment. To capture the same physical phenomena from battlefield targets as the composite system, the suite of ground-based sensors can be expanded to include imaging and video capabilities. The spatial density of deployed sensor nodes is increased to allow more precise ground-based location and tracking of detected targets by active nodes. The "swarm" mobile agents enabling WSN intelligence are organized in a three processing stages: detection, recognition and sustained tracking of ground targets. Features formed from the compressed sensor data are down-selected according to an information-theoretic algorithm that reduces redundancy within the feature set, reducing the dimension of samples used in the target

  6. Sensor Technology Baseline Study for Enabling Condition Based Maintenance Plus in Army Ground Vehicles

    Science.gov (United States)

    2012-03-01

    and mechanisms are identified. Based on this analysis, baselines sensor technologies are determined to prognosticate these types failure causes early...Current/voltage sensor measured at sensor terminals; Fluid level sensor Excessive slippage and clutch chatter Internal transmission failure ... TYPE Final 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Sensor Technology Baseline Study for Enabling Condition Based Maintenance Plus in

  7. Improved GSO Optimized ESN Soft-Sensor Model of Flotation Process Based on Multisource Heterogeneous Information Fusion

    Directory of Open Access Journals (Sweden)

    Jie-sheng Wang

    2014-01-01

    Full Text Available For predicting the key technology indicators (concentrate grade and tailings recovery rate of flotation process, an echo state network (ESN based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO algorithm is proposed. Firstly, the color feature (saturation and brightness and texture features (angular second moment, sum entropy, inertia moment, etc. based on grey-level co-occurrence matrix (GLCM are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process.

  8. Improved GSO optimized ESN soft-sensor model of flotation process based on multisource heterogeneous information fusion.

    Science.gov (United States)

    Wang, Jie-sheng; Han, Shuang; Shen, Na-na

    2014-01-01

    For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, an echo state network (ESN) based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO) algorithm is proposed. Firstly, the color feature (saturation and brightness) and texture features (angular second moment, sum entropy, inertia moment, etc.) based on grey-level co-occurrence matrix (GLCM) are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA) method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process.

  9. Sensor Fusion for Measurement of Transonic Flow and Structural Dynamic Response Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses development of techniques that support experimental modeling, simulations, ground testing, wind tunnel tests, and flight experiments, with...

  10. Optimization of Automatic Target Recognition with a Reject Option Using Fusion and Correlated Sensor Data

    Science.gov (United States)

    2005-04-25

    ROC curve in the evaluation of machine learning algorithms,” Pattern Recognition, Vol 30, No 7: 1145-1159 (1997). Brown, Gerald G. “Top Ten Secrets...Kuo C. and Karp , Sherman. “Polarimetric fusion for synthetic aperture radar target classification,” Pattern Recognition, Vol 30, No 5: 769-775

  11. Bearings Only Tracking with Fusion from Heterogenous Passive Sensors: ESM/EO and Acoustic

    Science.gov (United States)

    2017-02-01

    speed 5m/s. The simulation results were obtained from 100 Monte Carlo runs. The estimated position root mean square errors (RMSE) versus time are...ing with state-dependent propagation delay,” Proc. 17th International Conference on Information Fusion, Salamanca , Spain, Jul. 2014. [22] Yang, R

  12. Finite Element Modelling of a Field-Sensed Magnetic Suspended System for Accurate Proximity Measurement Based on a Sensor Fusion Algorithm with Unscented Kalman Filter

    Directory of Open Access Journals (Sweden)

    Amor Chowdhury

    2016-09-01

    Full Text Available The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.

  13. Finite Element Modelling of a Field-Sensed Magnetic Suspended System for Accurate Proximity Measurement Based on a Sensor Fusion Algorithm with Unscented Kalman Filter.

    Science.gov (United States)

    Chowdhury, Amor; Sarjaš, Andrej

    2016-09-15

    The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.

  14. Instrumental intelligent test of food sensory quality as mimic of human panel test combining multiple cross-perception sensors and data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng, E-mail: qschen@ujs.edu.cn

    2014-09-02

    Highlights: • To develop a novel instrumental intelligent test methodology for food sensory analysis. • A novel data fusion was used in instrumental intelligent test methodology. • Linear and nonlinear tools were comparatively used for modeling. • The instrumental test methodology can be imitative of human test behavior. - Abstract: Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers.

  15. Self-Organizing Architecture for Information Fusion in Distributed Sensor Networks

    National Research Council Canada - National Science Library

    Bajo, Javier; De Paz, Juan F; Villarrubia, Gabriel; Corchado, Juan M

    2015-01-01

    The management of heterogeneous distributed sensor networks requires new solutions that can address the problem of automatically fusing the information coming from different sources in an efficient and effective manner...

  16. Sensor Fusion Applied To Shape Sensing: Theory and Numerical Proof-of-Concept (poster)

    NARCIS (Netherlands)

    De Mooij, C.; Martinez, M.; Benedictus, R.

    2015-01-01

    Existing shape sensing methods use individual sensor types, determining either strain or displacement well, but not both. More accurate shape sensing could improve load estimation, necessary for accurate life assessment of the structure.

  17. Realization to Extend the Orientation Estimation Range of Moving Target on the Ground by a Single Vector Sensor

    Directory of Open Access Journals (Sweden)

    Xiaopeng Song

    2013-07-01

    Full Text Available The DOA (direction of arrival estimation of seismic signals from the moving target on the ground bears great significance for unattended ground systems. The traditional DOA estimation of seismic signals is achieved by a sensor array and its corresponding algorithms. MEMS (Micro-Electro- Mechanical Systems vector vibration sensor, however, gets the vector information over the propagation of seismic signals and therefore can get a DOA estimation within a certain range through a single vector sensor. This paper proposes a new method to extend the orientation range through the rotation of the MEMS vector vibration axis. The experiment shows that this method shares the merits with simple systematic structure, high sensitivity and less than 5 degrees of error on average, which has an extensive wide application prospect.

  18. Gesture-Directed Sensor-Information Fusion (GDSIF) for Protection and Communication in Hazardous Environments

    Science.gov (United States)

    2009-11-20

    G. Rogers, R. Luna, and J. Ellen, “Wireless Communication Glove Apparatus for Motion Tracking, Gesture Recognition , Data Transmission, and Reception...and easier to deploy in a variety of ways. (See, for example, [1] and [3].) The current eGloves have magnetic and motion sensors for gesture ... recognition [5], [6]. An important future step to enhance the effectiveness of the war fighter is to integrate CBRN and other sensors into the eGloves

  19. Para-quinodimethane-bridged perylene dimers and pericondensed quaterrylenes: The effect of the fusion mode on the ground states and physical properties

    KAUST Repository

    Das, Soumyajit

    2014-07-23

    Polycyclic hydrocarbon compounds with a singlet biradical ground state show unique physical properties and promising material applications; therefore, it is important to understand the fundamental structure/biradical character/physical properties relationships. In this study, para-quinodimethane (p-QDM)-bridged quinoidal perylene dimers 4 and 5 with different fusion modes and their corresponding aromatic counterparts, the pericondensed quaterrylenes 6 and 7, were synthesized. Their ground-state electronic structures and physical properties were studied by using various experiments assisted with DFT calculations. The proaromatic p-QDM-bridged perylene monoimide dimer 4 has a singlet biradical ground state with a small singlet/triplet energy gap (-2.97 kcalmol-1), whereas the antiaromatic s-indacene-bridged N-annulated perylene dimer 5 exists as a closed-shell quinoid with an obvious intramolecular charge-transfer character. Both of these dimers showed shorter singlet excited-state lifetimes, larger two-photon-absorption cross sections, and smaller energy gaps than the corresponding aromatic quaterrylene derivatives 6 and 7, respectively. Our studies revealed how the fusion mode and aromaticity affect the ground state and, consequently, the photophysical properties and electronic properties of a series of extended polycyclic hydrocarbon compounds. A matter of fusion mode! Fusion of a para-quinodimethane (p-QDM) subunit at the peri and β positions of perylene dimers leads to systems with different ground states, that is, open and closed shell (see picture). These systems showed large two-photon absorption cross sections and ultrafast excited-state dynamics relative to their corresponding pericondensed aromatic quaterrylene counterparts. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Accurate positioning of pedestrains in mixed indoor/outdoor settings : A particle filter approach to sensor and map fusion

    DEFF Research Database (Denmark)

    Toftkjær, Thomas

    Pedestrian positioning with full coverage in urban environments is a long sought after research goal. This thesis proposes new techniques for handling the challenging task of truly pervasive pedestrian positioning. It shows that through sensor fusion one can both improve accuracy and extend...... the coverage of pedestrian positioning, for the professional person, such as rst responders, as well as for the ordinary citizen. Since GNSS alone cannot satisfy the availability and accuracy demands for all indoor settings, this thesis pursues a better understanding of the capabilities of GNSS indoors...... and an extension to ProPosition, that advances the state-of-the-art in pedestrian positioning towards the pervasive research goals of full urban environment coverage, seamless indoor and outdoor positioning and the ability to run on ordinary user devices such as smartphones. Through ProLoc it is possible...

  1. 3D-Information Fusion from Very High Resolution Satellite Sensors

    OpenAIRE

    Krauss, T; P. d'Angelo; G. Kuschk; Tian, J.; T. Partovi

    2015-01-01

    In this paper we show the pre-processing and potential for environmental applications of very high resolution (VHR) satellite stereo imagery like these from WorldView-2 or Pl´eiades with ground sampling distances (GSD) of half a metre to a metre. To process such data first a dense digital surface model (DSM) has to be generated. Afterwards from this a digital terrain model (DTM) representing the ground and a so called normalized digital elevation model (nDEM) representing off-ground ...

  2. Cooperative Navigation and Coverage Identification with Random Gossip and Sensor Fusion

    OpenAIRE

    2016-01-01

    This paper is concerned with cooperative Terrain Aided Navigation of a network of aircraft using fusion of Radar Altimeter and inter-node range measurements. State inference is performed using a Rao-Blackwellized Particle Filter with online measurement noise statistics estimation. For terrain coverage measurement noise parameter identification, an online Expectation Maximization algorithm is proposed, where local sufficient statistics at each node are calculated in the E-step, which are then ...

  3. A chitosan-coated humidity sensor based on Mach-Zehnder interferometer with waist-enlarged fusion bitapers

    Science.gov (United States)

    Ni, Kai; Chan, Chi Chiu; Chen, Lihan; Dong, Xinyong; Huang, Ran; Ma, Qifei

    2017-01-01

    A novel humidity sensor, which adopts a Mach-Zehnder interferometer (MZI) in normal single mode fiber (SMF) modified by the deposition of chitosan (a moisture-sensitive natural polymer) on the cladding, is proposed and experimentally demonstrated. It is fabricated by the fusion splicing of a segment between the two SMF with waist-enlarged fusion bitapers. This all-fiber MZI based on SMF incorporates intermodal interference between the core mode and the cladding mode. Due to the fact that it is sensitive to external refractive index and that the RI of the chitosan multi-layer film coat depends on the environmental humidity, the SMF-MZI with a chitosan coating layer of nanometer thickness is employed in humidity measuring. The sensitivity of ∼119.6 pm/RH (relative humidity unit) is achieved within the range from 10% to 90% on the experimental level. Moreover, the chitosan coat has good biocompatibility for in vivo biomedical applications like immunosensing and DNA hybridization detection in the near future.

  4. Development of Deep Learning Based Data Fusion Approach for Accurate Rainfall Estimation Using Ground Radar and Satellite Precipitation Products

    Science.gov (United States)

    Chen, H.; Chandra, C. V.; Tan, H.; Cifelli, R.; Xie, P.

    2016-12-01

    Rainfall estimation based on onboard satellite measurements has been an important topic in satellite meteorology for decades. A number of precipitation products at multiple time and space scales have been developed based upon satellite observations. For example, NOAA Climate Prediction Center has developed a morphing technique (i.e., CMORPH) to produce global precipitation products by combining existing space based rainfall estimates. The CMORPH products are essentially derived based on geostationary satellite IR brightness temperature information and retrievals from passive microwave measurements (Joyce et al. 2004). Although the space-based precipitation products provide an excellent tool for regional and global hydrologic and climate studies as well as improved situational awareness for operational forecasts, its accuracy is limited due to the sampling limitations, particularly for extreme events such as very light and/or heavy rain. On the other hand, ground-based radar is more mature science for quantitative precipitation estimation (QPE), especially after the implementation of dual-polarization technique and further enhanced by urban scale radar networks. Therefore, ground radars are often critical for providing local scale rainfall estimation and a "heads-up" for operational forecasters to issue watches and warnings as well as validation of various space measurements and products. The CASA DFW QPE system, which is based on dual-polarization X-band CASA radars and a local S-band WSR-88DP radar, has demonstrated its excellent performance during several years of operation in a variety of precipitation regimes. The real-time CASA DFW QPE products are used extensively for localized hydrometeorological applications such as urban flash flood forecasting. In this paper, a neural network based data fusion mechanism is introduced to improve the satellite-based CMORPH precipitation product by taking into account the ground radar measurements. A deep learning system is

  5. Particle Filter-Based Recursive Data Fusion With Sensor Indexing for Large Core Neutron Flux Estimation

    Science.gov (United States)

    Tamboli, Prakash Kumar; Duttagupta, Siddhartha P.; Roy, Kallol

    2017-06-01

    We introduce a sequential importance sampling particle filter (PF)-based multisensor multivariate nonlinear estimator for estimating the in-core neutron flux distribution for pressurized heavy water reactor core. Many critical applications such as reactor protection and control rely upon neutron flux information, and thus their reliability is of utmost importance. The point kinetic model based on neutron transport conveniently explains the dynamics of nuclear reactor. The neutron flux in the large core loosely coupled reactor is sensed by multiple sensors measuring point fluxes located at various locations inside the reactor core. The flux values are coupled to each other through diffusion equation. The coupling facilitates redundancy in the information. It is shown that multiple independent data about the localized flux can be fused together to enhance the estimation accuracy to a great extent. We also propose the sensor anomaly handling feature in multisensor PF to maintain the estimation process even when the sensor is faulty or generates data anomaly.

  6. Part task investigation of multispectral image fusion using gray scale and synthetic color night-vision sensor imagery for helicopter pilotage

    Science.gov (United States)

    Steele, Paul M.; Perconti, Philip

    1997-06-01

    Today, night vision sensor and display systems used in the pilotage or navigation of military helicopters are either long wave IR thermal sensors (8 - 12 microns) or image intensified, visible and near IR (0.6 - 0.9 microns), sensors. The sensor imagery is displayed using a monochrome phosphor on a Cathode Ray Tube or night vision goggle. Currently, there is no fielded capability to combine the best attributes of the emissive radiation sensed by the thermal sensor and the reflected radiation sensed by the image intensified sensor into a single fused image. However, recent advances in signal processing have permitted the real time image fusion and display of multispectral sensors in either monochrome or synthetic chromatic form. The merits of such signal processing is explored. A part task simulation using a desktop computer, video playback unit, and a biocular head mounted display was conducted. Response time and accuracy measures of test subject responses to visual perception tasks were taken. Subjective ratings were collected to determine levels of pilot acceptance. In general, fusion based formats resulted in better subject performance. The benefits of integrating synthetic color to fused imagery, however, is dependent on the color algorithm used, the visual task performed, and scene content.

  7. Coincident Observation of Lightning using Spaceborne Spectrophotometer and Ground-Level Electromagnetic Sensors

    Science.gov (United States)

    Adachi, Toru; Cohen, Morris; Li, Jingbo; Cummer, Steve; Blakeslee, Richard; Marshall, THomas; Stolzenberg, Maribeth; Karunarathne, Sumedhe; Hsu, Rue-Ron; Su, Han-Tzong; Chen, Alfred; Takahashi, Yukihiro; Frey, Harald; Mende, Stephen

    2012-01-01

    The present study aims at assessing a possible new way to reveal the properties of lightning flash, using spectrophotometric data obtained by FORMOSAT-2/ISUAL which is the first spaceborne multicolor lightning detector. The ISUAL data was analyzed in conjunction with ground ]based electromagnetic data obtained by Duke magnetic field sensors, NLDN, North Alabama Lightning Mapping Array (LMA), and Kennedy Space Center (KSC) electric field antennas. We first classified the observed events into cloud ]to ]ground (CG) and intra ]cloud (IC) lightning based on the Duke and NLDN measurements and analyzed ISUAL data to clarify their optical characteristics. It was found that the ISUAL optical waveform of CG lightning was strongly correlated with the current moment waveform, suggesting that it is possible to evaluate the electrical properties of lightning from satellite optical measurement to some extent. The ISUAL data also indicated that the color of CG lightning turned to red at the time of return stroke while the color of IC pulses remained unchanged. Furthermore, in one CG event which was simultaneously detected by ISUAL and LMA, the observed optical emissions slowly turned red as the altitude of optical source gradually decreased. All of these results indicate that the color of lightning flash depends on the source altitude and suggest that spaceborne optical measurement could be a new tool to discriminate CG and IC lightning. In the presentation, we will also show results on the comparison between the ISUAL and KSC electric field data to clarify characteristics of each lightning process such as preliminary breakdown, return stroke, and subsequent upward illumination.

  8. Observability considerations for multi-sensor and product fusion: Bias, information content, and validation (Invited)

    Science.gov (United States)

    Reid, J. S.; Zhang, J.; Hyer, E. J.; Campbell, J. R.; Christopher, S. A.; Ferrare, R. A.; Leptoukh, G. G.; Stackhouse, P. W.

    2009-12-01

    With the successful development of many aerosol products from the NASA A-train as well as new operational geostationary and polar orbiting sensors, the scientific community now has a host of new parameters to use in their analyses. The variety and quality of products has reached a point where the community has moved from basic observation-based science to sophisticated multi-component research that addresses the complex atmospheric environment. In order for these satellite data contribute to the science their uncertainty levels must move from semi-quantitative to quantitative. Initial attempts to quantify uncertainties have led to some recent debate in the community as to the efficacy of aerosol products from current and future NASA satellite sensors. In an effort to understand the state of satellite product fidelity, the Naval Research Laboratory and a newly reformed Global Energy and Water Cycle Experiment (GEWEX) aerosol panel have both initiated assessments of the nature of aerosol remote sensing uncertainty and bias. In this talk we go over areas of specific concern based on the authors’ experiences with the data, emphasizing the multi-sensor problem. We first enumerate potential biases, including retrieval, sampling/contextual, and cognitive bias. We show examples of how these biases can subsequently lead to the pitfalls of correlated/compensating errors, tautology, and confounding. The nature of bias is closely related to the information content of the sensor signal and its subsequent application to the derived aerosol quantity of interest (e.g., optical depth, flux, index of refraction, etc.). Consequently, purpose-specific validation methods must be employed, especially when generating multi-sensor products. Indeed, cloud and lower boundary condition biases in particular complicate the more typical methods of regressional bias elimination and histogram matching. We close with a discussion of sequestration of uncertainty in multi-sensor applications of

  9. Inertial Head-Tracker Sensor Fusion by a Complementary Separate-Bias Kalman Filter

    Science.gov (United States)

    Foxlin, Eric

    1996-01-01

    Current virtual environment and teleoperator applications are hampered by the need for an accurate, quick-responding head-tracking system with a large working volume. Gyroscopic orientation sensors can overcome problems with jitter, latency, interference, line-of-sight obscurations, and limited range, but suffer from slow drift. Gravimetric inclinometers can detect attitude without drifting, but are slow and sensitive to transverse accelerations. This paper describes the design of a Kalman filter to integrate the data from these two types of sensors in order to achieve the excellent dynamic response of an inertial system without drift, and without the acceleration sensitivity of inclinometers.

  10. Sensor Integration, Management and Data Fusion Concepts in a Naval Command and Control Perspective

    Science.gov (United States)

    2016-06-07

    October I octobre 1998 Approved by I approuve par ~.rro /3.J.P..u#VU."’ & i Section Head I Chef d~ section .. le. 9 od’ohtt ~8 Date SANS...efforts to- reduce uncertainty and is invaluable in resolving ambiguities.- Complementarity results if the sensor suite is made up of sensors each of...which observes a subset of the environment state space, such that the union of these subsets makes up the whole environment state space which is of

  11. Tilt performance of the ground settlement sensor configured in a fiber-optic low-coherent interferometer.

    Science.gov (United States)

    Zhang, Pinglei; Wei, Heming; Guo, Jingjing; Sun, Changsen

    2016-10-01

    Ground settlement (GS) is one of the causes that destroy the durability of reinforced concrete structures. It could lead to a deterioration in the structural basement and increase the risk of collapse. The methods used for GS monitoring were mostly electronic-based sensors for reading the changes in resistance, resonant frequencies, etc. These sensors often bear low accuracy in the long term. Our published work demonstrated that a fiber-optic low-coherent interferometer configured in a Michelson interferometer was designed as a GS sensor, and a micro-meter resolution in the room environment was approached. However, the designed GS sensor, which in principle is based on a hydraulic connecting vessel, has to suffer from a tilt degeneration problem due to a strictly vertical requirement in practical installment. Here, we made a design for the GS sensor based on its robust tilt performance. The experimental tests show that the sensor can work well within a ±5° tilt. This could meet the requirements in most designed GS sensor installment applications.

  12. Cross-Characterization of Aerosol Properties from Multiple Spaceborne Sensors Facilitated by Regional Ground-Based Observations

    Science.gov (United States)

    Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory

    2010-01-01

    Aerosol observations from space have become a standard source for retrieval of aerosol properties on both regional and global scales. Indeed, the large number of currently operational spaceborne sensors provides for unprecedented access to the most complete set of complimentary aerosol measurements ever to be available. Nonetheless, this resource remains under-utilized, largely due to the discrepancies and differences existing between the sensors and their aerosol products. To characterize the inconsistencies and bridge the gap that exists between the sensors, we have designed and implemented an online Multi-sensor Aerosol Products Sampling System (MAPSS) that facilitates the joint sampling of aerosol data from multiple sensors. MAPSS consistently samples aerosol products from multiple spaceborne sensors using a unified spatial and temporal resolution, where each dataset is sampled over Aerosol Robotic Network (AERONET) locations together with coincident AERONET data samples. In this way, MAPSS enables a direct cross-characterization and data integration between aerosol products from multiple sensors. Moreover, the well-characterized co-located ground-based AERONET data provides the basis for the integrated validation of these products.

  13. 3D Buried Utility Location Using A Marching-Cross-Section Algorithm for Multi-Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Qingxu Dou

    2016-11-01

    Full Text Available We address the problem of accurately locating buried utility segments by fusing data from multiple sensors using a novel Marching-Cross-Section (MCS algorithm. Five types of sensors are used in this work: Ground Penetrating Radar (GPR, Passive Magnetic Fields (PMF, Magnetic Gradiometer (MG, Low Frequency Electromagnetic Fields (LFEM and Vibro-Acoustics (VA. As part of the MCS algorithm, a novel formulation of the extended Kalman Filter (EKF is proposed for marching existing utility tracks from a scan cross-section (scs to the next one; novel rules for initializing utilities based on hypothesized detections on the first scs and for associating predicted utility tracks with hypothesized detections in the following scss are introduced. Algorithms are proposed for generating virtual scan lines based on given hypothesized detections when different sensors do not share common scan lines, or when only the coordinates of the hypothesized detections are provided without any information of the actual survey scan lines. The performance of the proposed system is evaluated with both synthetic data and real data. The experimental results in this work demonstrate that the proposed MCS algorithm can locate multiple buried utility segments simultaneously, including both straight and curved utilities, and can separate intersecting segments. By using the probabilities of a hypothesized detection being a pipe or a cable together with its 3D coordinates, the MCS algorithm is able to discriminate a pipe and a cable close to each other. The MCS algorithm can be used for both post- and on-site processing. When it is used on site, the detected tracks on the current scs can help to determine the location and direction of the next scan line. The proposed “multi-utility multi-sensor” system has no limit to the number of buried utilities or the number of sensors, and the more sensor data used, the more buried utility segments can be detected with more accurate location

  14. Motor Function Evaluation of Hemiplegic Upper-Extremities Using Data Fusion from Wearable Inertial and Surface EMG Sensors.

    Science.gov (United States)

    Li, Yanran; Zhang, Xu; Gong, Yanan; Cheng, Ying; Gao, Xiaoping; Chen, Xiang

    2017-03-13

    Quantitative evaluation of motor function is of great demand for monitoring clinical outcome of applied interventions and further guiding the establishment of therapeutic protocol. This study proposes a novel framework for evaluating upper limb motor function based on data fusion from inertial measurement units (IMUs) and surface electromyography (EMG) sensors. With wearable sensors worn on the tested upper limbs, subjects were asked to perform eleven straightforward, specifically designed canonical upper-limb functional tasks. A series of machine learning algorithms were applied to the recorded motion data to produce evaluation indicators, which is able to reflect the level of upper-limb motor function abnormality. Sixteen healthy subjects and eighteen stroke subjects with substantial hemiparesis were recruited in the experiment. The combined IMU and EMG data yielded superior performance over the IMU data alone and the EMG data alone, in terms of decreased normal data variation rate (NDVR) and improved determination coefficient (DC) from a regression analysis between the derived indicator and routine clinical assessment score. Three common unsupervised learning algorithms achieved comparable performance with NDVR around 10% and strong DC around 0.85. By contrast, the use of a supervised algorithm was able to dramatically decrease the NDVR to 6.55%. With the proposed framework, all the produced indicators demonstrated high agreement with the routine clinical assessment scale, indicating their capability of assessing upper-limb motor functions. This study offers a feasible solution to motor function assessment in an objective and quantitative manner, especially suitable for home and community use.

  15. A Sensor Fusion Method Based on an Integrated Neural Network and Kalman Filter for Vehicle Roll Angle Estimation

    Directory of Open Access Journals (Sweden)

    Leandro Vargas-Meléndez

    2016-08-01

    Full Text Available This article presents a novel estimator based on sensor fusion, which combines the Neural Network (NN with a Kalman filter in order to estimate the vehicle roll angle. The NN estimates a “pseudo-roll angle” through variables that are easily measured from Inertial Measurement Unit (IMU sensors. An IMU is a device that is commonly used for vehicle motion detection, and its cost has decreased during recent years. The pseudo-roll angle is introduced in the Kalman filter in order to filter noise and minimize the variance of the norm and maximum errors’ estimation. The NN has been trained for J-turn maneuvers, double lane change maneuvers and lane change maneuvers at different speeds and road friction coefficients. The proposed method takes into account the vehicle non-linearities, thus yielding good roll angle estimation. Finally, the proposed estimator has been compared with one that uses the suspension deflections to obtain the pseudo-roll angle. Experimental results show the effectiveness of the proposed NN and Kalman filter-based estimator.

  16. A Sensor Fusion Method Based on an Integrated Neural Network and Kalman Filter for Vehicle Roll Angle Estimation.

    Science.gov (United States)

    Vargas-Meléndez, Leandro; Boada, Beatriz L; Boada, María Jesús L; Gauchía, Antonio; Díaz, Vicente

    2016-08-31

    This article presents a novel estimator based on sensor fusion, which combines the Neural Network (NN) with a Kalman filter in order to estimate the vehicle roll angle. The NN estimates a "pseudo-roll angle" through variables that are easily measured from Inertial Measurement Unit (IMU) sensors. An IMU is a device that is commonly used for vehicle motion detection, and its cost has decreased during recent years. The pseudo-roll angle is introduced in the Kalman filter in order to filter noise and minimize the variance of the norm and maximum errors' estimation. The NN has been trained for J-turn maneuvers, double lane change maneuvers and lane change maneuvers at different speeds and road friction coefficients. The proposed method takes into account the vehicle non-linearities, thus yielding good roll angle estimation. Finally, the proposed estimator has been compared with one that uses the suspension deflections to obtain the pseudo-roll angle. Experimental results show the effectiveness of the proposed NN and Kalman filter-based estimator.

  17. A Navigation System for the Visually Impaired: A Fusion of Vision and Depth Sensor.

    Science.gov (United States)

    Kanwal, Nadia; Bostanci, Erkan; Currie, Keith; Clark, Adrian F

    2015-01-01

    For a number of years, scientists have been trying to develop aids that can make visually impaired people more independent and aware of their surroundings. Computer-based automatic navigation tools are one example of this, motivated by the increasing miniaturization of electronics and the improvement in processing power and sensing capabilities. This paper presents a complete navigation system based on low cost and physically unobtrusive sensors such as a camera and an infrared sensor. The system is based around corners and depth values from Kinect's infrared sensor. Obstacles are found in images from a camera using corner detection, while input from the depth sensor provides the corresponding distance. The combination is both efficient and robust. The system not only identifies hurdles but also suggests a safe path (if available) to the left or right side and tells the user to stop, move left, or move right. The system has been tested in real time by both blindfolded and blind people at different indoor and outdoor locations, demonstrating that it operates adequately.

  18. Camera-based platform and sensor motion tracking for data fusion in a landmine detection system

    NARCIS (Netherlands)

    Mark, W. van der; Heuvel, J.C. van den; Breejen, E. den; Groen, F.C.A.

    2003-01-01

    Vehicles that serve in the role as landmine detection robots could be an important tool for demining former conflict areas. On the LOTUS platform for humanitarian demining, different sensors are used to detect a wide range of landmine types. Reliable and accurate detection depends on correctly combi

  19. The role of unattended ground sensors (UGS) in regional confidence building and arms control

    Energy Technology Data Exchange (ETDEWEB)

    Vannoni, M.; Duggan, R.

    1997-03-01

    Although the Cold War has ended, the world has not become more peaceful. Without the stability provided by an international system dominated by two super-powers, local conflicts are more likely to escalate. Agreements to counter destabilizing pressures in regional conflicts can benefit from the use of cooperative monitoring. Cooperative monitoring is the collecting, analyzing, and sharing of information among parties to an agreement. Ground sensor technologies can contribute to the collection of relevant information. If implemented with consideration for local conditions, cooperative monitoring can build confidence, strengthen existing agreements, and set the stage for continued progress. This presentation describes two examples: the Israeli-Egyptian Sinai agreements of the 1970s and a conceptual example for the contemporary Korean Peninsula. The Sinai was a precedent for the successful use of UGS within the context of cooperative monitoring. The Korean Peninsula is the world`s largest military confrontation. Future confidence building measures that address the security needs of both countries could decrease the danger of conflict and help create an environment for a peace agreement.

  20. Seismic Target Classification Using a Wavelet Packet Manifold in Unattended Ground Sensors Systems

    Directory of Open Access Journals (Sweden)

    Enliang Song

    2013-07-01

    Full Text Available One of the most challenging problems in target classification is the extraction of a robust feature, which can effectively represent a specific type of targets. The use of seismic signals in unattended ground sensor (UGS systems makes this problem more complicated, because the seismic target signal is non-stationary, geology-dependent and with high-dimensional feature space. This paper proposes a new feature extraction algorithm, called wavelet packet manifold (WPM, by addressing the neighborhood preserving embedding (NPE algorithm of manifold learning on the wavelet packet node energy (WPNE of seismic signals. By combining non-stationary information and low-dimensional manifold information, WPM provides a more robust representation for seismic target classification. By using a K nearest neighbors classifier on the WPM signature, the algorithm of wavelet packet manifold classification (WPMC is proposed. Experimental results show that the proposed WPMC can not only reduce feature dimensionality, but also improve the classification accuracy up to 95.03%. Moreover, compared with state-of-the-art methods, WPMC is more suitable for UGS in terms of recognition ratio and computational complexity.

  1. Geographic information system for fusion and analysis of high-resolution remote sensing and ground truth data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1992-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System, integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a bore Al forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case I) calibrated DC-8 SAR data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case II) will produce calibrated DC-8 SAR and AVIRIS data, together with

  2. Geographic information system for fusion and analysis of high-resolution remote sensing and ground truth data

    Science.gov (United States)

    Freeman, Anthony; Way, Jo Bea; Dubois, Pascale; Leberl, Franz

    1992-01-01

    We seek to combine high-resolution remotely sensed data with models and ground truth measurements, in the context of a Geographical Information System, integrated with specialized image processing software. We will use this integrated system to analyze the data from two Case Studies, one at a bore Al forest site, the other a tropical forest site. We will assess the information content of the different components of the data, determine the optimum data combinations to study biogeophysical changes in the forest, assess the best way to visualize the results, and validate the models for the forest response to different radar wavelengths/polarizations. During the 1990's, unprecedented amounts of high-resolution images from space of the Earth's surface will become available to the applications scientist from the LANDSAT/TM series, European and Japanese ERS-1 satellites, RADARSAT and SIR-C missions. When the Earth Observation Systems (EOS) program is operational, the amount of data available for a particular site can only increase. The interdisciplinary scientist, seeking to use data from various sensors to study his site of interest, may be faced with massive difficulties in manipulating such large data sets, assessing their information content, determining the optimum combinations of data to study a particular parameter, visualizing his results and validating his model of the surface. The techniques to deal with these problems are also needed to support the analysis of data from NASA's current program of Multi-sensor Airborne Campaigns, which will also generate large volumes of data. In the Case Studies outlined in this proposal, we will have somewhat unique data sets. For the Bonanza Creek Experimental Forest (Case I) calibrated DC-8 SAR data and extensive ground truth measurement are already at our disposal. The data set shows documented evidence to temporal change. The Belize Forest Experiment (Case II) will produce calibrated DC-8 SAR and AVIRIS data, together with

  3. Fusion of Redundant Aided-inertial Sensors with Decentralised Kalman Filter for Autonomous Underwater Vehicle Navigation

    Directory of Open Access Journals (Sweden)

    Vaibhav Awale

    2015-11-01

    Full Text Available Most submarines carry more than one set of inertial navigation system (INS for redundancy and reliability. Apart from INS systems, the submarine carries other sensors that provide different navigation information. A major challenge is to combine these sensors and INS estimates in an optimal and robust manner for navigation. This issue has been addressed by Farrell1. The same approach is used in this paper to combine different sensor measurements along with INS system. However, since more than one INS system is available onboard, it would be better to use multiple INS systems at the same time to obtain a better estimate of states and to provide autonomy in the event of failure of one INS system. This would require us to combine the estimates obtained from local filters (one set of INS system integrated with external sensors, in some optimal way to provide a global estimate. Individual sensor and IMU measurements cannot be accessed in this scenario. Also, autonomous operation requires no sharing of information among local filters. Hence a decentralised Kalman filter approach is considered for combining the estimates of local filters to give a global estimate. This estimate would not be optimal, however. A better optimal estimate can be obtained by accessing individual measurements and augmenting the state vector in Kalman filter, but in that case, corruption of one INS system will lead to failure of the whole filter. Hence to ensure satisfactory performance of the filter even in the event of failure of some INS system, a decentralised Kalman filtering approach is considered.

  4. Noise modeling and analysis of an IMU-based attitude sensor: improvement of performance by filtering and sensor fusion

    Science.gov (United States)

    K., Nirmal; A. G., Sreejith; Mathew, Joice; Sarpotdar, Mayuresh; Suresh, Ambily; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2016-07-01

    We describe the characterization and removal of noises present in the Inertial Measurement Unit (IMU) MPU- 6050, which was initially used in an attitude sensor, and later used in the development of a pointing system for small balloon-borne astronomical payloads. We found that the performance of the IMU degraded with time because of the accumulation of different errors. Using Allan variance analysis method, we identified the different components of noise present in the IMU, and verified the results by the power spectral density analysis (PSD). We tried to remove the high-frequency noise using smooth filters such as moving average filter and then Savitzky Golay (SG) filter. Even though we managed to filter some high-frequency noise, these filters performance wasn't satisfactory for our application. We found the distribution of the random noise present in IMU using probability density analysis and identified that the noise in our IMU was white Gaussian in nature. Hence, we used a Kalman filter to remove the noise and which gave us good performance real time.

  5. Noise modeling and analysis of an IMU-based attitude sensor: improvement of performance by filtering and sensor fusion

    CERN Document Server

    Nirmal, K; Mathew, Joice; Sarpotdar, Mayuresh; Suresh, Ambily; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2016-01-01

    We describe the characterization and removal of noise present in the Inertial Measurement Unit (IMU) MPU-6050. This IMU was initially used in an attitude sensor (AS) developed in-house, and subsequently implemented in a pointing and stabilization platform developed for small balloon-borne astronomical payloads. We found that the performance of the IMU degrades with time due to the accumulation of different errors. Using the Allan variance analysis method, we identified the different components of noise present in the IMU and verified the results using a power spectral density analysis (PSD). We tried to remove the high-frequency noise using smoothing filters, such as moving average filter and Savitzky-Golay filter. Although we did manage to filter some of the high-frequency noise, the performance of these filters was not satisfactory for our application. We found the distribution of the random noise present in the IMU using a probability density analysis, and identified the noise to be white Gaussian in natur...

  6. Tomographic data fusion with CFD simulations associated with a planar sensor

    Science.gov (United States)

    Liu, J.; Liu, S.; Sun, S.; Zhou, W.; Schlaberg, I. H. I.; Wang, M.; Yan, Y.

    2017-04-01

    Tomographic techniques have great abilities to interrogate the combustion processes, especially when it is combined with the physical models of the combustion itself. In this study, a data fusion algorithm is developed to investigate the flame distribution of a swirl-induced environmental (EV) burner, a new type of burner for low NOx combustion. An electric capacitance tomography (ECT) system is used to acquire 3D flame images and computational fluid dynamics (CFD) is applied to calculate an initial distribution of the temperature profile for the EV burner. Experiments were also carried out to visualize flames at a series of locations above the burner. While the ECT images essentially agree with the CFD temperature distribution, discrepancies exist at a certain height. When data fusion is applied, the discrepancy is visibly reduced and the ECT images are improved. The methods used in this study can lead to a new route where combustion visualization can be much improved and applied to clean energy conversion and new burner development.

  7. Pose Estimation of Unmanned Aerial Vehicles Based on a Vision-Aided Multi-Sensor Fusion

    Science.gov (United States)

    Abdi, G.; Samadzadegan, F.; Kurz, F.

    2016-06-01

    GNSS/IMU navigation systems offer low-cost and robust solution to navigate UAVs. Since redundant measurements greatly improve the reliability of navigation systems, extensive researches have been made to enhance the efficiency and robustness of GNSS/IMU by additional sensors. This paper presents a method for integrating reference data, images taken from UAVs, barometric height data and GNSS/IMU data to estimate accurate and reliable pose parameters of UAVs. We provide improved pose estimations by integrating multi-sensor observations in an EKF algorithm with IMU motion model. The implemented methodology has demonstrated to be very efficient and reliable for automatic pose estimation. The calculated position and attitude of the UAV especially when we removed the GNSS from the working cycle clearly indicate the ability of the purposed methodology.

  8. Activity Recognition Using Fusion of Low-Cost Sensors on a Smartphone for Mobile Navigation Application

    Directory of Open Access Journals (Sweden)

    Sara Saeedi

    2015-08-01

    Full Text Available Low-cost inertial and motion sensors embedded on smartphones have provided a new platform for dynamic activity pattern inference. In this research, a comparison has been conducted on different sensor data, feature spaces and feature selection methods to increase the efficiency and reduce the computation cost of activity recognition on the smartphones. We evaluated a variety of feature spaces and a number of classification algorithms from the area of Machine Learning, including Naive Bayes, Decision Trees, Artificial Neural Networks and Support Vector Machine classifiers. A smartphone app that performs activity recognition is being developed to collect data and send them to a server for activity recognition. Using extensive experiments, the performance of various feature spaces has been evaluated. The results showed that the Bayesian Network classifier yields recognition accuracy of 96.21% using four features while requiring fewer computations.

  9. POSE ESTIMATION OF UNMANNED AERIAL VEHICLES BASED ON A VISION-AIDED MULTI-SENSOR FUSION

    Directory of Open Access Journals (Sweden)

    G. Abdi

    2016-06-01

    Full Text Available GNSS/IMU navigation systems offer low-cost and robust solution to navigate UAVs. Since redundant measurements greatly improve the reliability of navigation systems, extensive researches have been made to enhance the efficiency and robustness of GNSS/IMU by additional sensors. This paper presents a method for integrating reference data, images taken from UAVs, barometric height data and GNSS/IMU data to estimate accurate and reliable pose parameters of UAVs. We provide improved pose estimations by integrating multi-sensor observations in an EKF algorithm with IMU motion model. The implemented methodology has demonstrated to be very efficient and reliable for automatic pose estimation. The calculated position and attitude of the UAV especially when we removed the GNSS from the working cycle clearly indicate the ability of the purposed methodology.

  10. A Fusion Architecture for Tracking a Group of People Using a Distributed Sensor Network

    Science.gov (United States)

    2013-07-01

    assumptions in order to answer the above questions: • If somebody is riding an animal, the animal and the rider are considered as one target and...categorized as an animal, since the rider does not contribute to the seismic signals and contributes very little to PIR and ultrasonic signatures. This...these sensors brings a specific attribute to classify the targets and help in counting their number. Although we consider horses in this paper, the

  11. Multi-Sensor Image Fusion for Target Recognition in the Environment of Network Decision Support Systems

    Science.gov (United States)

    2015-12-01

    tool kit SURF Speeded-up robust features TNT Tactical network topology TP True positive TPR True positive rate UAS Unmanned airborne system UMS...can provide interconnection among such networks and existing terrestrial, satellite, or mobile Internet networks , such as 3G, 4G, or even 5G , and may...mesh sensor networks . If and when the classified COIs are positively identified, they become part of the image database, and the system has to be

  12. Low Complexity Track Initialization and Fusion for Multi-Modal Sensor Networks

    Science.gov (United States)

    2012-11-08

    that is, freely available to any visitor to IEEE Xplore . Authors who choose open access must do so by electing this option by notifying the Editor-In...Publications 1. Q. Le and L.M. Kaplan, “Target localization using proximity binary sensors,” Proceedings of IEEE Aerospace conference, Big Sky, MT... IEEE Aerospace Conference, Big Sky, MT, Mar. 2011. 5. Q. Le, and L. M. Kaplan, “Effects of Operation Parameters on Multitarget Tracking in Proximity

  13. Maneuvering Vehicle Tracking Based on Multi-sensor Fusion%基于多传感融合的路面机动目标跟踪

    Institute of Scientific and Technical Information of China (English)

    陈莹; 韩崇昭

    2005-01-01

    Maneuvering targets tracking is a fundamental task in intelligent vehicle research. This paper focuses on the problem of fusion between radar and image sensors in targets tracking. In order to improve positioning accuracy and narrow down the image working area, a novel method that integrates radar filter with image intensity is proposed to establish an adaptive vision window.A weighted Hausdorff distance is introduced to define the functional relationship between image and model projection, and a modified simulated annealing algorithm is used to find optimum orientation parameter. Furthermore, the global state is estimated, which refers to the distributed data fusion algorithm. Experiment results show that our method is accurate.

  14. Intelligent algorithms for persistent and pervasive sensing in systems comprised of wireless ad hoc networks of ground-based sensors and mobile infrastructures

    Science.gov (United States)

    Hortos, William S.

    2007-04-01

    With the development of low-cost, durable unmanned aerial vehicles (UAVs), it is now practical to perform persistent sensing and target tracking autonomously over broad surveillance areas. These vehicles can sense the environment directly through onboard active sensors, or indirectly when aimed toward ground targets in a mission environment by ground-based passive sensors operating wirelessly as an ad hoc network in the environment. The combination of the swarm intelligence of the airborne infrastructure comprised of UAVs with the ant-like collaborative behavior of the unattended ground sensors creates a system capable of both persistent and pervasive sensing of mission environment, such that, the continuous collection, analysis and tracking of targets from sensor data received from the ground can be achieved. Mobile software agents are used to implement intelligent algorithms for the communications, formation control and sensor data processing in this composite configuration. The enabling mobile agents are organized in a hierarchy for the three stages of processing in the distributed system: target detection, location and recognition from the collaborative data processing among active ground-sensor nodes; transfer of the target information processed on the ground to the UAV swarm overhead; and formation control and sensor activation of the UAV swarm for sustained ground-target surveillance and tracking. Intelligent algorithms are presented that can adapt to the operation of the composite system to target dynamics and system resources. Established routines, appropriate to the processing needs of each stage, are selected as preferred based on their published use in similar scenarios, ability to be distributively implemented over the set of processors at system nodes, and ability to conserve the limited resources at the ground nodes to extend the lifetime of the pervasive network. In this paper, the performance of this distributed, collaborative system concept for

  15. Biomimetic fusion that enhances sensor performance in a bimodal surveillance system

    Science.gov (United States)

    Ziph-Schatzberg, Leah; Kelsall, Sarah; Hubbard, Allyn E.

    2011-06-01

    Algorithms for synergistically fusing acoustic and optical sensory inputs, thereby mimicking biological attentional processes are described. Manual existing perimeter defense surveillance systems using more than one sensory modality combine different sensors' information to corroborate findings by other sensors and to add data from a second modality. In contrast to how conventional systems work, animals use information from multiple sensory inputs in a way that improves each sensory system's performance. We demonstrated that performance is enhanced when information in one modality is used to focus processing in the other modality (a form of attention). This synergistic bi-modal operation improves surveillance efficacy by focusing auditory and visual "attention" on a particular target or location. Algorithms for focusing auditory and visual sensors using detection information were developed. These combination algorithms perform "zoom-with-enhanced-acuity" in both the visual and auditory domains, triggered by detection in either domain. Sensory-input processing algorithms focus on specific locations, indicated by at least one of the modalities. This spatially focused processing emulates biological attention-driven focusing. We showed that given information about the target, the acoustic algorithms were able to achieve over 80% correct target detection at signal-tonoise ratios (SNRs) of -20 dB and above, as compared with similar performance at SNRs of -10 db and above without target information from another modality. Similarly, the visual algorithm achieved performance of over 80% detection with added noise variance of 0.001 without target indication, but maintained 100% detection at added noise variance of 0.05 when acoustic target information was taken into account.

  16. Robot path Planning Using  SIFT and Sonar Sensor Fusion

    DEFF Research Database (Denmark)

    Plascencia, Alfredo; Raposo, Hector

    2007-01-01

    This paper presents a novel map building approach for path planning purposes, which takes into account the uncertainty inherent in sensor measurements. To this end, Bayesian estimation and Dempster-Shafer evidential theory are used to fuse the sensory information and to update the occupancy...... and evidential grid maps, respectively. The approach is illustrated using actual measurements from a laboratory robot. The sensory information is obtained from a sonar array and the Scale Invariant Feature Transform (SIFT) algorithm. Finally, the resulting two evidential maps based on Bayes and Dempster theories...... are used for path planning using the potential field method. Both yield satisfying results...

  17. Autonomous underwater vehicle motion tracking using a Kalman filter for sensor fusion

    CSIR Research Space (South Africa)

    Holtzhausen, S

    2008-11-01

    Full Text Available to be discussed is the IMU. This sensor system consists of three accelerometers and three gyros. The accelerometers are positioned on the X, Y and Z axes and measure acceleration in their respective directions. 15th International conference on Mechatronics... of the vehicle. The accelerometer data can be used to determine linear motion by calculating the double integral of the accelerometer data over time. This will give the calculated movement of the vehicle in the given direction. The accelerometer data can also...

  18. Microwave and camera sensor fusion for the shape extraction of metallic 3D space objects

    Science.gov (United States)

    Shaw, Scott W.; Defigueiredo, Rui J. P.; Krishen, Kumar

    1989-01-01

    The vacuum of space presents special problems for optical image sensors. Metallic objects in this environment can produce intense specular reflections and deep shadows. By combining the polarized RCS with an incomplete camera image, it has become possible to better determine the shape of some simple three-dimensional objects. The radar data are used in an iterative procedure that generates successive approximations to the target shape by minimizing the error between computed scattering cross-sections and the observed radar returns. Favorable results have been obtained for simulations and experiments reconstructing plates, ellipsoids, and arbitrary surfaces.

  19. Multi-Sensor Fusion for Enhanced Contextual Awareness of Everyday Activities with Ubiquitous Devices

    Directory of Open Access Journals (Sweden)

    John J. Guiry

    2014-03-01

    Full Text Available In this paper, the authors investigate the role that smart devices, including smartphones and smartwatches, can play in identifying activities of daily living. A feasibility study involving N = 10 participants was carried out to evaluate the devices’ ability to differentiate between nine everyday activities. The activities examined include walking, running, cycling, standing, sitting, elevator ascents, elevator descents, stair ascents and stair descents. The authors also evaluated the ability of these devices to differentiate indoors from outdoors, with the aim of enhancing contextual awareness. Data from this study was used to train and test five well known machine learning algorithms: C4.5, CART, Naïve Bayes, Multi-Layer Perceptrons and finally Support Vector Machines. Both single and multi-sensor approaches were examined to better understand the role each sensor in the device can play in unobtrusive activity recognition. The authors found overall results to be promising, with some models correctly classifying up to 100% of all instances.

  20. Multi-Sensor Fusion for Enhanced Contextual Awareness of Everyday Activities with Ubiquitous Devices

    Science.gov (United States)

    Guiry, John J.; van de Ven, Pepijn; Nelson, John

    2014-01-01

    In this paper, the authors investigate the role that smart devices, including smartphones and smartwatches, can play in identifying activities of daily living. A feasibility study involving N = 10 participants was carried out to evaluate the devices' ability to differentiate between nine everyday activities. The activities examined include walking, running, cycling, standing, sitting, elevator ascents, elevator descents, stair ascents and stair descents. The authors also evaluated the ability of these devices to differentiate indoors from outdoors, with the aim of enhancing contextual awareness. Data from this study was used to train and test five well known machine learning algorithms: C4.5, CART, Naïve Bayes, Multi-Layer Perceptrons and finally Support Vector Machines. Both single and multi-sensor approaches were examined to better understand the role each sensor in the device can play in unobtrusive activity recognition. The authors found overall results to be promising, with some models correctly classifying up to 100% of all instances. PMID:24662406

  1. Multi-Sensor Fusion of Infrared and Electro-Optic Signals for High Resolution Night Images

    Directory of Open Access Journals (Sweden)

    Victor Lawrence

    2012-07-01

    Full Text Available Electro-optic (EO image sensors exhibit the properties of high resolution and low noise level at daytime, but they do not work in dark environments. Infrared (IR image sensors exhibit poor resolution and cannot separate objects with similar temperature. Therefore, we propose a novel framework of IR image enhancement based on the information (e.g., edge from EO images, which improves the resolution of IR images and helps us distinguish objects at night. Our framework superimposing/blending the edges of the EO image onto the corresponding transformed IR image improves their resolution. In this framework, we adopt the theoretical point spread function (PSF proposed by Hardie et al. for the IR image, which has the modulation transfer function (MTF of a uniform detector array and the incoherent optical transfer function (OTF of diffraction-limited optics. In addition, we design an inverse filter for the proposed PSF and use it for the IR image transformation. The framework requires four main steps: (1 inverse filter-based IR image transformation; (2 EO image edge detection; (3 registration; and (4 blending/superimposing of the obtained image pair. Simulation results show both blended and superimposed IR images, and demonstrate that blended IR images have better quality over the superimposed images. Additionally, based on the same steps, simulation result shows a blended IR image of better quality when only the original IR image is available.

  2. Measuring indoor occupancy in intelligent buildings using the fusion of vision sensors

    Science.gov (United States)

    Liu, Dixin; Guan, Xiaohong; Du, Youtian; Zhao, Qianchuan

    2013-07-01

    In intelligent buildings, practical sensing systems designed to gather indoor occupancy information play an indispensable role in improving occupant comfort and energy efficiency. In this paper, we propose a novel method for occupancy measurement based on the video surveillance now widely used in buildings. In our method, we analyze occupant detection both at the entrance and inside the room. A two-stage static detector is presented based on both appearances and shapes to find the human heads in rooms, and motion-based technology is used for occupant detection at the entrance. To model the change of occupancy and combine the detection results from multiple vision sensors located at entrances and inside rooms for more accurate occupancy estimation, we propose a dynamic Bayesian network-based method. The detection results of each vision sensor play the role of evidence nodes of this network, and thus, we can estimate the true occupancy at time t using the evidence prior to (and including) time t. Experimental results demonstrate the effectiveness and efficiency of the proposed method.

  3. WIRELESS SENSOR NETWORKS AND FUSION OF CONTEXTUAL INFORMATION FOR WEATHER OUTLIER DETECTION

    Directory of Open Access Journals (Sweden)

    A. Amidi

    2013-09-01

    Full Text Available Weather stations are often expensive hence it may be difficult to obtain data with a high spatial coverage. A low cost alternative is wireless sensor network (WSN, which can be deployed as weather stations and address the aforementioned shortcoming. Due to imperfect sensors in WSNs context, provided raw data may be drawn in from of a low quality and reliability level, expectedly that is an emergence of applying outlier detection methods. Outliers may include errors or potentially useful information called events. In this research, forecast values as contextual information are utilized for weather outlier detection. In this paper, outliers are identified by comparing the patterns of WSN and forecasts. With that approach, temporal outliers are detected with respect to slopes of the WSNs and forecasts in the presence of pre-defined tolerance. The experimental results from the real data-set validate the applicability of using contextual information in the context of WSNs for outlier detection in terms of accuracy and energy efficiency.

  4. Fusion of Inertial/Magnetic Sensor Measurements and Map Information for Pedestrian Tracking.

    Science.gov (United States)

    Bao, Shu-Di; Meng, Xiao-Li; Xiao, Wendong; Zhang, Zhi-Qiang

    2017-02-10

    The wearable inertial/magnetic sensor based human motion analysis plays an important role in many biomedical applications, such as physical therapy, gait analysis and rehabilitation. One of the main challenges for the lower body bio-motion analysis is how to reliably provide position estimations of human subject during walking. In this paper, we propose a particle filter based human position estimation method using a foot-mounted inertial and magnetic sensor module, which not only uses the traditional zero velocity update (ZUPT), but also applies map information to further correct the acceleration double integration drift and thus improve estimation accuracy. In the proposed method, a simple stance phase detector is designed to identify the stance phase of a gait cycle based on gyroscope measurements. For the non-stance phase during a gait cycle, an acceleration control variable derived from ZUPT information is introduced in the process model, while vector map information is taken as binary pseudo-measurements to further enhance position estimation accuracy and reduce uncertainty of walking trajectories. A particle filter is then designed to fuse ZUPT information and binary pseudo-measurements together. The proposed human position estimation method has been evaluated with closed-loop walking experiments in indoor and outdoor environments. Results of comparison study have illustrated the effectiveness of the proposed method for application scenarios with useful map information.

  5. Fusion of Inertial/Magnetic Sensor Measurements and Map Information for Pedestrian Tracking

    Science.gov (United States)

    Bao, Shu-Di; Meng, Xiao-Li; Xiao, Wendong; Zhang, Zhi-Qiang

    2017-01-01

    The wearable inertial/magnetic sensor based human motion analysis plays an important role in many biomedical applications, such as physical therapy, gait analysis and rehabilitation. One of the main challenges for the lower body bio-motion analysis is how to reliably provide position estimations of human subject during walking. In this paper, we propose a particle filter based human position estimation method using a foot-mounted inertial and magnetic sensor module, which not only uses the traditional zero velocity update (ZUPT), but also applies map information to further correct the acceleration double integration drift and thus improve estimation accuracy. In the proposed method, a simple stance phase detector is designed to identify the stance phase of a gait cycle based on gyroscope measurements. For the non-stance phase during a gait cycle, an acceleration control variable derived from ZUPT information is introduced in the process model, while vector map information is taken as binary pseudo-measurements to further enhance position estimation accuracy and reduce uncertainty of walking trajectories. A particle filter is then designed to fuse ZUPT information and binary pseudo-measurements together. The proposed human position estimation method has been evaluated with closed-loop walking experiments in indoor and outdoor environments. Results of comparison study have illustrated the effectiveness of the proposed method for application scenarios with useful map information. PMID:28208591

  6. Sensor Fusion and Autonomy as a Powerful Combination for Biological Assessment in the Marine Environment

    Directory of Open Access Journals (Sweden)

    Mark A. Moline

    2016-02-01

    Full Text Available The ocean environment and the physical and biological processes that govern dynamics are complex. Sampling the ocean to better understand these processes is difficult given the temporal and spatial domains and sampling tools available. Biological systems are especially difficult as organisms possess behavior, operate at horizontal scales smaller than traditional shipboard sampling allows, and are often disturbed by the sampling platforms themselves. Sensors that measure biological processes have also generally not kept pace with the development of physical counterparts as their requirements are as complex as the target organisms. Here, we attempt to address this challenge by advocating the need for sensor-platform combinations to integrate and process data in real-time and develop data products that are useful in increasing sampling efficiencies. Too often, the data of interest is only garnered after post-processing after a sampling effort and the opportunity to use that information to guide sampling is lost. Here we demonstrate a new autonomous platform, where data are collected, analyzed, and data products are output in real-time to inform autonomous decision-making. This integrated capability allows for enhanced and informed sampling towards improving our understanding of the marine environment.

  7. The Distribution of Titanium in Lunar Soils on the Basis of Sensor and In Situ Data Fusion

    Science.gov (United States)

    Clark, P. E.; Evans, L.

    1999-01-01

    A variety of remote-sensing measurements have been used to map the distribution of elements on the Moon as a means of providing constraints on the processes from which its crust and major terranes originated. Discussed here is Ti, which is incorporated into refractory minerals such as ilmenite during the latter stages of differentiation, and is thus a most useful element for understanding mare basalt petrogenesis. One of the earliest Ti maps showed Ti variations in nearside maria on the basis of groundbased spectral reflectance measurements. A map of Ti derived from gamma-ray measurements on Apollo 15 and 16 was produced at about the same time, and was improved upon considerably by Davis and coworkers, who effectively removed sources of spurious variation from Fe and Al or REE (e.g., Th) interference, and calibrated Ti on the bases of landing-site soil averages. In recent years, spectral reflectance measurements from Clementine have been used by Lucey and coworkers to produce global Ti distribution maps as well. As we indicated previously, the Lucey and Davis maps agree to first order. Meanwhile, we are using the concept of sensor data fusion to combine measurements from the AGR (Apollo gamma-ray) and CSR (Clementine Spectral Reflectance) techniques with ground truth from lunar soils to utilize the differences between the two maps to understand the distribution of Ti within lunar soil components, as we have done with Fe. This technique should be verified and applied on Lunar Prospector gamma-ray measurements of Ti, as the calibrated data become available within the next couple of years. Lunar Ti is found principally in the mineral ilmenite, and is associated with certain components of lunar soil: crystalline Ilmenite mineral fragments and high Ti-bearing glass. All data indicate that Ti is associated with maria and mafic minerals. In AGR and CSR datasets, Ti is highest on the nearside and in the maria, particularly in southern Serenitatis/northern Tranquillitatis

  8. 无线传感网络中的目标分类融合%Classification Fusion in Wireless Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    刘春婷; 霍宏; 方涛; 李德仁; 沈晓

    2006-01-01

    In wireless sensor networks, target classification differs from that in centralized sensing systems because of the distributed detection, wireless communication and limited resources. We study the classification problem of moving vehicles in wireless sensor networks using acoustic signals emitted from vehicles. Three algorithms including wavelet decomposition, weighted k-nearest-neighbor and Dempster-Shafer theory are combined in this paper. Finally, we use real world experimental data to validate the classification methods. The result shows that wavelet based feature extraction method can extract stable features from acoustic signals. By fusion with Dempster's rule, the classification performance is improved.

  9. A Wearable Ground Reaction Force Sensor System and Its Application to the Measurement of Extrinsic Gait Variability

    Science.gov (United States)

    Liu, Tao; Inoue, Yoshio; Shibata, Kyoko

    2010-01-01

    Wearable sensors for gait analysis are attracting wide interest. In this paper, a wearable ground reaction force (GRF) sensor system and its application to measure extrinsic gait variability are presented. To validate the GRF and centre of pressure (CoP) measurements of the sensor system and examine the effectiveness of the proposed method for gait analysis, we conducted an experimental study on seven volunteer subjects. Based on the assessment of the influence of the sensor system on natural gait, we found that no significant differences were found for almost all measured gait parameters (p-values < 0.05). As for measurement accuracy, the root mean square (RMS) differences for the two transverse components and the vertical component of the GRF were 7.2% ± 0.8% and 9.0% ± 1% of the maximum of each transverse component and 1.5% ± 0.9% of the maximum vertical component of GRF, respectively. The RMS distance between both CoP measurements was 1.4% ± 0.2% of the length of the shoe. The area of CoP distribution on the foot-plate and the average coefficient of variation of the triaxial GRF, are the introduced parameters for analysing extrinsic gait variability. Based on a statistical analysis of the results of the tests with subjects wearing the sensor system, we found that the proposed parameters changed according to walking speed and turning (p-values < 0.05). PMID:22163468

  10. A Wearable Ground Reaction Force Sensor System and Its Application to the Measurement of Extrinsic Gait Variability

    Directory of Open Access Journals (Sweden)

    Kyoko Shibata

    2010-11-01

    Full Text Available Wearable sensors for gait analysis are attracting wide interest. In this paper, a wearable ground reaction force (GRF sensor system and its application to measure extrinsic gait variability are presented. To validate the GRF and centre of pressure (CoP measurements of the sensor system and examine the effectiveness of the proposed method for gait analysis, we conducted an experimental study on seven volunteer subjects. Based on the assessment of the influence of the sensor system on natural gait, we found that no significant differences were found for almost all measured gait parameters (p-values < 0.05. As for measurement accuracy, the root mean square (RMS differences for the two transverse components and the vertical component of the GRF were 7.2% ± 0.8% and 9.0% ± 1% of the maximum of each transverse component and 1.5% ± 0.9% of the maximum vertical component of GRF, respectively. The RMS distance between both CoP measurements was 1.4% ± 0.2% of the length of the shoe. The area of CoP distribution on the foot-plate and the average coefficient of variation of the triaxial GRF, are the introduced parameters for analysing extrinsic gait variability. Based on a statistical analysis of the results of the tests with subjects wearing the sensor system, we found that the proposed parameters changed according to walking speed and turning (p-values < 0.05.

  11. 基于多传感器数据融合技术的应用研究%Research Based on Multi-sensor Data Fusion Technology

    Institute of Scientific and Technical Information of China (English)

    宋强; 王爱民; 张运素

    2013-01-01

    复杂系统的多传感器数据融合是一门新兴的技术,它通过对来自多个传感器的数据进行多级别、多方面、多层次的处理从而产生出单个传感器所不能获得的更有意义的信息.数据融合在军事领域和民用领域都有很大的发展和应用前景.该文提出了一种基于神经网络融合算法的多传感器数据融合技术,对所采用的数据融合技术用于烧结终点预测进行了详细介绍.通过仿真结果证明,该方法鲁棒性强,准确性高,泛化能力广,具有很强的实用性和推广价值.%The complex system multi-sensor data fusion was an emerging technology,which based on the data from multiple sensors multiple levels,many Levels of processing in order to produce a single sensor cannot get obtain meaningful information.Data fusion in military and civil areas have great developmental and applicative prospect.Proposed a multi-sensor data fusion technology,which was based on neural network algorithm,data fusion technology for the BTP projections described in detail.The application result shows that the prediction with this method can achieve higher robust,better utility and expensive value.

  12. Development and Ground-Test Validation of Fiber Optic Sensor Attachment Techniques for Hot Structures Applications

    Science.gov (United States)

    Piazza, Anthony; Hudson, Larry D.; Richards, W. Lance

    2005-01-01

    Fiber Optic Strain Measurements: a) Successfully attached silica fiber optic sensors to both metallics and composites; b) Accomplished valid EFPI strain measurements to 1850 F; c) Successfully attached EFPI sensors to large scale hot-structures; and d) Attached and thermally validated FBG bond and epsilon(sub app). Future Development a) Improve characterization of sensors on C-C and C-SiC substrates; b) Apply application to other composites such as SiC-SiC; c) Assist development of interferometer based Sapphire sensor currently being conducted under a Phase II SBIR; and d) Complete combined thermal/mechanical testing of FBG on composite substrates in controlled laboratory environment.

  13. A Ubiquitous and Low-Cost Solution for Movement Monitoring and Accident Detection Based on Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Filipe Felisberto

    2014-05-01

    Full Text Available The low average birth rate in developed countries and the increase in life expectancy have lead society to face for the first time an ageing situation. This situation associated with the World’s economic crisis (which started in 2008 forces the need of equating better and more efficient ways of providing more quality of life for the elderly. In this context, the solution presented in this work proposes to tackle the problem of monitoring the elderly in a way that is not restrictive for the life of the monitored, avoiding the need for premature nursing home admissions. To this end, the system uses the fusion of sensory data provided by a network of wireless sensors placed on the periphery of the user. Our approach was also designed with a low-cost deployment in mind, so that the target group may be as wide as possible. Regarding the detection of long-term problems, the tests conducted showed that the precision of the system in identifying and discerning body postures and body movements allows for a valid monitorization and rehabilitation of the user. Moreover, concerning the detection of accidents, while the proposed solution presented a near 100% precision at detecting normal falls, the detection of more complex falls (i.e., hampered falls will require further study.

  14. Sensor fusion of 2D and 3D data for the processing of images of dental imprints

    Science.gov (United States)

    Methot, Jean-Francois; Mokhtari, Marielle; Laurendeau, Denis; Poussart, Denis

    1993-08-01

    This paper presents a computer vision system for the acquisition and processing of 3-D images of wax dental imprints. The ultimate goal of the system is to measure a set of 10 orthodontic parameters that will be fed to an expert system for automatic diagnosis of occlusion problems. An approach for the acquisition of range images of both sides of the imprint is presented. Range is obtained from a shape-from-absorption technique applied to a pair of grey-level images obtained at two different wavelengths. The accuracy of the range values is improved using sensor fusion between the initial range image and a reflectance image from the pair of grey-level images. The improved range image is segmented in order to find the interstices between teeth and, following further processing, the type of each tooth on the profile. Once each tooth has been identified, its accurate location on the imprint is found using a region- growing approach and its shape is reconstructed with third degree polynomial functions. The reconstructed shape will be later used by the system to find specific features that are needed to estimate the orthodontic parameters.

  15. Sensor data fusion for body state estimation in a bipedal robot and its feedback control application for stable walking.

    Science.gov (United States)

    Chen, Ching-Pei; Chen, Jing-Yi; Huang, Chun-Kai; Lu, Jau-Ching; Lin, Pei-Chun

    2015-02-27

    We report on a sensor data fusion algorithm via an extended Kalman filter for estimating the spatial motion of a bipedal robot. Through fusing the sensory information from joint encoders, a 6-axis inertial measurement unit and a 2-axis inclinometer, the robot's body state at a specific fixed position can be yielded. This position is also equal to the CoM when the robot is in the standing posture suggested by the detailed CAD model of the robot. In addition, this body state is further utilized to provide sensory information for feedback control on a bipedal robot with walking gait. The overall control strategy includes the proposed body state estimator as well as the damping controller, which regulates the body position state of the robot in real-time based on instant and historical position tracking errors. Moreover, a posture corrector for reducing unwanted torque during motion is addressed. The body state estimator and the feedback control structure are implemented in a child-size bipedal robot and the performance is experimentally evaluated.

  16. Sensor Data Fusion for Body State Estimation in a Bipedal Robot and Its Feedback Control Application for Stable Walking

    Directory of Open Access Journals (Sweden)

    Ching-Pei Chen

    2015-02-01

    Full Text Available We report on a sensor data fusion algorithm via an extended Kalman filter for estimating the spatial motion of a bipedal robot. Through fusing the sensory information from joint encoders, a 6-axis inertial measurement unit and a 2-axis inclinometer, the robot’s body state at a specific fixed position can be yielded. This position is also equal to the CoM when the robot is in the standing posture suggested by the detailed CAD model of the robot. In addition, this body state is further utilized to provide sensory information for feedback control on a bipedal robot with walking gait. The overall control strategy includes the proposed body state estimator as well as the damping controller, which regulates the body position state of the robot in real-time based on instant and historical position tracking errors. Moreover, a posture corrector for reducing unwanted torque during motion is addressed. The body state estimator and the feedback control structure are implemented in a child-size bipedal robot and the performance is experimentally evaluated.

  17. Novel Paradigm for Constructing Masses in Dempster-Shafer Evidence Theory for Wireless Sensor Network’s Multisource Data Fusion

    Directory of Open Access Journals (Sweden)

    Zhenjiang Zhang

    2014-04-01

    Full Text Available Dempster-Shafer evidence theory (DSET is a flexible and popular paradigm for multisource data fusion in wireless sensor networks (WSNs. This paper presents a novel and easy implementing method computing masses from the hundreds of pieces of data collected by a WSN. The transfer model is based on the Mahalanobis distance (MD, which is an effective method to measure the similarity between an object and a sample. Compared to the existing methods, the proposed method concerns the statistical features of the observed data and it is good at transferring multi-dimensional data to belief assignment correctly and effectively. The main processes of the proposed method, which include the calculation of the intersection classes of the power set and the algorithm mapping MDs to masses, are described in detail. Experimental results in transformer fault diagnosis show that the proposed method has a high accuracy in constructing masses from multidimensional data for DSET. Additionally, the results also prove that higher dimensional data brings higher accuracy in transferring data to mass.

  18. 多传感器数据融合的主成分方法研究%Study on principle component method for multi-sensor data fusion

    Institute of Scientific and Technical Information of China (English)

    董九英

    2009-01-01

    针对多个传感器对某一特性指标进行测量实验的数据融合问题,提出了一种基于主成分分析的融合方法.该方法把各传感器的测量数据作为一变量,定义总体的各主成分,利用测量值与主成分的复相关关系 ,给出了各传感器的综合支持程度和数据融合公式.应用实例验证了方法的有效性和精确性.%Due to data fusion of multi-sensor experiment on some characteristic index,a new fusion method is proposed based on the principle component analysis.The method views the measured data of every sensor as a variant.After defining each princi-ple components for the collectivity,the synthesis support degrees of all sensors are given according to the compound relationship between the measured value and the principle component.The formula of data fusion is obtained.The applied example proves that the method is both effective and accurate.

  19. Estimating above-ground biomass by fusion of LiDAR and multispectral data in subtropical woody plant communities in topographically complex terrain in North-eastern Australia

    Institute of Scientific and Technical Information of China (English)

    Sisira Ediriweera; Sumith Pathirana; Tim Danaher; Doland Nichols

    2014-01-01

    We investigated a strategy to improve predicting capacity of plot-scale above-ground biomass (AGB) by fusion of LiDAR and Land-sat5 TM derived biophysical variables for subtropical rainforest and eucalypts dominated forest in topographically complex landscapes in North-eastern Australia. Investigation was carried out in two study areas separately and in combination. From each plot of both study areas, LiDAR derived structural parameters of vegetation and reflectance of all Landsat bands, vegetation indices were employed. The regression analysis was carried out separately for LiDAR and Landsat derived variables indi-vidually and in combination. Strong relationships were found with LiDAR alone for eucalypts dominated forest and combined sites compared to the accuracy of AGB estimates by Landsat data. Fusing LiDAR with Landsat5 TM derived variables increased overall performance for the eucalypt forest and combined sites data by describing extra variation (3% for eucalypt forest and 2% combined sites) of field estimated plot-scale above-ground biomass. In contrast, separate LiDAR and imagery data, and fusion of LiDAR and Landsat data performed poorly across structurally complex closed canopy subtropical rainforest. These findings reinforced that obtaining accurate estimates of above ground biomass using remotely sensed data is a function of the complexity of horizontal and vertical structural diversity of vegetation.

  20. Absolute Position Sensing Based on a Robust Differential Capacitive Sensor with a Grounded Shield Window.

    Science.gov (United States)

    Bai, Yang; Lu, Yunfeng; Hu, Pengcheng; Wang, Gang; Xu, Jinxin; Zeng, Tao; Li, Zhengkun; Zhang, Zhonghua; Tan, Jiubin

    2016-05-11

    A simple differential capacitive sensor is provided in this paper to measure the absolute positions of length measuring systems. By utilizing a shield window inside the differential capacitor, the measurement range and linearity range of the sensor can reach several millimeters. What is more interesting is that this differential capacitive sensor is only sensitive to one translational degree of freedom (DOF) movement, and immune to the vibration along the other two translational DOFs. In the experiment, we used a novel circuit based on an AC capacitance bridge to directly measure the differential capacitance value. The experimental result shows that this differential capacitive sensor has a sensitivity of 2 × 10(-4) pF/μm with 0.08 μm resolution. The measurement range of this differential capacitive sensor is 6 mm, and the linearity error are less than 0.01% over the whole absolute position measurement range.

  1. Wastewater quality monitoring system using sensor fusion and machine learning techniques.

    Science.gov (United States)

    Qin, Xusong; Gao, Furong; Chen, Guohua

    2012-03-15

    A multi-sensor water quality monitoring system incorporating an UV/Vis spectrometer and a turbidimeter was used to monitor the Chemical Oxygen Demand (COD), Total Suspended Solids (TSS) and Oil & Grease (O&G) concentrations of the effluents from the Chinese restaurant on campus and an electrocoagulation-electroflotation (EC-EF) pilot plant. In order to handle the noise and information unbalance in the fused UV/Vis spectra and turbidity measurements during the calibration model building, an improved boosting method, Boosting-Iterative Predictor Weighting-Partial Least Squares (Boosting-IPW-PLS), was developed in the present study. The Boosting-IPW-PLS method incorporates IPW into boosting scheme to suppress the quality-irrelevant variables by assigning small weights, and builds up the models for the wastewater quality predictions based on the weighted variables. The monitoring system was tested in the field with satisfactory results, underlying the potential of this technique for the online monitoring of water quality. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Sensor fusion of electron paramagnetic resonance and magnetorelaxometry data for quantitative magnetic nanoparticle imaging

    Science.gov (United States)

    Coene, A.; Leliaert, J.; Crevecoeur, G.; Dupré, L.

    2017-03-01

    Magnetorelaxometry (MRX) imaging and electron paramagnetic resonance (EPR) are two non-invasive techniques capable of recovering the magnetic nanoparticle (MNP) distribution. Both techniques solve an ill-posed inverse problem in order to find the spatial MNP distribution. A lot of research has been done on increasing the stability of these inverse problems with the main objective to improve the quality of MNP imaging. In this paper a proof of concept is presented in which the sensor data of both techniques is fused into EPR–MRX, with the intention to stabilize the inverse problem. First, both techniques are compared by reconstructing several phantoms with different sizes for various noise levels and calculating stability, sensitivity and reconstruction quality parameters for these cases. This study reveals that both techniques are sensitive to different information from the MNP distributions and generate complementary measurement data. As such, their merging might stabilize the inverse problem. In a next step we investigated how both techniques need to be combined to reduce their respective drawbacks, such as a high number of required measurements and reduced stability, and to improve MNP reconstructions. We were able to stabilize both techniques, increase reconstruction quality by an average of 5% and reduce measurement times by 88%. These improvements could make EPR–MRX a valuable and accurate technique in a clinical environment.

  3. Enhanced research in ground-penetrating radar and multisensor fusion with application to the detection and visualization of buried waste. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Devney, A.J.; DiMarzio, C.; Kokar, M.; Miller, E.L.; Rappaport, C.M.; Weedon, W.H.

    1996-05-14

    Recognizing the difficulty and importance of the landfill remediation problems faced by DOE, and the fact that no one sensor alone can provide complete environmental site characterization, a multidisciplinary team approach was chosen for this project. The authors have developed a multisensor fusion approach that is suitable for the wide variety of sensors available to DOE, that allows separate detection algorithms to be developed and custom-tailored to each sensor. This approach is currently being applied to the Geonics EM-61 and Coleman step-frequency radar data. High-resolution array processing techniques were developed for detecting and localizing buried waste containers. A soil characterization laboratory facility was developed using a HP-8510 network analyzer and near-field coaxial probe. Both internal and external calibration procedures were developed for de-embedding the frequency-dependent soil electrical parameters from the measurements. Dispersive soil propagation modeling algorithms were also developed for simulating wave propagation in dispersive soil media. A study was performed on the application of infrared sensors to the landfill remediation problem, particularly for providing information on volatile organic compounds (VOC`s) in the atmosphere. A dust-emission lidar system is proposed for landfill remediation monitoring. Design specifications are outlined for a system which could be used to monitor dust emissions in a landfill remediation effort. The detailed results of the investigations are contained herein.

  4. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yubin Zhao

    2016-08-01

    Full Text Available In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér–Rao lower bound (CRLB, which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the

  5. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks.

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-08-23

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér-Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  6. Practical Performance Analysis for Multiple Information Fusion Based Scalable Localization System Using Wireless Sensor Networks

    Science.gov (United States)

    Zhao, Yubin; Li, Xiaofan; Zhang, Sha; Meng, Tianhui; Zhang, Yiwen

    2016-01-01

    In practical localization system design, researchers need to consider several aspects to make the positioning efficiently and effectively, e.g., the available auxiliary information, sensing devices, equipment deployment and the environment. Then, these practical concerns turn out to be the technical problems, e.g., the sequential position state propagation, the target-anchor geometry effect, the Non-line-of-sight (NLOS) identification and the related prior information. It is necessary to construct an efficient framework that can exploit multiple available information and guide the system design. In this paper, we propose a scalable method to analyze system performance based on the Cramér–Rao lower bound (CRLB), which can fuse all of the information adaptively. Firstly, we use an abstract function to represent all of the wireless localization system model. Then, the unknown vector of the CRLB consists of two parts: the first part is the estimated vector, and the second part is the auxiliary vector, which helps improve the estimation accuracy. Accordingly, the Fisher information matrix is divided into two parts: the state matrix and the auxiliary matrix. Unlike the theoretical analysis, our CRLB can be a practical fundamental limit to denote the system that fuses multiple information in the complicated environment, e.g., recursive Bayesian estimation based on the hidden Markov model, the map matching method and the NLOS identification and mitigation methods. Thus, the theoretical results are approaching the real case more. In addition, our method is more adaptable than other CRLBs when considering more unknown important factors. We use the proposed method to analyze the wireless sensor network-based indoor localization system. The influence of the hybrid LOS/NLOS channels, the building layout information and the relative height differences between the target and anchors are analyzed. It is demonstrated that our method exploits all of the available information for

  7. Ground Optical Signal Processing Architecture for Contributing Space-Based SSA Sensor Data

    Science.gov (United States)

    2014-09-01

    display a currently valid OMB control number. 1. REPORT DATE SEP 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND...Propagation (Both sensor and target) Sensor/Target State Vectors & Target Radiometric Properties Millennium Space Systems Tasker/Scheduler...capability of converting visual magnitudes to radiometric quantities. However, since the model covers the full range of light (from UV through VLWIR) a

  8. Adaptation of Dubins Paths for UAV Ground Obstacle Avoidance When Using a Low Cost On-Board GNSS Sensor

    Directory of Open Access Journals (Sweden)

    Ramūnas Kikutis

    2017-09-01

    Full Text Available Current research on Unmanned Aerial Vehicles (UAVs shows a lot of interest in autonomous UAV navigation. This interest is mainly driven by the necessity to meet the rules and restrictions for small UAV flights that are issued by various international and national legal organizations. In order to lower these restrictions, new levels of automation and flight safety must be reached. In this paper, a new method for ground obstacle avoidance derived by using UAV navigation based on the Dubins paths algorithm is presented. The accuracy of the proposed method has been tested, and research results have been obtained by using Software-in-the-Loop (SITL simulation and real UAV flights, with the measurements done with a low cost Global Navigation Satellite System (GNSS sensor. All tests were carried out in a three-dimensional space, but the height accuracy was not assessed. The GNSS navigation data for the ground obstacle avoidance algorithm is evaluated statistically.

  9. Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis

    Science.gov (United States)

    Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario

    2015-12-01

    Objective. Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. Approach. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. Main results. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. Significance. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.

  10. Spatio-temporal monitoring of cotton cultivation using ground-based and airborne multispectral sensors in GIS environment.

    Science.gov (United States)

    Papadopoulos, Antonis; Kalivas, Dionissios; Theocharopoulos, Sid

    2017-07-01

    Multispectral sensor capability of capturing reflectance data at several spectral channels, together with the inherent reflectance responses of various soils and especially plant surfaces, has gained major interest in crop production. In present study, two multispectral sensing systems, a ground-based and an aerial-based, were applied for the multispatial and temporal monitoring of two cotton fields in central Greece. The ground-based system was Crop Circle ACS-430, while the aerial consisted of a consumer-level quadcopter (Phantom 2) and a modified Hero3+ Black digital camera. The purpose of the research was to monitor crop growth with the two systems and investigate possible interrelations between the derived well-known normalized difference vegetation index (NDVI). Five data collection campaigns were conducted during the cultivation period and concerned scanning soil and plants with the ground-based sensor and taking aerial photographs of the fields with the unmanned aerial system. According to the results, both systems successfully monitored cotton growth stages in terms of space and time. The mean values of NDVI changes through time as retrieved by the ground-based system were satisfactorily modelled by a second-order polynomial equation (R (2) 0.96 in Field 1 and 0.99 in Field 2). Further, they were highly correlated (r 0.90 in Field 1 and 0.74 in Field 2) with the according values calculated via the aerial-based system. The unmanned aerial system (UAS) can potentially substitute crop scouting as it concerns a time-effective, non-destructive and reliable way of soil and plant monitoring.

  11. Wearable Sensor System for Human Dynamics Analysis

    OpenAIRE

    Liu, Tao; Inoue, Yoshio; Shibata, Kyoko; Zheng, Rencheng

    2010-01-01

    A new wearable sensor system was developed for measuring tri-directional ground reaction force (GRF) and segment orientations. A stationary force plate can not measure more than one stride; moreover, in studies of stair ascent and descent measurements, a complex system consisting of many stationary force plates and a data fusion method must be constructed (Stacoff et al., 2005; Della and Bonato, 2007). The wearable sensor system proposed in this chapter can be applied to successive walking tr...

  12. Monitoring soil moisture patterns in alpine meadows using ground sensor networks and remote sensing techniques

    Science.gov (United States)

    Bertoldi, Giacomo; Brenner, Johannes; Notarnicola, Claudia; Greifeneder, Felix; Nicolini, Irene; Della Chiesa, Stefano; Niedrist, Georg; Tappeiner, Ulrike

    2015-04-01

    Soil moisture content (SMC) is a key factor for numerous processes, including runoff generation, groundwater recharge, evapotranspiration, soil respiration, and biological productivity. Understanding the controls on the spatial and temporal variability of SMC in mountain catchments is an essential step towards improving quantitative predictions of catchment hydrological processes and related ecosystem services. The interacting influences of precipitation, soil properties, vegetation, and topography on SMC and the influence of SMC patterns on runoff generation processes have been extensively investigated (Vereecken et al., 2014). However, in mountain areas, obtaining reliable SMC estimations is still challenging, because of the high variability in topography, soil and vegetation properties. In the last few years, there has been an increasing interest in the estimation of surface SMC at local scales. On the one hand, low cost wireless sensor networks provide high-resolution SMC time series. On the other hand, active remote sensing microwave techniques, such as Synthetic Aperture Radars (SARs), show promising results (Bertoldi et al. 2014). As these data provide continuous coverage of large spatial extents with high spatial resolution (10-20 m), they are particularly in demand for mountain areas. However, there are still limitations related to the fact that the SAR signal can penetrate only a few centimeters in the soil. Moreover, the signal is strongly influenced by vegetation, surface roughness and topography. In this contribution, we analyse the spatial and temporal dynamics of surface and root-zone SMC (2.5 - 5 - 25 cm depth) of alpine meadows and pastures in the Long Term Ecological Research (LTER) Area Mazia Valley (South Tyrol - Italy) with different techniques: (I) a network of 18 stations; (II) field campaigns with mobile ground sensors; (III) 20-m resolution RADARSAT2 SAR images; (IV) numerical simulations using the GEOtop hydrological model (Rigon et al

  13. Assessment of Above-Ground Biomass of Borneo Forests through a New Data-Fusion Approach Combining Two Pan-Tropical Biomass Maps

    Directory of Open Access Journals (Sweden)

    Andreas Langner

    2015-08-01

    Full Text Available This study investigates how two existing pan-tropical above-ground biomass (AGB maps (Saatchi 2011, Baccini 2012 can be combined to derive forest ecosystem specific carbon estimates. Several data-fusion models which combine these AGB maps according to their local correlations with independent datasets such as the spectral bands of SPOT VEGETATION imagery are analyzed. Indeed these spectral bands convey information about vegetation type and structure which can be related to biomass values. Our study area is the island of Borneo. The data-fusion models are evaluated against a reference AGB map available for two forest concessions in Sabah. The highest accuracy was achieved by a model which combines the AGB maps according to the mean of the local correlation coefficients calculated over different kernel sizes. Combining the resulting AGB map with a new Borneo land cover map (whose overall accuracy has been estimated at 86.5% leads to average AGB estimates of 279.8 t/ha and 233.1 t/ha for forests and degraded forests respectively. Lowland dipterocarp and mangrove forests have the highest and lowest AGB values (305.8 t/ha and 136.5 t/ha respectively. The AGB of all natural forests amounts to 10.8 Gt mainly stemming from lowland dipterocarp (66.4%, upper dipterocarp (10.9% and peat swamp forests (10.2%. Degraded forests account for another 2.1 Gt of AGB. One main advantage of our approach is that, once the best fitting data-fusion model is selected, no further AGB reference dataset is required for implementing the data-fusion process. Furthermore, the local harmonization of AGB datasets leads to more spatially precise maps. This approach can easily be extended to other areas in Southeast Asia which are dominated by lowland dipterocarp forest, and can be repeated when newer or more accurate AGB maps become available.

  14. Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches

    Directory of Open Access Journals (Sweden)

    Damien Vivet

    2013-08-01

    Full Text Available This paper is concerned with robotic applications using a ground‐based radar sensor for simultaneous localization and mapping problems. In mobile robotics, radar technology is interesting because of its long range and the robustness of radar waves to atmospheric conditions, making these sensors well‐suited for extended outdoor robotic applications. Two localization and mapping approaches using data obtained from a 360° field of view microwave radar sensor are presented and compared. The first method is a trajectory‐ oriented simultaneous localization and mapping technique, which makes no landmark assumptions and avoids the data association problem. The estimation of the ego‐motion makes use of the Fourier‐Mellin transform for registering radar images in a sequence, from which the rotation and translation of the sensor motion can be estimated. The second approach uses the consequence of using a rotating range sensor in high speed robotics. In such a situation, movement combinations create distortions in the collected data. Velocimetry is achieved here by explicitly analysing these measurement distortions. As a result, the trajectory of the vehicle and then the radar map of outdoor environments can be obtained. The evaluation of experimental results obtained by the two methods is presented on real‐world data from a vehicle moving at 30 km/h over a 2.5 km course.

  15. Accuracy of PARTwear Inertial Sensor and Optojump Optical Measurement System for Measuring Ground Contact Time During Running.

    Science.gov (United States)

    Ammann, Rahel; Taube, Wolfgang; Wyss, Thomas

    2016-07-01

    Ammann, R, Taube, W, and Wyss, T. Accuracy of PARTwear inertial sensor and Optojump optical measurement system for measuring ground contact time during running. J Strength Cond Res 30(7): 2057-2063, 2016-The aim of this study was to validate the detection of ground contact time (GCT) during running in 2 differently working systems: a small inertial measurement sensor, PARTwear (PW), worn on the shoe laces, and the optical measurement system, Optojump (OJ), placed on the track. Twelve well-trained subjects performed 12 runs each on an indoor track at speeds ranging from 3.0 to 9.0 m·s. GCT of one step per run (total 144) was simultaneously obtained by the PW, the OJ, and a high-speed video camera (HSC), whereby the latter served as reference system. The sampling rate was 1,000 Hz for all methods. Compared with the HSC, the PW and the OJ systems underestimated GCT by -1.3 ± 6.1% and -16.5 ± 6.7% (p-values ≤ 0.05), respectively. The intraclass correlation coefficients between PW and HSC and between OJ and HSC were 0.984 and 0.853 (p-values measurement systems.

  16. Fusion of lidar and radar for detection of partially obscured objects

    Science.gov (United States)

    Hollinger, Jim; Kutscher, Brett; Close, Ryan

    2015-05-01

    The capability to detect partially obscured objects is of interest to many communities, including ground vehicle robotics. The ability to find partially obscured objects can aid in automated navigation and planning algorithms used by robots. Two sensors often used for this task are Lidar and Radar. Lidar and Radar systems provide complementary data about the environment. Both are active sensing modalities and provide direct range measurements. However, they operate in very different portions of the radio frequency spectrum. By exploiting properties associated with the different frequency spectra, the sensors are able to compensate for each other's shortcomings. This makes them excellent candidates for sensor processing and data fusion systems. The benefits associated with Lidar and Radar sensor fusion for a ground vehicle application, using economical variants of these sensors, are presented. Special consideration is given to detecting objects partially obscured by light to medium vegetation.

  17. Sensoring Fusion Data from the Optic and Acoustic Emissions of Electric Arcs in the GMAW-S Process for Welding Quality Assessment

    Directory of Open Access Journals (Sweden)

    Eber Huanca Cayo

    2012-05-01

    Full Text Available The present study shows the relationship between welding quality and optical-acoustic emissions from electric arcs, during welding runs, in the GMAW-S process. Bead on plate welding tests was carried out with pre-set parameters chosen from manufacturing standards. During the welding runs interferences were induced on the welding path using paint, grease or gas faults. In each welding run arc voltage, welding current, infrared and acoustic emission values were acquired and parameters such as arc power, acoustic peaks rate and infrared radiation rate computed. Data fusion algorithms were developed by assessing known welding quality parameters from arc emissions. These algorithms have showed better responses when they are based on more than just one sensor. Finally, it was concluded that there is a close relation between arc emissions and quality in welding and it can be measured from arc emissions sensing and data fusion algorithms.

  18. Sensoring fusion data from the optic and acoustic emissions of electric arcs in the GMAW-S process for welding quality assessment.

    Science.gov (United States)

    Alfaro, Sadek Crisóstomo Absi; Cayo, Eber Huanca

    2012-01-01

    The present study shows the relationship between welding quality and optical-acoustic emissions from electric arcs, during welding runs, in the GMAW-S process. Bead on plate welding tests was carried out with pre-set parameters chosen from manufacturing standards. During the welding runs interferences were induced on the welding path using paint, grease or gas faults. In each welding run arc voltage, welding current, infrared and acoustic emission values were acquired and parameters such as arc power, acoustic peaks rate and infrared radiation rate computed. Data fusion algorithms were developed by assessing known welding quality parameters from arc emissions. These algorithms have showed better responses when they are based on more than just one sensor. Finally, it was concluded that there is a close relation between arc emissions and quality in welding and it can be measured from arc emissions sensing and data fusion algorithms.

  19. Flight Test Result for the Ground-Based Radio Navigation System Sensor with an Unmanned Air Vehicle.

    Science.gov (United States)

    Jang, Jaegyu; Ahn, Woo-Guen; Seo, Seungwoo; Lee, Jang Yong; Park, Jun-Pyo

    2015-11-11

    The Ground-based Radio Navigation System (GRNS) is an alternative/backup navigation system based on time synchronized pseudolites. It has been studied for some years due to the potential vulnerability issue of satellite navigation systems (e.g., GPS or Galileo). In the framework of our study, a periodic pulsed sequence was used instead of the randomized pulse sequence recommended as the RTCM (radio technical commission for maritime services) SC (special committee)-104 pseudolite signal, as a randomized pulse sequence with a long dwell time is not suitable for applications requiring high dynamics. This paper introduces a mathematical model of the post-correlation output in a navigation sensor, showing that the aliasing caused by the additional frequency term of a periodic pulsed signal leads to a false lock (i.e., Doppler frequency bias) during the signal acquisition process or in the carrier tracking loop of the navigation sensor. We suggest algorithms to resolve the frequency false lock issue in this paper, relying on the use of a multi-correlator. A flight test with an unmanned helicopter was conducted to verify the implemented navigation sensor. The results of this analysis show that there were no false locks during the flight test and that outliers stem from bad dilution of precision (DOP) or fluctuations in the received signal quality.

  20. Flight Test Result for the Ground-Based Radio Navigation System Sensor with an Unmanned Air Vehicle

    Directory of Open Access Journals (Sweden)

    Jaegyu Jang

    2015-11-01

    Full Text Available The Ground-based Radio Navigation System (GRNS is an alternative/backup navigation system based on time synchronized pseudolites. It has been studied for some years due to the potential vulnerability issue of satellite navigation systems (e.g., GPS or Galileo. In the framework of our study, a periodic pulsed sequence was used instead of the randomized pulse sequence recommended as the RTCM (radio technical commission for maritime services SC (special committee-104 pseudolite signal, as a randomized pulse sequence with a long dwell time is not suitable for applications requiring high dynamics. This paper introduces a mathematical model of the post-correlation output in a navigation sensor, showing that the aliasing caused by the additional frequency term of a periodic pulsed signal leads to a false lock (i.e., Doppler frequency bias during the signal acquisition process or in the carrier tracking loop of the navigation sensor. We suggest algorithms to resolve the frequency false lock issue in this paper, relying on the use of a multi-correlator. A flight test with an unmanned helicopter was conducted to verify the implemented navigation sensor. The results of this analysis show that there were no false locks during the flight test and that outliers stem from bad dilution of precision (DOP or fluctuations in the received signal quality.

  1. Real-time phasing and co-phasing of a ground-based interferometer with a pyramid wavefront sensor.

    Science.gov (United States)

    Vérinaud, Christophe; Esposito, Simone

    The feasibility and remarkable performances of pyramid wavefront sensing in adaptive optics have already been demonstrated. In this paper, we investigate another potential of the pyramid wavefront sensor which is differential piston sensing in interferometry: this can be done by using a glass pyramid placed in a combined focal plane of the interferometer, and a CCD sampling the usual four diffracted images of the pupil, composed here by the interferometer apertures. From a purely geometrical point of view, no information about the differential phase between two pupils could be retrieved. However, as the sensor main component, the pyramid, is located directly in the interference pattern of the interferometer, the piston information present in the electric field of the combined focal plane modifies, after diffraction by the pyramid, the intensity distribution in the pupil plane. Thus, with only one sensor, the differential piston can be measured, in addition to the classical local tilts determination. In this paper we present the concept and give some simulation results showing the performances of a closed-loop adaptive optics correction for a ground-based two-telescope interferometer like the Large Binocular Telescope.

  2. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor

    Directory of Open Access Journals (Sweden)

    Bodo eRückauer

    2016-04-01

    Full Text Available In this study we compare nine optical flow algorithms that locally measure the flow normal to edges according to accuracy and computation cost. In contrast to conventional, frame-based motion flow algorithms, our open-source implementations compute optical flow based on address-events from a neuromorphic Dynamic Vision Sensor (DVS. For this benchmarking we created a dataset of two synthesized and three real samples recorded from a 240x180 pixel Dynamic and Active-pixel Vision Sensor (DAVIS. This dataset contains events from the DVS as well as conventional frames to support testing state-of-the-art frame-based methods. We introduce a new source for the ground truth: In the special case that the perceived motion stems solely from a rotation of the vision sensor around its three camera axes, the true optical flow can be estimated using gyro data from the inertial measurement unit integrated with the DAVIS camera. This provides a ground-truth to which we can compare algorithms that measure optical flow by means of motion cues. An analysis of error sources led to the use of a refractory period, more accurate numerical derivatives and a Savitzky-Golay filter to achieve significant improvements in accuracy. Our pure Java implementations of two recently published algorithms reduce computational cost by up to 29% compared to the original implementations. Two of the algorithms introduced in this paper further speed up processing by a factor of 10 compared with the original implementations, at equal or better accuracy. On a desktop PC, they run in real-time on dense natural input recorded by a DAVIS camera.

  3. Evaluation of Event-Based Algorithms for Optical Flow with Ground-Truth from Inertial Measurement Sensor.

    Science.gov (United States)

    Rueckauer, Bodo; Delbruck, Tobi

    2016-01-01

    In this study we compare nine optical flow algorithms that locally measure the flow normal to edges according to accuracy and computation cost. In contrast to conventional, frame-based motion flow algorithms, our open-source implementations compute optical flow based on address-events from a neuromorphic Dynamic Vision Sensor (DVS). For this benchmarking we created a dataset of two synthesized and three real samples recorded from a 240 × 180 pixel Dynamic and Active-pixel Vision Sensor (DAVIS). This dataset contains events from the DVS as well as conventional frames to support testing state-of-the-art frame-based methods. We introduce a new source for the ground truth: In the special case that the perceived motion stems solely from a rotation of the vision sensor around its three camera axes, the true optical flow can be estimated using gyro data from the inertial measurement unit integrated with the DAVIS camera. This provides a ground-truth to which we can compare algorithms that measure optical flow by means of motion cues. An analysis of error sources led to the use of a refractory period, more accurate numerical derivatives and a Savitzky-Golay filter to achieve significant improvements in accuracy. Our pure Java implementations of two recently published algorithms reduce computational cost by up to 29% compared to the original implementations. Two of the algorithms introduced in this paper further speed up processing by a factor of 10 compared with the original implementations, at equal or better accuracy. On a desktop PC, they run in real-time on dense natural input recorded by a DAVIS camera.

  4. Gulf stream ground truth project - Results of the NRL airborne sensors

    Science.gov (United States)

    Mcclain, C. R.; Chen, D. T.; Hammond, D. L.

    1980-01-01

    Results of an airborne study of the waves in the Gulf Stream are presented. These results show that the active microwave sensors (high-flight radar and wind-wave radar) provide consistent and accurate estimates of significant wave height and surface wind speed, respectively. The correlation between the wave height measurements of the high-flight radar and a laser profilometer is excellent.

  5. Fusion of Satellite Multispectral Images Based on Ground-Penetrating Radar (GPR Data for the Investigation of Buried Concealed Archaeological Remains

    Directory of Open Access Journals (Sweden)

    Athos Agapiou

    2017-06-01

    Full Text Available The paper investigates the superficial layers of an archaeological landscape based on the integration of various remote sensing techniques. It is well known in the literature that shallow depths may be rich in archeological remains, which generate different signal responses depending on the applied technique. In this study three main technologies are examined, namely ground-penetrating radar (GPR, ground spectroscopy, and multispectral satellite imagery. The study aims to propose a methodology to enhance optical remote sensing satellite images, intended for archaeological research, based on the integration of ground based and satellite datasets. For this task, a regression model between the ground spectroradiometer and GPR is established which is then projected to a high resolution sub-meter optical image. The overall methodology consists of nine steps. Beyond the acquirement of the in-situ measurements and their calibration (Steps 1–3, various regression models are examined for more than 70 different vegetation indices (Steps 4–5. The specific data analysis indicated that the red-edge position (REP hyperspectral index was the most appropriate for developing a local fusion model between ground spectroscopy data and GPR datasets (Step 6, providing comparable results with the in situ GPR measurements (Step 7. Other vegetation indices, such as the normalized difference vegetation index (NDVI, have also been examined, providing significant correlation between the two datasets (R = 0.50. The model is then projected to a high-resolution image over the area of interest (Step 8. The proposed methodology was evaluated with a series of field data collected from the Vésztő-Mágor Tell in the eastern part of Hungary. The results were compared with in situ magnetic gradiometry measurements, indicating common interpretation results. The results were also compatible with the preliminary archaeological investigations of the area (Step 9. The overall

  6. An interprojection sensor fusion approach to estimate blocked projection signal in synchronized moving grid-based CBCT system

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hong; Kong, Vic [Department of Radiation Oncology, Georgia Regents University, Augusta, Georgia 30912 (United States); Ren, Lei; Giles, William; Zhang, You [Department of Radiation Oncology, Duke University, Durham, North Carolina 27710 (United States); Jin, Jian-Yue, E-mail: jjin@gru.edu [Department of Radiation Oncology, Georgia Regents University, Augusta, Georgia 30912 and Department of Radiology, Georgia Regents University, Augusta, Georgia 30912 (United States)

    2016-01-15

    Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the

  7. Optimal Fusion of Sensors

    DEFF Research Database (Denmark)

    Larsen, Thomas Dall

    This thesis deals with the problem of fusing and managing data concerning the state or identity of a given object. Focus is put on the challenges occurring within the field of mobile robot navigation. The main problem here will often be to keep track of the position and orientation of the robot...

  8. Comparison of buried soil sensors, surface chambers and above ground measurements of carbon dioxide fluxes

    Science.gov (United States)

    Soil carbon dioxide (CO2) flux is an important component of the terrestrial carbon cycle. Accurate measurements of soil CO2 flux aids determinations of carbon budgets. In this study, we investigated soil CO2 fluxes with time and depth and above ground CO2 fluxes in a bare field. CO2 concentrations w...

  9. Anomaly Detection for Data Reduction in an Unattended Ground Sensor (UGS) Field

    Science.gov (United States)

    2014-09-01

    information (shown with solid lines in the diagram). Typically, this would be a mobile ad-hoc network ( MANET ). The clusters are connected to other nodes...the detection algorithms be able to coordinate observations over the local MANET that interconnects UGSs within the cluster. Extrapolating from that...interquartile ranges MANET mobile ad-hoc network OSUS Open Standards for Unattended Sensors TOC tactical operations center UAVs unmanned aerial vehicles

  10. Sensors

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, H. [PBI-Dansensor A/S (Denmark); Toft Soerensen, O. [Risoe National Lab., Materials Research Dept. (Denmark)

    1999-10-01

    A new type of ceramic oxygen sensors based on semiconducting oxides was developed in this project. The advantage of these sensors compared to standard ZrO{sub 2} sensors is that they do not require a reference gas and that they can be produced in small sizes. The sensor design and the techniques developed for production of these sensors are judged suitable by the participating industry for a niche production of a new generation of oxygen sensors. Materials research on new oxygen ion conducting conductors both for applications in oxygen sensors and in fuel was also performed in this project and finally a new process was developed for fabrication of ceramic tubes by dip-coating. (EHS)

  11. Sensors

    CERN Document Server

    Pigorsch, Enrico

    1997-01-01

    This is the 5th edition of the Metra Martech Directory "EUROPEAN CENTRES OF EXPERTISE - SENSORS." The entries represent a survey of European sensors development. The new edition contains 425 detailed profiles of companies and research institutions in 22 countries. This is reflected in the diversity of sensors development programmes described, from sensors for physical parameters to biosensors and intelligent sensor systems. We do not claim that all European organisations developing sensors are included, but this is a good cross section from an invited list of participants. If you see gaps or omissions, or would like your organisation to be included, please send details. The data base invites the formation of effective joint ventures by identifying and providing access to specific areas in which organisations offer collaboration. This issue is recognised to be of great importance and most entrants include details of collaboration offered and sought. We hope the directory on Sensors will help you to find the ri...

  12. Multistatic Surveillance and Reconnaissance: Sensor, Signals and Data Fusion (Surveillance et Reconnaissance Multistatiques : Fusion des capteurs, des signaux et des donnees)

    Science.gov (United States)

    2009-04-01

    capteurs , des signaux et des données) Research and Technology Organisation (NATO) BP 25, F-92201 Neuilly-sur-Seine Cedex, France RTO-EN-SET-133...this new paradigm, each radar/sonar can receive and process its own signal and/or the signal of other local sources. The application of bi-/multistatic...Multistatiques : Fusion des capteurs , des signaux et des données) The material in this publication was assembled to support a Lecture Series under the

  13. Approach to multisensor/multilook information fusion

    Science.gov (United States)

    Myler, Harley R.; Patton, Ronald

    1997-07-01

    We are developing a multi-sensor, multi-look Artificial Intelligence Enhanced Information Processor (AIEIP) that combines classification elements of geometric hashing, neural networks and evolutionary algorithms in a synergistic combination. The fusion is coordinated using a piecewise level fusion algorithm that operates on probability data from statistics of the individual classifiers. Further, the AIEIP incorporates a knowledge-based system to aid a user in evaluating target data dynamically. The AIEIP is intended as a semi-autonomous system that not only fuses information from electronic data sources, but also has the capability to include human input derived from battlefield awareness and intelligence sources. The system would be useful in either advanced reconnaissance information fusion tasks where multiple fixed sensors and human observer inputs must be combined or for a dynamic fusion scenario incorporating an unmanned vehicle swarm with dynamic, multiple sensor data inputs. This paper represents our initial results from experiments and data analysis using the individual components of the AIEIP on FLIR target sets of ground vehicles.

  14. A Knowledge-Based Step Length Estimation Method Based on Fuzzy Logic and Multi-Sensor Fusion Algorithms for a Pedestrian Dead Reckoning System

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lai

    2016-05-01

    Full Text Available The demand for pedestrian navigation has increased along with the rapid progress in mobile and wearable devices. This study develops an accurate and usable Step Length Estimation (SLE method for a Pedestrian Dead Reckoning (PDR system with features including a wide range of step lengths, a self-contained system, and real-time computing, based on the multi-sensor fusion and Fuzzy Logic (FL algorithms. The wide-range SLE developed in this study was achieved by using a knowledge-based method to model the walking patterns of the user. The input variables of the FL are step strength and frequency, and the output is the estimated step length. Moreover, a waist-mounted sensor module has been developed using low-cost inertial sensors. Since low-cost sensors suffer from various errors, a calibration procedure has been utilized to improve accuracy. The proposed PDR scheme in this study demonstrates its ability to be implemented on waist-mounted devices in real time and is suitable for the indoor and outdoor environments considered in this study without the need for map information or any pre-installed infrastructure. The experiment results show that the maximum distance error was within 1.2% of 116.51 m in an indoor environment and was 1.78% of 385.2 m in an outdoor environment.

  15. Using acoustic sensor technologies to create a more terrain capable unmanned ground vehicle

    OpenAIRE

    Odedra, Sid; Prior, Stephen D.; Karamanoglu, Mehmet; Erbil, Mehmet Ali; Shen, Siu-Tsen; International Conference on Engineering Psychology and Cognitive Ergonomics

    2009-01-01

    Unmanned Ground Vehicle’s (UGV) have to cope with the most complex range of dynamic and variable obstacles and therefore need to be highly intelligent in order to cope with navigating in such a cluttered environment. When traversing over different terrains (whether it is a UGV or a commercial manned vehicle) different drive styles and configuration settings need to be selected in order to travel successfully over each terrain type. These settings are usually selected by a human operator in ma...

  16. Satellite and ground-based sensors for the Urban Heat Island analysis in the city of Rome

    DEFF Research Database (Denmark)

    Fabrizi, Roberto; Bonafoni, Stefania; Biondi, Riccardo

    2010-01-01

    In this work, the trend of the Urban Heat Island (UHI) of Rome is analyzed by both ground-based weather stations and a satellite-based infrared sensor. First, we have developed a suitable algorithm employing satellite brightness temperatures for the estimation of the air temperature belonging...... to the layer of air closest to the surface. UHI spatial characteristics have been assessed using air temperatures measured by both weather stations and brightness temperature maps from the Advanced Along Track Scanning Radiometer (AATSR) on board ENVISAT polar-orbiting satellite. In total, 634 daytime...... and nighttime scenes taken between 2003 and 2006 have been processed. Analysis of the Canopy Layer Heat Island (CLHI) during summer months reveals a mean growth in magnitude of 3-4 K during nighttime and a negative or almost zero CLHI intensity during daytime, confirmed by the weather stations. © 2010...

  17. Contact-Less High Speed Measurement over Ground with 61 GHz Radar Sensor

    OpenAIRE

    Imran, Muneeb

    2016-01-01

    Conventional FMCW radar principle was implemented on Symeo 61 GHz LPR®-1DHP-R radar sensor system. There were few limitations of the FMCW implementation which needed to be removed. First, target separation in multi target environment was not possible for objects at same distance. For example, there are two targets, one is moving and one is static. When the moving target approaches the static target and becomes parallel to static target, which means they are at the same distance. At this point...

  18. Wireless sensor network data fusion in layer based on hierarchy%基于分层的层内无线传感网络数据融合

    Institute of Scientific and Technical Information of China (English)

    李同锋; 杜秀娟; 牛昆

    2014-01-01

    Data fusion of wireless sensor networks can effectively reduce the data traffic among sensing nodes and energy consumption of nodes,and prolong the life of networks. A node hierarchical algorithm is proposed in this paper. The specific da-ta fusion algorithm is added into the sensing node inside layer. The Pauta criterion is utilized in the algorithm the to detect the abnormal data which are received by nodes. The principal components analysis (PCA) is adopted to fuse the remaining data. The simulations experiment indicates this algorithm has a high data fusion accuracy.%无线传感网络数据融合能够有效减少传感节点的数据通信量,减少节点的能量消耗,延长了网络的寿命。本文提出了节点分层算法,在层内传感节点加入了具体的数据融合算法,利用拉依达准则对节点收到的数据进行异常数据检测,在上层节点利用主成分分析对剩余数据进行数据融合。通过仿真实验得出该算法数据融合结果准确率好。

  19. 无线传感器网络免疫代理数据融合算法%Immune agent data fusion algorithm in Wireless Sensor Network

    Institute of Scientific and Technical Information of China (English)

    孙子文; 刘加杰; 梁广玮

    2013-01-01

    In view of the time delay and energy consumption, an immune agent-based data fusion algorithm is proposed. The energy consumption of network is reduced by the free migration of agent. In order to further reduce the energy consumption of network, the number of node participated in data fusion is reduced by the immune of sensor data. The time delay of network in emergency situation is reduced by the establishment of emergency access. And the sensor data of nodes are compressed by hexadecimal encoding. The simulation results show that the proposed algorithm is effective in reducing the energy consumption and the time delay of network.%针对网络能耗和延迟问题,提出了一种基于免疫代理的数据融合算法.通过代理的自由迁移降低节点传输能耗;通过免疫降低参与融合的节点数以降低网络能耗;设立应急通道以降低紧急情况下的网络延迟;采用十六进制编码方法对融合数据进行压缩处理.试验结果表明,该算法能有效降低网络能耗和延迟.

  20. A New Proxy Measurement Algorithm with Application to the Estimation of Vertical Ground Reaction Forces Using Wearable Sensors

    Directory of Open Access Journals (Sweden)

    Yuzhu Guo

    2017-09-01

    Full Text Available Measurement of the ground reaction forces (GRF during walking is typically limited to laboratory settings, and only short observations using wearable pressure insoles have been reported so far. In this study, a new proxy measurement method is proposed to estimate the vertical component of the GRF (vGRF from wearable accelerometer signals. The accelerations are used as the proxy variable. An orthogonal forward regression algorithm (OFR is employed to identify the dynamic relationships between the proxy variables and the measured vGRF using pressure-sensing insoles. The obtained model, which represents the connection between the proxy variable and the vGRF, is then used to predict the latter. The results have been validated using pressure insoles data collected from nine healthy individuals under two outdoor walking tasks in non-laboratory settings. The results show that the vGRFs can be reconstructed with high accuracy (with an average prediction error of less than 5.0% using only one wearable sensor mounted at the waist (L5, fifth lumbar vertebra. Proxy measures with different sensor positions are also discussed. Results show that the waist acceleration-based proxy measurement is more stable with less inter-task and inter-subject variability than the proxy measures based on forehead level accelerations. The proposed proxy measure provides a promising low-cost method for monitoring ground reaction forces in real-life settings and introduces a novel generic approach for replacing the direct determination of difficult to measure variables in many applications.