A perceptual reasoning system adaptively extracting, associating, and fusing information from multiple sources, at various levels of abstraction, is considered as the building block for the next generation of surveillance systems. A system architecture is presented which makes use of both centralized and distributed predetection fusion combined with intelligent monitor and control coupling both on-platform and off-board track and decision level fusion results. The goal of this system is to create a `gestalt fused sensor system' whose information product is greater than the sum of the information products from the individual sensors and has performance superior to either individual or a sub-group of combined sensors. The application of this architectural concept to the law enforcement arena (e.g. drug interdiction) utilizing multiple spatially and temporally diverse surveillance platforms and/or information sources, is used to illustrate the benefits of the adaptive perceptual reasoning system concept.
Fitzgerald, D.S.; Adams, D.G.
Past attempts at sensor fusion have used some form of Boolean logic to combine the sensor information. As an alteniative, an adaptive ``fuzzy`` sensor fusion technique is described in this paper. This technique exploits the robust capabilities of fuzzy logic in the decision process as well as the optimization features of the genetic algorithm. This paper presents a brief background on fuzzy logic and genetic algorithms and how they are used in an online implementation of adaptive sensor fusion.
Past attempts at sensor fusion have used some form of Boolean logic to combine the sensor information. As an alteniative, an adaptive ''fuzzy'' sensor fusion technique is described in this paper. This technique exploits the robust capabilities of fuzzy logic in the decision process as well as the optimization features of the genetic algorithm. This paper presents a brief background on fuzzy logic and genetic algorithms and how they are used in an online implementation of adaptive sensor fusion
The data fusion process has led to an evolution for emerging Wireless Sensor Networks (WSNs) and examines the impact of various factors on energy consumption. Significantly there has always been a constant effort to enhance network efficiency without decreasing the quality of information. Based on Adaptive Fusion Steiner Tree (AFST), this paper proposes a heuristic algorithm called Modified Adaptive Fusion Steiner Tree (M-AFST) for energy efficient routing which not only does adaptively adjus...
Zhaohua Yu; Qiang Ling; Yi Yu
In wireless sensor networks, the fusion center collects the dates from the sensor nodes and makes the optimal decision fusion, while the optimal decision fusion rules need the performance parameters of each sensor node. However, sensors, particularly low-cost and low-precision sensors, are usually displaced in harsh environment and their performance parameters can be easily affected by the environment and hardly be known in advance. In order to resolve this issue, we take a heterogeneous wire...
Becker, J.C. [Technical Univ. Braunschweig (Germany). Inst. of Control Engineering
This paper describes an adaptive information filter for the fusion of sensor data of an autonomous vehicle. The vehicle sensor system for object detection consists of a stereo vision sensor, four laserscanners and a radar sensor and provides a high redundancy in the observed area in front of the vehicle. The derivation of the information filter as well as its application to sensor data fusion is presented. Maneuver of observed targets are detected and the filter parameter are adapted accordingly. The information filter fusion is compared to the Kalman filter based measurement fusion. (orig.)
Inertial sensors are widely used in human body motion monitoring systems since they permit us to determine the position of the subject's limbs. Limb angle measurement is carried out through the integration of the angular velocity measured by a rate sensor and the decomposition of the components of static gravity acceleration measured by an accelerometer. Different factors derived from the sensors' nature, such as the angle random walk and dynamic bias, lead to erroneous measurements. Dynamic bias effects can be reduced through the use of adaptive filtering based on sensor fusion concepts. Most existing published works use a Kalman filtering sensor fusion approach. Our aim is to perform a comparative study among different adaptive filters. Several least mean squares (LMS), recursive least squares (RLS) and Kalman filtering variations are tested for the purpose of finding the best method leading to a more accurate and robust limb angle measurement. A new angle wander compensation sensor fusion approach based on LMS and RLS filters has been developed
This paper proposes an adaptive discrete finite-time synergetic control (ADFTSC) scheme based on a multi-rate sensor fusion estimator for flexible-joint mechanical systems in the presence of unmeasured states and dynamic uncertainties. Multi-rate sensors are employed to observe the system states which cannot be directly obtained by encoders due to the existence of joint flexibilities. By using an extended Kalman filter (EKF), the finite-time synergetic controller is designed based on a sensor fusion estimator which estimates states and parameters of the mechanical system with multi-rate measurements. The proposed controller can guarantee the finite-time convergence of tracking errors by the theoretical derivation. Simulation and experimental studies are included to validate the effectiveness of the proposed approach. (general)
Plascencia, Alfredo; Stepán, Petr
The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a Sensor Data Fusion (SDF) architecture. This approach involves combined sonar array with stereo vision readings. Sonar readings are interpreted using probability density functions...... to the occupied and empty regions. Scale Invariant Feature Transform (SIFT) feature descriptors are interpreted using gaussian probabilistic error models. The use of occupancy grids is proposed for representing the sensor readings. The Bayesian estimation approach is applied to update the sonar array...... and the SIFT descriptors' uncertainty grids. The sensor fusion yields a significant reduction in the uncertainty of the occupancy grid compared to the individual sensor readings....
The purpose of a tracking algorithm is to associate data measured by one or more (moving) sensors to moving objects in the environment. The state of these objects that can be estimated with the tracking process depends on the type of data that is provided by these sensors. It is discussed how the tr
Duffy, Brian R.; Garcia, C.; Rooney, Colm, (Thesis); O'Hare, G.M.P.
This paper advocates the application of sensor fusion for the visualisation of social robotic behaviour. Experiments with the Virtual Reality Workbench integrate the key elements of Virtual Reality and robotics in a coherent and systematic manner. The deliberative focusing of attention and sensor fusion between vision systems and sonar sensors is implemented on autonomous mobile robots functioning in standard office environments
Scheffel, Peter; Fish, Robert; Knobler, Ron; Plummer, Thomas
McQ has developed a broad based capability to fuse information in a geographic area from multiple sensors to build a better understanding of the situation. The paper will discuss the fusion architecture implemented by McQ to use many sensors and share their information. This multi sensor fusion architecture includes data sharing and analysis at the individual sensor, at communications nodes that connect many sensors together, at the system server/user interface, and across multi source information available through networked services. McQ will present a data fusion architecture that integrates a "Feature Information Base" (FIB) with McQ's well known Common Data Interchange Format (CDIF) data structure. The distributed multi sensor fusion provides enhanced situation awareness for the user.
The authors review techniques for sensor fusion in robot navigation, emphasizing algorithms for self-location. These find use when the sensor suite of a mobile robot comprises several different sensors, some complementary and some redundant. Integrating the sensor readings, the robot seeks to accomplish tasks such as constructing a map of its environment, locating itself in that map, and recognizing objects that should be avoided or sought. The review describes integration techniques in two categories: low-level fusion is used for direct integration of sensory data, resulting in parameter and state estimates; high-level fusion is used for indirect integration of sensory data in hierarchical architectures, through command arbitration and integration of control signals suggested by different modules. The review provides an arsenal of tools for addressing this (rather ill-posed) problem in machine intelligence, including Kalman filtering, rule-based techniques, behavior based algorithms and approaches that borrow from information theory, Dempster-Shafer reasoning, fuzzy logic and neural networks. It points to several further-research needs, including: robustness of decision rules; simultaneous consideration of self-location, motion planning, motion control and vehicle dynamics; the effect of sensor placement and attention focusing on sensor fusion; and adaptation of techniques from biological sensor fusion
In this paper, we take the model of Laser range finder based on synchronized scanner as example, show how to use data fusion method in the process of sensor model designing to get more robust output. Also we provide our idea on the relation of sensor model, data fusion and system structure, and in the paper, there is a solution that transform the parameter space to get linear model for Kalman filter.
Gustafsson, Fredrik; Schön, Thomas; Hol, Jeroen
The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing comput...
Choi, Woo-Kyung; Kim, Seong-Joo; Jeon, Hong-Tae
The goal of this paper in a snake robot and sensor fusion is that the snake robot which imitates a real snake's an activity and being adapted to topography and has multiple sensors operates well with considering environment around it. To avoid overloads of a processor and process a huge data of multiple sensors in distribute methods, fusion module of sensor is constructed in a module. In a low level of sensor processes, we worked sensor fusion
Larsen, Thomas Dall
within some global frame of reference using a wide variety of sensors providing odometric, inertial and absolute data concerning the robot and its surroundings. Kalman filters have for a long time been widely used to solve this problem. However, when measurements are delayed or the mobile robot...
Chang, K. C.
A distributed data fusion system consists of a network of sensors, each capable of local processing and fusion of sensor data. There has been a great deal of work in developing distributed fusion algorithms applicable to a network centric architecture. Currently there are at least a few approaches including naive fusion, cross-correlation fusion, information graph fusion, maximum a posteriori (MAP) fusion, channel filter fusion, and covariance intersection fusion. However, in general, in a distributed system such as the ad hoc sensor networks, the communication architecture is not fixed. Each node has knowledge of only its local connectivity but not the global network topology. In those cases, the distributed fusion algorithm based on information graph type of approach may not scale due to its requirements to carry long pedigree information for decorrelation. In this paper, we focus on scalable fusion algorithms and conduct analytical performance evaluation to compare their performance. The goal is to understand the performance of those algorithms under different operating conditions. Specifically, we evaluate the performance of channel filter fusion, Chernoff fusion, Shannon Fusion, and Battachayya fusion algorithms. We also compare their results to NaÃve fusion and "optimal" centralized fusion algorithms under a specific communication pattern.
Chappell, Steve; Preston, Dan; Olmstead, Dave; Flint, Bob; Sullivan, Chris
We present a system for the boresighting of sensors using inertial measurement devices as the basis for developing a range of dynamic real-time sensor fusion applications. The proof of concept utilizes a COTS FPGA platform for sensor fusion and real-time correction of a misaligned video sensor. We exploit a custom-designed 32-bit soft processor core and C-based design & synthesis for rapid, platform-neutral development. Kalman filter and sensor fusion techniques established in advanced aviation systems are applied to automotive vehicles with results exceeding typical industry requirements for sensor alignment. Results of the static and the dynamic tests demonstrate that using inexpensive accelerometers mounted on (or during assembly of) a sensor and an Inertial Measurement Unit (IMU) fixed to a vehicle can be used to compute the misalignment of the sensor to the IMU and thus vehicle. In some cases the model predications and test results exceeded the requirements by an order of magnitude with a 3-sigma or ...
The purpose of an intelligent alarm analysis system is to provide complete and manageable information to a central alarm station operator by applying alarm processing and fusion techniques to sensor information. This paper discusses the sensor fusion approach taken to perform intelligent alarm analysis for the Advanced Exterior Sensor (AES). The AES is an intrusion detection and assessment system designed for wide-area coverage, quick deployment, low false/nuisance alarm operation, and immediate visual assessment. It combines three sensor technologies (visible, infrared, and millimeter wave radar) collocated on a compact and portable remote sensor module. The remote sensor module rotates at a rate of 1 revolution per second to detect and track motion and provide assessment in a continuous 360 degree field-of-regard. Sensor fusion techniques are used to correlate and integrate the track data from these three sensors into a single track for operator observation. Additional inputs to the fusion process include environmental data, knowledge of sensor performance under certain weather conditions, sensor priority, and recent operator feedback. A confidence value is assigned to the track as a result of the fusion process. This helps to reduce nuisance alarms and to increase operator confidence in the system while reducing the workload of the operator
Multi-sensor data fusion is a broad area of constant research which is applied to a wide variety of fields such as the field of mobile robots. Mobile robots are complex systems where the design and implementation of sensor fusion is a complex task. But research applications are explored constantl....... The scope of the thesis is limited to building a map for a laboratory robot by fusing range readings from a sonar array with landmarks extracted from stereo vision images using the (Scale Invariant Feature Transform) SIFT algorithm.......Multi-sensor data fusion is a broad area of constant research which is applied to a wide variety of fields such as the field of mobile robots. Mobile robots are complex systems where the design and implementation of sensor fusion is a complex task. But research applications are explored constantly...
Full Text Available In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites, a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted.
Yunmin ZHU; Xiaorong LI
When all the rules of sensor decision are known,the optimal distributed decision fusion,which relies only on the joint conditional probability densities,can be derived for very general decision systems.They include those systems with interdependent sensor observations and any network structure.It is also valid for m-ary Bayesian decision problems and binary problems under the Neyman-Pearson criterion.Local decision rules of a sensor with communication from other sensors that are optimal for the sensor itself are also presented,which take the form of a generalized likelihood ratio test.Numerical examples are given to reveal some interesting phenomena that communication between sensors can improve performance of a senor decision,but cannot guarantee to improve the global fusion performance when sensor rules were given before fusing.
Flaherty, Michael; Pritchett, Daniel; Cothren, Brian; Schwaiger, James
Torch Technologies Inc., is actively involved in chemical sensor networking and data fusion via multi-year efforts with Dugway Proving Ground (DPG) and the Defense Threat Reduction Agency (DTRA). The objective of these efforts is to develop innovative concepts and advanced algorithms that enhance our national Chemical Warfare (CW) test and warning capabilities via the fusion of traditional and non-traditional CW sensor data. Under Phase I, II, and III Small Business Innovative Research (SBIR) contracts with DPG, Torch developed the Advanced Chemical Release Evaluation System (ACRES) software to support non real-time CW sensor data fusion. Under Phase I and II SBIRs with DTRA in conjunction with the Edgewood Chemical Biological Center (ECBC), Torch is using the DPG ACRES CW sensor data fuser as a framework from which to develop the Cloud state Estimation in a Networked Sensor Environment (CENSE) data fusion system. Torch is currently developing CENSE to implement and test innovative real-time sensor network based data fusion concepts using CW and non-CW ancillary sensor data to improve CW warning and detection in tactical scenarios.
Devi, Gadi Gayathri; Kumari, Priya; Jyoshna, Eslavath; Deepika; Murthy, Garimella Rama
In this research paper, the problems dealing with sensor network architecture, sensor fusion are addressed. Time/Computationally optimal network architectures are investigated. Some novel ideas on sensor fusion are proposed.
In this paper, we propose a new scheme for sensor data fusion in machine vision. The proposed scheme uses Kalman filter as the sensor data integration tool and hierarchical B- spline surface as the recording data structure. Kalman filter is used to obtain statistically optimal estimations of the imaged surface structure based on external sensor measurements. Hierarchical B-spline surface maintains high-order surface derivative continuity, may be adaptively refined, possesses desirable local control property, and is storage efficient. Hence, it is used to record the reconstructed surface structure.
Maneuvering targets tracking is a fundamental task in intelligent vehicle research. This paper focuses on the problem of fusion between radar and image sensors in targets tracking. In order to improve positioning accuracy and narrow down the image working area, a novel method that integrates radar filter with image intensity is proposed to establish an adaptive vision window.A weighted Hausdorff distance is introduced to define the functional relationship between image and model projection, and a modified simulated annealing algorithm is used to find optimum orientation parameter. Furthermore, the global state is estimated, which refers to the distributed data fusion algorithm. Experiment results show that our method is accurate.
Full Text Available Interest in fusing multiple sensor data for both military and civil applications has beengrowing. Some of the important applications integrate image information from multiple sensorsto aid in navigation guidance, object detection and recognition, medical diagnosis, datacompression, etc. While, human beings may visually inspect various images and integrateinformation, it is of interest to develop algorithms that can fuse various input imagery to producea composite image. Fusion of images from various sensor modalities is expected to produce anoutput that captures all the relevant information in the input. The standard multi-resolution-based edge fusion scheme has been reviewed in this paper. A theoretical framework is given forthis edge fusion method by showing how edge fusion can be framed as information maximisation.However, the presence of noise complicates the situation. The framework developed is used toshow that for noisy images, all edges no longer correspond to information. In this paper, varioustechniques have been presented for fusion of noisy multi-sensor images. These techniques aredeveloped for a single resolution as well as using multi-resolution decomposition. Some of thetechniques are based on modifying edge maps by filtering images, while others depend onalternate definition of information maps. Both these approaches can also be combined.Experiments show that the proposed algorithms work well for various kinds of noisy multi-sensor images.
Wilson, John; Tian, Gui; Morozov, Maxim; Qubaa, Abd
Sensor fusion for electromagnetic NDE at different stages and levels has been discussed and three case studies for fusion at sensor and feature levels have been investigated. Instead of applying innovative mathematical techniques to utilise multiple sensors to improve the fidelity of defect and material characterisation, physics based sensor fusion is investigated. It has been shown that the three types of sensing system fusion, feature selection and integration and information combination fo...
Lytrivis, Panagiotis; Thomaidis, George; Amditis, Angelos
This chapter has summarized the state-of-the-art in sensor data fusion for automotive applications, showing that this is a relatively new discipline in the automotive research area, compared to signal processing, image processing or radar processing. Thus, there is a
The paper describes the implementation of the data-sensor fusion and sensor management technology for accident management through simulated severe accident (SA) scenarios subjected to study. The organization of the present paper is as follows. As the data-sensor fusion and sensor management is an emerging technology which is not widely known, in Sec. 2, the definition and goals of data-sensor fusion and sensor management technology is described. In Sec. 3 fits, with reference to Kalman filtering as an information filter, statistical data-sensor fusion technology is described. This is followed by deterministic data-sensor fusion technology using gross plant state variables and neural networks (NN) and the implementation for severe accident management in NPPs. In Sec. 4, the sensor management technology is described. Finally, the performance of the data-sensor fusion technology for NPP safety is discussed. 12 refs, 6 figs
Information-based management of crop production systems known as precision agriculture relies on different sensor technologies aimed at characterization of spatial heterogeneity of a cropping environment. Remote and proximal sensing systems have been deployed to obtain high-resolution data pertainin...
In wireless sensor networks, sensor fusion is employed to integrate the acquired data from diverse sensors to provide a unified interpretation. The best and most salient advantage of sensor fusion is to obtain high-level information in both statistical and definitive aspects, which cannot be attained by a single sensor. In this paper, we propose a novel sensor fusion technique based on fuzzy theory for our earlier proposed Cognitive Radio-based Vehicular Ad Hoc and Sensor Networks (CR-VASNET)...
Wang, Meisong; Perera, Charith; Jayaraman, Prem Prakash; Zhang, Miranda; Strazdins, Peter; Ranjan, Rajiv
Internet of Things (IoT) has gained substantial attention recently and play a significant role in smart city application deployments. A number of such smart city applications depend on sensor fusion capabilities in the cloud from diverse data sources. We introduce the concept of IoT and present in detail ten different parameters that govern our sensor data fusion evaluation framework. We then evaluate the current state-of-the art in sensor data fusion against our sensor data fusion framework....
Full Text Available This paper developed a kind of robot self-protection system using the multi-sensors information fusion technology. This system used five groups of photoelectric sensor and ultrasonic sensor which were installed in different direction of the robot. In this study, signals were gathered by using the complement of ranging of photoelectric sensor and ultrasonic sensor. Then the signals were sent to MCU to achieve multi-sensors information fusion.Core fusion technology was the adaptive weighted fusion estimation algorithm, which can make measurement data more accurate. With such technology, an accurate robot self-protection command was made as to avoid obstacles, to judge narrow highland and to prevent dropping. The experiment results validated its good self-protection function.
Hol, Jeroen; Schön, Thomas; Gustafsson, Fredrik; Slycke, Per
In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the cam...
Emilsson, Erika; Rydell, Joakim
A reliable indoor positioning system providing high accuracy has the potential to increase the safety of first responders and military personnel significantly. To enable navigation in a broad range of environments and obtain more accurate and robust positioning results, we propose a multi-sensor fusion approach. We describe and evaluate a positioning system, based on sensor fusion between a foot-mounted inertial measurement unit (IMU) and a camera-based system for simultaneous localization and mapping (SLAM). The complete system provides accurate navigation in many relevant environments without depending on preinstalled infrastructure. The camera-based system uses both inertial measurements and visual data, thereby enabling navigation also in environments and scenarios where one of the sensors provides unreliable data during a few seconds. When sufficient light is available, the camera-based system generally provides good performance. The foot-mounted system provides accurate positioning when distinct steps can be detected, e.g., during walking and running, even in dark or smoke-filled environments. By combining the two systems, the integrated positioning system can be expected to enable accurate navigation in almost all kinds of environments and scenarios. In this paper we present results from initial tests, which show that the proposed sensor fusion improves the navigation solution considerably in scenarios where either the foot-mounted or camera-based system is unable to navigate on its own.
The main motivation of this thesis is to design low-complexity high efficiency noncoherent fusion rules for the parallel triple-layer wireless sensor networks (WSNs) based on frequency-hopping Mary frequency shift keying (FH/MFSK) techniques, which are hence referred to as the FH/MFSK WSNs. The FH/MFSKWSNs may be employed to monitor single or multiple source events (SEs)with each SE having multiple states. In the FH/MFSKWSNs, local decisions made by local sensor nodes (LSNs) are transmitted t...
Schenker, Paul S.
Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)
This thesis addresses a transmission energy problem for wireless sensor networks. There are two types of wireless sensor networks. One is single-hop sensor network where data from each sensor is directly transmitted to a fusion center, and the other is multihop sensor network where data is relayed through adjacent sensors. In the absence of a moving agent for data collection, multihop sensor network is typically much more energy efficient than single-hop sensor network since the former avoids...
Bartolini, Novella; la Porta, Thomas; Petrioli, Chiara; Silvestri, Simone
In this paper we address the problem of prolonging the lifetime of wireless sensor networks (WSNs) deployed to monitor an area of interest. In this scenario, a helpful approach is to reduce coverage redundancy and therefore the energy expenditure due to coverage. We introduce the first algorithm which reduces coverage redundancy by means of Sensor Activation and sensing Radius Adaptation (SARA)in a general applicative scenario with two classes of devices: sensors that can adapt their sensing range (adjustable sensors) and sensors that cannot (fixed sensors). In particular, SARA activates only a subset of all the available sensors and reduces the sensing range of the adjustable sensors that have been activated. In doing so, SARA also takes possible heterogeneous coverage capabilities of sensors belonging to the same class into account. It specifically addresses device heterogeneity by modeling the coverage problem in the Laguerre geometry through Voronoi-Laguerre diagrams. SARA executes quickly and is guarante...
Full Text Available A novel sensor fusion design framework is presented with the objective of improving the overall multisensor measurement system performance and achieving graceful degradation following individual sensor failures. The Unscented Information Filter (UIF is used to provide a useful tool for combining information from multiple sources. A two-step off-line and on-line calibration procedure refines sensor error models and improves the measurement performance. A Fault Detection and Identification (FDI scheme crosschecks sensor measurements and simultaneously monitors sensor biases. Low-quality or faulty sensor readings are then rejected from the final sensor fusion process. The attitude estimation problem is used as a case study for the multiple sensor fusion algorithm design, with information provided by a set of low-cost rate gyroscopes, accelerometers, magnetometers, and a single-frequency GPS receiver’s position and velocity solution. Flight data collected with an Unmanned Aerial Vehicle (UAV research test bed verifies the sensor fusion, adaptation, and fault-tolerance capabilities of the designed sensor fusion algorithm.
National Aeronautics and Space Administration — Research on desensitized optimal filtering techniques and a navigation and sensor fusion tool kit using advanced filtering techniques is proposed. Research focuses...
Pannetier, B.; Moras, J.; Dezert, Jean; Sella, G.
In this paper, data obtained from wireless unattended ground sensor network are used for tracking multiple ground targets (vehicles, pedestrians and animals) moving on and off the road network. The goal of the study is to evaluate several data fusion algorithms to select the best approach to establish the tactical situational awareness. The ground sensor network is composed of heterogeneous sensors (optronic, radar, seismic, acoustic, magnetic sensors) and data fusion nodes. The fusion nodes are small hardware platforms placed on the surveillance area that communicate together. In order to satisfy operational needs and the limited communication bandwidth between the nodes, we study several data fusion algorithms to track and classify targets in real time. A multiple targets tracking (MTT) algorithm is integrated in each data fusion node taking into account embedded constraint. The choice of the MTT algorithm is motivated by the limit of the chosen technology. In the fusion nodes, the distributed MTT algorithm exploits the road network information in order to constrain the multiple dynamic models. Then, a variable structure interacting multiple model (VS-IMM) is adapted with the road network topology. This algorithm is well-known in centralized architecture, but it implies a modification of other data fusion algorithms to preserve the performances of the tracking under constraints. Based on such VS-IMM MTT algorithm, we adapt classical data fusion techniques to make it working in three architectures: centralized, distributed and hierarchical. The sensors measurements are considered asynchronous, but the fusion steps are synchronized on all sensors. Performances of data fusion algorithms are evaluated using simulated data and also validated on real data. The scenarios under analysis contain multiple targets with close and crossing trajectories involving data association uncertainties.
Mayo, Donald R.
This research provides analysis of several approaches to the fusion of multiple dissimilar sensors to supplement simple color vision detection and recognition. Non-visible sensor systems can enhance computer vision systems. Our research investigates using thermal infrared (IR) sensors in combination with color data for object detection and recognition. We analyze several types of high-level and low-level sensor fusion to compare error rates with raw color and raw IR error rates in detecti...
Haberjahn, Mathias; Kozempel, Karsten
A reference sensor system, consisting of a multilayer laser scanner and a stereo camera system, is used for detecting vehicle surroundings. Via a novel multi level multi sensor fusion framework the heterogeneous sensor information can be fused on three succeeding processing levels (low, mid and high level). Here the low level fusion achieved the highest accuracy in the description of the object hypotheses. Detection and processing faults can be ecognized and reduced by competing sensor in...
Fluorescence proteins are widely used as markers for biomedical and technological purposes. Therefore, the aim of this project was to create a fluorescent sensor, based in the green and cyan fluorescent protein, using bacterial S-layers proteins as scaffold for the fluorescent tag. We report the cloning, expression and purification of three S-layer fluorescent proteins: SgsE-EGFP, SgsE-ECFP and SgsE-13aa-ECFP, this last containing a 13-amino acid rigid linker. The pH dependence of the fluorescence intensity of the S-layer fusion proteins, monitored by fluorescence spectroscopy, showed that the ECFP tag was more stable than EGFP. Furthermore, the fluorescent fusion proteins were reassembled on silica particles modified with cationic and anionic polyelectrolytes. Zeta potential measurements confirmed the particle coatings and indicated their colloidal stability. Flow cytometry and fluorescence microscopy showed that the fluorescence of the fusion proteins was pH dependent and sensitive to the underlying polyelectrolyte coating. This might suggest that the fluorescent tag is not completely exposed to the bulk media as an independent moiety. Finally, it was found out that viscosity enhanced the fluorescence intensity of the three fluorescent S-layer proteins. (paper)
Prats Mateu, Batirtze; Kainz, Birgit; Pum, Dietmar; Sleytr, Uwe B.; Toca-Herrera, José L.
Fluorescence proteins are widely used as markers for biomedical and technological purposes. Therefore, the aim of this project was to create a fluorescent sensor, based in the green and cyan fluorescent protein, using bacterial S-layers proteins as scaffold for the fluorescent tag. We report the cloning, expression and purification of three S-layer fluorescent proteins: SgsE-EGFP, SgsE-ECFP and SgsE-13aa-ECFP, this last containing a 13-amino acid rigid linker. The pH dependence of the fluorescence intensity of the S-layer fusion proteins, monitored by fluorescence spectroscopy, showed that the ECFP tag was more stable than EGFP. Furthermore, the fluorescent fusion proteins were reassembled on silica particles modified with cationic and anionic polyelectrolytes. Zeta potential measurements confirmed the particle coatings and indicated their colloidal stability. Flow cytometry and fluorescence microscopy showed that the fluorescence of the fusion proteins was pH dependent and sensitive to the underlying polyelectrolyte coating. This might suggest that the fluorescent tag is not completely exposed to the bulk media as an independent moiety. Finally, it was found out that viscosity enhanced the fluorescence intensity of the three fluorescent S-layer proteins.
Zhang Yinan; Sun Qingwei; Quan He; Jin Yonggao; Quan Taifan
In practical multi-sensor information fusion systems,there exists uncertainty about the network structure,active state of sensors,and information itself (including fuzziness,randomness,incompleteness as well as roughness,etc). Hence it requires investigating the problem of uncertain information fusion. Robust learning algorithm which adapts to complex environment and the fuzzy inference algorithm which disposes fuzzy information are explored to solve the problem. Based on the fusion technology of neural networks and fuzzy inference algorithm, a multi-sensor uncertain information fusion system is modeled. Also RANFIS learning algorithm and fusing weight synthesized inference algorithm are developed from the ANFIS algorithm according to the concept of robust neural networks. This fusion system mainly consists of RANFIS confidence estimator, fusing weight synthesized inference knowledge base and weighted fusion section. The simulation result demonstrates that the proposed fusion model and algorithm have the capability of uncertain information fusion, thus is obviously advantageous compared with the conventional Kalman weighted fusion algorithm.
Sensor Data Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output in accuracy, reliability, or cost. This book provides an introduction Sensor Data Fusion, as an information technology as well as a branch of engineering science and informatics. Part I presents a coherent methodological framework, thus providing th
The paper describes the implementation of the data-sensor fusion and sensor management technology for accident management through simulated severe accident (SA) scenarios subjected to study. By means of accident management the appropriate prompt actions to be taken to avoid nuclear accident (SA) scenarios subjected to study. By means of accident management the appropriate prompt actions to be taken to avoid nuclear accidents are meant, while such accidents are deemed to somehow be imminent during plant operation. The organisation of the present paper is as follows. As the data-sensor fusion and sensor management is an emerging technology which is not widely known, in Sec. 2, the definition and goals of data-sensor fusion and sensor management technology is described. In Sec. 3 first, with reference to Kalman filtering as an information filter, statistical data-sensor fusion technology is described. This is followed by the examples of deterministic data-sensor fusion technology using gross plant state variables and neural networks (NN) and the implementation for severe accident management in NPPs. In Sec. 4, the sensor management technology is described. Finally, the performance of the data-sensor fusion technology for NPP safety is discussed. (orig./WL)
Full Text Available This paper proposes a real-time system for face authentication, obtained through fusion of Infra Red (IR and visible images. In order to identify the unknown person authentication in highly secured areas, multiple algorithms are needed. The four well known algorithms for face recognition, Block Independent Component Analysis(BICA, Kalman Filtering(KF method, Discrete Cosine Transform(DCT and Orthogonal Locality Preserving Projections (OLPP are used to extract the features. If the data base size is very large and the features are not distinct then ambiguity will exists in face recognition. Hence more than one sensor is needed for critical and/or highly secured areas. This paper deals with multiple fusion methodology using weighted average and Fuzzy Logic. The visible sensor output depends on the environmental condition namely lighting conditions, illumination etc., to overcome this problem use histogram technique to choose appropriate algorithm. DCT and Kalman filtering are holistic approaches, BICA follows feature based approach and OLPP preserves the Euclidean structure of face space. These recognizers are capable of considering the problem of dimensionality reduction by eliminating redundant features and reducing the feature space. The system can handle variations like illumination, pose, orientation, occlusion, etc. up to a significant level. The integrated system overcomes the drawbacks of individual recognizers. The proposed system is aimed at increasing the accuracy of the person authentication system and at the same time reducing the limitations of individual algorithms. It is tested on real time database and the results are found to be 96% accurate.
Connors, John J. (PPG Industries, Inc., Harmar Township, PA); Hill, Kevin (PPG Industries, Inc., Harmar Township, PA); Hanekamp, David (PPG Industries, Inc., Harmar Township, PA); Haley, William F. (PPG Industries, Inc., Wichita Falls, TX); Gallagher, Robert J.; Gowin, Craig (PPG Industries, Inc., Batavia, IL); Farrar, Arthur R. (PPG Industries, Inc., Wichita Falls, TX); Sheaffer, Donald A.; DeYoung, Mark A. (PPG Industries, Inc., Mt. Zion, IL); Bertram, Lee A.; Dodge, Craig (PPG Industries, Inc., Mt. Zion, IL); Binion, Bruce (PPG Industries, Inc., Mt. Zion, IL); Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R. (University of Utah, Salt Lake City, UT); Tiwary, Rajiv (PPG Industries, Inc., Harmar Township, PA); Stokes, Michael R. (PPG Industries, Inc.); Miller, Alan J. (PPG Industries, Inc., Mt. Zion, IL); Michael, Richard W. (PPG Industries, Inc., Lincoln, AL); Mayer, Raymond M. (PPG Industries, Inc., Harmar Township, PA); Jiao, Yu (PPG Industries, Inc., Harmar Township, PA); Smith, Philip J. (University of Utah, Salt Lake City, UT); Arbab, Mehran (PPG Industries, Inc., Harmar Township, PA); Hillaire, Robert G.
An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation
Marwah Almasri; Khaled Elleithy; Abrar Alajlan
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot...
Guerrero, Rafael F; Kirkpatrick, Mark
We use forward and coalescent models of population genetics to study chromosome fusions that reduce the recombination between two locally adapted loci. Under a continent-island model, a fusion spreads and reaches a polymorphic equilibrium when it causes recombination between locally adapted alleles to be less than their selective advantage. In contrast, fusions in a two-deme model always spread; whether it reaches a polymorphic equilibrium or becomes fixed depends on the relative recombination rates of fused homozygotes and heterozygotes. Neutral divergence around fusion polymorphisms is markedly increased, showing peaks at the point of fusion and at the locally adapted loci. Local adaptation could explain the evolution of many of chromosome fusions, which are some of the most common chromosome rearrangements in nature. PMID:24964074
Zou, Xiaotian; Tian, Ye; Wu, Nan; Sun, Kai; Wang, Xingwei
The adaptive neural network is a standard technique used in nonlinear system estimation and learning applications for dynamic models. In this paper, we introduced an adaptive sensor fusion algorithm for a helmet structure health monitoring system. The helmet structure health monitoring system is used to study the effects of ballistic/blast events on the helmet and human skull. Installed inside the helmet system, there is an optical fiber pressure sensors array. After implementing the adaptive estimation algorithm into helmet system, a dynamic model for the sensor array has been developed. The dynamic response characteristics of the sensor network are estimated from the pressure data by applying an adaptive control algorithm using artificial neural network. With the estimated parameters and position data from the dynamic model, the pressure distribution of the whole helmet can be calculated following the Bazier Surface interpolation method. The distribution pattern inside the helmet will be very helpful for improving helmet design to provide better protection to soldiers from head injuries.
Schwarz, Sebastian; Sjöström, Mårten; Olsson, Roger
Obtaining three-dimensional scenery data is an essential task in computer vision, with diverse applications in various areas such as manufacturing and quality control, security and surveillance, or user interaction and entertainment. Dedicated Time-of-Flight sensors can provide detailed scenery depth in real-time and overcome short-comings of traditional stereo analysis. Nonetheless, they do not provide texture information and have limited spatial resolution. Therefore such sensors are typically combined with high resolution video sensors. Time-of-Flight Sensor Fusion is a highly active field of research. Over the recent years, there have been multiple proposals addressing important topics such as texture-guided depth upsampling and depth data denoising. In this article we take a step back and look at the underlying principles of ToF sensor fusion. We derive the ToF sensor fusion error model and evaluate its sensitivity to inaccuracies in camera calibration and depth measurements. In accordance with our findings, we propose certain courses of action to ensure high quality fusion results. With this multivariate sensitivity analysis of the ToF sensor fusion model, we provide an important guideline for designing, calibrating and running a sophisticated Time-of-Flight sensor fusion capture systems.
López Milla, Javier
[ANGLÈS] Kalman Filter is nowadays one of the most used tool to perform sensor fusion in navigation environments, i.e. combine information from several different sensors to obtain the optimal navigation solution. However, there is no single algorithm for Kalman Filter and each of them must be adapted to the concrete problem. This implies that there are no two Kalman Filters that are exactly the same, complicating an objective comparison between them, mainly if implemented by different people....
National Aeronautics and Space Administration — It is proposed to develop desensitized optimal filtering techniques and to implement these algorithms in a navigation and sensor fusion tool kit. These proposed...
Johnson, Donald H.; Shaw, Scott W.; Reynolds, Steven; Himayat, Nageen
Multi-sensor fusion, at the most basic level, can be cast into a concise, elegant model. Reality demands, however, that this model be modified and augmented. These modifications often result in software systems that are confusing in function and difficult to debug. This problem can be ameliorated by adopting an object-oriented, data-flow programming style. For real-time applications, this approach simplifies data communications and storage management. The concept of object-oriented, data-flow programming is conveniently embodied in the black-board style of software architecture. Blackboard systems allow diverse programs access to a central data base. When the blackboard is described as an object, it can be distributed over multiple processors for real-time applications. Choosing the appropriate parallel architecture is the subject of ongoing research. A prototype blackboard has been constructed to fuse optical image regions and Doppler radar events. The system maintains tracks of simulated targets in real time. The results of this simulation have been used to direct further research on real-time blackboard systems.
Martí Muñoz, Enrique David
Sensor fusion is a mature but very active research field, included in the more general discipline of information fusion. It studies how to combine data coming from different sensors, in such way that the resulting information is better in some sense –more complete, accurate or stable– than any of the original sources used individually. Context is defined as everything that constraints or affects the process of solving a problem, without being part of the problem or the solution itself. Over ...
SONG Kaichen; NIE Xili
Weighted fusion algorithms, which can be applied in the area of multi-sensor data fusion,are advanced based on weighted least square method. A weighted fusion algorithm, in which the relationship between weight coefficients and measurement noise is established, is proposed by giving attention to the correlation of measurement noise. Then a simplified weighted fusion algorithm is deduced on the assumption that measurement noise is uncorrelated. In addition, an algorithm, which can adjust the weight coefficients in the simplified algorithm by making estimations of measurement noise from measurements, is presented. It is proved by emulation and experiment that the precision performance of the multi-sensor system based on these algorithms is better than that of the multi-sensor system based on other algorithms.
Benaskeur, Abder R.; Roy, Jean
Sensor Management (SM) has to do with how to best manage, coordinate and organize the use of sensing resources in a manner that synergistically improves the process of data fusion. Based on the contextual information, SM develops options for collecting further information, allocates and directs the sensors towards the achievement of the mission goals and/or tunes the parameters for the realtime improvement of the effectiveness of the sensing process. Conscious of the important role that SM has to play in modern data fusion systems, we are currently studying advanced SM Concepts that would help increase the survivability of the current Halifax and Iroquois Class ships, as well as their possible future upgrades. For this purpose, a hierarchical scheme has been proposed for data fusion and resource management adaptation, based on the control theory and within the process refinement paradigm of the JDL data fusion model, and taking into account the multi-agent model put forward by the SASS Group for the situation analysis process. The novelty of this work lies in the unified framework that has been defined for tackling the adaptation of both the fusion process and the sensor/weapon management.
Blum, Rick S
Taking another lesson from nature, the latest advances in image processing technology seek to combine image data from several diverse types of sensors in order to obtain a more accurate view of the scene: very much the same as we rely on our five senses. Multi-Sensor Image Fusion and Its Applications is the first text dedicated to the theory and practice of the registration and fusion of image data, covering such approaches as statistical methods, color-related techniques, model-based methods, and visual information display strategies.After a review of state-of-the-art image fusion techniques,
Ng, Joseph; Piacentino, Michael; Caldwell, Brian
Mission success is highly dependent on the ability to accomplish Surveillance, Situation Awareness, Target Detection and Classification, but is challenging under adverse weather conditions. This paper introduces an engineering prototype to address the image collection challenges using a Common Optical Path, Multiple Sensors and an Intelligent Image Fusion System, and provides illustrations and sample fusion images. Panavision's advanced wide spectrum optical design has permitted a suite of imagers to perform observations through a common optical path with a common field of view, thereby aligning images and facilitating optimized downstream image processing. The adaptable design also supports continuous zoom or Galilean lenses for multiple field of views. The Multiple Sensors include: (1) High-definition imaging sensors that are small, have low power consumption and a wide dynamic range; (2) EMCCD sensors that transition from daylight to starlight, even under poor weather conditions, with sensitivity down to 0.00025 Lux; and (3) SWIR sensors that, with the advancement in InGaAs, are able to generate ultra-high sensitivity images from 1-1.7μm reflective light and can achieve imaging through haze and some types of camouflage. The intelligent fusion of multiple sensors provides high-resolution color information with previously impossible sensitivity and contrast. With the integration of Field-Programmable Gate Arrays (FPGAs) and Application-Specific Integrated Circuits (ASICs), real-time Image Processing and Fusion Algorithms can facilitate mission success in a small, low power package.
Data fusion is the integration and analysis of data from multiple sensors to develop a more accurate understanding of a situation and determine how to respond to it. Although data fusion can be applied in many situations, this paper focuses on its application to manufacturing and how it changes some of the more traditional, less adaptive information models that support the design and manufacturing functions. The paper consists of four parts: Section 1 defines data fusion and explains its impact on manufacturing. Section 2 describes an information system architecture and explains the natural language-based information modeling methodology used by this research project. Section 3 identifies the major design and manufacturing functions, reviews the information models required to support them, and then shows how these models must be extended to support data fusion. Section 4 discusses the future directions of this work. This report is one of three produced by an FY93 LDRD project, Information Integration for Data Fusion. The project confirmed: (1) that the natural language-based information modeling methodology could be used effectively in data fusion areas, and (2) that commonalities could be found that would allow synergy across various data fusion areas, such as defense, manufacturing, and health care. The project found five common objects that are the basis for all of the data fusion areas examined: targets, behaviors, environments, signatures, and sensors. Many of these objects and the specific facts related to them were common across several models and could easily be reused. In some cases, even the terminology remained the same. This commonality is important with the growing use of multisensor data fusion. Data fusion is much more difficult if each type of sensor uses its own objects and models rather than building on a common set. Information model integration at the conceptual level is much easier than at the implementation level.
Breejen, E. den; Schutte, K.; Cremer, F.
In this paper the multi sensor fusion results obtained within the European research project GEODE (Ground Explosive Ordnance Detection system) are presented. The lay out of the test lane and the individual sensors used are described. The implementation of the SCOOP algorithm improves the ROC curves,
Frisk, Mikael; Nilsson, Albin
In order to determine the orientation and position of an object it is common to measure linear and angular motion with an inertial sensor. To further improve the positioning an ultra-wideband sensor can be used simultaneously and integrated in the final solution with sensor fusion. This study has evaluated an ultra- wideband sensor, and also integrated it with a pre- existing solution for positioning using inertial sensors, in order to determine if the solution is viable for positioning of an...
Shahina Begum; Shaibal Barua; Mobyen Uddin Ahmed
Today, clinicians often do diagnosis and classification of diseases based on information collected from several physiological sensor signals. However, sensor signal could easily be vulnerable to uncertain noises or interferences and due to large individual variations sensitivity to different physiological sensors could also vary. Therefore, multiple sensor signal fusion is valuable to provide more robust and reliable decision. This paper demonstrates a physiological sensor signal classificati...
He, Changyu; Liu, Yue
To solve the occlusion problem in optical tracking system (OTS) for surgical navigation, this paper proposes a sensor fusion approach and an adaptive display method to handle cases where partial or total occlusion occurs. In the sensor fusion approach, the full 6D pose information provided by the optical tracker is used to estimate the bias of the inertial sensors when all of the markers are visible. When partial occlusion occurs, the optical system can track the position of at least one marker which can be combined with the orientation estimated from the inertial measurements to recover the full 6D pose information. When all the markers are invisible, the position tracking will be realized based on outputs of the Inertial Measurement Unit (IMU) which may generate increasing drifting error. To alert the user when the drifting error is great enough to influence the navigation, the images adaptive to the drifting error are displayed in the field of the user's view. The experiments are performed with an augmented reality HMD which displays the AR images and the hybrid tracking system (HTS) which consists of an OTS and an IMU. Experimental result shows that with proposed sensor fusion approach the 6D pose of the head with respect to the reference frame can be estimated even under partial occlusion conditions. With the help of the proposed adaptive display method, the users can recover the scene of markers when the error is considered to be relatively high.
In a multiple sensor system, each sensor produces an output which is related to the desired feature according to a certain probability distribution. We propose a fuser that combines the sensor outputs to more accurately predict the desired feature. The fuser utilizes the lower envelope of regression curves of sensors to project the sensor with the least error at each point of the feature space. This fuser is optimal among all projective fusers and also satisfies the isolation property that ensures a performance at least as good as the best sensor. In the case the sensor distributions are not known, we show that a consistent estimator of this fuser can be computed entirely based on a training sample. Compared to linear fusers, the projective fusers provide a complementary performance. We propose two classes of metafusers that utilize both linear and projectives fusers to perform at least as good as the best sensor as well as the best fuser
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766
Full Text Available Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.
Bahrepour, M.; Meratnia, N.; Havinga, P.J.M.
Recently, Wireless Sensor Networks (WSN) community has witnessed an application focus shift. Although, monitoring was the initial application of wireless sensor networks, in-network data processing and (near) real-time actuation capability have made wireless sensor networks suitable candidate for ev
Scanlon, Michael V.; Ludwig, William D.
A research-oriented Army Technology Objective (ATO) named Sensor and Information Fusion for Improved Hostile Fire Situational Awareness uniquely focuses on the underpinning technologies to detect and defeat any hostile threat; before, during, and after its occurrence. This is a joint effort led by the Army Research Laboratory, with the Armaments and the Communications and Electronics Research, Development, and Engineering Centers (CERDEC and ARDEC) partners. It addresses distributed sensor fusion and collaborative situational awareness enhancements, focusing on the underpinning technologies to detect/identify potential hostile shooters prior to firing a shot and to detect/classify/locate the firing point of hostile small arms, mortars, rockets, RPGs, and missiles after the first shot. A field experiment conducted addressed not only diverse modality sensor performance and sensor fusion benefits, but gathered useful data to develop and demonstrate the ad hoc networking and dissemination of relevant data and actionable intelligence. Represented at this field experiment were various sensor platforms such as UGS, soldier-worn, manned ground vehicles, UGVs, UAVs, and helicopters. This ATO continues to evaluate applicable technologies to include retro-reflection, UV, IR, visible, glint, LADAR, radar, acoustic, seismic, E-field, narrow-band emission and image processing techniques to detect the threats with very high confidence. Networked fusion of multi-modal data will reduce false alarms and improve actionable intelligence by distributing grid coordinates, detection report features, and imagery of threats.
Mohammad Jalil Piran
Full Text Available In wireless sensor networks, sensor fusion is employed to integrate the acquired data from diverse sensors to provide a unified interpretation. The best and most salient advantage of sensor fusion is to obtain high-level information in both statistical and definitive aspects, which cannot be attained by a single sensor. In this paper, we propose a novel sensor fusion technique based on fuzzy theory for our earlier proposed Cognitive Radio-based Vehicular Ad Hoc and Sensor Networks (CR-VASNET. In the proposed technique, we considered four input sensor readings (antecedents and one output (consequent. The employed mobile nodes in CR-VASNET are supposed to be equipped with diverse sensors, which cater to our antecedent variables, for example, The Jerk, Collision Intensity, and Temperature and Inclination Degree. Crash_Severity is considered as the consequent variable. The processing and fusion of the diverse sensory signals are carried out by fuzzy logic scenario. Accuracy and reliability of the proposed protocol, demonstrated by the simulation results, introduce it as an applicable system to be employed to reduce the causalities rate of the vehicles’ crashes.
GUO Hang; YU Min
This paper presents a data fusion method in distributed multi-sensor system including GPS and INS sensors' data processing. First, a residual χ2-test strategy with the corresponding algorithm is designed. Then a coefficient matrices calculation method of the information sharing principle is derived. Finally, the federated Kalman filter is used to combine these independent, parallel, real-time data. A pseudolite (PL) simulation example is given.
Shen, Yunhe; Wu, Fan; Tseng, Kuo-Shih; Ye, Ding; Raymond, John; Konety, Badrinath; Sweet, Robert
Here we introduce a motion tracking or navigation module for medical simulation systems. Our main contribution is a sensor fusion method for proximity or distance sensors integrated with inertial measurement unit (IMU). Since IMU rotation tracking has been widely studied, we focus on the position or trajectory tracking of the instrument moving freely within a given boundary. In our experiments, we have found that this module reliably tracks instrument motion. PMID:27046606
Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck;
This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms as...
With multisensor data fusion technology, the data from multiple sensors are fused in order to make a more accurate estimation of the environment through measurement, processing and analysis. Artificial neural networks are the computational models that mimic biological neural networks. With high per...
Reliability of navigation data are critical for steering and manoeuvring control, and in particular so at high speed or in critical phases of a mission. Should faults occur, faulty instruments need be autonomously isolated and faulty information discarded. This paper designs a navigation solution...... events where the fault-tolerant sensor fusion provided uninterrupted navigation data despite temporal instrument defects...
Full Text Available Due to the limited localization precision of single sensor, a sensor data fusion is introduced based on Rao-Blackwellization Unscented Kalman Filter (RBUKF that fuses the sensor data of a GPS receiver, one gyro and one compass. RBUKF algorithm is compared with that of Extended Kalman Filter (EKF and Unscented Kalman Filter (UKF in this study. The experimental results show that the RBUKF algorithm can more effectively improve tracking accuracy and reduce computational complexity than the other algorithms and has practical significance.
Röhl, S.; Bodenstedt, S.; Küderle, C.; Suwelack, S.; Kenngott, H.; Müller-Stich, B. P.; Dillmann, R.; Speidel, S.
Minimally invasive surgery is medically complex and can heavily benefit from computer assistance. One way to help the surgeon is to integrate preoperative planning data into the surgical workflow. This information can be represented as a customized preoperative model of the surgical site. To use it intraoperatively, it has to be updated during the intervention due to the constantly changing environment. Hence, intraoperative sensor data has to be acquired and registered with the preoperative model. Haptic information which could complement the visual sensor data is still not established. In addition, biomechanical modeling of the surgical site can help in reflecting the changes which cannot be captured by intraoperative sensors. We present a setting where a force sensor is integrated into a laparoscopic instrument. In a test scenario using a silicone liver phantom, we register the measured forces with a reconstructed surface model from stereo endoscopic images and a finite element model. The endoscope, the instrument and the liver phantom are tracked with a Polaris optical tracking system. By fusing this information, we can transfer the deformation onto the finite element model. The purpose of this setting is to demonstrate the principles needed and the methods developed for intraoperative sensor data fusion. One emphasis lies on the calibration of the force sensor with the instrument and first experiments with soft tissue. We also present our solution and first results concerning the integration of the force sensor as well as accuracy to the fusion of force measurements, surface reconstruction and biomechanical modeling.
To carry out exploration tasks in unknown or partially unknown environments, a mobile robot needs to acquire and maintain models of its environment. In doing so, several sensors of same nature and/or heterogeneous sensor configurations may be used by the robot to achieve reliable performances. However, this in turn poses the problem of sensor fusion-based map building: How to interpret, combine and integrate sensory information in order to build a proper representation of the environment. Specifically, the goal of this thesis is to probe integration algorithms for Occupancy Grid (OG) based map building using odometry, ultrasonic rangefinders, and stereo vision. Three different uncertainty calculi are presented here which are used for sensor fusion-based map building purposes. They are based on probability theory, Dempster-Shafer theory of evidence, and fuzzy set theory. Besides, two different sensor models are depicted which are used to translate sensing data into range information. Experimental examples of OGs built from real data recorded by two robots in office-like environment are presented. They show the feasibility of the proposed approach for building both sonar and visual based OGs. A comparison among the presented uncertainty calculi is performed in a sonar-based framework. Finally, the fusion of both sonar and visual information based of the fuzzy set theory is depicted. (author)
This book introduces resource-aware data fusion algorithms to gather and combine data from multiple sources (e.g., sensors) in order to achieve inferences. These techniques can be used in centralized and distributed systems to overcome sensor failure, technological limitation, and spatial and temporal coverage problems. The algorithms described in this book are evaluated with simulation and experimental results to show they will maintain data integrity and make data useful and informative. Describes techniques to overcome real problems posed by wireless sensor networks deployed in circumstances that might interfere with measurements provided, such as strong variations of pressure, temperature, radiation, and electromagnetic noise; Uses simulation and experimental results to evaluate algorithms presented and includes real test-bed; Includes case study implementing data fusion algorithms on a remote monitoring framework for sand production in oil pipelines.
Abdelrahman, Mohamed; Kandasamy, Parameshwaran
The main focus of this research is to reduce the risk of a catastrophic response of a feedback control system when some of the feedback data from the system sensors are not reliable, while maintaining a reasonable performance of the control system. In this paper a methodology for integrating multiple sensor fusion into the controller design is presented. The multiple sensor fusion algorithm produces, in addition to the estimate of the measurand, a parameter that measures the confidence in the estimated value. This confidence is integrated as a parameter into the controller to produce fast system response when the confidence in the estimate is high, and a slow response when the confidence in the estimate is low. Conditions for the stability of the system with the developed controller are discussed. This methodology is demonstrated on a cupola furnace model. The simulations illustrate the advantages of the new methodology. PMID:12708539
Duro, João A.; Padget, Julian A.; Bowen, Chris R.; Kim, H. Alicia; Nassehi, Aydin
Reliable machining monitoring systems are essential for lowering production time and manufacturing costs. Existing expensive monitoring systems focus on prevention/detection of tool malfunctions and provide information for process optimisation by force measurement. An alternative and cost-effective approach is monitoring acoustic emissions (AEs) from machining operations by acting as a robust proxy. The limitations of AEs include high sensitivity to sensor position and cutting parameters. In this paper, a novel multi-sensor data fusion framework is proposed to enable identification of the best sensor locations for monitoring cutting operations, identifying sensors that provide the best signal, and derivation of signals with an enhanced periodic component. Our experimental results reveal that by utilising the framework, and using only three sensors, signal interpretation improves substantially and the monitoring system reliability is enhanced for a wide range of machining parameters. The framework provides a route to overcoming the major limitations of AE based monitoring.
Adel Ghanem, Ph D
The objective of Phase 1 of this STTR project is to demonstrate a Proof-of-Concept (PoC) of the Geo-Rad system that integrates a location-aware SmartTag (made by ZonTrak) and a radiation detector (developed by LLNL). It also includes the ability to transmit the collected radiation data and location information to the ZonTrak server (ZonService). The collected data is further transmitted to a central server at LLNL (the Fusion Server) to be processed in conjunction with overhead imagery to generate location estimates of nuclear proliferation and radiation sources.
For all segments and tests, a modified Kalman filter and a quasi-static sensor fusion algorithm were equally accurate (precision and accuracy ∼2–3°) compared to normalized least mean squares filtering, recursive least-squares filtering and standard Kalman filtering. The aims were to: (1) compare adaptive filtering techniques used for sensor fusion and (2) evaluate the precision and accuracy for a chosen adaptive filter. Motion sensors (based on inertial measurement units) are limited by accumulative integration errors arising from sensor bias. This drift can partly be handled with adaptive filtering techniques. To advance the measurement technique in this area, a new modified Kalman filter is developed. Differences in accuracy were observed during different tests especially drift in the internal/external rotation angle. This drift can be minimized if the sensors include magnetometers. (paper)
Zhenhua Li(李振华); Zhongliang Jing(敬忠良); Shaoyuan Sun(孙韶媛)
An algorithm is presented for multi-sensor image fusion using discrete wavelet frame transform (DWFT).The source images to be fused are firstly decomposed by DWFT. The fusion process is the combining of the source coefficients. Before the image fusion process, image segmentation is performed on each source image in order to obtain the region representation of each source image. For each source image, the salience of each region in its region representation is calculated. By overlapping all these region representations of all the source images, we produce a shared region representation to label all the input images. The fusion process is guided by these region representations. Region match measure of the source images is calculated for each region in the shared region representation. When fusing the similar regions, weighted averaging mode is performed; otherwise selection mode is performed. Experimental results using real data show that the proposed algorithm outperforms the traditional pyramid transform based or discrete wavelet transform (DWT) based algorithms in multi-sensor image fusion.
Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.
Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.
Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm
Vicens Oviedo, Miguel Ángel
Sistema de posicionamiento por radio frecuencia que utiliza sensores inerciales. Con la fusión de estos dos sistemas mejoramos la navegación. During the last few years, different navigation systems with varying degrees of accuracy have appeared, such as: LORAN (Long Range Navigation), Decca, GNSS (GLObal NAvigation Satellite System), the latter being the most widely used nowadays. GNSS, which includes GPS (Global Positioning System) and GLONASS (Global Navigation Satellite System) , consis...
Bullock, Michael E.; Miltonberger, Thomas W.; Reinholdsten, Paul A.; Wilson, Kathleen
A method for object recognition using a multisensor model-based approach has been developed. The sensor algorithm research expert system (SARES) is a sun-based workstation for model-based object recognition algorithm development. SARES is a means to perform research into multiple levels of geometric and scattering models, image and signal feature extraction, hypothesis management, and matching strategies. SARES multisensor fusion allows for multiple geometric representations and decompositions, and sensor location transformations, as well as feature prediction, matching, and evidence accrual. It is shown that the fusion algorithm can exploit the synergistic information contained in IR and synthetic aperture radar (SAR) imagery yielding increased object recognition accuracy and confidence over single sensor exploitation alone. The fusion algorithm has the added benefit of reducing the number of computations by virtue of simplified object model combinatorics. That is, the additional sensor information eliminates a large number of the incorrect object hypotheses early in the algorithm. This provides a focus of attention to those object hypotheses which are closest to the correct hypothesis.
Wenyu Zhang; Zhenjiang Zhang
Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any ty...
Daniels, J.; Yeh, J.; Illman, W.; Harri, S.; Kruger, A.; Parashar, M.
A stochastic information fusion methodology is developed to assimilate electrical resistivity tomography, high-frequency ground penetrating radar, mid-range-frequency radar, pneumatic/gas tracer tomography, and hydraulic/tracer tomography to image fractures, characterize hydrogeophysical properties, and monitor natural processes in the vadose zone. The information technology research will develop: 1) mechanisms and algorithms for fusion of large data volumes ; 2) parallel adaptive computational engines supporting parallel adaptive algorithms and multi-physics/multi-model computations; 3) adaptive runtime mechanisms for proactive and reactive runtime adaptation and optimization of geophysical and hydrological models of the subsurface; and 4) technologies and infrastructure for remote (pervasive) and collaborative access to computational capabilities for monitoring subsurface processes through interactive visualization tools. The combination of the stochastic fusion approach and information technology can lead to a new level of capability for both hydrologists and geophysicists enabling them to "see" into the earth at greater depths and resolutions than is possible today. Furthermore, the new computing strategies will make high resolution and large-scale hydrological and geophysical modeling feasible for the private sector, scientists, and engineers who are unable to access supercomputers, i.e., an effective paradigm for technology transfer.
Lefes, Alberto; Podnar, Gregg W.; Dolan, John M.; Hosler, Jeffrey C.; Ames, Troy J.
The Tele-supervised Adaptive Ocean Sensor Fleet (TAOSF) is a multi-robot science exploration architecture and system that uses a group of robotic boats (the Ocean-Atmosphere Sensor Integration System, or OASIS) to enable in-situ study of ocean surface and subsurface characteristics and the dynamics of such ocean phenomena as coastal pollutants, oil spills, hurricanes, or harmful algal blooms (HABs). The OASIS boats are extended- deployment, autonomous ocean surface vehicles. The TAOSF architecture provides an integrated approach to multi-vehicle coordination and sliding human-vehicle autonomy. One feature of TAOSF is the adaptive re-planning of the activities of the OASIS vessels based on sensor input ( smart sensing) and sensorial coordination among multiple assets. The architecture also incorporates Web-based communications that permit control of the assets over long distances and the sharing of data with remote experts. Autonomous hazard and assistance detection allows the automatic identification of hazards that require human intervention to ensure the safety and integrity of the robotic vehicles, or of science data that require human interpretation and response. Also, the architecture is designed for science analysis of acquired data in order to perform an initial onboard assessment of the presence of specific science signatures of immediate interest. TAOSF integrates and extends five subsystems developed by the participating institutions: Emergent Space Tech - nol ogies, Wallops Flight Facility, NASA s Goddard Space Flight Center (GSFC), Carnegie Mellon University, and Jet Propulsion Laboratory (JPL). The OASIS Autonomous Surface Vehicle (ASV) system, which includes the vessels as well as the land-based control and communications infrastructure developed for them, controls the hardware of each platform (sensors, actuators, etc.), and also provides a low-level waypoint navigation capability. The Multi-Platform Simulation Environment from GSFC is a surrogate
Robotic systems for remediation of hazardous waste sites must be highly reliable to avoid equipment failures and subsequent possible exposure of personnel to hazardous environments. Safe, efficient cleanup operations also require accurate, complete knowledge of the task space. This paper presents progress made on a 18 month program to meet these needs. To enhance robot reliability, a conceptual design of a monitoring and diagnostic system is being developed to predict the onset of mechanical failure modes, provide maximum lead time to make operational changes or repairs, and minimize the occurrence of on-site breakdowns. To ensure safe operation, a comprehensive software package is being developed that will fuse data from multiple surface mapping sensors and poses so as to reduce the error effects in individual data points and provide accurate 3-D maps of a work space
erić, Ljiljana; Stipaniev, Darko; tula, Maja
Inspirited by the formal theory of perception and technology of sensor network we have introduced the idea of observer network as a reliable framework for data and information fusion. Our ideas have been successfully tested in the case of forest fire observer network. Observer network was implemented using multi-agent technology. A special multi agent shell was designed for this purpose having software system desirable features like modularity and flexibility. The system was implemented in nu...
Cvejic, N; Bull, DR; Canagarajah, CN
We present a novel image fusion algorithm based on ICA that has an improved performance over sensor networks. It employs segmentation to determine the most important regions in the input images and consequently fuses the ICA coefficients from the given regions. Sparse coding of the coefficients in ICA domain is used to minimize noise transferred from input images into the fused output. Experimental results confirm that the proposed method outperforms other state-of- the-art methods in the sen...
Mohammad Momani; Subhash Challa; Rami Alhmouz
This paper introduces a new Bayesian fusion algorithm to combine more than one trust component (data trust and communication trust) to infer the overall trust between nodes. This research work proposes that one trust component is not enough when deciding on whether or not to trust a specific node in a wireless sensor network. This paper discusses and analyses the results from the communication trust component (binary) and the data trust component (continuous) and proves that either component ...
Asharif, Mohammad Reza; Moshiri, Behzad; HoseinNezhad, Reza
In any autonomous mobile robot, one of the most important issues to be designed and implemented is environment perception. In this paper, a new approach is formulated in order to perform sensory data integration for generation of an occupancy grid map of the environment. This method is an extended version of the Bayesian fusion method for independent sources of information. The performance of the proposed method of fusion and its sensitivity are discussed. Map building simulation for a cylindrical robot with eight ultrasonic sensors and mapping implementation for a Khepera robot have been separately tried in simulation and experimental works. A new neural structure is introduced for conversion of proximity data that are given by Khepera IR sensors to occupancy probabilities. Path planning experiments have also been applied to the resulting maps. For each map, two factors are considered and calculated: the fitness and the augmented occupancy of the map with respect to the ideal map. The length and the least distance to obstacles were the other two factors that were calculated for the routes that are resulted by path planning experiments. Experimental and simulation results show that by using the new fusion formulas, more informative maps of the environment are obtained. By these maps more appropriate routes could be achieved. Actually, there is a tradeoff between the length of the resulting routes and their safety and by choosing the proper fusion function, this tradeoff is suitably tuned for different map building applications. PMID:12160343
Parsons, Matthew M; Krapp, Holger G; Laughlin, Simon B
Animal locomotion often depends upon stabilization reflexes that use sensory feedback to maintain trajectories and orientation. Such stabilizing reflexes are critically important for the blowfly, whose aerodynamic instability permits outstanding maneuverability but increases the demands placed on flight control. Flies use several sensory systems to drive reflex responses, and recent studies have provided access to the circuitry responsible for combining and employing these sensory inputs. We report that lobula plate VS neurons combine inputs from two optical sensors, the ocelli and the compound eyes. Both systems deliver essential information on in-flight rotations, but our neuronal recordings reveal that the ocelli encode this information in three axes, whereas the compound eyes encode in nine. The difference in dimensionality is reconciled by tuning each VS neuron to the ocellar axis closest to its compound eye axis. We suggest that this simple projection combines the speed of the ocelli with the accuracy of the compound eyes without compromising either. Our findings also support the suggestion that the coordinates of sensory information processing are aligned with axes controlling the natural modes of the fly's flight to improve the efficiency with which sensory signals are transformed into appropriate motor commands. PMID:20303270
Lambrecht, Stefan; Nogueira, Samuel L; Bortole, Magdo; Siqueira, Adriano A G; Terra, Marco H; Rocon, Eduardo; Pons, José L
This paper presents the comparison between cooperative and local Kalman Filters (KF) for estimating the absolute segment angle, under two calibration conditions. A simplified calibration, that can be replicated in most laboratories; and a complex calibration, similar to that applied by commercial vendors. The cooperative filters use information from either all inertial sensors attached to the body, Matricial KF; or use information from the inertial sensors and the potentiometers of an exoskeleton, Markovian KF. A one minute walking trial of a subject walking with a 6-DoF exoskeleton was used to assess the absolute segment angle of the trunk, thigh, shank, and foot. The results indicate that regardless of the segment and filter applied, the more complex calibration always results in a significantly better performance compared to the simplified calibration. The interaction between filter and calibration suggests that when the quality of the calibration is unknown the Markovian KF is recommended. Applying the complex calibration, the Matricial and Markovian KF perform similarly, with average RMSE below 1.22 degrees. Cooperative KFs perform better or at least equally good as Local KF, we therefore recommend to use cooperative KFs instead of local KFs for control or analysis of walking. PMID:26901198
Peng Geng; Shuaiqi Liu; Shanna Zhuang
Medical image fusion plays an important role in diagnosis and treatment of diseases such as image-guided radiotherapy and surgery. The modified local contrast information is proposed to fuse multimodal medical images. Firstly, the adaptive manifold filter is introduced into filtering source images as the low-frequency part in the modified local contrast. Secondly, the modified spatial frequency of the source images is adopted as the high-frequency part in the modified local contrast. Finally,...
Data fusion problem is one of the research hotspots in wireless sensor network. Aiming at the shortage of excessive energy consumption in existing fusion method, a data fusion program is proposed based on minimum energy consumption according to the theory of compressive sensing. This paper first analyzes the impact of different fusion modes on data collection performance, considers the impact of routing and mixed Compressive Sensing (CS) fusion on energy optimization, models the data fusion p...
Yang, R; Er, PV; Wang, Z.; Tan, KK
A radial basis function (RBF) neural network approach with a fusion of multiple signal candidates in precision motion control is studied in this paper. Sensor weightages are assigned to sensor measurements according to the selector attributes and approximated using RBF neural network in multi-sensor fusion. A specific application towards precision motion control of a linear motor system using a magnetic encoder and a soft position sensor in conjunction with an analog velocity sensor is demons...
Autonomous driving vehicles introduce challenging research areas combining diﬀer-ent disciplines. One challenge is the detection of obstacles with diﬀerent sensors and the combination of information to generate a comprehensive representation of the environment, which can be used for path planning and decision making.The sensor fusion is demonstrated using two Velodyne multi beam laser scanners, but it is possible to extend the proposed sensor fusion framework for diﬀerent sensor types. Sensor...
Wilson, Dean A.
In the Command and Control mission, new technologies such as 'sensor fusion' are designed to help reduce operator workload and increase situational awareness. This thesis explored the tracking characteristics of diverse sensors and sources of data and their contributions to a fused tactical picture. The fundamental building blocks of any sensor fusion algorithm are the tracking algorithm associated with each of the sensors on the sensor platform. In support of this study the MATLAB program 'f...
Perlovsky, Leonid I.; Weijers, Bertus; Mutz, Chris W.
Target detection, tracking, and sensor fusion are complicated problems, which usually are performed sequentially. First detecting targets, then tracking, then fusing multiple sensors reduces computations. This procedure however is inapplicable to difficult targets which cannot be reliably detected using individual sensors, on individual scans or frames. In such more complicated cases one has to perform functions of fusing, tracking, and detecting concurrently. This often has led to prohibitive combinatorial complexity and, as a consequence, to sub-optimal performance as compared to the information-theoretic content of all the available data. It is well appreciated that in this task the human mind is by far superior qualitatively to existing mathematical methods of sensor fusion, however, the human mind is limited in the amount of information and speed of computation it can cope with. Therefore, research efforts have been devoted toward incorporating "biological lessons" into smart algorithms, yet success has been limited. Why is this so, and how to overcome existing limitations? The fundamental reasons for current limitations are analyzed and a potentially breakthrough research and development effort is outlined. We utilize the way our mind combines emotions and concepts in the thinking process and present the mathematical approach to accomplishing this in the current technology computers. The presentation will summarize the difficulties encountered by intelligent systems over the last 50 years related to combinatorial complexity, analyze the fundamental limitations of existing algorithms and neural networks, and relate it to the type of logic underlying the computational structure: formal, multivalued, and fuzzy logic. A new concept of dynamic logic will be introduced along with algorithms capable of pulling together all the available information from multiple sources. This new mathematical technique, like our brain, combines conceptual understanding with
Pietropaoli, Bastien; Dominici, Michele; Weis, Frédéric
Computing context is a major subject of interest in smart homes. In this paper, we present how we adapted a general purpose multi-level architecture for the computation of contextual data to a prototype of smart home. After a quick explanation of why we use different methods at different levels of abstraction, we focus more on the low-level data fusion. To do this, we present the basics of belief functions theory and how we apply this theory to sensors to obtain stable abstractions. By doing ...
Juntao Fei; Hongfei Ding
This paper presents an adaptive control approach for Micro-Electro-Mechanical Systems (MEMS) z-axis gyroscope sensor. The dynamical model of MEMS gyroscope sensor is derived and adaptive state tracking control for MEMS gyroscope is developed. The proposed adaptive control approaches can estimate the angular velocity and the damping and stiffness coefficients including the coupling terms due to the fabrication imperfection. The stability of the closed-loop systems is established with the propo...
Chunlong Yao; Wei Pan; Lan Shen; Xu Li
With the development of image sensor technology, multi-sensor image fusion technology emerged and was widely used in the field of military surveillance, medical diagnosis, remote sensing, intelligent robot and so on. However, the current image fusion technology mainly focuses on the research of gray images, the color image fusion is rarely. Because color image contains more information compared with gray image, the research on color image fusion technology is becoming more and more urgent. In...
This paper, verifies the problem of designing robust state estimator for multiple sensor networks with uncertain model and noisy measurements. Multi sensor data fusion by measurement fusion and state vector fusion structure using the Kalman filter were introduced. The standard Kalman filter requires an accurate system model. In order to get accurate information in modelling signals and sensors, the information form of robust Kalman filter by using the Krein space approach is proposed. Also, a...
Ghersi, I.; Mariño, M.; Miralles, M. T.
Human-machine interfaces have evolved, benefiting from the growing access to devices with superior, embedded signal-processing capabilities, as well as through new sensors that allow the estimation of movements and gestures, resulting in increasingly intuitive interfaces. In this context, sensor fusion for the estimation of the spatial orientation of body segments allows to achieve more robust solutions, overcoming specific disadvantages derived from the use of isolated sensors, such as the sensitivity of magnetic-field sensors to external influences, when used in uncontrolled environments. In this work, a method for the combination of image-processing data and angular-velocity registers from a 3D MEMS gyroscope, through a Discrete-time Kalman Filter, is proposed and deployed as an alternate user interface for mobile devices, in which an on-screen pointer is controlled with head movements. Results concerning general performance of the method are presented, as well as a comparative analysis, under a dedicated test application, with results from a previous version of this system, in which the relative-orientation information was acquired directly from MEMS sensors (3D magnetometer-accelerometer). These results show an improved response for this new version of the pointer, both in terms of precision and response time, while keeping many of the benefits that were highlighted for its predecessor, giving place to a complementary method for signal acquisition that can be used as an alternative-input device, as well as for accessibility solutions.
Full Text Available This paper proposes a profile-based sensing framework for adaptive sensor systems based on models that relate possibly heterogeneous sensor data and profiles generated by the models to detect events. With these concepts, three phases for building the sensor systems are extracted from two examples: a combustion control sensor system for an automobile engine, and a sensor system for home security. The three phases are: modeling, profiling, and managing trade-offs. Designing and building a sensor system involves mapping the signals to a model to achieve a given mission.
Zhang Zhen; Xu Lizhong; Harry Hua Li; Shi Aiye; Han Hua; Wang Huibin
In the applications of water regime monitoring,incompleteness,and inaccuracy of sensor data may directly affect the reliability of acquired monitoring information.Based on the spatial and temporal correlation of water regime monitoring information,this paper addresses this issue and proposes an information fusion method to implement data rectification.An improved Back Propagation (BP) neural network is used to perform data fusion on the hardware platform of a stantion unit,which takes Field-Programmable Gate Array (FPGA) as the core component.In order to verify the effectiveness,five measurements including water level,discharge and velocity are selected from three different points in a water regime monitoring station.The simulation results show that this method can recitify random errors as well as gross errors significantly.
Full Text Available As with the development of computer technology and informatization, network technique, sensor technique and communication technology become three necessary components of information industry. As the core technique of sensor application, signal processing mainly determines the sensor performances. For this reason, study on signal processing mode is very important to sensors and the application of sensor network. In this paper, we introduce a new sensor coarse signal processing mode based on adaptive genetic algorithm. This algorithm selects crossover, mutation probability adaptively and compensates multiple operators commutatively to optimize the search process, so that we can obtain the global optimum solution. Based on the proposed algorithm, using auto-correlative characteristic parameter extraction method, it achieves smaller test error in sensor coarse signal processing mode of processing interference signal. We evaluate the proposed approach on a set of data. The experimental results show that, the proposed approach is able to improve the performance in different experimental setting
One of the biggest challenges currently facing the robotics field is sensor data fusion. Unmanned robots carry many sophisticated sensors including visual and infrared cameras, radar, laser range finders, chemical sensors, accelerometers, gyros, and global positioning systems. By effectively fusing the data from these sensors, a robot would be able to form a coherent view of its world that could then be used to facilitate both autonomous and intelligent operation. Another distinct fusion problem is that of fusing data from teammates with data from onboard sensors. If an entire team of vehicles has the same worldview they will be able to cooperate much more effectively. Sharing worldviews is made even more difficult if the teammates have different sensor types. The final fusion challenge the robotics field faces is that of fusing data gathered by robots with data gathered by human teammates (soft sensors). Humans sense the world completely differently from robots, which makes this problem particularly difficult. The advantage of fusing data from humans is that it makes more information available to the entire team, thus helping each agent to make the best possible decisions. This thesis presents a system for fusing data from multiple unmanned aerial vehicles, unmanned ground vehicles, and human observers. The first issue this thesis addresses is that of centralized data fusion. This is a foundational data fusion issue, which has been very well studied. Important issues in centralized fusion include data association, classification, tracking, and robotics problems. Because these problems are so well studied, this thesis does not make any major contributions in this area, but does review it for completeness. The chapter on centralized fusion concludes with an example unmanned aerial vehicle surveillance problem that demonstrates many of the traditional fusion methods. The second problem this thesis addresses is that of distributed data fusion. Distributed data fusion
Agogino, Alice; Goebel, Kai; Alag, Sanam
This study reports on a method to accomplish sensor validation and fusion in Intelligent Transportation Systems (ITS). The method is based on probabilistic and fuzzy techniques that express a confidence in the sensor data and take into account environmental factors and the state of the system. Sensor data fusion uses the confidence assigned to each sensor reading and integrates them into one reading. Noise and failure are filtered from the data and lead to a safety improvement in ITS.
Daud, Taher; Stoica, Adrian; Tyson, Thomas; Li, Wei-te; Fabunmi, James
The paper presents the concept and initial tests from the hardware implementation of a low-power, high-speed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) processor is developed to seamlessly combine rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor in compact low power VLSI. The first demonstration of the ELIPS concept targets interceptor functionality; other applications, mainly in robotics and autonomous systems are considered for the future. The main assumption behind ELIPS is that fuzzy, rule-based and neural forms of computation can serve as the main primitives of an "intelligent" processor. Thus, in the same way classic processors are designed to optimize the hardware implementation of a set of fundamental operations, ELIPS is developed as an efficient implementation of computational intelligence primitives, and relies on a set of fuzzy set, fuzzy inference and neural modules, built in programmable analog hardware. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Following software demonstrations on several interceptor data, three important ELIPS building blocks (a fuzzy set preprocessor, a rule-based fuzzy system and a neural network) have been fabricated in analog VLSI hardware and demonstrated microsecond-processing times.
Timothy R. McJunkin; Milos Manic
Tomography, used to create images of the internal properties and features of an object, from phased array ultasonics is improved through many sophisiticated methonds of post processing of data. One approach used to improve tomographic results is to prescribe the collection of more data, from different points of few so that data fusion might have a richer data set to work from. This approach can lead to rapid increase in the data needed to be stored and processed. It also does not necessarily lead to have the needed data. This article describes a novel approach to utilizing the data aquired as a basis for adapting the sensors focusing parameters to locate more precisely the features in the material: specifically, two evolutionary methods of autofocusing on a returned signal are coupled with the derivations of the forumulas for spatially locating the feature are given. Test results of the two novel methods of evolutionary based focusing (EBF) illustrate the improved signal strength and correction of the position of feature using the optimized focal timing parameters, called Focused Delay Identification (FoDI).
Bein, Doina; Madan, Bharat B.; Phoha, Shashi; Rajtmajer, Sarah; Rish, Anna
Given a specific scenario for the border control problem, we propose a dynamic data-driven adaptation of the associated sensor network via embedded software agents which make sensor network control, adaptation and collaboration decisions based on the contextual information value of competing data provided by different multi-modal sensors. We further propose the use of influence diagrams to guide data-driven decision making in selecting the appropriate action or course of actions which maximize a given utility function by designing a sensor embedded software agent that uses an influence diagram to make decisions about whether to engage or not engage higher level sensors for accurately detecting human presence in the region. The overarching goal of the sensor system is to increase the probability of target detection and classification and reduce the rate of false alarms. The proposed decision support software agent is validated experimentally on a laboratory testbed for multiple border control scenarios.
Lin, X.; Zhang, S. Q.; Ebtehaj, M.
Data assimilation (DA) based on estimation theories has been a powerful tool to extract information from a wide range of data sources to produce optimal estimates of physical parameters. In operational NWP applications, the available information essentially consists of observations, the physical laws governing the dynamical and physical processes of the atmosphere, as well as the associated uncertainties of these information sources. The DA framework can be adapted to the multi-sensor multi-scale data fusion problem, in which the primary goal is not to define the initial condition of a forecast as in operational DA systems, but to produce an estimate of the physical field of interest as accurate as possible. In this work we explore the DA framework in an application of combining multi-sensor multi-scale precipitation data to produce an integrated observation-based precipitation analysis. Similar to a DA system, an objective function is optimized with minimization of analysis error variance, provided that the error characteristics for each data source are estimated and described by its error covariance. Wavelet-transform is employed to obtain scale-decomposed and sparse representation of signal distribution. We present a prototype data fusion system including the estimation/fusion algorithm based on data assimilation methodologies, the estimation of uncertainty variance database for satellite retrievals, and the data registration for different platforms. A set of case studies such as hurricanes and mesoscale convective complexes are used to evaluate the general paradigm of the data fusion system and to investigate the impact from individual data source, different spatial resolutions and observation error distributions.
Schenker, Paul S. (Editor)
The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.
The structural layers and methods of multi-sensor information fusion technology are analysed, and its application in fault diagnosis of hydraulic system is discussed. Aiming at hydraulic system, a model of hydraulic fault diagnosis system based on multi-sensor information fusion technology is presented. Choosing and implementing the method of information fusion reasonably, the model can fuse and calculate various fault characteristic parameters in hydraulic system effectively and provide more valuable result for fault diagnosis of hydraulic system.
Junhai Luo; Tao Li
We study distributed detection and fusion in sensor networks with bathtub-shaped failure (BSF) rate of the sensors which may or not send data to the Fusion Center (FC). The reliability of semiconductor devices is usually represented by the failure rate curve (called the “bathtub curve”), which can be divided into the three following regions: initial failure period, random failure period, and wear-out failure period. Considering the possibility of the failed sensors which still work but in a ...
Xiang He; Aloi, Daniel N.; Jia Li
Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph struct...
Otazu Porter, Xavier; González-Audicana, María; Fors Aldrich, Octavi; Núñez de Murga, Jorge, 1955-
Usual image fusion methods inject features from a high spatial resolution panchromatic sensor into every low spatial resolution multispectral band trying to preserve spectral signatures and improve spatial resolution to that of the panchromatic sensor. The objective is to obtain the image that would be observed by a sensor with the same spectral response (i.e., spectral sensitivity and quantum efficiency) as the multispectral sensors and the spatial resolution of the panchromatic sensor. But ...
Susstrunk, Sabine E.; Holm, Jack M.; Finlayson, Graham D.
Chromatic adaptation transforms are used in imaging system to map image appearance to colorimetry under different illumination sources. In this paper, the performance of different chromatic adaptation transforms (CAT) is compared with the performance of transforms based on RGB primaries that have been investigated in relation to standard color spaces for digital still camera characterization and image interchange. The chromatic adaptation transforms studied are von Kries, Bradford, Sharp, and CMCCAT2000. The RGB primaries investigated are ROMM, ITU-R BT.709, and 'prime wavelength' RGB. The chromatic adaptation model used is a von Kries model that linearly scales post-adaptation cone response with illuminant dependent coefficients. The transforms were evaluated using 16 sets of corresponding color dat. The actual and predicted tristimulus values were converted to CIELAB, and three different error prediction metrics, (Delta) ELab, (Delta) ECIE94, and (Delta) ECMC(1:1) were applied to the results. One-tail Student-t tests for matched pairs were calculated to compare if the variations in errors are statistically significant. For the given corresponding color data sets, the traditional chromatic adaptation transforms, Sharp CAT and CMCCAT2000, performed best. However, some transforms based on RGB primaries also exhibit good chromatic adaptation behavior, leading to the conclusion that white-point independent RGB spaces for image encoding can be defined. This conclusion holds only if the linear von Kries model is considered adequate to predict chromatic adaptation behavior.
Geng, Peng; Liu, Shuaiqi; Zhuang, Shanna
Medical image fusion plays an important role in diagnosis and treatment of diseases such as image-guided radiotherapy and surgery. The modified local contrast information is proposed to fuse multimodal medical images. Firstly, the adaptive manifold filter is introduced into filtering source images as the low-frequency part in the modified local contrast. Secondly, the modified spatial frequency of the source images is adopted as the high-frequency part in the modified local contrast. Finally, the pixel with larger modified local contrast is selected into the fused image. The presented scheme outperforms the guided filter method in spatial domain, the dual-tree complex wavelet transform-based method, nonsubsampled contourlet transform-based method, and four classic fusion methods in terms of visual quality. Furthermore, the mutual information values by the presented method are averagely 55%, 41%, and 62% higher than the three methods and those values of edge based similarity measure by the presented method are averagely 13%, 33%, and 14% higher than the three methods for the six pairs of source images. PMID:26664494
Full Text Available Medical image fusion plays an important role in diagnosis and treatment of diseases such as image-guided radiotherapy and surgery. The modified local contrast information is proposed to fuse multimodal medical images. Firstly, the adaptive manifold filter is introduced into filtering source images as the low-frequency part in the modified local contrast. Secondly, the modified spatial frequency of the source images is adopted as the high-frequency part in the modified local contrast. Finally, the pixel with larger modified local contrast is selected into the fused image. The presented scheme outperforms the guided filter method in spatial domain, the dual-tree complex wavelet transform-based method, nonsubsampled contourlet transform-based method, and four classic fusion methods in terms of visual quality. Furthermore, the mutual information values by the presented method are averagely 55%, 41%, and 62% higher than the three methods and those values of edge based similarity measure by the presented method are averagely 13%, 33%, and 14% higher than the three methods for the six pairs of source images.
WANG Dian-hong; FEI E; YAN Yu-jie
To efficiently utilize the limited computational resource in real-time sensor networks, this paper focu-ses on the challenge of computational resource allocation in sensor networks and provides a solution with the method of economies. It designs a mieroeconomic system in which the applications distribute their computational resource consumption across sensor networks by virtue of mobile agent. Further, it proposes the market-based computational resource allocation policy named MCRA which satisfies the uniform consumption of computational energy in network and the optimal division of the single computational capacity for multiple tasks. The simula-tion in the scenario of target tracing demonstrates that MCRA realizes an efficient allocation of computational re-sources according to the priority of tasks, achieves the superior allocation performance and equilibrium perform-ance compared to traditional allocation policies, and ultimately prolongs the system lifetime.
Full Text Available
ENGLISH ABSTRACT: Manufacturing companies of today face unpredictable, high frequency market changes driven by global competition. To stay competitive, these companies must have the characteristics of cost-effective rapid response to the market needs. As an engineering discipline, mechatronics strives to integrate mechanical, electronic, and computer systems optimally in order to create high precision products and manufacturing processes. This paper presents a methodology of increasing flexibility and reusability of a generic computer integrated manufacturing (CIM cell-control system using simulation and modelling of mechatronic sensory system (MSS concepts. The utilisation of sensors within the CIM cell is highlighted specifically for data acquisition, analysis, and multi-sensor data fusion. Thus the designed reference architecture provides comprehensive insight for the functions and methodologies of a generic shop-floor control system (SFCS, which consequently enables the rapid deployment of a flexible system.
AFRIKAANSE OPSOMMING: Hedendaagse vervaardigingsondernemings ervaar gereeld onvoorspelbare markveranderinge wat aangedryf word deur wêreldwye mededinging. Om kompeterend te bly moet hierdie ondernemings die eienskappe van kosteeffektiwiteit en snelle-respons op markfluktuasies toon. Megatronika streef daarna om meganiese, elektroniese en rekenaarstelsels optimaal te integreer om hoëpresisieprodukte en produksieprosesse daar te stel. Hierdie artikel suggereer 'n metodologie vir toenemende aanpasbaarheid en herbruikbaarheid van 'n generiese rekenaargeïntegreerde vervaardigingsel-beheersisteem deur die gebruik van simulasie en die modellering van megatroniese sensorsisteemkonsepte. Die aanwending van sensors binne die sel fasiliteer datavaslegging, ontleding en multisensordatafusie. Sodoende verskaf die ontwerpte argitektuur insig in die funksie en metodologie van 'n generiese stukwerkwinkelbeheersisteem wat die vinnige
Li, Xiaofan; Zhao, Yubin; Zhang, Sha; Fan, Xiaopeng
Particle filters (PFs) are widely used for nonlinear signal processing in wireless sensor networks (WSNs). However, the measurement uncertainty makes the WSN observations unreliable to the actual case and also degrades the estimation accuracy of the PFs. In addition to the algorithm design, few works focus on improving the likelihood calculation method, since it can be pre-assumed by a given distribution model. In this paper, we propose a novel PF method, which is based on a new likelihood fusion method for WSNs and can further improve the estimation performance. We firstly use a dynamic Gaussian model to describe the nonparametric features of the measurement uncertainty. Then, we propose a likelihood adaptation method that employs the prior information and a belief factor to reduce the measurement noise. The optimal belief factor is attained by deriving the minimum Kullback–Leibler divergence. The likelihood adaptation method can be integrated into any PFs, and we use our method to develop three versions of adaptive PFs for a target tracking system using wireless sensor network. The simulation and experimental results demonstrate that our likelihood adaptation method has greatly improved the estimation performance of PFs in a high noise environment. In addition, the adaptive PFs are highly adaptable to the environment without imposing computational complexity. PMID:27249002
Li, Xiaofan; Zhao, Yubin; Zhang, Sha; Fan, Xiaopeng
Particle filters (PFs) are widely used for nonlinear signal processing in wireless sensor networks (WSNs). However, the measurement uncertainty makes the WSN observations unreliable to the actual case and also degrades the estimation accuracy of the PFs. In addition to the algorithm design, few works focus on improving the likelihood calculation method, since it can be pre-assumed by a given distribution model. In this paper, we propose a novel PF method, which is based on a new likelihood fusion method for WSNs and can further improve the estimation performance. We firstly use a dynamic Gaussian model to describe the nonparametric features of the measurement uncertainty. Then, we propose a likelihood adaptation method that employs the prior information and a belief factor to reduce the measurement noise. The optimal belief factor is attained by deriving the minimum Kullback-Leibler divergence. The likelihood adaptation method can be integrated into any PFs, and we use our method to develop three versions of adaptive PFs for a target tracking system using wireless sensor network. The simulation and experimental results demonstrate that our likelihood adaptation method has greatly improved the estimation performance of PFs in a high noise environment. In addition, the adaptive PFs are highly adaptable to the environment without imposing computational complexity. PMID:27249002
Full Text Available Fault diagnosis (FD and data fusion (DF technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2 sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.
Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi
Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study. PMID:24351636
Angelov, Plamen; Kordon, Arthur
A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible in...
E M Saad; Awadalla, M. H.; R. R. Darwish
Energy hole problem is considered one of the most severe threats in wireless sensor networks. In this paper the idea of exploiting sink mobility for the purpose of culling the energy hole problem in hierarchical large-scale wireless sensor networks based on bees algorithm is presented. In the proposed scheme, a mobile sink equipped with a powerful transceiver and battery, traverses the entire field, and periodically gathers data from network cluster heads. The mobile sink follows an adaptive ...
Full Text Available Context-awareness is an interesting topic in mobile navigation scenarios where the context of the application is highly dynamic. Using context-aware computing, navigation services consider the situation of user, not only in the design process, but in real time while the device is in use. The basic idea is that mobile navigation services can provide different services based on different contexts—where contexts are related to the user’s activity and the device placement. Context-aware systems are concerned with the following challenges which are addressed in this paper: context acquisition, context understanding, and context-aware application adaptation. The proposed approach in this paper is using low-cost sensors in a multi-level fusion scheme to improve the accuracy and robustness of context-aware navigation system. The experimental results demonstrate the capabilities of the context-aware Personal Navigation Systems (PNS for outdoor personal navigation using a smartphone.
Saeedi, Sara; Moussa, Adel; El-Sheimy, Naser
Context-awareness is an interesting topic in mobile navigation scenarios where the context of the application is highly dynamic. Using context-aware computing, navigation services consider the situation of user, not only in the design process, but in real time while the device is in use. The basic idea is that mobile navigation services can provide different services based on different contexts-where contexts are related to the user's activity and the device placement. Context-aware systems are concerned with the following challenges which are addressed in this paper: context acquisition, context understanding, and context-aware application adaptation. The proposed approach in this paper is using low-cost sensors in a multi-level fusion scheme to improve the accuracy and robustness of context-aware navigation system. The experimental results demonstrate the capabilities of the context-aware Personal Navigation Systems (PNS) for outdoor personal navigation using a smartphone. PMID:24670715
Pasika, Hugh Joseph Christopher
Sensor fusion has become a significant area of signal processing research that draws on a variety of tools. Its goals are many, however in this thesis, the creation of a virtual sensor is paramount. In particular, neural networks are used to simulate the output of a LIDAR (LASER. RADAR) that measures cloud-base height. Eye-safe LIDAR is more accurate than the standard tool that would be used for such measurement; the ceilometer. The desire is to make cloud-base height information available at a network of ground-based meteorological stations without actually installing LIDAR sensors. To accomplish this, fifty-seven sensors ranging from multispectral satellite information to standard atmospheric measurements such as temperature and humidity, are fused in what can only be termed as a very complex, nonlinear environment. The result is an accurate prediction of cloud-base height. Thus, a virtual sensor is created. A total of four different learning algorithms were studied; two global and two local. In each case, the very best state-of-the-art learning algorithms have been selected. Local methods investigated are the regularized radial basis function network, and the support vector machine. Global methods include the standard backpropagation with momentum trained multilayer perceptron (used as a benchmark) and the multilayer perceptron trained via the Kalman filter algorithm. While accuracy is the primary concern, computational considerations potentially limit the application of several of the above techniques. Thus, in all cases care was taken to minimize computational cost. For example in the case of the support vector machine, a method of partitioning the problem in order to reduce memory requirements and make the optimization over a large data set feasible was employed and in the Kalman algorithm case, node-decoupling was used to dramatically reduce the number of operations required. Overall, the methods produced somewhat equivalent mean squared errors indicating
Full Text Available Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter and complementary (Non-linear observer filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles and heading (yaw angle errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.
Holzrichter, Michael Warren; O' Rourke, William T.; Zenner, Jennifer; Maish, Alexander B.
The goal of this LDRD was to demonstrate the use of robotic vehicles for deploying and autonomously reconfiguring seismic and acoustic sensor arrays with high (centimeter) accuracy to obtain enhancement of our capability to locate and characterize remote targets. The capability to accurately place sensors and then retrieve and reconfigure them allows sensors to be placed in phased arrays in an initial monitoring configuration and then to be reconfigured in an array tuned to the specific frequencies and directions of the selected target. This report reviews the findings and accomplishments achieved during this three-year project. This project successfully demonstrated autonomous deployment and retrieval of a payload package with an accuracy of a few centimeters using differential global positioning system (GPS) signals. It developed an autonomous, multisensor, temporally aligned, radio-frequency communication and signal processing capability, and an array optimization algorithm, which was implemented on a digital signal processor (DSP). Additionally, the project converted the existing single-threaded, monolithic robotic vehicle control code into a multi-threaded, modular control architecture that enhances the reuse of control code in future projects.
The problem of landmines in Egypt and the detection techniques currently used for humanitarian demining are described and discussed. Most of these techniques depend on metal detectors. This makes the demining of vast areas of contaminated lands difficult, dangerous, slow and very costly processes. Although some of these techniques are very effective to locate metal or metal like anomalies, but they suffer from high false alarm rates because they are not capable to identify these anomalies as mines. However, in the last four decades, techniques based on using neutrons of different energies have proved themselves as powerful tools for elemental analysis and are therefore capable to identify explosive materials and to confirm the presence or not of a landmine. Results of the activities running through the IAEA TC project, EGY1024 for developing and adapting the two main promising nuclear techniques based on measuring the density variation of hydrogen by measuring thermal neutrons backscattered from buried object are given and discussed. In addition, the use of two nuclear sensors with other two sensors based on EMI and GPR in an integrated system with data fusion and how to integrate these sensors onto a land vehicle are described and discussed. (author)
Jones, Kennie H.; Jones, Michael G.; Nark, Douglas M.; Lodding, Kenneth N.
In the last decade, the realization of small, inexpensive, and powerful devices with sensors, computers, and wireless communication has promised the development of massive sized sensor networks with dense deployments over large areas capable of high fidelity situational assessments. However, most management models have been based on centralized control and research has concentrated on methods for passing data from sensor devices to the central controller. Most implementations have been small but, as it is not scalable, this methodology is insufficient for massive deployments. Here, a specific application of a large sensor network for adaptive noise reduction demonstrates a new paradigm where communities of sensor/computer devices assess local conditions and make local decisions from which emerges a global behaviour. This approach obviates many of the problems of centralized control as it is not prone to single point of failure and is more scalable, efficient, robust, and fault tolerant
Fafoutis, Xenofon; Dragoni, Nicola
ODMAC (On-Demand Media Access Control) is a recently proposed MAC protocol designed to support individual duty cycles for Energy Harvesting — Wireless Sensor Networks (EH-WSNs). Individual duty cycles are vital for EH-WSNs, because they allow nodes to adapt their energy consumption to the ever...... three key properties of EH-WSNs: adaptability of energy consumption, distributed energy-aware load balancing and support for different application-specific requirements....
National Aeronautics and Space Administration — SSCI proposes to develop, implement and test a collision detection system for unmanned aerial vehicles (UAV), referred to as the Asynchronous Sensor fuSion for...
Hwang, Sungjae; Agada, Peter; Kiemel, Tim; Jeka, John J
We simultaneously perturbed visual, vestibular and proprioceptive modalities to understand how sensory feedback is re-weighted so that overall feedback remains suited to stabilizing upright stance. Ten healthy young subjects received an 80 Hz vibratory stimulus to their bilateral Achilles tendons (stimulus turns on-off at 0.28 Hz), a ± 1 mA binaural monopolar galvanic vestibular stimulus at 0.36 Hz, and a visual stimulus at 0.2 Hz during standing. The visual stimulus was presented at different amplitudes (0.2, 0.8 deg rotation about ankle axis) to measure: the change in gain (weighting) to vision, an intramodal effect; and a change in gain to vibration and galvanic vestibular stimulation, both intermodal effects. The results showed a clear intramodal visual effect, indicating a de-emphasis on vision when the amplitude of visual stimulus increased. At the same time, an intermodal visual-proprioceptive reweighting effect was observed with the addition of vibration, which is thought to change proprioceptive inputs at the ankles, forcing the nervous system to rely more on vision and vestibular modalities. Similar intermodal effects for visual-vestibular reweighting were observed, suggesting that vestibular information is not a "fixed" reference, but is dynamically adjusted in the sensor fusion process. This is the first time, to our knowledge, that the interplay between the three primary modalities for postural control has been clearly delineated, illustrating a central process that fuses these modalities for accurate estimates of self-motion. PMID:24498252
Full Text Available We simultaneously perturbed visual, vestibular and proprioceptive modalities to understand how sensory feedback is re-weighted so that overall feedback remains suited to stabilizing upright stance. Ten healthy young subjects received an 80 Hz vibratory stimulus to their bilateral Achilles tendons (stimulus turns on-off at 0.28 Hz, a ± 1 mA binaural monopolar galvanic vestibular stimulus at 0.36 Hz, and a visual stimulus at 0.2 Hz during standing. The visual stimulus was presented at different amplitudes (0.2, 0.8 deg rotation about ankle axis to measure: the change in gain (weighting to vision, an intramodal effect; and a change in gain to vibration and galvanic vestibular stimulation, both intermodal effects. The results showed a clear intramodal visual effect, indicating a de-emphasis on vision when the amplitude of visual stimulus increased. At the same time, an intermodal visual-proprioceptive reweighting effect was observed with the addition of vibration, which is thought to change proprioceptive inputs at the ankles, forcing the nervous system to rely more on vision and vestibular modalities. Similar intermodal effects for visual-vestibular reweighting were observed, suggesting that vestibular information is not a "fixed" reference, but is dynamically adjusted in the sensor fusion process. This is the first time, to our knowledge, that the interplay between the three primary modalities for postural control has been clearly delineated, illustrating a central process that fuses these modalities for accurate estimates of self-motion.
Full Text Available This paper introduces a new Bayesian fusion algorithm to combine more than one trust component (data trust and communication trust to infer the overall trust between nodes. This research work proposes that one trust component is not enough when deciding on whether or not to trust a specific node in a wireless sensor network. This paper discusses and analyses the results from the communication trust component (binary and the data trust component (continuous and proves that either component by itself, can mislead the network and eventually cause a total breakdown of the network. As a result of this, new algorithms are needed to combine more than one trust component to infer the overall trust. The proposed algorithm is simple and generic as it allows trust components to be added and deleted easily. Simulation results demonstrate that a node is highly trustworthy provided that both trust components simultaneously confirm its trustworthiness and conversely, a node is highly untrustworthy if its untrustworthiness is asserted by both components.
Haiping Huang; Lei Chen; Xiao Cao; Ruchuan Wang; Qianyi Wang
We use a great deal of wireless sensor nodes to detect target signal that is more accurate than the traditional single radar detection method. Each local sensor detects the target signal in the region of interests and collects relevant data, and then it sends the respective data to the data fusion center (DFC) for aggregation processing and judgment making whether the target signal exists or not. However, the current judgment fusion rules such as Counting Rule (CR) and Clustering-Counting Rul...
ZiQi Hao; ZhenJiang Zhang; Han-Chieh Chao
As limited energy is one of the tough challenges in wireless sensor networks (WSN), energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can signifi...
Yan Zhou; Dongli Wang; Tingrui Pei; Yonghong Lan
Optimizing the design of tracking system under energy and bandwidth constraints in wireless sensor networks (WSN) is of paramount importance. In this paper, the problem of collaborative target tracking in WSNs is considered in a framework of quantized measurement fusion. First, the measurement in each local sensor is quantized by probabilistic quantization scheme and transmitted to a fusion center (FC). Then, the quantized messages are fused and sequential importance resampling (SIR) particle...
Talabac, Stephen J.
Sensor Web observing systems may have the potential to significantly improve our ability to monitor, understand, and predict the evolution of rapidly evolving, transient, or variable environmental features and events. This improvement will come about by integrating novel data collection techniques, new or improved instruments, emerging communications technologies and protocols, sensor mark-up languages, and interoperable planning and scheduling systems. In contrast to today's observing systems, "event-driven" sensor webs will synthesize real- or near-real time measurements and information from other platforms and then react by reconfiguring the platforms and instruments to invoke new measurement modes and adaptive observation strategies. Similarly, "model-driven" sensor webs will utilize environmental prediction models to initiate targeted sensor measurements or to use a new observing strategy. The sensor web concept contrasts with today's data collection techniques and observing system operations concepts where independent measurements are made by remote sensing and in situ platforms that do not share, and therefore cannot act upon, potentially useful complementary sensor measurement data and platform state information. This presentation describes NASA's view of event-driven and model-driven Sensor Webs and highlights several research and development activities at the Goddard Space Flight Center.
Chioclea, Shmuel; Dickstein, Phineas
In recent years, there has been progress in the application of measurement and control systems that engage multi-sensor arrays. Several algorithms and techniques have been developed for the integration of the information obtained from the sensors. The fusion of the data may be complicated due to the fact that each sensor has its own performance characteristics, and because different sensors may detect different physical phenomena. As a result, data fusion turns out to be a multidisciplinary field, which applies principles adopted from other fields such as signal processing, artificial intelligence, statistics, and The Theory of Information. The data fusion machine tries to imitate the human brain, in combining data from numerous sensors and making optimal inferences about the environment. The present paper provides a critical review of data fusion algorithms and techniques and a trenchant summary of the experience gained to date from the several preliminary NDT studies which have been applying multi-sensor data fusion systems. Consequently, this paper provides a list of rules and criteria to be followed in future applications of data fusion to nondestructive testing.
Full Text Available As a critical variable to characterize the biophysical processes in ecological environment, and as a key indicator in the surface energy balance, evapotranspiration and urban heat islands, Land Surface Temperature (LST retrieved from Thermal Infra-Red (TIR images at both high temporal and spatial resolution is in urgent need. However, due to the limitations of the existing satellite sensors, there is no earth observation which can obtain TIR at detailed spatial- and temporal-resolution simultaneously. Thus, several attempts of image fusion by blending the TIR data from high temporal resolution sensor with data from high spatial resolution sensor have been studied. This paper presents a novel data fusion method by integrating image fusion and spatio-temporal fusion techniques, for deriving LST datasets at 30 m spatial resolution from daily MODIS image and Landsat ETM+ images. The Landsat ETM+ TIR data were firstly enhanced based on extreme learning machine (ELM algorithm using neural network regression model, from 60 m to 30 m resolution. Then, the MODIS LST and enhanced Landsat ETM+ TIR data were fused by Spatio-temporal Adaptive Data Fusion Algorithm for Temperature mapping (SADFAT in order to derive high resolution synthetic data. The synthetic images were evaluated for both testing and simulated satellite images. The average difference (AD and absolute average difference (AAD are smaller than 1.7 K, where the correlation coefficient (CC and root-mean-square error (RMSE are 0.755 and 1.824, respectively, showing that the proposed method enhances the spatial resolution of the predicted LST images and preserves the spectral information at the same time.
Koenig, A.; Rehg, T.; Rasshofer, R.
Driver states such as fatigue, stress, aggression, distraction or even medical emergencies continue to be yield to severe mistakes in driving and promote accidents. A pathway towards improving driver state assessment can be found in psycho-physiological measures to directly quantify the driver's state from physiological recordings. Although heart rate is a well-established physiological variable that reflects cognitive stress, obtaining heart rate contactless and reliably is a challenging task in an automotive environment. Our aim was to investigate, how sensory fusion of two automotive grade sensors would influence the accuracy of automatic classification of cognitive stress levels. We induced cognitive stress in subjects and estimated levels from their heart rate signals, acquired from automotive ready ECG sensors. Using signal quality indices and Kalman filters, we were able to decrease Root Mean Squared Error (RMSE) of heart rate recordings by 10 beats per minute. We then trained a neural network to classify the cognitive workload state of subjects from heart rate and compared classification performance for ground truth, the individual sensors and the fused heart rate signal. We obtained an increase of 5 % higher correct classification by fusing signals as compared to individual sensors, staying only 4 % below the maximally possible classification accuracy from ground truth. These results are a first step towards real world applications of psycho-physiological measurements in vehicle settings. Future implementations of driver state modeling will be able to draw from a larger pool of data sources, such as additional physiological values or vehicle related data, which can be expected to drive classification to significantly higher values.
Several nondestructive technologies have been developed for assessing the firmness and soluble solids content (SSC) of apples. Each of these technologies has its merits and limitations in predicting these quality parameters. With the concept of multi-sensor data fusion, different sensors would work ...
Kjærgaard, Mikkel Baun; Wirz, Martin; Roggen, Daniel;
derived from multiple sensor modalities of modern smartphones. Automatic detection of flocks has several important applications, including evacuation management and socially aware computing. The novelty of this paper is, firstly, to use data fusion techniques to combine several sensor modalities (WiFi...
Schutte, K.; Schavemaker, J.G.M.; Cremer, F.; Breejen, E. den
We present the sensor-fusion results obtained from measurements within the European research project ground explosive ordinance detection (GEODE) system that strives for the realisation of a vehicle-mounted, multi-sensor, anti-personnel landmine-detection system for humanitarian de-mining. The syste
Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den
In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian demin
Quaranta, Carlo; Balzarotti, Giorgio
An algorithm is proposed here that overcomes the problem of the not exhaustive number of common measures from sensors of different kind. In the presence of a suite of heterogeneous sensors, the data fusion process has to deal with the problem of managing different information, generally not directly comparable. The analysis of the mathematical model is carried out considering a data fusion system between radar and Infrared Search and Track (IRST) where the measurement of the range is achieved by radar only. Simulation results demonstrate the effectiveness of the algorithm as regards the fusion process, tracking and correctness of association among tracks from different sensors. A comparison with a known approach from the literature about the fusion equation is also performed.
The FBG (Fiber Bragg Grating) sensors are having potential applications in Fusion reactor applications where huge electric and magnetic field environment is present. This is due to the sensor being immune to electric and magnetic fields. The utilization of optical sensors for the tokamak research has got special attention towards structural monitoring, strain sensing applications and temperature monitoring as per the advantages. The monitoring and preventive maintenance of such structures can be more effectively carried out with FBG sensors compared to conventional sensors like thermocouples, RTD etc. We present the theoretical and experimental investigations carried out on the thermal response of FBGs and LPGs for the measurement of temperatures upto 250 degrees centigrade. The results reveal good sensitivity and resolution in the measurement of temperatures. The present work reports the development and temperature characterization of the FBG sensors demand is highlighted for applications in the Fusion reactor machines. (author)
Rand, Robert S.; Khuon, Timothy; Truslow, Eric
A proposed framework using spectral and spatial information is introduced for neural net multisensor data fusion. This consists of a set of independent-sensor neural nets, one for each sensor (type of data), coupled to a fusion net. The neural net of each sensor is trained from a representative data set of the particular sensor to map to a hypothesis space output. The decision outputs from the sensor nets are used to train the fusion net to an overall decision. During the initial processing, three-dimensional (3-D) point cloud data (PCD) are segmented using a multidimensional mean-shift algorithm into clustered objects. Concurrently, multiband spectral imagery data (multispectral or hyperspectral) are spectrally segmented by the stochastic expectation-maximization into a cluster map containing (spectral-based) pixel classes. For the proposed sensor fusion, spatial detections and spectral detections complement each other. They are fused into final detections by a cascaded neural network, which consists of two levels of neural nets. The success of the approach in utilizing sensor synergism for an enhanced classification is demonstrated for the specific case of classifying hyperspectral imagery and PCD extracted from LIDAR, obtained from an airborne data collection over the campus of University of Southern Mississippi, Gulfport, Mississippi.
In this report we discuss sensor technology, data fusion and data interpretation approaches of possible maximal usefulness for subsurface imaging and characterization of land-fill waste sites. Two sensor technologies, terrain conductivity using electromagnetic induction and ground penetrating radar, are described and the literature on the subject is reviewed. We identify the maximum entropy stochastic method as one providing a rigorously justifiable framework for fusing the sensor data, briefly summarize work done by us in this area, and examine some of the outstanding issues with regard to data fusion and interpretation. 25 refs., 17 figs
Carlos Marqués; Eduardo Romero; Mónica Lovay; Gabriela Peretti
This paper presents an adaptive amplifier that is part of a sensor node in a wireless sensor network. The system presents a target gain that has to be maintained without direct human intervention despite the presence of faults. In addition, its bandwidth must be as large as possible. The system is composed of a software-based built-in self-test scheme implemented in the node that checks all the available gains in the amplifiers, a reconfigurable amplifier, and a genetic algorithm (GA) for rec...
Zhang, Wen-An; Song, Haiyu; Yu, Li
This book systematically presents energy-efficient robust fusion estimation methods to achieve thorough and comprehensive results in the context of network-based fusion estimation. It summarizes recent findings on fusion estimation with communication constraints; several novel energy-efficient and robust design methods for dealing with energy constraints and network-induced uncertainties are presented, such as delays, packet losses, and asynchronous information... All the results are presented as algorithms, which are convenient for practical applications.
Development of new imaging sensors arises the need for image processing techniques that can effectively fuse images from different sensors into a single coherent composition for interpretation. In order to make use of inherent redundancy and extended coverage of multiple sensors, we propose a multi-scale approach for pixel level image fusion. The ultimate goal is to reduce human/machine error in detection and recognition of objects. Results show that proposed methods has lots of superiority o...
Hindo, Thamira; Chakrabartty, Shantanu
Even though current micro-nano fabrication technology has reached integration levels where ultra-sensitive sensors can be fabricated, the sensing performance (resolution per joule) of synthetic systems are still orders of magnitude inferior to those observed in neurobiology. For example, the filiform hairs in crickets operate at fundamental limits of noise; auditory sensors in a parasitoid fly can overcome fundamental limitations to precisely localize ultra-faint acoustic signatures. Even though many of these biological marvels have served as inspiration for different types of neuromorphic sensors, the main focus these designs have been to faithfully replicate the biological functionalities, without considering the constructive role of "noise". In man-made sensors device and sensor noise are typically considered as a nuisance, where as in neurobiology "noise" has been shown to be a computational aid that enables biology to sense and operate at fundamental limits of energy efficiency and performance. In this paper, we describe some of the important noise-exploitation and adaptation principles observed in neurobiology and how they can be systematically used for designing neuromorphic sensors. Our focus will be on two types of noise-exploitation principles, namely, (a) stochastic resonance; and (b) noise-shaping, which are unified within our previously reported framework called Σ▵ learning. As a case-study, we describe the application of Σ▵ learning for the design of a miniature acoustic source localizer whose performance matches that of its biological counterpart(Ormia Ochracea).
Gross, Jason Nicholas
Navigation-grade inertial sensors are often too expensive and too heavy for use in most Small Unmanned Aerial Vehicle (SUAV) systems. Low-cost Micro-Electrical-Mechanical-Systems (MEMS) inertial sensors provide an attractive alternative, but currently do not provide an adequate navigation solution alone due to the presence of sensor bias. Toward addressing this problem, this research focuses on the development and experimental evaluation of sensor fusion algorithms to combine partially redundant information from low-cost sensor to achieve accurate SUAV attitude estimation. To conduct this research, several sets of SUAVs flight data that include measurements from a low-cost MEMS based Inertial Measurement Unit, a Global Positioning System receiver, and a set of low-grade tri-axial magnetometers are used to evaluate a variety of algorithms. In order to provide a baseline for performance evaluation, attitude measurements obtained directly with a high-quality mechanical vertical gyroscope are used as an independent attitude 'truth'. In addition, as a part of this project, a custom SUAV avionics system was developed to provide a platform for fault-tolerant flight control research. The overall goal of this research is to provide high-accuracy attitude estimation during nominal sensor performance conditions and in the event of sensors failures, while using only low-cost components. To achieve this goal, this study is carried out in three phases. The specific aim of the first phase is to obtain high-accuracy under nominal sensor conditions. During this phase, two different nonlinear Kalman filtering methods are applied to various sensor fusion formulations and evaluated with respect to estimation accuracy over diverse sets of flight data. Next, during the second phase, sensor fusion based calibration techniques are explored to further enhance estimation accuracy. Finally, the third phase of the study considers the design of a sensor fusion attitude estimation architecture
Larsen, Thomas Dall; Andersen, Nils Axel; Ravn, Ole
This paper addresses the problem of identity fusion, i.e. the problem of selecting one of several identity hypotheses concerning an observed object. Two problems are considered. Firstly the problem of preserving the information in the representation and fusion of measurements relating to identity...
Sashima, Akio; Ikeda, Takeshi; Kurumatani, Koichi
In current implementation of the healthcare service based on mobile sensing architecture, sensor discovery and communication protocols are predefined. How it can communicate with environmental sensors in an ad-hoc manner is an important issue. Users do not stay in their homes. They visit various places, such as commercial facilities. How can users access environmental sensors in such public spaces? A mechanism to discover and communicate with the environmental sensors is necessary to realize ...
Full Text Available The paper presents a multifunctional joint sensor with measurement adaptability for biological engineering applications, such as gait analysis, gesture recognition, etc. The adaptability is embodied in both static and dynamic environment measurements, both of body pose and in motion capture. Its multifunctional capabilities lay in its ability of simultaneous measurement of multiple degrees of freedom (MDOF with a single sensor to reduce system complexity. The basic working mode enables 2DOF spatial angle measurement over big ranges and stands out for its applications on different joints of different individuals without recalibration. The optional advanced working mode enables an additional DOF measurement for various applications. By employing corrugated tube as the main body, the sensor is also characterized as flexible and wearable with less restraints. MDOF variations are converted to linear displacements of the sensing elements. The simple reconstruction algorithm and small outputs volume are capable of providing real-time angles and long-term monitoring. The performance assessment of the built prototype is promising enough to indicate the feasibility of the sensor.
Alam, Mushfiqul; Rohac, Jan
MEMS (micro-electro-mechanical system)-based inertial sensors, i.e., accelerometers and angular rate sensors, are commonly used as a cost-effective solution for the purposes of navigation in a broad spectrum of terrestrial and aerospace applications. These tri-axial inertial sensors form an inertial measurement unit (IMU), which is a core unit of navigation systems. Even if MEMS sensors have an advantage in their size, cost, weight and power consumption, they suffer from bias instability, noisy output and insufficient resolution. Furthermore, the sensor's behavior can be significantly affected by strong vibration when it operates in harsh environments. All of these constitute conditions require treatment through data processing. As long as the navigation solution is primarily based on using only inertial data, this paper proposes a novel concept in adaptive data pre-processing by using a variable bandwidth filtering. This approach utilizes sinusoidal estimation to continuously adapt the filtering bandwidth of the accelerometer's data in order to reduce the effects of vibration and sensor noise before attitude estimation is processed. Low frequency vibration generally limits the conditions under which the accelerometers can be used to aid the attitude estimation process, which is primarily based on angular rate data and, thus, decreases its accuracy. In contrast, the proposed pre-processing technique enables using accelerometers as an aiding source by effective data smoothing, even when they are affected by low frequency vibration. Verification of the proposed concept is performed on simulation and real-flight data obtained on an ultra-light aircraft. The results of both types of experiments confirm the suitability of the concept for inertial data pre-processing. PMID:25648711
Cvejic, N; Canagarajah, CN; Bull, DR
In this paper, we present a novel multimodal image fusion algorithm in ICA domain. It uses segmentation to determine the most important regions in the input images and consequently fuses the ICA coefficients from given regions using the Piella fusion metric to maximise the quality of the fused image. The proposed method exhibits significantly higher performance than the basic ICA algorithm and improvement over other state-of-the-art algorithms In this paper, we present a novel multimodal i...
高慧良; 周新聪; 程海明; 赵春华; 严新平
Machine lubrication contains abundant information on the equipment operation.Nowadays, most measuring methods are based on offline sampling or on online measuring with a single sensor.An online oil monitoring system with multiple sensors was designed.The measurement data was processed with a fuzzy intelligence system.Information from integrated sensors in an oil online monitoring system was evaluated using fuzzy logic.The analyses show that the multiple sensors evaluation results are more reliable than online monitoring systems with single sensors.
Full Text Available With the development of image sensor technology, multi-sensor image fusion technology emerged and was widely used in the field of military surveillance, medical diagnosis, remote sensing, intelligent robot and so on. However, the current image fusion technology mainly focuses on the research of gray images, the color image fusion is rarely. Because color image contains more information compared with gray image, the research on color image fusion technology is becoming more and more urgent. In this paper, the realization of several typical color image fusion algorithms were discussed, the principle and their respective advantages and disadvantages were analyzed. Secondly, according to the different characteristics of visible image and infrared image, this paper proposes a color image fusion algorithm based on Curve let transform, this algorithm will combine visible image, infrared image with its negative respectively fusion, color mapping rules are in couple with the human visual characteristics. Experiments show that color fusion images obtained are richer in color, they contains more details and recognize easily.
Full Text Available Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor’s uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.
A framework is proposed, which consolidates the benefits of a fuzzy rationale and a neural system. The framework joins together Kalman separating and delicate processing guideline i.e. ANFIS to structure an effective information combination strategy for the target following framework. A novel versatile calculation focused around ANFIS is proposed to adjust logical progressions and to weaken the questionable aggravation of estimation information from multisensory. Fuzzy versatile combination calculation is a compelling device to make the genuine quality of the leftover covariance steady with its hypothetical worth. ANFIS indicates great taking in and forecast proficiencies, which makes it a productive device to manage experienced vulnerabilities in any framework. A neural system is presented, which can concentrate the measurable properties of the samples throughout the preparation sessions. Reproduction results demonstrate that the calculation can successfully alter the framework to adjust context oriented progressions and has solid combination capacity in opposing questionable data. This sagacious estimator is actualized utilizing Matlab/Simulink and the exhibitions are explored.
Zhenhua Xu; Jianguo Huang; Hai Huang; Qunfei Zhang
In order to solve the distributed detection fusion problem of underwater target detection,when the signal to noise ratio (SNR) of the acoustic channel is low,a new strategy for united detection fusion and communication using multiple sensors was proposed.The performance of detection fusion was studied and compared based on the Neyman-Pearson principle when the binary phase shift keying (BPSK) and on-off keying (OOK) modes were used by the local sensors.The comparative simulation and analysis between the optimal likelihood ratio test and the proposed strategy was completed,and both the theoretical analysis and simulation indicate that using the proposed new strategy could improve the detection performance effectively.In theory,the proposed strategy of united detection fusion and communication is of great significance to the establishment of an underwater target detection system.
B N Suresh; K Sivan
In this paper, the utilization of multi-sensors of different types, their characteristics, and their data-fusion in launch vehicles to achieve the goal of injecting the satellite into a precise orbit is explained. Performance requirements of sensors and their redundancy management in a typical launch vehicle are also included. The role of an integrated system level-test bed for evaluating multi-sensors and mission performance in a typical launch vehicle mission is described. Some of the typical simulation results to evaluate the effect of the sensors on the overall system are highlighted.
Full Text Available MEMS (micro-electro-mechanical system-based inertial sensors, i.e., accelerometers and angular rate sensors, are commonly used as a cost-effective solution for the purposes of navigation in a broad spectrum of terrestrial and aerospace applications. These tri-axial inertial sensors form an inertial measurement unit (IMU, which is a core unit of navigation systems. Even if MEMS sensors have an advantage in their size, cost, weight and power consumption, they suffer from bias instability, noisy output and insufficient resolution. Furthermore, the sensor’s behavior can be significantly affected by strong vibration when it operates in harsh environments. All of these constitute conditions require treatment through data processing. As long as the navigation solution is primarily based on using only inertial data, this paper proposes a novel concept in adaptive data pre-processing by using a variable bandwidth filtering. This approach utilizes sinusoidal estimation to continuously adapt the filtering bandwidth of the accelerometer’s data in order to reduce the effects of vibration and sensor noise before attitude estimation is processed. Low frequency vibration generally limits the conditions under which the accelerometers can be used to aid the attitude estimation process, which is primarily based on angular rate data and, thus, decreases its accuracy. In contrast, the proposed pre-processing technique enables using accelerometers as an aiding source by effective data smoothing, even when they are affected by low frequency vibration. Verification of the proposed concept is performed on simulation and real-flight data obtained on an ultra-light aircraft. The results of both types of experiments confirm the suitability of the concept for inertial data pre-processing.
This report presents the results of a study to define several types of sensors in use, the qualitative reliability (failure modes) and quantitative reliability (average failure rates) for these types of process sensors. Temperature, pressure, flow, and level sensors are discussed for water coolant and for cryogenic coolants. The failure rates that have been found are useful for risk assessment and safety analysis. Repair times and calibration intervals are also given when found in the literature. All of these values can also be useful to plant operators and maintenance personnel. Designers may be able to make use of these data when planning systems. The final chapter in this report discusses failure rates for several types of personnel safety sensors, including ionizing radiation monitors, toxic and combustible gas detectors, humidity sensors, and magnetic field sensors. These data could be useful to industrial hygienists and other safety professionals when designing or auditing for personnel safety.
This report presents the results of a study to define several types of sensors in use, the qualitative reliability (failure modes) and quantitative reliability (average failure rates) for these types of process sensors. Temperature, pressure, flow, and level sensors are discussed for water coolant and for cryogenic coolants. The failure rates that have been found are useful for risk assessment and safety analysis. Repair times and calibration intervals are also given when found in the literature. All of these values can also be useful to plant operators and maintenance personnel. Designers may be able to make use of these data when planning systems. The final chapter in this report discusses failure rates for several types of personnel safety sensors, including ionizing radiation monitors, toxic and combustible gas detectors, humidity sensors, and magnetic field sensors. These data could be useful to industrial hygienists and other safety professionals when designing or auditing for personnel safety
Maneuvering targets tracking is a fundamental task in intelligent vehicle research. This paper focuses on the problem of fusion between radar and image sensors in targets tracking. In order to improve positioning accuracy and narrow down the image working area, a novel method that integrates radar filter with image intensity is proposed to establish an adaptive vision window.A weighted Hausdorff distance is introduced to define the functional relationship between image and model projection, and a modified simulated annealing algorithm is used to find optimum orientation parameter. Furthermore, the global state is estimated, which refers to the distributed data fusion algorithm. Experiment results show that our method is accurate.
Schweiger, R.; Franz, S.; Löhlein, O.; Ritter, W.; Källhammer, J.-E.; Franks, J.; Krekels, T.
The next generation of automotive Night Vision Enhancement systems offers automatic pedestrian recognition with a performance beyond current Night Vision systems at a lower cost. This will allow high market penetration, covering the luxury as well as compact car segments. Improved performance can be achieved by fusing a Far Infrared (FIR) sensor with a Near Infrared (NIR) sensor. However, fusing with today's FIR systems will be too costly to get a high market penetration. The main cost drivers of the FIR system are its resolution and its sensitivity. Sensor cost is largely determined by sensor die size. Fewer and smaller pixels will reduce die size but also resolution and sensitivity. Sensitivity limits are mainly determined by inclement weather performance. Sensitivity requirements should be matched to the possibilities of low cost FIR optics, especially implications of molding of highly complex optical surfaces. As a FIR sensor specified for fusion can have lower resolution as well as lower sensitivity, fusing FIR and NIR can solve performance and cost problems. To allow compensation of FIR-sensor degradation on the pedestrian detection capabilities, a fusion approach called MultiSensorBoosting is presented that produces a classifier holding highly discriminative sub-pixel features from both sensors at once. The algorithm is applied on data with different resolution and on data obtained from cameras with varying optics to incorporate various sensor sensitivities. As it is not feasible to record representative data with all different sensor configurations, transformation routines on existing high resolution data recorded with high sensitivity cameras are investigated in order to determine the effects of lower resolution and lower sensitivity to the overall detection performance. This paper also gives an overview of the first results showing that a reduction of FIR sensor resolution can be compensated using fusion techniques and a reduction of sensitivity can be
Wu, Lihong; Chen, Yingsong; Cui, Zhouping
The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.
Stoica, Adrian; Thomas, Tyson; Li, Wei-Te; Daud, Taher; Fabunmi, James
The paper presents the hardware implementation and initial tests from a low-power, highspeed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) is described, which combines rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor signals in compact low power VLSI. The development of the ELIPS concept is being done to demonstrate the interceptor functionality which particularly underlines the high speed and low power requirements. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Processing speeds of microseconds have been demonstrated using our test hardware.
Lu, Kelin; Zhou, Rui
A sensor fusion methodology for the Gaussian mixtures model is proposed for ballistic target tracking with unknown ballistic coefficients. To improve the estimation accuracy, a track-to-track fusion architecture is proposed to fuse tracks provided by the local interacting multiple model filters. During the fusion process, the duplicate information is removed by considering the first order redundant information between the local tracks. With extensive simulations, we show that the proposed algorithm improves the tracking accuracy in ballistic target tracking in the re-entry phase applications. PMID:27537883
Lamborn, Peter; Williams, Pamela J.
Alarm-based sensor systems are being explored as a tool to expand perimeter security for facilities and force protection. However, the collection of increased sensor data has resulted in an insufficient solution that includes faulty data points. Data analysis is needed to reduce nuisance and false alarms, which will improve officials' decision making and confidence levels in the system's alarms. Moreover, operational costs can be allayed and losses mitigated if authorities are alerted only when a real threat is detected. In the current system, heuristics such as persistence of alarm and type of sensor that detected an event are used to guide officials' responses. We hypothesize that fusing data from heterogeneous sensors in the sensor field can provide more complete situational awareness than looking at individual sensor data. We propose a two stage approach to reduce false alarms. First, we use self organizing maps to cluster sensors based on global positioning coordinates and then train classifiers on the within cluster data to obtain a local view of the event. Next, we train a classifier on the local results to compute a global solution. We investigate the use of machine learning techniques, such as k-nearest neighbor, neural networks, and support vector machines to improve alarm accuracy. On simulated sensor data, the proposed approach identifies false alarms with greater accuracy than a weighted voting algorithm.
Lamborn, Peter; Williams, Pamela J.
Alarm-based sensor systems are being explored as a tool to expand perimeter security for facilities and force protection. However, the collection of increased sensor data has resulted in an insufficient solution that includes faulty data points. Data analysis is needed to reduce nuisance and false alarms, which will improve officials decision making and confidence levels in the system's alarms. Moreover, operational costs can be allayed and losses mitigated if authorities are alerted only when a real threat is detected. In the current system, heuristics such as persistence of alarm and type of sensor that detected an event are used to guide officials responses. We hypothesize that fusing data from heterogeneous sensors in the sensor field can provide more complete situational awareness than looking at individual sensor data. We propose a two stage approach to reduce false alarms. First, we use self organizing maps to cluster sensors based on global positioning coordinates and then train classifiers on the within cluster data to obtain a local view of the event. Next, we train a classifier on the local results to compute a global solution. We investigate the use of machine learning techniques, such as k-nearest neighbor, neural networks, and support vector machines to improve alarm accuracy. On simulated sensor data, the proposed approach identifies false alarms with greater accuracy than a weighted voting algorithm.
Walls, Thomas J.; Wilson, Michael L.; Partridge, Darin C.; Haws, Jonathan R.; Jensen, Mark D.; Johnson, Troy R.; Petersen, Brad D.; Sullivan, Stephanie W.
The capabilities of tactical intelligence, surveillance, and reconnaissance (ISR) payloads are expanding from single sensor imagers to integrated systems-of-systems architectures. Increasingly, these systems-of-systems include multiple sensing modalities that can act as force multipliers for the intelligence analyst. Currently, the separate sensing modalities operate largely independent of one another, providing a selection of operating modes but not an integrated intelligence product. We describe here a Sensor Management System (SMS) designed to provide a small, compact processing unit capable of managing multiple collaborative sensor systems on-board an aircraft. Its purpose is to increase sensor cooperation and collaboration to achieve intelligent data collection and exploitation. The SMS architecture is designed to be largely sensor and data agnostic and provide flexible networked access for both data providers and data consumers. It supports pre-planned and ad-hoc missions, with provisions for on-demand tasking and updates from users connected via data links. Management of sensors and user agents takes place over standard network protocols such that any number and combination of sensors and user agents, either on the local network or connected via data link, can register with the SMS at any time during the mission. The SMS provides control over sensor data collection to handle logging and routing of data products to subscribing user agents. It also supports the addition of algorithmic data processing agents for feature/target extraction and provides for subsequent cueing from one sensor to another. The SMS architecture was designed to scale from a small UAV carrying a limited number of payloads to an aircraft carrying a large number of payloads. The SMS system is STANAG 4575 compliant as a removable memory module (RMM) and can act as a vehicle specific module (VSM) to provide STANAG 4586 compliance (level-3 interoperability) to a non-compliant sensor system
Ren Fang; Yang Zhaojian; Xiong Shibo
The coal-rock interface recognition method based on multi-sensor data fusion technique is put forward because of the localization of single type sensor recognition method. The measuring theory based on multi-sensor data fusion technique is analyzed, and hereby the test platform of recognition system is manufactured. The advantage of data fusion with the fuzzy neural network (FNN) technique has been probed. The two-level FNN is constructed and data fusion is carried out. The experiments show that in various conditions the method can always acquire a much higher recognition rate than normal ones.
Podanchuk, Dmytro V; Goloborodko, Andrey A; Kotov, Myhailo M; Kovalenko, Andrey V; Kurashov, Vitalij N; Dan'ko, Volodymyr P
A new adaptive method of wavefront sensing is proposed and demonstrated. The method is based on the Talbot self-imaging effect, which is observed in an illuminating light beam with strong second-order aberration. Compensation of defocus and astigmatism is achieved with an appropriate choice of size of the rectangular unit cell of the diffraction grating, which is performed iteratively. A liquid-crystal spatial light modulator is used for this purpose. Self-imaging of rectangular grating in the astigmatic light beam is demonstrated experimentally. High-order aberrations are detected with respect to the compensated second-order aberration. The comparative results of wavefront sensing with a Shack-Hartmann sensor and the proposed sensor are adduced. PMID:27140122
Zhao Ji; Ma Zi; Lin Na; Zhu Quanmin
In this paper, a series of new techniques are used to optimize typical laser scanning sensor. The integrated prototype is compared with traditional approach to demonstrate the much improved performance. In the research and development, camera calibration is achieved by extracting characteristic points of the laser plane, so that the calibration efficiency is improved significantly. With feedback control of its intensity, the laser is automatically adjusted for different material. A modified algorithm is presented to improve the accuracy of laser stripe extraction. The fusion of data extracted from left and right camera is completed with re-sampling technique. The scanner is integrated with a robot arm and some other machinery for on-line measurement and inspection, which provides a flexible measurement tool for reverse engineering.
Coué, Christophe; Fraichard, Thierry; Bessiere, Pierre; Mazer, Emmanuel
A prerequisite to the design of future Advanced Driver Assistance Systems for cars is a sensing system providing all the information required for high-level driving assistance tasks. Carsense is a European project whose purpose is to develop such a new sensing system. It will combine different sensors (laser, radar and video) and will rely on the fusion of the information coming from these sensors in order to achieve better accuracy, robustness and an increase of the information content. This...
Coué, Christophe; Fraichard, Thierry; Bessiere, Pierre; Mazer, Emmanuel
A prerequisite to the design of future Advanced Driver Assistance Systems for cars is a sensing sytem providing all the information required for high-level driving assistance tasks. Carsense is a European project whose purpose is to develop such a new sensing system. It will combine different sensors (laser, radar and video) and will rely on the fusion of the information coming from these sensors in order to achieve better accuracy, robustness and an increase of the information content. This ...
Klimentjew, Denis; Hendrich, Norman; Zhang, jianwei
This paper proposes multi sensor fusion based on an effective calibration method for a perception system designed for mobile robots and intended for later object recognition. The perception system consists of a camera and a three-dimensional laser range finder. The three-dimensional laser range finder is based on a two-dimensional laser scanner and a pan-tilt unit as a moving platform. The calibration permits the coalescence of the two most important sensors for three-dim...
Pedro Albertos; Ángel Valera; Ángel Soriano; Marina Vallés; Leonardo Marín
This paper presents a sensor fusion framework that improves the localization of mobile robots with limited computational resources. It employs an event based Kalman Filter to combine the measurements of a global sensor and an inertial measurement unit (IMU) on an event based schedule, using fewer resources (execution time and bandwidth) but with similar performance when compared to the traditional methods. The event is defined to reflect the necessity of the global information, when the estim...
This paper proposes a new approach for calibration of dead reckoning process. Using the well-known UMB mark (University of Michigan Benchmark) is not sufficient for a desirable calibration of dead reckoning. Besides, existing calibration methods usually require explicit measurement of actual motion of the robot. Some recent methods use the smart encoder trailer or long range finder sensors such as ultrasonic or laser range finders for automatic calibration. Manual measurement is necessary in the case of the robots that are not equipped with long-range detectors or such smart encoder trailer. Our proposed approach uses an environment map that is created by fusion of proximity data, in order to calibrate the odometry error automatically. In the new approach, the systematic part of the error is adaptively estimated and compensated by an efficient and incremental maximum likelihood algorithm. Actually, environment map data are fused with the odometry and current sensory data in order to acquire the maximum likelihood estimation. The advantages of the proposed approach are demonstrated in some experiments with Khepera robot. It is shown that the amount of pose estimation error is reduced by a percentage of more than 80%
Mukherjee, Subhabrata; Mukherjee, Amitava
In recent years, the wireless sensor network (WSN) is playing a key role in sensing, collecting and disseminating information in various applications. An important feature associated with WSN is to develop an efficient data distribution and routing scheme to ensure better quality of service (QoS) that reduces the power consumption and the end-to-end data delivery time. In this work, we propose an adaptive framework to transmit data packets from a source to the sink in WSN across multiples paths with strategically distributed data packets so as to minimize the power consumption as well as the end-to-end data delivery time.
He, Xiang; Aloi, Daniel N; Li, Jia
Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design. PMID:26694387
Full Text Available Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer, wireless signal strength indicators (WiFi, Bluetooth, Zigbee, and visual sensors (LiDAR, camera. People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.
Di Mauro, Alessio; Dragoni, Nicola
Energy Harvesting - Wireless Sensor Networks (EH-WSNs) constitute systems of networked sensing nodes that are capable of extracting energy from the environment and that use the harvested energy to operate in a sustainable state. Sustainability, seen as design goal, has a significant impact...... on the design of the security protocols for such networks, as the nodes have to adapt and optimize their behaviour according to the available energy. Traditional key management schemes do not take energy into account, making them not suitable for EH-WSNs. In this paper we propose a new multipath key...... reinforcement scheme specifically designed for EH-WSNs. The proposed scheme allows each node to take into consideration and adapt to the amount of energy available in the system. In particular, we present two approaches, one static and one fully dynamic, and we discuss some experimental results....
National Aeronautics and Space Administration — It has been proven that the combination of smart sensors with embedded metadata and wireless technologies present real opportunities for significant improvements in...
Palubinskas, Gintautas; Reinartz, Peter
Multi-resolution image fusion also known as pansharpening aims to include spatial information from a high resolution image, e.g. panchromatic or Synthetic Aperture Radar (SAR) image, into a low resolution image, e.g. multi-spectral or hyper-spectral image, while preserving spectral properties of a low resolution image. A signal processing view at this problem allowed us to perform a systematic classification of most known multi-resolution image fusion approaches and resul...
Calero Scanlan, David
This thesis analyses the performance that can be obtained in navigation applications by using the camera and sensors embedded in a mobile phone (and GPS when available). The project includes the development of image processing algorithms to extract useful observations for navigation. Navigation is based on the determination of the trajectory, ie, time, position, velocity and altitude. For a good navigation experience a period of calibration and characterization of embedded mobile sensors such...
This thesis deals with feedback control using two different sensor types, force sensors and cameras. In many tasks robotics compliance is required in order to avoid damage to the workpiece. Force and vision are the most useful sensing capabilities for a robot system operating in an unknown or uncalibrated environment. An overview of vision based estimation, control and vision/force control is given. Two different control algorithms based on a hybrid force/vision structure are presented, using...
Kermorgant, Olivier; Chaumette, F.
A low-level sensor fusion scheme is presented for the positioning of a multi-sensor robot. This non-hierarchical framework can be used for robot arms or other velocity- controlled robots, and is part of the task function approach. A stability analysis is presented for the general case, then several control laws illustrate the versatility of the framework. This approach is applied to the multi-camera eye-in-hand/eye- to-hand configuration in visual servoing. Experimental results point out the ...
Log-likelihood Ratio Test (ELRT rule is derived. Finally, the ROC curve for this model is presented. The simulation results show that the ELRT rule improves the robust performance of the system, compared with the traditional fusion rule without considering sensor failures.
Christiansen, Martin Peter; Jensen, Kjeld; Ellekilde, Lars-Peter;
Using the detected trees seen in gure 4(b) a localised SLAM map of the surroundings area, can be created an used to determine the localisation of the tractor. This kind of sensor-fusion is used, to keep the amount of prior information about outlay of the orchard to a minimum, so it can be used...
Milisavljevic, N.; Broek, S.P. van den; Bloch, I.; Schwering, P.B.W.; Lensen, H.A.; Acheroy, M.
In this paper, two methods for fusion of mine detection sensors are presented, based on belief functions and on voting procedures, respectively. Their application is illustrated and compared on a real multisensor data set collected at the TNO test facilities under the HOM 2000 project. This set cont
Cremer, F.; Schavemaker, J.G.M.; Breejen, E. den; Schutte, K.
In this paper we present the sensor-fusion results based on the measurements obtained within the European research project GEODE (Ground Explosive Ordnance DEtection system) that strives for the realisation of a vehicle-mounted, multisensor, anti-personnel land-mine detection system for humanitarian
Bolshakova, I.; Ďuran, Ivan; Holyaka, R.; Hristoforou, E.; Marusenkov, A.
Roč. 5, č. 1 (2007), s. 283-288. ISSN 1546-198X R&D Projects: GA AV ČR KJB100430504 Institutional research plan: CEZ:AV0Z20430508 Keywords : Galvanomagnetic * Sensor * Fusion Reactor * Magnetic Diagnostics * Radiation Hardness Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.587, year: 2007
Etoh, T. Goji; Dao, V. T. S.; Nguyen, Quang A.; Kimata, M.
Most targets of ultra-high-speed video cameras operating at more than 1 Mfps, such as combustion, crack propagation, collision, plasma, spark discharge, an air bag at a car accident and a tire under a sudden brake, generate sudden heat. Researchers in these fields require tools to measure the high-speed motion and heat simultaneously. Ultra-high frame rate imaging is achieved by an in-situ storage image sensor. Each pixel of the sensor is equipped with multiple memory elements to record a series of image signals simultaneously at all pixels. Image signals stored in each pixel are read out after an image capturing operation. In 2002, we developed an in-situ storage image sensor operating at 1 Mfps 1). However, the fill factor of the sensor was only 15% due to a light shield covering the wide in-situ storage area. Therefore, in 2011, we developed a backside illuminated (BSI) in-situ storage image sensor to increase the sensitivity with 100% fill factor and a very high quantum efficiency 2). The sensor also achieved a much higher frame rate,16.7 Mfps, thanks to the wiring on the front side with more freedom 3). The BSI structure has another advantage that it has less difficulties in attaching an additional layer on the backside, such as scintillators. This paper proposes development of an ultra-high-speed IR image sensor in combination of advanced nano-technologies for IR imaging and the in-situ storage technology for ultra-highspeed imaging with discussion on issues in the integration.
Full Text Available As technology has been developed rapidly, botnet threats to the global cyber community are also increasing. And the botnet detection has recently become a major research topic in the field of network security. Most of the current detection approaches work only on the evidence from single information source, which can not hold all the traces of botnet and hardly achieve high accuracy. In this paper, a novel botnet detection architecture based on heterogeneous multi-sensor information fusion is proposed. The architecture is designed to carry out information integration in the three fusion levels of data, feature, and decision. As the core component, a feature extraction module is also elaborately designed. And an extended algorithm of the Dempster-Shafer (D-S theory is proved and adopted in decision fusion. Furthermore, a representative case is provided to illustrate that the detection architecture can effectively fuse the complicated information from various sensors, thus to achieve better detection effect.
Full Text Available This paper is concerned with the estimation problem of a dynamic stochastic variable in a sensor network, where the quantization of scalar measurement, the optimization of the bandwidth scheduling, and the characteristic of transmission channels are considered. For the imperfect channels with missing measurements in sensor networks, two weighted measurement fusion (WMF quantized Kalman filters based on the quantized measurements arriving at the fusion center are presented. One is dependent on the known message of whether a measurement is received. The other is dependent on the probability of missing measurements. They have the reduced computational cost and same accuracy as the corresponding centralized fusion filter. The approximate solution for the optimal bandwidth-scheduling problem is given under a limited bandwidth constraint. Furthermore, the vector measurement case is also discussed. The simulation research shows the effectiveness.
Mollenhauer, Hannes; Remmler, Paul; Schuhmann, Gudrun; Lausch, Angela; Merbach, Ines; Assing, Martin; Mollenhauer, Olaf; Dietrich, Peter; Bumberger, Jan
Nutrients such as nitrogen are playing a key role in the plant life cycle. They are much needed for chlorophyll production and other plant cell components. Therefore, the crop yield is strongly affected by plant nutrient status. Due to the spatial and temporal variability of soil characteristics or swaying agricultural inputs the plant development varies within a field. Thus, the determination of these fluctuations in the plant development is valuable for a detection of stress conditions and optimization of fertilisation due to its high environmental and economic impact. Plant parameters play crucial roles in plant growth estimation and prediction since they are used as indicators of plant performance. Especially indices derived out of remote sensing techniques provide quantitative information about agricultural crops instantaneously, and above all, non-destructively. Due to the specific absorption of certain plant pigments, a characteristic spectral signature can be seen in the visible and IR part of the electromagnetic spectrum, known as narrow-band peaks. In an analogous manner, the presence and concentration of different nutrients cause a characteristic spectral signature. To this end, an adequate remote sensing monitoring concept is needed, considering heterogeneity and dynamic of the plant population and economical aspects. This work will present the development and field investigations of an inexpensive multichannel radiation sensor to observe the incoming and reflected specific parts or rather distinct wavelengths of the solar light spectrum on the crop and facilitate the determination of different plant indices. Based on the selected sensor wavelengths, the sensing device allows the detection of specific parameters, e.g. plant vitality, chlorophyll content or nitrogen content. Besides the improvement of the sensor characteristic, the simple wavelength adaption, and the price-performance ratio, the achievement of appropriate energy efficiency as well as a
Ahmad, Omar; Bona, Basilio; Anjum, Muhammad Latif
This paper presents sensor data fusion using Unscented Kalman Filter (UKF) to implement high performance vestibulo-ocular reflex (VOR) based vision tracking system for mobile robots. Information from various sensors is required to be integrated using an efficient sensor fusion algorithm to achieve a continuous and robust vision tracking system. We use data from low cost accelerometer, gyroscope, and encoders to calculate robot motion information. The Unscented Kalman Filter is used as an effi...
Sang Won Yoon
Full Text Available The Mecanum automated guided vehicle (AGV, which can move in any direction by using a special wheel structure with a LIM-wheel and a diagonally positioned roller, holds considerable promise for the field of industrial electronics. A conventional method for Mecanum AGV localization has certain limitations, such as slip phenomena, because there are variations in the surface of the road and ground friction. Therefore, precise localization is a very important issue for the inevitable slip phenomenon situation. So a sensor fusion technique is developed to cope with this drawback by using the Kalman filter. ENCODER and StarGazer were used for sensor fusion. StarGazer is a position sensor for an image recognition device and always generates some errors due to the limitations of the image recognition device. ENCODER has also errors accumulating over time. On the other hand, there are no moving errors. In this study, we developed a Mecanum AGV prototype system and showed by simulation that we can eliminate the disadvantages of each sensor. We obtained the precise localization of the Mecanum AGV in a slip phenomenon situation via sensor fusion using a Kalman filter.
Jesse S. Jin
Full Text Available Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.
Giompapa, S.; Croci, R.; Di Stefano, R.; Farina, A.; Gini, F.; Graziano, A.; Lapierre, F.
This paper describes the classification function of naval targets performed by an infrared camera (IR) and an electro-optical camera (EO) that operate in a more complex multisensor system for the surveillance of a coastal region. The following naval targets are considered: high speed dinghy, motor boat, fishing boat, oil tanker. Target classification is automatically performed by exploiting the knowledge of the sensor confusion matrix (CM). The CM is analytically computed as a function of the sensor noise features, the sensor resolution, and the dimension of the involved image database. For both the sensors, a database of images is generated exploiting a three-dimensional (3D) Computer Aided Design (CAD) of the target, for the four types of ship mentioned above. For the EO camera, the image generation is simply obtained by the projection of the 3D CAD on the camera focal plane. For the IR images simulation, firstly the surface temperatures are computed using an Open-source Software for Modelling and Simulation of Infrared Signatures (OSMOSIS) that efficiently integrates the dependence of the emissivity upon the surface temperature, the wavelength, and the elevation angle. The software is applicable to realistic ship geometries. Secondly, these temperatures and the environment features are used to predict realistic IR images. The local decisions on the class are made using the elements of the confusion matrix of each sensor and they are fused according to a maximum likelihood (ML) rule. The global performance of the classification process is measured in terms of the global confusion matrix of the integrated system. This analytical approach can effectively reduce the computational load of a Monte Carlo simulation, when the sensors described here are introduced in a more complex multisensor system for the maritime surveillance.
As part of the First Wall/Blanket/Shield Engineering Test Program, a test bed called FELIX (Fusion ELectromagnetic Induction eXperiment) is now under construction at ANL. Its purpose will be to test, evaluate, and develop computer codes for the prediction of electromagnetically induced phenomenon in a magnetic environment modeling that of a fusion reaction. Crucial to this process is the sensing and recording of the various induced effects. Sensor evaluation for FELIX has reached the point where most sensor types have been evaluated and preliminary decisions are being made as to type and quantity for the initial FELIX experiments. These early experiments, the first, flat plate experiment in particular, will be aimed at testing the sensors as well as the pertinent theories involved. The reason for these evaluations, decisions, and proof tests is the harsh electrical and magnetic environment that FELIX presents
The focus of this thesis is on studying diverse techniques, methods and sensors for position and orientation determination with application to augmented reality applications. In Chapter 2 we reviewed a variety of existing techniques and systems for position determination. From a practical point of v
An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. This adaptive system is structured as a multistage stochastic process of detection, identification, and compensation. It is shown that the detection system can be effectively constructed on the basis of a design value, specified by mission requirements, of the unknown parameter in the actual system, and of a degradation mode in the form of a constant bias jump. A suboptimal detection system on the basis of Wald's sequential analysis is developed using the concept of information value and information feedback. The developed system is easily implemented, and demonstrates a performance remarkably close to that of the optimal nonlinear detection system. An invariant transformation is derived to eliminate the effect of nuisance parameters such that the ambiguous identification system can be reduced to a set of disjoint simple hypotheses tests. By application of a technique of decoupled bias estimation in the compensation system the adaptive system can be operated without any complicated reorganization.
The traditional measuring range for a mobile robot is based on a sonar sensor. Because of different working environments, it is very difficult to obtain high precision by using just one single method of range measurement. So, a hybrid sonar sensor and laser scanner method is put forward to overcome these shortcomings. A novel fusion model is proposed based on basic theory and a method of information fusion. An optimal measurement result has been obtained with information fusion from different sensors. After large numbers of experiments and performance analysis, a conclusion can be drawn that the laser scanner and sonar sensor method with multi-sensor information fusion have a higher precision than the single method of sonar. It can also be the same with different environments
Cameron, S.M.; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.
Comprehensive management of the battle-space has created new requirements in information management, communication, and interoperability as they effect surveillance and situational awareness. The objective of this proposal is to expand intelligent controls theory to produce a uniquely powerful implementation of distributed ground-based measurement incorporating both local collective behavior, and interoperative global optimization for sensor fusion and mission oversight. By using a layered hierarchal control architecture to orchestrate adaptive reconfiguration of autonomous robotic agents, we can improve overall robustness and functionality in dynamic tactical environments without information bottlenecks. In this concept, each sensor is equipped with a miniaturized optical reflectance modulator which is interactively monitored as a remote transponder using a covert laser communication protocol from a remote mothership or operative. Robot data-sharing at the ground level can be leveraged with global evaluation criteria, including terrain overlays and remote imaging data. Information sharing and distributed intelli- gence opens up a new class of remote-sensing applications in which small single-function autono- mous observers at the local level can collectively optimize and measure large scale ground-level signals. AS the need for coverage and the number of agents grows to improve spatial resolution, cooperative behavior orchestrated by a global situational awareness umbrella will be an essential ingredient to offset increasing bandwidth requirements within the net. A system of the type described in this proposal will be capable of sensitively detecting, tracking, and mapping spatial distributions of measurement signatures which are non-stationary or obscured by clutter and inter- fering obstacles by virtue of adaptive reconfiguration. This methodology could be used, for example, to field an adaptive ground-penetrating radar for detection of underground structures in
Ren, M. J.; Liu, M. Y.; Cheung, C. F.; Yin, Y. H.
Along with the rapid development of the science and technology in fields such as space optics, multi-scale enriched freeform surfaces are widely used to enhance the performance of the optical systems in both functionality and size reduction. Multi-sensor technology is considered as one of the promising methods to measure and characterize these surfaces at multiple scales. This paper presents a multi-sensor data fusion based measurement method to purposely extract the geometric information of the components with different scales which is used to establish a holistic geometry of the surface via data fusion. To address the key problems of multi-sensor data fusion, an intrinsic feature pattern based surface registration method is developed to transform the measured datasets to a common coordinate frame. Gaussian zero-order regression filter is then used to separate each measured data in different scales, and the datasets are fused based on an edge intensity data fusion algorithm within the same wavelength. The fused data at different scales is then merged to form a new surface with holistic multiscale information. Experimental study is presented to verify the effectiveness of the proposed method.
Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.
Hellickson, Dean; Richards, Paul; Reynolds, Zane; Keener, Joshua
Traditional site security systems are susceptible to high individual sensor nuisance alarm rates that reduce the overall system effectiveness. Visual assessment of intrusions can be intensive and manually difficult as cameras are slewed by the system to non intrusion areas or as operators respond to nuisance alarms. Very little system intrusion performance data are available other than discrete sensor alarm indications that provide no real value. This paper discusses the system architecture, integration and display of a multi-sensor data fused system for wide area surveillance, local site intrusion detection and intrusion classification. The incorporation of a novel seismic array of smart sensors using FK Beamforming processing that greatly enhances the overall system detection and classification performance of the system is discussed. Recent test data demonstrates the performance of the seismic array within several different installations and its ability to classify and track moving targets at significant standoff distances with exceptional immunity to background clutter and noise. Multi-sensor data fusion is applied across a suite of complimentary sensors eliminating almost all nuisance alarms while integrating within a geographical information system to feed a visual-fusion display of the area being secured. Real-time sensor detection and intrusion classification data is presented within a visual-fusion display providing greatly enhanced situational awareness, system performance information and real-time assessment of intrusions and situations of interest with limited security operator involvement. This approach scales from a small local perimeter to very large geographical area and can be used across multiple sites controlled at a single command and control station.
Wiley, J.C. [Univ. of Texas, Austin, TX (United States)
The author describes a general `hp` finite element method with adaptive grids. The code was based on the work of Oden, et al. The term `hp` refers to the method of spatial refinement (h), in conjunction with the order of polynomials used as a part of the finite element discretization (p). This finite element code seems to handle well the different mesh grid sizes occuring between abuted grids with different resolutions.
Ricardo Anacleto; Lino Figueiredo; Ana Almeida; Paulo Novais; António Meireles
A pedestrian inertial navigation system is typically used to suppress the Global Navigation Satellite System limitation to track persons in indoor or in dense environments. However, low- cost inertial systems provide huge location estimation errors due to sensors and pedestrian dead reckoning inherent characteristics. To suppress some of these errors we propose a system that uses two inertial measurement units spread in person’s body, which measurements are aggregated using learning algorithm...
Ďuran, Ivan; Sentkerestiová, J.; Kohout, Michal; Mušálek, Radek; Viererbl, L.; Kovařík, Karel
Vol. 1612. MELVILLE: American Institute of Physics, 2014 - (Gorini, G.; Orsitto, F.; Sozzi, C.; Tardocchi, M.), s. 31-34. (AIP Conference Proceedings. 1612). ISBN 978-0-7354-1248-4. ISSN 0094-243X. [International Conference on Fusion Reactor Diagnostics. Villa Monastero,Varenna (IT), 09.09.2013-13.09.2013] R&D Projects: GA MŠk(CZ) LM2011021 Institutional support: RVO:61389021 ; RVO:68378271 Keywords : Hall sensors * fusion * magnetic diagnostic * radiation hardness Subject RIV: BL - Plasma and Gas Discharge Physics; BL - Plasma and Gas Discharge Physics (FZU-D) http://scitation.aip.org/content/aip/proceeding/aipcp/10.1063/1.4894020
Full Text Available This paper presents an adaptive amplifier that is part of a sensor node in a wireless sensor network. The system presents a target gain that has to be maintained without direct human intervention despite the presence of faults. In addition, its bandwidth must be as large as possible. The system is composed of a software-based built-in self-test scheme implemented in the node that checks all the available gains in the amplifiers, a reconfigurable amplifier, and a genetic algorithm (GA for reconfiguring the node resources that runs on a host computer. We adopt a PSoC device from Cypress for the node implementation. The performance evaluation of the scheme presented is made by adopting four different types of fault models in the amplifier gains. The fault simulation results show that GA finds the target gain with low error, maintains the bandwidth above the minimum tolerable bandwidth, and presents a runtime lower than exhaustive search method.
Full Text Available In Wireless Sensor Networks (WSN when an event is detected there is an increase in data traffic that mightlead to packets being transmitted through the network close to the packet handling capacity of the WSN.The WSN experiences a decrease in network performance due to packet loss, long delays, and reduction inthroughput. In this paper we developed an adaptive congestion control algorithm that monitors networkutilization and adjust traffic levels and/or increases network resources to improve throughput and conserveenergy. The traffic congestion control protocol DelStatic is developed by introducing backpressuremechanism into NOAH. We analyzed various routing protocols and established that DSR has a higherresource congestion control capability. The proposed protocol, ACCP uses a sink switching algorithm totrigger DelStatic or DSR feedback to a congested node based on its Node Rank. From the simulationresults, ACCP protocol does not only improve throughput but also conserves energy which is critical tosensor application survivability on the field. Our Adaptive Congestion control achieved reliability, highthroughput and energy efficiency.
Mukherjee, Subhabrata; Naskar, Mrinal K; Mukherjee, Amitava
The wireless sensor network is a collection of energy-constrained nodes. Their objective is to sense, collect and process information for some ad-hoc purpose. Typically the nodes are deployed in geographically inaccessible regions. Thus the most challenging task is to design a network with minimal power consumption. As the nodes have to collect and process data very fast, minimizing data delivery time is another objective. In addition to this, when multiple sources transmit data simultaneously, the network load gradually increases and it may lead to congestion. In this paper we have proposed an adaptive framework in which multiple sources transmit data simultaneously with minimal end-to-end data delivery time and minimal energy consumption besides ensuring that congestion remains at an optimum low so that minimal number of data packets are dropped. This paper presents an adaptive framework to achieve the above-mentioned objectives. This framework has been used over Mac 802.11 and extensive simulations have be...
Zhang, Qiong; Maldague, Xavier
A novel nonsubsampled contourlet transform (NSCT) based image fusion approach, implementing an adaptive-Gaussian (AG) fuzzy membership method, compressed sensing (CS) technique, total variation (TV) based gradient descent reconstruction algorithm, is proposed for the fusion computation of infrared and visible images. Compared with wavelet, contourlet, or any other multi-resolution analysis method, NSCT has many evident advantages, such as multi-scale, multi-direction, and translation invariance. As is known, a fuzzy set is characterized by its membership function (MF), while the commonly known Gaussian fuzzy membership degree can be introduced to establish an adaptive control of the fusion processing. The compressed sensing technique can sparsely sample the image information in a certain sampling rate, and the sparse signal can be recovered by solving a convex problem employing gradient descent based iterative algorithm(s). In the proposed fusion process, the pre-enhanced infrared image and the visible image are decomposed into low-frequency subbands and high-frequency subbands, respectively, via the NSCT method as a first step. The low-frequency coefficients are fused using the adaptive regional average energy rule; the highest-frequency coefficients are fused using the maximum absolute selection rule; the other high-frequency coefficients are sparsely sampled, fused using the adaptive-Gaussian regional standard deviation rule, and then recovered by employing the total variation based gradient descent recovery algorithm. Experimental results and human visual perception illustrate the effectiveness and advantages of the proposed fusion approach. The efficiency and robustness are also analyzed and discussed through different evaluation methods, such as the standard deviation, Shannon entropy, root-mean-square error, mutual information and edge-based similarity index.
An imaging system which applies knowledge-based technology to supervise and control both sensor hardware and computation in the imaging system is described. It includes the development of an imaging system breadboard which brings together into one system work that we and others have pursued for LaRC for several years. The goal is to combine Digital Signal Processing (DSP) with Knowledge-Based Processing and also include Neural Net processing. The system is considered a smart camera. Imagine that there is a microgravity experiment on-board Space Station Freedom with a high frame rate, high resolution camera. All the data cannot possibly be acquired from a laboratory on Earth. In fact, only a small fraction of the data will be received. Again, imagine being responsible for some experiments on Mars with the Mars Rover: the data rate is a few kilobits per second for data from several sensors and instruments. Would it not be preferable to have a smart system which would have some human knowledge and yet follow some instructions and attempt to make the best use of the limited bandwidth for transmission. The system concept, current status of the breadboard system and some recent experiments at the Mars-like Amboy Lava Fields in California are discussed.
Nashman, Marilyn; Yoshimi, Billibon; Hong, Tsai Hong; Rippey, William G.; Herman, Martin
This paper describes a real-time hierarchical system that fuses data from vision and touch sensors to improve the performance of a coordinate measuring machine (CMM) used for dimensional inspection tasks. The system consists of sensory processing, world modeling, and task decomposition modules. It uses the strengths of each sensor -- the precision of the CMM scales and the analog touch probe and the global information provided by the low resolution camera -- to improve the speed and flexibility of the inspection task. In the experiment described, the vision module performs all computations in image coordinate space. The part's boundaries are extracted during an initialization process and then the probe's position is continuously updated as it scans and measures the part surface. The system fuses the estimated probe velocity and distance to the part boundary in image coordinates with the estimated velocity and probe position provided by the CMM controller. The fused information provides feedback to the monitor controller as it guides the touch probe to scan the part. We also discuss integrating information from the vision system and the probe to autonomously collect data for 2-D to 3-D calibration, and work to register computer aided design (CAD) models with images of parts in the workplace.
Hol, Jeroen D.
The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game conso...
Full Text Available This paper introduces the basic concept of the Position Navigation and Timing (PNT Module as future part of a ship side Integrated Navigation System (INS. Core of the PNT Module is a sensor fusion based processing system (PNT Unit. The paper will focus on important aspects and first results of the initial practical realization of such a PNT Unit, including a realization of a Consistent Common Reference System (CCRS, GNSS/IMU tightly coupled positioning results as well as contingency performance of the inertial sensors.
Changyu He; Peter Kazanzides; Hasan Tutkun Sen; Sungmin Kim; Yue Liu
Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approa...
Shrabani Bhattacharya; R Appavu Raj
We have adopted the state-vector fusion technique for fusing multiple sensors track data to provide complete and precise trajectory information about the ﬂight vehicle under test, for the purpose of ﬂight safety monitoring and decisionmaking at Test Range. The present paper brings out the performance of the algorithm for different process noise and measurement noise using simulated as well as real track data.
Martí, Enrique David; García, Jesús; José M. Molina
Many works on context-aware systems make use of location, navigation or tracking services offered by an underlying sensor fusion module, as part of the relevant contextual information. The obtained knowledge is typically consumed only by the high level layers of the system, in spite that context itself represents a valuable source of information from which every part of the implemented system could take benefit. This paper closes the loop, analyzing how can context knowledge be applied to imp...
Paillet, Philippe; Goiffon, Vincent; Chabane, Aziouz; Girard, Sylvain; Rousseau, Adrien; Darbon, Stéphane; Duhamel, Olivier; Raine, Mélanie; Cervantes, Paola; Gaillardin, Marc; Bourgade, Jean-Luc; Magnan, Pierre; Glebov, Vladimir Yu; Pien, Gregory
A hardening method is proposed to enable the use of CMOS image sensors for Fusion by Inertial Confinement Diagnostics. The mitigation technique improves their radiation tolerance using a reset mode implemented in the device. The results obtained evidence a reduction of more than 70% in the number of transient white pixels induced in the pixel array by the mixed neutron and γ-ray pulsed radiation environment.
Luo, Ren C.; Lai, Chun C.
This chapter presents a consistent map construction in a unitary SLAM (simultaneously localization and mapping) process through the sensor fusion approach and optimal alignment technologies. The system will autonomously provide the environment geometrical structure for intelligent robot service in a building. In order to build the consistent map, a CI (Covariance Intersection) rule fuses the uncertainty from wheel encoder and ICP (Iterative Closest Point) result as a robust initial value of t...
A hardening method is proposed to enable the use of CMOS image sensors for Fusion by Inertial Confinement Diagnostics. The mitigation technique improves their radiation tolerance using a reset mode implemented in the device. The results obtained evidence a reduction of more than 70% in the number of transient white pixels induced in the pixel array by the mixed neutron and γ-ray pulsed radiation environment. (authors)
Fernandez Prades, Carles; Vilà Valls, Jordi
This paper shows the applicability of recently-developed Gaussian nonlinear filters to sensor data fusion for positioning purposes. After providing a brief review of Bayesian nonlinear filtering, we specially address square-root, derivative-free algorithms based on the Gaussian assumption and approximation rules for numerical integration, namely the Gauss-Hermite quadrature rule and the cubature rule. Then, we propose a motion model based on the observations taken by an Inertial Measurement U...