WorldWideScience

Sample records for adaptive sensor fusion

  1. Adaptive sensor fusion using genetic algorithms

    International Nuclear Information System (INIS)

    Fitzgerald, D.S.; Adams, D.G.

    1994-01-01

    Past attempts at sensor fusion have used some form of Boolean logic to combine the sensor information. As an alteniative, an adaptive ''fuzzy'' sensor fusion technique is described in this paper. This technique exploits the robust capabilities of fuzzy logic in the decision process as well as the optimization features of the genetic algorithm. This paper presents a brief background on fuzzy logic and genetic algorithms and how they are used in an online implementation of adaptive sensor fusion

  2. Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.

    Science.gov (United States)

    Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan

    2018-02-06

    This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.

  3. Accurate human limb angle measurement: sensor fusion through Kalman, least mean squares and recursive least-squares adaptive filtering

    Science.gov (United States)

    Olivares, A.; Górriz, J. M.; Ramírez, J.; Olivares, G.

    2011-02-01

    Inertial sensors are widely used in human body motion monitoring systems since they permit us to determine the position of the subject's limbs. Limb angle measurement is carried out through the integration of the angular velocity measured by a rate sensor and the decomposition of the components of static gravity acceleration measured by an accelerometer. Different factors derived from the sensors' nature, such as the angle random walk and dynamic bias, lead to erroneous measurements. Dynamic bias effects can be reduced through the use of adaptive filtering based on sensor fusion concepts. Most existing published works use a Kalman filtering sensor fusion approach. Our aim is to perform a comparative study among different adaptive filters. Several least mean squares (LMS), recursive least squares (RLS) and Kalman filtering variations are tested for the purpose of finding the best method leading to a more accurate and robust limb angle measurement. A new angle wander compensation sensor fusion approach based on LMS and RLS filters has been developed.

  4. Accurate human limb angle measurement: sensor fusion through Kalman, least mean squares and recursive least-squares adaptive filtering

    International Nuclear Information System (INIS)

    Olivares, A; Olivares, G; Górriz, J M; Ramírez, J

    2011-01-01

    Inertial sensors are widely used in human body motion monitoring systems since they permit us to determine the position of the subject's limbs. Limb angle measurement is carried out through the integration of the angular velocity measured by a rate sensor and the decomposition of the components of static gravity acceleration measured by an accelerometer. Different factors derived from the sensors' nature, such as the angle random walk and dynamic bias, lead to erroneous measurements. Dynamic bias effects can be reduced through the use of adaptive filtering based on sensor fusion concepts. Most existing published works use a Kalman filtering sensor fusion approach. Our aim is to perform a comparative study among different adaptive filters. Several least mean squares (LMS), recursive least squares (RLS) and Kalman filtering variations are tested for the purpose of finding the best method leading to a more accurate and robust limb angle measurement. A new angle wander compensation sensor fusion approach based on LMS and RLS filters has been developed

  5. Sensor fusion for mobile robot navigation

    International Nuclear Information System (INIS)

    Kam, M.; Zhu, X.; Kalata, P.

    1997-01-01

    The authors review techniques for sensor fusion in robot navigation, emphasizing algorithms for self-location. These find use when the sensor suite of a mobile robot comprises several different sensors, some complementary and some redundant. Integrating the sensor readings, the robot seeks to accomplish tasks such as constructing a map of its environment, locating itself in that map, and recognizing objects that should be avoided or sought. The review describes integration techniques in two categories: low-level fusion is used for direct integration of sensory data, resulting in parameter and state estimates; high-level fusion is used for indirect integration of sensory data in hierarchical architectures, through command arbitration and integration of control signals suggested by different modules. The review provides an arsenal of tools for addressing this (rather ill-posed) problem in machine intelligence, including Kalman filtering, rule-based techniques, behavior based algorithms and approaches that borrow from information theory, Dempster-Shafer reasoning, fuzzy logic and neural networks. It points to several further-research needs, including: robustness of decision rules; simultaneous consideration of self-location, motion planning, motion control and vehicle dynamics; the effect of sensor placement and attention focusing on sensor fusion; and adaptation of techniques from biological sensor fusion

  6. Multi-rate sensor fusion-based adaptive discrete finite-time synergetic control for flexible-joint mechanical systems

    International Nuclear Information System (INIS)

    Xue Guang-Yue; Ren Xue-Mei; Xia Yuan-Qing

    2013-01-01

    This paper proposes an adaptive discrete finite-time synergetic control (ADFTSC) scheme based on a multi-rate sensor fusion estimator for flexible-joint mechanical systems in the presence of unmeasured states and dynamic uncertainties. Multi-rate sensors are employed to observe the system states which cannot be directly obtained by encoders due to the existence of joint flexibilities. By using an extended Kalman filter (EKF), the finite-time synergetic controller is designed based on a sensor fusion estimator which estimates states and parameters of the mechanical system with multi-rate measurements. The proposed controller can guarantee the finite-time convergence of tracking errors by the theoretical derivation. Simulation and experimental studies are included to validate the effectiveness of the proposed approach. (general)

  7. An Adaptive Multi-Sensor Data Fusion Method Based on Deep Convolutional Neural Networks for Fault Diagnosis of Planetary Gearbox

    Science.gov (United States)

    Jing, Luyang; Wang, Taiyong; Zhao, Ming; Wang, Peng

    2017-01-01

    A fault diagnosis approach based on multi-sensor data fusion is a promising tool to deal with complicated damage detection problems of mechanical systems. Nevertheless, this approach suffers from two challenges, which are (1) the feature extraction from various types of sensory data and (2) the selection of a suitable fusion level. It is usually difficult to choose an optimal feature or fusion level for a specific fault diagnosis task, and extensive domain expertise and human labor are also highly required during these selections. To address these two challenges, we propose an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis. The proposed method can learn features from raw data and optimize a combination of different fusion levels adaptively to satisfy the requirements of any fault diagnosis task. The proposed method is tested through a planetary gearbox test rig. Handcraft features, manual-selected fusion levels, single sensory data, and two traditional intelligent models, back-propagation neural networks (BPNN) and a support vector machine (SVM), are used as comparisons in the experiment. The results demonstrate that the proposed method is able to detect the conditions of the planetary gearbox effectively with the best diagnosis accuracy among all comparative methods in the experiment. PMID:28230767

  8. An Approach to Automated Fusion System Design and Adaptation

    Directory of Open Access Journals (Sweden)

    Alexander Fritze

    2017-03-01

    Full Text Available Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  9. An Approach to Automated Fusion System Design and Adaptation.

    Science.gov (United States)

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-03-16

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  10. Sensor Data Fusion

    DEFF Research Database (Denmark)

    Plascencia, Alfredo; Stepán, Petr

    2006-01-01

    The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a Sensor Data Fusion (SDF) architecture. This approach involves combined sonar array with stereo vision readings.  Sonar readings are interpreted using probability density functions...

  11. Sensor Fusion and Smart Sensor in Sports and Biomedical Applications

    Directory of Open Access Journals (Sweden)

    José Jair Alves Mendes Jr.

    2016-09-01

    Full Text Available The following work presents an overview of smart sensors and sensor fusion targeted at biomedical applications and sports areas. In this work, the integration of these areas is demonstrated, promoting a reflection about techniques and applications to collect, quantify and qualify some physical variables associated with the human body. These techniques are presented in various biomedical and sports applications, which cover areas related to diagnostics, rehabilitation, physical monitoring, and the development of performance in athletes, among others. Although some applications are described in only one of two fields of study (biomedicine and sports, it is very likely that the same application fits in both, with small peculiarities or adaptations. To illustrate the contemporaneity of applications, an analysis of specialized papers published in the last six years has been made. In this context, the main characteristic of this review is to present the largest quantity of relevant examples of sensor fusion and smart sensors focusing on their utilization and proposals, without deeply addressing one specific system or technique, to the detriment of the others.

  12. Sensor fusion for intelligent alarm analysis

    International Nuclear Information System (INIS)

    Nelson, C.L.; Fitzgerald, D.S.

    1996-01-01

    The purpose of an intelligent alarm analysis system is to provide complete and manageable information to a central alarm station operator by applying alarm processing and fusion techniques to sensor information. This paper discusses the sensor fusion approach taken to perform intelligent alarm analysis for the Advanced Exterior Sensor (AES). The AES is an intrusion detection and assessment system designed for wide-area coverage, quick deployment, low false/nuisance alarm operation, and immediate visual assessment. It combines three sensor technologies (visible, infrared, and millimeter wave radar) collocated on a compact and portable remote sensor module. The remote sensor module rotates at a rate of 1 revolution per second to detect and track motion and provide assessment in a continuous 360 degree field-of-regard. Sensor fusion techniques are used to correlate and integrate the track data from these three sensors into a single track for operator observation. Additional inputs to the fusion process include environmental data, knowledge of sensor performance under certain weather conditions, sensor priority, and recent operator feedback. A confidence value is assigned to the track as a result of the fusion process. This helps to reduce nuisance alarms and to increase operator confidence in the system while reducing the workload of the operator

  13. Information-Fusion Methods Based Simultaneous Localization and Mapping for Robot Adapting to Search and Rescue Postdisaster Environments

    Directory of Open Access Journals (Sweden)

    Hongling Wang

    2018-01-01

    Full Text Available The first application of utilizing unique information-fusion SLAM (IF-SLAM methods is developed for mobile robots performing simultaneous localization and mapping (SLAM adapting to search and rescue (SAR environments in this paper. Several fusion approaches, parallel measurements filtering, exploration trajectories fusing, and combination sensors’ measurements and mobile robots’ trajectories, are proposed. The novel integration particle filter (IPF and optimal improved EKF (IEKF algorithms are derived for information-fusion systems to perform SLAM task in SAR scenarios. The information-fusion architecture consists of multirobots and multisensors (MAM; multiple robots mount on-board laser range finder (LRF sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors. This information-fusion SLAM (IF-SLAM is compared with conventional methods, which indicates that fusion trajectory is more consistent with estimated trajectories and real observation trajectories. The simulations and experiments of SLAM process are conducted in both cluttered indoor environment and outdoor collapsed unstructured scenario, and experimental results validate the effectiveness of the proposed information-fusion methods in improving SLAM performances adapting to SAR scenarios.

  14. Efficient sensor selection for active information fusion.

    Science.gov (United States)

    Zhang, Yongmian; Ji, Qiang

    2010-06-01

    In our previous paper, we formalized an active information fusion framework based on dynamic Bayesian networks to provide active information fusion. This paper focuses on a central issue of active information fusion, i.e., the efficient identification of a subset of sensors that are most decision relevant and cost effective. Determining the most informative and cost-effective sensors requires an evaluation of all the possible subsets of sensors, which is computationally intractable, particularly when information-theoretic criterion such as mutual information is used. To overcome this challenge, we propose a new quantitative measure for sensor synergy based on which a sensor synergy graph is constructed. Using the sensor synergy graph, we first introduce an alternative measure to multisensor mutual information for characterizing the sensor information gain. We then propose an approximated nonmyopic sensor selection method that can efficiently and near-optimally select a subset of sensors for active fusion. The simulation study demonstrates both the performance and the efficiency of the proposed sensor selection method.

  15. Enhanced chemical weapon warning via sensor fusion

    Science.gov (United States)

    Flaherty, Michael; Pritchett, Daniel; Cothren, Brian; Schwaiger, James

    2011-05-01

    Torch Technologies Inc., is actively involved in chemical sensor networking and data fusion via multi-year efforts with Dugway Proving Ground (DPG) and the Defense Threat Reduction Agency (DTRA). The objective of these efforts is to develop innovative concepts and advanced algorithms that enhance our national Chemical Warfare (CW) test and warning capabilities via the fusion of traditional and non-traditional CW sensor data. Under Phase I, II, and III Small Business Innovative Research (SBIR) contracts with DPG, Torch developed the Advanced Chemical Release Evaluation System (ACRES) software to support non real-time CW sensor data fusion. Under Phase I and II SBIRs with DTRA in conjunction with the Edgewood Chemical Biological Center (ECBC), Torch is using the DPG ACRES CW sensor data fuser as a framework from which to develop the Cloud state Estimation in a Networked Sensor Environment (CENSE) data fusion system. Torch is currently developing CENSE to implement and test innovative real-time sensor network based data fusion concepts using CW and non-CW ancillary sensor data to improve CW warning and detection in tactical scenarios.

  16. Feasibility study on sensor data fusion for the CP-140 aircraft: fusion architecture analyses

    Science.gov (United States)

    Shahbazian, Elisa

    1995-09-01

    Loral Canada completed (May 1995) a Department of National Defense (DND) Chief of Research and Development (CRAD) contract, to study the feasibility of implementing a multi- sensor data fusion (MSDF) system onboard the CP-140 Aurora aircraft. This system is expected to fuse data from: (a) attributed measurement oriented sensors (ESM, IFF, etc.); (b) imaging sensors (FLIR, SAR, etc.); (c) tracking sensors (radar, acoustics, etc.); (d) data from remote platforms (data links); and (e) non-sensor data (intelligence reports, environmental data, visual sightings, encyclopedic data, etc.). Based on purely theoretical considerations a central-level fusion architecture will lead to a higher performance fusion system. However, there are a number of systems and fusion architecture issues involving fusion of such dissimilar data: (1) the currently existing sensors are not designed to provide the type of data required by a fusion system; (2) the different types (attribute, imaging, tracking, etc.) of data may require different degree of processing, before they can be used within a fusion system efficiently; (3) the data quality from different sensors, and more importantly from remote platforms via the data links must be taken into account before fusing; and (4) the non-sensor data may impose specific requirements on the fusion architecture (e.g. variable weight/priority for the data from different sensors). This paper presents the analyses performed for the selection of the fusion architecture for the enhanced sensor suite planned for the CP-140 aircraft in the context of the mission requirements and environmental conditions.

  17. Context-Aided Sensor Fusion for Enhanced Urban Navigation

    Science.gov (United States)

    Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María

    2012-01-01

    The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments. PMID:23223080

  18. Decentralized Sensor Fusion for Ubiquitous Networking Robotics in Urban Areas

    Science.gov (United States)

    Sanfeliu, Alberto; Andrade-Cetto, Juan; Barbosa, Marco; Bowden, Richard; Capitán, Jesús; Corominas, Andreu; Gilbert, Andrew; Illingworth, John; Merino, Luis; Mirats, Josep M.; Moreno, Plínio; Ollero, Aníbal; Sequeira, João; Spaan, Matthijs T.J.

    2010-01-01

    In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites), a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted. PMID:22294927

  19. Freeway Multisensor Data Fusion Approach Integrating Data from Cellphone Probes and Fixed Sensors

    Directory of Open Access Journals (Sweden)

    Shanglu He

    2016-01-01

    Full Text Available Freeway traffic state information from multiple sources provides sufficient support to the traffic surveillance but also brings challenges. This paper made an investigation into the fusion of a new data combination from cellular handoff probe system and microwave sensors. And a fusion method based on the neural network technique was proposed. To identify the factors influencing the accuracy of fusion results, we analyzed the sensitivity of those factors by changing the inputs of neural-network-based fusion model. The results showed that handoff link length and sample size were identified as the most influential parameters to the precision of fusion. Then, the effectiveness and capability of proposed fusion method under various traffic conditions were evaluated. And a comparative analysis between the proposed method and other fusion approaches was conducted. The results of simulation test and evaluation showed that the fusion method could complement the drawback of each collection method, improve the overall estimation accuracy, adapt to the variable traffic condition (free flow or incident state, suit the fusion of data from cellphone probes and fixed sensors, and outperform other fusion methods.

  20. Energy-efficient Organization of Wireless Sensor Networks with Adaptive Forecasting

    Directory of Open Access Journals (Sweden)

    Dao-Wei Bi

    2008-04-01

    Full Text Available Due to the wide potential applications of wireless sensor networks, this topic has attracted great attention. The strict energy constraints of sensor nodes result in great challenges for energy efficiency. This paper proposes an energy-efficient organization method. The organization of wireless sensor networks is formulated for target tracking. Target localization is achieved by collaborative sensing with multi-sensor fusion. The historical localization results are utilized for adaptive target trajectory forecasting. Combining autoregressive moving average (ARMA model and radial basis function networks (RBFNs, robust target position forecasting is performed. Moreover, an energyefficient organization method is presented to enhance the energy efficiency of wireless sensor networks. The sensor nodes implement sensing tasks are awakened in a distributed manner. When the sensor nodes transfer their observations to achieve data fusion, the routing scheme is obtained by ant colony optimization. Thus, both the operation and communication energy consumption can be minimized. Experimental results verify that the combination of ARMA model and RBFN can estimate the target position efficiently and energy saving is achieved by the proposed organization method in wireless sensor networks.

  1. Data fusion and sensor management for nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ciftcioglu, O [Istanbul Technical Univ., Istanbul (Turkey). Nuclear Power Dept.; Turkcan, E [Netherlands Energy Research Foundation (ECN), Petten (Netherlands)

    1997-12-31

    The paper describes the implementation of the data-sensor fusion and sensor management technology for accident management through simulated severe accident (SA) scenarios subjected to study. The organization of the present paper is as follows. As the data-sensor fusion and sensor management is an emerging technology which is not widely known, in Sec. 2, the definition and goals of data-sensor fusion and sensor management technology is described. In Sec. 3 fits, with reference to Kalman filtering as an information filter, statistical data-sensor fusion technology is described. This is followed by deterministic data-sensor fusion technology using gross plant state variables and neural networks (NN) and the implementation for severe accident management in NPPs. In Sec. 4, the sensor management technology is described. Finally, the performance of the data-sensor fusion technology for NPP safety is discussed. 12 refs, 6 figs.

  2. Data fusion and sensor management for nuclear power plant safety

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1996-01-01

    The paper describes the implementation of the data-sensor fusion and sensor management technology for accident management through simulated severe accident (SA) scenarios subjected to study. The organization of the present paper is as follows. As the data-sensor fusion and sensor management is an emerging technology which is not widely known, in Sec. 2, the definition and goals of data-sensor fusion and sensor management technology is described. In Sec. 3 fits, with reference to Kalman filtering as an information filter, statistical data-sensor fusion technology is described. This is followed by deterministic data-sensor fusion technology using gross plant state variables and neural networks (NN) and the implementation for severe accident management in NPPs. In Sec. 4, the sensor management technology is described. Finally, the performance of the data-sensor fusion technology for NPP safety is discussed. 12 refs, 6 figs

  3. Reliability of measured data for pH sensor arrays with fault diagnosis and data fusion based on LabVIEW.

    Science.gov (United States)

    Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi

    2013-12-13

    Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.

  4. Reliability of Measured Data for pH Sensor Arrays with Fault Diagnosis and Data Fusion Based on LabVIEW

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liao

    2013-12-01

    Full Text Available Fault diagnosis (FD and data fusion (DF technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2 sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.

  5. Decentralized Sensor Fusion for Ubiquitous Networking Robotics in Urban Areas

    Directory of Open Access Journals (Sweden)

    Aníbal Ollero

    2010-03-01

    Full Text Available In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites, a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted.

  6. Multivariate Sensitivity Analysis of Time-of-Flight Sensor Fusion

    Science.gov (United States)

    Schwarz, Sebastian; Sjöström, Mårten; Olsson, Roger

    2014-09-01

    Obtaining three-dimensional scenery data is an essential task in computer vision, with diverse applications in various areas such as manufacturing and quality control, security and surveillance, or user interaction and entertainment. Dedicated Time-of-Flight sensors can provide detailed scenery depth in real-time and overcome short-comings of traditional stereo analysis. Nonetheless, they do not provide texture information and have limited spatial resolution. Therefore such sensors are typically combined with high resolution video sensors. Time-of-Flight Sensor Fusion is a highly active field of research. Over the recent years, there have been multiple proposals addressing important topics such as texture-guided depth upsampling and depth data denoising. In this article we take a step back and look at the underlying principles of ToF sensor fusion. We derive the ToF sensor fusion error model and evaluate its sensitivity to inaccuracies in camera calibration and depth measurements. In accordance with our findings, we propose certain courses of action to ensure high quality fusion results. With this multivariate sensitivity analysis of the ToF sensor fusion model, we provide an important guideline for designing, calibrating and running a sophisticated Time-of-Flight sensor fusion capture systems.

  7. An adaptive Kalman filter approach for cardiorespiratory signal extraction and fusion of non-contacting sensors.

    Science.gov (United States)

    Foussier, Jerome; Teichmann, Daniel; Jia, Jing; Misgeld, Berno; Leonhardt, Steffen

    2014-05-09

    Extracting cardiorespiratory signals from non-invasive and non-contacting sensor arrangements, i.e. magnetic induction sensors, is a challenging task. The respiratory and cardiac signals are mixed on top of a large and time-varying offset and are likely to be disturbed by measurement noise. Basic filtering techniques fail to extract relevant information for monitoring purposes. We present a real-time filtering system based on an adaptive Kalman filter approach that separates signal offsets, respiratory and heart signals from three different sensor channels. It continuously estimates respiration and heart rates, which are fed back into the system model to enhance performance. Sensor and system noise covariance matrices are automatically adapted to the aimed application, thus improving the signal separation capabilities. We apply the filtering to two different subjects with different heart rates and sensor properties and compare the results to the non-adaptive version of the same Kalman filter. Also, the performance, depending on the initialization of the filters, is analyzed using three different configurations ranging from best to worst case. Extracted data are compared with reference heart rates derived from a standard pulse-photoplethysmographic sensor and respiration rates from a flowmeter. In the worst case for one of the subjects the adaptive filter obtains mean errors (standard deviations) of -0.2 min(-1) (0.3 min(-1)) and -0.7 bpm (1.7 bpm) (compared to -0.2 min(-1) (0.4 min(-1)) and 42.0 bpm (6.1 bpm) for the non-adaptive filter) for respiration and heart rate, respectively. In bad conditions the heart rate is only correctly measurable when the Kalman matrices are adapted to the target sensor signals. Also, the reduced mean error between the extracted offset and the raw sensor signal shows that adapting the Kalman filter continuously improves the ability to separate the desired signals from the raw sensor data. The average total computational time needed

  8. Context-Aided Sensor Fusion for Enhanced Urban Navigation

    Directory of Open Access Journals (Sweden)

    Enrique David Martí

    2012-12-01

    Full Text Available  The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments.

  9. Data fusion and sensor management for nuclear power plant safety

    International Nuclear Information System (INIS)

    Ciftcioglu, Oe.

    1996-05-01

    The paper describes the implementation of the data-sensor fusion and sensor management technology for accident management through simulated severe accident (SA) scenarios subjected to study. By means of accident management the appropriate prompt actions to be taken to avoid nuclear accident (SA) scenarios subjected to study. By means of accident management the appropriate prompt actions to be taken to avoid nuclear accidents are meant, while such accidents are deemed to somehow be imminent during plant operation. The organisation of the present paper is as follows. As the data-sensor fusion and sensor management is an emerging technology which is not widely known, in Sec. 2, the definition and goals of data-sensor fusion and sensor management technology is described. In Sec. 3 first, with reference to Kalman filtering as an information filter, statistical data-sensor fusion technology is described. This is followed by the examples of deterministic data-sensor fusion technology using gross plant state variables and neural networks (NN) and the implementation for severe accident management in NPPs. In Sec. 4, the sensor management technology is described. Finally, the performance of the data-sensor fusion technology for NPP safety is discussed. (orig./WL)

  10. Fuzzy-Based Sensor Fusion for Cognitive Radio-Based Vehicular Ad Hoc and Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mohammad Jalil Piran

    2015-01-01

    Full Text Available In wireless sensor networks, sensor fusion is employed to integrate the acquired data from diverse sensors to provide a unified interpretation. The best and most salient advantage of sensor fusion is to obtain high-level information in both statistical and definitive aspects, which cannot be attained by a single sensor. In this paper, we propose a novel sensor fusion technique based on fuzzy theory for our earlier proposed Cognitive Radio-based Vehicular Ad Hoc and Sensor Networks (CR-VASNET. In the proposed technique, we considered four input sensor readings (antecedents and one output (consequent. The employed mobile nodes in CR-VASNET are supposed to be equipped with diverse sensors, which cater to our antecedent variables, for example, The Jerk, Collision Intensity, and Temperature and Inclination Degree. Crash_Severity is considered as the consequent variable. The processing and fusion of the diverse sensory signals are carried out by fuzzy logic scenario. Accuracy and reliability of the proposed protocol, demonstrated by the simulation results, introduce it as an applicable system to be employed to reduce the causalities rate of the vehicles’ crashes.

  11. Advances in multi-sensor data fusion: algorithms and applications.

    Science.gov (United States)

    Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying

    2009-01-01

    With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.

  12. Fusion of Images from Dissimilar Sensor Systems

    National Research Council Canada - National Science Library

    Chow, Khin

    2004-01-01

    Different sensors exploit different regions of the electromagnetic spectrum; therefore a multi-sensor image fusion system can take full advantage of the complementary capabilities of individual sensors in the suit...

  13. Distributed data fusion across multiple hard and soft mobile sensor platforms

    Science.gov (United States)

    Sinsley, Gregory

    One of the biggest challenges currently facing the robotics field is sensor data fusion. Unmanned robots carry many sophisticated sensors including visual and infrared cameras, radar, laser range finders, chemical sensors, accelerometers, gyros, and global positioning systems. By effectively fusing the data from these sensors, a robot would be able to form a coherent view of its world that could then be used to facilitate both autonomous and intelligent operation. Another distinct fusion problem is that of fusing data from teammates with data from onboard sensors. If an entire team of vehicles has the same worldview they will be able to cooperate much more effectively. Sharing worldviews is made even more difficult if the teammates have different sensor types. The final fusion challenge the robotics field faces is that of fusing data gathered by robots with data gathered by human teammates (soft sensors). Humans sense the world completely differently from robots, which makes this problem particularly difficult. The advantage of fusing data from humans is that it makes more information available to the entire team, thus helping each agent to make the best possible decisions. This thesis presents a system for fusing data from multiple unmanned aerial vehicles, unmanned ground vehicles, and human observers. The first issue this thesis addresses is that of centralized data fusion. This is a foundational data fusion issue, which has been very well studied. Important issues in centralized fusion include data association, classification, tracking, and robotics problems. Because these problems are so well studied, this thesis does not make any major contributions in this area, but does review it for completeness. The chapter on centralized fusion concludes with an example unmanned aerial vehicle surveillance problem that demonstrates many of the traditional fusion methods. The second problem this thesis addresses is that of distributed data fusion. Distributed data fusion

  14. Performance evaluation of multi-sensor data fusion technique for ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    Multi-sensor data fusion; Test Range application; trajectory .... Kalman filtering technique utilizes the noise statistics of the underlying system under con- ..... Hall D L 1992 Mathematical techniques in multi-sensor data fusion (Boston, MA: ...

  15. Multi-sensor image fusion and its applications

    CERN Document Server

    Blum, Rick S

    2005-01-01

    Taking another lesson from nature, the latest advances in image processing technology seek to combine image data from several diverse types of sensors in order to obtain a more accurate view of the scene: very much the same as we rely on our five senses. Multi-Sensor Image Fusion and Its Applications is the first text dedicated to the theory and practice of the registration and fusion of image data, covering such approaches as statistical methods, color-related techniques, model-based methods, and visual information display strategies.After a review of state-of-the-art image fusion techniques,

  16. Tracking and sensor data fusion methodological framework and selected applications

    CERN Document Server

    Koch, Wolfgang

    2013-01-01

    Sensor Data Fusion is the process of combining incomplete and imperfect pieces of mutually complementary sensor information in such a way that a better understanding of an underlying real-world phenomenon is achieved. Typically, this insight is either unobtainable otherwise or a fusion result exceeds what can be produced from a single sensor output in accuracy, reliability, or cost. This book provides an introduction Sensor Data Fusion, as an information technology as well as a branch of engineering science and informatics. Part I presents a coherent methodological framework, thus providing th

  17. An adaptive secret key-directed cryptographic scheme for secure transmission in wireless sensor networks

    International Nuclear Information System (INIS)

    Muhammad, K.; Jan, Z.; Khan, Z

    2015-01-01

    Wireless Sensor Networks (WSNs) are memory and bandwidth limited networks whose main goals are to maximize the network lifetime and minimize the energy consumption and transmission cost. To achieve these goals, different techniques of compression and clustering have been used. However, security is an open and major issue in WSNs for which different approaches are used, both in centralized and distributed WSNs' environments. This paper presents an adaptive cryptographic scheme for secure transmission of various sensitive parameters, sensed by wireless sensors to the fusion center for further processing in WSNs such as military networks. The proposed method encrypts the sensitive captured data of sensor nodes using various encryption procedures (bitxor operation, bits shuffling, and secret key based encryption) and then sends it to the fusion center. At the fusion center, the received encrypted data is decrypted for taking further necessary actions. The experimental results with complexity analysis, validate the effectiveness and feasibility of the proposed method in terms of security in WSNs. (author)

  18. Sensor data fusion to predict multiple soil properties

    NARCIS (Netherlands)

    Mahmood, H.S.; Hoogmoed, W.B.; Henten, van E.J.

    2012-01-01

    The accuracy of a single sensor is often low because all proximal soil sensors respond to more than one soil property of interest. Sensor data fusion can potentially overcome this inability of a single sensor and can best extract useful and complementary information from multiple sensors or sources.

  19. Distributed Sensor Fusion for Scalar Field Mapping Using Mobile Sensor Networks.

    Science.gov (United States)

    La, Hung Manh; Sheng, Weihua

    2013-04-01

    In this paper, autonomous mobile sensor networks are deployed to measure a scalar field and build its map. We develop a novel method for multiple mobile sensor nodes to build this map using noisy sensor measurements. Our method consists of two parts. First, we develop a distributed sensor fusion algorithm by integrating two different distributed consensus filters to achieve cooperative sensing among sensor nodes. This fusion algorithm has two phases. In the first phase, the weighted average consensus filter is developed, which allows each sensor node to find an estimate of the value of the scalar field at each time step. In the second phase, the average consensus filter is used to allow each sensor node to find a confidence of the estimate at each time step. The final estimate of the value of the scalar field is iteratively updated during the movement of the mobile sensors via weighted average. Second, we develop the distributed flocking-control algorithm to drive the mobile sensors to form a network and track the virtual leader moving along the field when only a small subset of the mobile sensors know the information of the leader. Experimental results are provided to demonstrate our proposed algorithms.

  20. Relative Vessel Motion Tracking using Sensor Fusion, Aruco Markers, and MRU Sensors

    Directory of Open Access Journals (Sweden)

    Sondre Sanden Tordal

    2017-04-01

    Full Text Available This paper presents a novel approach for estimating the relative motion between two moving offshore vessels. The method is based on a sensor fusion algorithm including a vision system and two motion reference units (MRUs. The vision system makes use of the open-source computer vision library OpenCV and a cube with Aruco markers placed onto each of the cube sides. The Extended Quaternion Kalman Filter (EQKF is used for bad pose rejection for the vision system. The presented sensor fusion algorithm is based on the Indirect Feedforward Kalman Filter for error estimation. The system is self-calibrating in the sense that the Aruco cube can be placed in an arbitrary location on the secondary vessel. Experimental 6-DOF results demonstrate the accuracy and efficiency of the proposed sensor fusion method compared with the internal joint sensors of two Stewart platforms and the industrial robot. The standard deviation error was found to be 31mm or better when the Arcuo cube was placed at three different locations.

  1. Advances in Multi-Sensor Information Fusion: Theory and Applications 2017.

    Science.gov (United States)

    Jin, Xue-Bo; Sun, Shuli; Wei, Hong; Yang, Feng-Bao

    2018-04-11

    The information fusion technique can integrate a large amount of data and knowledge representing the same real-world object and obtain a consistent, accurate, and useful representation of that object. The data may be independent or redundant, and can be obtained by different sensors at the same time or at different times. A suitable combination of investigative methods can substantially increase the profit of information in comparison with that from a single sensor. Multi-sensor information fusion has been a key issue in sensor research since the 1970s, and it has been applied in many fields. For example, manufacturing and process control industries can generate a lot of data, which have real, actionable business value. The fusion of these data can greatly improve productivity through digitization. The goal of this special issue is to report innovative ideas and solutions for multi-sensor information fusion in the emerging applications era, focusing on development, adoption, and applications.

  2. Advances in Multi-Sensor Information Fusion: Theory and Applications 2017

    Directory of Open Access Journals (Sweden)

    Xue-Bo Jin

    2018-04-01

    Full Text Available The information fusion technique can integrate a large amount of data and knowledge representing the same real-world object and obtain a consistent, accurate, and useful representation of that object. The data may be independent or redundant, and can be obtained by different sensors at the same time or at different times. A suitable combination of investigative methods can substantially increase the profit of information in comparison with that from a single sensor. Multi-sensor information fusion has been a key issue in sensor research since the 1970s, and it has been applied in many fields. For example, manufacturing and process control industries can generate a lot of data, which have real, actionable business value. The fusion of these data can greatly improve productivity through digitization. The goal of this special issue is to report innovative ideas and solutions for multi-sensor information fusion in the emerging applications era, focusing on development, adoption, and applications.

  3. Towards an operational sensor-fusion system for anti-personnel landmine detection

    NARCIS (Netherlands)

    Cremer, F.; Schutte, K.; Schavemaker, J.G.M.; Breejen, E. den

    2000-01-01

    To acquire detection performance required for an operational system for the detection of anti-personnel landmines, it is necessary to use multiple sensors and sensor-fusion techniques. This paper describes five decision-level sensor-fusion techniques and their common optimisation method. The

  4. Sensor Fusion for Autonomous Mobile Robot Navigation

    DEFF Research Database (Denmark)

    Plascencia, Alfredo

    Multi-sensor data fusion is a broad area of constant research which is applied to a wide variety of fields such as the field of mobile robots. Mobile robots are complex systems where the design and implementation of sensor fusion is a complex task. But research applications are explored constantl....... The scope of the thesis is limited to building a map for a laboratory robot by fusing range readings from a sonar array with landmarks extracted from stereo vision images using the (Scale Invariant Feature Transform) SIFT algorithm....

  5. Data fusion for target tracking and classification with wireless sensor network

    Science.gov (United States)

    Pannetier, Benjamin; Doumerc, Robin; Moras, Julien; Dezert, Jean; Canevet, Loic

    2016-10-01

    In this paper, we address the problem of multiple ground target tracking and classification with information obtained from a unattended wireless sensor network. A multiple target tracking (MTT) algorithm, taking into account road and vegetation information, is proposed based on a centralized architecture. One of the key issue is how to adapt classical MTT approach to satisfy embedded processing. Based on track statistics, the classification algorithm uses estimated location, velocity and acceleration to help to classify targets. The algorithms enables tracking human and vehicles driving both on and off road. We integrate road or trail width and vegetation cover, as constraints in target motion models to improve performance of tracking under constraint with classification fusion. Our algorithm also presents different dynamic models, to palliate the maneuvers of targets. The tracking and classification algorithms are integrated into an operational platform (the fusion node). In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).

  6. Neuromorphic Audio-Visual Sensor Fusion on a Sound-Localising Robot

    Directory of Open Access Journals (Sweden)

    Vincent Yue-Sek Chan

    2012-02-01

    Full Text Available This paper presents the first robotic system featuring audio-visual sensor fusion with neuromorphic sensors. We combine a pair of silicon cochleae and a silicon retina on a robotic platform to allow the robot to learn sound localisation through self-motion and visual feedback, using an adaptive ITD-based sound localisation algorithm. After training, the robot can localise sound sources (white or pink noise in a reverberant environment with an RMS error of 4 to 5 degrees in azimuth. In the second part of the paper, we investigate the source binding problem. An experiment is conducted to test the effectiveness of matching an audio event with a corresponding visual event based on their onset time. The results show that this technique can be quite effective, despite its simplicity.

  7. Multi-Level Sensor Fusion Algorithm Approach for BMD Interceptor Applications

    National Research Council Canada - National Science Library

    Allen, Doug

    1998-01-01

    ... through fabrication and testing of advanced sensor hardware concepts and advanced sensor fusion algorithms. Advanced sensor concepts include onboard LADAR in conjunction with a multi-color passive IR sensor...

  8. Application of a sensor fusion algorithm for improving grasping stability

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Hyeon; Yoon, Hyun Suck; Moon, Hyung Pil; Choi, Hyouk Ryeol; Koo Ja Choon [Sungkyunkwan University, Suwon (Korea, Republic of)

    2015-07-15

    A robot hand normally employees various sensors that are packaged in small form factor, perform with delicately accurate, and cost mostly very expensive. Grasping operation of the hand relies especially on accuracy of those sensors. Even with a set of advanced sensory systems embedded in a robot hand, securing a stable grasping is still challenging task. The present work makes an attempt to improve force sensor accuracy by applying sensor fusion method. An optimal weight value sensor fusion method formulated with Kalman filters is presented and tested in the work. Using a set of inexpensive sensors, the work achieves a reliable force sensing and applies the enhanced sensor stability to an object pinch grasping.

  9. Application of a sensor fusion algorithm for improving grasping stability

    International Nuclear Information System (INIS)

    Kim, Jae Hyeon; Yoon, Hyun Suck; Moon, Hyung Pil; Choi, Hyouk Ryeol; Koo Ja Choon

    2015-01-01

    A robot hand normally employees various sensors that are packaged in small form factor, perform with delicately accurate, and cost mostly very expensive. Grasping operation of the hand relies especially on accuracy of those sensors. Even with a set of advanced sensory systems embedded in a robot hand, securing a stable grasping is still challenging task. The present work makes an attempt to improve force sensor accuracy by applying sensor fusion method. An optimal weight value sensor fusion method formulated with Kalman filters is presented and tested in the work. Using a set of inexpensive sensors, the work achieves a reliable force sensing and applies the enhanced sensor stability to an object pinch grasping.

  10. Adaptive Energy-Efficient Target Detection Based on Mobile Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Tengyue Zou

    2017-05-01

    Full Text Available Target detection is a widely used application for area surveillance, elder care, and fire alarms; its purpose is to find a particular object or event in a region of interest. Usually, fixed observing stations or static sensor nodes are arranged uniformly in the field. However, each part of the field has a different probability of being intruded upon; if an object suddenly enters an area with few guardian devices, a loss of detection will occur, and the stations in the safe areas will waste their energy for a long time without any discovery. Thus, mobile wireless sensor networks may benefit from adaptation and pertinence in detection. Sensor nodes equipped with wheels are able to move towards the risk area via an adaptive learning procedure based on Bayesian networks. Furthermore, a clustering algorithm based on k-means++ and an energy control mechanism is used to reduce the energy consumption of nodes. The extended Kalman filter and a voting data fusion method are employed to raise the localization accuracy of the target. The simulation and experimental results indicate that this new system with adaptive energy-efficient methods is able to achieve better performance than the traditional ones.

  11. Adaptive Energy-Efficient Target Detection Based on Mobile Wireless Sensor Networks.

    Science.gov (United States)

    Zou, Tengyue; Li, Zhenjia; Li, Shuyuan; Lin, Shouying

    2017-05-04

    Target detection is a widely used application for area surveillance, elder care, and fire alarms; its purpose is to find a particular object or event in a region of interest. Usually, fixed observing stations or static sensor nodes are arranged uniformly in the field. However, each part of the field has a different probability of being intruded upon; if an object suddenly enters an area with few guardian devices, a loss of detection will occur, and the stations in the safe areas will waste their energy for a long time without any discovery. Thus, mobile wireless sensor networks may benefit from adaptation and pertinence in detection. Sensor nodes equipped with wheels are able to move towards the risk area via an adaptive learning procedure based on Bayesian networks. Furthermore, a clustering algorithm based on k -means++ and an energy control mechanism is used to reduce the energy consumption of nodes. The extended Kalman filter and a voting data fusion method are employed to raise the localization accuracy of the target. The simulation and experimental results indicate that this new system with adaptive energy-efficient methods is able to achieve better performance than the traditional ones.

  12. Fusion of Radar and EO-sensors for Surveillance

    NARCIS (Netherlands)

    Kester, L.J.H.M.; Theil, A.

    2001-01-01

    Fusion of radar and EO-sensors is investigated for the purpose of surveillance in littoral waters is. All sensors are considered to be co-located with respect to the distance, typically 1 to 10 km, of the area under surveillance. The sensor suite is a coherent polarimetric radar in combination with

  13. Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion

    Science.gov (United States)

    Qiao, Tiezhu; Chen, Lulu; Pang, Yusong; Yan, Gaowei

    2018-06-01

    Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.

  14. An alternative sensor fusion method for object orientation using low-cost MEMS inertial sensors

    Science.gov (United States)

    Bouffard, Joshua L.

    This thesis develops an alternative sensor fusion approach for object orientation using low-cost MEMS inertial sensors. The alternative approach focuses on the unique challenges of small UAVs. Such challenges include the vibrational induced noise onto the accelerometer and bias offset errors of the rate gyroscope. To overcome these challenges, a sensor fusion algorithm combines the measured data from the accelerometer and rate gyroscope to achieve a single output free from vibrational noise and bias offset errors. One of the most prevalent sensor fusion algorithms used for orientation estimation is the Extended Kalman filter (EKF). The EKF filter performs the fusion process by first creating the process model using the nonlinear equations of motion and then establishing a measurement model. With the process and measurement models established, the filter operates by propagating the mean and covariance of the states through time. The success of EKF relies on the ability to establish a representative process and measurement model of the system. In most applications, the EKF measurement model utilizes the accelerometer and GPS-derived accelerations to determine an estimate of the orientation. However, if the GPS-derived accelerations are not available then the measurement model becomes less reliable when subjected to harsh vibrational environments. This situation led to the alternative approach, which focuses on the correlation between the rate gyroscope and accelerometer-derived angle. The correlation between the two sensors then determines how much the algorithm will use one sensor over the other. The result is a measurement that does not suffer from the vibrational noise or from bias offset errors.

  15. Non-verbal communication through sensor fusion

    Science.gov (United States)

    Tairych, Andreas; Xu, Daniel; O'Brien, Benjamin M.; Anderson, Iain A.

    2016-04-01

    When we communicate face to face, we subconsciously engage our whole body to convey our message. In telecommunication, e.g. during phone calls, this powerful information channel cannot be used. Capturing nonverbal information from body motion and transmitting it to the receiver parallel to speech would make these conversations feel much more natural. This requires a sensing device that is capable of capturing different types of movements, such as the flexion and extension of joints, and the rotation of limbs. In a first embodiment, we developed a sensing glove that is used to control a computer game. Capacitive dielectric elastomer (DE) sensors measure finger positions, and an inertial measurement unit (IMU) detects hand roll. These two sensor technologies complement each other, with the IMU allowing the player to move an avatar through a three-dimensional maze, and the DE sensors detecting finger flexion to fire weapons or open doors. After demonstrating the potential of sensor fusion in human-computer interaction, we take this concept to the next level and apply it in nonverbal communication between humans. The current fingerspelling glove prototype uses capacitive DE sensors to detect finger gestures performed by the sending person. These gestures are mapped to corresponding messages and transmitted wirelessly to another person. A concept for integrating an IMU into this system is presented. The fusion of the DE sensor and the IMU combines the strengths of both sensor types, and therefore enables very comprehensive body motion sensing, which makes a large repertoire of gestures available to nonverbal communication over distances.

  16. Motion Artifact Quantification and Sensor Fusion for Unobtrusive Health Monitoring.

    Science.gov (United States)

    Hoog Antink, Christoph; Schulz, Florian; Leonhardt, Steffen; Walter, Marian

    2017-12-25

    Sensors integrated into objects of everyday life potentially allow unobtrusive health monitoring at home. However, since the coupling of sensors and subject is not as well-defined as compared to a clinical setting, the signal quality is much more variable and can be disturbed significantly by motion artifacts. One way of tackling this challenge is the combined evaluation of multiple channels via sensor fusion. For robust and accurate sensor fusion, analyzing the influence of motion on different modalities is crucial. In this work, a multimodal sensor setup integrated into an armchair is presented that combines capacitively coupled electrocardiography, reflective photoplethysmography, two high-frequency impedance sensors and two types of ballistocardiography sensors. To quantify motion artifacts, a motion protocol performed by healthy volunteers is recorded with a motion capture system, and reference sensors perform cardiorespiratory monitoring. The shape-based signal-to-noise ratio SNR S is introduced and used to quantify the effect on motion on different sensing modalities. Based on this analysis, an optimal combination of sensors and fusion methodology is developed and evaluated. Using the proposed approach, beat-to-beat heart-rate is estimated with a coverage of 99.5% and a mean absolute error of 7.9 ms on 425 min of data from seven volunteers in a proof-of-concept measurement scenario.

  17. A hierarchical structure approach to MultiSensor Information Fusion

    Energy Technology Data Exchange (ETDEWEB)

    Maren, A.J. (Tennessee Univ., Tullahoma, TN (United States). Space Inst.); Pap, R.M.; Harston, C.T. (Accurate Automation Corp., Chattanooga, TN (United States))

    1989-01-01

    A major problem with image-based MultiSensor Information Fusion (MSIF) is establishing the level of processing at which information should be fused. Current methodologies, whether based on fusion at the pixel, segment/feature, or symbolic levels, are each inadequate for robust MSIF. Pixel-level fusion has problems with coregistration of the images or data. Attempts to fuse information using the features of segmented images or data relies an a presumed similarity between the segmentation characteristics of each image or data stream. Symbolic-level fusion requires too much advance processing to be useful, as we have seen in automatic target recognition tasks. Image-based MSIF systems need to operate in real-time, must perform fusion using a variety of sensor types, and should be effective across a wide range of operating conditions or deployment environments. We address this problem through developing a new representation level which facilitates matching and information fusion. The Hierarchical Scene Structure (HSS) representation, created using a multilayer, cooperative/competitive neural network, meets this need. The MSS is intermediate between a pixel-based representation and a scene interpretation representation, and represents the perceptual organization of an image. Fused HSSs will incorporate information from multiple sensors. Their knowledge-rich structure aids top-down scene interpretation via both model matching and knowledge-based,region interpretation.

  18. A hierarchical structure approach to MultiSensor Information Fusion

    Energy Technology Data Exchange (ETDEWEB)

    Maren, A.J. [Tennessee Univ., Tullahoma, TN (United States). Space Inst.; Pap, R.M.; Harston, C.T. [Accurate Automation Corp., Chattanooga, TN (United States)

    1989-12-31

    A major problem with image-based MultiSensor Information Fusion (MSIF) is establishing the level of processing at which information should be fused. Current methodologies, whether based on fusion at the pixel, segment/feature, or symbolic levels, are each inadequate for robust MSIF. Pixel-level fusion has problems with coregistration of the images or data. Attempts to fuse information using the features of segmented images or data relies an a presumed similarity between the segmentation characteristics of each image or data stream. Symbolic-level fusion requires too much advance processing to be useful, as we have seen in automatic target recognition tasks. Image-based MSIF systems need to operate in real-time, must perform fusion using a variety of sensor types, and should be effective across a wide range of operating conditions or deployment environments. We address this problem through developing a new representation level which facilitates matching and information fusion. The Hierarchical Scene Structure (HSS) representation, created using a multilayer, cooperative/competitive neural network, meets this need. The MSS is intermediate between a pixel-based representation and a scene interpretation representation, and represents the perceptual organization of an image. Fused HSSs will incorporate information from multiple sensors. Their knowledge-rich structure aids top-down scene interpretation via both model matching and knowledge-based,region interpretation.

  19. Sensor fusion in smart camera networks for ambient Intelligence

    NARCIS (Netherlands)

    Maatta, T.T.

    2013-01-01

    This short report introduces the topics of PhD research that was conducted on 2008-2013 and was defended on July 2013. The PhD thesis covers sensor fusion theory, gathers it into a framework with design rules for fusion-friendly design of vision networks, and elaborates on the rules through fusion

  20. Motion Artifact Quantification and Sensor Fusion for Unobtrusive Health Monitoring

    Science.gov (United States)

    Hoog Antink, Christoph; Schulz, Florian; Walter, Marian

    2017-01-01

    Sensors integrated into objects of everyday life potentially allow unobtrusive health monitoring at home. However, since the coupling of sensors and subject is not as well-defined as compared to a clinical setting, the signal quality is much more variable and can be disturbed significantly by motion artifacts. One way of tackling this challenge is the combined evaluation of multiple channels via sensor fusion. For robust and accurate sensor fusion, analyzing the influence of motion on different modalities is crucial. In this work, a multimodal sensor setup integrated into an armchair is presented that combines capacitively coupled electrocardiography, reflective photoplethysmography, two high-frequency impedance sensors and two types of ballistocardiography sensors. To quantify motion artifacts, a motion protocol performed by healthy volunteers is recorded with a motion capture system, and reference sensors perform cardiorespiratory monitoring. The shape-based signal-to-noise ratio SNRS is introduced and used to quantify the effect on motion on different sensing modalities. Based on this analysis, an optimal combination of sensors and fusion methodology is developed and evaluated. Using the proposed approach, beat-to-beat heart-rate is estimated with a coverage of 99.5% and a mean absolute error of 7.9 ms on 425 min of data from seven volunteers in a proof-of-concept measurement scenario. PMID:29295594

  1. Motion Artifact Quantification and Sensor Fusion for Unobtrusive Health Monitoring

    Directory of Open Access Journals (Sweden)

    Christoph Hoog Antink

    2017-12-01

    Full Text Available Sensors integrated into objects of everyday life potentially allow unobtrusive health monitoring at home. However, since the coupling of sensors and subject is not as well-defined as compared to a clinical setting, the signal quality is much more variable and can be disturbed significantly by motion artifacts. One way of tackling this challenge is the combined evaluation of multiple channels via sensor fusion. For robust and accurate sensor fusion, analyzing the influence of motion on different modalities is crucial. In this work, a multimodal sensor setup integrated into an armchair is presented that combines capacitively coupled electrocardiography, reflective photoplethysmography, two high-frequency impedance sensors and two types of ballistocardiography sensors. To quantify motion artifacts, a motion protocol performed by healthy volunteers is recorded with a motion capture system, and reference sensors perform cardiorespiratory monitoring. The shape-based signal-to-noise ratio SNR S is introduced and used to quantify the effect on motion on different sensing modalities. Based on this analysis, an optimal combination of sensors and fusion methodology is developed and evaluated. Using the proposed approach, beat-to-beat heart-rate is estimated with a coverage of 99.5% and a mean absolute error of 7.9 ms on 425 min of data from seven volunteers in a proof-of-concept measurement scenario.

  2. Vehicle recognition and tracking using a generic multi-sensor and multi-algorithm fusion approach

    OpenAIRE

    Nashashibi , Fawzi; Khammari , Ayoub; Laurgeau , Claude

    2008-01-01

    International audience; This paper tackles the problem of improving the robustness of vehicle detection for Adaptive Cruise Control (ACC) applications. Our approach is based on a multisensor and a multialgorithms data fusion for vehicle detection and recognition. Our architecture combines two sensors: a frontal camera and a laser scanner. The improvement of the robustness stems from two aspects. First, we addressed the vision-based detection by developing an original approach based on fine gr...

  3. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation

    Science.gov (United States)

    Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar

    2015-01-01

    Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766

  4. Sensor Fusion Based Model for Collision Free Mobile Robot Navigation

    Directory of Open Access Journals (Sweden)

    Marwah Almasri

    2015-12-01

    Full Text Available Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.

  5. Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks

    Directory of Open Access Journals (Sweden)

    Elena Bergamini

    2014-10-01

    Full Text Available Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter and complementary (Non-linear observer filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles and heading (yaw angle errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.

  6. Performance evaluation of multi-sensor data-fusion systems

    Indian Academy of Sciences (India)

    In this paper, the utilization of multi-sensors of different types, their characteristics, and their data-fusion in launch vehicles to achieve the goal of injecting the satellite into a precise orbit is explained. Performance requirements of sensors and their redundancy management in a typical launch vehicle are also included.

  7. Sensor fusion III: 3-D perception and recognition; Proceedings of the Meeting, Boston, MA, Nov. 5-8, 1990

    Science.gov (United States)

    Schenker, Paul S. (Editor)

    1991-01-01

    The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.

  8. Sensor fusion-based map building for mobile robot exploration

    International Nuclear Information System (INIS)

    Ribo, M.

    2000-01-01

    To carry out exploration tasks in unknown or partially unknown environments, a mobile robot needs to acquire and maintain models of its environment. In doing so, several sensors of same nature and/or heterogeneous sensor configurations may be used by the robot to achieve reliable performances. However, this in turn poses the problem of sensor fusion-based map building: How to interpret, combine and integrate sensory information in order to build a proper representation of the environment. Specifically, the goal of this thesis is to probe integration algorithms for Occupancy Grid (OG) based map building using odometry, ultrasonic rangefinders, and stereo vision. Three different uncertainty calculi are presented here which are used for sensor fusion-based map building purposes. They are based on probability theory, Dempster-Shafer theory of evidence, and fuzzy set theory. Besides, two different sensor models are depicted which are used to translate sensing data into range information. Experimental examples of OGs built from real data recorded by two robots in office-like environment are presented. They show the feasibility of the proposed approach for building both sonar and visual based OGs. A comparison among the presented uncertainty calculi is performed in a sonar-based framework. Finally, the fusion of both sonar and visual information based of the fuzzy set theory is depicted. (author)

  9. Sensor Fusion Techniques for Phased-Array Eddy Current and Phased-Array Ultrasound Data

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, Lloyd F. [Y-12 National Security Complex, Oak Ridge, TN (United States)

    2018-03-15

    Sensor (or Data) fusion is the process of integrating multiple data sources to produce more consistent, accurate and comprehensive information than is provided by a single data source. Sensor fusion may also be used to combine multiple signals from a single modality to improve the performance of a particular inspection technique. Industrial nondestructive testing may utilize multiple sensors to acquire inspection data depending upon the object under inspection and the anticipated types of defects that can be identified. Sensor fusion can be performed at various levels of signal abstraction with each having its strengths and weaknesses. A multimodal data fusion strategy first proposed by Heideklang and Shokouhi that combines spatially scattered detection locations to improve detection performance of surface-breaking and near-surface cracks in ferromagnetic metals is shown using a surface inspection example and is then extended for volumetric inspections. Utilizing data acquired from an Olympus Omniscan MX2 from both phased array eddy current and ultrasound probes on test phantoms, single and multilevel fusion techniques are employed to integrate signals from the two modalities. Preliminary results demonstrate how confidence in defect identification and interpretation benefit from sensor fusion techniques. Lastly, techniques for integrating data into radiographic and volumetric imagery from computed tomography are described and results are presented.

  10. APPLICATION OF SENSOR FUSION TO IMPROVE UAV IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    S. Jabari

    2017-08-01

    Full Text Available Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan camera along with either a colour camera or a four-band multi-spectral (MS camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC. We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  11. Application of Sensor Fusion to Improve Uav Image Classification

    Science.gov (United States)

    Jabari, S.; Fathollahi, F.; Zhang, Y.

    2017-08-01

    Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  12. CONTEXT-CAPTURE MULTI-VALUED DECISION FUSION WITH FAULT TOLERANT CAPABILITY FOR WIRELESS SENSOR NETWORKS

    OpenAIRE

    Jun Wu; Shigeru Shimamoto

    2011-01-01

    Wireless sensor networks (WSNs) are usually utilized to perform decision fusion of event detection. Current decision fusion schemes are based on binary valued decision and do not consider bursty contextcapture. However, bursty context and multi-valued data are important characteristics of WSNs. One on hand, the local decisions from sensors usually have bursty and contextual characteristics. Fusion center must capture the bursty context information from the sensors. On the other hand, in pract...

  13. Kalman filter-based EM-optical sensor fusion for needle deflection estimation.

    Science.gov (United States)

    Jiang, Baichuan; Gao, Wenpeng; Kacher, Daniel; Nevo, Erez; Fetics, Barry; Lee, Thomas C; Jayender, Jagadeesan

    2018-04-01

    In many clinical procedures such as cryoablation that involves needle insertion, accurate placement of the needle's tip at the desired target is the major issue for optimizing the treatment and minimizing damage to the neighboring anatomy. However, due to the interaction force between the needle and tissue, considerable error in intraoperative tracking of the needle tip can be observed as needle deflects. In this paper, measurements data from an optical sensor at the needle base and a magnetic resonance (MR) gradient field-driven electromagnetic (EM) sensor placed 10 cm from the needle tip are used within a model-integrated Kalman filter-based sensor fusion scheme. Bending model-based estimations and EM-based direct estimation are used as the measurement vectors in the Kalman filter, thus establishing an online estimation approach. Static tip bending experiments show that the fusion method can reduce the mean error of the tip position estimation from 29.23 mm of the optical sensor-based approach to 3.15 mm of the fusion-based approach and from 39.96 to 6.90 mm, at the MRI isocenter and the MRI entrance, respectively. This work established a novel sensor fusion scheme that incorporates model information, which enables real-time tracking of needle deflection with MRI compatibility, in a free-hand operating setup.

  14. Dempster Shafer Sensor Fusion for Autonomously Driving Vehicles : Association Free Tracking of Dynamic Objects

    OpenAIRE

    Högger, Andreas

    2016-01-01

    Autonomous driving vehicles introduce challenging research areas combining differ-ent disciplines. One challenge is the detection of obstacles with different sensors and the combination of information to generate a comprehensive representation of the environment, which can be used for path planning and decision making.The sensor fusion is demonstrated using two Velodyne multi beam laser scanners, but it is possible to extend the proposed sensor fusion framework for different sensor types. Sensor...

  15. Kalman Filter Sensor Fusion for Mecanum Wheeled Automated Guided Vehicle Localization

    Directory of Open Access Journals (Sweden)

    Sang Won Yoon

    2015-01-01

    Full Text Available The Mecanum automated guided vehicle (AGV, which can move in any direction by using a special wheel structure with a LIM-wheel and a diagonally positioned roller, holds considerable promise for the field of industrial electronics. A conventional method for Mecanum AGV localization has certain limitations, such as slip phenomena, because there are variations in the surface of the road and ground friction. Therefore, precise localization is a very important issue for the inevitable slip phenomenon situation. So a sensor fusion technique is developed to cope with this drawback by using the Kalman filter. ENCODER and StarGazer were used for sensor fusion. StarGazer is a position sensor for an image recognition device and always generates some errors due to the limitations of the image recognition device. ENCODER has also errors accumulating over time. On the other hand, there are no moving errors. In this study, we developed a Mecanum AGV prototype system and showed by simulation that we can eliminate the disadvantages of each sensor. We obtained the precise localization of the Mecanum AGV in a slip phenomenon situation via sensor fusion using a Kalman filter.

  16. Security of nuclear materials using fusion multi sensor wavelett

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho

    2010-01-01

    Security of a nuclear material in an installation is determined by how far the installation is to assure that nuclear material remains at a predetermined location. This paper observed a preliminary design on nuclear material tracking system in the installation for decision making support based on multi sensor fusion that is reliable and accurate to ensure that the nuclear material remains inside the control area. Capability on decision making in the Management Information System is represented by an understanding of perception in the third level of abstraction. The second level will be achieved with the support of image analysis and organizing data. The first level of abstraction is constructed by merger between several CCD camera sensors distributed in a building in a data fusion representation. Data fusion is processed based on Wavelett approach. Simulation utilizing Matlab programming shows that Wavelett fuses multi information from sensors as well. Hope that when the nuclear material out of control regions which have been predetermined before, there will arise a warning alarm and a message in the Management Information System display. Thus the nuclear material movement time event can be obtained and tracked as well. (author)

  17. Resource-Aware Data Fusion Algorithms for Wireless Sensor Networks

    CERN Document Server

    Abdelgawad, Ahmed

    2012-01-01

    This book introduces resource-aware data fusion algorithms to gather and combine data from multiple sources (e.g., sensors) in order to achieve inferences.  These techniques can be used in centralized and distributed systems to overcome sensor failure, technological limitation, and spatial and temporal coverage problems. The algorithms described in this book are evaluated with simulation and experimental results to show they will maintain data integrity and make data useful and informative.   Describes techniques to overcome real problems posed by wireless sensor networks deployed in circumstances that might interfere with measurements provided, such as strong variations of pressure, temperature, radiation, and electromagnetic noise; Uses simulation and experimental results to evaluate algorithms presented and includes real test-bed; Includes case study implementing data fusion algorithms on a remote monitoring framework for sand production in oil pipelines.

  18. Advancing of Land Surface Temperature Retrieval Using Extreme Learning Machine and Spatio-Temporal Adaptive Data Fusion Algorithm

    Directory of Open Access Journals (Sweden)

    Yang Bai

    2015-04-01

    Full Text Available As a critical variable to characterize the biophysical processes in ecological environment, and as a key indicator in the surface energy balance, evapotranspiration and urban heat islands, Land Surface Temperature (LST retrieved from Thermal Infra-Red (TIR images at both high temporal and spatial resolution is in urgent need. However, due to the limitations of the existing satellite sensors, there is no earth observation which can obtain TIR at detailed spatial- and temporal-resolution simultaneously. Thus, several attempts of image fusion by blending the TIR data from high temporal resolution sensor with data from high spatial resolution sensor have been studied. This paper presents a novel data fusion method by integrating image fusion and spatio-temporal fusion techniques, for deriving LST datasets at 30 m spatial resolution from daily MODIS image and Landsat ETM+ images. The Landsat ETM+ TIR data were firstly enhanced based on extreme learning machine (ELM algorithm using neural network regression model, from 60 m to 30 m resolution. Then, the MODIS LST and enhanced Landsat ETM+ TIR data were fused by Spatio-temporal Adaptive Data Fusion Algorithm for Temperature mapping (SADFAT in order to derive high resolution synthetic data. The synthetic images were evaluated for both testing and simulated satellite images. The average difference (AD and absolute average difference (AAD are smaller than 1.7 K, where the correlation coefficient (CC and root-mean-square error (RMSE are 0.755 and 1.824, respectively, showing that the proposed method enhances the spatial resolution of the predicted LST images and preserves the spectral information at the same time.

  19. A Radiosonde Using a Humidity Sensor Array with a Platinum Resistance Heater and Multi-Sensor Data Fusion

    Science.gov (United States)

    Shi, Yunbo; Luo, Yi; Zhao, Wenjie; Shang, Chunxue; Wang, Yadong; Chen, Yinsheng

    2013-01-01

    This paper describes the design and implementation of a radiosonde which can measure the meteorological temperature, humidity, pressure, and other atmospheric data. The system is composed of a CPU, microwave module, temperature sensor, pressure sensor and humidity sensor array. In order to effectively solve the humidity sensor condensation problem due to the low temperatures in the high altitude environment, a capacitive humidity sensor including four humidity sensors to collect meteorological humidity and a platinum resistance heater was developed using micro-electro-mechanical-system (MEMS) technology. A platinum resistance wire with 99.999% purity and 0.023 mm in diameter was used to obtain the meteorological temperature. A multi-sensor data fusion technique was applied to process the atmospheric data. Static and dynamic experimental results show that the designed humidity sensor with platinum resistance heater can effectively tackle the sensor condensation problem, shorten response times and enhance sensitivity. The humidity sensor array can improve measurement accuracy and obtain a reliable initial meteorological humidity data, while the multi-sensor data fusion technique eliminates the uncertainty in the measurement. The radiosonde can accurately reflect the meteorological changes. PMID:23857263

  20. Noncontact Sleep Study by Multi-Modal Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Ku-young Chung

    2017-07-01

    Full Text Available Polysomnography (PSG is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner.

  1. A novel method of range measuring for a mobile robot based on multi-sensor information fusion

    International Nuclear Information System (INIS)

    Zhang Yi; Luo Yuan; Wang Jifeng

    2005-01-01

    The traditional measuring range for a mobile robot is based on a sonar sensor. Because of different working environments, it is very difficult to obtain high precision by using just one single method of range measurement. So, a hybrid sonar sensor and laser scanner method is put forward to overcome these shortcomings. A novel fusion model is proposed based on basic theory and a method of information fusion. An optimal measurement result has been obtained with information fusion from different sensors. After large numbers of experiments and performance analysis, a conclusion can be drawn that the laser scanner and sonar sensor method with multi-sensor information fusion have a higher precision than the single method of sonar. It can also be the same with different environments

  2. Sensor fusion to enable next generation low cost Night Vision systems

    Science.gov (United States)

    Schweiger, R.; Franz, S.; Löhlein, O.; Ritter, W.; Källhammer, J.-E.; Franks, J.; Krekels, T.

    2010-04-01

    The next generation of automotive Night Vision Enhancement systems offers automatic pedestrian recognition with a performance beyond current Night Vision systems at a lower cost. This will allow high market penetration, covering the luxury as well as compact car segments. Improved performance can be achieved by fusing a Far Infrared (FIR) sensor with a Near Infrared (NIR) sensor. However, fusing with today's FIR systems will be too costly to get a high market penetration. The main cost drivers of the FIR system are its resolution and its sensitivity. Sensor cost is largely determined by sensor die size. Fewer and smaller pixels will reduce die size but also resolution and sensitivity. Sensitivity limits are mainly determined by inclement weather performance. Sensitivity requirements should be matched to the possibilities of low cost FIR optics, especially implications of molding of highly complex optical surfaces. As a FIR sensor specified for fusion can have lower resolution as well as lower sensitivity, fusing FIR and NIR can solve performance and cost problems. To allow compensation of FIR-sensor degradation on the pedestrian detection capabilities, a fusion approach called MultiSensorBoosting is presented that produces a classifier holding highly discriminative sub-pixel features from both sensors at once. The algorithm is applied on data with different resolution and on data obtained from cameras with varying optics to incorporate various sensor sensitivities. As it is not feasible to record representative data with all different sensor configurations, transformation routines on existing high resolution data recorded with high sensitivity cameras are investigated in order to determine the effects of lower resolution and lower sensitivity to the overall detection performance. This paper also gives an overview of the first results showing that a reduction of FIR sensor resolution can be compensated using fusion techniques and a reduction of sensitivity can be

  3. A Method Based on Multi-Sensor Data Fusion for Fault Detection of Planetary Gearboxes

    Directory of Open Access Journals (Sweden)

    Detong Kong

    2012-02-01

    Full Text Available Studies on fault detection and diagnosis of planetary gearboxes are quite limited compared with those of fixed-axis gearboxes. Different from fixed-axis gearboxes, planetary gearboxes exhibit unique behaviors, which invalidate fault diagnosis methods that work well for fixed-axis gearboxes. It is a fact that for systems as complex as planetary gearboxes, multiple sensors mounted on different locations provide complementary information on the health condition of the systems. On this basis, a fault detection method based on multi-sensor data fusion is introduced in this paper. In this method, two features developed for planetary gearboxes are used to characterize the gear health conditions, and an adaptive neuro-fuzzy inference system (ANFIS is utilized to fuse all features from different sensors. In order to demonstrate the effectiveness of the proposed method, experiments are carried out on a planetary gearbox test rig, on which multiple accelerometers are mounted for data collection. The comparisons between the proposed method and the methods based on individual sensors show that the former achieves much higher accuracies in detecting planetary gearbox faults.

  4. Sensor Interoperability and Fusion in Fingerprint Verification: A Case Study using Minutiae-and Ridge-Based Matchers

    NARCIS (Netherlands)

    Alonso-Fernandez, F.; Veldhuis, Raymond N.J.; Bazen, A.M.; Fierrez-Aguilar, J.; Ortega-Garcia, J.

    2006-01-01

    Information fusion in fingerprint recognition has been studied in several papers. However, only a few papers have been focused on sensor interoperability and sensor fusion. In this paper, these two topics are studied using a multisensor database acquired with three different fingerprint sensors.

  5. Data Fusion Based on Node Trust Evaluation in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhou Jianming

    2014-01-01

    Full Text Available Abnormal behavior detection and trust evaluation mode of traditional sensor node have a single function without considering all the factors, and the trust value algorithm is relatively complicated. To avoid these above disadvantages, a trust evaluation model based on the autonomous behavior of sensor node is proposed in this paper. Each sensor node has the monitoring privilege and obligation. Neighboring sensor nodes can monitor each other. Their direct and indirect trust values can be achieved by using a relatively simple calculation method, the synthesis trust value of which could be got according to the composition rule of D-S evidence theory. Firstly, the cluster head assigns different weighted value for the data from each sensor node, then the weight vector is set according to the synthesis trust value, the data fusion processing is executed, and finally the cluster head sensor node transmits the fused result to the base station. Simulation experiment results demonstrate that the trust evaluation model can rapidly, exactly, and effectively recognize malicious sensor node and avoid malicious sensor node becoming cluster head sensor node. The proposed algorithm can greatly increase the safety and accuracy of data fusion, improve communication efficiency, save energy of sensor node, suit different application fields, and deploy environments.

  6. HEAT Sensor: Harsh Environment Adaptable Thermionic Sensor

    Energy Technology Data Exchange (ETDEWEB)

    Limb, Scott J. [Palo Alto Research Center, Palo Alto, CA (United States)

    2016-05-31

    This document is the final report for the “HARSH ENVIRONMENT ADAPTABLE THERMIONIC SENSOR” project under NETL’s Crosscutting contract DE-FE0013062. This report addresses sensors that can be made with thermionic thin films along with the required high temperature hermetic packaging process. These sensors can be placed in harsh high temperature environments and potentially be wireless and self-powered.

  7. Sensor Data Fusion for Accurate Cloud Presence Prediction Using Dempster-Shafer Evidence Theory

    Directory of Open Access Journals (Sweden)

    Jesse S. Jin

    2010-10-01

    Full Text Available Sensor data fusion technology can be used to best extract useful information from multiple sensor observations. It has been widely applied in various applications such as target tracking, surveillance, robot navigation, signal and image processing. This paper introduces a novel data fusion approach in a multiple radiation sensor environment using Dempster-Shafer evidence theory. The methodology is used to predict cloud presence based on the inputs of radiation sensors. Different radiation data have been used for the cloud prediction. The potential application areas of the algorithm include renewable power for virtual power station where the prediction of cloud presence is the most challenging issue for its photovoltaic output. The algorithm is validated by comparing the predicted cloud presence with the corresponding sunshine occurrence data that were recorded as the benchmark. Our experiments have indicated that comparing to the approaches using individual sensors, the proposed data fusion approach can increase correct rate of cloud prediction by ten percent, and decrease unknown rate of cloud prediction by twenty three percent.

  8. Cooperative aquatic sensing using the telesupervised adaptive ocean sensor fleet

    Science.gov (United States)

    Dolan, John M.; Podnar, Gregg W.; Stancliff, Stephen; Low, Kian Hsiang; Elfes, Alberto; Higinbotham, John; Hosler, Jeffrey; Moisan, Tiffany; Moisan, John

    2009-09-01

    Earth science research must bridge the gap between the atmosphere and the ocean to foster understanding of Earth's climate and ecology. Typical ocean sensing is done with satellites or in situ buoys and research ships which are slow to reposition. Cloud cover inhibits study of localized transient phenomena such as Harmful Algal Blooms (HAB). A fleet of extended-deployment surface autonomous vehicles will enable in situ study of characteristics of HAB, coastal pollutants, and related phenomena. We have developed a multiplatform telesupervision architecture that supports adaptive reconfiguration based on environmental sensor inputs. Our system allows the autonomous repositioning of smart sensors for HAB study by networking a fleet of NOAA OASIS (Ocean Atmosphere Sensor Integration System) surface autonomous vehicles. In situ measurements intelligently modify the search for areas of high concentration. Inference Grid and complementary information-theoretic techniques support sensor fusion and analysis. Telesupervision supports sliding autonomy from high-level mission tasking, through vehicle and data monitoring, to teleoperation when direct human interaction is appropriate. This paper reports on experimental results from multi-platform tests conducted in the Chesapeake Bay and in Pittsburgh, Pennsylvania waters using OASIS platforms, autonomous kayaks, and multiple simulated platforms to conduct cooperative sensing of chlorophyll-a and water quality.

  9. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian

  10. Assessing the Performance of Sensor Fusion Methods: Application to Magnetic-Inertial-Based Human Body Tracking.

    Science.gov (United States)

    Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria

    2016-01-26

    Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor's uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.

  11. Comparison of pH Data Measured with a pH Sensor Array Using Different Data Fusion Methods

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liao

    2012-09-01

    Full Text Available This paper introduces different data fusion methods which are used for an electrochemical measurement using a sensor array. In this study, we used ruthenium dioxide sensing membrane pH electrodes to form a sensor array. The sensor array was used for detecting the pH values of grape wine, generic cola drink and bottled base water. The measured pH data were used for data fusion methods to increase the reliability of the measured results, and we also compared the fusion results with other different data fusion methods.

  12. Application of D-S Evidence Fusion Method in the Fault Detection of Temperature Sensor

    Directory of Open Access Journals (Sweden)

    Zheng Dou

    2014-01-01

    Full Text Available Due to the complexity and dangerousness of drying process, the fault detection of temperature sensor is very difficult and dangerous in actual working practice and the detection effectiveness is not satisfying. For this problem, in this paper, based on the idea of information fusion and the requirements of D-S evidence method, a D-S evidence fusion structure with two layers was introduced to detect the temperature sensor fault in drying process. The first layer was data layer to establish the basic belief assignment function of evidence which could be realized by BP Neural Network. The second layer was decision layer to detect and locate the sensor fault which could be realized by D-S evidence fusion method. According to the numerical simulation results, the working conditions of sensors could be described effectively and accurately by this method, so that it could be used to detect and locate the sensor fault.

  13. An epidemic model for biological data fusion in ad hoc sensor networks

    Science.gov (United States)

    Chang, K. C.; Kotari, Vikas

    2009-05-01

    Bio terrorism can be a very refined and a catastrophic approach of attacking a nation. This requires the development of a complete architecture dedicatedly designed for this purpose which includes but is not limited to Sensing/Detection, Tracking and Fusion, Communication, and others. In this paper we focus on one such architecture and evaluate its performance. Various sensors for this specific purpose have been studied. The accent has been on use of Distributed systems such as ad-hoc networks and on application of epidemic data fusion algorithms to better manage the bio threat data. The emphasis has been on understanding the performance characteristics of these algorithms under diversified real time scenarios which are implemented through extensive JAVA based simulations. Through comparative studies on communication and fusion the performance of channel filter algorithm for the purpose of biological sensor data fusion are validated.

  14. All-IP-Ethernet architecture for real-time sensor-fusion processing

    Science.gov (United States)

    Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya

    2016-03-01

    Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.

  15. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine

    Science.gov (United States)

    Maimaitijiang, Maitiniyazi; Ghulam, Abduwasit; Sidike, Paheding; Hartling, Sean; Maimaitiyiming, Matthew; Peterson, Kyle; Shavers, Ethan; Fishman, Jack; Peterson, Jim; Kadam, Suhas; Burken, Joel; Fritschi, Felix

    2017-12-01

    Estimating crop biophysical and biochemical parameters with high accuracy at low-cost is imperative for high-throughput phenotyping in precision agriculture. Although fusion of data from multiple sensors is a common application in remote sensing, less is known on the contribution of low-cost RGB, multispectral and thermal sensors to rapid crop phenotyping. This is due to the fact that (1) simultaneous collection of multi-sensor data using satellites are rare and (2) multi-sensor data collected during a single flight have not been accessible until recent developments in Unmanned Aerial Systems (UASs) and UAS-friendly sensors that allow efficient information fusion. The objective of this study was to evaluate the power of high spatial resolution RGB, multispectral and thermal data fusion to estimate soybean (Glycine max) biochemical parameters including chlorophyll content and nitrogen concentration, and biophysical parameters including Leaf Area Index (LAI), above ground fresh and dry biomass. Multiple low-cost sensors integrated on UASs were used to collect RGB, multispectral, and thermal images throughout the growing season at a site established near Columbia, Missouri, USA. From these images, vegetation indices were extracted, a Crop Surface Model (CSM) was advanced, and a model to extract the vegetation fraction was developed. Then, spectral indices/features were combined to model and predict crop biophysical and biochemical parameters using Partial Least Squares Regression (PLSR), Support Vector Regression (SVR), and Extreme Learning Machine based Regression (ELR) techniques. Results showed that: (1) For biochemical variable estimation, multispectral and thermal data fusion provided the best estimate for nitrogen concentration and chlorophyll (Chl) a content (RMSE of 9.9% and 17.1%, respectively) and RGB color information based indices and multispectral data fusion exhibited the largest RMSE 22.6%; the highest accuracy for Chl a + b content estimation was

  16. Infrared sensors and sensor fusion; Proceedings of the Meeting, Orlando, FL, May 19-21, 1987

    International Nuclear Information System (INIS)

    Buser, R.G.; Warren, F.B.

    1987-01-01

    The present conference discusses topics in the fields of IR sensor multifunctional design; image modeling, simulation, and detection; IR sensor configurations and components; thermal sensor arrays; silicide-based IR sensors; and IR focal plane array utilization. Attention is given to the fusion of lidar and FLIR for target segmentation and enhancement, the synergetic integration of thermal and visual images for computer vision, the 'Falcon Eye' FLIR system, multifunctional electrooptics and multiaperture sensors for precision-guided munitions, and AI approaches to data integration. Also discussed are the comparative performance of Ir silicide and Pt silicide photodiodes, high fill-factor silicide monolithic arrays, and the characterization of noise in staring IR focal plane arrays

  17. A near-optimal low complexity sensor fusion technique for accurate indoor localization based on ultrasound time of arrival measurements from low-quality sensors

    Science.gov (United States)

    Mitilineos, Stelios A.; Argyreas, Nick D.; Thomopoulos, Stelios C. A.

    2009-05-01

    A fusion-based localization technique for location-based services in indoor environments is introduced herein, based on ultrasound time-of-arrival measurements from multiple off-the-shelf range estimating sensors which are used in a market-available localization system. In-situ field measurements results indicated that the respective off-the-shelf system was unable to estimate position in most of the cases, while the underlying sensors are of low-quality and yield highly inaccurate range and position estimates. An extensive analysis is performed and a model of the sensor-performance characteristics is established. A low-complexity but accurate sensor fusion and localization technique is then developed, which consists inof evaluating multiple sensor measurements and selecting the one that is considered most-accurate based on the underlying sensor model. Optimality, in the sense of a genie selecting the optimum sensor, is subsequently evaluated and compared to the proposed technique. The experimental results indicate that the proposed fusion method exhibits near-optimal performance and, albeit being theoretically suboptimal, it largely overcomes most flaws of the underlying single-sensor system resulting in a localization system of increased accuracy, robustness and availability.

  18. Phase 1 report on sensor technology, data fusion and data interpretation for site characterization

    International Nuclear Information System (INIS)

    Beckerman, M.

    1991-10-01

    In this report we discuss sensor technology, data fusion and data interpretation approaches of possible maximal usefulness for subsurface imaging and characterization of land-fill waste sites. Two sensor technologies, terrain conductivity using electromagnetic induction and ground penetrating radar, are described and the literature on the subject is reviewed. We identify the maximum entropy stochastic method as one providing a rigorously justifiable framework for fusing the sensor data, briefly summarize work done by us in this area, and examine some of the outstanding issues with regard to data fusion and interpretation. 25 refs., 17 figs

  19. Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors

    Science.gov (United States)

    Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.

    2017-12-01

    We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data

  20. Comparison of pH Data Measured with a pH Sensor Array Using Different Data Fusion Methods

    OpenAIRE

    Yi-Hung Liao; Jung-Chuan Chou

    2012-01-01

    This paper introduces different data fusion methods which are used for an electrochemical measurement using a sensor array. In this study, we used ruthenium dioxide sensing membrane pH electrodes to form a sensor array. The sensor array was used for detecting the pH values of grape wine, generic cola drink and bottled base water. The measured pH data were used for data fusion methods to increase the reliability of the measured results, and we also compared the fusion results with other differ...

  1. Soft Thermal Sensor with Mechanical Adaptability.

    Science.gov (United States)

    Yang, Hui; Qi, Dianpeng; Liu, Zhiyuan; Chandran, Bevita K; Wang, Ting; Yu, Jiancan; Chen, Xiaodong

    2016-11-01

    A soft thermal sensor with mechanical adaptability is fabricated by the combination of single-wall carbon nanotubes with carboxyl groups and self-healing polymers. This study demonstrates that this soft sensor has excellent thermal response and mechanical adaptability. It shows tremendous promise for improving the service life of soft artificial-intelligence robots and protecting thermally sensitive electronics from the risk of damage by high temperature. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Detecting Pedestrian Flocks by Fusion of Multi-Modal Sensors in Mobile Phones

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Wirz, Martin; Roggen, Daniel

    2012-01-01

    derived from multiple sensor modalities of modern smartphones. Automatic detection of flocks has several important applications, including evacuation management and socially aware computing. The novelty of this paper is, firstly, to use data fusion techniques to combine several sensor modalities (WiFi...

  3. Sensor fusion for active vibration isolation in precision equipment

    NARCIS (Netherlands)

    Tjepkema, D.; van Dijk, Johannes; Soemers, Herman

    2012-01-01

    Sensor fusion is a promising control strategy to improve the performance of active vibration isolation systems that are used in precision equipment. Normally, those vibration isolation systems are only capable of realizing a low transmissibility. Additional objectives are to increase the damping

  4. A flexible data fusion architecture for persistent surveillance using ultra-low-power wireless sensor networks

    Science.gov (United States)

    Hanson, Jeffrey A.; McLaughlin, Keith L.; Sereno, Thomas J.

    2011-06-01

    We have developed a flexible, target-driven, multi-modal, physics-based fusion architecture that efficiently searches sensor detections for targets and rejects clutter while controlling the combinatoric problems that commonly arise in datadriven fusion systems. The informational constraints imposed by long lifetime requirements make systems vulnerable to false alarms. We demonstrate that our data fusion system significantly reduces false alarms while maintaining high sensitivity to threats. In addition, mission goals can vary substantially in terms of targets-of-interest, required characterization, acceptable latency, and false alarm rates. Our fusion architecture provides the flexibility to match these trade-offs with mission requirements unlike many conventional systems that require significant modifications for each new mission. We illustrate our data fusion performance with case studies that span many of the potential mission scenarios including border surveillance, base security, and infrastructure protection. In these studies, we deployed multi-modal sensor nodes - including geophones, magnetometers, accelerometers and PIR sensors - with low-power processing algorithms and low-bandwidth wireless mesh networking to create networks capable of multi-year operation. The results show our data fusion architecture maintains high sensitivities while suppressing most false alarms for a variety of environments and targets.

  5. Fusion of intraoperative force sensoring, surface reconstruction and biomechanical modeling

    Science.gov (United States)

    Röhl, S.; Bodenstedt, S.; Küderle, C.; Suwelack, S.; Kenngott, H.; Müller-Stich, B. P.; Dillmann, R.; Speidel, S.

    2012-02-01

    Minimally invasive surgery is medically complex and can heavily benefit from computer assistance. One way to help the surgeon is to integrate preoperative planning data into the surgical workflow. This information can be represented as a customized preoperative model of the surgical site. To use it intraoperatively, it has to be updated during the intervention due to the constantly changing environment. Hence, intraoperative sensor data has to be acquired and registered with the preoperative model. Haptic information which could complement the visual sensor data is still not established. In addition, biomechanical modeling of the surgical site can help in reflecting the changes which cannot be captured by intraoperative sensors. We present a setting where a force sensor is integrated into a laparoscopic instrument. In a test scenario using a silicone liver phantom, we register the measured forces with a reconstructed surface model from stereo endoscopic images and a finite element model. The endoscope, the instrument and the liver phantom are tracked with a Polaris optical tracking system. By fusing this information, we can transfer the deformation onto the finite element model. The purpose of this setting is to demonstrate the principles needed and the methods developed for intraoperative sensor data fusion. One emphasis lies on the calibration of the force sensor with the instrument and first experiments with soft tissue. We also present our solution and first results concerning the integration of the force sensor as well as accuracy to the fusion of force measurements, surface reconstruction and biomechanical modeling.

  6. A Bayes-Maximum Entropy method for multi-sensor data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Beckerman, M.

    1991-01-01

    In this paper we introduce a Bayes-Maximum Entropy formalism for multi-sensor data fusion, and present an application of this methodology to the fusion of ultrasound and visual sensor data as acquired by a mobile robot. In our approach the principle of maximum entropy is applied to the construction of priors and likelihoods from the data. Distances between ultrasound and visual points of interest in a dual representation are used to define Gibbs likelihood distributions. Both one- and two-dimensional likelihoods are presented, and cast into a form which makes explicit their dependence upon the mean. The Bayesian posterior distributions are used to test a null hypothesis, and Maximum Entropy Maps used for navigation are updated using the resulting information from the dual representation. 14 refs., 9 figs.

  7. Design and Analysis of a Data Fusion Scheme in Mobile Wireless Sensor Networks Based on Multi-Protocol Mobile Agents.

    Science.gov (United States)

    Wu, Chunxue; Wu, Wenliang; Wan, Caihua; Bekkering, Ernst; Xiong, Naixue

    2017-11-03

    Sensors are increasingly used in mobile environments with wireless network connections. Multiple sensor types measure distinct aspects of the same event. Their measurements are then combined to produce integrated, reliable results. As the number of sensors in networks increases, low energy requirements and changing network connections complicate event detection and measurement. We present a data fusion scheme for use in mobile wireless sensor networks with high energy efficiency and low network delays, that still produces reliable results. In the first phase, we used a network simulation where mobile agents dynamically select the next hop migration node based on the stability parameter of the link, and perform the data fusion at the migration node. Agents use the fusion results to decide if it should return the fusion results to the processing center or continue to collect more data. In the second phase. The feasibility of data fusion at the node level is confirmed by an experimental design where fused data from color sensors show near-identical results to actual physical temperatures. These results are potentially important for new large-scale sensor network applications.

  8. Sensor-Data Fusion for Multi-Person Indoor Location Estimation.

    Science.gov (United States)

    Mohebbi, Parisa; Stroulia, Eleni; Nikolaidis, Ioanis

    2017-10-18

    We consider the problem of estimating the location of people as they move and work in indoor environments. More specifically, we focus on the scenario where one of the persons of interest is unable or unwilling to carry a smartphone, or any other "wearable" device, which frequently arises in caregiver/cared-for situations. We consider the case of indoor spaces populated with anonymous binary sensors (Passive Infrared motion sensors) and eponymous wearable sensors (smartphones interacting with Estimote beacons), and we propose a solution to the resulting sensor-fusion problem. Using a data set with sensor readings collected from one-person and two-person sessions engaged in a variety of activities of daily living, we investigate the relative merits of relying solely on anonymous sensors, solely on eponymous sensors, or on their combination. We examine how the lack of synchronization across different sensing sources impacts the quality of location estimates, and discuss how it could be mitigated without resorting to device-level mechanisms. Finally, we examine the trade-off between the sensors' coverage of the monitored space and the quality of the location estimates.

  9. A New Multi-Sensor Track Fusion Architecture for Multi-Sensor Information Integration

    Science.gov (United States)

    2004-09-01

    NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION ...NAME(S) AND ADDRESS(ES) Lockheed Martin Aeronautical Systems Company,Marietta,GA,3063 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...tracking process and degrades the track accuracy. ARCHITECHTURE OF MULTI-SENSOR TRACK FUSION MODEL The Alpha

  10. Sensor Fusion and Model Verification for a Mobile Robot

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms...

  11. An evidential sensor fusion method in fault diagnosis

    Directory of Open Access Journals (Sweden)

    Wen Jiang

    2016-03-01

    Full Text Available Dempster–Shafer evidence theory is widely used in information fusion. However, it may lead to an unreasonable result when dealing with high conflict evidence. In order to solve this problem, we put forward a new method based on the credibility of evidence. First, a novel belief entropy, Deng entropy, is applied to measure the information volume of the evidence and then the discounting coefficients of each evidence are obtained. Finally, weighted averaging the evidence in the system, the Dempster combination rule was used to realize information fusion. A weighted averaging combination role is presented for multi-sensor data fusion in fault diagnosis. It seems more reasonable than before using the new belief function to determine the weight. A numerical example is given to illustrate that the proposed rule is more effective to perform fault diagnosis than classical evidence theory in fusing multi-symptom domains.

  12. Robust site security using smart seismic array technology and multi-sensor data fusion

    Science.gov (United States)

    Hellickson, Dean; Richards, Paul; Reynolds, Zane; Keener, Joshua

    2010-04-01

    Traditional site security systems are susceptible to high individual sensor nuisance alarm rates that reduce the overall system effectiveness. Visual assessment of intrusions can be intensive and manually difficult as cameras are slewed by the system to non intrusion areas or as operators respond to nuisance alarms. Very little system intrusion performance data are available other than discrete sensor alarm indications that provide no real value. This paper discusses the system architecture, integration and display of a multi-sensor data fused system for wide area surveillance, local site intrusion detection and intrusion classification. The incorporation of a novel seismic array of smart sensors using FK Beamforming processing that greatly enhances the overall system detection and classification performance of the system is discussed. Recent test data demonstrates the performance of the seismic array within several different installations and its ability to classify and track moving targets at significant standoff distances with exceptional immunity to background clutter and noise. Multi-sensor data fusion is applied across a suite of complimentary sensors eliminating almost all nuisance alarms while integrating within a geographical information system to feed a visual-fusion display of the area being secured. Real-time sensor detection and intrusion classification data is presented within a visual-fusion display providing greatly enhanced situational awareness, system performance information and real-time assessment of intrusions and situations of interest with limited security operator involvement. This approach scales from a small local perimeter to very large geographical area and can be used across multiple sites controlled at a single command and control station.

  13. Sensor fusion IV: Control paradigms and data structures; Proceedings of the Meeting, Boston, MA, Nov. 12-15, 1991

    Science.gov (United States)

    Schenker, Paul S. (Editor)

    1992-01-01

    Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.

  14. Biomimetic micromechanical adaptive flow-sensor arrays

    Science.gov (United States)

    Krijnen, Gijs; Floris, Arjan; Dijkstra, Marcel; Lammerink, Theo; Wiegerink, Remco

    2007-05-01

    We report current developments in biomimetic flow-sensors based on flow sensitive mechano-sensors of crickets. Crickets have one form of acoustic sensing evolved in the form of mechanoreceptive sensory hairs. These filiform hairs are highly perceptive to low-frequency sound with energy sensitivities close to thermal threshold. In this work we describe hair-sensors fabricated by a combination of sacrificial poly-silicon technology, to form silicon-nitride suspended membranes, and SU8 polymer processing for fabrication of hairs with diameters of about 50 μm and up to 1 mm length. The membranes have thin chromium electrodes on top forming variable capacitors with the substrate that allow for capacitive read-out. Previously these sensors have been shown to exhibit acoustic sensitivity. Like for the crickets, the MEMS hair-sensors are positioned on elongated structures, resembling the cercus of crickets. In this work we present optical measurements on acoustically and electrostatically excited hair-sensors. We present adaptive control of flow-sensitivity and resonance frequency by electrostatic spring stiffness softening. Experimental data and simple analytical models derived from transduction theory are shown to exhibit good correspondence, both confirming theory and the applicability of the presented approach towards adaptation.

  15. A comparison of decision-level sensor-fusion methods for anti-personnel landmine detection.

    NARCIS (Netherlands)

    Schutte, K.; Schavemaker, J.G.M.; Cremer, F.; Breejen, E. den

    2001-01-01

    We present the sensor-fusion results obtained from measurements within the European research project ground explosive ordinance detection (GEODE) system that strives for the realisation of a vehicle-mounted, multi-sensor, anti-personnel landmine-detection system for humanitarian de-mining. The

  16. Summary of sensor evaluation for the Fusion ELectromagnetic Induction eXperiment (FELIX)

    International Nuclear Information System (INIS)

    Knott, M.J.

    1982-08-01

    As part of the First Wall/Blanket/Shield Engineering Test Program, a test bed called FELIX (Fusion ELectromagnetic Induction eXperiment) is now under construction at ANL. Its purpose will be to test, evaluate, and develop computer codes for the prediction of electromagnetically induced phenomenon in a magnetic environment modeling that of a fusion reaction. Crucial to this process is the sensing and recording of the various induced effects. Sensor evaluation for FELIX has reached the point where most sensor types have been evaluated and preliminary decisions are being made as to type and quantity for the initial FELIX experiments. These early experiments, the first, flat plate experiment in particular, will be aimed at testing the sensors as well as the pertinent theories involved. The reason for these evaluations, decisions, and proof tests is the harsh electrical and magnetic environment that FELIX presents

  17. Design and Analysis of a Data Fusion Scheme in Mobile Wireless Sensor Networks Based on Multi-Protocol Mobile Agents

    Directory of Open Access Journals (Sweden)

    Chunxue Wu

    2017-11-01

    Full Text Available Sensors are increasingly used in mobile environments with wireless network connections. Multiple sensor types measure distinct aspects of the same event. Their measurements are then combined to produce integrated, reliable results. As the number of sensors in networks increases, low energy requirements and changing network connections complicate event detection and measurement. We present a data fusion scheme for use in mobile wireless sensor networks with high energy efficiency and low network delays, that still produces reliable results. In the first phase, we used a network simulation where mobile agents dynamically select the next hop migration node based on the stability parameter of the link, and perform the data fusion at the migration node. Agents use the fusion results to decide if it should return the fusion results to the processing center or continue to collect more data. In the second phase. The feasibility of data fusion at the node level is confirmed by an experimental design where fused data from color sensors show near-identical results to actual physical temperatures. These results are potentially important for new large-scale sensor network applications.

  18. Coresident sensor fusion and compression using the wavelet transform

    Energy Technology Data Exchange (ETDEWEB)

    Yocky, D.A.

    1996-03-11

    Imagery from coresident sensor platforms, such as unmanned aerial vehicles, can be combined using, multiresolution decomposition of the sensor images by means of the two-dimensional wavelet transform. The wavelet approach uses the combination of spatial/spectral information at multiple scales to create a fused image. This can be done in both an ad hoc or model-based approach. We compare results from commercial ``fusion`` software and the ad hoc, wavelet approach. Results show the wavelet approach outperforms the commercial algorithms and also supports efficient compression of the fused image.

  19. On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition

    Directory of Open Access Journals (Sweden)

    Ignacio Rojas

    2012-06-01

    Full Text Available The main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered.

  20. Two-level Robust Measurement Fusion Kalman Filter for Clustering Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANG Peng; QI Wen-Juan; DENG Zi-Li

    2014-01-01

    This paper investigates the distributed fusion Kalman filtering over clustering sensor networks. The sensor network is partitioned as clusters by the nearest neighbor rule and each cluster consists of sensing nodes and cluster-head. Using the minimax robust estimation principle, based on the worst-case conservative system with the conservative upper bounds of noise variances, two-level robust measurement fusion Kalman filter is presented for the clustering sensor network systems with uncertain noise variances. It can significantly reduce the communication load and save energy when the number of sensors is very large. A Lyapunov equation approach for the robustness analysis is presented, by which the robustness of the local and fused Kalman filters is proved. The concept of the robust accuracy is presented, and the robust accuracy relations among the local and fused robust Kalman filters are proved. It is proved that the robust accuracy of the two-level weighted measurement fuser is equal to that of the global centralized robust fuser and is higher than those of each local robust filter and each local weighted measurement fuser. A simulation example shows the correctness and effectiveness of the proposed results.

  1. The development of fusion sensor techniques for condition monitoring of a check valve

    International Nuclear Information System (INIS)

    Seong, S.H.; Kim, J.S.; Hur, S.; Kim, J.T.; Park, W.M.; Cha, D.B.

    2004-01-01

    The failures of check valves are one of the most important problems in nuclear power plants because the reverse flows through the failed check valve impact on the healthy hydraulic loop. The present test method of finding out the mechanical failure of a check valve is very risky in the radiated environments during normal operation. In addition, the detection of failures in the overhaul period is very costly and tedious because many check valves are used in the plants and manual disassembly work is required. We have suggested the fusion sensor technology for detecting the failures of check valves through measuring and analyzing the backward leakage flow and mechanical vibration without disassembling the check valve. The fusion sensor means that more than two sensors are used in order to identify and analyze the changes of the frequency response between the failed check valve and healthy check valve. We use the accelerometer and acoustic emission sensor as an alternative to the fusion sensor methodology. We have found that the acoustic emission sensor would be capable of directly detecting a high frequency acoustic wave generated from backward leakage flow itself at a low pressure and temperature. The accelerometer for detecting the mechanical vibration induced from leakage flows would, also, be useful at a high pressure and temperature from the previous studies. The effectiveness of this system is that it is possible for predictive maintenance and information of the problem valve will be captured and it reduces the radiation exposure for the maintenance personnel during power operation as well as the maintenance period. (orig.)

  2. Rasiowa completion versus Keisler saturation: Towards a pragmatics of infinite fusion

    Energy Technology Data Exchange (ETDEWEB)

    Tomasik, J.A. [Univ. of Clermont (France)

    1996-12-31

    The goal of this survey note is to make a step towards a semiotical approach to the control of Sensor Data Fusion systems. We try to adapt infinitistic methods of Rasiowa and Keisler as prototypes of a many-one pragmatics for the control of SDF in order to provide a faithful system - semantical or/and syntactical - adequate for simultaneous interpretation of signals received through several sensors. Intuitively, it is clear that the amount of the information captured during the fusion process depends strongly on the fusion itself. Nevertheless we can find (c.f. works of Kokar`s group) the following implicit heuristic in semiotical investigations on Sensory Data Fusion Z`s accepted.

  3. Reliability of Measured Data for pH Sensor Arrays with Fault Diagnosis and Data Fusion Based on LabVIEW

    OpenAIRE

    Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi

    2013-01-01

    Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagn...

  4. IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Omid Dehzangi

    2017-11-01

    Full Text Available The wide spread usage of wearable sensors such as in smart watches has provided continuous access to valuable user generated data such as human motion that could be used to identify an individual based on his/her motion patterns such as, gait. Several methods have been suggested to extract various heuristic and high-level features from gait motion data to identify discriminative gait signatures and distinguish the target individual from others. However, the manual and hand crafted feature extraction is error prone and subjective. Furthermore, the motion data collected from inertial sensors have complex structure and the detachment between manual feature extraction module and the predictive learning models might limit the generalization capabilities. In this paper, we propose a novel approach for human gait identification using time-frequency (TF expansion of human gait cycles in order to capture joint 2 dimensional (2D spectral and temporal patterns of gait cycles. Then, we design a deep convolutional neural network (DCNN learning to extract discriminative features from the 2D expanded gait cycles and jointly optimize the identification model and the spectro-temporal features in a discriminative fashion. We collect raw motion data from five inertial sensors placed at the chest, lower-back, right hand wrist, right knee, and right ankle of each human subject synchronously in order to investigate the impact of sensor location on the gait identification performance. We then present two methods for early (input level and late (decision score level multi-sensor fusion to improve the gait identification generalization performance. We specifically propose the minimum error score fusion (MESF method that discriminatively learns the linear fusion weights of individual DCNN scores at the decision level by minimizing the error rate on the training data in an iterative manner. 10 subjects participated in this study and hence, the problem is a 10-class

  5. Sensor data fusion for textured reconstruction and virtual representation of alpine scenes

    Science.gov (United States)

    Häufel, Gisela; Bulatov, Dimitri; Solbrig, Peter

    2017-10-01

    The concept of remote sensing is to provide information about a wide-range area without making physical contact with this area. If, additionally to satellite imagery, images and videos taken by drones provide a more up-to-date data at a higher resolution, or accurate vector data is downloadable from the Internet, one speaks of sensor data fusion. The concept of sensor data fusion is relevant for many applications, such as virtual tourism, automatic navigation, hazard assessment, etc. In this work, we describe sensor data fusion aiming to create a semantic 3D model of an extremely interesting yet challenging dataset: An alpine region in Southern Germany. A particular challenge of this work is that rock faces including overhangs are present in the input airborne laser point cloud. The proposed procedure for identification and reconstruction of overhangs from point clouds comprises four steps: Point cloud preparation, filtering out vegetation, mesh generation and texturing. Further object types are extracted in several interesting subsections of the dataset: Building models with textures from UAV (Unmanned Aerial Vehicle) videos, hills reconstructed as generic surfaces and textured by the orthophoto, individual trees detected by the watershed algorithm, as well as the vector data for roads retrieved from openly available shapefiles and GPS-device tracks. We pursue geo-specific reconstruction by assigning texture and width to roads of several pre-determined types and modeling isolated trees and rocks using commercial software. For visualization and simulation of the area, we have chosen the simulation system Virtual Battlespace 3 (VBS3). It becomes clear that the proposed concept of sensor data fusion allows a coarse reconstruction of a large scene and, at the same time, an accurate and up-to-date representation of its relevant subsections, in which simulation can take place.

  6. Fluorescent sensors based on bacterial fusion proteins

    International Nuclear Information System (INIS)

    Mateu, Batirtze Prats; Pum, Dietmar; Sleytr, Uwe B; Toca-Herrera, José L; Kainz, Birgit

    2014-01-01

    Fluorescence proteins are widely used as markers for biomedical and technological purposes. Therefore, the aim of this project was to create a fluorescent sensor, based in the green and cyan fluorescent protein, using bacterial S-layers proteins as scaffold for the fluorescent tag. We report the cloning, expression and purification of three S-layer fluorescent proteins: SgsE-EGFP, SgsE-ECFP and SgsE-13aa-ECFP, this last containing a 13-amino acid rigid linker. The pH dependence of the fluorescence intensity of the S-layer fusion proteins, monitored by fluorescence spectroscopy, showed that the ECFP tag was more stable than EGFP. Furthermore, the fluorescent fusion proteins were reassembled on silica particles modified with cationic and anionic polyelectrolytes. Zeta potential measurements confirmed the particle coatings and indicated their colloidal stability. Flow cytometry and fluorescence microscopy showed that the fluorescence of the fusion proteins was pH dependent and sensitive to the underlying polyelectrolyte coating. This might suggest that the fluorescent tag is not completely exposed to the bulk media as an independent moiety. Finally, it was found out that viscosity enhanced the fluorescence intensity of the three fluorescent S-layer proteins. (paper)

  7. Conflict management based on belief function entropy in sensor fusion.

    Science.gov (United States)

    Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong

    2016-01-01

    Wireless sensor network plays an important role in intelligent navigation. It incorporates a group of sensors to overcome the limitation of single detection system. Dempster-Shafer evidence theory can combine the sensor data of the wireless sensor network by data fusion, which contributes to the improvement of accuracy and reliability of the detection system. However, due to different sources of sensors, there may be conflict among the sensor data under uncertain environment. Thus, this paper proposes a new method combining Deng entropy and evidence distance to address the issue. First, Deng entropy is adopted to measure the uncertain information. Then, evidence distance is applied to measure the conflict degree. The new method can cope with conflict effectually and improve the accuracy and reliability of the detection system. An example is illustrated to show the efficiency of the new method and the result is compared with that of the existing methods.

  8. Multi-sensor fusion using an adaptive multi-hypothesis tracking algorithm

    NARCIS (Netherlands)

    Kester, L.J.H.M.

    2003-01-01

    The purpose of a tracking algorithm is to associate data measured by one or more (moving) sensors to moving objects in the environment. The state of these objects that can be estimated with the tracking process depends on the type of data that is provided by these sensors. It is discussed how the

  9. Multi-Sensor Information Fusion for Optimizing Electric Bicycle Routes Using a Swarm Intelligence Algorithm

    Directory of Open Access Journals (Sweden)

    Daniel H. De La Iglesia

    2017-10-01

    Full Text Available The use of electric bikes (e-bikes has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route.

  10. Multi-Sensor Information Fusion for Optimizing Electric Bicycle Routes Using a Swarm Intelligence Algorithm

    Science.gov (United States)

    Villarubia, Gabriel; De Paz, Juan F.; Bajo, Javier

    2017-01-01

    The use of electric bikes (e-bikes) has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route. PMID:29088087

  11. Multi-Sensor Information Fusion for Optimizing Electric Bicycle Routes Using a Swarm Intelligence Algorithm.

    Science.gov (United States)

    De La Iglesia, Daniel H; Villarrubia, Gabriel; De Paz, Juan F; Bajo, Javier

    2017-10-31

    The use of electric bikes (e-bikes) has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route.

  12. Sensor-Data Fusion for Multi-Person Indoor Location Estimation

    Directory of Open Access Journals (Sweden)

    Parisa Mohebbi

    2017-10-01

    Full Text Available We consider the problem of estimating the location of people as they move and work in indoor environments. More specifically, we focus on the scenario where one of the persons of interest is unable or unwilling to carry a smartphone, or any other “wearable” device, which frequently arises in caregiver/cared-for situations. We consider the case of indoor spaces populated with anonymous binary sensors (Passive Infrared motion sensors and eponymous wearable sensors (smartphones interacting with Estimote beacons, and we propose a solution to the resulting sensor-fusion problem. Using a data set with sensor readings collected from one-person and two-person sessions engaged in a variety of activities of daily living, we investigate the relative merits of relying solely on anonymous sensors, solely on eponymous sensors, or on their combination. We examine how the lack of synchronization across different sensing sources impacts the quality of location estimates, and discuss how it could be mitigated without resorting to device-level mechanisms. Finally, we examine the trade-off between the sensors’ coverage of the monitored space and the quality of the location estimates.

  13. A Cluster-Based Fuzzy Fusion Algorithm for Event Detection in Heterogeneous Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    ZiQi Hao

    2015-01-01

    Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.

  14. Sensor management in RADAR/IRST track fusion

    Science.gov (United States)

    Hu, Shi-qiang; Jing, Zhong-liang

    2004-07-01

    In this paper, a novel radar management strategy technique suitable for RADAR/IRST track fusion, which is based on Fisher Information Matrix (FIM) and fuzzy stochastic decision approach, is put forward. Firstly, optimal radar measurements' scheduling is obtained by the method of maximizing determinant of the Fisher information matrix of radar and IRST measurements, which is managed by the expert system. Then, suggested a "pseudo sensor" to predict the possible target position using the polynomial method based on the radar and IRST measurements, using "pseudo sensor" model to estimate the target position even if the radar is turned off. At last, based on the tracking performance and the state of target maneuver, fuzzy stochastic decision is used to adjust the optimal radar scheduling and retrieve the module parameter of "pseudo sensor". The experiment result indicates that the algorithm can not only limit Radar activity effectively but also keep the tracking accuracy of active/passive system well. And this algorithm eliminates the drawback of traditional Radar management methods that the Radar activity is fixed and not easy to control and protect.

  15. A Remote Sensing Image Fusion Method based on adaptive dictionary learning

    Science.gov (United States)

    He, Tongdi; Che, Zongxi

    2018-01-01

    This paper discusses using a remote sensing fusion method, based on' adaptive sparse representation (ASP)', to provide improved spectral information, reduce data redundancy and decrease system complexity. First, the training sample set is formed by taking random blocks from the images to be fused, the dictionary is then constructed using the training samples, and the remaining terms are clustered to obtain the complete dictionary by iterated processing at each step. Second, the self-adaptive weighted coefficient rule of regional energy is used to select the feature fusion coefficients and complete the reconstruction of the image blocks. Finally, the reconstructed image blocks are rearranged and an average is taken to obtain the final fused images. Experimental results show that the proposed method is superior to other traditional remote sensing image fusion methods in both spectral information preservation and spatial resolution.

  16. Autonomous sensor manager agents (ASMA)

    Science.gov (United States)

    Osadciw, Lisa A.

    2004-04-01

    Autonomous sensor manager agents are presented as an algorithm to perform sensor management within a multisensor fusion network. The design of the hybrid ant system/particle swarm agents is described in detail with some insight into their performance. Although the algorithm is designed for the general sensor management problem, a simulation example involving 2 radar systems is presented. Algorithmic parameters are determined by the size of the region covered by the sensor network, the number of sensors, and the number of parameters to be selected. With straight forward modifications, this algorithm can be adapted for most sensor management problems.

  17. Sensor Fusion - Sonar and Stereo Vision, Using Occupancy Grids and SIFT

    DEFF Research Database (Denmark)

    Plascencia, Alfredo; Bendtsen, Jan Dimon

    2006-01-01

    to the occupied and empty regions. SIFT (Scale Invariant Feature Transform) feature descriptors are  interpreted using gaussian probabilistic error models. The use of occupancy grids is proposed for representing the sonar  as well as the features descriptors readings. The Bayesian estimation approach is applied...... to update the sonar and the SIFT descriptors' uncertainty grids. The sensor fusion yields a significant reduction in the uncertainty of the occupancy grid compared to the individual sensor readings....

  18. Multi-sensor fusion with interacting multiple model filter for improved aircraft position accuracy.

    Science.gov (United States)

    Cho, Taehwan; Lee, Changho; Choi, Sangbang

    2013-03-27

    The International Civil Aviation Organization (ICAO) has decided to adopt Communications, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) as the 21st century standard for navigation. Accordingly, ICAO members have provided an impetus to develop related technology and build sufficient infrastructure. For aviation surveillance with CNS/ATM, Ground-Based Augmentation System (GBAS), Automatic Dependent Surveillance-Broadcast (ADS-B), multilateration (MLAT) and wide-area multilateration (WAM) systems are being established. These sensors can track aircraft positions more accurately than existing radar and can compensate for the blind spots in aircraft surveillance. In this paper, we applied a novel sensor fusion method with Interacting Multiple Model (IMM) filter to GBAS, ADS-B, MLAT, and WAM data in order to improve the reliability of the aircraft position. Results of performance analysis show that the position accuracy is improved by the proposed sensor fusion method with the IMM filter.

  19. Sensor Fusion of Position- and Micro-Sensors (MEMS) integrated in a Wireless Sensor Network for movement detection in landslide areas

    Science.gov (United States)

    Arnhardt, Christian; Fernández-Steeger, Tomas; Azzam, Rafig

    2010-05-01

    Monitoring systems in landslide areas are important elements of effective Early Warning structures. Data acquisition and retrieval allows the detection of movement processes and thus is essential to generate warnings in time. Apart from the precise measurement, the reliability of data is fundamental, because outliers can trigger false alarms and leads to the loss of acceptance of such systems. For the monitoring of mass movements and their risk it is important to know, if there is movement, how fast it is and how trustworthy is the information. The joint project "Sensorbased landslide early warning system" (SLEWS) deals with these questions, and tries to improve data quality and to reduce false alarm rates, due to the combination of sensor date (sensor fusion). The project concentrates on the development of a prototypic Alarm- and Early Warning system (EWS) for different types of landslides by using various low-cost sensors, integrated in a wireless sensor network (WSN). The network consists of numerous connection points (nodes) that transfer data directly or over other nodes (Multi-Hop) in real-time to a data collection point (gateway). From there all the data packages are transmitted to a spatial data infrastructure (SDI) for further processing, analyzing and visualizing with respect to end-user specifications. The ad-hoc characteristic of the network allows the autonomous crosslinking of the nodes according to existing connections and communication strength. Due to the independent finding of new or more stable connections (self healing) a breakdown of the whole system is avoided. The bidirectional data stream enables the receiving of data from the network but also allows the transfer of commands and pointed requests into the WSN. For the detection of surface deformations in landslide areas small low-cost Micro-Electro-Mechanical-Systems (MEMS) and positionsensors from the automobile industries, different industrial applications and from other measurement

  20. Advanced data visualization and sensor fusion: Conversion of techniques from medical imaging to Earth science

    Science.gov (United States)

    Savage, Richard C.; Chen, Chin-Tu; Pelizzari, Charles; Ramanathan, Veerabhadran

    1993-01-01

    Hughes Aircraft Company and the University of Chicago propose to transfer existing medical imaging registration algorithms to the area of multi-sensor data fusion. The University of Chicago's algorithms have been successfully demonstrated to provide pixel by pixel comparison capability for medical sensors with different characteristics. The research will attempt to fuse GOES (Geostationary Operational Environmental Satellite), AVHRR (Advanced Very High Resolution Radiometer), and SSM/I (Special Sensor Microwave Imager) sensor data which will benefit a wide range of researchers. The algorithms will utilize data visualization and algorithm development tools created by Hughes in its EOSDIS (Earth Observation SystemData/Information System) prototyping. This will maximize the work on the fusion algorithms since support software (e.g. input/output routines) will already exist. The research will produce a portable software library with documentation for use by other researchers.

  1. Testbeam Studies on Pick-Up in Sensors with Embedded Pitch Adapters

    CERN Document Server

    Rehnisch, Laura; The ATLAS collaboration

    2017-01-01

    For silicon strip sensors, the tracking information specifications can lead to challenging requirements for wire bonding. A common strategy is to use external pitch adapters to facilitate this step in the production of detector modules. A novel approach previously discussed in [1], is to implement the pitch adapters in the sensor, by embedding a second layer of metal tracks. The use of these embedded pitch adapters (EPAs) decouples the bond pad layout of the sensor from its implant layout by moving the adaption to the sensor production step. This solution, however, can yield the risk of performance losses due to the increase of inter-strip capacitance, or unwanted capacitive coupling between the metal layers (cross-talk) or the silicon bulk and the second metal layer (pick-up). In the prototyping stage of the ATLAS tracker end-cap upgrade, where different bond-pad layouts on sensor and readout chip lead to extremely challenging wire-bonding conditions, sensors with different geometries of EPA implementations ...

  2. PERSON AUTHENTICATION USING MULTIPLE SENSOR DATA FUSION

    Directory of Open Access Journals (Sweden)

    S. Vasuhi

    2011-04-01

    Full Text Available This paper proposes a real-time system for face authentication, obtained through fusion of Infra Red (IR and visible images. In order to identify the unknown person authentication in highly secured areas, multiple algorithms are needed. The four well known algorithms for face recognition, Block Independent Component Analysis(BICA, Kalman Filtering(KF method, Discrete Cosine Transform(DCT and Orthogonal Locality Preserving Projections (OLPP are used to extract the features. If the data base size is very large and the features are not distinct then ambiguity will exists in face recognition. Hence more than one sensor is needed for critical and/or highly secured areas. This paper deals with multiple fusion methodology using weighted average and Fuzzy Logic. The visible sensor output depends on the environmental condition namely lighting conditions, illumination etc., to overcome this problem use histogram technique to choose appropriate algorithm. DCT and Kalman filtering are holistic approaches, BICA follows feature based approach and OLPP preserves the Euclidean structure of face space. These recognizers are capable of considering the problem of dimensionality reduction by eliminating redundant features and reducing the feature space. The system can handle variations like illumination, pose, orientation, occlusion, etc. up to a significant level. The integrated system overcomes the drawbacks of individual recognizers. The proposed system is aimed at increasing the accuracy of the person authentication system and at the same time reducing the limitations of individual algorithms. It is tested on real time database and the results are found to be 96% accurate.

  3. Adaptive Sensing Based on Profiles for Sensor Systems

    Directory of Open Access Journals (Sweden)

    Yoshiteru Ishida

    2009-10-01

    Full Text Available This paper proposes a profile-based sensing framework for adaptive sensor systems based on models that relate possibly heterogeneous sensor data and profiles generated by the models to detect events. With these concepts, three phases for building the sensor systems are extracted from two examples: a combustion control sensor system for an automobile engine, and a sensor system for home security. The three phases are: modeling, profiling, and managing trade-offs. Designing and building a sensor system involves mapping the signals to a model to achieve a given mission.

  4. A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion

    NARCIS (Netherlands)

    Gevaert, C.M.; Garcia-Haro, F.J.

    2015-01-01

    The focus of the current study is to compare data fusion methods applied to sensors with medium- and high-spatial resolutions. Two documented methods are applied, the spatial and temporal adaptive reflectance fusion model (STARFM) and an unmixing-based method which proposes a Bayesian formulation to

  5. Signal processing, sensor fusion, and target recognition; Proceedings of the Meeting, Orlando, FL, Apr. 20-22, 1992

    Science.gov (United States)

    Libby, Vibeke; Kadar, Ivan

    Consideration is given to a multiordered mapping technique for target prioritization, a neural network approach to multiple-target-tracking problems, a multisensor fusion algorithm for multitarget multibackground classification, deconvolutiom of multiple images of the same object, neural networks and genetic algorithms for combinatorial optimization of sensor data fusion, classification of atmospheric acoustic signals from fixed-wing aircraft, and an optics approach to sensor fusion for target recognition. Also treated are a zoom lens for automatic target recognition, a hybrid model for the analysis of radar sensors, an innovative test bed for developing and assessing air-to-air noncooperative target identification algorithms, SAR imagery scene segmentation using fractal processing, sonar feature-based bandwidth compression, laboratory experiments for a new sonar system, computational algorithms for discrete transform using fixed-size filter matrices, and pattern recognition for power systems.

  6. Fault-tolerant Sensor Fusion for Marine Navigation

    DEFF Research Database (Denmark)

    Blanke, Mogens

    2006-01-01

    Reliability of navigation data are critical for steering and manoeuvring control, and in particular so at high speed or in critical phases of a mission. Should faults occur, faulty instruments need be autonomously isolated and faulty information discarded. This paper designs a navigation solution...... where essential navigation information is provided even with multiple faults in instrumentation. The paper proposes a provable correct implementation through auto-generated state-event logics in a supervisory part of the algorithms. Test results from naval vessels document the performance and shows...... events where the fault-tolerant sensor fusion provided uninterrupted navigation data despite temporal instrument defects...

  7. Asynchronous Sensor fuSion for Improved Safety of air Traffic (ASSIST), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI proposes to develop, implement and test a collision detection system for unmanned aerial vehicles (UAV), referred to as the Asynchronous Sensor fuSion for...

  8. Mixed H2/H∞-Based Fusion Estimation for Energy-Limited Multi-Sensors in Wearable Body Networks

    Directory of Open Access Journals (Sweden)

    Chao Li

    2017-12-01

    Full Text Available In wireless sensor networks, sensor nodes collect plenty of data for each time period. If all of data are transmitted to a Fusion Center (FC, the power of sensor node would run out rapidly. On the other hand, the data also needs a filter to remove the noise. Therefore, an efficient fusion estimation model, which can save the energy of the sensor nodes while maintaining higher accuracy, is needed. This paper proposes a novel mixed H2/H∞-based energy-efficient fusion estimation model (MHEEFE for energy-limited Wearable Body Networks. In the proposed model, the communication cost is firstly reduced efficiently while keeping the estimation accuracy. Then, the parameters in quantization method are discussed, and we confirm them by an optimization method with some prior knowledge. Besides, some calculation methods of important parameters are researched which make the final estimates more stable. Finally, an iteration-based weight calculation algorithm is presented, which can improve the fault tolerance of the final estimate. In the simulation, the impacts of some pivotal parameters are discussed. Meanwhile, compared with the other related models, the MHEEFE shows a better performance in accuracy, energy-efficiency and fault tolerance.

  9. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... with a multivariate analysis technique from RGB pictures. The color information is also transformed to hue, saturation and intensity components. Both sets of image features are combined with traditional process measurements to obtain an inferential model by partial least squares (PLS) regression. A dynamic PLS model...... oxides (NOx) emission of cement kilns. On-site tests demonstrate improved performance over soft sensors based on conventional process measurements only....

  10. Adaptive inferential sensors based on evolving fuzzy models.

    Science.gov (United States)

    Angelov, Plamen; Kordon, Arthur

    2010-04-01

    A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the

  11. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device

    Directory of Open Access Journals (Sweden)

    Xiang He

    2015-12-01

    Full Text Available Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer, wireless signal strength indicators (WiFi, Bluetooth, Zigbee, and visual sensors (LiDAR, camera. People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.

  12. The pH sensor for flavivirus membrane fusion

    OpenAIRE

    Harrison, Stephen C.

    2008-01-01

    Viruses that infect cells by uptake through endosomes have generally evolved to ?sense? the local pH as part of the mechanism by which they penetrate into the cytosol. Even for the very well studied fusion proteins of enveloped viruses, identification of the specific pH sensor has been a challenge, one that has now been met successfully, for flaviviruses, by Fritz et al. (Fritz, R., K. Stiasny, and F.X. Heinz. 2008. J. Cell Biol. 183:353?361) in this issue. Thorough mutational analysis of con...

  13. Lifetime Maximizing Adaptive Power Control in Wireless Sensor Networks

    National Research Council Canada - National Science Library

    Sun, Fangting; Shayman, Mark

    2006-01-01

    ...: adaptive power control. They focus on the sensor networks that consist of a sink and a set of homogeneous wireless sensor nodes, which are randomly deployed according to a uniform distribution...

  14. Pose estimation of surgical instrument using sensor data fusion with optical tracker and IMU based on Kalman filter

    Directory of Open Access Journals (Sweden)

    Oh Hyunmin

    2015-01-01

    Full Text Available Tracking system is essential for Image Guided Surgery(IGS. The Optical Tracking Sensor(OTS has been widely used as tracking system for IGS due to its high accuracy and easy usage. However, OTS has a limit that tracking fails when occlusion of marker occurs. In this paper, sensor fusion with OTS and Inertial Measurement Unit(IMU is proposed to solve this problem. The proposed algorithm improves the accuracy of tracking system by eliminating scattering error of the sensor and supplements the disadvantages of OTS and IMU through sensor fusion based on Kalman filter. Also, coordinate axis calibration method that improves the accuracy is introduced. The performed experiment verifies the effectualness of the proposed algorithm.

  15. Marker-Based Multi-Sensor Fusion Indoor Localization System for Micro Air Vehicles.

    Science.gov (United States)

    Xing, Boyang; Zhu, Quanmin; Pan, Feng; Feng, Xiaoxue

    2018-05-25

    A novel multi-sensor fusion indoor localization algorithm based on ArUco marker is designed in this paper. The proposed ArUco mapping algorithm can build and correct the map of markers online with Grubbs criterion and K-mean clustering, which avoids the map distortion due to lack of correction. Based on the conception of multi-sensor information fusion, the federated Kalman filter is utilized to synthesize the multi-source information from markers, optical flow, ultrasonic and the inertial sensor, which can obtain a continuous localization result and effectively reduce the position drift due to the long-term loss of markers in pure marker localization. The proposed algorithm can be easily implemented in a hardware of one Raspberry Pi Zero and two STM32 micro controllers produced by STMicroelectronics (Geneva, Switzerland). Thus, a small-size and low-cost marker-based localization system is presented. The experimental results show that the speed estimation result of the proposed system is better than Px4flow, and it has the centimeter accuracy of mapping and positioning. The presented system not only gives satisfying localization precision, but also has the potential to expand other sensors (such as visual odometry, ultra wideband (UWB) beacon and lidar) to further improve the localization performance. The proposed system can be reliably employed in Micro Aerial Vehicle (MAV) visual localization and robotics control.

  16. Marker-Based Multi-Sensor Fusion Indoor Localization System for Micro Air Vehicles

    Directory of Open Access Journals (Sweden)

    Boyang Xing

    2018-05-01

    Full Text Available A novel multi-sensor fusion indoor localization algorithm based on ArUco marker is designed in this paper. The proposed ArUco mapping algorithm can build and correct the map of markers online with Grubbs criterion and K-mean clustering, which avoids the map distortion due to lack of correction. Based on the conception of multi-sensor information fusion, the federated Kalman filter is utilized to synthesize the multi-source information from markers, optical flow, ultrasonic and the inertial sensor, which can obtain a continuous localization result and effectively reduce the position drift due to the long-term loss of markers in pure marker localization. The proposed algorithm can be easily implemented in a hardware of one Raspberry Pi Zero and two STM32 micro controllers produced by STMicroelectronics (Geneva, Switzerland. Thus, a small-size and low-cost marker-based localization system is presented. The experimental results show that the speed estimation result of the proposed system is better than Px4flow, and it has the centimeter accuracy of mapping and positioning. The presented system not only gives satisfying localization precision, but also has the potential to expand other sensors (such as visual odometry, ultra wideband (UWB beacon and lidar to further improve the localization performance. The proposed system can be reliably employed in Micro Aerial Vehicle (MAV visual localization and robotics control.

  17. Embry-Riddle Aeronautical University multispectral sensor and data fusion laboratory: a model for distributed research and education

    Science.gov (United States)

    McMullen, Sonya A. H.; Henderson, Troy; Ison, David

    2017-05-01

    The miniaturization of unmanned systems and spacecraft, as well as computing and sensor technologies, has opened new opportunities in the areas of remote sensing and multi-sensor data fusion for a variety of applications. Remote sensing and data fusion historically have been the purview of large government organizations, such as the Department of Defense (DoD), National Aeronautics and Space Administration (NASA), and National Geospatial-Intelligence Agency (NGA) due to the high cost and complexity of developing, fielding, and operating such systems. However, miniaturized computers with high capacity processing capabilities, small and affordable sensors, and emerging, commercially available platforms such as UAS and CubeSats to carry such sensors, have allowed for a vast range of novel applications. In order to leverage these developments, Embry-Riddle Aeronautical University (ERAU) has developed an advanced sensor and data fusion laboratory to research component capabilities and their employment on a wide-range of autonomous, robotic, and transportation systems. This lab is unique in several ways, for example, it provides a traditional campus laboratory for students and faculty to model and test sensors in a range of scenarios, process multi-sensor data sets (both simulated and experimental), and analyze results. Moreover, such allows for "virtual" modeling, testing, and teaching capability reaching beyond the physical confines of the facility for use among ERAU Worldwide students and faculty located around the globe. Although other institutions such as Georgia Institute of Technology, Lockheed Martin, University of Dayton, and University of Central Florida have optical sensor laboratories, the ERAU virtual concept is the first such lab to expand to multispectral sensors and data fusion, while focusing on the data collection and data products and not on the manufacturing aspect. Further, the initiative is a unique effort among Embry-Riddle faculty to develop multi

  18. An Extension to Deng's Entropy in the Open World Assumption with an Application in Sensor Data Fusion.

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S

    2018-06-11

    Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  19. Camera-laser fusion sensor system and environmental recognition for humanoids in disaster scenarios

    International Nuclear Information System (INIS)

    Lee, Inho; Oh, Jaesung; Oh, Jun-Ho; Kim, Inhyeok

    2017-01-01

    This research aims to develop a vision sensor system and a recognition algorithm to enable a humanoid to operate autonomously in a disaster environment. In disaster response scenarios, humanoid robots that perform manipulation and locomotion tasks must identify the objects in the environment from those challenged by the call by the United States’ Defense Advanced Research Projects Agency, e.g., doors, valves, drills, debris, uneven terrains, and stairs, among others. In order for a humanoid to undertake a number of tasks, we con- struct a camera–laser fusion system and develop an environmental recognition algorithm. Laser distance sensor and motor are used to obtain 3D cloud data. We project the 3D cloud data onto a 2D image according to the intrinsic parameters of the camera and the distortion model of the lens. In this manner, our fusion sensor system performs functions such as those performed by the RGB-D sensor gener- ally used in segmentation research. Our recognition algorithm is based on super-pixel segmentation and random sampling. The proposed approach clusters the unorganized cloud data according to geometric characteristics, namely, proximity and co-planarity. To assess the feasibility of our system and algorithm, we utilize the humanoid robot, DRC-HUBO, and the results are demonstrated in the accompanying video.

  20. Camera-laser fusion sensor system and environmental recognition for humanoids in disaster scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Inho [Institute for Human and Machine Cognition (IHMC), Florida (United States); Oh, Jaesung; Oh, Jun-Ho [Korea Advanced Institute of Science and Technology (KAIST), Daejeon (Korea, Republic of); Kim, Inhyeok [NAVER Green Factory, Seongnam (Korea, Republic of)

    2017-06-15

    This research aims to develop a vision sensor system and a recognition algorithm to enable a humanoid to operate autonomously in a disaster environment. In disaster response scenarios, humanoid robots that perform manipulation and locomotion tasks must identify the objects in the environment from those challenged by the call by the United States’ Defense Advanced Research Projects Agency, e.g., doors, valves, drills, debris, uneven terrains, and stairs, among others. In order for a humanoid to undertake a number of tasks, we con- struct a camera–laser fusion system and develop an environmental recognition algorithm. Laser distance sensor and motor are used to obtain 3D cloud data. We project the 3D cloud data onto a 2D image according to the intrinsic parameters of the camera and the distortion model of the lens. In this manner, our fusion sensor system performs functions such as those performed by the RGB-D sensor gener- ally used in segmentation research. Our recognition algorithm is based on super-pixel segmentation and random sampling. The proposed approach clusters the unorganized cloud data according to geometric characteristics, namely, proximity and co-planarity. To assess the feasibility of our system and algorithm, we utilize the humanoid robot, DRC-HUBO, and the results are demonstrated in the accompanying video.

  1. The role of data fusion in predictive maintenance using digital twin

    Science.gov (United States)

    Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih

    2018-04-01

    Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.

  2. The use of proximal soil sensor data fusion and digital soil mapping for precision agriculture

    OpenAIRE

    Ji, Wenjun; Adamchuk, Viacheslav; Chen, Songchao; Biswas, Asim; Leclerc, Maxime; Viscarra Rossel, Raphael

    2017-01-01

    Proximal soil sensing (PSS) is a promising approach when it comes to detailed characterization of spatial soil heterogeneity. Since none of existing PSS systems can measure all soil information needed for implementation precision agriculture, sensor data fusion can provide a reasonable al- ternative to characterize the complexity of soils. In this study, we fused the data measured using a gamma-ray sensor, an apparent electrical conductivity (ECa) sensor, and a commercial Veris MS...

  3. arXiv Signal coupling to embedded pitch adapters in silicon sensors

    CERN Document Server

    Artuso, M.; Bezshyiko, I.; Blusk, S.; Bruendler, R.; Bugiel, S.; Dasgupta, R.; Dendek, A.; Dey, B.; Ely, S.; Lionetto, F.; Petruzzo, M.; Polyakov, I.; Rudolph, M.; Schindler, H.; Steinkamp, O.; Stone, S.

    2018-01-01

    We have examined the effects of embedded pitch adapters on signal formation in n-substrate silicon microstrip sensors with data from beam tests and simulation. According to simulation, the presence of the pitch adapter metal layer changes the electric field inside the sensor, resulting in slowed signal formation on the nearby strips and a pick-up effect on the pitch adapter. This can result in an inefficiency to detect particles passing through the pitch adapter region. All these effects have been observed in the beam test data.

  4. A Method for Improving the Pose Accuracy of a Robot Manipulator Based on Multi-Sensor Combined Measurement and Data Fusion

    Science.gov (United States)

    Liu, Bailing; Zhang, Fumin; Qu, Xinghua

    2015-01-01

    An improvement method for the pose accuracy of a robot manipulator by using a multiple-sensor combination measuring system (MCMS) is presented. It is composed of a visual sensor, an angle sensor and a series robot. The visual sensor is utilized to measure the position of the manipulator in real time, and the angle sensor is rigidly attached to the manipulator to obtain its orientation. Due to the higher accuracy of the multi-sensor, two efficient data fusion approaches, the Kalman filter (KF) and multi-sensor optimal information fusion algorithm (MOIFA), are used to fuse the position and orientation of the manipulator. The simulation and experimental results show that the pose accuracy of the robot manipulator is improved dramatically by 38%∼78% with the multi-sensor data fusion. Comparing with reported pose accuracy improvement methods, the primary advantage of this method is that it does not require the complex solution of the kinematics parameter equations, increase of the motion constraints and the complicated procedures of the traditional vision-based methods. It makes the robot processing more autonomous and accurate. To improve the reliability and accuracy of the pose measurements of MCMS, the visual sensor repeatability is experimentally studied. An optimal range of 1 × 0.8 × 1 ∼ 2 × 0.8 × 1 m in the field of view (FOV) is indicated by the experimental results. PMID:25850067

  5. An integrated multi-sensor fusion-based deep feature learning approach for rotating machinery diagnosis

    Science.gov (United States)

    Liu, Jie; Hu, Youmin; Wang, Yan; Wu, Bo; Fan, Jikai; Hu, Zhongxu

    2018-05-01

    The diagnosis of complicated fault severity problems in rotating machinery systems is an important issue that affects the productivity and quality of manufacturing processes and industrial applications. However, it usually suffers from several deficiencies. (1) A considerable degree of prior knowledge and expertise is required to not only extract and select specific features from raw sensor signals, and but also choose a suitable fusion for sensor information. (2) Traditional artificial neural networks with shallow architectures are usually adopted and they have a limited ability to learn the complex and variable operating conditions. In multi-sensor-based diagnosis applications in particular, massive high-dimensional and high-volume raw sensor signals need to be processed. In this paper, an integrated multi-sensor fusion-based deep feature learning (IMSFDFL) approach is developed to identify the fault severity in rotating machinery processes. First, traditional statistics and energy spectrum features are extracted from multiple sensors with multiple channels and combined. Then, a fused feature vector is constructed from all of the acquisition channels. Further, deep feature learning with stacked auto-encoders is used to obtain the deep features. Finally, the traditional softmax model is applied to identify the fault severity. The effectiveness of the proposed IMSFDFL approach is primarily verified by a one-stage gearbox experimental platform that uses several accelerometers under different operating conditions. This approach can identify fault severity more effectively than the traditional approaches.

  6. Instrumental intelligent test of food sensory quality as mimic of human panel test combining multiple cross-perception sensors and data fusion

    International Nuclear Information System (INIS)

    Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng

    2014-01-01

    Highlights: • To develop a novel instrumental intelligent test methodology for food sensory analysis. • A novel data fusion was used in instrumental intelligent test methodology. • Linear and nonlinear tools were comparatively used for modeling. • The instrumental test methodology can be imitative of human test behavior. - Abstract: Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers

  7. Instrumental intelligent test of food sensory quality as mimic of human panel test combining multiple cross-perception sensors and data fusion

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng, E-mail: qschen@ujs.edu.cn

    2014-09-02

    Highlights: • To develop a novel instrumental intelligent test methodology for food sensory analysis. • A novel data fusion was used in instrumental intelligent test methodology. • Linear and nonlinear tools were comparatively used for modeling. • The instrumental test methodology can be imitative of human test behavior. - Abstract: Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers.

  8. Instrumental intelligent test of food sensory quality as mimic of human panel test combining multiple cross-perception sensors and data fusion.

    Science.gov (United States)

    Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng

    2014-09-02

    Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. An Extension to Deng’s Entropy in the Open World Assumption with an Application in Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Yongchuan Tang

    2018-06-01

    Full Text Available Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  10. Adaptive neural network/expert system that learns fault diagnosis for different structures

    Science.gov (United States)

    Simon, Solomon H.

    1992-08-01

    Corporations need better real-time monitoring and control systems to improve productivity by watching quality and increasing production flexibility. The innovative technology to achieve this goal is evolving in the form artificial intelligence and neural networks applied to sensor processing, fusion, and interpretation. By using these advanced Al techniques, we can leverage existing systems and add value to conventional techniques. Neural networks and knowledge-based expert systems can be combined into intelligent sensor systems which provide real-time monitoring, control, evaluation, and fault diagnosis for production systems. Neural network-based intelligent sensor systems are more reliable because they can provide continuous, non-destructive monitoring and inspection. Use of neural networks can result in sensor fusion and the ability to model highly, non-linear systems. Improved models can provide a foundation for more accurate performance parameters and predictions. We discuss a research software/hardware prototype which integrates neural networks, expert systems, and sensor technologies and which can adapt across a variety of structures to perform fault diagnosis. The flexibility and adaptability of the prototype in learning two structures is presented. Potential applications are discussed.

  11. How Magnetic Disturbance Influences the Attitude and Heading in Magnetic and Inertial Sensor-Based Orientation Estimation.

    Science.gov (United States)

    Fan, Bingfei; Li, Qingguo; Liu, Tao

    2017-12-28

    With the advancements in micro-electromechanical systems (MEMS) technologies, magnetic and inertial sensors are becoming more and more accurate, lightweight, smaller in size as well as low-cost, which in turn boosts their applications in human movement analysis. However, challenges still exist in the field of sensor orientation estimation, where magnetic disturbance represents one of the obstacles limiting their practical application. The objective of this paper is to systematically analyze exactly how magnetic disturbances affects the attitude and heading estimation for a magnetic and inertial sensor. First, we reviewed four major components dealing with magnetic disturbance, namely decoupling attitude estimation from magnetic reading, gyro bias estimation, adaptive strategies of compensating magnetic disturbance and sensor fusion algorithms. We review and analyze the features of existing methods of each component. Second, to understand each component in magnetic disturbance rejection, four representative sensor fusion methods were implemented, including gradient descent algorithms, improved explicit complementary filter, dual-linear Kalman filter and extended Kalman filter. Finally, a new standardized testing procedure has been developed to objectively assess the performance of each method against magnetic disturbance. Based upon the testing results, the strength and weakness of the existing sensor fusion methods were easily examined, and suggestions were presented for selecting a proper sensor fusion algorithm or developing new sensor fusion method.

  12. A Real-Time Smooth Weighted Data Fusion Algorithm for Greenhouse Sensing Based on Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Tengyue Zou

    2017-11-01

    Full Text Available Wireless sensor networks are widely used to acquire environmental parameters to support agricultural production. However, data variation and noise caused by actuators often produce complex measurement conditions. These factors can lead to nonconformity in reporting samples from different nodes and cause errors when making a final decision. Data fusion is well suited to reduce the influence of actuator-based noise and improve automation accuracy. A key step is to identify the sensor nodes disturbed by actuator noise and reduce their degree of participation in the data fusion results. A smoothing value is introduced and a searching method based on Prim’s algorithm is designed to help obtain stable sensing data. A voting mechanism with dynamic weights is then proposed to obtain the data fusion result. The dynamic weighting process can sharply reduce the influence of actuator noise in data fusion and gradually condition the data to normal levels over time. To shorten the data fusion time in large networks, an acceleration method with prediction is also presented to reduce the data collection time. A real-time system is implemented on STMicroelectronics STM32F103 and NORDIC nRF24L01 platforms and the experimental results verify the improvement provided by these new algorithms.

  13. Active Hearing Mechanisms Inspire Adaptive Amplification in an Acoustic Sensor System.

    Science.gov (United States)

    Guerreiro, Jose; Reid, Andrew; Jackson, Joseph C; Windmill, James F C

    2018-06-01

    Over many millions of years of evolution, nature has developed some of the most adaptable sensors and sensory systems possible, capable of sensing, conditioning and processing signals in a very power- and size-effective manner. By looking into biological sensors and systems as a source of inspiration, this paper presents the study of a bioinspired concept of signal processing at the sensor level. By exploiting a feedback control mechanism between a front-end acoustic receiver and back-end neuronal based computation, a nonlinear amplification with hysteretic behavior is created. Moreover, the transient response of the front-end acoustic receiver can also be controlled and enhanced. A theoretical model is proposed and the concept is prototyped experimentally through an embedded system setup that can provide dynamic adaptations of a sensory system comprising a MEMS microphone placed in a closed-loop feedback system. It faithfully mimics the mosquito's active hearing response as a function of the input sound intensity. This is an adaptive acoustic sensor system concept that can be exploited by sensor and system designers within acoustics and ultrasonic engineering fields.

  14. Coupled sensor/platform control design for low-level chemical detection with position-adaptive micro-UAVs

    Science.gov (United States)

    Goodwin, Thomas; Carr, Ryan; Mitra, Atindra K.; Selmic, Rastko R.

    2009-05-01

    We discuss the development of Position-Adaptive Sensors [1] for purposes for detecting embedded chemical substances in challenging environments. This concept is a generalization of patented Position-Adaptive Radar Concepts developed at AFRL for challenging conditions such as urban environments. For purposes of investigating the detection of chemical substances using multiple MAV (Micro-UAV) platforms, we have designed and implemented an experimental testbed with sample structures such as wooden carts that contain controlled leakage points. Under this general concept, some of the members of a MAV swarm can serve as external position-adaptive "transmitters" by blowing air over the cart and some of the members of a MAV swarm can serve as external position-adaptive "receivers" that are equipped with chemical or biological (chem/bio) sensors that function as "electronic noses". The objective can be defined as improving the particle count of chem/bio concentrations that impinge on a MAV-based position-adaptive sensor that surrounds a chemical repository, such as a cart, via the development of intelligent position-adaptive control algorithms. The overall effect is to improve the detection and false-alarm statistics of the overall system. Within the major sections of this paper, we discuss a number of different aspects of developing our initial MAV-Based Sensor Testbed. This testbed includes blowers to simulate position-adaptive excitations and a MAV from Draganfly Innovations Inc. with stable design modifications to accommodate our chem/bio sensor boom design. We include details with respect to several critical phases of the development effort including development of the wireless sensor network and experimental apparatus, development of the stable sensor boom for the MAV, integration of chem/bio sensors and sensor node onto the MAV and boom, development of position-adaptive control algorithms and initial tests at IDCAST (Institute for the Development and

  15. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    Science.gov (United States)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  16. Track classification within wireless sensor network

    Science.gov (United States)

    Doumerc, Robin; Pannetier, Benjamin; Moras, Julien; Dezert, Jean; Canevet, Loic

    2017-05-01

    In this paper, we present our study on track classification by taking into account environmental information and target estimated states. The tracker uses several motion model adapted to different target dynamics (pedestrian, ground vehicle and SUAV, i.e. small unmanned aerial vehicle) and works in centralized architecture. The main idea is to explore both: classification given by heterogeneous sensors and classification obtained with our fusion module. The fusion module, presented in his paper, provides a class on each track according to track location, velocity and associated uncertainty. To model the likelihood on each class, a fuzzy approach is used considering constraints on target capability to move in the environment. Then the evidential reasoning approach based on Dempster-Shafer Theory (DST) is used to perform a time integration of this classifier output. The fusion rules are tested and compared on real data obtained with our wireless sensor network.In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of this system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).

  17. Multi-sensor information fusion method for vibration fault diagnosis of rolling bearing

    Science.gov (United States)

    Jiao, Jing; Yue, Jianhai; Pei, Di

    2017-10-01

    Bearing is a key element in high-speed electric multiple unit (EMU) and any defect of it can cause huge malfunctioning of EMU under high operation speed. This paper presents a new method for bearing fault diagnosis based on least square support vector machine (LS-SVM) in feature-level fusion and Dempster-Shafer (D-S) evidence theory in decision-level fusion which were used to solve the problems about low detection accuracy, difficulty in extracting sensitive characteristics and unstable diagnosis system of single-sensor in rolling bearing fault diagnosis. Wavelet de-nosing technique was used for removing the signal noises. LS-SVM was used to make pattern recognition of the bearing vibration signal, and then fusion process was made according to the D-S evidence theory, so as to realize recognition of bearing fault. The results indicated that the data fusion method improved the performance of the intelligent approach in rolling bearing fault detection significantly. Moreover, the results showed that this method can efficiently improve the accuracy of fault diagnosis.

  18. A New, Adaptable, Optical High-Resolution 3-Axis Sensor

    Directory of Open Access Journals (Sweden)

    Niels Buchhold

    2017-01-01

    Full Text Available This article presents a new optical, multi-functional, high-resolution 3-axis sensor which serves to navigate and can, for example, replace standard joysticks in medical devices such as electric wheelchairs, surgical robots or medical diagnosis devices. A light source, e.g., a laser diode, is affixed to a movable axis and projects a random geometric shape on an image sensor (CMOS or CCD. The downstream microcontroller’s software identifies the geometric shape’s center, distortion and size, and then calculates x, y, and z coordinates, which can be processed in attached devices. Depending on the image sensor in use (e.g., 6.41 megapixels, the 3-axis sensor features a resolution of 1544 digits from right to left and 1038 digits up and down. Through interpolation, these values rise by a factor of 100. A unique feature is the exact reproducibility (deflection to coordinates and its precise ability to return to its neutral position. Moreover, optical signal processing provides a high level of protection against electromagnetic and radio frequency interference. The sensor is adaptive and adjustable to fit a user’s range of motion (stroke and force. This recommendation aims to optimize sensor systems such as joysticks in medical devices in terms of safety, ease of use, and adaptability.

  19. DEVELOPMENT OF A PEDESTRIAN INDOOR NAVIGATION SYSTEM BASED ON MULTI-SENSOR FUSION AND FUZZY LOGIC ESTIMATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    Y. C. Lai

    2015-05-01

    Full Text Available This paper presents a pedestrian indoor navigation system based on the multi-sensor fusion and fuzzy logic estimation algorithms. The proposed navigation system is a self-contained dead reckoning navigation that means no other outside signal is demanded. In order to achieve the self-contained capability, a portable and wearable inertial measure unit (IMU has been developed. Its adopted sensors are the low-cost inertial sensors, accelerometer and gyroscope, based on the micro electro-mechanical system (MEMS. There are two types of the IMU modules, handheld and waist-mounted. The low-cost MEMS sensors suffer from various errors due to the results of manufacturing imperfections and other effects. Therefore, a sensor calibration procedure based on the scalar calibration and the least squares methods has been induced in this study to improve the accuracy of the inertial sensors. With the calibrated data acquired from the inertial sensors, the step length and strength of the pedestrian are estimated by multi-sensor fusion and fuzzy logic estimation algorithms. The developed multi-sensor fusion algorithm provides the amount of the walking steps and the strength of each steps in real-time. Consequently, the estimated walking amount and strength per step are taken into the proposed fuzzy logic estimation algorithm to estimates the step lengths of the user. Since the walking length and direction are both the required information of the dead reckoning navigation, the walking direction is calculated by integrating the angular rate acquired by the gyroscope of the developed IMU module. Both the walking length and direction are calculated on the IMU module and transmit to a smartphone with Bluetooth to perform the dead reckoning navigation which is run on a self-developed APP. Due to the error accumulating of dead reckoning navigation, a particle filter and a pre-loaded map of indoor environment have been applied to the APP of the proposed navigation system

  20. Development of a Pedestrian Indoor Navigation System Based on Multi-Sensor Fusion and Fuzzy Logic Estimation Algorithms

    Science.gov (United States)

    Lai, Y. C.; Chang, C. C.; Tsai, C. M.; Lin, S. Y.; Huang, S. C.

    2015-05-01

    This paper presents a pedestrian indoor navigation system based on the multi-sensor fusion and fuzzy logic estimation algorithms. The proposed navigation system is a self-contained dead reckoning navigation that means no other outside signal is demanded. In order to achieve the self-contained capability, a portable and wearable inertial measure unit (IMU) has been developed. Its adopted sensors are the low-cost inertial sensors, accelerometer and gyroscope, based on the micro electro-mechanical system (MEMS). There are two types of the IMU modules, handheld and waist-mounted. The low-cost MEMS sensors suffer from various errors due to the results of manufacturing imperfections and other effects. Therefore, a sensor calibration procedure based on the scalar calibration and the least squares methods has been induced in this study to improve the accuracy of the inertial sensors. With the calibrated data acquired from the inertial sensors, the step length and strength of the pedestrian are estimated by multi-sensor fusion and fuzzy logic estimation algorithms. The developed multi-sensor fusion algorithm provides the amount of the walking steps and the strength of each steps in real-time. Consequently, the estimated walking amount and strength per step are taken into the proposed fuzzy logic estimation algorithm to estimates the step lengths of the user. Since the walking length and direction are both the required information of the dead reckoning navigation, the walking direction is calculated by integrating the angular rate acquired by the gyroscope of the developed IMU module. Both the walking length and direction are calculated on the IMU module and transmit to a smartphone with Bluetooth to perform the dead reckoning navigation which is run on a self-developed APP. Due to the error accumulating of dead reckoning navigation, a particle filter and a pre-loaded map of indoor environment have been applied to the APP of the proposed navigation system to extend its

  1. Evolving RBF neural networks for adaptive soft-sensor design.

    Science.gov (United States)

    Alexandridis, Alex

    2013-12-01

    This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.

  2. Testbeam Studies on Pick-Up in Sensors with Embedded Pitch Adapters

    CERN Document Server

    Rehnisch, Laura; The ATLAS collaboration

    2018-01-01

    Embedded pitch adapters are an alternative solution to external pitch adapters widely used to facilitate the wire-bonding step when connecting silicon strip sensors and readout electronics of different pitch. The pad-pitch adaption can be moved into the sensor fabrication step by implementing a second layer of metal tracks, connected by vias to the primary metal layer of sensor strips. Such a solution, however, might bear the risk of performance losses introduced by various phenomena. One of these effects, the undesired capacitive coupling between the silicon bulk and this second metal layer (pick-up) has been investigated in photon testbeam measurements. For a worst-case embedded pitch adapter design, expected to be maximally susceptible to pick-up, a qualitative analysis has visualized the effect as a function of the location on the second metal layer structure. It was further found that the unwanted effect decreases towards expected values for operating thresholds of the binary readout used. Suggestions fo...

  3. Interference mitigation through adaptive power control in wireless sensor networks

    NARCIS (Netherlands)

    Chincoli, M.; Bacchiani, C.; Syed, Aly; Exarchakos, G.; Liotta, A.

    2016-01-01

    Adaptive transmission power control schemes have been introduced in wireless sensor networks to adjust energy consumption under different network conditions. This is a crucial goal, given the constraints under which sensor communications operate. Power reduction may however have counter-productive

  4. Identifying and tracking pedestrians based on sensor fusion and motion stability predictions.

    Science.gov (United States)

    Musleh, Basam; García, Fernando; Otamendi, Javier; Armingol, José Maria; de la Escalera, Arturo

    2010-01-01

    The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS) applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem) and dense disparity maps and u-v disparity (vision subsystem). Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle.

  5. Identifying and Tracking Pedestrians Based on Sensor Fusion and Motion Stability Predictions

    Directory of Open Access Journals (Sweden)

    Arturo de la Escalera

    2010-08-01

    Full Text Available The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem and dense disparity maps and u-v disparity (vision subsystem. Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle.

  6. An Improved Evidential-IOWA Sensor Data Fusion Approach in Fault Diagnosis.

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Zhuang, Miaoyan; Fang, Xueyi; Xie, Chunhe

    2017-09-18

    As an important tool of information fusion, Dempster-Shafer evidence theory is widely applied in handling the uncertain information in fault diagnosis. However, an incorrect result may be obtained if the combined evidence is highly conflicting, which may leads to failure in locating the fault. To deal with the problem, an improved evidential-Induced Ordered Weighted Averaging (IOWA) sensor data fusion approach is proposed in the frame of Dempster-Shafer evidence theory. In the new method, the IOWA operator is used to determine the weight of different sensor data source, while determining the parameter of the IOWA, both the distance of evidence and the belief entropy are taken into consideration. First, based on the global distance of evidence and the global belief entropy, the α value of IOWA is obtained. Simultaneously, a weight vector is given based on the maximum entropy method model. Then, according to IOWA operator, the evidence are modified before applying the Dempster's combination rule. The proposed method has a better performance in conflict management and fault diagnosis due to the fact that the information volume of each evidence is taken into consideration. A numerical example and a case study in fault diagnosis are presented to show the rationality and efficiency of the proposed method.

  7. Multimode Adaptable Microwave Radar Sensor Based on Leaky-Wave Antennas

    Czech Academy of Sciences Publication Activity Database

    Hudec, P.; Pánek, Petr; Jeník, V.

    2017-01-01

    Roč. 65, č. 9 (2017), s. 3464-3473 ISSN 0018-9480 Institutional support: RVO:67985882 Keywords : adaptable sensor * low-range radar * multimode sensor Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering OBOR OECD: Electrical and electronic engineering Impact factor: 2.897, year: 2016

  8. Neuromorphic infrared focal plane performs sensor fusion on-plane local-contrast-enhancement spatial and temporal filtering

    Science.gov (United States)

    Massie, Mark A.; Woolaway, James T., II; Curzan, Jon P.; McCarley, Paul L.

    1993-08-01

    An infrared focal plane has been simulated, designed and fabricated which mimics the form and function of the vertebrate retina. The `Neuromorphic' focal plane has the capability of performing pixel-based sensor fusion and real-time local contrast enhancement, much like the response of the human eye. The device makes use of an indium antimonide detector array with a 3 - 5 micrometers spectral response, and a switched capacitor resistive network to compute a real-time 2D spatial average. This device permits the summation of other sensor outputs to be combined on-chip with the infrared detections of the focal plane itself. The resulting real-time analog processed information thus represents the combined information of many sensors with the advantage that analog spatial and temporal signal processing is performed at the focal plane. A Gaussian subtraction method is used to produce the pixel output which when displayed produces an image with enhanced edges, representing spatial and temporal derivatives in the scene. The spatial and temporal responses of the device are tunable during operation, permitting the operator to `peak up' the response of the array to spatial and temporally varying signals. Such an array adapts to ambient illumination conditions without loss of detection performance. This paper reviews the Neuromorphic infrared focal plane from initial operational simulations to detailed design characteristics, and concludes with a presentation of preliminary operational data for the device as well as videotaped imagery.

  9. A Markov game theoretic data fusion approach for cyber situational awareness

    Science.gov (United States)

    Shen, Dan; Chen, Genshe; Cruz, Jose B., Jr.; Haynes, Leonard; Kruger, Martin; Blasch, Erik

    2007-04-01

    This paper proposes an innovative data-fusion/ data-mining game theoretic situation awareness and impact assessment approach for cyber network defense. Alerts generated by Intrusion Detection Sensors (IDSs) or Intrusion Prevention Sensors (IPSs) are fed into the data refinement (Level 0) and object assessment (L1) data fusion components. High-level situation/threat assessment (L2/L3) data fusion based on Markov game model and Hierarchical Entity Aggregation (HEA) are proposed to refine the primitive prediction generated by adaptive feature/pattern recognition and capture new unknown features. A Markov (Stochastic) game method is used to estimate the belief of each possible cyber attack pattern. Game theory captures the nature of cyber conflicts: determination of the attacking-force strategies is tightly coupled to determination of the defense-force strategies and vice versa. Also, Markov game theory deals with uncertainty and incompleteness of available information. A software tool is developed to demonstrate the performance of the high level information fusion for cyber network defense situation and a simulation example shows the enhanced understating of cyber-network defense.

  10. An Adaptive Fault-Tolerant Communication Scheme for Body Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zichuan Xu

    2010-10-01

    Full Text Available A high degree of reliability for critical data transmission is required in body sensor networks (BSNs. However, BSNs are usually vulnerable to channel impairments due to body fading effect and RF interference, which may potentially cause data transmission to be unreliable. In this paper, an adaptive and flexible fault-tolerant communication scheme for BSNs, namely AFTCS, is proposed. AFTCS adopts a channel bandwidth reservation strategy to provide reliable data transmission when channel impairments occur. In order to fulfill the reliability requirements of critical sensors, fault-tolerant priority and queue are employed to adaptively adjust the channel bandwidth allocation. Simulation results show that AFTCS can alleviate the effect of channel impairments, while yielding lower packet loss rate and latency for critical sensors at runtime.

  11. Adaptive polarization image fusion based on regional energy dynamic weighted average

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yong-qiang; PAN Quan; ZHANG Hong-cai

    2005-01-01

    According to the principle of polarization imaging and the relation between Stokes parameters and the degree of linear polarization, there are much redundant and complementary information in polarized images. Since man-made objects and natural objects can be easily distinguished in images of degree of linear polarization and images of Stokes parameters contain rich detailed information of the scene, the clutters in the images can be removed efficiently while the detailed information can be maintained by combining these images. An algorithm of adaptive polarization image fusion based on regional energy dynamic weighted average is proposed in this paper to combine these images. Through an experiment and simulations,most clutters are removed by this algorithm. The fusion method is used for different light conditions in simulation, and the influence of lighting conditions on the fusion results is analyzed.

  12. Wearable sensors for human health monitoring

    Science.gov (United States)

    Asada, H. Harry; Reisner, Andrew

    2006-03-01

    Wearable sensors for continuous monitoring of vital signs for extended periods of weeks or months are expected to revolutionize healthcare services in the home and workplace as well as in hospitals and nursing homes. This invited paper describes recent research progress in wearable health monitoring technology and its clinical applications, with emphasis on blood pressure and circulatory monitoring. First, a finger ring-type wearable blood pressure sensor based on photo plethysmogram is presented. Technical issues, including motion artifact reduction, power saving, and wearability enhancement, will be addressed. Second, sensor fusion and sensor networking for integrating multiple sensors with diverse modalities will be discussed for comprehensive monitoring and diagnosis of health status. Unlike traditional snap-shot measurements, continuous monitoring with wearable sensors opens up the possibility to treat the physiological system as a dynamical process. This allows us to apply powerful system dynamics and control methodologies, such as adaptive filtering, single- and multi-channel system identification, active noise cancellation, and adaptive control, to the monitoring and treatment of highly complex physiological systems. A few clinical trials illustrate the potentials of the wearable sensor technology for future heath care services.

  13. Sensor Virtual Adaptable de Concentración de Etanol para Fermentadores Industriales

    Directory of Open Access Journals (Sweden)

    Boris Martínez

    2009-07-01

    Full Text Available Resumen: Los sistemas de control emplean sensores para observar el estado del proceso y tomar decisiones. En ocasiones, se necesita estimar las variables del proceso pues el sensor adecuado no existe, es prohibitivamente costoso o las mediciones son difíciles de realizar. Una solución consiste en inferir las variables no medidas a partir de otras variables mediante sensores virtuales o sensores por software (soft-sensors. En los procesos de fermentación alcohólica, la medición de la concentración del etanol es esencial. Sin embargo, no existen sensores baratos y confiables para medirla en línea ni existe una solución aceptada por todos del modelado de dicha variable. Además, las fermentaciones nunca son iguales pues los microorganismos son muy sensibles a pequeñas desviaciones en las variables involucradas. Por tanto, estos procesos requieren un sistema de estimación adaptable y altamente robusto. En este trabajo se presenta un sensor virtual adaptable para un proceso fermentativo de bioetanol empleando un modelo borroso evolutivo a partir de datos del proceso. Además, el modelo obtenido es compacto y presenta una estructura adecuada para su aplicación futura en estrategias de control, en aras de optimizar la productividad del proceso y disminuir los costos de producción. Palabras clave: bioetanol, procesos fermentativos, sensores virtuales o sensores software, sistemas adaptables, sistemas borrosos

  14. Weaving Hilbert space fusion frames

    OpenAIRE

    Neyshaburi, Fahimeh Arabyani; Arefijamaal, Ali Akbar

    2018-01-01

    A new notion in frame theory, so called weaving frames has been recently introduced to deal with some problems in signal processing and wireless sensor networks. Also, fusion frames are an important extension of frames, used in many areas especially for wireless sensor networks. In this paper, we survey the notion of weaving Hilbert space fusion frames. This concept can be had potential applications in wireless sensor networks which require distributed processing using different fusion frames...

  15. Performance of Hall sensor-based devices for magnetic field diagnosis at fusion reactors

    Czech Academy of Sciences Publication Activity Database

    Bolshakova, I.; Ďuran, Ivan; Holyaka, R.; Hristoforou, E.; Marusenkov, A.

    2007-01-01

    Roč. 5, č. 1 (2007), s. 283-288 ISSN 1546-198X R&D Projects: GA AV ČR KJB100430504 Institutional research plan: CEZ:AV0Z20430508 Keywords : Galvanomagnetic * Sensor * Fusion Reactor * Magnetic Diagnostics * Radiation Hardness Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.587, year: 2007

  16. Keystrokes Inference Attack on Android: A Comparative Evaluation of Sensors and Their Fusion

    Directory of Open Access Journals (Sweden)

    Ahmed Al-Haiqi

    2014-11-01

    Full Text Available Introducing motion sensors into smartphones contributed to a wide range of applications in human-phone interaction, gaming, and many others. However, built-in sensors that detect subtle motion changes (e.g. accelerometers, might also reveal information about taps on touch screens: the main user input mode. Few researchers have already demonstrated the idea of exploiting motion sensors as side-channels into inferring keystrokes. Taken at most as initial explorations, much research is still needed to analyze the practicality of the new threat and examine various aspects of its implementation. One important aspect affecting directly the attack effectiveness is the selection of the right combination of sensors, to supply inference data. Although other aspects also play crucial role (e.g. the features set, we start in this paper by focusing on the comparison of different available sensors, in terms of the inference accuracy. We consider individual sensors shipped on Android phones, and study few options of preprocessing their raw datasets as well as fusing several sensors' readings. Our results indicate an outstanding performance of the gyroscope, and the potential of sensors data fusion. However, it seems that sensors with magnetometer component or the accelerometer alone have less benefit in the context of the adverted attack.

  17. The Joint Adaptive Kalman Filter (JAKF) for Vehicle Motion State Estimation.

    Science.gov (United States)

    Gao, Siwei; Liu, Yanheng; Wang, Jian; Deng, Weiwen; Oh, Heekuck

    2016-07-16

    This paper proposes a multi-sensory Joint Adaptive Kalman Filter (JAKF) through extending innovation-based adaptive estimation (IAE) to estimate the motion state of the moving vehicles ahead. JAKF views Lidar and Radar data as the source of the local filters, which aims to adaptively adjust the measurement noise variance-covariance (V-C) matrix 'R' and the system noise V-C matrix 'Q'. Then, the global filter uses R to calculate the information allocation factor 'β' for data fusion. Finally, the global filter completes optimal data fusion and feeds back to the local filters to improve the measurement accuracy of the local filters. Extensive simulation and experimental results show that the JAKF has better adaptive ability and fault tolerance. JAKF enables one to bridge the gap of the accuracy difference of various sensors to improve the integral filtering effectivity. If any sensor breaks down, the filtered results of JAKF still can maintain a stable convergence rate. Moreover, the JAKF outperforms the conventional Kalman filter (CKF) and the innovation-based adaptive Kalman filter (IAKF) with respect to the accuracy of displacement, velocity, and acceleration, respectively.

  18. Discrete Kalman Filter based Sensor Fusion for Robust Accessibility Interfaces

    International Nuclear Information System (INIS)

    Ghersi, I; Miralles, M T; Mariño, M

    2016-01-01

    Human-machine interfaces have evolved, benefiting from the growing access to devices with superior, embedded signal-processing capabilities, as well as through new sensors that allow the estimation of movements and gestures, resulting in increasingly intuitive interfaces. In this context, sensor fusion for the estimation of the spatial orientation of body segments allows to achieve more robust solutions, overcoming specific disadvantages derived from the use of isolated sensors, such as the sensitivity of magnetic-field sensors to external influences, when used in uncontrolled environments. In this work, a method for the combination of image-processing data and angular-velocity registers from a 3D MEMS gyroscope, through a Discrete-time Kalman Filter, is proposed and deployed as an alternate user interface for mobile devices, in which an on-screen pointer is controlled with head movements. Results concerning general performance of the method are presented, as well as a comparative analysis, under a dedicated test application, with results from a previous version of this system, in which the relative-orientation information was acquired directly from MEMS sensors (3D magnetometer-accelerometer). These results show an improved response for this new version of the pointer, both in terms of precision and response time, while keeping many of the benefits that were highlighted for its predecessor, giving place to a complementary method for signal acquisition that can be used as an alternative-input device, as well as for accessibility solutions. (paper)

  19. Extending the Lifetime of Sensor Networks through Adaptive Reclustering

    Directory of Open Access Journals (Sweden)

    Gianluigi Ferrari

    2007-06-01

    Full Text Available We analyze the lifetime of clustered sensor networks with decentralized binary detection under a physical layer quality-of-service (QoS constraint, given by the maximum tolerable probability of decision error at the access point (AP. In order to properly model the network behavior, we consider four different distributions (exponential, uniform, Rayleigh, and lognormal for the lifetime of a single sensor. We show the benefits, in terms of longer network lifetime, of adaptive reclustering. We also derive an analytical framework for the computation of the network lifetime and the penalty, in terms of time delay and energy consumption, brought by adaptive reclustering. On the other hand, absence of reclustering leads to a shorter network lifetime, and we show the impact of various clustering configurations under different QoS conditions. Our results show that the organization of sensors in a few big clusters is the winning strategy to maximize the network lifetime. Moreover, the observation of the phenomenon should be frequent in order to limit the penalties associated with the reclustering procedure. We also apply the developed framework to analyze the energy consumption associated with the proposed reclustering protocol, obtaining results in good agreement with the performance of realistic wireless sensor networks. Finally, we present simulation results on the lifetime of IEEE 802.15.4 wireless sensor networks, which enrich the proposed analytical framework and show that typical networking performance metrics (such as throughput and delay are influenced by the sensor network lifetime.

  20. Extending the Lifetime of Sensor Networks through Adaptive Reclustering

    Directory of Open Access Journals (Sweden)

    Ferrari Gianluigi

    2007-01-01

    Full Text Available We analyze the lifetime of clustered sensor networks with decentralized binary detection under a physical layer quality-of-service (QoS constraint, given by the maximum tolerable probability of decision error at the access point (AP. In order to properly model the network behavior, we consider four different distributions (exponential, uniform, Rayleigh, and lognormal for the lifetime of a single sensor. We show the benefits, in terms of longer network lifetime, of adaptive reclustering. We also derive an analytical framework for the computation of the network lifetime and the penalty, in terms of time delay and energy consumption, brought by adaptive reclustering. On the other hand, absence of reclustering leads to a shorter network lifetime, and we show the impact of various clustering configurations under different QoS conditions. Our results show that the organization of sensors in a few big clusters is the winning strategy to maximize the network lifetime. Moreover, the observation of the phenomenon should be frequent in order to limit the penalties associated with the reclustering procedure. We also apply the developed framework to analyze the energy consumption associated with the proposed reclustering protocol, obtaining results in good agreement with the performance of realistic wireless sensor networks. Finally, we present simulation results on the lifetime of IEEE 802.15.4 wireless sensor networks, which enrich the proposed analytical framework and show that typical networking performance metrics (such as throughput and delay are influenced by the sensor network lifetime.

  1. Adapting Mobile Beacon-Assisted Localization in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Wei Dong

    2009-04-01

    Full Text Available The ability to automatically locate sensor nodes is essential in many Wireless Sensor Network (WSN applications. To reduce the number of beacons, many mobile-assisted approaches have been proposed. Current mobile-assisted approaches for localization require special hardware or belong to centralized localization algorithms involving some deterministic approaches due to the fact that they explicitly consider the impreciseness of location estimates. In this paper, we first propose a range-free, distributed and probabilistic Mobile Beacon-assisted Localization (MBL approach for static WSNs. Then, we propose another approach based on MBL, called Adapting MBL (A-MBL, to increase the efficiency and accuracy of MBL by adapting the size of sample sets and the parameter of the dynamic model during the estimation process. Evaluation results show that the accuracy of MBL and A-MBL outperform both Mobile and Static sensor network Localization (MSL and Arrival and Departure Overlap (ADO when both of them use only a single mobile beacon for localization in static WSNs.

  2. Adapting mobile beacon-assisted localization in wireless sensor networks.

    Science.gov (United States)

    Teng, Guodong; Zheng, Kougen; Dong, Wei

    2009-01-01

    The ability to automatically locate sensor nodes is essential in many Wireless Sensor Network (WSN) applications. To reduce the number of beacons, many mobile-assisted approaches have been proposed. Current mobile-assisted approaches for localization require special hardware or belong to centralized localization algorithms involving some deterministic approaches due to the fact that they explicitly consider the impreciseness of location estimates. In this paper, we first propose a range-free, distributed and probabilistic Mobile Beacon-assisted Localization (MBL) approach for static WSNs. Then, we propose another approach based on MBL, called Adapting MBL (A-MBL), to increase the efficiency and accuracy of MBL by adapting the size of sample sets and the parameter of the dynamic model during the estimation process. Evaluation results show that the accuracy of MBL and A-MBL outperform both Mobile and Static sensor network Localization (MSL) and Arrival and Departure Overlap (ADO) when both of them use only a single mobile beacon for localization in static WSNs.

  3. Embedded pitch adapters: A high-yield interconnection solution for strip sensors

    Energy Technology Data Exchange (ETDEWEB)

    Ullán, M., E-mail: miguel.ullan@imb-cnm.csic.es [Centro Nacional de Microelectronica (IMB-CNM, CSIC), Campus UAB-Bellaterra, 08193 Barcelona (Spain); Allport, P.P.; Baca, M.; Broughton, J.; Chisholm, A.; Nikolopoulos, K.; Pyatt, S.; Thomas, J.P.; Wilson, J.A. [School of Physics and Astronomy, University of Birmingham, Birmingham B15 2TT (United Kingdom); Kierstead, J.; Kuczewski, P.; Lynn, D. [Brookhaven National Laboratory, Physics Department and Instrumentation Division, Upton, NY 11973-5000 (United States); Hommels, L.B.A. [Cavendish Laboratory, University of Cambridge, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Fleta, C.; Fernandez-Tejero, J.; Quirion, D. [Centro Nacional de Microelectronica (IMB-CNM, CSIC), Campus UAB-Bellaterra, 08193 Barcelona (Spain); Bloch, I.; Díez, S.; Gregor, I.M.; Lohwasser, K. [DESY, Notkestrasse 85, 22607 Hamburg (Germany); and others

    2016-09-21

    A proposal to fabricate large area strip sensors with integrated, or embedded, pitch adapters is presented for the End-cap part of the Inner Tracker in the ATLAS experiment. To implement the embedded pitch adapters, a second metal layer is used in the sensor fabrication, for signal routing to the ASICs. Sensors with different embedded pitch adapters have been fabricated in order to optimize the design and technology. Inter-strip capacitance, noise, pick-up, cross-talk, signal efficiency, and fabrication yield have been taken into account in their design and fabrication. Inter-strip capacitance tests taking into account all channel neighbors reveal the important differences between the various designs considered. These tests have been correlated with noise figures obtained in full assembled modules, showing that the tests performed on the bare sensors are a valid tool to estimate the final noise in the full module. The full modules have been subjected to test beam experiments in order to evaluate the incidence of cross-talk, pick-up, and signal loss. The detailed analysis shows no indication of cross-talk or pick-up as no additional hits can be observed in any channel not being hit by the beam above 170 mV threshold, and the signal in those channels is always below 1% of the signal recorded in the channel being hit, above 100 mV threshold. First results on irradiated mini-sensors with embedded pitch adapters do not show any change in the interstrip capacitance measurements with only the first neighbors connected.

  4. Metal Hall sensors for the new generation fusion reactors of DEMO scale

    Science.gov (United States)

    Bolshakova, I.; Bulavin, M.; Kargin, N.; Kost, Ya.; Kuech, T.; Kulikov, S.; Radishevskiy, M.; Shurygin, F.; Strikhanov, M.; Vasil'evskii, I.; Vasyliev, A.

    2017-11-01

    For the first time, the results of on-line testing of metal Hall sensors based on nano-thickness (50-70) nm gold films, which was conducted under irradiation by high-energy neutrons up to the high fluences of 1 · 1024 n · m-2, are presented. The testing has been carried out in the IBR-2 fast pulsed reactor in the neutron flux with the intensity of 1.5 · 1017 n · m-2 · s-1 at the Joint Institute for Nuclear Research. The energy spectrum of neutron flux was very close to that expected for the ex-vessel sensors locations in the ITER experimental reactor. The magnetic field sensitivity of the gold sensors was stable within the whole fluence range under research. Also, sensitivity values at the start and at the end of irradiation session were equal within the measurement error (<1%). The results obtained make it possible to recommend gold sensors for magnetic diagnostics in the new generation fusion reactors of DEMO scale.

  5. A new approach to self-localization for mobile robots using sensor data fusion

    International Nuclear Information System (INIS)

    Moshiri, B.; Asharif, M.; Hoseim Nezhad, R.

    2002-01-01

    This paper proposes a new approach for calibration of dead reckoning process. Using the well-known UMB mark (University of Michigan Benchmark) is not sufficient for a desirable calibration of dead reckoning. Besides, existing calibration methods usually require explicit measurement of actual motion of the robot. Some recent methods use the smart encoder trailer or long range finder sensors such as ultrasonic or laser range finders for automatic calibration. Manual measurement is necessary in the case of the robots that are not equipped with long-range detectors or such smart encoder trailer. Our proposed approach uses an environment map that is created by fusion of proximity data, in order to calibrate the odometry error automatically. In the new approach, the systematic part of the error is adaptively estimated and compensated by an efficient and incremental maximum likelihood algorithm. Actually, environment map data are fused with the odometry and current sensory data in order to acquire the maximum likelihood estimation. The advantages of the proposed approach are demonstrated in some experiments with Khepera robot. It is shown that the amount of pose estimation error is reduced by a percentage of more than 80%

  6. Speech Adaptation to Kinematic Recording Sensors: Perceptual and Acoustic Findings

    Science.gov (United States)

    Dromey, Christopher; Hunter, Elise; Nissen, Shawn L.

    2018-01-01

    Purpose: This study used perceptual and acoustic measures to examine the time course of speech adaptation after the attachment of electromagnetic sensor coils to the tongue, lips, and jaw. Method: Twenty native English speakers read aloud stimulus sentences before the attachment of the sensors, immediately after attachment, and again 5, 10, 15,…

  7. Sensor data fusion of radar, ESM, IFF, and data LINK of the Canadian Patrol Frigate and the data alignment issues

    Science.gov (United States)

    Couture, Jean; Boily, Edouard; Simard, Marc-Alain

    1996-05-01

    The research and development group at Loral Canada is now at the second phase of the development of a data fusion demonstration model (DFDM) for a naval anti-air warfare to be used as a workbench tool to perform exploratory research. This project has emphatically addressed how the concepts related to fusion could be implemented within the Canadian Patrol Frigate (CPF) software environment. The project has been designed to read data passively on the CPF bus without any modification to the CPF software. This has brought to light important time alignment issues since the CPF sensors and the CPF command and control system were not important time alignment issues since the CPF sensors and the CPF command and control system were not originally designed to support a track management function which fuses information. The fusion of data from non-organic sensors with the tactical Link-11 data has produced stimulating spatial alignment problems which have been overcome by the use of a geodetic referencing coordinate system. Some benchmark scenarios have been selected to quantitatively demonstrate the capabilities of this fusion implementation. This paper describes the implementation design of DFDM (version 2), and summarizes the results obtained so far when fusing the scenarios simulated data.

  8. Adaptive structured dictionary learning for image fusion based on group-sparse-representation

    Science.gov (United States)

    Yang, Jiajie; Sun, Bin; Luo, Chengwei; Wu, Yuzhong; Xu, Limei

    2018-04-01

    Dictionary learning is the key process of sparse representation which is one of the most widely used image representation theories in image fusion. The existing dictionary learning method does not use the group structure information and the sparse coefficients well. In this paper, we propose a new adaptive structured dictionary learning algorithm and a l1-norm maximum fusion rule that innovatively utilizes grouped sparse coefficients to merge the images. In the dictionary learning algorithm, we do not need prior knowledge about any group structure of the dictionary. By using the characteristics of the dictionary in expressing the signal, our algorithm can automatically find the desired potential structure information that hidden in the dictionary. The fusion rule takes the physical meaning of the group structure dictionary, and makes activity-level judgement on the structure information when the images are being merged. Therefore, the fused image can retain more significant information. Comparisons have been made with several state-of-the-art dictionary learning methods and fusion rules. The experimental results demonstrate that, the dictionary learning algorithm and the fusion rule both outperform others in terms of several objective evaluation metrics.

  9. Gyro Drift Correction for An Indirect Kalman Filter Based Sensor Fusion Driver

    Directory of Open Access Journals (Sweden)

    Chan-Gun Lee

    2016-06-01

    Full Text Available Sensor fusion techniques have made a significant contribution to the success of the recently emerging mobile applications era because a variety of mobile applications operate based on multi-sensing information from the surrounding environment, such as navigation systems, fitness trackers, interactive virtual reality games, etc. For these applications, the accuracy of sensing information plays an important role to improve the user experience (UX quality, especially with gyroscopes and accelerometers. Therefore, in this paper, we proposed a novel mechanism to resolve the gyro drift problem, which negatively affects the accuracy of orientation computations in the indirect Kalman filter based sensor fusion. Our mechanism focuses on addressing the issues of external feedback loops and non-gyro error elements contained in the state vectors of an indirect Kalman filter. Moreover, the mechanism is implemented in the device-driver layer, providing lower process latency and transparency capabilities for the upper applications. These advances are relevant to millions of legacy applications since utilizing our mechanism does not require the existing applications to be re-programmed. The experimental results show that the root mean square errors (RMSE before and after applying our mechanism are significantly reduced from 6.3 × 10−1 to 5.3 × 10−7, respectively.

  10. RGB-D, Laser and Thermal Sensor Fusion for People following in a Mobile Robot

    Directory of Open Access Journals (Sweden)

    Loreto Susperregi

    2013-06-01

    Full Text Available Detecting and tracking people is a key capability for robots that operate in populated environments. In this paper, we used a multiple sensor fusion approach that combines three kinds of sensors in order to detect people using RGB-D vision, lasers and a thermal sensor mounted on a mobile platform. The Kinect sensor offers a rich data set at a significantly low cost, however, there are some limitations to its use in a mobile platform, mainly that the Kinect algorithms for people detection rely on images captured by a static camera. To cope with these limitations, this work is based on the combination of the Kinect and a Hokuyo laser and a thermopile array sensor. A real-time particle filter system merges the information provided by the sensors and calculates the position of the target, using probabilistic leg and thermal patterns, image features and optical flow to this end. Experimental results carried out with a mobile platform in a Science museum have shown that the combination of different sensory cues increases the reliability of the people following system.

  11. Combination Adaptive Traffic Algorithm and Coordinated Sleeping in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    M. Udin Harun Al Rasyid

    2014-12-01

    Full Text Available Wireless sensor network (WSN uses a battery as its primary power source, so that WSN will be limited to battery power for long operations. The WSN should be able to save the energy consumption in order to operate in a long time.WSN has the potential to be the future of wireless communications solutions. WSN are small but has a variety of functions that can help human life. WSN has the wide variety of sensors and can communicate quickly making it easier for people to obtain information accurately and quickly. In this study, we combine adaptive traffic algorithms and coordinated sleeping as power‐efficient WSN solution. We compared the performance of our proposed ideas combination adaptive traffic and coordinated sleeping algorithm with non‐adaptive scheme. From the simulation results, our proposed idea has good‐quality data transmission and more efficient in energy consumption, but it has higher delay than that of non‐adaptive scheme. Keywords:WSN,adaptive traffic,coordinated sleeping,beacon order,superframe order.

  12. The assessment of multi-sensor image fusion using wavelet transforms for mapping the Brazilian Savanna

    NARCIS (Netherlands)

    Weimar Acerbi, F.; Clevers, J.G.P.W.; Schaepman, M.E.

    2006-01-01

    Multi-sensor image fusion using the wavelet approach provides a conceptual framework for the improvement of the spatial resolution with minimal distortion of the spectral content of the source image. This paper assesses whether images with a large ratio of spatial resolution can be fused, and

  13. New distributed fusion filtering algorithm based on covariances over sensor networks with random packet dropouts

    Science.gov (United States)

    Caballero-Águila, R.; Hermoso-Carazo, A.; Linares-Pérez, J.

    2017-07-01

    This paper studies the distributed fusion estimation problem from multisensor measured outputs perturbed by correlated noises and uncertainties modelled by random parameter matrices. Each sensor transmits its outputs to a local processor over a packet-erasure channel and, consequently, random losses may occur during transmission. Different white sequences of Bernoulli variables are introduced to model the transmission losses. For the estimation, each lost output is replaced by its estimator based on the information received previously, and only the covariances of the processes involved are used, without requiring the signal evolution model. First, a recursive algorithm for the local least-squares filters is derived by using an innovation approach. Then, the cross-correlation matrices between any two local filters is obtained. Finally, the distributed fusion filter weighted by matrices is obtained from the local filters by applying the least-squares criterion. The performance of the estimators and the influence of both sensor uncertainties and transmission losses on the estimation accuracy are analysed in a numerical example.

  14. An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation

    Science.gov (United States)

    He, Changyu; Kazanzides, Peter; Sen, Hasan Tutkun; Kim, Sungmin; Liu, Yue

    2015-01-01

    Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions. PMID:26184191

  15. An Adaptive Orientation Estimation Method for Magnetic and Inertial Sensors in the Presence of Magnetic Disturbances

    Directory of Open Access Journals (Sweden)

    Bingfei Fan

    2017-05-01

    Full Text Available Magnetic and inertial sensors have been widely used to estimate the orientation of human segments due to their low cost, compact size and light weight. However, the accuracy of the estimated orientation is easily affected by external factors, especially when the sensor is used in an environment with magnetic disturbances. In this paper, we propose an adaptive method to improve the accuracy of orientation estimations in the presence of magnetic disturbances. The method is based on existing gradient descent algorithms, and it is performed prior to sensor fusion algorithms. The proposed method includes stationary state detection and magnetic disturbance severity determination. The stationary state detection makes this method immune to magnetic disturbances in stationary state, while the magnetic disturbance severity determination helps to determine the credibility of magnetometer data under dynamic conditions, so as to mitigate the negative effect of the magnetic disturbances. The proposed method was validated through experiments performed on a customized three-axis instrumented gimbal with known orientations. The error of the proposed method and the original gradient descent algorithms were calculated and compared. Experimental results demonstrate that in stationary state, the proposed method is completely immune to magnetic disturbances, and in dynamic conditions, the error caused by magnetic disturbance is reduced by 51.2% compared with original MIMU gradient descent algorithm.

  16. Multi-Sensor Building Fire Alarm System with Information Fusion Technology Based on D-S Evidence Theory

    Directory of Open Access Journals (Sweden)

    Qian Ding

    2014-10-01

    Full Text Available Multi-sensor and information fusion technology based on Dempster-Shafer evidence theory is applied in the system of a building fire alarm to realize early detecting and alarming. By using a multi-sensor to monitor the parameters of the fire process, such as light, smoke, temperature, gas and moisture, the range of fire monitoring in space and time is expanded compared with a single-sensor system. Then, the D-S evidence theory is applied to fuse the information from the multi-sensor with the specific fire model, and the fire alarm is more accurate and timely. The proposed method can avoid the failure of the monitoring data effectively, deal with the conflicting evidence from the multi-sensor robustly and improve the reliability of fire warning significantly.

  17. Spatial Aspects of Multi-Sensor Data Fusion: Aerosol Optical Thickness

    Science.gov (United States)

    Leptoukh, Gregory; Zubko, V.; Gopalan, A.

    2007-01-01

    The Goddard Earth Sciences Data and Information Services Center (GES DISC) investigated the applicability and limitations of combining multi-sensor data through data fusion, to increase the usefulness of the multitude of NASA remote sensing data sets, and as part of a larger effort to integrate this capability in the GES-DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni). This initial study focused on merging daily mean Aerosol Optical Thickness (AOT), as measured by the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the Terra and Aqua satellites, to increase spatial coverage and produce complete fields to facilitate comparison with models and station data. The fusion algorithm used the maximum likelihood technique to merge the pixel values where available. The algorithm was applied to two regional AOT subsets (with mostly regular and irregular gaps, respectively) and a set of AOT fields that differed only in the size and location of artificially created gaps. The Cumulative Semivariogram (CSV) was found to be sensitive to the spatial distribution of gap areas and, thus, useful for assessing the sensitivity of the fused data to spatial gaps.

  18. Localization in orchards using Extended Kalman Filter for sensor-fusion - A FroboMind component

    DEFF Research Database (Denmark)

    Christiansen, Martin Peter; Jensen, Kjeld; Ellekilde, Lars-Peter

    Using the detected trees seen in gure 4(b) a localised SLAM map of the surroundings area, can be created an used to determine the localisation of the tractor. This kind of sensor-fusion is used, to keep the amount of prior information about outlay of the orchard to a minimum, so it can be used...

  19. Adaptive LINE-P: An Adaptive Linear Energy Prediction Model for Wireless Sensor Network Nodes.

    Science.gov (United States)

    Ahmed, Faisal; Tamberg, Gert; Le Moullec, Yannick; Annus, Paul

    2018-04-05

    In the context of wireless sensor networks, energy prediction models are increasingly useful tools that can facilitate the power management of the wireless sensor network (WSN) nodes. However, most of the existing models suffer from the so-called fixed weighting parameter, which limits their applicability when it comes to, e.g., solar energy harvesters with varying characteristics. Thus, in this article we propose the Adaptive LINE-P (all cases) model that calculates adaptive weighting parameters based on the stored energy profiles. Furthermore, we also present a profile compression method to reduce the memory requirements. To determine the performance of our proposed model, we have used real data for the solar and wind energy profiles. The simulation results show that our model achieves 90-94% accuracy and that the compressed method reduces memory overheads by 50% as compared to state-of-the-art models.

  20. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    Science.gov (United States)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  1. Base isolation technique for tokamak type fusion reactor using adaptive control

    International Nuclear Information System (INIS)

    Koizumi, T.; Tsujiuchi, N.; Kishimoto, F.; Iida, H.; Fujita, T.

    1991-01-01

    In this paper relating to the isolation device of heavy structure such as nuclear fusion reactor, a control rule for reducing the response acceleration and relative displacement simultaneously was formulated, and the aseismic performance was improved by employing the adaptive control method of changing the damping factors of the system adaptively every moment. The control rule was studied by computer simulation, and the aseismic effect was evaluated in an experiment employing a scale model. As a results, the following conclusions were obtained. (1) By employing the control rule presented in this paper, both absolute acceleration and relative displacement can be reduced simultaneously without making the system unstable. (2) By introducing this control rule in a scale model assuming the Tokamak type fusion reactor, the response acceleration can be suppressed down to 78 % and also the relative displacement to 79 % as compared with the conventional aseismic method. (3) The sensitivities of absolute acceleration and relative displacement with respect to the control gain are not equal. However, by employing the relative weighting factor between the absolute acceleration and relative displacement, it is possible to increase the control capability for any kind of objective structures and appliances. (author)

  2. Detection scheme for a partially occluded pedestrian based on occluded depth in lidar-radar sensor fusion

    Science.gov (United States)

    Kwon, Seong Kyung; Hyun, Eugin; Lee, Jin-Hee; Lee, Jonghun; Son, Sang Hyuk

    2017-11-01

    Object detections are critical technologies for the safety of pedestrians and drivers in autonomous vehicles. Above all, occluded pedestrian detection is still a challenging topic. We propose a new detection scheme for occluded pedestrian detection by means of lidar-radar sensor fusion. In the proposed method, the lidar and radar regions of interest (RoIs) have been selected based on the respective sensor measurement. Occluded depth is a new means to determine whether an occluded target exists or not. The occluded depth is a region projected out by expanding the longitudinal distance with maintaining the angle formed by the outermost two end points of the lidar RoI. The occlusion RoI is the overlapped region made by superimposing the radar RoI and the occluded depth. The object within the occlusion RoI is detected by the radar measurement information and the occluded object is estimated as a pedestrian based on human Doppler distribution. Additionally, various experiments are performed in detecting a partially occluded pedestrian in outdoor as well as indoor environments. According to experimental results, the proposed sensor fusion scheme has much better detection performance compared to the case without our proposed method.

  3. Optical Communication System for Remote Monitoring and Adaptive Control of Distributed Ground Sensors Exhibiting Collective Intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Cameron, S.M.; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.

    1998-11-01

    Comprehensive management of the battle-space has created new requirements in information management, communication, and interoperability as they effect surveillance and situational awareness. The objective of this proposal is to expand intelligent controls theory to produce a uniquely powerful implementation of distributed ground-based measurement incorporating both local collective behavior, and interoperative global optimization for sensor fusion and mission oversight. By using a layered hierarchal control architecture to orchestrate adaptive reconfiguration of autonomous robotic agents, we can improve overall robustness and functionality in dynamic tactical environments without information bottlenecks. In this concept, each sensor is equipped with a miniaturized optical reflectance modulator which is interactively monitored as a remote transponder using a covert laser communication protocol from a remote mothership or operative. Robot data-sharing at the ground level can be leveraged with global evaluation criteria, including terrain overlays and remote imaging data. Information sharing and distributed intelli- gence opens up a new class of remote-sensing applications in which small single-function autono- mous observers at the local level can collectively optimize and measure large scale ground-level signals. AS the need for coverage and the number of agents grows to improve spatial resolution, cooperative behavior orchestrated by a global situational awareness umbrella will be an essential ingredient to offset increasing bandwidth requirements within the net. A system of the type described in this proposal will be capable of sensitively detecting, tracking, and mapping spatial distributions of measurement signatures which are non-stationary or obscured by clutter and inter- fering obstacles by virtue of adaptive reconfiguration. This methodology could be used, for example, to field an adaptive ground-penetrating radar for detection of underground structures in

  4. Noise-exploitation and adaptation in neuromorphic sensors

    Science.gov (United States)

    Hindo, Thamira; Chakrabartty, Shantanu

    2012-04-01

    Even though current micro-nano fabrication technology has reached integration levels where ultra-sensitive sensors can be fabricated, the sensing performance (resolution per joule) of synthetic systems are still orders of magnitude inferior to those observed in neurobiology. For example, the filiform hairs in crickets operate at fundamental limits of noise; auditory sensors in a parasitoid fly can overcome fundamental limitations to precisely localize ultra-faint acoustic signatures. Even though many of these biological marvels have served as inspiration for different types of neuromorphic sensors, the main focus these designs have been to faithfully replicate the biological functionalities, without considering the constructive role of "noise". In man-made sensors device and sensor noise are typically considered as a nuisance, where as in neurobiology "noise" has been shown to be a computational aid that enables biology to sense and operate at fundamental limits of energy efficiency and performance. In this paper, we describe some of the important noise-exploitation and adaptation principles observed in neurobiology and how they can be systematically used for designing neuromorphic sensors. Our focus will be on two types of noise-exploitation principles, namely, (a) stochastic resonance; and (b) noise-shaping, which are unified within our previously reported framework called Σ▵ learning. As a case-study, we describe the application of Σ▵ learning for the design of a miniature acoustic source localizer whose performance matches that of its biological counterpart(Ormia Ochracea).

  5. Multi-rate cubature Kalman filter based data fusion method with residual compensation to adapt to sampling rate discrepancy in attitude measurement system.

    Science.gov (United States)

    Guo, Xiaoting; Sun, Changku; Wang, Peng

    2017-08-01

    This paper investigates the multi-rate inertial and vision data fusion problem in nonlinear attitude measurement systems, where the sampling rate of the inertial sensor is much faster than that of the vision sensor. To fully exploit the high frequency inertial data and obtain favorable fusion results, a multi-rate CKF (Cubature Kalman Filter) algorithm with estimated residual compensation is proposed in order to adapt to the problem of sampling rate discrepancy. During inter-sampling of slow observation data, observation noise can be regarded as infinite. The Kalman gain is unknown and approaches zero. The residual is also unknown. Therefore, the filter estimated state cannot be compensated. To obtain compensation at these moments, state error and residual formulas are modified when compared with the observation data available moments. Self-propagation equation of the state error is established to propagate the quantity from the moments with observation to the moments without observation. Besides, a multiplicative adjustment factor is introduced as Kalman gain, which acts on the residual. Then the filter estimated state can be compensated even when there are no visual observation data. The proposed method is tested and verified in a practical setup. Compared with multi-rate CKF without residual compensation and single-rate CKF, a significant improvement is obtained on attitude measurement by using the proposed multi-rate CKF with inter-sampling residual compensation. The experiment results with superior precision and reliability show the effectiveness of the proposed method.

  6. Finite Element Modelling of a Field-Sensed Magnetic Suspended System for Accurate Proximity Measurement Based on a Sensor Fusion Algorithm with Unscented Kalman Filter.

    Science.gov (United States)

    Chowdhury, Amor; Sarjaš, Andrej

    2016-09-15

    The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.

  7. A Neural Network Approach for Building An Obstacle Detection Model by Fusion of Proximity Sensors Data

    Science.gov (United States)

    Peralta, Emmanuel; Vargas, Héctor; Hermosilla, Gabriel

    2018-01-01

    Proximity sensors are broadly used in mobile robots for obstacle detection. The traditional calibration process of this kind of sensor could be a time-consuming task because it is usually done by identification in a manual and repetitive way. The resulting obstacles detection models are usually nonlinear functions that can be different for each proximity sensor attached to the robot. In addition, the model is highly dependent on the type of sensor (e.g., ultrasonic or infrared), on changes in light intensity, and on the properties of the obstacle such as shape, colour, and surface texture, among others. That is why in some situations it could be useful to gather all the measurements provided by different kinds of sensor in order to build a unique model that estimates the distances to the obstacles around the robot. This paper presents a novel approach to get an obstacles detection model based on the fusion of sensors data and automatic calibration by using artificial neural networks. PMID:29495338

  8. Secure Adaptive Topology Control for Wireless Ad-Hoc Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yen-Chieh Ouyang

    2010-02-01

    Full Text Available This paper presents a secure decentralized clustering algorithm for wireless ad-hoc sensor networks. The algorithm operates without a centralized controller, operates asynchronously, and does not require that the location of the sensors be known a priori. Based on the cluster-based topology, secure hierarchical communication protocols and dynamic quarantine strategies are introduced to defend against spam attacks, since this type of attacks can exhaust the energy of sensor nodes and will shorten the lifetime of a sensor network drastically. By adjusting the threshold of infected percentage of the cluster coverage, our scheme can dynamically coordinate the proportion of the quarantine region and adaptively achieve the cluster control and the neighborhood control of attacks. Simulation results show that the proposed approach is feasible and cost effective for wireless sensor networks.

  9. Accurate positioning of pedestrains in mixed indoor/outdoor settings : A particle filter approach to sensor and map fusion

    DEFF Research Database (Denmark)

    Toftkjær, Thomas

    , through an extensive GNSS measurement campaign. The results of this campaign provides researchers a foundation for choosing and designing complementary technologies and systems. The contributions in this thesis are novel sensor fusion methods based on particle lters for improved positioning, hence......Pedestrian positioning with full coverage in urban environments is a long sought after research goal. This thesis proposes new techniques for handling the challenging task of truly pervasive pedestrian positioning. It shows that through sensor fusion one can both improve accuracy and extend...... the following four aspects all advance this research direction. Firstly, this thesis proposes ProPosition, a system that utilizes GNSS-complementary technologies such as WiFi and Dead Reckoning by Inertial Measurements Units. We show, that through ProPosition's use of the probabilistic Bayes lter technique...

  10. Visualization of graphical information fusion results

    Science.gov (United States)

    Blasch, Erik; Levchuk, Georgiy; Staskevich, Gennady; Burke, Dustin; Aved, Alex

    2014-06-01

    Graphical fusion methods are popular to describe distributed sensor applications such as target tracking and pattern recognition. Additional graphical methods include network analysis for social, communications, and sensor management. With the growing availability of various data modalities, graphical fusion methods are widely used to combine data from multiple sensors and modalities. To better understand the usefulness of graph fusion approaches, we address visualization to increase user comprehension of multi-modal data. The paper demonstrates a use case that combines graphs from text reports and target tracks to associate events and activities of interest visualization for testing Measures of Performance (MOP) and Measures of Effectiveness (MOE). The analysis includes the presentation of the separate graphs and then graph-fusion visualization for linking network graphs for tracking and classification.

  11. Clay content prediction using on-the-go proximal soil sensor fusion

    DEFF Research Database (Denmark)

    Tabatabai, Salman; Knadel, Maria; Greve, Mogens Humlekrog

    on soil usability, very few studies so far have provided robust and accurate predictions for fields with high clay content variability. An on-the-go multi-sensor platform was used to measure topsoil (25cm) VNIR spectra and temperature as well as electrical conductivity of top 30cm and top 90cm in 5 fields...... least squares regression (PLSR) and support vector machines regression (SVMR) were performed using VNIR spectra, EC and soil temperature as predictors and clay content as the response variable. PLSR and SVMR models were validated using full and 20-segment cross-validation respectively. The results were...... highly accurate with R2 of 0.91 and 0.93, root mean square error (RMSE) of 1.19 and 1.08, and ratio of performance to interquartile range (RPIQ) of 4.6 and 5.1 for PLSR and SVMR respectively. This shows the high potential of on-the-go soil sensor fusion to predict soil clay content and automate...

  12. Reovirus FAST Proteins Drive Pore Formation and Syncytiogenesis Using a Novel Helix-Loop-Helix Fusion-Inducing Lipid Packing Sensor.

    Directory of Open Access Journals (Sweden)

    Jolene Read

    2015-06-01

    Full Text Available Pore formation is the most energy-demanding step during virus-induced membrane fusion, where high curvature of the fusion pore rim increases the spacing between lipid headgroups, exposing the hydrophobic interior of the membrane to water. How protein fusogens breach this thermodynamic barrier to pore formation is unclear. We identified a novel fusion-inducing lipid packing sensor (FLiPS in the cytosolic endodomain of the baboon reovirus p15 fusion-associated small transmembrane (FAST protein that is essential for pore formation during cell-cell fusion and syncytiogenesis. NMR spectroscopy and mutational studies indicate the dependence of this FLiPS on a hydrophobic helix-loop-helix structure. Biochemical and biophysical assays reveal the p15 FLiPS preferentially partitions into membranes with high positive curvature, and this partitioning is impeded by bis-ANS, a small molecule that inserts into hydrophobic defects in membranes. Most notably, the p15 FLiPS can be functionally replaced by heterologous amphipathic lipid packing sensors (ALPS but not by other membrane-interactive amphipathic helices. Furthermore, a previously unrecognized amphipathic helix in the cytosolic domain of the reptilian reovirus p14 FAST protein can functionally replace the p15 FLiPS, and is itself replaceable by a heterologous ALPS motif. Anchored near the cytoplasmic leaflet by the FAST protein transmembrane domain, the FLiPS is perfectly positioned to insert into hydrophobic defects that begin to appear in the highly curved rim of nascent fusion pores, thereby lowering the energy barrier to stable pore formation.

  13. Sensors Fusion based Online Mapping and Features Extraction of Mobile Robot in the Road Following and Roundabout

    International Nuclear Information System (INIS)

    Ali, Mohammed A H; Yussof, Wan Azhar B.; Hamedon, Zamzuri B; Yussof, Zulkifli B.; Majeed, Anwar P P; Mailah, Musa

    2016-01-01

    A road feature extraction based mapping system using a sensor fusion technique for mobile robot navigation in road environments is presented in this paper. The online mapping of mobile robot is performed continuously in the road environments to find the road properties that enable the robot to move from a certain start position to pre-determined goal while discovering and detecting the roundabout. The sensors fusion involving laser range finder, camera and odometry which are installed in a new platform, are used to find the path of the robot and localize it within its environments. The local maps are developed using camera and laser range finder to recognize the roads borders parameters such as road width, curbs and roundabout. Results show the capability of the robot with the proposed algorithms to effectively identify the road environments and build a local mapping for road following and roundabout. (paper)

  14. Reconnaissance blind multi-chess: an experimentation platform for ISR sensor fusion and resource management

    Science.gov (United States)

    Newman, Andrew J.; Richardson, Casey L.; Kain, Sean M.; Stankiewicz, Paul G.; Guseman, Paul R.; Schreurs, Blake A.; Dunne, Jeffrey A.

    2016-05-01

    This paper introduces the game of reconnaissance blind multi-chess (RBMC) as a paradigm and test bed for understanding and experimenting with autonomous decision making under uncertainty and in particular managing a network of heterogeneous Intelligence, Surveillance and Reconnaissance (ISR) sensors to maintain situational awareness informing tactical and strategic decision making. The intent is for RBMC to serve as a common reference or challenge problem in fusion and resource management of heterogeneous sensor ensembles across diverse mission areas. We have defined a basic rule set and a framework for creating more complex versions, developed a web-based software realization to serve as an experimentation platform, and developed some initial machine intelligence approaches to playing it.

  15. Integrated multi-sensor fusion for mapping and localization in outdoor environments for mobile robots

    Science.gov (United States)

    Emter, Thomas; Petereit, Janko

    2014-05-01

    An integrated multi-sensor fusion framework for localization and mapping for autonomous navigation in unstructured outdoor environments based on extended Kalman filters (EKF) is presented. The sensors for localization include an inertial measurement unit, a GPS, a fiber optic gyroscope, and wheel odometry. Additionally a 3D LIDAR is used for simultaneous localization and mapping (SLAM). A 3D map is built while concurrently a localization in a so far established 2D map is estimated with the current scan of the LIDAR. Despite of longer run-time of the SLAM algorithm compared to the EKF update, a high update rate is still guaranteed by sophisticatedly joining and synchronizing two parallel localization estimators.

  16. A Combined Approach of Sensor Data Fusion and Multivariate Geostatistics for Delineation of Homogeneous Zones in an Agricultural Field

    Directory of Open Access Journals (Sweden)

    Annamaria Castrignanò

    2017-12-01

    Full Text Available To assess spatial variability at the very fine scale required by Precision Agriculture, different proximal and remote sensors have been used. They provide large amounts and different types of data which need to be combined. An integrated approach, using multivariate geostatistical data-fusion techniques and multi-source geophysical sensor data to determine simple summary scale-dependent indices, is described here. These indices can be used to delineate management zones to be submitted to differential management. Such a data fusion approach with geophysical sensors was applied in a soil of an agronomic field cropped with tomato. The synthetic regionalized factors determined, contributed to split the 3D edaphic environment into two main horizontal structures with different hydraulic properties and to disclose two main horizons in the 0–1.0-m depth with a discontinuity probably occurring between 0.40 m and 0.70 m. Comparing this partition with the soil properties measured with a shallow sampling, it was possible to verify the coherence in the topsoil between the dielectric properties and other properties more directly related to agronomic management. These results confirm the advantages of using proximal sensing as a preliminary step in the application of site-specific management. Combining disparate spatial data (data fusion is not at all a naive problem and novel and powerful methods need to be developed.

  17. A Combined Approach of Sensor Data Fusion and Multivariate Geostatistics for Delineation of Homogeneous Zones in an Agricultural Field.

    Science.gov (United States)

    Castrignanò, Annamaria; Buttafuoco, Gabriele; Quarto, Ruggiero; Vitti, Carolina; Langella, Giuliano; Terribile, Fabio; Venezia, Accursio

    2017-12-03

    To assess spatial variability at the very fine scale required by Precision Agriculture, different proximal and remote sensors have been used. They provide large amounts and different types of data which need to be combined. An integrated approach, using multivariate geostatistical data-fusion techniques and multi-source geophysical sensor data to determine simple summary scale-dependent indices, is described here. These indices can be used to delineate management zones to be submitted to differential management. Such a data fusion approach with geophysical sensors was applied in a soil of an agronomic field cropped with tomato. The synthetic regionalized factors determined, contributed to split the 3D edaphic environment into two main horizontal structures with different hydraulic properties and to disclose two main horizons in the 0-1.0-m depth with a discontinuity probably occurring between 0.40 m and 0.70 m. Comparing this partition with the soil properties measured with a shallow sampling, it was possible to verify the coherence in the topsoil between the dielectric properties and other properties more directly related to agronomic management. These results confirm the advantages of using proximal sensing as a preliminary step in the application of site-specific management. Combining disparate spatial data (data fusion) is not at all a naive problem and novel and powerful methods need to be developed.

  18. Sensor fusion: lane marking detection and autonomous intelligent cruise control system

    Science.gov (United States)

    Baret, Marc; Baillarin, S.; Calesse, C.; Martin, Lionel

    1995-12-01

    In the past few years MATRA and RENAULT have developed an Autonomous Intelligent Cruise Control (AICC) system based on a LIDAR sensor. This sensor incorporating a charge coupled device was designed to acquire pulsed laser diode emission reflected by standard car reflectors. The absence of moving mechanical parts, the large field of view, the high measurement rate and the very good accuracy for distance range and angular position of targets make this sensor very interesting. It provides the equipped car with the distance and the relative speed of other vehicles enabling the safety distance to be controlled by acting on the throttle and the automatic gear box. Experiments in various real traffic situations have shown the limitations of this kind of system especially on bends. All AICC sensors are unable to distinguish between a bend and a change of lane. This is easily understood if we consider a road without lane markings. This fact has led MATRA to improve its AICC system by providing the lane marking information. Also in the scope of the EUREKA PROMETHEUS project, MATRA and RENAULT have developed a lane keeping system in order to warn of the drivers lack of vigilance. Thus, MATRA have spread this system to far field lane marking detection and have coupled it with the AICC system. Experiments will be carried out on roads to estimate the gain in performance and comfort due to this fusion.

  19. Multistream sensor fusion-based prognostics model for systems with single failure modes

    International Nuclear Information System (INIS)

    Fang, Xiaolei; Paynabar, Kamran; Gebraeel, Nagi

    2017-01-01

    Advances in sensor technology have facilitated the capability of monitoring the degradation of complex engineering systems through the analysis of multistream degradation signals. However, the varying levels of correlation with physical degradation process for different sensors, high-dimensionality of the degradation signals and cross-correlation among different signal streams pose significant challenges in monitoring and prognostics of such systems. To address the foregoing challenges, we develop a three-step multi-sensor prognostic methodology that utilizes multistream signals to predict residual useful lifetimes of partially degraded systems. We first identify the informative sensors via the penalized (log)-location-scale regression. Then, we fuse the degradation signals of the informative sensors using multivariate functional principal component analysis, which is capable of modeling the cross-correlation of signal streams. Finally, the third step focuses on utilizing the fused signal features for prognostics via adaptive penalized (log)-location-scale regression. We validate our multi-sensor prognostic methodology using simulation study as well as a case study of aircraft turbofan engines available from NASA repository.

  20. Attack Detection/Isolation via a Secure Multisensor Fusion Framework for Cyberphysical Systems

    Directory of Open Access Journals (Sweden)

    Arash Mohammadi

    2018-01-01

    Full Text Available Motivated by rapid growth of cyberphysical systems (CPSs and the necessity to provide secure state estimates against potential data injection attacks in their application domains, the paper proposes a secure and innovative attack detection and isolation fusion framework. The proposed multisensor fusion framework provides secure state estimates by using ideas from interactive multiple models (IMM combined with a novel fuzzy-based attack detection/isolation mechanism. The IMM filter is used to adjust the system’s uncertainty adaptively via model probabilities by using a hybrid state model consisting of two behaviour modes, one corresponding to the ideal scenario and one associated with the attack behaviour mode. The state chi-square test is then incorporated through the proposed fuzzy-based fusion framework to detect and isolate potential data injection attacks. In other words, the validation probability of each sensor is calculated based on the value of the chi-square test. Finally, by incorporation of the validation probability of each sensor, the weights of its associated subsystem are computed. To be concrete, an integrated navigation system is simulated with three types of attacks ranging from a constant bias attack to a non-Gaussian stochastic attack to evaluate the proposed attack detection and isolation fusion framework.

  1. Inductive learning as a fusion engine for mine detection

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, S.W.; Dress, W.B.

    1997-08-01

    Semiotics is defined by some researchers as {open_quotes}the study of the appearance (visual or otherwise) meaning, and use of symbols and symbol systems.{close_quotes} Semiotic fusion of data from multiple sensory sources is a potential solution to the problem of landmine detection. This turns out to be significant, because notwithstanding the diversity of sensor technologies being used to attack the problem, there is no single effective landmine sensor technology. The only practical, general-purpose mine detector presently available is the trained dog. Most research into mine-detection technology seeds to emulate the dog`s seemingly uncanny abilities. An ideal data-fusion system would mimic animal reaction, with the brain`s perceptive power melding multiple sensory cues into an awareness of the size and location of a mine. Furthermore, the fusion process should be adaptive, with the skill at combining cues into awareness improving with experience. Electronic data-fusion systems reported in the countermine literature use conventional vector-based pattern recognition methods. Although neural nets are popular, they have never satisfactorily met the challenge. Despite years of investigation, nobody has ever found a vector space representation that reliably characterizes mine identity. This strongly suggests that the features have not been found because researchers have been looking for the wrong characteristics. It is worth considering that dogs probably do not represent data as mathematical number lists, but they almost certainly represent data via semiotic structures.

  2. Detection of water-quality contamination events based on multi-sensor fusion using an extented Dempster–Shafer method

    International Nuclear Information System (INIS)

    Hou, Dibo; He, Huimei; Huang, Pingjie; Zhang, Guangxin; Loaiciga, Hugo

    2013-01-01

    This study presents a method for detecting contamination events of sources of drinking water based on the Dempster–Shafer (D-S) evidence theory. The detection method has the purpose of protecting water supply systems against accidental and intentional contamination events. This purpose is achieved by first predicting future water-quality parameters using an autoregressive (AR) model. The AR model predicts future water-quality parameters using recent measurements of these parameters made with automated (on-line) water-quality sensors. Next, a probabilistic method assigns probabilities to the time series of residuals formed by comparing predicted water-quality parameters with threshold values. Finally, the D-S fusion method searches for anomalous probabilities of the residuals and uses the result of that search to determine whether the current water quality is normal (that is, free of pollution) or contaminated. The D-S fusion method is extended and improved in this paper by weighted averaging of water-contamination evidence and by the analysis of the persistence of anomalous probabilities of water-quality parameters. The extended D-S fusion method makes determinations that have a high probability of being correct concerning whether or not a source of drinking water has been contaminated. This paper's method for detecting water-contamination events was tested with water-quality time series from automated (on-line) water quality sensors. In addition, a small-scale, experimental, water-pipe network was tested to detect water-contamination events. The two tests demonstrated that the extended D-S fusion method achieves a low false alarm rate and high probabilities of detecting water contamination events. (paper)

  3. Communal Sensor Network for Adaptive Noise Reduction in Aircraft Engine Nacelles

    Science.gov (United States)

    Jones, Kennie H.; Nark, Douglas M.; Jones, Michael G.

    2011-01-01

    Emergent behavior, a subject of much research in biology, sociology, and economics, is a foundational element of Complex Systems Science and is apropos in the design of sensor network systems. To demonstrate engineering for emergent behavior, a novel approach in the design of a sensor/actuator network is presented maintaining optimal noise attenuation as an adaptation to changing acoustic conditions. Rather than use the conventional approach where sensors are managed by a central controller, this new paradigm uses a biomimetic model where sensor/actuators cooperate as a community of autonomous organisms, sharing with neighbors to control impedance based on local information. From the combination of all individual actions, an optimal attenuation emerges for the global system.

  4. Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion.

    Science.gov (United States)

    Fang, Wei; Zheng, Lianyu; Deng, Huanjun; Zhang, Hongbo

    2017-05-05

    In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time.

  5. Accurate 3D Positioning for a Mobile Platform in Non-Line-of-Sight Scenarios Based on IMU/Magnetometer Sensor Fusion.

    Science.gov (United States)

    Hellmers, Hendrik; Kasmi, Zakaria; Norrdine, Abdelmoumen; Eichhorn, Andreas

    2018-01-04

    In recent years, a variety of real-time applications benefit from services provided by localization systems due to the advent of sensing and communication technologies. Since the Global Navigation Satellite System (GNSS) enables localization only outside buildings, applications for indoor positioning and navigation use alternative technologies. Ultra Wide Band Signals (UWB), Wireless Local Area Network (WLAN), ultrasonic or infrared are common examples. However, these technologies suffer from fading and multipath effects caused by objects and materials in the building. In contrast, magnetic fields are able to pass through obstacles without significant propagation errors, i.e. in Non-Line of Sight Scenarios (NLoS). The aim of this work is to propose a novel indoor positioning system based on artificially generated magnetic fields in combination with Inertial Measurement Units (IMUs). In order to reach a better coverage, multiple coils are used as reference points. A basic algorithm for three-dimensional applications is demonstrated as well as evaluated in this article. The established system is then realized by a sensor fusion principle as well as a kinematic motion model on the basis of a Kalman filter. Furthermore, a pressure sensor is used in combination with an adaptive filtering method to reliably estimate the platform's altitude.

  6. Real-time classification and sensor fusion with a spiking deep belief network.

    Science.gov (United States)

    O'Connor, Peter; Neil, Daniel; Liu, Shih-Chii; Delbruck, Tobi; Pfeiffer, Michael

    2013-01-01

    Deep Belief Networks (DBNs) have recently shown impressive performance on a broad range of classification problems. Their generative properties allow better understanding of the performance, and provide a simpler solution for sensor fusion tasks. However, because of their inherent need for feedback and parallel update of large numbers of units, DBNs are expensive to implement on serial computers. This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation. The method is demonstrated in simulation and by a real-time implementation of a 3-layer network with 2694 neurons used for visual classification of MNIST handwritten digits with input from a 128 × 128 Dynamic Vision Sensor (DVS) silicon retina, and sensory-fusion using additional input from a 64-channel AER-EAR silicon cochlea. The system is implemented through the open-source software in the jAER project and runs in real-time on a laptop computer. It is demonstrated that the system can recognize digits in the presence of distractions, noise, scaling, translation and rotation, and that the degradation of recognition performance by using an event-based approach is less than 1%. Recognition is achieved in an average of 5.8 ms after the onset of the presentation of a digit. By cue integration from both silicon retina and cochlea outputs we show that the system can be biased to select the correct digit from otherwise ambiguous input.

  7. Modeling for deformable mirrors and the adaptive optics optimization program

    International Nuclear Information System (INIS)

    Henesian, M.A.; Haney, S.W.; Trenholme, J.B.; Thomas, M.

    1997-01-01

    We discuss aspects of adaptive optics optimization for large fusion laser systems such as the 192-arm National Ignition Facility (NIF) at LLNL. By way of example, we considered the discrete actuator deformable mirror and Hartmann sensor system used on the Beamlet laser. Beamlet is a single-aperture prototype of the 11-0-5 slab amplifier design for NIF, and so we expect similar optical distortion levels and deformable mirror correction requirements. We are now in the process of developing a numerically efficient object oriented C++ language implementation of our adaptive optics and wavefront sensor code, but this code is not yet operational. Results are based instead on the prototype algorithms, coded-up in an interpreted array processing computer language

  8. Estimation of tool wear during CNC milling using neural network-based sensor fusion

    Science.gov (United States)

    Ghosh, N.; Ravi, Y. B.; Patra, A.; Mukhopadhyay, S.; Paul, S.; Mohanty, A. R.; Chattopadhyay, A. B.

    2007-01-01

    Cutting tool wear degrades the product quality in manufacturing processes. Monitoring tool wear value online is therefore needed to prevent degradation in machining quality. Unfortunately there is no direct way of measuring the tool wear online. Therefore one has to adopt an indirect method wherein the tool wear is estimated from several sensors measuring related process variables. In this work, a neural network-based sensor fusion model has been developed for tool condition monitoring (TCM). Features extracted from a number of machining zone signals, namely cutting forces, spindle vibration, spindle current, and sound pressure level have been fused to estimate the average flank wear of the main cutting edge. Novel strategies such as, signal level segmentation for temporal registration, feature space filtering, outlier removal, and estimation space filtering have been proposed. The proposed approach has been validated by both laboratory and industrial implementations.

  9. Energy Logic (EL): a novel fusion engine of multi-modality multi-agent data/information fusion for intelligent surveillance systems

    Science.gov (United States)

    Rababaah, Haroun; Shirkhodaie, Amir

    2009-04-01

    The rapidly advancing hardware technology, smart sensors and sensor networks are advancing environment sensing. One major potential of this technology is Large-Scale Surveillance Systems (LS3) especially for, homeland security, battlefield intelligence, facility guarding and other civilian applications. The efficient and effective deployment of LS3 requires addressing number of aspects impacting the scalability of such systems. The scalability factors are related to: computation and memory utilization efficiency, communication bandwidth utilization, network topology (e.g., centralized, ad-hoc, hierarchical or hybrid), network communication protocol and data routing schemes; and local and global data/information fusion scheme for situational awareness. Although, many models have been proposed to address one aspect or another of these issues but, few have addressed the need for a multi-modality multi-agent data/information fusion that has characteristics satisfying the requirements of current and future intelligent sensors and sensor networks. In this paper, we have presented a novel scalable fusion engine for multi-modality multi-agent information fusion for LS3. The new fusion engine is based on a concept we call: Energy Logic. Experimental results of this work as compared to a Fuzzy logic model strongly supported the validity of the new model and inspired future directions for different levels of fusion and different applications.

  10. Block Fusion on Dynamically Adaptive Spacetree Grids for Shallow Water Waves

    KAUST Repository

    Weinzierl, Tobias

    2014-09-01

    © 2014 World Scientific Publishing Company. Spacetrees are a popular formalism to describe dynamically adaptive Cartesian grids. Even though they directly yield a mesh, it is often computationally reasonable to embed regular Cartesian blocks into their leaves. This promotes stencils working on homogeneous data chunks. The choice of a proper block size is sensitive. While large block sizes foster loop parallelism and vectorisation, they restrict the adaptivity\\'s granularity and hence increase the memory footprint and lower the numerical accuracy per byte. In the present paper, we therefore use a multiscale spacetree-block coupling admitting blocks on all spacetree nodes. We propose to find sets of blocks on the finest scale throughout the simulation and to replace them by fused big blocks. Such a replacement strategy can pick up hardware characteristics, i.e. which block size yields the highest throughput, while the dynamic adaptivity of the fine grid mesh is not constrained - applications can work with fine granular blocks. We study the fusion with a state-of-the-art shallow water solver at hands of an Intel Sandy Bridge and a Xeon Phi processor where we anticipate their reaction to selected block optimisation and vectorisation.

  11. FuzzyFusion: an application architecture for multisource information fusion

    Science.gov (United States)

    Fox, Kevin L.; Henning, Ronda R.

    2009-04-01

    The correlation of information from disparate sources has long been an issue in data fusion research. Traditional data fusion addresses the correlation of information from sources as diverse as single-purpose sensors to all-source multi-media information. Information system vulnerability information is similar in its diversity of sources and content, and in the desire to draw a meaningful conclusion, namely, the security posture of the system under inspection. FuzzyFusionTM, A data fusion model that is being applied to the computer network operations domain is presented. This model has been successfully prototyped in an applied research environment and represents a next generation assurance tool for system and network security.

  12. Sensor Fusion-based Event Detection in Wireless Sensor Networks

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, Nirvana; Havinga, Paul J.M.

    2009-01-01

    Recently, Wireless Sensor Networks (WSN) community has witnessed an application focus shift. Although, monitoring was the initial application of wireless sensor networks, in-network data processing and (near) real-time actuation capability have made wireless sensor networks suitable candidate for

  13. An airport surface surveillance solution based on fusion algorithm

    Science.gov (United States)

    Liu, Jianliang; Xu, Yang; Liang, Xuelin; Yang, Yihuang

    2017-01-01

    In this paper, we propose an airport surface surveillance solution combined with Multilateration (MLAT) and Automatic Dependent Surveillance Broadcast (ADS-B). The moving target to be monitored is regarded as a linear stochastic hybrid system moving freely and each surveillance technology is simplified as a sensor with white Gaussian noise. The dynamic model of target and the observation model of sensor are established in this paper. The measurements of sensors are filtered properly by estimators to get the estimation results for current time. Then, we analysis the characteristics of two fusion solutions proposed, and decide to use the scheme based on sensor estimation fusion for our surveillance solution. In the proposed fusion algorithm, according to the output of estimators, the estimation error is quantified, and the fusion weight of each sensor is calculated. The two estimation results are fused with weights, and the position estimation of target is computed accurately. Finally the proposed solution and algorithm are validated by an illustrative target tracking simulation.

  14. Model wavefront sensor for adaptive confocal microscopy

    Science.gov (United States)

    Booth, Martin J.; Neil, Mark A. A.; Wilson, Tony

    2000-05-01

    A confocal microscope permits 3D imaging of volume objects by the inclusion of a pinhole in the detector path which eliminates out of focus light. This configuration is however very sensitive to aberrations induced by the specimen or the optical system and would therefore benefit from an adaptive optics approach. We present a wavefront sensor capable of measuring directly the Zernike components of an aberrated wavefront and show that it is particularly applicable to the confocal microscope since only those wavefronts originating in the focal region contribute to the measured aberration.

  15. Adaptive Media Access Control for Energy Harvesting - Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Fafoutis, Xenofon; Dragoni, Nicola

    2012-01-01

    ODMAC (On-Demand Media Access Control) is a recently proposed MAC protocol designed to support individual duty cycles for Energy Harvesting — Wireless Sensor Networks (EH-WSNs). Individual duty cycles are vital for EH-WSNs, because they allow nodes to adapt their energy consumption to the ever-ch...

  16. Adaptive Home System Using Wireless Sensor Network And Multi Agent System

    OpenAIRE

    Jayarani Kamble; Prof.Nandini Dhole

    2014-01-01

    Smart Home is an emerging technology growing continuously which includes number of new technologies which helps to improve human’s quality of living. This paper proposes an adaptive home system for optimum utilization of power, through Artificial Intelligence and Wireless Sensor network. Artificial Intelligence is a technology for developing adaptive system that can perceive the enviornmrnt, learn from the environment and can make decision using Rule based system.Zigbee is a w...

  17. An Embodied Multi-Sensor Fusion Approach to Visual Motion Estimation Using Unsupervised Deep Networks.

    Science.gov (United States)

    Shamwell, E Jared; Nothwang, William D; Perlis, Donald

    2018-05-04

    Aimed at improving size, weight, and power (SWaP)-constrained robotic vision-aided state estimation, we describe our unsupervised, deep convolutional-deconvolutional sensor fusion network, Multi-Hypothesis DeepEfference (MHDE). MHDE learns to intelligently combine noisy heterogeneous sensor data to predict several probable hypotheses for the dense, pixel-level correspondence between a source image and an unseen target image. We show how our multi-hypothesis formulation provides increased robustness against dynamic, heteroscedastic sensor and motion noise by computing hypothesis image mappings and predictions at 76⁻357 Hz depending on the number of hypotheses being generated. MHDE fuses noisy, heterogeneous sensory inputs using two parallel, inter-connected architectural pathways and n (1⁻20 in this work) multi-hypothesis generating sub-pathways to produce n global correspondence estimates between a source and a target image. We evaluated MHDE on the KITTI Odometry dataset and benchmarked it against the vision-only DeepMatching and Deformable Spatial Pyramids algorithms and were able to demonstrate a significant runtime decrease and a performance increase compared to the next-best performing method.

  18. An ADC-free adaptive interface circuit of resistive sensor for electronic nose system.

    Science.gov (United States)

    Chang, Chia-Lin; Chiu, Shih-Wen; Tang, Kea-Tiong

    2013-01-01

    The initial resistance of chemiresistive gas sensors could be affected by temperature, humidity, and background odors. In a sensing system, the traditional interface circuit always requires an ADC to convert analog signal to digital signal. In this paper, we propose an ADC-free adaptive interface circuit for a resistive gas sensor to read sensor signal and cancel the baseline drift. Furthermore, methanol was used to test the proposed interface circuit, which was connected with a FIGARO® gas sensor. This circuit was fabricated by TSMC 0.18 µm CMOS process, and consumed 86.41 µW under 1 V supply voltage.

  19. Adaptation of sensor morphology: an integrative view of perception from biologically inspired robotics perspective

    Science.gov (United States)

    Nurzaman, Surya G.

    2016-01-01

    Sensor morphology, the morphology of a sensing mechanism which plays a role of shaping the desired response from physical stimuli from surroundings to generate signals usable as sensory information, is one of the key common aspects of sensing processes. This paper presents a structured review of researches on bioinspired sensor morphology implemented in robotic systems, and discusses the fundamental design principles. Based on literature review, we propose two key arguments: first, owing to its synthetic nature, biologically inspired robotics approach is a unique and powerful methodology to understand the role of sensor morphology and how it can evolve and adapt to its task and environment. Second, a consideration of an integrative view of perception by looking into multidisciplinary and overarching mechanisms of sensor morphology adaptation across biology and engineering enables us to extract relevant design principles that are important to extend our understanding of the unfinished concepts in sensing and perception. PMID:27499843

  20. Distributed adaptive diagnosis of sensor faults using structural response data

    Science.gov (United States)

    Dragos, Kosmas; Smarsly, Kay

    2016-10-01

    The reliability and consistency of wireless structural health monitoring (SHM) systems can be compromised by sensor faults, leading to miscalibrations, corrupted data, or even data loss. Several research approaches towards fault diagnosis, referred to as ‘analytical redundancy’, have been proposed that analyze the correlations between different sensor outputs. In wireless SHM, most analytical redundancy approaches require centralized data storage on a server for data analysis, while other approaches exploit the on-board computing capabilities of wireless sensor nodes, analyzing the raw sensor data directly on board. However, using raw sensor data poses an operational constraint due to the limited power resources of wireless sensor nodes. In this paper, a new distributed autonomous approach towards sensor fault diagnosis based on processed structural response data is presented. The inherent correlations among Fourier amplitudes of acceleration response data, at peaks corresponding to the eigenfrequencies of the structure, are used for diagnosis of abnormal sensor outputs at a given structural condition. Representing an entirely data-driven analytical redundancy approach that does not require any a priori knowledge of the monitored structure or of the SHM system, artificial neural networks (ANN) are embedded into the sensor nodes enabling cooperative fault diagnosis in a fully decentralized manner. The distributed analytical redundancy approach is implemented into a wireless SHM system and validated in laboratory experiments, demonstrating the ability of wireless sensor nodes to self-diagnose sensor faults accurately and efficiently with minimal data traffic. Besides enabling distributed autonomous fault diagnosis, the embedded ANNs are able to adapt to the actual condition of the structure, thus ensuring accurate and efficient fault diagnosis even in case of structural changes.

  1. Anti-Personnel Landmine detection using depth fusion

    NARCIS (Netherlands)

    Schutte, K.; Cremer, F.; Breejen, E. den; Schavemaker, J.G.M.; Benoist, K.W.

    2001-01-01

    Detection of Anti-Personnel Landmines is a challenging task for any sensor currently available. Using sensor fusion allows combining individual sensor results such that for each sensor its advantages remain while compensating for its disadvantages by using other sensor types. This paper provides an

  2. Multisensor Fusion for Change Detection

    Science.gov (United States)

    Schenk, T.; Csatho, B.

    2005-12-01

    Combining sensors that record different properties of a 3-D scene leads to complementary and redundant information. If fused properly, a more robust and complete scene description becomes available. Moreover, fusion facilitates automatic procedures for object reconstruction and modeling. For example, aerial imaging sensors, hyperspectral scanning systems, and airborne laser scanning systems generate complementary data. We describe how data from these sensors can be fused for such diverse applications as mapping surface erosion and landslides, reconstructing urban scenes, monitoring urban land use and urban sprawl, and deriving velocities and surface changes of glaciers and ice sheets. An absolute prerequisite for successful fusion is a rigorous co-registration of the sensors involved. We establish a common 3-D reference frame by using sensor invariant features. Such features are caused by the same object space phenomena and are extracted in multiple steps from the individual sensors. After extracting, segmenting and grouping the features into more abstract entities, we discuss ways on how to automatically establish correspondences. This is followed by a brief description of rigorous mathematical models suitable to deal with linear and area features. In contrast to traditional, point-based registration methods, lineal and areal features lend themselves to a more robust and more accurate registration. More important, the chances to automate the registration process increases significantly. The result of the co-registration of the sensors is a unique transformation between the individual sensors and the object space. This makes spatial reasoning of extracted information more versatile; reasoning can be performed in sensor space or in 3-D space where domain knowledge about features and objects constrains reasoning processes, reduces the search space, and helps to make the problem well-posed. We demonstrate the feasibility of the proposed multisensor fusion approach

  3. Improvement of force-sensor-based heart rate estimation using multichannel data fusion.

    Science.gov (United States)

    Bruser, Christoph; Kortelainen, Juha M; Winter, Stefan; Tenhunen, Mirja; Parkka, Juha; Leonhardt, Steffen

    2015-01-01

    The aim of this paper is to present and evaluate algorithms for heartbeat interval estimation from multiple spatially distributed force sensors integrated into a bed. Moreover, the benefit of using multichannel systems as opposed to a single sensor is investigated. While it might seem intuitive that multiple channels are superior to a single channel, the main challenge lies in finding suitable methods to actually leverage this potential. To this end, two algorithms for heart rate estimation from multichannel vibration signals are presented and compared against a single-channel sensing solution. The first method operates by analyzing the cepstrum computed from the average spectra of the individual channels, while the second method applies Bayesian fusion to three interval estimators, such as the autocorrelation, which are applied to each channel. This evaluation is based on 28 night-long sleep lab recordings during which an eight-channel polyvinylidene fluoride-based sensor array was used to acquire cardiac vibration signals. The recruited patients suffered from different sleep disorders of varying severity. From the sensor array data, a virtual single-channel signal was also derived for comparison by averaging the channels. The single-channel results achieved a beat-to-beat interval error of 2.2% with a coverage (i.e., percentage of the recording which could be analyzed) of 68.7%. In comparison, the best multichannel results attained a mean error and coverage of 1.0% and 81.0%, respectively. These results present statistically significant improvements of both metrics over the single-channel results (p < 0.05).

  4. Sensor Fusion of Cameras and a Laser for City-Scale 3D Reconstruction

    Directory of Open Access Journals (Sweden)

    Yunsu Bok

    2014-11-01

    Full Text Available This paper presents a sensor fusion system of cameras and a 2D laser sensorfor large-scale 3D reconstruction. The proposed system is designed to capture data on afast-moving ground vehicle. The system consists of six cameras and one 2D laser sensor,and they are synchronized by a hardware trigger. Reconstruction of 3D structures is doneby estimating frame-by-frame motion and accumulating vertical laser scans, as in previousworks. However, our approach does not assume near 2D motion, but estimates free motion(including absolute scale in 3D space using both laser data and image features. In orderto avoid the degeneration associated with typical three-point algorithms, we present a newalgorithm that selects 3D points from two frames captured by multiple cameras. The problemof error accumulation is solved by loop closing, not by GPS. The experimental resultsshow that the estimated path is successfully overlaid on the satellite images, such that thereconstruction result is very accurate.

  5. A Weighted Combination Method for Conflicting Evidence in Multi-Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2018-05-01

    Full Text Available Dempster–Shafer evidence theory is widely applied in various fields related to information fusion. However, how to avoid the counter-intuitive results is an open issue when combining highly conflicting pieces of evidence. In order to handle such a problem, a weighted combination method for conflicting pieces of evidence in multi-sensor data fusion is proposed by considering both the interplay between the pieces of evidence and the impacts of the pieces of evidence themselves. First, the degree of credibility of the evidence is determined on the basis of the modified cosine similarity measure of basic probability assignment. Then, the degree of credibility of the evidence is adjusted by leveraging the belief entropy function to measure the information volume of the evidence. Finally, the final weight of each piece of evidence generated from the above steps is obtained and adopted to modify the bodies of evidence before using Dempster’s combination rule. A numerical example is provided to illustrate that the proposed method is reasonable and efficient in handling the conflicting pieces of evidence. In addition, applications in data classification and motor rotor fault diagnosis validate the practicability of the proposed method with better accuracy.

  6. Location-Based Self-Adaptive Routing Algorithm for Wireless Sensor Networks in Home Automation

    Directory of Open Access Journals (Sweden)

    Hong SeungHo

    2011-01-01

    Full Text Available The use of wireless sensor networks in home automation (WSNHA is attractive due to their characteristics of self-organization, high sensing fidelity, low cost, and potential for rapid deployment. Although the AODVjr routing algorithm in IEEE 802.15.4/ZigBee and other routing algorithms have been designed for wireless sensor networks, not all are suitable for WSNHA. In this paper, we propose a location-based self-adaptive routing algorithm for WSNHA called WSNHA-LBAR. It confines route discovery flooding to a cylindrical request zone, which reduces the routing overhead and decreases broadcast storm problems in the MAC layer. It also automatically adjusts the size of the request zone using a self-adaptive algorithm based on Bayes' theorem. This makes WSNHA-LBAR more adaptable to the changes of the network state and easier to implement. Simulation results show improved network reliability as well as reduced routing overhead.

  7. Parameter Selection Method for Support Vector Regression Based on Adaptive Fusion of the Mixed Kernel Function

    Directory of Open Access Journals (Sweden)

    Hailun Wang

    2017-01-01

    Full Text Available Support vector regression algorithm is widely used in fault diagnosis of rolling bearing. A new model parameter selection method for support vector regression based on adaptive fusion of the mixed kernel function is proposed in this paper. We choose the mixed kernel function as the kernel function of support vector regression. The mixed kernel function of the fusion coefficients, kernel function parameters, and regression parameters are combined together as the parameters of the state vector. Thus, the model selection problem is transformed into a nonlinear system state estimation problem. We use a 5th-degree cubature Kalman filter to estimate the parameters. In this way, we realize the adaptive selection of mixed kernel function weighted coefficients and the kernel parameters, the regression parameters. Compared with a single kernel function, unscented Kalman filter (UKF support vector regression algorithms, and genetic algorithms, the decision regression function obtained by the proposed method has better generalization ability and higher prediction accuracy.

  8. Robust Sequential Covariance Intersection Fusion Kalman Filtering over Multi-agent Sensor Networks with Measurement Delays and Uncertain Noise Variances

    Institute of Scientific and Technical Information of China (English)

    QI Wen-Juan; ZHANG Peng; DENG Zi-Li

    2014-01-01

    This paper deals with the problem of designing robust sequential covariance intersection (SCI) fusion Kalman filter for the clustering multi-agent sensor network system with measurement delays and uncertain noise variances. The sensor network is partitioned into clusters by the nearest neighbor rule. Using the minimax robust estimation principle, based on the worst-case conservative sensor network system with conservative upper bounds of noise variances, and applying the unbiased linear minimum variance (ULMV) optimal estimation rule, we present the two-layer SCI fusion robust steady-state Kalman filter which can reduce communication and computation burdens and save energy sources, and guarantee that the actual filtering error variances have a less-conservative upper-bound. A Lyapunov equation method for robustness analysis is proposed, by which the robustness of the local and fused Kalman filters is proved. The concept of the robust accuracy is presented and the robust accuracy relations of the local and fused robust Kalman filters are proved. It is proved that the robust accuracy of the global SCI fuser is higher than those of the local SCI fusers and the robust accuracies of all SCI fusers are higher than that of each local robust Kalman filter. A simulation example for a tracking system verifies the robustness and robust accuracy relations.

  9. Fusion of imaging and nonimaging data for surveillance aircraft

    Science.gov (United States)

    Shahbazian, Elisa; Gagnon, Langis; Duquet, Jean Remi; Macieszczak, Maciej; Valin, Pierre

    1997-06-01

    This paper describes a phased incremental integration approach for application of image analysis and data fusion technologies to provide automated intelligent target tracking and identification for airborne surveillance on board an Aurora Maritime Patrol Aircraft. The sensor suite of the Aurora consists of a radar, an identification friend or foe (IFF) system, an electronic support measures (ESM) system, a spotlight synthetic aperture radar (SSAR), a forward looking infra-red (FLIR) sensor and a link-11 tactical datalink system. Lockheed Martin Canada (LMCan) is developing a testbed, which will be used to analyze and evaluate approaches for combining the data provided by the existing sensors, which were initially not designed to feed a fusion system. Three concurrent research proof-of-concept activities provide techniques, algorithms and methodology into three sequential phases of integration of this testbed. These activities are: (1) analysis of the fusion architecture (track/contact/hybrid) most appropriate for the type of data available, (2) extraction and fusion of simple features from the imaging data into the fusion system performing automatic target identification, and (3) development of a unique software architecture which will permit integration and independent evolution, enhancement and optimization of various decision aid capabilities, such as multi-sensor data fusion (MSDF), situation and threat assessment (STA) and resource management (RM).

  10. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Science.gov (United States)

    Srbinovski, Bruno; Magno, Michele; Edwards-Murphy, Fiona; Pakrashi, Vikram; Popovici, Emanuel

    2016-01-01

    Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN) are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA) for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind). Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA) in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources) and power hungry sensors (ultrasonic wind sensor and gas sensors). The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA. PMID:27043559

  11. An Energy Aware Adaptive Sampling Algorithm for Energy Harvesting WSN with Energy Hungry Sensors

    Directory of Open Access Journals (Sweden)

    Bruno Srbinovski

    2016-03-01

    Full Text Available Wireless sensor nodes have a limited power budget, though they are often expected to be functional in the field once deployed for extended periods of time. Therefore, minimization of energy consumption and energy harvesting technology in Wireless Sensor Networks (WSN are key tools for maximizing network lifetime, and achieving self-sustainability. This paper proposes an energy aware Adaptive Sampling Algorithm (ASA for WSN with power hungry sensors and harvesting capabilities, an energy management technique that can be implemented on any WSN platform with enough processing power to execute the proposed algorithm. An existing state-of-the-art ASA developed for wireless sensor networks with power hungry sensors is optimized and enhanced to adapt the sampling frequency according to the available energy of the node. The proposed algorithm is evaluated using two in-field testbeds that are supplied by two different energy harvesting sources (solar and wind. Simulation and comparison between the state-of-the-art ASA and the proposed energy aware ASA (EASA in terms of energy durability are carried out using in-field measured harvested energy (using both wind and solar sources and power hungry sensors (ultrasonic wind sensor and gas sensors. The simulation results demonstrate that using ASA in combination with an energy aware function on the nodes can drastically increase the lifetime of a WSN node and enable self-sustainability. In fact, the proposed EASA in conjunction with energy harvesting capability can lead towards perpetual WSN operation and significantly outperform the state-of-the-art ASA.

  12. An Autonomous Self-Aware and Adaptive Fault Tolerant Routing Technique for Wireless Sensor Networks.

    Science.gov (United States)

    Abba, Sani; Lee, Jeong-A

    2015-08-18

    We propose an autonomous self-aware and adaptive fault-tolerant routing technique (ASAART) for wireless sensor networks. We address the limitations of self-healing routing (SHR) and self-selective routing (SSR) techniques for routing sensor data. We also examine the integration of autonomic self-aware and adaptive fault detection and resiliency techniques for route formation and route repair to provide resilience to errors and failures. We achieved this by using a combined continuous and slotted prioritized transmission back-off delay to obtain local and global network state information, as well as multiple random functions for attaining faster routing convergence and reliable route repair despite transient and permanent node failure rates and efficient adaptation to instantaneous network topology changes. The results of simulations based on a comparison of the ASAART with the SHR and SSR protocols for five different simulated scenarios in the presence of transient and permanent node failure rates exhibit a greater resiliency to errors and failure and better routing performance in terms of the number of successfully delivered network packets, end-to-end delay, delivered MAC layer packets, packet error rate, as well as efficient energy conservation in a highly congested, faulty, and scalable sensor network.

  13. Data fusion mathematics theory and practice

    CERN Document Server

    Raol, Jitendra R

    2015-01-01

    Fills the Existing Gap of Mathematics for Data FusionData fusion (DF) combines large amounts of information from a variety of sources and fuses this data algorithmically, logically and, if required intelligently, using artificial intelligence (AI). Also, known as sensor data fusion (SDF), the DF fusion system is an important component for use in various applications that include the monitoring of vehicles, aerospace systems, large-scale structures, and large industrial automation plants. Data Fusion Mathematics: Theory and Practice offers a comprehensive overview of data fusion, and provides a

  14. An Emergency-Adaptive Routing Scheme for Wireless Sensor Networks for Building Fire Hazard Monitoring

    Directory of Open Access Journals (Sweden)

    Guilin Zheng

    2011-03-01

    Full Text Available Fire hazard monitoring and evacuation for building environments is a novel application area for the deployment of wireless sensor networks. In this context, adaptive routing is essential in order to ensure safe and timely data delivery in building evacuation and fire fighting resource applications. Existing routing mechanisms for wireless sensor networks are not well suited for building fires, especially as they do not consider critical and dynamic network scenarios. In this paper, an emergency-adaptive, real-time and robust routing protocol is presented for emergency situations such as building fire hazard applications. The protocol adapts to handle dynamic emergency scenarios and works well with the routing hole problem. Theoretical analysis and simulation results indicate that our protocol provides a real-time routing mechanism that is well suited for dynamic emergency scenarios in building fires when compared with other related work.

  15. Experimental measurement of oil–water two-phase flow by data fusion of electrical tomography sensors and venturi tube

    International Nuclear Information System (INIS)

    Liu, Yinyan; Deng, Yuchi; Zhang, Maomao; Yu, Peining; Li, Yi

    2017-01-01

    Oil–water two-phase flows are commonly found in the production processes of the petroleum industry. Accurate online measurement of flow rates is crucial to ensure the safety and efficiency of oil exploration and production. A research team from Tsinghua University has developed an experimental apparatus for multiphase flow measurement based on an electrical capacitance tomography (ECT) sensor, an electrical resistance tomography (ERT) sensor, and a venturi tube. This work presents the phase fraction and flow rate measurements of oil–water two-phase flows based on the developed apparatus. Full-range phase fraction can be obtained by the combination of the ECT sensor and the ERT sensor. By data fusion of differential pressures measured by venturi tube and the phase fraction, the total flow rate and single-phase flow rate can be calculated. Dynamic experiments were conducted on the multiphase flow loop in horizontal and vertical pipelines and at various flow rates. (paper)

  16. A Cluster-Based Dual-Adaptive Topology Control Approach in Wireless Sensor Networks.

    Science.gov (United States)

    Gui, Jinsong; Zhou, Kai; Xiong, Naixue

    2016-09-25

    Multi-Input Multi-Output (MIMO) can improve wireless network performance. Sensors are usually single-antenna devices due to the high hardware complexity and cost, so several sensors are used to form virtual MIMO array, which is a desirable approach to efficiently take advantage of MIMO gains. Also, in large Wireless Sensor Networks (WSNs), clustering can improve the network scalability, which is an effective topology control approach. The existing virtual MIMO-based clustering schemes do not either fully explore the benefits of MIMO or adaptively determine the clustering ranges. Also, clustering mechanism needs to be further improved to enhance the cluster structure life. In this paper, we propose an improved clustering scheme for virtual MIMO-based topology construction (ICV-MIMO), which can determine adaptively not only the inter-cluster transmission modes but also the clustering ranges. Through the rational division of cluster head function and the optimization of cluster head selection criteria and information exchange process, the ICV-MIMO scheme effectively reduces the network energy consumption and improves the lifetime of the cluster structure when compared with the existing typical virtual MIMO-based scheme. Moreover, the message overhead and time complexity are still in the same order of magnitude.

  17. Sensor selection and chemo-sensory optimization: toward an adaptable chemo-sensory system

    Directory of Open Access Journals (Sweden)

    Alexander eVergara

    2012-01-01

    Full Text Available Over the past two decades, despite the tremendous research effort performed on chemical sensors and machine olfaction to develop micro-sensory systems that will accomplish the growing existent needs in personal health (implantable sensors, environment monitoring (widely distributed sensor networks, and security/threat detection (chemo/bio warfare agents, simple, low-cost molecular sensing platforms capable of long-term autonomous operation remain beyond the current state-of-the-art of chemical sensing. A fundamental issue within this context is that most of the chemical sensors depend on interactions between the targeted species and the surfaces functionalized with receptors that bind the target species selectively, and that these binding events are coupled with transduction processes that begin to change when they are exposed to the messy world of real samples. With the advent of fundamental breakthroughs at the intersection of materials science, micro/nano-technology, and signal processing, hybrid chemo-sensory systems have incorporated tunable, optimizable operating parameters, through which changes in the response characteristics can be modeled and compensated as the environmental conditions or application needs change.The objective of this article, in this context, is to bring together the key advances at the device, data processing, and system levels that enable chemo-sensory systems to adapt in response to their environments. Accordingly, in this review we will feature the research effort made by selected experts on chemical sensing and information theory, whose work has been devoted to develop strategies that provide tunability and adaptability to single sensor devices or sensory array systems. Particularly, we consider sensor-array selection, modulation of internal sensing parameters, and active sensing. The article ends with some conclusions drawn from the results presented and a visionary look toward the future in terms of how the

  18. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method.

    Science.gov (United States)

    Deng, Xinyang; Jiang, Wen

    2017-09-12

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.

  19. Assessment of fusion facility dose rate map using mesh adaptivity enhancements of hybrid Monte Carlo/deterministic techniques

    International Nuclear Information System (INIS)

    Ibrahim, Ahmad M.; Wilson, Paul P.; Sawan, Mohamed E.; Mosher, Scott W.; Peplow, Douglas E.; Grove, Robert E.

    2014-01-01

    Highlights: •Calculate the prompt dose rate everywhere throughout the entire fusion energy facility. •Utilize FW-CADIS to accurately perform difficult neutronics calculations for fusion energy systems. •Develop three mesh adaptivity algorithms to enhance FW-CADIS efficiency in fusion-neutronics calculations. -- Abstract: Three mesh adaptivity algorithms were developed to facilitate and expedite the use of the CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques in accurate full-scale neutronics simulations of fusion energy systems with immense sizes and complicated geometries. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as much geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility and resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation. Additionally, because of the significant increase in the efficiency of FW-CADIS simulations, the three algorithms enabled this difficult calculation to be accurately solved on a regular computer cluster, eliminating the need for a world-class super computer

  20. Information fusion in aquaculture: a state-of the art review

    Directory of Open Access Journals (Sweden)

    Shahbaz Gul HASSAN,Murtaza HASAN,Daoliang LI

    2016-09-01

    Full Text Available Efficient fish feeding is currently one of biggest challenges in aquaculture to enhance the production of fish quality and quantity. In this review, an information fusion approach was used to integrate multi-sensor and computer vision techniques to make fish feeding more efficient and accurate. Information fusion is a well-known technology that has been used in different fields of artificial intelligence, robotics, image processing, computer vision, sensors and wireless sensor networks. Information fusion in aquaculture is a growing field of research that is used to enhance the performance of an industrialized ecosystem. This review study surveys different fish feeding systems using multi-sensor data fusion, computer vision technology, and different food intake models. In addition, different fish behavior monitoring techniques are discussed, and the parameters of water, pH, dissolved oxygen, turbidity, temperature etc., necessary for the fish feeding process, are examined. Moreover, the different waste management and fish disease diagnosis techniques using different technologies, expert systems and modeling are also reviewed.

  1. Multi-Sensor Architectures

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Ahmed, Zaki; Khan, M. Z.

    2012-01-01

    The use of multiple sensors typically requires the fusion of data from different type of sensors. The combined use of such a data has the potential to give an efficient, high quality and reliable estimation. Input data from different sensors allows the introduction of target attributes (target ty...

  2. Incremental Support Vector Machine Framework for Visual Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yuichi Motai

    2007-01-01

    Full Text Available Motivated by the emerging requirements of surveillance networks, we present in this paper an incremental multiclassification support vector machine (SVM technique as a new framework for action classification based on real-time multivideo collected by homogeneous sites. The technique is based on an adaptation of least square SVM (LS-SVM formulation but extends beyond the static image-based learning of current SVM methodologies. In applying the technique, an initial supervised offline learning phase is followed by a visual behavior data acquisition and an online learning phase during which the cluster head performs an ensemble of model aggregations based on the sensor nodes inputs. The cluster head then selectively switches on designated sensor nodes for future incremental learning. Combining sensor data offers an improvement over single camera sensing especially when the latter has an occluded view of the target object. The optimization involved alleviates the burdens of power consumption and communication bandwidth requirements. The resulting misclassification error rate, the iterative error reduction rate of the proposed incremental learning, and the decision fusion technique prove its validity when applied to visual sensor networks. Furthermore, the enabled online learning allows an adaptive domain knowledge insertion and offers the advantage of reducing both the model training time and the information storage requirements of the overall system which makes it even more attractive for distributed sensor networks communication.

  3. Self-adaptive calibration for staring infrared sensors

    Science.gov (United States)

    Kendall, William B.; Stocker, Alan D.

    1993-10-01

    This paper presents a new, self-adaptive technique for the correlation of non-uniformities (fixed-pattern noise) in high-density infrared focal-plane detector arrays. We have developed a new approach to non-uniformity correction in which we use multiple image frames of the scene itself, and take advantage of the aim-point wander caused by jitter, residual tracking errors, or deliberately induced motion. Such wander causes each detector in the array to view multiple scene elements, and each scene element to be viewed by multiple detectors. It is therefore possible to formulate (and solve) a set of simultaneous equations from which correction parameters can be computed for the detectors. We have tested our approach with actual images collected by the ARPA-sponsored MUSIC infrared sensor. For these tests we employed a 60-frame (0.75-second) sequence of terrain images for which an out-of-date calibration was deliberately used. The sensor was aimed at a point on the ground via an operator-assisted tracking system having a maximum aim point wander on the order of ten pixels. With these data, we were able to improve the calibration accuracy by a factor of approximately 100.

  4. A data fusion approach for track monitoring from multiple in-service trains

    Science.gov (United States)

    Lederman, George; Chen, Siheng; Garrett, James H.; Kovačević, Jelena; Noh, Hae Young; Bielak, Jacobo

    2017-10-01

    We present a data fusion approach for enabling data-driven rail-infrastructure monitoring from multiple in-service trains. A number of researchers have proposed using vibration data collected from in-service trains as a low-cost method to monitor track geometry. The majority of this work has focused on developing novel features to extract information about the tracks from data produced by individual sensors on individual trains. We extend this work by presenting a technique to combine extracted features from multiple passes over the tracks from multiple sensors aboard multiple vehicles. There are a number of challenges in combining multiple data sources, like different relative position coordinates depending on the location of the sensor within the train. Furthermore, as the number of sensors increases, the likelihood that some will malfunction also increases. We use a two-step approach that first minimizes position offset errors through data alignment, then fuses the data with a novel adaptive Kalman filter that weights data according to its estimated reliability. We show the efficacy of this approach both through simulations and on a data-set collected from two instrumented trains operating over a one-year period. Combining data from numerous in-service trains allows for more continuous and more reliable data-driven monitoring than analyzing data from any one train alone; as the number of instrumented trains increases, the proposed fusion approach could facilitate track monitoring of entire rail-networks.

  5. Remote sensing image fusion

    CERN Document Server

    Alparone, Luciano; Baronti, Stefano; Garzelli, Andrea

    2015-01-01

    A synthesis of more than ten years of experience, Remote Sensing Image Fusion covers methods specifically designed for remote sensing imagery. The authors supply a comprehensive classification system and rigorous mathematical description of advanced and state-of-the-art methods for pansharpening of multispectral images, fusion of hyperspectral and panchromatic images, and fusion of data from heterogeneous sensors such as optical and synthetic aperture radar (SAR) images and integration of thermal and visible/near-infrared images. They also explore new trends of signal/image processing, such as

  6. An Autonomous Self-Aware and Adaptive Fault Tolerant Routing Technique for Wireless Sensor Networks

    Science.gov (United States)

    Abba, Sani; Lee, Jeong-A

    2015-01-01

    We propose an autonomous self-aware and adaptive fault-tolerant routing technique (ASAART) for wireless sensor networks. We address the limitations of self-healing routing (SHR) and self-selective routing (SSR) techniques for routing sensor data. We also examine the integration of autonomic self-aware and adaptive fault detection and resiliency techniques for route formation and route repair to provide resilience to errors and failures. We achieved this by using a combined continuous and slotted prioritized transmission back-off delay to obtain local and global network state information, as well as multiple random functions for attaining faster routing convergence and reliable route repair despite transient and permanent node failure rates and efficient adaptation to instantaneous network topology changes. The results of simulations based on a comparison of the ASAART with the SHR and SSR protocols for five different simulated scenarios in the presence of transient and permanent node failure rates exhibit a greater resiliency to errors and failure and better routing performance in terms of the number of successfully delivered network packets, end-to-end delay, delivered MAC layer packets, packet error rate, as well as efficient energy conservation in a highly congested, faulty, and scalable sensor network. PMID:26295236

  7. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    Science.gov (United States)

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-01-01

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated

  8. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    Directory of Open Access Journals (Sweden)

    Sungho Kim

    2016-07-01

    Full Text Available Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR images or infrared (IR images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter and an asymmetric morphological closing filter (AMCF, post-filter into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic

  9. Score level fusion scheme based on adaptive local Gabor features for face-iris-fingerprint multimodal biometric

    Science.gov (United States)

    He, Fei; Liu, Yuanning; Zhu, Xiaodong; Huang, Chun; Han, Ye; Chen, Ying

    2014-05-01

    A multimodal biometric system has been considered a promising technique to overcome the defects of unimodal biometric systems. We have introduced a fusion scheme to gain a better understanding and fusion method for a face-iris-fingerprint multimodal biometric system. In our case, we use particle swarm optimization to train a set of adaptive Gabor filters in order to achieve the proper Gabor basic functions for each modality. For a closer analysis of texture information, two different local Gabor features for each modality are produced by the corresponding Gabor coefficients. Next, all matching scores of the two Gabor features for each modality are projected to a single-scalar score via a trained, supported, vector regression model for a final decision. A large-scale dataset is formed to validate the proposed scheme using the Facial Recognition Technology database-fafb and CASIA-V3-Interval together with FVC2004-DB2a datasets. The experimental results demonstrate that as well as achieving further powerful local Gabor features of multimodalities and obtaining better recognition performance by their fusion strategy, our architecture also outperforms some state-of-the-art individual methods and other fusion approaches for face-iris-fingerprint multimodal biometric systems.

  10. Projective Method for Generic Sensor Fusion Problem

    International Nuclear Information System (INIS)

    Rao, N.S.V.

    1999-01-01

    In a multiple sensor system, each sensor produces an output which is related to the desired feature according to a certain probability distribution. We propose a fuser that combines the sensor outputs to more accurately predict the desired feature. The fuser utilizes the lower envelope of regression curves of sensors to project the sensor with the least error at each point of the feature space. This fuser is optimal among all projective fusers and also satisfies the isolation property that ensures a performance at least as good as the best sensor. In the case the sensor distributions are not known, we show that a consistent estimator of this fuser can be computed entirely based on a training sample. Compared to linear fusers, the projective fusers provide a complementary performance. We propose two classes of metafusers that utilize both linear and projectives fusers to perform at least as good as the best sensor as well as the best fuser

  11. An Effective Collaborative Mobile Weighted Clustering Schemes for Energy Balancing in Wireless Sensor Networks.

    Science.gov (United States)

    Tang, Chengpei; Shokla, Sanesy Kumcr; Modhawar, George; Wang, Qiang

    2016-02-19

    Collaborative strategies for mobile sensor nodes ensure the efficiency and the robustness of data processing, while limiting the required communication bandwidth. In order to solve the problem of pipeline inspection and oil leakage monitoring, a collaborative weighted mobile sensing scheme is proposed. By adopting a weighted mobile sensing scheme, the adaptive collaborative clustering protocol can realize an even distribution of energy load among the mobile sensor nodes in each round, and make the best use of battery energy. A detailed theoretical analysis and experimental results revealed that the proposed protocol is an energy efficient collaborative strategy such that the sensor nodes can communicate with a fusion center and produce high power gain.

  12. Multi-Source Multi-Sensor Information Fusion

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    applications in biomedical, industrial automation, aerospace systems and environmental ... performance evaluation and achievable accuracy, mainly for aerospace ... system and sensors, sensor data processing and performance requirement, ...

  13. A Cluster-Based Dual-Adaptive Topology Control Approach in Wireless Sensor Networks

    Science.gov (United States)

    Gui, Jinsong; Zhou, Kai; Xiong, Naixue

    2016-01-01

    Multi-Input Multi-Output (MIMO) can improve wireless network performance. Sensors are usually single-antenna devices due to the high hardware complexity and cost, so several sensors are used to form virtual MIMO array, which is a desirable approach to efficiently take advantage of MIMO gains. Also, in large Wireless Sensor Networks (WSNs), clustering can improve the network scalability, which is an effective topology control approach. The existing virtual MIMO-based clustering schemes do not either fully explore the benefits of MIMO or adaptively determine the clustering ranges. Also, clustering mechanism needs to be further improved to enhance the cluster structure life. In this paper, we propose an improved clustering scheme for virtual MIMO-based topology construction (ICV-MIMO), which can determine adaptively not only the inter-cluster transmission modes but also the clustering ranges. Through the rational division of cluster head function and the optimization of cluster head selection criteria and information exchange process, the ICV-MIMO scheme effectively reduces the network energy consumption and improves the lifetime of the cluster structure when compared with the existing typical virtual MIMO-based scheme. Moreover, the message overhead and time complexity are still in the same order of magnitude. PMID:27681731

  14. A Cluster-Based Dual-Adaptive Topology Control Approach in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jinsong Gui

    2016-09-01

    Full Text Available Multi-Input Multi-Output (MIMO can improve wireless network performance. Sensors are usually single-antenna devices due to the high hardware complexity and cost, so several sensors are used to form virtual MIMO array, which is a desirable approach to efficiently take advantage of MIMO gains. Also, in large Wireless Sensor Networks (WSNs, clustering can improve the network scalability, which is an effective topology control approach. The existing virtual MIMO-based clustering schemes do not either fully explore the benefits of MIMO or adaptively determine the clustering ranges. Also, clustering mechanism needs to be further improved to enhance the cluster structure life. In this paper, we propose an improved clustering scheme for virtual MIMO-based topology construction (ICV-MIMO, which can determine adaptively not only the inter-cluster transmission modes but also the clustering ranges. Through the rational division of cluster head function and the optimization of cluster head selection criteria and information exchange process, the ICV-MIMO scheme effectively reduces the network energy consumption and improves the lifetime of the cluster structure when compared with the existing typical virtual MIMO-based scheme. Moreover, the message overhead and time complexity are still in the same order of magnitude.

  15. Thermonuclear fusion

    International Nuclear Information System (INIS)

    Weisse, J.

    2000-01-01

    This document takes stock of the two ways of thermonuclear fusion research explored today: magnetic confinement fusion and inertial confinement fusion. The basic physical principles are recalled first: fundamental nuclear reactions, high temperatures, elementary properties of plasmas, ignition criterion, magnetic confinement (charged particle in a uniform magnetic field, confinement and Tokamak principle, heating of magnetized plasmas (ohmic, neutral particles, high frequency waves, other heating means), results obtained so far (scale laws and extrapolation of performances, tritium experiments, ITER project), inertial fusion (hot spot ignition, instabilities, results (Centurion-Halite program, laser experiments). The second part presents the fusion reactor and its associated technologies: principle (tritium production, heat source, neutron protection, tritium generation, materials), magnetic fusion (superconducting magnets, divertor (role, principle, realization), inertial fusion (energy vector, laser adaptation, particle beams, reaction chamber, stresses, chamber concepts (dry and wet walls, liquid walls), targets (fabrication, injection and pointing)). The third chapter concerns the socio-economic aspects of thermonuclear fusion: safety (normal operation and accidents, wastes), costs (costs structure and elementary comparison, ecological impact and external costs). (J.S.)

  16. Quantitative characterization of pulverized coal and biomass–coal blends in pneumatic conveying pipelines using electrostatic sensor arrays and data fusion techniques

    International Nuclear Information System (INIS)

    Qian, Xiangchen; Wang, Chao; Yan, Yong; Shao, Jiaqing; Wang, Lijuan; Zhou, Hao

    2012-01-01

    Quantitative data about the dynamic behaviour of pulverized coal and biomass–coal blends in fuel injection pipelines allow power plant operators to detect variations in fuel supply and oscillations in the flow at an early stage, enable them to balance fuel distribution between fuel feeding pipes and ultimately to achieve higher combustion efficiency and lower greenhouse gas emissions. Electrostatic sensor arrays and data fusion algorithms are combined to provide a non-intrusive solution to the measurement of fuel particle velocity, relative solid concentration and flow stability under pneumatic conveying conditions. Electrostatic sensor arrays with circular and arc-shaped electrodes are integrated in the same sensing head to measure ‘averaged’ and ‘localized’ characteristics of pulverized fuel flow. Data fusion techniques are applied to optimize and integrate the results from the sensor arrays. Experimental tests were conducted on the horizontal section of a 150 mm bore pneumatic conveyor circulating pulverized coal and sawdust under various flow conditions. Test results suggest that pure coal particles travel faster and carry more electrostatic charge than biomass–coal blends. As more biomass particles are added to the flow, the overall velocity of the flow reduces, the electrostatic charge level on particles decreases and the flow becomes less stable compared to the pure coal flow. (paper)

  17. Quantitative characterization of pulverized coal and biomass-coal blends in pneumatic conveying pipelines using electrostatic sensor arrays and data fusion techniques

    Science.gov (United States)

    Qian, Xiangchen; Yan, Yong; Shao, Jiaqing; Wang, Lijuan; Zhou, Hao; Wang, Chao

    2012-08-01

    Quantitative data about the dynamic behaviour of pulverized coal and biomass-coal blends in fuel injection pipelines allow power plant operators to detect variations in fuel supply and oscillations in the flow at an early stage, enable them to balance fuel distribution between fuel feeding pipes and ultimately to achieve higher combustion efficiency and lower greenhouse gas emissions. Electrostatic sensor arrays and data fusion algorithms are combined to provide a non-intrusive solution to the measurement of fuel particle velocity, relative solid concentration and flow stability under pneumatic conveying conditions. Electrostatic sensor arrays with circular and arc-shaped electrodes are integrated in the same sensing head to measure ‘averaged’ and ‘localized’ characteristics of pulverized fuel flow. Data fusion techniques are applied to optimize and integrate the results from the sensor arrays. Experimental tests were conducted on the horizontal section of a 150 mm bore pneumatic conveyor circulating pulverized coal and sawdust under various flow conditions. Test results suggest that pure coal particles travel faster and carry more electrostatic charge than biomass-coal blends. As more biomass particles are added to the flow, the overall velocity of the flow reduces, the electrostatic charge level on particles decreases and the flow becomes less stable compared to the pure coal flow.

  18. Efficient and Adaptive Node Selection for Target Tracking in Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Juan Feng

    2016-01-01

    Full Text Available In target tracking wireless sensor network, choosing the proper working nodes can not only minimize the number of active nodes, but also satisfy the tracking reliability requirement. However, most existing works focus on selecting sensor nodes which are the nearest to the target for tracking missions and they did not consider the correlation of the location of the sensor nodes so that these approaches can not meet all the goals of the network. This work proposes an efficient and adaptive node selection approach for tracking a target in a distributed wireless sensor network. The proposed approach combines the distance-based node selection strategy and particle filter prediction considering the spatial correlation of the different sensing nodes. Moreover, a joint distance weighted measurement is proposed to estimate the information utility of sensing nodes. Experimental results show that EANS outperformed the state-of-the-art approaches by reducing the energy cost and computational complexity as well as guaranteeing the tracking accuracy.

  19. Adaptive Reliable Routing Based on Cluster Hierarchy for Wireless Multimedia Sensor Networks

    Directory of Open Access Journals (Sweden)

    Kai Lin

    2010-01-01

    Full Text Available As a multimedia information acquisition and processing method, wireless multimedia sensor network(WMSN has great application potential in military and civilian areas. Compared with traditional wireless sensor network, the routing design of WMSN should obtain more attention on the quality of transmission. This paper proposes an adaptive reliable routing based on clustering hierarchy named ARCH, which includes energy prediction and power allocation mechanism. To obtain a better performance, the cluster structure is formed based on cellular topology. The introduced prediction mechanism makes the sensor nodes predict the remaining energy of other nodes, which dramatically reduces the overall information needed for energy balancing. ARCH can dynamically balance the energy consumption of nodes based on the predicted results provided by power allocation. The simulation results prove the efficiency of the proposed ARCH routing.

  20. Adaptive Multichannel Radiation Sensors for Plant Parameter Monitoring

    Science.gov (United States)

    Mollenhauer, Hannes; Remmler, Paul; Schuhmann, Gudrun; Lausch, Angela; Merbach, Ines; Assing, Martin; Mollenhauer, Olaf; Dietrich, Peter; Bumberger, Jan

    2016-04-01

    Nutrients such as nitrogen are playing a key role in the plant life cycle. They are much needed for chlorophyll production and other plant cell components. Therefore, the crop yield is strongly affected by plant nutrient status. Due to the spatial and temporal variability of soil characteristics or swaying agricultural inputs the plant development varies within a field. Thus, the determination of these fluctuations in the plant development is valuable for a detection of stress conditions and optimization of fertilisation due to its high environmental and economic impact. Plant parameters play crucial roles in plant growth estimation and prediction since they are used as indicators of plant performance. Especially indices derived out of remote sensing techniques provide quantitative information about agricultural crops instantaneously, and above all, non-destructively. Due to the specific absorption of certain plant pigments, a characteristic spectral signature can be seen in the visible and IR part of the electromagnetic spectrum, known as narrow-band peaks. In an analogous manner, the presence and concentration of different nutrients cause a characteristic spectral signature. To this end, an adequate remote sensing monitoring concept is needed, considering heterogeneity and dynamic of the plant population and economical aspects. This work will present the development and field investigations of an inexpensive multichannel radiation sensor to observe the incoming and reflected specific parts or rather distinct wavelengths of the solar light spectrum on the crop and facilitate the determination of different plant indices. Based on the selected sensor wavelengths, the sensing device allows the detection of specific parameters, e.g. plant vitality, chlorophyll content or nitrogen content. Besides the improvement of the sensor characteristic, the simple wavelength adaption, and the price-performance ratio, the achievement of appropriate energy efficiency as well as a

  1. Adaptive multi-node multiple input and multiple output (MIMO) transmission for mobile wireless multimedia sensor networks.

    Science.gov (United States)

    Cho, Sunghyun; Choi, Ji-Woong; You, Cheolwoo

    2013-10-02

    Mobile wireless multimedia sensor networks (WMSNs), which consist of mobile sink or sensor nodes and use rich sensing information, require much faster and more reliable wireless links than static wireless sensor networks (WSNs). This paper proposes an adaptive multi-node (MN) multiple input and multiple output (MIMO) transmission to improve the transmission reliability and capacity of mobile sink nodes when they experience spatial correlation. Unlike conventional single-node (SN) MIMO transmission, the proposed scheme considers the use of transmission antennas from more than two sensor nodes. To find an optimal antenna set and a MIMO transmission scheme, a MN MIMO channel model is introduced first, followed by derivation of closed-form ergodic capacity expressions with different MIMO transmission schemes, such as space-time transmit diversity coding and spatial multiplexing. The capacity varies according to the antenna correlation and the path gain from multiple sensor nodes. Based on these statistical results, we propose an adaptive MIMO mode and antenna set switching algorithm that maximizes the ergodic capacity of mobile sink nodes. The ergodic capacity of the proposed scheme is compared with conventional SN MIMO schemes, where the gain increases as the antenna correlation and path gain ratio increase.

  2. Image sensor system with bio-inspired efficient coding and adaptation.

    Science.gov (United States)

    Okuno, Hirotsugu; Yagi, Tetsuya

    2012-08-01

    We designed and implemented an image sensor system equipped with three bio-inspired coding and adaptation strategies: logarithmic transform, local average subtraction, and feedback gain control. The system comprises a field-programmable gate array (FPGA), a resistive network, and active pixel sensors (APS), whose light intensity-voltage characteristics are controllable. The system employs multiple time-varying reset voltage signals for APS in order to realize multiple logarithmic intensity-voltage characteristics, which are controlled so that the entropy of the output image is maximized. The system also employs local average subtraction and gain control in order to obtain images with an appropriate contrast. The local average is calculated by the resistive network instantaneously. The designed system was successfully used to obtain appropriate images of objects that were subjected to large changes in illumination.

  3. Average Throughput Performance of Myopic Policy in Energy Harvesting Wireless Sensor Networks.

    Science.gov (United States)

    Gul, Omer Melih; Demirekler, Mubeccel

    2017-09-26

    This paper considers a single-hop wireless sensor network where a fusion center collects data from M energy harvesting wireless sensors. The harvested energy is stored losslessly in an infinite-capacity battery at each sensor. In each time slot, the fusion center schedules K sensors for data transmission over K orthogonal channels. The fusion center does not have direct knowledge on the battery states of sensors, or the statistics of their energy harvesting processes. The fusion center only has information of the outcomes of previous transmission attempts. It is assumed that the sensors are data backlogged, there is no battery leakage and the communication is error-free. An energy harvesting sensor can transmit data to the fusion center whenever being scheduled only if it has enough energy for data transmission. We investigate average throughput of Round-Robin type myopic policy both analytically and numerically under an average reward (throughput) criterion. We show that Round-Robin type myopic policy achieves optimality for some class of energy harvesting processes although it is suboptimal for a broad class of energy harvesting processes.

  4. A Fusion Approach to Feature Extraction by Wavelet Decomposition and Principal Component Analysis in Transient Signal Processing of SAW Odor Sensor Array

    Directory of Open Access Journals (Sweden)

    Prashant SINGH

    2011-03-01

    Full Text Available This paper presents theoretical analysis of a new approach for development of surface acoustic wave (SAW sensor array based odor recognition system. The construction of sensor array employs a single polymer interface for selective sorption of odorant chemicals in vapor phase. The individual sensors are however coated with different thicknesses. The idea of sensor coating thickness variation is for terminating solvation and diffusion kinetics of vapors into polymer up to different stages of equilibration on different sensors. This is expected to generate diversity in information content of the sensors transient. The analysis is based on wavelet decomposition of transient signals. The single sensor transients have been used earlier for generating odor identity signatures based on wavelet approximation coefficients. In the present work, however, we exploit variability in diffusion kinetics due to polymer thicknesses for making odor signatures. This is done by fusion of the wavelet coefficients from different sensors in the array, and then applying the principal component analysis. We find that the present approach substantially enhances the vapor class separability in feature space. The validation is done by generating synthetic sensor array data based on well-established SAW sensor theory.

  5. High Level Information Fusion (HLIF) with nested fusion loops

    Science.gov (United States)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  6. A Modified Adaptive Stochastic Resonance for Detecting Faint Signal in Sensors

    Directory of Open Access Journals (Sweden)

    Hengwei Li

    2007-02-01

    Full Text Available In this paper, an approach is presented to detect faint signals with strong noises in sensors by stochastic resonance (SR. We adopt the power spectrum as the evaluation tool of SR, which can be obtained by the fast Fourier transform (FFT. Furthermore, we introduce the adaptive filtering scheme to realize signal processing automatically. The key of the scheme is how to adjust the barrier height to satisfy the optimal condition of SR in the presence of any input. For the given input signal, we present an operable procedure to execute the adjustment scheme. An example utilizing one audio sensor to detect the fault information from the power supply is given. Simulation results show that th

  7. Extending lifetime of wireless sensor networks using multi-sensor ...

    Indian Academy of Sciences (India)

    SOUMITRA DAS

    In this paper a multi-sensor data fusion approach for wireless sensor network based on bayesian methods and ant colony ... niques for efficiently routing the data from source to the BS ... Literature review ... efficient scheduling and lot more to increase the lifetime of ... Nature-inspired algorithms such as ACO algorithms have.

  8. Fusion of Haptic and Gesture Sensors for Rehabilitation of Bimanual Coordination and Dexterous Manipulation.

    Science.gov (United States)

    Yu, Ningbo; Xu, Chang; Li, Huanshuai; Wang, Kui; Wang, Liancheng; Liu, Jingtai

    2016-03-18

    Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs), and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training.

  9. Motion-sensor fusion-based gesture recognition and its VLSI architecture design for mobile devices

    Science.gov (United States)

    Zhu, Wenping; Liu, Leibo; Yin, Shouyi; Hu, Siqi; Tang, Eugene Y.; Wei, Shaojun

    2014-05-01

    With the rapid proliferation of smartphones and tablets, various embedded sensors are incorporated into these platforms to enable multimodal human-computer interfaces. Gesture recognition, as an intuitive interaction approach, has been extensively explored in the mobile computing community. However, most gesture recognition implementations by now are all user-dependent and only rely on accelerometer. In order to achieve competitive accuracy, users are required to hold the devices in predefined manner during the operation. In this paper, a high-accuracy human gesture recognition system is proposed based on multiple motion sensor fusion. Furthermore, to reduce the energy overhead resulted from frequent sensor sampling and data processing, a high energy-efficient VLSI architecture implemented on a Xilinx Virtex-5 FPGA board is also proposed. Compared with the pure software implementation, approximately 45 times speed-up is achieved while operating at 20 MHz. The experiments show that the average accuracy for 10 gestures achieves 93.98% for user-independent case and 96.14% for user-dependent case when subjects hold the device randomly during completing the specified gestures. Although a few percent lower than the conventional best result, it still provides competitive accuracy acceptable for practical usage. Most importantly, the proposed system allows users to hold the device randomly during operating the predefined gestures, which substantially enhances the user experience.

  10. Fusion of Haptic and Gesture Sensors for Rehabilitation of Bimanual Coordination and Dexterous Manipulation

    Directory of Open Access Journals (Sweden)

    Ningbo Yu

    2016-03-01

    Full Text Available Disabilities after neural injury, such as stroke, bring tremendous burden to patients, families and society. Besides the conventional constrained-induced training with a paretic arm, bilateral rehabilitation training involves both the ipsilateral and contralateral sides of the neural injury, fitting well with the fact that both arms are needed in common activities of daily living (ADLs, and can promote good functional recovery. In this work, the fusion of a gesture sensor and a haptic sensor with force feedback capabilities has enabled a bilateral rehabilitation training therapy. The Leap Motion gesture sensor detects the motion of the healthy hand, and the omega.7 device can detect and assist the paretic hand, according to the designed cooperative task paradigm, as much as needed, with active force feedback to accomplish the manipulation task. A virtual scenario has been built up, and the motion and force data facilitate instantaneous visual and audio feedback, as well as further analysis of the functional capabilities of the patient. This task-oriented bimanual training paradigm recruits the sensory, motor and cognitive aspects of the patient into one loop, encourages the active involvement of the patients into rehabilitation training, strengthens the cooperation of both the healthy and impaired hands, challenges the dexterous manipulation capability of the paretic hand, suits easy of use at home or centralized institutions and, thus, promises effective potentials for rehabilitation training.

  11. Pragmatic data fusion uncertainty concerns: Tribute to Dave L. Hall

    CSIR Research Space (South Africa)

    Blasch, E

    2016-07-01

    Full Text Available Over the course of Dave Hall's career, he highlighted various concerns associated with the implementation of data fusion methods. Many of the issues included the role of uncertainty in data fusion, practical implementation of sensor fusion systems...

  12. Three-Dimensional Sensor Common Operating Picture (3-D Sensor COP)

    Science.gov (United States)

    2017-01-01

    DEMs that have been computed from the point clouds . Additionally, Fusion3D can also display 3-D data created using photogrammetry software...Picture (3-D Sensor COP). To test the 3-D Sensor COP, we took advantage of a sensor network that had been deployed for the Enterprise Challenge 2016 at...took advantage of a sensor network that had been deployed for the Enterprise Challenge 2016 (EC16) at Fort Huachuca in Sierra Vista, Arizona. The

  13. Distributed fusion estimation for sensor networks with communication constraints

    CERN Document Server

    Zhang, Wen-An; Song, Haiyu; Yu, Li

    2016-01-01

    This book systematically presents energy-efficient robust fusion estimation methods to achieve thorough and comprehensive results in the context of network-based fusion estimation. It summarizes recent findings on fusion estimation with communication constraints; several novel energy-efficient and robust design methods for dealing with energy constraints and network-induced uncertainties are presented, such as delays, packet losses, and asynchronous information... All the results are presented as algorithms, which are convenient for practical applications.

  14. Towards adaptive security for convergent wireless sensor networks in beyond 3G environments

    DEFF Research Database (Denmark)

    Mitseva, Anelia; Aivaloglou, Efthimia; Marchitti, Maria-Antonietta

    2010-01-01

    The integration of wireless sensor networks with different network systems gives rise to many research challenges to ensure security, privacy and trust in the overall architecture. The main contribution of this paper is a generic security, privacy and trust framework providing context-aware adapt...

  15. An adaptive wing for a small-aircraft application with a configuration of fibre Bragg grating sensors

    International Nuclear Information System (INIS)

    Mieloszyk, M; Krawczuk, M; Zak, A; Ostachowicz, W

    2010-01-01

    In this paper a concept of an adaptive wing for small-aircraft applications with an array of fibre Bragg grating (FBG) sensors has been presented and discussed. In this concept the shape of the wing can be controlled and altered thanks to the wing design and the use of integrated shape memory alloy actuators. The concept has been tested numerically by the use of the finite element method. For numerical calculations the commercial finite element package ABAQUS ® has been employed. A finite element model of the wing has been prepared in order to estimate the values of the wing twisting angles and distributions of the twist for various activation scenarios. Based on the results of numerical analysis the locations and numbers of the FBG sensors have also been determined. The results of numerical calculations obtained by the authors confirmed the usefulness of the assumed wing control strategy. Based on them and the concept developed of the adaptive wing, a wing demonstration stand has been designed and built. The stand has been used to verify experimentally the performance of the adaptive wing and the usefulness of the FBG sensors for evaluation of the wing condition

  16. Performance of UWB Array-Based Radar Sensor in a Multi-Sensor Vehicle-Based Suit for Landmine Detection

    NARCIS (Netherlands)

    Yarovoy, A.; Savelyev, T.; Zhuge, X.; Aubry, P.; Ligthart, L.; Schavemaker, J.G.M.; Tettelaar, P.; Breejen, E. de

    2008-01-01

    In this paper, integration of an UWB array-based timedomain radar sensor in a vehicle-mounted multi-sensor system for landmine detection is described. Dedicated real-time signal processing algorithms are developed to compute the radar sensor confidence map which is used for sensor fusion.

  17. Complimentary Advanced Fusion Exploration

    National Research Council Canada - National Science Library

    Alford, Mark G; Jones, Eric C; Bubalo, Adnan; Neumann, Melissa; Greer, Michael J

    2005-01-01

    .... The focus areas were in the following regimes: multi-tensor homographic computer vision image fusion, out-of-sequence measurement and track data handling, Nash bargaining approaches to sensor management, pursuit-evasion game theoretic modeling...

  18. Adaptive Square-Shaped Trajectory-Based Service Location Protocol in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hwa-Jung Lim

    2010-04-01

    Full Text Available In this paper we propose an adaptive square-shaped trajectory (ASST-based service location method to ensure load scalability in wireless sensor networks. This first establishes a square-shaped trajectory over the nodes that surround a target point computed by the hash function and any user can access it, using the hash. Both the width and the size of the trajectory are dynamically adjustable, depending on the number of queries made to the service information on the trajectory. The number of sensor nodes on the trajectory varies in proportion to the changing trajectory shape, allowing high loads to be distributed around the hot spot area.

  19. Audio-Visual Fusion for Sound Source Localization and Improved Attention

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byoung Gi; Choi, Jong Suk; Yoon, Sang Suk; Choi, Mun Taek; Kim, Mun Sang [Korea Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Dai Jin [Pohang University of Science and Technology, Pohang (Korea, Republic of)

    2011-07-15

    Service robots are equipped with various sensors such as vision camera, sonar sensor, laser scanner, and microphones. Although these sensors have their own functions, some of them can be made to work together and perform more complicated functions. AudioFvisual fusion is a typical and powerful combination of audio and video sensors, because audio information is complementary to visual information and vice versa. Human beings also mainly depend on visual and auditory information in their daily life. In this paper, we conduct two studies using audioFvision fusion: one is on enhancing the performance of sound localization, and the other is on improving robot attention through sound localization and face detection.

  20. Audio-Visual Fusion for Sound Source Localization and Improved Attention

    International Nuclear Information System (INIS)

    Lee, Byoung Gi; Choi, Jong Suk; Yoon, Sang Suk; Choi, Mun Taek; Kim, Mun Sang; Kim, Dai Jin

    2011-01-01

    Service robots are equipped with various sensors such as vision camera, sonar sensor, laser scanner, and microphones. Although these sensors have their own functions, some of them can be made to work together and perform more complicated functions. AudioFvisual fusion is a typical and powerful combination of audio and video sensors, because audio information is complementary to visual information and vice versa. Human beings also mainly depend on visual and auditory information in their daily life. In this paper, we conduct two studies using audioFvision fusion: one is on enhancing the performance of sound localization, and the other is on improving robot attention through sound localization and face detection

  1. Online Sensor Drift Compensation for E-Nose Systems Using Domain Adaptation and Extreme Learning Machine

    Science.gov (United States)

    Luo, Guangchun; Qin, Ke; Wang, Nan; Niu, Weina

    2018-01-01

    Sensor drift is a common issue in E-Nose systems and various drift compensation methods have received fruitful results in recent years. Although the accuracy for recognizing diverse gases under drift conditions has been largely enhanced, few of these methods considered online processing scenarios. In this paper, we focus on building online drift compensation model by transforming two domain adaptation based methods into their online learning versions, which allow the recognition models to adapt to the changes of sensor responses in a time-efficient manner without losing the high accuracy. Experimental results using three different settings confirm that the proposed methods save large processing time when compared with their offline versions, and outperform other drift compensation methods in recognition accuracy. PMID:29494543

  2. Online Sensor Drift Compensation for E-Nose Systems Using Domain Adaptation and Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Zhiyuan Ma

    2018-03-01

    Full Text Available Sensor drift is a common issue in E-Nose systems and various drift compensation methods have received fruitful results in recent years. Although the accuracy for recognizing diverse gases under drift conditions has been largely enhanced, few of these methods considered online processing scenarios. In this paper, we focus on building online drift compensation model by transforming two domain adaptation based methods into their online learning versions, which allow the recognition models to adapt to the changes of sensor responses in a time-efficient manner without losing the high accuracy. Experimental results using three different settings confirm that the proposed methods save large processing time when compared with their offline versions, and outperform other drift compensation methods in recognition accuracy.

  3. Multipoint dynamically reconfigure adaptive distributed fiber optic acoustic emission sensor (FAESense) system for condition based maintenance

    Science.gov (United States)

    Mendoza, Edgar; Prohaska, John; Kempen, Connie; Esterkin, Yan; Sun, Sunjian; Krishnaswamy, Sridhar

    2010-09-01

    This paper describes preliminary results obtained under a Navy SBIR contract by Redondo Optics Inc. (ROI), in collaboration with Northwestern University towards the development and demonstration of a next generation, stand-alone and fully integrated, dynamically reconfigurable, adaptive fiber optic acoustic emission sensor (FAESense™) system for the in-situ unattended detection and localization of shock events, impact damage, cracks, voids, and delaminations in new and aging critical infrastructures found in ships, submarines, aircraft, and in next generation weapon systems. ROI's FAESense™ system is based on the integration of proven state-of-the-art technologies: 1) distributed array of in-line fiber Bragg gratings (FBGs) sensors sensitive to strain, vibration, and acoustic emissions, 2) adaptive spectral demodulation of FBG sensor dynamic signals using two-wave mixing interferometry on photorefractive semiconductors, and 3) integration of all the sensor system passive and active optoelectronic components within a 0.5-cm x 1-cm photonic integrated circuit microchip. The adaptive TWM demodulation methodology allows the measurement of dynamic high frequnency acoustic emission events, while compensating for passive quasi-static strain and temperature drifts. It features a compact, low power, environmentally robust 1-inch x 1-inch x 4-inch small form factor (SFF) package with no moving parts. The FAESense™ interrogation system is microprocessor-controlled using high data rate signal processing electronics for the FBG sensors calibration, temperature compensation and the detection and analysis of acoustic emission signals. Its miniaturized package, low power operation, state-of-the-art data communications, and low cost makes it a very attractive solution for a large number of applications in naval and maritime industries, aerospace, civil structures, the oil and chemical industry, and for homeland security applications.

  4. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    Science.gov (United States)

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  5. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    Science.gov (United States)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  6. Sensor Validation using Bayesian Networks

    Data.gov (United States)

    National Aeronautics and Space Administration — One of NASA’s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in...

  7. A Novel Evidence Theory and Fuzzy Preference Approach-Based Multi-Sensor Data Fusion Technique for Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Fuyuan Xiao

    2017-10-01

    Full Text Available The multi-sensor data fusion technique plays a significant role in fault diagnosis and in a variety of such applications, and the Dempster–Shafer evidence theory is employed to improve the system performance; whereas, it may generate a counter-intuitive result when the pieces of evidence highly conflict with each other. To handle this problem, a novel multi-sensor data fusion approach on the basis of the distance of evidence, belief entropy and fuzzy preference relation analysis is proposed. A function of evidence distance is first leveraged to measure the conflict degree among the pieces of evidence; thus, the support degree can be obtained to represent the reliability of the evidence. Next, the uncertainty of each piece of evidence is measured by means of the belief entropy. Based on the quantitative uncertainty measured above, the fuzzy preference relations are applied to represent the relative credibility preference of the evidence. Afterwards, the support degree of each piece of evidence is adjusted by taking advantage of the relative credibility preference of the evidence that can be utilized to generate an appropriate weight with respect to each piece of evidence. Finally, the modified weights of the evidence are adopted to adjust the bodies of the evidence in the advance of utilizing Dempster’s combination rule. A numerical example and a practical application in fault diagnosis are used as illustrations to demonstrate that the proposal is reasonable and efficient in the management of conflict and fault diagnosis.

  8. Information fusion under consideration of conflicting input signals

    CERN Document Server

    Mönks, Uwe

    2017-01-01

    This work proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) and the µBalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. In addition, a sensor defect detection method, which is based on the continuous monitoring of sensor reliabilities, is presented. The performances of the contributions are shown by their evaluation in the scope of both a publicly available data set and a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The author Dr.-Ing. Uwe Mönks studied Electrical Engineering and Information Technology at the OWL University of Applied Sciences (Lemgo), Halmstad University (Sweden), and Aalborg University (Denmark). Since 2009 he is employed at the Institute Industrial IT (inIT) as research associate with project leading responsibilities. During th...

  9. Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks

    Science.gov (United States)

    Tarighati, Alla; Gross, James; Jalden, Joakim

    2017-09-01

    We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.

  10. Fault Diagnosis for Satellite Sensors and Actuators using Nonlinear Geometric Approach and Adaptive Observers

    DEFF Research Database (Denmark)

    Baldi, P.; Blanke, Mogens; Castaldi, P.

    2018-01-01

    This paper presents a novel scheme for diagnosis of faults affecting sensors that measure the satellite attitude, body angular velocity, flywheel spin rates, and defects in control torques from reaction wheel motors. The proposed methodology uses adaptive observers to provide fault estimates that...

  11. Engineering workstation: Sensor modeling

    Science.gov (United States)

    Pavel, M; Sweet, B.

    1993-01-01

    The purpose of the engineering workstation is to provide an environment for rapid prototyping and evaluation of fusion and image processing algorithms. Ideally, the algorithms are designed to optimize the extraction of information that is useful to a pilot for all phases of flight operations. Successful design of effective fusion algorithms depends on the ability to characterize both the information available from the sensors and the information useful to a pilot. The workstation is comprised of subsystems for simulation of sensor-generated images, image processing, image enhancement, and fusion algorithms. As such, the workstation can be used to implement and evaluate both short-term solutions and long-term solutions. The short-term solutions are being developed to enhance a pilot's situational awareness by providing information in addition to his direct vision. The long term solutions are aimed at the development of complete synthetic vision systems. One of the important functions of the engineering workstation is to simulate the images that would be generated by the sensors. The simulation system is designed to use the graphics modeling and rendering capabilities of various workstations manufactured by Silicon Graphics Inc. The workstation simulates various aspects of the sensor-generated images arising from phenomenology of the sensors. In addition, the workstation can be used to simulate a variety of impairments due to mechanical limitations of the sensor placement and due to the motion of the airplane. Although the simulation is currently not performed in real-time, sequences of individual frames can be processed, stored, and recorded in a video format. In that way, it is possible to examine the appearance of different dynamic sensor-generated and fused images.

  12. Hardware implementation of adaptive filtering using charge-coupled devices. [For perimeter security sensors

    Energy Technology Data Exchange (ETDEWEB)

    Donohoe, G.W.

    1977-01-01

    Sandia Laboratories' Digital Systems Division/1734, as part of its work on the Base and Installation Security Systems (BISS) program has been making use of adaptive digital filters to improve the signal-to-noise ratio of perimeter sensor signals. In particular, the Widrow-Hoff least-mean-squares algorithm has been used extensively. This non-recursive linear predictor has been successful in extracting aperiodic signals from periodic noise. The adaptive filter generates a predictor signal which is subtracted from the input signal to produce an error signal. The value of this error is fed back to the filter to improve the quality of the next prediction. Implementation of the Widrow adaptive filter using a Charge-Coupled Device tapped analog delay line, analog voltage multipliers and operational amplifiers is described. The resulting filter adapts to signals with frequency components as high as several megahertz.

  13. An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis

    Science.gov (United States)

    Chien, T. T.

    1972-01-01

    An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.

  14. Sensor management for identity fusion on a mobile robot

    DEFF Research Database (Denmark)

    Larsen, Thomas Dall; Andersen, Nils Axel; Ravn, Ole

    1998-01-01

    This paper addresses the problem of identity fusion, i.e. the problem of selecting one of several identity hypotheses concerning an observed object. Two problems are considered. Firstly the problem of preserving the information in the representation and fusion of measurements relating to identity...

  15. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  16. Cell-cycle dependent expression of a translocation-mediated fusion oncogene mediates checkpoint adaptation in rhabdomyosarcoma.

    Directory of Open Access Journals (Sweden)

    Ken Kikuchi

    2014-01-01

    Full Text Available Rhabdomyosarcoma is the most commonly occurring soft-tissue sarcoma in childhood. Most rhabdomyosarcoma falls into one of two biologically distinct subgroups represented by alveolar or embryonal histology. The alveolar subtype harbors a translocation-mediated PAX3:FOXO1A fusion gene and has an extremely poor prognosis. However, tumor cells have heterogeneous expression for the fusion gene. Using a conditional genetic mouse model as well as human tumor cell lines, we show that that Pax3:Foxo1a expression is enriched in G2 and triggers a transcriptional program conducive to checkpoint adaptation under stress conditions such as irradiation in vitro and in vivo. Pax3:Foxo1a also tolerizes tumor cells to clinically-established chemotherapy agents and emerging molecularly-targeted agents. Thus, the surprisingly dynamic regulation of the Pax3:Foxo1a locus is a paradigm that has important implications for the way in which oncogenes are modeled in cancer cells.

  17. Adaptive Pulsed Laser Line Extraction for Terrain Reconstruction using a Dynamic Vision Sensor

    Directory of Open Access Journals (Sweden)

    Christian eBrandli

    2014-01-01

    Full Text Available Mobile robots need to know the terrain in which they are moving for path planning and obstacle avoidance. This paper proposes the combination of a bio-inspired, redundancy-suppressing dynamic vision sensor with a pulsed line laser to allow fast terrain reconstruction. A stable laser stripe extraction is achieved by exploiting the sensor’s ability to capture the temporal dynamics in a scene. An adaptive temporal filter for the sensor output allows a reliable reconstruction of 3D terrain surfaces. Laser stripe extractions up to pulsing frequencies of 500Hz were achieved using a line laser of 3mW at a distance of 45cm using an event-based algorithm that exploits the sparseness of the sensor output. As a proof of concept, unstructured rapid prototype terrain samples have been successfully reconstructed with an accuracy of 2mm.

  18. An Efficient and Self-Adapting Localization in Static Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Wei Dong

    2009-08-01

    Full Text Available Localization is one of the most important subjects in Wireless Sensor Networks (WSNs. To reduce the number of beacons and adopt probabilistic methods, some particle filter-based mobile beacon-assisted localization approaches have been proposed, such as Mobile Beacon-assisted Localization (MBL, Adapting MBL (A-MBL, and the method proposed by Hang et al. Some new significant problems arise in these approaches, however. The first question is which probability distribution should be selected as the dynamic model in the prediction stage. The second is whether the unknown node adopts neighbors’ observation in the update stage. The third is how to find a self-adapting mechanism to achieve more flexibility in the adapting stage. In this paper, we give the theoretical analysis and experimental evaluations to suggest which probability distribution in the dynamic model should be adopted to improve the efficiency in the prediction stage. We also give the condition for whether the unknown node should use the observations from its neighbors to improve the accuracy. Finally, we propose a Self-Adapting Mobile Beacon-assisted Localization (SA-MBL approach to achieve more flexibility and achieve almost the same performance with A-MBL.

  19. A systematic review of gait analysis methods based on inertial sensors and adaptive algorithms.

    Science.gov (United States)

    Caldas, Rafael; Mundt, Marion; Potthast, Wolfgang; Buarque de Lima Neto, Fernando; Markert, Bernd

    2017-09-01

    The conventional methods to assess human gait are either expensive or complex to be applied regularly in clinical practice. To reduce the cost and simplify the evaluation, inertial sensors and adaptive algorithms have been utilized, respectively. This paper aims to summarize studies that applied adaptive also called artificial intelligence (AI) algorithms to gait analysis based on inertial sensor data, verifying if they can support the clinical evaluation. Articles were identified through searches of the main databases, which were encompassed from 1968 to October 2016. We have identified 22 studies that met the inclusion criteria. The included papers were analyzed due to their data acquisition and processing methods with specific questionnaires. Concerning the data acquisition, the mean score is 6.1±1.62, what implies that 13 of 22 papers failed to report relevant outcomes. The quality assessment of AI algorithms presents an above-average rating (8.2±1.84). Therefore, AI algorithms seem to be able to support gait analysis based on inertial sensor data. Further research, however, is necessary to enhance and standardize the application in patients, since most of the studies used distinct methods to evaluate healthy subjects. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Adaptive Opportunistic Cooperative Control Mechanism Based on Combination Forecasting and Multilevel Sensing Technology of Sensors for Mobile Internet of Things

    Directory of Open Access Journals (Sweden)

    Yong Jin

    2014-01-01

    Full Text Available In mobile Internet of Things, there are many challenges, including sensing technology of sensors, how and when to join cooperative transmission, and how to select the cooperative sensors. To address these problems, we studied the combination forecasting based on the multilevel sensing technology of sensors, building upon which we proposed the adaptive opportunistic cooperative control mechanism based on the threshold values such as activity probability, distance, transmitting power, and number of relay sensors, in consideration of signal to noise ratio and outage probability. More importantly, the relay sensors would do self-test real time in order to judge whether to join the cooperative transmission, for maintaining the optimal cooperative transmission state with high performance. The mathematical analyses results show that the proposed adaptive opportunistic cooperative control approach could perform better in terms of throughput ratio, packet error rate and delay, and energy efficiency, compared with the direct transmission and opportunistic cooperative approaches.

  1. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    International Nuclear Information System (INIS)

    Fu, Y; Xu, O; Yang, W; Zhou, L; Wang, J

    2017-01-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately. (paper)

  2. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  3. An optical liquid level sensor based on core-offset fusion splicing method using polarization-maintaining fiber

    Science.gov (United States)

    Lou, Weimin; Chen, Debao; Shen, Changyu; Lu, Yanfang; Liu, Huanan; Wei, Jian

    2016-01-01

    A simple liquid level sensor using a small piece of hydrofluoric acid (HF) etched polarization maintaining fiber (PMF), with SMF-PMF-SMF fiber structure based on Mach- Zehnder interference (MZI) mechanism is proposed. The core-offset fusion splicing method induced cladding modes interfere with the core mode. Moreover, the changing liquid level would influence the optical path difference of the MZI since the effective refractive indices of the air and the liquid is different. Both the variations of the wavelength shifts and power intensity attenuation corresponding to the liquid level can be obtained with a sensitivity of 0.4956nm/mm and 0.2204dB/mm, respectively.

  4. Proceedings of the Adaptive Sensor Array Processing Workshop (12th) Held in Lexington, MA on 16-18 March 2004 (CD-ROM)

    National Research Council Canada - National Science Library

    James, F

    2004-01-01

    ...: The twelfth annual workshop on Adaptive Sensor Array Processing presented a diverse agenda featuring new work on adaptive methods for communications, radar and sonar, algorithmic challenges posed...

  5. Projection-based circular constrained state estimation and fusion over long-haul links

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Qiang [ORNL; Rao, Nageswara S. [ORNL

    2017-07-01

    In this paper, we consider a scenario where sensors are deployed over a large geographical area for tracking a target with circular nonlinear constraints on its motion dynamics. The sensor state estimates are sent over long-haul networks to a remote fusion center for fusion. We are interested in different ways to incorporate the constraints into the estimation and fusion process in the presence of communication loss. In particular, we consider closed-form projection-based solutions, including rules for fusing the estimates and for incorporating the constraints, which jointly can guarantee timely fusion often required in realtime systems. We test the performance of these methods in the long-haul tracking environment using a simple example.

  6. Reliability estimates for selected sensors in fusion applications

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1996-09-01

    This report presents the results of a study to define several types of sensors in use, the qualitative reliability (failure modes) and quantitative reliability (average failure rates) for these types of process sensors. Temperature, pressure, flow, and level sensors are discussed for water coolant and for cryogenic coolants. The failure rates that have been found are useful for risk assessment and safety analysis. Repair times and calibration intervals are also given when found in the literature. All of these values can also be useful to plant operators and maintenance personnel. Designers may be able to make use of these data when planning systems. The final chapter in this report discusses failure rates for several types of personnel safety sensors, including ionizing radiation monitors, toxic and combustible gas detectors, humidity sensors, and magnetic field sensors. These data could be useful to industrial hygienists and other safety professionals when designing or auditing for personnel safety

  7. Infrared and visible image fusion using discrete cosine transform and swarm intelligence for surveillance applications

    Science.gov (United States)

    Paramanandham, Nirmala; Rajendiran, Kishore

    2018-01-01

    A novel image fusion technique is presented for integrating infrared and visible images. Integration of images from the same or various sensing modalities can deliver the required information that cannot be delivered by viewing the sensor outputs individually and consecutively. In this paper, a swarm intelligence based image fusion technique using discrete cosine transform (DCT) domain is proposed for surveillance application which integrates the infrared image with the visible image for generating a single informative fused image. Particle swarm optimization (PSO) is used in the fusion process for obtaining the optimized weighting factor. These optimized weighting factors are used for fusing the DCT coefficients of visible and infrared images. Inverse DCT is applied for obtaining the initial fused image. An enhanced fused image is obtained through adaptive histogram equalization for a better visual understanding and target detection. The proposed framework is evaluated using quantitative metrics such as standard deviation, spatial frequency, entropy and mean gradient. The experimental results demonstrate the outperformance of the proposed algorithm over many other state- of- the- art techniques reported in literature.

  8. Multi-Censor Fusion using Observation Merging with Central Level Architecture

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Ahmed, Zaki; Khan, M. Z.

    2011-01-01

    The use of multiple sensors typically requires the fusion of data from different type of sensors. The combined use of such a data has the potential to give an efficient, high quality and reliable estimation. Input data from different sensors allows the introduction of target attributes (target ty...

  9. Robust and reliable banknote authentification and print flaw detection with opto-acoustical sensor fusion methods

    Science.gov (United States)

    Lohweg, Volker; Schaede, Johannes; Türke, Thomas

    2006-02-01

    The authenticity checking and inspection of bank notes is a high labour intensive process where traditionally every note on every sheet is inspected manually. However with the advent of more and more sophisticated security features, both visible and invisible, and the requirement of cost reduction in the printing process, it is clear that automation is required. As more and more print techniques and new security features will be established, total quality security, authenticity and bank note printing must be assured. Therefore, this factor necessitates amplification of a sensorial concept in general. We propose a concept for both authenticity checking and inspection methods for pattern recognition and classification for securities and banknotes, which is based on the concept of sensor fusion and fuzzy interpretation of data measures. In the approach different methods of authenticity analysis and print flaw detection are combined, which can be used for vending or sorting machines, as well as for printing machines. Usually only the existence or appearance of colours and their textures are checked by cameras. Our method combines the visible camera images with IR-spectral sensitive sensors, acoustical and other measurements like temperature and pressure of printing machines.

  10. A New Multi-Sensor Track Fusion Architecture for Multi-Sensor Information Integration

    National Research Council Canada - National Science Library

    Jean, Buddy H; Younker, John; Hung, Chih-Cheng

    2004-01-01

    .... This new technology will integrate multi-sensor information and extract integrated multi-sensor information to detect, track and identify multiple targets at any time, in any place under all weather conditions...

  11. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  12. Dependence of the compensation error on the error of a sensor and corrector in an adaptive optics phase-conjugating system

    International Nuclear Information System (INIS)

    Kiyko, V V; Kislov, V I; Ofitserov, E N

    2015-01-01

    In the framework of a statistical model of an adaptive optics system (AOS) of phase conjugation, three algorithms based on an integrated mathematical approach are considered, each of them intended for minimisation of one of the following characteristics: the sensor error (in the case of an ideal corrector), the corrector error (in the case of ideal measurements) and the compensation error (with regard to discreteness and measurement noises and to incompleteness of a system of response functions of the corrector actuators). Functional and statistical relationships between the algorithms are studied and a relation is derived to ensure calculation of the mean-square compensation error as a function of the errors of the sensor and corrector with an accuracy better than 10%. Because in adjusting the AOS parameters, it is reasonable to proceed from the equality of the sensor and corrector errors, in the case the Hartmann sensor is used as a wavefront sensor, the required number of actuators in the absence of the noise component in the sensor error turns out 1.5 – 2.5 times less than the number of counts, and that difference grows with increasing measurement noise. (adaptive optics)

  13. Dependence of the compensation error on the error of a sensor and corrector in an adaptive optics phase-conjugating system

    Energy Technology Data Exchange (ETDEWEB)

    Kiyko, V V; Kislov, V I; Ofitserov, E N [A M Prokhorov General Physics Institute, Russian Academy of Sciences, Moscow (Russian Federation)

    2015-08-31

    In the framework of a statistical model of an adaptive optics system (AOS) of phase conjugation, three algorithms based on an integrated mathematical approach are considered, each of them intended for minimisation of one of the following characteristics: the sensor error (in the case of an ideal corrector), the corrector error (in the case of ideal measurements) and the compensation error (with regard to discreteness and measurement noises and to incompleteness of a system of response functions of the corrector actuators). Functional and statistical relationships between the algorithms are studied and a relation is derived to ensure calculation of the mean-square compensation error as a function of the errors of the sensor and corrector with an accuracy better than 10%. Because in adjusting the AOS parameters, it is reasonable to proceed from the equality of the sensor and corrector errors, in the case the Hartmann sensor is used as a wavefront sensor, the required number of actuators in the absence of the noise component in the sensor error turns out 1.5 – 2.5 times less than the number of counts, and that difference grows with increasing measurement noise. (adaptive optics)

  14. A Hybrid Adaptive Routing Algorithm for Event-Driven Wireless Sensor Networks

    Science.gov (United States)

    Figueiredo, Carlos M. S.; Nakamura, Eduardo F.; Loureiro, Antonio A. F.

    2009-01-01

    Routing is a basic function in wireless sensor networks (WSNs). For these networks, routing algorithms depend on the characteristics of the applications and, consequently, there is no self-contained algorithm suitable for every case. In some scenarios, the network behavior (traffic load) may vary a lot, such as an event-driven application, favoring different algorithms at different instants. This work presents a hybrid and adaptive algorithm for routing in WSNs, called Multi-MAF, that adapts its behavior autonomously in response to the variation of network conditions. In particular, the proposed algorithm applies both reactive and proactive strategies for routing infrastructure creation, and uses an event-detection estimation model to change between the strategies and save energy. To show the advantages of the proposed approach, it is evaluated through simulations. Comparisons with independent reactive and proactive algorithms show improvements on energy consumption. PMID:22423207

  15. Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2015-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, MODIS, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. HySDS is a Hybrid-Cloud Science Data System that has been developed and applied under NASA AIST, MEaSUREs, and ACCESS grants. HySDS uses the SciFlow workflow engine to partition analysis workflows into parallel tasks (e.g. segmenting by time or space) that are pushed into a durable job queue. The tasks are "pulled" from the queue by worker Virtual Machines (VM's) and executed in an on-premise Cloud (Eucalyptus or OpenStack) or at Amazon in the public Cloud or govCloud. In this way, years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the transferred data. We are using HySDS to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a MEASURES grant. We will present the architecture of HySDS, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. Our system demonstrates how one can pull A-Train variables (Levels 2 & 3) on-demand into the Amazon Cloud, and cache only those variables that are heavily used, so that any number of compute jobs can be

  16. A Novel Energy-Efficient Multi-Sensor Fusion Wake-Up Control Strategy Based on a Biomimetic Infectious-Immune Mechanism for Target Tracking.

    Science.gov (United States)

    Zhou, Jie; Liang, Yan; Shen, Qiang; Feng, Xiaoxue; Pan, Quan

    2018-04-18

    A biomimetic distributed infection-immunity model (BDIIM), inspired by the immune mechanism of an infected organism, is proposed in order to achieve a high-efficiency wake-up control strategy based on multi-sensor fusion for target tracking. The resultant BDIIM consists of six sub-processes reflecting the infection-immunity mechanism: occurrence probabilities of direct-infection (DI) and cross-infection (CI), immunity/immune-deficiency of DI and CI, pathogen amount of DI and CI, immune cell production, immune memory, and pathogen accumulation under immunity state. Furthermore, a corresponding relationship between the BDIIM and sensor wake-up control is established to form the collaborative wake-up method. Finally, joint surveillance and target tracking are formulated in the simulation, in which we show that the energy cost and position tracking error are reduced to 50.8% and 78.9%, respectively. Effectiveness of the proposed BDIIM algorithm is shown, and this model is expected to have a significant role in guiding the performance improvement of multi-sensor networks.

  17. Recent developments concerning the fusion; Developpements recents sur la fusion

    Energy Technology Data Exchange (ETDEWEB)

    Jacquinot, J. [CEA/Cadarache, Dept. de Recherches sur la Fusion Controlee, DRFC, 13 - Saint Paul lez Durance (France); Andre, M. [CEA/DAM Ile de France, 91 - Bruyeres Le Chatel (France); Aymar, R. [ITER Joint Central Team Garching, Muenchen (Germany)] [and others

    2000-09-04

    Organized the 9 march 2000 by the SFEN, this meeting on the european program concerning the fusion, showed the utility of the exploitation and the enhancement of the actual technology (JET, Tore Supra, ASDEX) and the importance of the Europe engagement in the ITER program. The physical stakes for the magnetic fusion have been developed with a presentation of the progresses in the knowledge of the stability limits. A paper on the inertial fusion was based on the LMJ (Laser MegaJoule) project. The two blanket concepts chosen in the scope of the european program on the tritium blankets, have been discussed. These concepts will be validated by irradiation tests in the ITER-FEAT and adapted for a future reactor. (A.L.B.)

  18. Adaptive Iterated Extended Kalman Filter and Its Application to Autonomous Integrated Navigation for Indoor Robot

    Directory of Open Access Journals (Sweden)

    Yuan Xu

    2014-01-01

    Full Text Available As the core of the integrated navigation system, the data fusion algorithm should be designed seriously. In order to improve the accuracy of data fusion, this work proposed an adaptive iterated extended Kalman (AIEKF which used the noise statistics estimator in the iterated extended Kalman (IEKF, and then AIEKF is used to deal with the nonlinear problem in the inertial navigation systems (INS/wireless sensors networks (WSNs-integrated navigation system. Practical test has been done to evaluate the performance of the proposed method. The results show that the proposed method is effective to reduce the mean root-mean-square error (RMSE of position by about 92.53%, 67.93%, 55.97%, and 30.09% compared with the INS only, WSN, EKF, and IEKF.

  19. New architecture for the sensor web: the SWAP framework.

    CSIR Research Space (South Africa)

    Moodley, D

    2006-11-01

    Full Text Available Sensor Web is a revolutionary concept towards achieving a collaborative, coherent, consistent, and consolidated sensor data collection, fusion and distribution system. Sensor Webs can perform as an extensive monitoring and sensing system...

  20. Self-adapted and tunable graphene strain sensors for detecting both subtle and large human motions.

    Science.gov (United States)

    Tao, Lu-Qi; Wang, Dan-Yang; Tian, He; Ju, Zhen-Yi; Liu, Ying; Pang, Yu; Chen, Yuan-Quan; Yang, Yi; Ren, Tian-Ling

    2017-06-22

    Conventional strain sensors rarely have both a high gauge factor and a large strain range simultaneously, so they can only be used in specific situations where only a high sensitivity or a large strain range is required. However, for detecting human motions that include both subtle and large motions, these strain sensors can't meet the diverse demands simultaneously. Here, we come up with laser patterned graphene strain sensors with self-adapted and tunable performance for the first time. A series of strain sensors with either an ultrahigh gauge factor or a preferable strain range can be fabricated simultaneously via one-step laser patterning, and are suitable for detecting all human motions. The strain sensors have a GF of up to 457 with a strain range of 35%, or have a strain range of up to 100% with a GF of 268. Most importantly, the performance of the strain sensors can be easily tuned by adjusting the patterns of the graphene, so that the sensors can meet diverse demands in both subtle and large motion situations. The graphene strain sensors show significant potential in applications such as wearable electronics, health monitoring and intelligent robots. Furthermore, the facile, fast and low-cost fabrication method will make them possible and practical to be used for commercial applications in the future.

  1. Bi-channel Sensor Fusion for Automatic Sign Language Recognition

    DEFF Research Database (Denmark)

    Kim, Jonghwa; Wagner, Johannes; Rehm, Matthias

    2008-01-01

    In this paper, we investigate the mutual-complementary functionality of accelerometer (ACC) and electromyogram (EMG) for recognizing seven word-level sign vocabularies in German sign language (GSL). Results are discussed for the single channels and for feature-level fusion for the bichannel senso......-independent condition, where subjective differences do not allow for high recognition rates. Finally we discuss a problem of feature-level fusion caused by high disparity between accuracies of each single channel classification....

  2. Adaptive Security in ODMAC for Multihop Energy Harvesting Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Fafoutis, Xenofon; Dragoni, Nicola

    2015-01-01

    Energy Harvesting Wireless Sensor Networks (EH-WSNs) represent an interesting new paradigm where individual nodes forming a network are powered by energy sources scavenged from the surrounding environment. This technique provides numerous advantages, but also new design challenges. Securing...... the communications under energy constraints represents one of these key challenges. The amount of energy available is theoretically infinite in the long run but highly variable over short periods of time, and managing it is a crucial aspect. In this paper we present an adaptive approach for security in multihop EH...

  3. Illumination adaptation with rapid-response color sensors

    Science.gov (United States)

    Zhang, Xinchi; Wang, Quan; Boyer, Kim L.

    2014-09-01

    Smart lighting solutions based on imaging sensors such as webcams or time-of-flight sensors suffer from rising privacy concerns. In this work, we use low-cost non-imaging color sensors to measure local luminous flux of different colors in an indoor space. These sensors have much higher data acquisition rate and are much cheaper than many o_-the-shelf commercial products. We have developed several applications with these sensors, including illumination feedback control and occupancy-driven lighting.

  4. The fusion of satellite and UAV data: simulation of high spatial resolution band

    Science.gov (United States)

    Jenerowicz, Agnieszka; Siok, Katarzyna; Woroszkiewicz, Malgorzata; Orych, Agata

    2017-10-01

    Remote sensing techniques used in the precision agriculture and farming that apply imagery data obtained with sensors mounted on UAV platforms became more popular in the last few years due to the availability of low- cost UAV platforms and low- cost sensors. Data obtained from low altitudes with low- cost sensors can be characterised by high spatial and radiometric resolution but quite low spectral resolution, therefore the application of imagery data obtained with such technology is quite limited and can be used only for the basic land cover classification. To enrich the spectral resolution of imagery data acquired with low- cost sensors from low altitudes, the authors proposed the fusion of RGB data obtained with UAV platform with multispectral satellite imagery. The fusion is based on the pansharpening process, that aims to integrate the spatial details of the high-resolution panchromatic image with the spectral information of lower resolution multispectral or hyperspectral imagery to obtain multispectral or hyperspectral images with high spatial resolution. The key of pansharpening is to properly estimate the missing spatial details of multispectral images while preserving their spectral properties. In the research, the authors presented the fusion of RGB images (with high spatial resolution) obtained with sensors mounted on low- cost UAV platforms and multispectral satellite imagery with satellite sensors, i.e. Landsat 8 OLI. To perform the fusion of UAV data with satellite imagery, the simulation of the panchromatic bands from RGB data based on the spectral channels linear combination, was conducted. Next, for simulated bands and multispectral satellite images, the Gram-Schmidt pansharpening method was applied. As a result of the fusion, the authors obtained several multispectral images with very high spatial resolution and then analysed the spatial and spectral accuracies of processed images.

  5. Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis.

    Science.gov (United States)

    Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario

    2015-12-01

    Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.

  6. Multiple Ca2+ sensors in secretion

    DEFF Research Database (Denmark)

    Walter, Alexander M; Groffen, Alexander J; Sørensen, Jakob Balslev

    2011-01-01

    Regulated neurotransmitter secretion depends on Ca(2+) sensors, C2 domain proteins that associate with phospholipids and soluble N-ethylmaleimide-sensitive fusion attachment protein receptor (SNARE) complexes to trigger release upon Ca(2+) binding. Ca(2+) sensors are thought to prevent spontaneous...

  7. C2-domain containing calcium sensors in neuroendocrine secretion

    DEFF Research Database (Denmark)

    Pinheiro, Paulo S; Houy, Sébastien; Sørensen, Jakob B

    2016-01-01

    The molecular mechanisms for calcium-triggered membrane fusion have long been sought for, and detailed models now exist that account for at least some of the functions of the many proteins involved in the process. Key players in the fusion reaction are a group of proteins that, upon binding...... to calcium, trigger the merger of cargo-filled vesicles with the plasma membrane. Low-affinity, fast-kinetics calcium sensors of the synaptotagmin family - especially synaptotagmin-1 and synaptotagmin-2 - are the main calcium sensors for fast exocytosis triggering in many cell types. Their functions extend...... beyond fusion triggering itself, having been implicated in the calcium-dependent vesicle recruitment during activity, docking of vesicles to the plasma membrane and priming, and even in post-fusion steps, such as fusion pore expansion and endocytosis. Furthermore, synaptotagmin diversity imparts distinct...

  8. A two-hop based adaptive routing protocol for real-time wireless sensor networks.

    Science.gov (United States)

    Rachamalla, Sandhya; Kancherla, Anitha Sheela

    2016-01-01

    One of the most important and challenging issues in wireless sensor networks (WSNs) is to optimally manage the limited energy of nodes without degrading the routing efficiency. In this paper, we propose an energy-efficient adaptive routing mechanism for WSNs, which saves energy of nodes by removing the much delayed packets without degrading the real-time performance of the used routing protocol. It uses the adaptive transmission power algorithm which is based on the attenuation of the wireless link to improve the energy efficiency. The proposed routing mechanism can be associated with any geographic routing protocol and its performance is evaluated by integrating with the well known two-hop based real-time routing protocol, PATH and the resulting protocol is energy-efficient adaptive routing protocol (EE-ARP). The EE-ARP performs well in terms of energy consumption, deadline miss ratio, packet drop and end-to-end delay.

  9. Distributed Fusion Estimation for Multisensor Multirate Systems with Stochastic Observation Multiplicative Noises

    Directory of Open Access Journals (Sweden)

    Peng Fangfang

    2014-01-01

    Full Text Available This paper studies the fusion estimation problem of a class of multisensor multirate systems with observation multiplicative noises. The dynamic system is sampled uniformly. Sampling period of each sensor is uniform and the integer multiple of the state update period. Moreover, different sensors have the different sampling rates and observations of sensors are subject to the stochastic uncertainties of multiplicative noises. At first, local filters at the observation sampling points are obtained based on the observations of each sensor. Further, local estimators at the state update points are obtained by predictions of local filters at the observation sampling points. They have the reduced computational cost and a good real-time property. Then, the cross-covariance matrices between any two local estimators are derived at the state update points. At last, using the matrix weighted optimal fusion estimation algorithm in the linear minimum variance sense, the distributed optimal fusion estimator is obtained based on the local estimators and the cross-covariance matrices. An example shows the effectiveness of the proposed algorithms.

  10. A Novel Low-Cost Adaptive Scanner Concept for Mobile Robots

    Directory of Open Access Journals (Sweden)

    Ivo Stančić

    2014-09-01

    Full Text Available A fundamental problem in mobile robot applications is the need for accurate knowledge of the position of a vehicle for localizing itself and for avoiding obstacles in its path. In the search for a solution to this problem, researchers and engineers have developed different sensors, systems and techniques. Modern mobile robots relay information obtained from a variety of sensors and sophisticated data fusion algorithms. In this paper, a novel concept for a low-cost adaptive scanner based on a projected light pattern is proposed. The main advantage of the proposed system is its adaptivity, which enables the rapid scanning of the robot’s surroundings in search of obstacles and a more detailed scan of a single object to retrieve its surface configuration and perform some limited analyses. This paper addresses the concept behind such a scanner, where a proof-of-concept is achieved using an office DLP projector. During the measurements, the accuracy of the proposed system was tested on obstacles and objects with known configurations. The obtained results are presented and analyzed, and conclusions about the system’s performance and possible improvements are discussed.

  11. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices

    Directory of Open Access Journals (Sweden)

    Ivan Miguel Pires

    2016-02-01

    Full Text Available This paper focuses on the research on the state of the art for sensor fusion techniques, applied to the sensors embedded in mobile devices, as a means to help identify the mobile device user’s daily activities. Sensor data fusion techniques are used to consolidate the data collected from several sensors, increasing the reliability of the algorithms for the identification of the different activities. However, mobile devices have several constraints, e.g., low memory, low battery life and low processing power, and some data fusion techniques are not suited to this scenario. The main purpose of this paper is to present an overview of the state of the art to identify examples of sensor data fusion techniques that can be applied to the sensors available in mobile devices aiming to identify activities of daily living (ADLs.

  12. From Data Acquisition to Data Fusion: A Comprehensive Review and a Roadmap for the Identification of Activities of Daily Living Using Mobile Devices

    Science.gov (United States)

    Pires, Ivan Miguel; Garcia, Nuno M.; Pombo, Nuno; Flórez-Revuelta, Francisco

    2016-01-01

    This paper focuses on the research on the state of the art for sensor fusion techniques, applied to the sensors embedded in mobile devices, as a means to help identify the mobile device user’s daily activities. Sensor data fusion techniques are used to consolidate the data collected from several sensors, increasing the reliability of the algorithms for the identification of the different activities. However, mobile devices have several constraints, e.g., low memory, low battery life and low processing power, and some data fusion techniques are not suited to this scenario. The main purpose of this paper is to present an overview of the state of the art to identify examples of sensor data fusion techniques that can be applied to the sensors available in mobile devices aiming to identify activities of daily living (ADLs). PMID:26848664

  13. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety

    Directory of Open Access Journals (Sweden)

    Zutao Zhang

    2016-06-01

    Full Text Available Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety.

  14. DYNAMIC AND INCREMENTAL EXPLORATION STRATEGY IN FUSION ADAPTIVE RESONANCE THEORY FOR ONLINE REINFORCEMENT LEARNING

    Directory of Open Access Journals (Sweden)

    Budhitama Subagdja

    2016-06-01

    Full Text Available One of the fundamental challenges in reinforcement learning is to setup a proper balance between exploration and exploitation to obtain the maximum cummulative reward in the long run. Most protocols for exploration bound the overall values to a convergent level of performance. If new knowledge is inserted or the environment is suddenly changed, the issue becomes more intricate as the exploration must compromise the pre-existing knowledge. This paper presents a type of multi-channel adaptive resonance theory (ART neural network model called fusion ART which serves as a fuzzy approximator for reinforcement learning with inherent features that can regulate the exploration strategy. This intrinsic regulation is driven by the condition of the knowledge learnt so far by the agent. The model offers a stable but incremental reinforcement learning that can involve prior rules as bootstrap knowledge for guiding the agent to select the right action. Experiments in obstacle avoidance and navigation tasks demonstrate that in the configuration of learning wherein the agent learns from scratch, the inherent exploration model in fusion ART model is comparable to the basic E-greedy policy. On the other hand, the model is demonstrated to deal with prior knowledge and strike a balance between exploration and exploitation.

  15. Intelligent Networks Data Fusion Web-based Services for Ad-hoc Integrated WSNs-RFID

    Directory of Open Access Journals (Sweden)

    Falah Alshahrany

    2016-01-01

    Full Text Available The use of variety of data fusion tools and techniques for big data processing poses the problem of the data and information integration called data fusion having objectives which can differ from one application to another. The design of network data fusion systems aimed at meeting these objectives, need to take into account of the necessary synergy that can result from distributed data processing within the data networks and data centres, involving increased computation and communication. This papers reports on how this processing distribution is functionally structured as configurable integrated web-based support services, in the context of an ad-hoc wireless sensor network used for sensing and tracking, in the context of distributed detection based on complete observations to support real rime decision making. The interrelated functional and hardware RFID-WSN integration is an essential aspect of the data fusion framework that focuses on multi-sensor collaboration as an innovative approach to extend the heterogeneity of the devices and sensor nodes of ad-hoc networks generating a huge amount of heterogeneous soft and hard raw data. The deployment and configuration of these networks require data fusion processing that includes network and service management and enhances the performance and reliability of networks data fusion support systems providing intelligent capabilities for real-time control access and fire detection.

  16. Adaptive Sensor Optimization and Cognitive Image Processing Using Autonomous Optical Neuroprocessors; TOPICAL

    International Nuclear Information System (INIS)

    CAMERON, STEWART M.

    2001-01-01

    Measurement and signal intelligence demands has created new requirements for information management and interoperability as they affect surveillance and situational awareness. Integration of on-board autonomous learning and adaptive control structures within a remote sensing platform architecture would substantially improve the utility of intelligence collection by facilitating real-time optimization of measurement parameters for variable field conditions. A problem faced by conventional digital implementations of intelligent systems is the conflict between a distributed parallel structure on a sequential serial interface functionally degrading bandwidth and response time. In contrast, optically designed networks exhibit the massive parallelism and interconnect density needed to perform complex cognitive functions within a dynamic asynchronous environment. Recently, all-optical self-organizing neural networks exhibiting emergent collective behavior which mimic perception, recognition, association, and contemplative learning have been realized using photorefractive holography in combination with sensory systems for feature maps, threshold decomposition, image enhancement, and nonlinear matched filters. Such hybrid information processors depart from the classical computational paradigm based on analytic rules-based algorithms and instead utilize unsupervised generalization and perceptron-like exploratory or improvisational behaviors to evolve toward optimized solutions. These systems are robust to instrumental systematics or corrupting noise and can enrich knowledge structures by allowing competition between multiple hypotheses. This property enables them to rapidly adapt or self-compensate for dynamic or imprecise conditions which would be unstable using conventional linear control models. By incorporating an intelligent optical neuroprocessor in the back plane of an imaging sensor, a broad class of high-level cognitive image analysis problems including geometric

  17. Effect of cross-correlation on track-to-track fusion

    Science.gov (United States)

    Saha, Rajat K.

    1994-07-01

    Since the advent of target tracking systems employing a diverse mixture of sensors, there has been increasing recognition by air defense system planners and other military system analysts of the need to integrate these tracks so that a clear air picture can be obtained in a command center. A popular methodology to achieve this goal is to perform track-to-track fusion, which performs track-to-track association as well as kinematic state vector fusion. This paper seeks to answer analytically the extent of improvement achievable by means of kinetic state vector fusion when the tracks are obtained from dissimilar sensors (e.g., Radar/ESM/IRST/IFF). It is well known that evaluation of the performance of state vector fusion algorithms at steady state must take into account the effects of cross-correlation between eligible tracks introduced by the input noise which, unfortunately, is often neglected because of added computational complexity. In this paper, an expression for the steady-state cross-covariance matrix for a 2D state vector track-to-track fusion is obtained. This matrix is shown to be a function of the parameters of the Kalman filters associated with the candidate tracks being fused. Conditions for positive definiteness of the cross-covariance matrix have been derived and the effect of positive definiteness on performance of track-to-track fusion is also discussed.

  18. Adaptive Multi-Sensor Perception for Driving Automation in Outdoor Contexts

    Directory of Open Access Journals (Sweden)

    Annalisa Milella

    2014-08-01

    Full Text Available In this research, adaptive perception for driving automation is discussed so as to enable a vehicle to automatically detect driveable areas and obstacles in the scene. It is especially designed for outdoor contexts where conventional perception systems that rely on a priori knowledge of the terrain's geometric properties, appearance properties, or both, is prone to fail, due to the variability in the terrain properties and environmental conditions. In contrast, the proposed framework uses a self-learning approach to build a model of the ground class that is continuously adjusted online to reflect the latest ground appearance. The system also features high flexibility, as it can work using a single sensor modality or a multi-sensor combination. In the context of this research, different embodiments have been demonstrated using range data coming from either a radar or a stereo camera, and adopting self-supervised strategies where monocular vision is automatically trained by radar or stereo vision. A comprehensive set of experimental results, obtained with different ground vehicles operating in the field, are presented to validate and assess the performance of the system.

  19. Robust Adaptive Beamforming with Sensor Position Errors Using Weighted Subspace Fitting-Based Covariance Matrix Reconstruction.

    Science.gov (United States)

    Chen, Peng; Yang, Yixin; Wang, Yong; Ma, Yuanliang

    2018-05-08

    When sensor position errors exist, the performance of recently proposed interference-plus-noise covariance matrix (INCM)-based adaptive beamformers may be severely degraded. In this paper, we propose a weighted subspace fitting-based INCM reconstruction algorithm to overcome sensor displacement for linear arrays. By estimating the rough signal directions, we construct a novel possible mismatched steering vector (SV) set. We analyze the proximity of the signal subspace from the sample covariance matrix (SCM) and the space spanned by the possible mismatched SV set. After solving an iterative optimization problem, we reconstruct the INCM using the estimated sensor position errors. Then we estimate the SV of the desired signal by solving an optimization problem with the reconstructed INCM. The main advantage of the proposed algorithm is its robustness against SV mismatches dominated by unknown sensor position errors. Numerical examples show that even if the position errors are up to half of the assumed sensor spacing, the output signal-to-interference-plus-noise ratio is only reduced by 4 dB. Beam patterns plotted using experiment data show that the interference suppression capability of the proposed beamformer outperforms other tested beamformers.

  20. Passive in-vehicle driver breath alcohol detection using advanced sensor signal acquisition and fusion.

    Science.gov (United States)

    Ljungblad, Jonas; Hök, Bertil; Allalou, Amin; Pettersson, Håkan

    2017-05-29

    The research objective of the present investigation is to demonstrate the present status of passive in-vehicle driver breath alcohol detection and highlight the necessary conditions for large-scale implementation of such a system. Completely passive detection has remained a challenge mainly because of the requirements on signal resolution combined with the constraints of vehicle integration. The work is part of the Driver Alcohol Detection System for Safety (DADSS) program aiming at massive deployment of alcohol sensing systems that could potentially save thousands of American lives annually. The work reported here builds on earlier investigations, in which it has been shown that detection of alcohol vapor in the proximity of a human subject may be traced to that subject by means of simultaneous recording of carbon dioxide (CO 2 ) at the same location. Sensors based on infrared spectroscopy were developed to detect and quantify low concentrations of alcohol and CO 2 . In the present investigation, alcohol and CO 2 were recorded at various locations in a vehicle cabin while human subjects were performing normal in-step procedures and driving preparations. A video camera directed to the driver position was recording images of the driver's upper body parts, including the face, and the images were analyzed with respect to features of significance to the breathing behavior and breath detection, such as mouth opening and head direction. Improvement of the sensor system with respect to signal resolution including algorithm and software development, and fusion of the sensor and camera signals was successfully implemented and tested before starting the human study. In addition, experimental tests and simulations were performed with the purpose of connecting human subject data with repeatable experimental conditions. The results include occurrence statistics of detected breaths by signal peaks of CO 2 and alcohol. From the statistical data, the accuracy of breath alcohol

  1. Trust metrics in information fusion

    Science.gov (United States)

    Blasch, Erik

    2014-05-01

    Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.

  2. Rapid and highly integrated FPGA-based Shack-Hartmann wavefront sensor for adaptive optics system

    Science.gov (United States)

    Chen, Yi-Pin; Chang, Chia-Yuan; Chen, Shean-Jen

    2018-02-01

    In this study, a field programmable gate array (FPGA)-based Shack-Hartmann wavefront sensor (SHWS) programmed on LabVIEW can be highly integrated into customized applications such as adaptive optics system (AOS) for performing real-time wavefront measurement. Further, a Camera Link frame grabber embedded with FPGA is adopted to enhance the sensor speed reacting to variation considering its advantage of the highest data transmission bandwidth. Instead of waiting for a frame image to be captured by the FPGA, the Shack-Hartmann algorithm are implemented in parallel processing blocks design and let the image data transmission synchronize with the wavefront reconstruction. On the other hand, we design a mechanism to control the deformable mirror in the same FPGA and verify the Shack-Hartmann sensor speed by controlling the frequency of the deformable mirror dynamic surface deformation. Currently, this FPGAbead SHWS design can achieve a 266 Hz cyclic speed limited by the camera frame rate as well as leaves 40% logic slices for additionally flexible design.

  3. An Adaptive Time-Spread Multiple-Access Policy for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Konstantinos Oikonomou

    2007-05-01

    Full Text Available Sensor networks require a simple and efficient medium access control policy achieving high system throughput with no or limited control overhead in order to increase the network lifetime by minimizing the energy consumed during transmission attempts. Time-spread multiple-access (TSMA policies that have been proposed for ad hoc network environments, can also be employed in sensor networks, since no control overhead is introduced. However, they do not take advantage of any cross-layer information in order to exploit the idiosyncrasies of the particular sensor network environment such as the presence of typically static nodes and a common destination for the forwarded data. An adaptive probabilistic TSMA-based policy, that is proposed and analyzed in this paper, exploits these idiosyncrasies and achieves higher system throughput than the existing TSMA-based policies without any need for extra control overhead. As it is analytically shown in this paper, the proposed policy always outperforms the existing TSMA-based policies, if certain parameter values are properly set; the analysis also provides for these proper values. It is also shown that the proposed policy is characterized by a certain convergence period and that high system throughput is achieved for long convergence periods. The claims and expectations of the provided analysis are supported by simulation results presented in this paper.

  4. A distance-aware replica adaptive data gathering protocol for Delay Tolerant Mobile Sensor Networks.

    Science.gov (United States)

    Feng, Yong; Gong, Haigang; Fan, Mingyu; Liu, Ming; Wang, Xiaomin

    2011-01-01

    In Delay Tolerant Mobile Sensor Networks (DTMSNs) that have the inherent features of intermitted connectivity and frequently changing network topology it is reasonable to utilize multi-replica schemes to improve the data gathering performance. However, most existing multi-replica approaches inject a large amount of message copies into the network to increase the probability of message delivery, which may drain each mobile node's limited battery supply faster and result in too much contention for the restricted resources of the DTMSN, so a proper data gathering scheme needs a trade off between the number of replica messages and network performance. In this paper, we propose a new data gathering protocol called DRADG (for Distance-aware Replica Adaptive Data Gathering protocol), which economizes network resource consumption through making use of a self-adapting algorithm to cut down the number of redundant replicas of messages, and achieves a good network performance by leveraging the delivery probabilities of the mobile sensors as main routing metrics. Simulation results have shown that the proposed DRADG protocol achieves comparable or higher message delivery ratios at the cost of the much lower transmission overhead than several current DTMSN data gathering schemes.

  5. Distributed detection in UWB sensor networks under non-orthogonal Nakagami-m fading

    KAUST Repository

    Mehbodniya, Abolfazl

    2011-09-01

    Several attractive features of ultra wideband (UWB) communications make it a good candidate for physical-layer of wireless sensor networks (WSN). These features include low power consumption, low complexity and low cost of implementation. In this paper, we present an opportunistic power assignment strategy for distributed detection in parallel fusion WSNs, considering a Nakagami-m fading model for the communication channel and time-hopping (TH) UWB for the transmitter circuit of the sensor nodes. In a parallel fusion WSN, local decisions are made by local sensors and transmitted through wireless channels to a fusion center. The fusion center processes the information and makes the final decision. Simulation results are provided for the global probability of detection error and relative performance gain to evaluate the efficiency of the proposed power assignment strategy in different fading environments. © 2011 IEEE.

  6. Semantic Edge Based Disparity Estimation Using Adaptive Dynamic Programming for Binocular Sensors.

    Science.gov (United States)

    Zhu, Dongchen; Li, Jiamao; Wang, Xianshun; Peng, Jingquan; Shi, Wenjun; Zhang, Xiaolin

    2018-04-03

    Disparity calculation is crucial for binocular sensor ranging. The disparity estimation based on edges is an important branch in the research of sparse stereo matching and plays an important role in visual navigation. In this paper, we propose a robust sparse stereo matching method based on the semantic edges. Some simple matching costs are used first, and then a novel adaptive dynamic programming algorithm is proposed to obtain optimal solutions. This algorithm makes use of the disparity or semantic consistency constraint between the stereo images to adaptively search parameters, which can improve the robustness of our method. The proposed method is compared quantitatively and qualitatively with the traditional dynamic programming method, some dense stereo matching methods, and the advanced edge-based method respectively. Experiments show that our method can provide superior performance on the above comparison.

  7. Biometric image enhancement using decision rule based image fusion techniques

    Science.gov (United States)

    Sagayee, G. Mary Amirtha; Arumugam, S.

    2010-02-01

    Introducing biometrics into information systems may result in considerable benefits. Most of the researchers confirmed that the finger print is widely used than the iris or face and more over it is the primary choice for most privacy concerned applications. For finger prints applications, choosing proper sensor is at risk. The proposed work deals about, how the image quality can be improved by introducing image fusion technique at sensor levels. The results of the images after introducing the decision rule based image fusion technique are evaluated and analyzed with its entropy levels and root mean square error.

  8. Scene Recognition for Indoor Localization Using a Multi-Sensor Fusion Approach

    Directory of Open Access Journals (Sweden)

    Mengyun Liu

    2017-12-01

    Full Text Available After decades of research, there is still no solution for indoor localization like the GNSS (Global Navigation Satellite System solution for outdoor environments. The major reasons for this phenomenon are the complex spatial topology and RF transmission environment. To deal with these problems, an indoor scene constrained method for localization is proposed in this paper, which is inspired by the visual cognition ability of the human brain and the progress in the computer vision field regarding high-level image understanding. Furthermore, a multi-sensor fusion method is implemented on a commercial smartphone including cameras, WiFi and inertial sensors. Compared to former research, the camera on a smartphone is used to “see” which scene the user is in. With this information, a particle filter algorithm constrained by scene information is adopted to determine the final location. For indoor scene recognition, we take advantage of deep learning that has been proven to be highly effective in the computer vision community. For particle filter, both WiFi and magnetic field signals are used to update the weights of particles. Similar to other fingerprinting localization methods, there are two stages in the proposed system, offline training and online localization. In the offline stage, an indoor scene model is trained by Caffe (one of the most popular open source frameworks for deep learning and a fingerprint database is constructed by user trajectories in different scenes. To reduce the volume requirement of training data for deep learning, a fine-tuned method is adopted for model training. In the online stage, a camera in a smartphone is used to recognize the initial scene. Then a particle filter algorithm is used to fuse the sensor data and determine the final location. To prove the effectiveness of the proposed method, an Android client and a web server are implemented. The Android client is used to collect data and locate a user. The web

  9. Multimode delta-E effect magnetic field sensors with adapted electrodes

    Energy Technology Data Exchange (ETDEWEB)

    Zabel, Sebastian; Fichtner, Simon; Kirchhof, Christine; Quandt, Eckhard; Faupel, Franz, E-mail: ff@tf.uni-kiel.de [Faculty of Engineering, Institute for Materials Science, Kiel University, Kaiserstraße 2, 24143 Kiel (Germany); Reermann, Jens; Schmidt, Gerhard [Faculty of Engineering, Institute for Electrical Engineering, Kiel University, Kaiserstraße 2, 24143 Kiel (Germany); Wagner, Bernhard [Fraunhofer Institute for Silicon Technology ISIT, Fraunhoferstraße 1, 25524 Itzehoe (Germany)

    2016-05-30

    We present an analytical and experimental study on low-noise piezoelectric thin film resonators that utilize the delta-E effect of a magnetostrictive layer to measure magnetic fields at low frequencies. Calculations from a physical model of the electromechanical resonator enable electrode designs to efficiently operate in the first and second transversal bending modes. As predicted by our calculations, the adapted electrode design improves the sensitivity by a factor of 6 and reduces the dynamic range of the sensor output by 16 dB, which significantly eases the requirements on readout electronics. Magnetic measurements show a bandwidth of 100 Hz at a noise level of about 100 pTHz{sup −0.5}.

  10. A Long-Distance RF-Powered Sensor Node with Adaptive Power Management for IoT Applications.

    Science.gov (United States)

    Pizzotti, Matteo; Perilli, Luca; Del Prete, Massimo; Fabbri, Davide; Canegallo, Roberto; Dini, Michele; Masotti, Diego; Costanzo, Alessandra; Franchi Scarselli, Eleonora; Romani, Aldo

    2017-07-28

    We present a self-sustained battery-less multi-sensor platform with RF harvesting capability down to -17 dBm and implementing a standard DASH7 wireless communication interface. The node operates at distances up to 17 m from a 2 W UHF carrier. RF power transfer allows operation when common energy scavenging sources (e.g., sun, heat, etc.) are not available, while the DASH7 communication protocol makes it fully compatible with a standard IoT infrastructure. An optimized energy-harvesting module has been designed, including a rectifying antenna (rectenna) and an integrated nano-power DC/DC converter performing maximum-power-point-tracking (MPPT). A nonlinear/electromagnetic co-design procedure is adopted to design the rectenna, which is optimized to operate at ultra-low power levels. An ultra-low power microcontroller controls on-board sensors and wireless protocol, to adapt the power consumption to the available detected power by changing wake-up policies. As a result, adaptive behavior can be observed in the designed platform, to the extent that the transmission data rate is dynamically determined by RF power. Among the novel features of the system, we highlight the use of nano-power energy harvesting, the implementation of specific hardware/software wake-up policies, optimized algorithms for best sampling rate implementation, and adaptive behavior by the node based on the power received.

  11. A Long-Distance RF-Powered Sensor Node with Adaptive Power Management for IoT Applications

    Directory of Open Access Journals (Sweden)

    Matteo Pizzotti

    2017-07-01

    Full Text Available We present a self-sustained battery-less multi-sensor platform with RF harvesting capability down to −17 dBm and implementing a standard DASH7 wireless communication interface. The node operates at distances up to 17 m from a 2 W UHF carrier. RF power transfer allows operation when common energy scavenging sources (e.g., sun, heat, etc. are not available, while the DASH7 communication protocol makes it fully compatible with a standard IoT infrastructure. An optimized energy-harvesting module has been designed, including a rectifying antenna (rectenna and an integrated nano-power DC/DC converter performing maximum-power-point-tracking (MPPT. A nonlinear/electromagnetic co-design procedure is adopted to design the rectenna, which is optimized to operate at ultra-low power levels. An ultra-low power microcontroller controls on-board sensors and wireless protocol, to adapt the power consumption to the available detected power by changing wake-up policies. As a result, adaptive behavior can be observed in the designed platform, to the extent that the transmission data rate is dynamically determined by RF power. Among the novel features of the system, we highlight the use of nano-power energy harvesting, the implementation of specific hardware/software wake-up policies, optimized algorithms for best sampling rate implementation, and adaptive behavior by the node based on the power received.

  12. Wireless sensor network adaptive cooling

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, T. [SynapSense Corp., Folsom, CA (United States)

    2009-07-01

    Options for reducing data centre cooling energy requirements and their cost savings were discussed with particular reference to a wireless control solution developed by SynapSense Corporation. The wireless sensor network reduces cooling energy use at data centres by providing improved air flow management through the installation of cold aisle containment. The use of this low cost, non-invasive wireless sensor network has reduced the cooling energy use in a data center at BC Hydro by 30 per cent. The system also reduced the server and storage fan energy by 3 per cent by maintaining inlet air temperature below ASHRAE recommended operating range. The distribution of low power, low cost wireless sensors has enabled visualization tools that are changing the way that data centres are managed. The annual savings have been estimated at 4,560,000 kWh and the annual carbon dioxide abatement is approximately 1,400 metric tons. tabs., figs.

  13. Sensoring fusion data from the optic and acoustic emissions of electric arcs in the GMAW-S process for welding quality assessment.

    Science.gov (United States)

    Alfaro, Sadek Crisóstomo Absi; Cayo, Eber Huanca

    2012-01-01

    The present study shows the relationship between welding quality and optical-acoustic emissions from electric arcs, during welding runs, in the GMAW-S process. Bead on plate welding tests was carried out with pre-set parameters chosen from manufacturing standards. During the welding runs interferences were induced on the welding path using paint, grease or gas faults. In each welding run arc voltage, welding current, infrared and acoustic emission values were acquired and parameters such as arc power, acoustic peaks rate and infrared radiation rate computed. Data fusion algorithms were developed by assessing known welding quality parameters from arc emissions. These algorithms have showed better responses when they are based on more than just one sensor. Finally, it was concluded that there is a close relation between arc emissions and quality in welding and it can be measured from arc emissions sensing and data fusion algorithms.

  14. Sensoring Fusion Data from the Optic and Acoustic Emissions of Electric Arcs in the GMAW-S Process for Welding Quality Assessment

    Directory of Open Access Journals (Sweden)

    Eber Huanca Cayo

    2012-05-01

    Full Text Available The present study shows the relationship between welding quality and optical-acoustic emissions from electric arcs, during welding runs, in the GMAW-S process. Bead on plate welding tests was carried out with pre-set parameters chosen from manufacturing standards. During the welding runs interferences were induced on the welding path using paint, grease or gas faults. In each welding run arc voltage, welding current, infrared and acoustic emission values were acquired and parameters such as arc power, acoustic peaks rate and infrared radiation rate computed. Data fusion algorithms were developed by assessing known welding quality parameters from arc emissions. These algorithms have showed better responses when they are based on more than just one sensor. Finally, it was concluded that there is a close relation between arc emissions and quality in welding and it can be measured from arc emissions sensing and data fusion algorithms.

  15. Scalable sensor management for automated fusion and tactical reconnaissance

    Science.gov (United States)

    Walls, Thomas J.; Wilson, Michael L.; Partridge, Darin C.; Haws, Jonathan R.; Jensen, Mark D.; Johnson, Troy R.; Petersen, Brad D.; Sullivan, Stephanie W.

    2013-05-01

    The capabilities of tactical intelligence, surveillance, and reconnaissance (ISR) payloads are expanding from single sensor imagers to integrated systems-of-systems architectures. Increasingly, these systems-of-systems include multiple sensing modalities that can act as force multipliers for the intelligence analyst. Currently, the separate sensing modalities operate largely independent of one another, providing a selection of operating modes but not an integrated intelligence product. We describe here a Sensor Management System (SMS) designed to provide a small, compact processing unit capable of managing multiple collaborative sensor systems on-board an aircraft. Its purpose is to increase sensor cooperation and collaboration to achieve intelligent data collection and exploitation. The SMS architecture is designed to be largely sensor and data agnostic and provide flexible networked access for both data providers and data consumers. It supports pre-planned and ad-hoc missions, with provisions for on-demand tasking and updates from users connected via data links. Management of sensors and user agents takes place over standard network protocols such that any number and combination of sensors and user agents, either on the local network or connected via data link, can register with the SMS at any time during the mission. The SMS provides control over sensor data collection to handle logging and routing of data products to subscribing user agents. It also supports the addition of algorithmic data processing agents for feature/target extraction and provides for subsequent cueing from one sensor to another. The SMS architecture was designed to scale from a small UAV carrying a limited number of payloads to an aircraft carrying a large number of payloads. The SMS system is STANAG 4575 compliant as a removable memory module (RMM) and can act as a vehicle specific module (VSM) to provide STANAG 4586 compliance (level-3 interoperability) to a non-compliant sensor system

  16. Biomimetic micromechanical adaptive flow-sensor arrays

    NARCIS (Netherlands)

    Krijnen, Gijsbertus J.M.; Floris, J.; Dijkstra, Marcel; Lammerink, Theodorus S.J.; Wiegerink, Remco J.

    2007-01-01

    We report current developments in biomimetic flow-sensors based on flow sensitive mechano-sensors of crickets. Crickets have one form of acoustic sensing evolved in the form of mechanoreceptive sensory hairs. These filiform hairs are highly perceptive to low-frequency sound with energy sensitivities

  17. Data Fusion for a Vision-Radiological System: a Statistical Calibration Algorithm

    International Nuclear Information System (INIS)

    Enqvist, Andreas; Koppal, Sanjeev; Riley, Phillip

    2015-01-01

    Presented here is a fusion system based on simple, low-cost computer vision and radiological sensors for tracking of multiple objects and identifying potential radiological materials being transported or shipped. The main focus of this work is the development of calibration algorithms for characterizing the fused sensor system as a single entity. There is an apparent need for correcting for a scene deviation from the basic inverse distance-squared law governing the detection rates even when evaluating system calibration algorithms. In particular, the computer vision system enables a map of distance-dependence of the sources being tracked, to which the time-dependent radiological data can be incorporated by means of data fusion of the two sensors' output data. (authors)

  18. Data Fusion for a Vision-Radiological System: a Statistical Calibration Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Enqvist, Andreas; Koppal, Sanjeev; Riley, Phillip [University of Florida, Gainesville, FL 32611 (United States)

    2015-07-01

    Presented here is a fusion system based on simple, low-cost computer vision and radiological sensors for tracking of multiple objects and identifying potential radiological materials being transported or shipped. The main focus of this work is the development of calibration algorithms for characterizing the fused sensor system as a single entity. There is an apparent need for correcting for a scene deviation from the basic inverse distance-squared law governing the detection rates even when evaluating system calibration algorithms. In particular, the computer vision system enables a map of distance-dependence of the sources being tracked, to which the time-dependent radiological data can be incorporated by means of data fusion of the two sensors' output data. (authors)

  19. Distributed cluster management techniques for unattended ground sensor networks

    Science.gov (United States)

    Essawy, Magdi A.; Stelzig, Chad A.; Bevington, James E.; Minor, Sharon

    2005-05-01

    Smart Sensor Networks are becoming important target detection and tracking tools. The challenging problems in such networks include the sensor fusion, data management and communication schemes. This work discusses techniques used to distribute sensor management and multi-target tracking responsibilities across an ad hoc, self-healing cluster of sensor nodes. Although miniaturized computing resources possess the ability to host complex tracking and data fusion algorithms, there still exist inherent bandwidth constraints on the RF channel. Therefore, special attention is placed on the reduction of node-to-node communications within the cluster by minimizing unsolicited messaging, and distributing the sensor fusion and tracking tasks onto local portions of the network. Several challenging problems are addressed in this work including track initialization and conflict resolution, track ownership handling, and communication control optimization. Emphasis is also placed on increasing the overall robustness of the sensor cluster through independent decision capabilities on all sensor nodes. Track initiation is performed using collaborative sensing within a neighborhood of sensor nodes, allowing each node to independently determine if initial track ownership should be assumed. This autonomous track initiation prevents the formation of duplicate tracks while eliminating the need for a central "management" node to assign tracking responsibilities. Track update is performed as an ownership node requests sensor reports from neighboring nodes based on track error covariance and the neighboring nodes geo-positional location. Track ownership is periodically recomputed using propagated track states to determine which sensing node provides the desired coverage characteristics. High fidelity multi-target simulation results are presented, indicating the distribution of sensor management and tracking capabilities to not only reduce communication bandwidth consumption, but to also

  20. Joint sensor location/power rating optimization for temporally-correlated source estimation

    KAUST Repository

    Bushnaq, Osama M.

    2017-12-22

    The optimal sensor selection for scalar state parameter estimation in wireless sensor networks is studied in the paper. A subset of N candidate sensing locations is selected to measure a state parameter and send the observation to a fusion center via wireless AWGN channel. In addition to selecting the optimal sensing location, the sensor type to be placed in these locations is selected from a pool of T sensor types such that different sensor types have different power ratings and costs. The sensor transmission power is limited based on the amount of energy harvested at the sensing location and the type of the sensor. The Kalman filter is used to efficiently obtain the MMSE estimator at the fusion center. Sensors are selected such that the MMSE estimator error is minimized subject to a prescribed system budget. This goal is achieved using convex relaxation and greedy algorithm approaches.

  1. Towards a Unified Approach to Information Integration - A review paper on data/information fusion

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Posse, Christian; Lei, Xingye C.

    2005-10-14

    Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, information is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.

  2. Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis

    Science.gov (United States)

    Markovic, Marko; Dosen, Strahinja; Popovic, Dejan; Graimann, Bernhard; Farina, Dario

    2015-12-01

    Objective. Myoelectric activity volitionally generated by the user is often used for controlling hand prostheses in order to replicate the synergistic actions of muscles in healthy humans during grasping. Muscle synergies in healthy humans are based on the integration of visual perception, heuristics and proprioception. Here, we demonstrate how sensor fusion that combines artificial vision and proprioceptive information with the high-level processing characteristics of biological systems can be effectively used in transradial prosthesis control. Approach. We developed a novel context- and user-aware prosthesis (CASP) controller integrating computer vision and inertial sensing with myoelectric activity in order to achieve semi-autonomous and reactive control of a prosthetic hand. The presented method semi-automatically provides simultaneous and proportional control of multiple degrees-of-freedom (DOFs), thus decreasing overall physical effort while retaining full user control. The system was compared against the major commercial state-of-the art myoelectric control system in ten able-bodied and one amputee subject. All subjects used transradial prosthesis with an active wrist to grasp objects typically associated with activities of daily living. Main results. The CASP significantly outperformed the myoelectric interface when controlling all of the prosthesis DOF. However, when tested with less complex prosthetic system (smaller number of DOF), the CASP was slower but resulted with reaching motions that contained less compensatory movements. Another important finding is that the CASP system required minimal user adaptation and training. Significance. The CASP constitutes a substantial improvement for the control of multi-DOF prostheses. The application of the CASP will have a significant impact when translated to real-life scenarious, particularly with respect to improving the usability and acceptance of highly complex systems (e.g., full prosthetic arms) by amputees.

  3. A practical approach for active camera coordination based on a fusion-driven multi-agent system

    Science.gov (United States)

    Bustamante, Alvaro Luis; Molina, José M.; Patricio, Miguel A.

    2014-04-01

    In this paper, we propose a multi-agent system architecture to manage spatially distributed active (or pan-tilt-zoom) cameras. Traditional video surveillance algorithms are of no use for active cameras, and we have to look at different approaches. Such multi-sensor surveillance systems have to be designed to solve two related problems: data fusion and coordinated sensor-task management. Generally, architectures proposed for the coordinated operation of multiple cameras are based on the centralisation of management decisions at the fusion centre. However, the existence of intelligent sensors capable of decision making brings with it the possibility of conceiving alternative decentralised architectures. This problem is approached by means of a MAS, integrating data fusion as an integral part of the architecture for distributed coordination purposes. This paper presents the MAS architecture and system agents.

  4. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  5. Joint sparsity based heterogeneous data-level fusion for target detection and estimation

    Science.gov (United States)

    Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe

    2017-05-01

    Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.

  6. Fusion instrumentation and control: a development strategy

    International Nuclear Information System (INIS)

    Hsu, P.Y.; Greninger, R.C.; Longhurst, G.R.; Madden, P.

    1981-01-01

    We have examined requirements for a fusion instrumentation and control development program to determine where emphasis is needed. The complex, fast, and closely coupled system dynamics of fusion reactors reveal a need for a rigorous approach to the development of instrumentation and control systems. A framework for such a development program should concentrate on three principal need areas: the operator-machine interface, the data and control system architecture, and fusion compatible instruments and sensors. System dynamics characterization of the whole fusion reactor system is also needed to facilitate the implementation process in each of these areas. Finally, the future need to make the instrumentation and control system compatible with the requirements of a commercial plant is met by applying transition technology. These needs form the basis for the program tasks suggested

  7. Joint sensor placement and power rating selection in energy harvesting wireless sensor networks

    KAUST Repository

    Bushnaq, Osama M.

    2017-11-02

    In this paper, the focus is on optimal sensor placement and power rating selection for parameter estimation in wireless sensor networks (WSNs). We take into account the amount of energy harvested by the sensing nodes, communication link quality, and the observation accuracy at the sensor level. In particular, the aim is to reconstruct the estimation parameter with minimum error at a fusion center under a system budget constraint. To achieve this goal, a subset of sensing locations is selected from a large pool of candidate sensing locations. Furthermore, the type of sensor to be placed at those locations is selected from a given set of sensor types (e.g., sensors with different power ratings). We further investigate whether it is better to install a large number of cheap sensors, a few expensive sensors or a combination of different sensor types at the optimal locations.

  8. Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient

    Directory of Open Access Journals (Sweden)

    Fengjian Shi

    2017-10-01

    Full Text Available In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster–Shafer evidence theory (D–S theory has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D–S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.

  9. Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient.

    Science.gov (United States)

    Shi, Fengjian; Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua

    2017-10-16

    In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster-Shafer evidence theory (D-S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D-S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.

  10. On state estimation and fusion with elliptical constraints

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Liu, Qiang [ORNL

    2017-11-01

    We consider tracking of a target with elliptical nonlinear constraints on its motion dynamics. The state estimates are generated by sensors and sent over long-haul links to a remote fusion center for fusion. We show that the constraints can be projected onto the known ellipse and hence incorporated into the estimation and fusion process. In particular, two methods based on (i) direct connection to the center, and (ii) shortest distance to the ellipse are discussed. A tracking example is used to illustrate the tracking performance using projection-based methods with various fusers in the lossy long-haul tracking environment.

  11. Neural network fusion capabilities for efficient implementation of tracking algorithms

    Science.gov (United States)

    Sundareshan, Malur K.; Amoozegar, Farid

    1997-03-01

    The ability to efficiently fuse information of different forms to facilitate intelligent decision making is one of the major capabilities of trained multilayer neural networks that is now being recognized. While development of innovative adaptive control algorithms for nonlinear dynamical plants that attempt to exploit these capabilities seems to be more popular, a corresponding development of nonlinear estimation algorithms using these approaches, particularly for application in target surveillance and guidance operations, has not received similar attention. We describe the capabilities and functionality of neural network algorithms for data fusion and implementation of tracking filters. To discuss details and to serve as a vehicle for quantitative performance evaluations, the illustrative case of estimating the position and velocity of surveillance targets is considered. Efficient target- tracking algorithms that can utilize data from a host of sensing modalities and are capable of reliably tracking even uncooperative targets executing fast and complex maneuvers are of interest in a number of applications. The primary motivation for employing neural networks in these applications comes from the efficiency with which more features extracted from different sensor measurements can be utilized as inputs for estimating target maneuvers. A system architecture that efficiently integrates the fusion capabilities of a trained multilayer neural net with the tracking performance of a Kalman filter is described. The innovation lies in the way the fusion of multisensor data is accomplished to facilitate improved estimation without increasing the computational complexity of the dynamical state estimator itself.

  12. Possible application of electromagnetic guns to impact fusion

    Science.gov (United States)

    Kostoff, R. N.; Peaslee, A. T., Jr.; Ribe, F. L.

    1982-01-01

    The possible application of electromagnetic guns to impact fusion for the generation of electric power is discussed, and advantages of impact fusion over the more conventional inertial confinement fusion concepts are examined. It is shown that impact fusion can achieve the necessary high yields, of the order of a few gigajoules, which are difficult to achieve with lasers except at unrealistically high target gains. The rail gun accelerator is well adapted to the delivery of some 10-100 megajoules of energy to the fusion target, and the electrical technology involved is relatively simple: inductive storage or rotating machinery and capacitors. It is concluded that the rail gun has the potential of developing into an impact fusion macroparticle accelerator.

  13. A Novel Feature-Level Data Fusion Method for Indoor Autonomous Localization

    Directory of Open Access Journals (Sweden)

    Minxiang Liu

    2013-01-01

    Full Text Available We present a novel feature-level data fusion method for autonomous localization in an inactive multiple reference unknown indoor environment. Since monocular sensors cannot provide the depth information directly, the proposed method incorporates the edge information of images from a camera with homologous depth information received from an infrared sensor. Real-time experimental results demonstrate that the accuracies of position and orientation are greatly improved by using the proposed fusion method in an unknown complex indoor environment. Compared to monocular localization, the proposed method is found to have up to 70 percent improvement in accuracy.

  14. A New Localization System for Indoor Service Robots in Low Luminance and Slippery Indoor Environment Using Afocal Optical Flow Sensor Based Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Dong-Hoon Yi

    2018-01-01

    Full Text Available In this paper, a new localization system utilizing afocal optical flow sensor (AOFS based sensor fusion for indoor service robots in low luminance and slippery environment is proposed, where conventional localization systems do not perform well. To accurately estimate the moving distance of a robot in a slippery environment, the robot was equipped with an AOFS along with two conventional wheel encoders. To estimate the orientation of the robot, we adopted a forward-viewing mono-camera and a gyroscope. In a very low luminance environment, it is hard to conduct conventional feature extraction and matching for localization. Instead, the interior space structure from an image and robot orientation was assessed. To enhance the appearance of image boundary, rolling guidance filter was applied after the histogram equalization. The proposed system was developed to be operable on a low-cost processor and implemented on a consumer robot. Experiments were conducted in low illumination condition of 0.1 lx and carpeted environment. The robot moved for 20 times in a 1.5 × 2.0 m square trajectory. When only wheel encoders and a gyroscope were used for robot localization, the maximum position error was 10.3 m and the maximum orientation error was 15.4°. Using the proposed system, the maximum position error and orientation error were found as 0.8 m and within 1.0°, respectively.

  15. A New Localization System for Indoor Service Robots in Low Luminance and Slippery Indoor Environment Using Afocal Optical Flow Sensor Based Sensor Fusion.

    Science.gov (United States)

    Yi, Dong-Hoon; Lee, Tae-Jae; Cho, Dong-Il Dan

    2018-01-10

    In this paper, a new localization system utilizing afocal optical flow sensor (AOFS) based sensor fusion for indoor service robots in low luminance and slippery environment is proposed, where conventional localization systems do not perform well. To accurately estimate the moving distance of a robot in a slippery environment, the robot was equipped with an AOFS along with two conventional wheel encoders. To estimate the orientation of the robot, we adopted a forward-viewing mono-camera and a gyroscope. In a very low luminance environment, it is hard to conduct conventional feature extraction and matching for localization. Instead, the interior space structure from an image and robot orientation was assessed. To enhance the appearance of image boundary, rolling guidance filter was applied after the histogram equalization. The proposed system was developed to be operable on a low-cost processor and implemented on a consumer robot. Experiments were conducted in low illumination condition of 0.1 lx and carpeted environment. The robot moved for 20 times in a 1.5 × 2.0 m square trajectory. When only wheel encoders and a gyroscope were used for robot localization, the maximum position error was 10.3 m and the maximum orientation error was 15.4°. Using the proposed system, the maximum position error and orientation error were found as 0.8 m and within 1.0°, respectively.

  16. Semiotic foundation for multisensor-multilook fusion

    Science.gov (United States)

    Myler, Harley R.

    1998-07-01

    This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.

  17. Multisensor Super Resolution Using Directionally-Adaptive Regularization for UAV Images.

    Science.gov (United States)

    Kang, Wonseok; Yu, Soohwan; Ko, Seungyong; Paik, Joonki

    2015-05-22

    In various unmanned aerial vehicle (UAV) imaging applications, the multisensor super-resolution (SR) technique has become a chronic problem and attracted increasing attention. Multisensor SR algorithms utilize multispectral low-resolution (LR) images to make a higher resolution (HR) image to improve the performance of the UAV imaging system. The primary objective of the paper is to develop a multisensor SR method based on the existing multispectral imaging framework instead of using additional sensors. In order to restore image details without noise amplification or unnatural post-processing artifacts, this paper presents an improved regularized SR algorithm by combining the directionally-adaptive constraints and multiscale non-local means (NLM) filter. As a result, the proposed method can overcome the physical limitation of multispectral sensors by estimating the color HR image from a set of multispectral LR images using intensity-hue-saturation (IHS) image fusion. Experimental results show that the proposed method provides better SR results than existing state-of-the-art SR methods in the sense of objective measures.

  18. Image fusion for enhanced forest structural assessment

    CSIR Research Space (South Africa)

    Roberts, JW

    2011-01-01

    Full Text Available This research explores the potential benefits of fusing active and passive medium resolution satellite-borne sensor data for forest structural assessment. Image fusion was applied as a means of retaining disparate data features relevant to modeling...

  19. The optimal algorithm for Multi-source RS image fusion.

    Science.gov (United States)

    Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan

    2016-01-01

    In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.

  20. Application of data fusion techniques and technologies for wearable health monitoring.

    Science.gov (United States)

    King, Rachel C; Villeneuve, Emma; White, Ruth J; Sherratt, R Simon; Holderbaum, William; Harwin, William S

    2017-04-01

    Technological advances in sensors and communications have enabled discrete integration into everyday objects, both in the home and about the person. Information gathered by monitoring physiological, behavioural, and social aspects of our lives, can be used to achieve a positive impact on quality of life, health, and well-being. Wearable sensors are at the cusp of becoming truly pervasive, and could be woven into the clothes and accessories that we wear such that they become ubiquitous and transparent. To interpret the complex multidimensional information provided by these sensors, data fusion techniques are employed to provide a meaningful representation of the sensor outputs. This paper is intended to provide a short overview of data fusion techniques and algorithms that can be used to interpret wearable sensor data in the context of health monitoring applications. The application of these techniques are then described in the context of healthcare including activity and ambulatory monitoring, gait analysis, fall detection, and biometric monitoring. A snap-shot of current commercially available sensors is also provided, focusing on their sensing capability, and a commentary on the gaps that need to be bridged to bring research to market. Copyright © 2017. Published by Elsevier Ltd.

  1. Variable self-powered light detection CMOS chip with real-time adaptive tracking digital output based on a novel on-chip sensor.

    Science.gov (United States)

    Wang, HongYi; Fan, Youyou; Lu, Zhijian; Luo, Tao; Fu, Houqiang; Song, Hongjiang; Zhao, Yuji; Christen, Jennifer Blain

    2017-10-02

    This paper provides a solution for a self-powered light direction detection with digitized output. Light direction sensors, energy harvesting photodiodes, real-time adaptive tracking digital output unit and other necessary circuits are integrated on a single chip based on a standard 0.18 µm CMOS process. Light direction sensors proposed have an accuracy of 1.8 degree over a 120 degree range. In order to improve the accuracy, a compensation circuit is presented for photodiodes' forward currents. The actual measurement precision of output is approximately 7 ENOB. Besides that, an adaptive under voltage protection circuit is designed for variable supply power which may undulate with temperature and process.

  2. Pixel-level multisensor image fusion based on matrix completion and robust principal component analysis

    Science.gov (United States)

    Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.

    2016-01-01

    Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.

  3. Adaptive neuro-fuzzy based inferential sensor model for estimating the average air temperature in space heating systems

    Energy Technology Data Exchange (ETDEWEB)

    Jassar, S.; Zhao, L. [Department of Electrical and Computer Engineering, Ryerson University, 350 Victoria Street, Toronto, ON (Canada); Liao, Z. [Department of Architectural Science, Ryerson University (Canada)

    2009-08-15

    The heating systems are conventionally controlled by open-loop control systems because of the absence of practical methods for estimating average air temperature in the built environment. An inferential sensor model, based on adaptive neuro-fuzzy inference system modeling, for estimating the average air temperature in multi-zone space heating systems is developed. This modeling technique has the advantage of expert knowledge of fuzzy inference systems (FISs) and learning capability of artificial neural networks (ANNs). A hybrid learning algorithm, which combines the least-square method and the back-propagation algorithm, is used to identify the parameters of the network. This paper describes an adaptive network based inferential sensor that can be used to design closed-loop control for space heating systems. The research aims to improve the overall performance of heating systems, in terms of energy efficiency and thermal comfort. The average air temperature results estimated by using the developed model are strongly in agreement with the experimental results. (author)

  4. Recognition of Wheat Spike from Field Based Phenotype Platform Using Multi-Sensor Fusion and Improved Maximum Entropy Segmentation Algorithms

    Directory of Open Access Journals (Sweden)

    Chengquan Zhou

    2018-02-01

    Full Text Available To obtain an accurate count of wheat spikes, which is crucial for estimating yield, this paper proposes a new algorithm that uses computer vision to achieve this goal from an image. First, a home-built semi-autonomous multi-sensor field-based phenotype platform (FPP is used to obtain orthographic images of wheat plots at the filling stage. The data acquisition system of the FPP provides high-definition RGB images and multispectral images of the corresponding quadrats. Then, the high-definition panchromatic images are obtained by fusion of three channels of RGB. The Gram–Schmidt fusion algorithm is then used to fuse these multispectral and panchromatic images, thereby improving the color identification degree of the targets. Next, the maximum entropy segmentation method is used to do the coarse-segmentation. The threshold of this method is determined by a firefly algorithm based on chaos theory (FACT, and then a morphological filter is used to de-noise the coarse-segmentation results. Finally, morphological reconstruction theory is applied to segment the adhesive part of the de-noised image and realize the fine-segmentation of the image. The computer-generated counting results for the wheat plots, using independent regional statistical function in Matlab R2017b software, are then compared with field measurements which indicate that the proposed method provides a more accurate count of wheat spikes when compared with other traditional fusion and segmentation methods mentioned in this paper.

  5. A New Developed GIHS-BT-SFIM Fusion Method Based On Edge and Class Data

    Directory of Open Access Journals (Sweden)

    S. Dehnavi

    2013-09-01

    Full Text Available The objective of image fusion (or sometimes pan sharpening is to produce a single image containing the best aspects of the source images. Some desirable aspects are high spatial resolution and high spectral resolution. With the development of space borne imaging sensors, a unified image fusion approach suitable for all employed imaging sources becomes necessary. Among various image fusion methods, intensity-hue-saturation (IHS and Brovey Transforms (BT can quickly merge huge amounts of imagery. However they often face color distortion problems with fused images. The SFIM fusion is one of the most frequently employed approaches in practice to control the tradeoff between the spatial and spectral information. In addition it preserves more spectral information but suffer more spatial information loss. Its effectiveness is heavily depends on the filter design. In this work, two modifications were tested to improve the spectral quality of the images and also investigating class-based fusion results. First, a Generalized Intensity-Hue-Saturation (GIHS, Brovey Transform (BT and smoothing-filter based intensity modulation (SFIM approach was implemented. This kind of algorithm has shown computational advantages among other fusion methods like wavelet, and can be extended to different number of bands as in literature discussed. The used IHS-BT-SFIM algorithm incorporates IHS, IHS-BT, BT, BT-SFIM and SFIM methods by two adjustable parameters. Second, a method was proposed to plus edge information in previous GIHS_BT_SFIM and edge enhancement by panchromatic image. Adding panchromatic data to images had no much improvement. Third, an edge adaptive GIHS_BT_SFIM was proposed to enforce fidelity away from the edges. Using MS image off edges has shown spectral improvement in some fusion methods. Fourth, a class based fusion was tested, which tests different coefficients for each method due to its class. The best parameters for vegetated areas was k1 = 0.6, k2

  6. A Data Fusion System for the Nondestructive Evaluation of Non-Piggable Pipes

    Energy Technology Data Exchange (ETDEWEB)

    Shreekanth Mandayam; Robi Polikar; John C. Chen

    2006-02-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall; and (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. This final report summarizes all research activities conducted by Rowan University during the project period. This includes the design and development of experimental validation test platforms, the design and development of data fusion algorithms for defect identification and sizing, and finally, the design and development of advanced visualization algorithms for the effective management of data resulting from multi-sensor interrogation of gas transmission pipelines.

  7. A Least Square-Based Self-Adaptive Localization Method for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Baoguo Yu

    2016-01-01

    Full Text Available In the wireless sensor network (WSN localization methods based on Received Signal Strength Indicator (RSSI, it is usually required to determine the parameters of the radio signal propagation model before estimating the distance between the anchor node and an unknown node with reference to their communication RSSI value. And finally we use a localization algorithm to estimate the location of the unknown node. However, this localization method, though high in localization accuracy, has weaknesses such as complex working procedure and poor system versatility. Concerning these defects, a self-adaptive WSN localization method based on least square is proposed, which uses the least square criterion to estimate the parameters of radio signal propagation model, which positively reduces the computation amount in the estimation process. The experimental results show that the proposed self-adaptive localization method outputs a high processing efficiency while satisfying the high localization accuracy requirement. Conclusively, the proposed method is of definite practical value.

  8. Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2013-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the 'A-Train' platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (MERRA), stratify the comparisons using a classification of the 'cloud scenes' from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically 'sharded' by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will

  9. Blob-level active-passive data fusion for Benthic classification

    Science.gov (United States)

    Park, Joong Yong; Kalluri, Hemanth; Mathur, Abhinav; Ramnath, Vinod; Kim, Minsu; Aitken, Jennifer; Tuell, Grady

    2012-06-01

    We extend the data fusion pixel level to the more semantically meaningful blob level, using the mean-shift algorithm to form labeled blobs having high similarity in the feature domain, and connectivity in the spatial domain. We have also developed Bhattacharyya Distance (BD) and rule-based classifiers, and have implemented these higher-level data fusion algorithms into the CZMIL Data Processing System. Applying these new algorithms to recent SHOALS and CASI data at Plymouth Harbor, Massachusetts, we achieved improved benthic classification accuracies over those produced with either single sensor, or pixel-level fusion strategies. These results appear to validate the hypothesis that classification accuracy may be generally improved by adopting higher spatial and semantic levels of fusion.

  10. Heterogeneous Multi-Metric Learning for Multi-Sensor Fusion

    Science.gov (United States)

    2011-07-01

    Neural Information Processing Systems, 2010. [18] C.-C. Shen and W.-H. Tsai. Multisensor fusion in smartphones for lifestyle monitoring. In Int. Conf. on...Ministry of Education (708085) of China. REFERENCES [1] C. M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006. [2] S. Boughhorbel, J

  11. Fault detection and diagnosis of an industrial steam turbine using fusion of SVM (support vector machine) and ANFIS (adaptive neuro-fuzzy inference system) classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Salahshoor, Karim [Department of Instrumentation and Automation, Petroleum University of Technology, Tehran (Iran, Islamic Republic of); Kordestani, Mojtaba; Khoshro, Majid S. [Department of Control Engineering, Islamic Azad University South Tehran branch (Iran, Islamic Republic of)

    2010-12-15

    The subject of FDD (fault detection and diagnosis) has gained widespread industrial interest in machine condition monitoring applications. This is mainly due to the potential advantage to be achieved from reduced maintenance costs, improved productivity and increased machine availability. This paper presents a new FDD scheme for condition machinery of an industrial steam turbine using a data fusion methodology. Fusion of a SVM (support vector machine) classifier with an ANFIS (adaptive neuro-fuzzy inference system) classifier, integrated into a common framework, is utilized to enhance the fault detection and diagnostic tasks. For this purpose, a multi-attribute data is fused into aggregated values of a single attribute by OWA (ordered weighted averaging) operators. The simulation studies indicate that the resulting fusion-based scheme outperforms the individual SVM and ANFIS systems to detect and diagnose incipient steam turbine faults. (author)

  12. An Ensemble Deep Convolutional Neural Network Model with Improved D-S Evidence Fusion for Bearing Fault Diagnosis.

    Science.gov (United States)

    Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang; Hu, Jianjun

    2017-07-28

    Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster-Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions.

  13. Economic potential of inertial fusion

    International Nuclear Information System (INIS)

    Nuckolls, J.H.

    1984-04-01

    Beyond the achievement of scientific feasibility, the key question for fusion energy is: does it have the economic potential to be significantly cheaper than fission and coal energy. If fusion has this high economic potential then there are compelling commercial and geopolitical incentives to accelerate the pace of the fusion program in the near term, and to install a global fusion energy system in the long term. Without this high economic potential, fusion's success depends on the failure of all alternatives, and there is no real incentive to accelerate the program. If my conjectures on the economic potential of inertial fusion are approximately correct, then inertial fusion energy's ultimate costs may be only half to two-thirds those of advanced fission and coal energy systems. Relative cost escalation is not assumed and could increase this advantage. Both magnetic and inertial approaches to fusion potentially have a two-fold economic advantage which derives from two fundamental properties: negligible fuel costs and high quality energy which makes possible more efficient generation of electricity. The wining approach to fusion may excel in three areas: electrical generating efficiency, minimum material costs, and adaptability to manufacture in automated factories. The winning approach must also rate highly in environmental potential, safety, availability factor, lifetime, small 0 and M costs, and no possibility of utility-disabling accidents

  14. Inertial Sensor Error Reduction through Calibration and Sensor Fusion

    Directory of Open Access Journals (Sweden)

    Stefan Lambrecht

    2016-02-01

    Full Text Available This paper presents the comparison between cooperative and local Kalman Filters (KF for estimating the absolute segment angle, under two calibration conditions. A simplified calibration, that can be replicated in most laboratories; and a complex calibration, similar to that applied by commercial vendors. The cooperative filters use information from either all inertial sensors attached to the body, Matricial KF; or use information from the inertial sensors and the potentiometers of an exoskeleton, Markovian KF. A one minute walking trial of a subject walking with a 6-DoF exoskeleton was used to assess the absolute segment angle of the trunk, thigh, shank, and foot. The results indicate that regardless of the segment and filter applied, the more complex calibration always results in a significantly better performance compared to the simplified calibration. The interaction between filter and calibration suggests that when the quality of the calibration is unknown the Markovian KF is recommended. Applying the complex calibration, the Matricial and Markovian KF perform similarly, with average RMSE below 1.22 degrees. Cooperative KFs perform better or at least equally good as Local KF, we therefore recommend to use cooperative KFs instead of local KFs for control or analysis of walking.

  15. State fusion with unknown correlation: ellipsoidal intersection

    NARCIS (Netherlands)

    Sijs, J.; Lazar, M.; Bosch, P.P.J. van den

    2010-01-01

    Some crucial challenges of estimation over sensor networks are reaching consensus on the estimates of different systems in the network and separating the mutual information of two estimates from their exclusive information. Current fusion methods of two estimates tend to bypass the mutual

  16. State fusion with unknown correlation : ellipsoidal intersection

    NARCIS (Netherlands)

    Sijs, J.; Lazar, M.; Bosch, van den P.P.J.

    2010-01-01

    Some crucial challenges of estimation over sensor networks are reaching consensus on the estimates of different systems in the network and separating the mutual information of two estimates from their exclusive information. Current fusion methods of two estimates tend to bypass the mutual

  17. Development of a Low-Cost Attitude Sensor for Agricultural Vehicles

    Science.gov (United States)

    The objective of this research was to develop a low-cost attitude sensor for agricultural vehicles. The attitude sensor was composed of three vibratory gyroscopes and two inclinometers. A sensor fusion algorithm was developed to estimate tilt angles (roll and pitch) by least-squares method. In the a...

  18. Improving Planetary Rover Attitude Estimation via MEMS Sensor Characterization

    Science.gov (United States)

    Hidalgo, Javier; Poulakis, Pantelis; Köhler, Johan; Del-Cerro, Jaime; Barrientos, Antonio

    2012-01-01

    Micro Electro-Mechanical Systems (MEMS) are currently being considered in the space sector due to its suitable level of performance for spacecrafts in terms of mechanical robustness with low power consumption, small mass and size, and significant advantage in system design and accommodation. However, there is still a lack of understanding regarding the performance and testing of these new sensors, especially in planetary robotics. This paper presents what is missing in the field: a complete methodology regarding the characterization and modeling of MEMS sensors with direct application. A reproducible and complete approach including all the intermediate steps, tools and laboratory equipment is described. The process of sensor error characterization and modeling through to the final integration in the sensor fusion scheme is explained with detail. Although the concept of fusion is relatively easy to comprehend, carefully characterizing and filtering sensor information is not an easy task and is essential for good performance. The strength of the approach has been verified with representative tests of novel high-grade MEMS inertia sensors and exemplary planetary rover platforms with promising results. PMID:22438761

  19. Indirect adaptive fuzzy fault-tolerant tracking control for MIMO nonlinear systems with actuator and sensor failures.

    Science.gov (United States)

    Bounemeur, Abdelhamid; Chemachema, Mohamed; Essounbouli, Najib

    2018-05-10

    In this paper, an active fuzzy fault tolerant tracking control (AFFTTC) scheme is developed for a class of multi-input multi-output (MIMO) unknown nonlinear systems in the presence of unknown actuator faults, sensor failures and external disturbance. The developed control scheme deals with four kinds of faults for both sensors and actuators. The bias, drift, and loss of accuracy additive faults are considered along with the loss of effectiveness multiplicative fault. A fuzzy adaptive controller based on back-stepping design is developed to deal with actuator failures and unknown system dynamics. However, an additional robust control term is added to deal with sensor faults, approximation errors, and external disturbances. Lyapunov theory is used to prove the stability of the closed loop system. Numerical simulations on a quadrotor are presented to show the effectiveness of the proposed approach. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Multi-focus Image Fusion Using Epifluorescence Microscopy for Robust Vascular Segmentation

    OpenAIRE

    Pelapur, Rengarajan; Prasath, Surya; Palaniappan, Kannappan

    2014-01-01

    We are building a computerized image analysis system for Dura Mater vascular network from fluorescence microscopy images. We propose a system that couples a multi-focus image fusion module with a robust adaptive filtering based segmentation. The robust adaptive filtering scheme handles noise without destroying small structures, and the multi focal image fusion considerably improves the overall segmentation quality by integrating information from multiple images. Based on the segmenta...

  1. Introduction to DSmT for Information Fusion

    CERN Document Server

    Dezert, J; The 7th International Conference on Information Fusion

    2004-01-01

    DSmT (Dezert-Smarandache Theory) is a new alternative to Dempster-Shafer Theory (DST) which: 1) proposes a new mathematical framework for information fusion; 2) incorporates any kind of model (free, hybrid DSm models, and/or Shafer's model) for taking into account any integrity constraints of the fusion problem; 3) combines uncertain, high conflicting, and imprecise sources of evidence with a new rule of combination and overcomes limitations of the Dempster's rule; 4) is adapted to static or dynamic fusion applications represented in terms of belief functions based on the same general formalism.

  2. Adaptive Fusion of Information for Seeing into Ordos Basin, China: A China-Germany-US Joint Venture.

    Science.gov (United States)

    Yeh, T. C. J.; Yin, L.; Sauter, M.; Hu, R.; Ptak, T.; Hou, G. C.

    2014-12-01

    Adaptive fusion of information for seeing into geological basins is the theme of this joint venture. The objective of this venture is to initiate possible collaborations between scientists from China, Germany, and US to develop innovative technologies, which can be utilized to characterize geological and hydrological structures and processes as well as other natural resources in regional scale geological basins of hundreds of thousands of kilometers (i.e., the Ordos Basin, China). This adaptive fusion of information aims to assimilate active (manmade) and passive (natural) hydrologic and geophysical tomography surveys to enhance our ability of seeing into hydrogeological basins at the resolutions of our interests. The active hydrogeophysical tomography refers to recently developed hydraulic tomgoraphic surveys by Chinese and German scientists, as well as well-established geophysical tomography surveys (such as electrical resistivity tomography, cross-borehole radars, electrical magnetic surveys). These active hydrogeophysical tomgoraphic surveys have been proven to be useful high-resolution surveys for geological media of tens and hundreds of meters wide and deep. For basin-scale (i.e., tens and hundreds of kilometers) problems, their applicabilities are however rather limited. The passive hydrogeophysical tomography refers to unexplored technologies that exploit natural stimuli as energy sources for tomographic surveys, which include direct lightning strikes, groundwater level fluctuations due to earthquakes, river stage fluctuations, precipitation storms, barometric pressure variations, and long term climate changes. These natural stimuli are spatially varying, recurrent, and powerful, influencing geological media over great distances and depths (e.g., tens and hundreds of kilometers). Monitoring hydrological and geophysical responses of geological media to these stimuli at different locations is tantamount to collecting data of naturally occurring tomographic

  3. QOS-aware error recovery in wireless body sensor networks using adaptive network coding.

    Science.gov (United States)

    Razzaque, Mohammad Abdur; Javadi, Saeideh S; Coulibaly, Yahaya; Hira, Muta Tah

    2014-12-29

    Wireless body sensor networks (WBSNs) for healthcare and medical applications are real-time and life-critical infrastructures, which require a strict guarantee of quality of service (QoS), in terms of latency, error rate and reliability. Considering the criticality of healthcare and medical applications, WBSNs need to fulfill users/applications and the corresponding network's QoS requirements. For instance, for a real-time application to support on-time data delivery, a WBSN needs to guarantee a constrained delay at the network level. A network coding-based error recovery mechanism is an emerging mechanism that can be used in these systems to support QoS at very low energy, memory and hardware cost. However, in dynamic network environments and user requirements, the original non-adaptive version of network coding fails to support some of the network and user QoS requirements. This work explores the QoS requirements of WBSNs in both perspectives of QoS. Based on these requirements, this paper proposes an adaptive network coding-based, QoS-aware error recovery mechanism for WBSNs. It utilizes network-level and user-/application-level information to make it adaptive in both contexts. Thus, it provides improved QoS support adaptively in terms of reliability, energy efficiency and delay. Simulation results show the potential of the proposed mechanism in terms of adaptability, reliability, real-time data delivery and network lifetime compared to its counterparts.

  4. SAR and LIDAR fusion: experiments and applications

    Science.gov (United States)

    Edwards, Matthew C.; Zaugg, Evan C.; Bradley, Joshua P.; Bowden, Ryan D.

    2013-05-01

    In recent years ARTEMIS, Inc. has developed a series of compact, versatile Synthetic Aperture Radar (SAR) systems which have been operated on a variety of small manned and unmanned aircraft. The multi-frequency-band SlimSAR has demonstrated a variety of capabilities including maritime and littoral target detection, ground moving target indication, polarimetry, interferometry, change detection, and foliage penetration. ARTEMIS also continues to build upon the radar's capabilities through fusion with other sensors, such as electro-optical and infrared camera gimbals and light detection and ranging (LIDAR) devices. In this paper we focus on experiments and applications employing SAR and LIDAR fusion. LIDAR is similar to radar in that it transmits a signal which, after being reflected or scattered by a target area, is recorded by the sensor. The differences are that a LIDAR uses a laser as a transmitter and optical sensors as a receiver, and the wavelengths used exhibit a very different scattering phenomenology than the microwaves used in radar, making SAR and LIDAR good complementary technologies. LIDAR is used in many applications including agriculture, archeology, geo-science, and surveying. Some typical data products include digital elevation maps of a target area and features and shapes extracted from the data. A set of experiments conducted to demonstrate the fusion of SAR and LIDAR data include a LIDAR DEM used in accurately processing the SAR data of a high relief area (mountainous, urban). Also, feature extraction is used in improving geolocation accuracy of the SAR and LIDAR data.

  5. Distributed estimation based on observations prediction in wireless sensor networks

    KAUST Repository

    Bouchoucha, Taha; Ahmed, Mohammed F A; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2015-01-01

    We consider wireless sensor networks (WSNs) used for distributed estimation of unknown parameters. Due to the limited bandwidth, sensor nodes quantize their noisy observations before transmission to a fusion center (FC) for the estimation process

  6. Enhancement of Tropical Land Cover Mapping with Wavelet-Based Fusion and Unsupervised Clustering of SAR and Landsat Image Data

    Science.gov (United States)

    LeMoigne, Jacqueline; Laporte, Nadine; Netanyahuy, Nathan S.; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    The characterization and the mapping of land cover/land use of forest areas, such as the Central African rainforest, is a very complex task. This complexity is mainly due to the extent of such areas and, as a consequence, to the lack of full and continuous cloud-free coverage of those large regions by one single remote sensing instrument, In order to provide improved vegetation maps of Central Africa and to develop forest monitoring techniques for applications at the local and regional scales, we propose to utilize multi-sensor remote sensing observations coupled with in-situ data. Fusion and clustering of multi-sensor data are the first steps towards the development of such a forest monitoring system. In this paper, we will describe some preliminary experiments involving the fusion of SAR and Landsat image data of the Lope Reserve in Gabon. Similarly to previous fusion studies, our fusion method is wavelet-based. The fusion provides a new image data set which contains more detailed texture features and preserves the large homogeneous regions that are observed by the Thematic Mapper sensor. The fusion step is followed by unsupervised clustering and provides a vegetation map of the area.

  7. An Articulated Inspection Arm for fusion purposes

    Energy Technology Data Exchange (ETDEWEB)

    Villedieu, E., E-mail: eric.villedieu@cea.fr [CEA-IRFM, 13108 Saint Paul lez Durance (France); Bruno, V.; Pastor, P.; Gargiulo, L. [CEA-IRFM, 13108 Saint Paul lez Durance (France); Song, Y.T.; Cheng, Y.; Feng, H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei (China); Liu, C. [CEA-IRFM, 13108 Saint Paul lez Durance (France); Institute of Plasma Physics, Chinese Academy of Sciences, Hefei (China); Shi, S.S. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei (China)

    2016-11-01

    Highlights: • Requirements for in vacuum tokamak inspection are presented. • Development of a prototype of the Articulated Inspection Arm is described. • The upgrade of the prototype to convert it into a fully operational device is detailed. • Future applications of inspection robots in the new fusion reactors is discussed. - Abstract: Fusion Tokamaks are complex machines which require special conditions for their operation, in particular, high vacuum inside the vessel and high temperature of the vessel walls. During plasma phases, the first wall components are highly stressed and a control is necessary in case of doubt about their condition. To be able to make safely such an inspection in a short period of time is a great advantage. The Articulated Inspection Arm (AIA) developed by the CEA provides the capability for fast inspections of the first wall overall surface keeping the vacuum and temperature conditions of the vessel. The robot prototype was validated in Tore Supra in 2008. In the frame of a joint laboratory, CEA/IRFM and ASIPP have decided to upgrade the existing AIA prototype to use it routinely in the EAST and WEST tokamaks. The robot has followed an important upgrade program in 2013 and 2014. The document presents the various upgrades made on the mechanics, the sensors, the electronics, the control station and the integration adaptation for the operation on EAST. From the AIA experience, thoughts for future inspection robots are given.

  8. An Articulated Inspection Arm for fusion purposes

    International Nuclear Information System (INIS)

    Villedieu, E.; Bruno, V.; Pastor, P.; Gargiulo, L.; Song, Y.T.; Cheng, Y.; Feng, H.; Liu, C.; Shi, S.S.

    2016-01-01

    Highlights: • Requirements for in vacuum tokamak inspection are presented. • Development of a prototype of the Articulated Inspection Arm is described. • The upgrade of the prototype to convert it into a fully operational device is detailed. • Future applications of inspection robots in the new fusion reactors is discussed. - Abstract: Fusion Tokamaks are complex machines which require special conditions for their operation, in particular, high vacuum inside the vessel and high temperature of the vessel walls. During plasma phases, the first wall components are highly stressed and a control is necessary in case of doubt about their condition. To be able to make safely such an inspection in a short period of time is a great advantage. The Articulated Inspection Arm (AIA) developed by the CEA provides the capability for fast inspections of the first wall overall surface keeping the vacuum and temperature conditions of the vessel. The robot prototype was validated in Tore Supra in 2008. In the frame of a joint laboratory, CEA/IRFM and ASIPP have decided to upgrade the existing AIA prototype to use it routinely in the EAST and WEST tokamaks. The robot has followed an important upgrade program in 2013 and 2014. The document presents the various upgrades made on the mechanics, the sensors, the electronics, the control station and the integration adaptation for the operation on EAST. From the AIA experience, thoughts for future inspection robots are given.

  9. Adaptive Naive Bayes classification for wireless sensor networks

    NARCIS (Netherlands)

    Zwartjes, G.J.

    2017-01-01

    Wireless Sensor Networks are tiny devices equipped with sensors and wireless communication. These devices observe environments and communicatie about these observations. Machine Learning techniques are of interest for Wireless Sensor Network applications since they can reduce the amount of needed

  10. Intelligent Flood Adaptive Context-aware System: How Wireless Sensors Adapt their Configuration based on Environmental Phenomenon Events

    Directory of Open Access Journals (Sweden)

    Jie SUN

    2016-11-01

    Full Text Available Henceforth, new generations of Wireless Sensor Networks (WSN have to be able to adapt their behavior to collect, from the study phenomenon, quality data for long periods of time. We have thus proposed a new formalization for the design and the implementation of context-aware systems relying on a WSN for the data collection. To illustrate this proposal, we also present an environmental use case: the study of flood events in a watershed. In this paper, we detail the simulation tool that we have developed in order to implement our model. We simulate several scenarios of context-aware systems to monitor a watershed. The data used for the simulation are the observation data of the French Orgeval watershed.

  11. Path Planning Algorithms for the Adaptive Sensor Fleet

    Science.gov (United States)

    Stoneking, Eric; Hosler, Jeff

    2005-01-01

    The Adaptive Sensor Fleet (ASF) is a general purpose fleet management and planning system being developed by NASA in coordination with NOAA. The current mission of ASF is to provide the capability for autonomous cooperative survey and sampling of dynamic oceanographic phenomena such as current systems and algae blooms. Each ASF vessel is a software model that represents a real world platform that carries a variety of sensors. The OASIS platform will provide the first physical vessel, outfitted with the systems and payloads necessary to execute the oceanographic observations described in this paper. The ASF architecture is being designed for extensibility to accommodate heterogenous fleet elements, and is not limited to using the OASIS platform to acquire data. This paper describes the path planning algorithms developed for the acquisition phase of a typical ASF task. Given a polygonal target region to be surveyed, the region is subdivided according to the number of vessels in the fleet. The subdivision algorithm seeks a solution in which all subregions have equal area and minimum mean radius. Once the subregions are defined, a dynamic programming method is used to find a minimum-time path for each vessel from its initial position to its assigned region. This path plan includes the effects of water currents as well as avoidance of known obstacles. A fleet-level planning algorithm then shuffles the individual vessel assignments to find the overall solution which puts all vessels in their assigned regions in the minimum time. This shuffle algorithm may be described as a process of elimination on the sorted list of permutations of a cost matrix. All these path planning algorithms are facilitated by discretizing the region of interest onto a hexagonal tiling.

  12. Wireless sensor networks distributed consensus estimation

    CERN Document Server

    Chen, Cailian; Guan, Xinping

    2014-01-01

    This SpringerBrief evaluates the cooperative effort of sensor nodes to accomplish high-level tasks with sensing, data processing and communication. The metrics of network-wide convergence, unbiasedness, consistency and optimality are discussed through network topology, distributed estimation algorithms and consensus strategy. Systematic analysis reveals that proper deployment of sensor nodes and a small number of low-cost relays (without sensing function) can speed up the information fusion and thus improve the estimation capability of wireless sensor networks (WSNs). This brief also investiga

  13. Improved laser-based triangulation sensor with enhanced range and resolution through adaptive optics-based active beam control.

    Science.gov (United States)

    Reza, Syed Azer; Khwaja, Tariq Shamim; Mazhar, Mohsin Ali; Niazi, Haris Khan; Nawab, Rahma

    2017-07-20

    Various existing target ranging techniques are limited in terms of the dynamic range of operation and measurement resolution. These limitations arise as a result of a particular measurement methodology, the finite processing capability of the hardware components deployed within the sensor module, and the medium through which the target is viewed. Generally, improving the sensor range adversely affects its resolution and vice versa. Often, a distance sensor is designed for an optimal range/resolution setting depending on its intended application. Optical triangulation is broadly classified as a spatial-signal-processing-based ranging technique and measures target distance from the location of the reflected spot on a position sensitive detector (PSD). In most triangulation sensors that use lasers as a light source, beam divergence-which severely affects sensor measurement range-is often ignored in calculations. In this paper, we first discuss in detail the limitations to ranging imposed by beam divergence, which, in effect, sets the sensor dynamic range. Next, we show how the resolution of laser-based triangulation sensors is limited by the interpixel pitch of a finite-sized PSD. In this paper, through the use of tunable focus lenses (TFLs), we propose a novel design of a triangulation-based optical rangefinder that improves both the sensor resolution and its dynamic range through adaptive electronic control of beam propagation parameters. We present the theory and operation of the proposed sensor and clearly demonstrate a range and resolution improvement with the use of TFLs. Experimental results in support of our claims are shown to be in strong agreement with theory.

  14. Monitoring of slope-instabilities and deformations with Micro-Electro-Mechanical-Systems (MEMS) in wireless ad-hoc Sensor Networks

    Science.gov (United States)

    Arnhardt, C.; Fernández-Steeger, T. M.; Azzam, R.

    2009-04-01

    efficiency that permits measurements over a long period of time. A special sensor-board that accommodates the measuring sensors and the node of the WSN was developed. The standardized interfaces of the measuring sensors permit an easy interaction with the node and thus enable an uncomplicated data transfer to the gateway. The 3-axis acceleration sensor (measuring range: +/- 2g), the 2-axis inclination sensor (measuring range: +/- 30°) for measuring tilt and the barometric pressure sensor (measuring rang: 30kPa - 120 kPa) for measuring sub-meter height changes (altimeter) are currently integrated into the sensor network and are tested in realistic experiments. In addition sensor nodes with precise potentiometric displacement and linear magnetorestrictive position transducer are used for extension and convergence measurements. According to the accuracy of the first developed test stations, the results of the experiments showed that the selected sensors meet the requirement profile, as the stability is satisfying and the spreading of the data is quite low. Therefore the jet developed sensor boards can be tested in a larger environment of a sensor network. In order to get more information about accuracy in detail, experiments in a new more precise test bed and tests with different sampling rates will follow. Another increasingly important aspect for the future is the fusion of sensor data (i.e. combination and comparison) to identify malfunctions and to reduce false alarm rates, while increasing data quality at the same time. The correlation of different (complementary sensor fusion) but also identical sensor-types (redundant sensor fusion) permits a validation of measuring data. The development of special algorithms allows in a further step to analyze and evaluate the data from all nodes of the network together (sensor node fusion). The sensor fusion contributes to the decision making of alarm and early warning systems and allows a better interpretation of data. The network

  15. Proximal soil sensors and data fusion for precision agriculture

    NARCIS (Netherlands)

    Mahmood, H.S.

    2013-01-01

    different remote and proximal soil sensors are available today that can scan entire fields and give detailed information on various physical, chemical, mechanical and biological soil properties. The first objective of this thesis was to evaluate different proximal soil sensors available today and to

  16. An Autonomous Sensor System Architecture for Active Flow and Noise Control Feedback

    Science.gov (United States)

    Humphreys, William M, Jr.; Culliton, William G.

    2008-01-01

    Multi-channel sensor fusion represents a powerful technique to simply and efficiently extract information from complex phenomena. While the technique has traditionally been used for military target tracking and situational awareness, a study has been successfully completed that demonstrates that sensor fusion can be applied equally well to aerodynamic applications. A prototype autonomous hardware processor was successfully designed and used to detect in real-time the two-dimensional flow reattachment location generated by a simple separated-flow wind tunnel model. The success of this demonstration illustrates the feasibility of using autonomous sensor processing architectures to enhance flow control feedback signal generation.

  17. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    Directory of Open Access Journals (Sweden)

    Kaijuan Yuan

    2016-01-01

    Full Text Available Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  18. Decentralized vs. centralized scheduling in wireless sensor networks for data fusion

    OpenAIRE

    Mitici, M.A.; Goseling, Jasper; de Graaf, Maurits; Boucherie, Richardus J.

    2014-01-01

    We consider the problem of data estimation in a sensor wireless network where sensors transmit their observations according to decentralized and centralized transmission schedules. A data collector is interested in achieving a data estimation using several sensor observations such that the variance of the estimation is below a targeted threshold. We analyze the waiting time for a collector to receive sufficient sensor observations. We show that, for sufficiently large sensor sets, the decentr...

  19. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...... neighborhood regularization is presented. This framework enables the formulation of the regularization in a way that corresponds well with our prior assumptions of the image data. The proposed method is validated and compared with other approaches on several data sets. Lastly, the intensity......-hue-saturation method is revisited in order to gain additional insight of what implications the spectral consistency has for an image fusion method....

  20. The ABC Adaptive Fusion Architecture

    DEFF Research Database (Denmark)

    Bunde-Pedersen, Jonathan; Mogensen, Martin; Bardram, Jakob Eyvind

    2006-01-01

    and early implementation of a systemcapable of adapting to its operating environment, choosingthe best fit combination of the client-server and peerto-peer architectures. The architecture creates a seamlessintegration between a centralized hybrid architecture and adecentralized architecture, relying on what...

  1. Data fusion in cyber security: first order entity extraction from common cyber data

    Science.gov (United States)

    Giacobe, Nicklaus A.

    2012-06-01

    The Joint Directors of Labs Data Fusion Process Model (JDL Model) provides a framework for how to handle sensor data to develop higher levels of inference in a complex environment. Beginning from a call to leverage data fusion techniques in intrusion detection, there have been a number of advances in the use of data fusion algorithms in this subdomain of cyber security. While it is tempting to jump directly to situation-level or threat-level refinement (levels 2 and 3) for more exciting inferences, a proper fusion process starts with lower levels of fusion in order to provide a basis for the higher fusion levels. The process begins with first order entity extraction, or the identification of important entities represented in the sensor data stream. Current cyber security operational tools and their associated data are explored for potential exploitation, identifying the first order entities that exist in the data and the properties of these entities that are described by the data. Cyber events that are represented in the data stream are added to the first order entities as their properties. This work explores typical cyber security data and the inferences that can be made at the lower fusion levels (0 and 1) with simple metrics. Depending on the types of events that are expected by the analyst, these relatively simple metrics can provide insight on their own, or could be used in fusion algorithms as a basis for higher levels of inference.

  2. A Cubature-Principle-Assisted IMM-Adaptive UKF Algorithm for Maneuvering Target Tracking Caused by Sensor Faults

    Directory of Open Access Journals (Sweden)

    Huan Zhou

    2017-09-01

    Full Text Available Aimed at solving the problem of decreased filtering precision while maneuvering target tracking caused by non-Gaussian distribution and sensor faults, we developed an efficient interacting multiple model-unscented Kalman filter (IMM-UKF algorithm. By dividing the IMM-UKF into two links, the algorithm introduces the cubature principle to approximate the probability density of the random variable, after the interaction, by considering the external link of IMM-UKF, which constitutes the cubature-principle-assisted IMM method (CPIMM for solving the non-Gaussian problem, and leads to an adaptive matrix to balance the contribution of the state. The algorithm provides filtering solutions by considering the internal link of IMM-UKF, which is called a new adaptive UKF algorithm (NAUKF to address sensor faults. The proposed CPIMM-NAUKF is evaluated in a numerical simulation and two practical experiments including one navigation experiment and one maneuvering target tracking experiment. The simulation and experiment results show that the proposed CPIMM-NAUKF has greater filtering precision and faster convergence than the existing IMM-UKF. The proposed algorithm achieves a very good tracking performance, and will be effective and applicable in the field of maneuvering target tracking.

  3. Covariance descriptor fusion for target detection

    Science.gov (United States)

    Cukur, Huseyin; Binol, Hamidullah; Bal, Abdullah; Yavuz, Fatih

    2016-05-01

    Target detection is one of the most important topics for military or civilian applications. In order to address such detection tasks, hyperspectral imaging sensors provide useful images data containing both spatial and spectral information. Target detection has various challenging scenarios for hyperspectral images. To overcome these challenges, covariance descriptor presents many advantages. Detection capability of the conventional covariance descriptor technique can be improved by fusion methods. In this paper, hyperspectral bands are clustered according to inter-bands correlation. Target detection is then realized by fusion of covariance descriptor results based on the band clusters. The proposed combination technique is denoted Covariance Descriptor Fusion (CDF). The efficiency of the CDF is evaluated by applying to hyperspectral imagery to detect man-made objects. The obtained results show that the CDF presents better performance than the conventional covariance descriptor.

  4. An Energy-Efficient Adaptive Clustering Protocol for Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Lü Tao

    2013-05-01

    Full Text Available An energy-efficient adaptive clustering hierarchy EEACH in wireless sensor networks based on LEACH and LEACH-C is proposed in this paper. The main consideration is the LEACH cluster structure, each cluster is not uniform energy consumption; LEACH-C using a centralized algorithm can achieve better clustering, but do not contribute to the implementation of distributed. In EEACH, we analyzed the effects of different numbers of cluster member node on the network energy consumption; and re-planning time slice to balance the energy consumption of each cluster; and avoid the energy hole problem by reasonable cluster head selection algorithm. Its objective is to balance the energy consumption and maximize the network lifetime. Analysis and simulation results show that EEACH provides more uniform energy consumption among nodes and can prolong network lifetime compared to LEACH and LEACH-C.

  5. Joint Multi-Focus Fusion and Bayer ImageRestoration

    Institute of Scientific and Technical Information of China (English)

    Ling Guo; Bin Yang; Chao Yang

    2015-01-01

    In this paper, a joint multifocus image fusion and Bayer pattern image restoration algorithm for raw images of single-sensor colorimaging devices is proposed. Different from traditional fusion schemes, the raw Bayer pattern images are fused before colorrestoration. Therefore, the Bayer image restoration operation is only performed one time. Thus, the proposed algorithm is moreefficient than traditional fusion schemes. In detail, a clarity measurement of Bayer pattern image is defined for raw Bayer patternimages, and the fusion operator is performed on superpixels which provide powerful grouping cues of local image feature. Theraw images are merged with refined weight map to get the fused Bayer pattern image, which is restored by the demosaicingalgorithm to get the full resolution color image. Experimental results demonstrate that the proposed algorithm can obtain betterfused results with more natural appearance and fewer artifacts than the traditional algorithms.

  6. Fusion of footsteps and face biometrics on an unsupervised and uncontrolled environment

    Science.gov (United States)

    Vera-Rodriguez, Ruben; Tome, Pedro; Fierrez, Julian; Ortega-Garcia, Javier

    2012-06-01

    This paper reports for the first time experiments on the fusion of footsteps and face on an unsupervised and not controlled environment for person authentication. Footstep recognition is a relatively new biometric based on signals extracted from people walking over floor sensors. The idea of the fusion between footsteps and face starts from the premise that in an area where footstep sensors are installed it is very simple to place a camera to capture also the face of the person that walks over the sensors. This setup may find application in scenarios like ambient assisted living, smart homes, eldercare, or security access. The paper reports a comparative assessment of both biometrics using the same database and experimental protocols. In the experimental work we consider two different applications: smart homes (small group of users with a large set of training data) and security access (larger group of users with a small set of training data) obtaining results of 0.9% and 5.8% EER respectively for the fusion of both modalities. This is a significant performance improvement compared with the results obtained by the individual systems.

  7. Bayesian and Dempster–Shafer fusion

    Indian Academy of Sciences (India)

    a large number devoted to the theory of information fusion: its algorithms and .... to isolate the relevant parts, making a decision, and finally carrying out that .... used to adapt the amount of process noise used by the Kalman Filter to account for.

  8. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    Energy Technology Data Exchange (ETDEWEB)

    Shreekanth Mandayam; Robi Polikar; John C. Chen

    2004-04-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University designed, developed and exercised multisensor data fusion algorithms for identifying defect related information present in magnetic flux leakage, ultrasonic testing and thermal imaging nondestructive evaluation signatures of a test-specimen suite representative of benign and anomalous indications in gas transmission pipelines.

  9. Adaptive Modeling of the International Space Station Electrical Power System

    Science.gov (United States)

    Thomas, Justin Ray

    2007-01-01

    Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.

  10. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    Science.gov (United States)

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  11. Sensor Fusion for Nuclear Proliferation Activity Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Adel Ghanem, Ph D

    2007-03-30

    The objective of Phase 1 of this STTR project is to demonstrate a Proof-of-Concept (PoC) of the Geo-Rad system that integrates a location-aware SmartTag (made by ZonTrak) and a radiation detector (developed by LLNL). It also includes the ability to transmit the collected radiation data and location information to the ZonTrak server (ZonService). The collected data is further transmitted to a central server at LLNL (the Fusion Server) to be processed in conjunction with overhead imagery to generate location estimates of nuclear proliferation and radiation sources.

  12. Canadian fusion fuels technology project

    International Nuclear Information System (INIS)

    1986-01-01

    The Canadian Fusion Fuels Technology Project was launched in 1982 to coordinate Canada's provision of fusion fuels technology to international fusion power development programs. The project has a mandate to extend and adapt existing Canadian tritium technologies for use in international fusion power development programs. 1985-86 represents the fourth year of the first five-year term of the Canadian Fusion Fuels Technology Project (CFFTP). This reporting period coincides with an increasing trend in global fusion R and D to direct more effort towards the management of tritium. This has resulted in an increased linking of CFFTP activities and objectives with those of facilities abroad. In this way there has been a continuing achievement resulting from CFFTP efforts to have cooperative R and D and service activities with organizations abroad. All of this is aided by the cooperative international atmosphere within the fusion community. This report summarizes our past year and provides some highlights of the upcoming year 1986/87, which is the final year of the first five-year phase of the program. AECL (representing the Federal Government), the Ministry of Energy (representing Ontario) and Ontario Hydro, have given formal indication of their intent to continue with a second five-year program. Plans for the second phase will continue to emphasize tritium technology and remote handling

  13. Minimum Energy Decentralized Estimation in a Wireless Sensor Network with Correlated Sensor Noises

    Directory of Open Access Journals (Sweden)

    Krasnopeev Alexey

    2005-01-01

    Full Text Available Consider the problem of estimating an unknown parameter by a sensor network with a fusion center (FC. Sensor observations are corrupted by additive noises with an arbitrary spatial correlation. Due to bandwidth and energy limitation, each sensor is only able to transmit a finite number of bits to the FC, while the latter must combine the received bits to estimate the unknown parameter. We require the decentralized estimator to have a mean-squared error ( that is within a constant factor to that of the best linear unbiased estimator (BLUE. We minimize the total sensor transmitted energy by selecting sensor quantization levels using the knowledge of noise covariance matrix while meeting the target requirement. Computer simulations show that our designs can achieve energy savings up to when compared to the uniform quantization strategy whereby each sensor generates the same number of bits, irrespective of the quality of its observation and the condition of its channel to the FC.

  14. Minimum Energy Decentralized Estimation in a Wireless Sensor Network with Correlated Sensor Noises

    Directory of Open Access Journals (Sweden)

    Krasnopeev Alexey

    2005-01-01

    Full Text Available Consider the problem of estimating an unknown parameter by a sensor network with a fusion center (FC. Sensor observations are corrupted by additive noises with an arbitrary spatial correlation. Due to bandwidth and energy limitation, each sensor is only able to transmit a finite number of bits to the FC, while the latter must combine the received bits to estimate the unknown parameter. We require the decentralized estimator to have a mean-squared error (MSE that is within a constant factor to that of the best linear unbiased estimator (BLUE. We minimize the total sensor transmitted energy by selecting sensor quantization levels using the knowledge of noise covariance matrix while meeting the target MSE requirement. Computer simulations show that our designs can achieve energy savings up to 70 % when compared to the uniform quantization strategy whereby each sensor generates the same number of bits, irrespective of the quality of its observation and the condition of its channel to the FC.

  15. Information Fusion of Conflicting Input Data

    Directory of Open Access Journals (Sweden)

    Uwe Mönks

    2016-10-01

    Full Text Available Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation employing the μBalTLCS (fuzzified balanced two-layer conflict solving fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible.

  16. Information Fusion of Conflicting Input Data.

    Science.gov (United States)

    Mönks, Uwe; Dörksen, Helene; Lohweg, Volker; Hübner, Michael

    2016-10-29

    Sensors, and also actuators or external sources such as databases, serve as data sources in order to realise condition monitoring of industrial applications or the acquisition of characteristic parameters like production speed or reject rate. Modern facilities create such a large amount of complex data that a machine operator is unable to comprehend and process the information contained in the data. Thus, information fusion mechanisms gain increasing importance. Besides the management of large amounts of data, further challenges towards the fusion algorithms arise from epistemic uncertainties (incomplete knowledge) in the input signals as well as conflicts between them. These aspects must be considered during information processing to obtain reliable results, which are in accordance with the real world. The analysis of the scientific state of the art shows that current solutions fulfil said requirements at most only partly. This article proposes the multilayered information fusion system MACRO (multilayer attribute-based conflict-reducing observation) employing the μ BalTLCS (fuzzified balanced two-layer conflict solving) fusion algorithm to reduce the impact of conflicts on the fusion result. The performance of the contribution is shown by its evaluation in the scope of a machine condition monitoring application under laboratory conditions. Here, the MACRO system yields the best results compared to state-of-the-art fusion mechanisms. The utilised data is published and freely accessible.

  17. Recent Advances in Registration, Integration and Fusion of Remotely Sensed Data: Redundant Representations and Frames

    Science.gov (United States)

    Czaja, Wojciech; Le Moigne-Stewart, Jacqueline

    2014-01-01

    In recent years, sophisticated mathematical techniques have been successfully applied to the field of remote sensing to produce significant advances in applications such as registration, integration and fusion of remotely sensed data. Registration, integration and fusion of multiple source imagery are the most important issues when dealing with Earth Science remote sensing data where information from multiple sensors, exhibiting various resolutions, must be integrated. Issues ranging from different sensor geometries, different spectral responses, differing illumination conditions, different seasons, and various amounts of noise need to be dealt with when designing an image registration, integration or fusion method. This tutorial will first define the problems and challenges associated with these applications and then will review some mathematical techniques that have been successfully utilized to solve them. In particular, we will cover topics on geometric multiscale representations, redundant representations and fusion frames, graph operators, diffusion wavelets, as well as spatial-spectral and operator-based data fusion. All the algorithms will be illustrated using remotely sensed data, with an emphasis on current and operational instruments.

  18. New Low Cost Structure for Dual Axis Mount Solar Tracking System Using Adaptive Solar Sensor

    DEFF Research Database (Denmark)

    Argeseanu, Alin; Ritchie, Ewen; Leban, Krisztina Monika

    2010-01-01

    A solar tracking system is designed to optimize the operation of solar energy receivers. The objective of this paper is proposing a new tracking system structure with two axis. The success strategy of this new project focuses on the economical analysis of solar energy. Therefore it is important...... to determine the most cost effective design, to consider the costs of production and maintenance, and operating. The proposed tracking system uses a new solar sensor position with an adaptive feature....

  19. Image Fusion Technologies In Commercial Remote Sensing Packages

    OpenAIRE

    Al-Wassai, Firouz Abdullah; Kalyankar, N. V.

    2013-01-01

    Several remote sensing software packages are used to the explicit purpose of analyzing and visualizing remotely sensed data, with the developing of remote sensing sensor technologies from last ten years. Accord-ing to literature, the remote sensing is still the lack of software tools for effective information extraction from remote sensing data. So, this paper provides a state-of-art of multi-sensor image fusion technologies as well as review on the quality evaluation of the single image or f...

  20. Distributed sensor networks

    CERN Document Server

    Rubin, Donald B; Carlin, John B; Iyengar, S Sitharama; Brooks, Richard R; University, Clemson

    2014-01-01

    An Overview, S.S. Iyengar, Ankit Tandon, and R.R. BrooksMicrosensor Applications, David ShepherdA Taxonomy of Distributed Sensor Networks, Shivakumar Sastry and S.S. IyengarContrast with Traditional Systems, R.R. BrooksDigital Signal Processing Background, Yu Hen HuImage-Processing Background Lynne Grewe and Ben ShahshahaniObject Detection and Classification, Akbar M. SayeedParameter Estimation David FriedlanderTarget Tracking with Self-Organizing Distributed Sensors R.R. Brooks, C. Griffin, D.S. Friedlander, and J.D. KochCollaborative Signal and Information Processing: AnInformation-Directed Approach Feng Zhao, Jie Liu, Juan Liu, Leonidas Guibas, and James ReichEnvironmental Effects, David C. SwansonDetecting and Counteracting Atmospheric Effects Lynne L. GreweSignal Processing and Propagation for Aeroacoustic Sensor Networks, Richard J. Kozick, Brian M. Sadler, and D. Keith WilsonDistributed Multi-Target Detection in Sensor Networks Xiaoling Wang, Hairong Qi, and Steve BeckFoundations of Data Fusion f...

  1. Process for manufacture of inertial confinement fusion targets and resulting product

    International Nuclear Information System (INIS)

    Solomon, D.E.; Wise, K.D.; Wuttke, G.H.; Masnari, N.A.; Rensel, W.B.; Robinson, M.G.

    1980-01-01

    A method of manufacturing inertial confinement fusion targets is described which is adaptable for high volume production of low cost targets in a wide variety of sizes. The targets include a spherical pellet of fusion fuel surrounded by a protective concentric shell. (UK)

  2. QoS-Aware Error Recovery in Wireless Body Sensor Networks Using Adaptive Network Coding

    Directory of Open Access Journals (Sweden)

    Mohammad Abdur Razzaque

    2014-12-01

    Full Text Available Wireless body sensor networks (WBSNs for healthcare and medical applications are real-time and life-critical infrastructures, which require a strict guarantee of quality of service (QoS, in terms of latency, error rate and reliability. Considering the criticality of healthcare and medical applications, WBSNs need to fulfill users/applications and the corresponding network’s QoS requirements. For instance, for a real-time application to support on-time data delivery, a WBSN needs to guarantee a constrained delay at the network level. A network coding-based error recovery mechanism is an emerging mechanism that can be used in these systems to support QoS at very low energy, memory and hardware cost. However, in dynamic network environments and user requirements, the original non-adaptive version of network coding fails to support some of the network and user QoS requirements. This work explores the QoS requirements of WBSNs in both perspectives of QoS. Based on these requirements, this paper proposes an adaptive network coding-based, QoS-aware error recovery mechanism for WBSNs. It utilizes network-level and user-/application-level information to make it adaptive in both contexts. Thus, it provides improved QoS support adaptively in terms of reliability, energy efficiency and delay. Simulation results show the potential of the proposed mechanism in terms of adaptability, reliability, real-time data delivery and network lifetime compared to its counterparts.

  3. QoS-Aware Error Recovery in Wireless Body Sensor Networks Using Adaptive Network Coding

    Science.gov (United States)

    Razzaque, Mohammad Abdur; Javadi, Saeideh S.; Coulibaly, Yahaya; Hira, Muta Tah

    2015-01-01

    Wireless body sensor networks (WBSNs) for healthcare and medical applications are real-time and life-critical infrastructures, which require a strict guarantee of quality of service (QoS), in terms of latency, error rate and reliability. Considering the criticality of healthcare and medical applications, WBSNs need to fulfill users/applications and the corresponding network's QoS requirements. For instance, for a real-time application to support on-time data delivery, a WBSN needs to guarantee a constrained delay at the network level. A network coding-based error recovery mechanism is an emerging mechanism that can be used in these systems to support QoS at very low energy, memory and hardware cost. However, in dynamic network environments and user requirements, the original non-adaptive version of network coding fails to support some of the network and user QoS requirements. This work explores the QoS requirements of WBSNs in both perspectives of QoS. Based on these requirements, this paper proposes an adaptive network coding-based, QoS-aware error recovery mechanism for WBSNs. It utilizes network-level and user-/application-level information to make it adaptive in both contexts. Thus, it provides improved QoS support adaptively in terms of reliability, energy efficiency and delay. Simulation results show the potential of the proposed mechanism in terms of adaptability, reliability, real-time data delivery and network lifetime compared to its counterparts. PMID:25551485

  4. Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision.

    Science.gov (United States)

    Kim, Min Young; Lee, Hyunkee; Cho, Hyungsuck

    2008-04-10

    One major research issue associated with 3D perception by robotic systems is the creation of efficient sensor systems that can generate dense range maps reliably. A visual sensor system for robotic applications is developed that is inherently equipped with two types of sensor, an active trinocular vision and a passive stereo vision. Unlike in conventional active vision systems that use a large number of images with variations of projected patterns for dense range map acquisition or from conventional passive vision systems that work well on specific environments with sufficient feature information, a cooperative bidirectional sensor fusion method for this visual sensor system enables us to acquire a reliable dense range map using active and passive information simultaneously. The fusion algorithms are composed of two parts, one in which the passive stereo vision helps active vision and the other in which the active trinocular vision helps the passive one. The first part matches the laser patterns in stereo laser images with the help of intensity images; the second part utilizes an information fusion technique using the dynamic programming method in which image regions between laser patterns are matched pixel-by-pixel with help of the fusion results obtained in the first part. To determine how the proposed sensor system and fusion algorithms can work in real applications, the sensor system is implemented on a robotic system, and the proposed algorithms are applied. A series of experimental tests is performed for a variety of configurations of robot and environments. The performance of the sensor system is discussed in detail.

  5. Decentralized vs. centralized scheduling in wireless sensor networks for data fusion

    NARCIS (Netherlands)

    Mitici, M.A.; Goseling, Jasper; de Graaf, Maurits; Boucherie, Richardus J.

    2014-01-01

    We consider the problem of data estimation in a sensor wireless network where sensors transmit their observations according to decentralized and centralized transmission schedules. A data collector is interested in achieving a data estimation using several sensor observations such that the variance

  6. Decentralized vs. centralized scheduling in wireless sensor networks for data fusion

    NARCIS (Netherlands)

    Mitici, Mihaela; Goseling, Jasper; de Graaf, Maurits; Boucherie, Richardus J.

    2013-01-01

    We consider the problem of data estimation in a sensor wireless network where sensors transmit their observations according to decentralized and centralized transmission schedules. A data collector is interested in achieving a data estimation using several sensor observations such that the variance

  7. Disaster Monitoring using Grid Based Data Fusion Algorithms

    Directory of Open Access Journals (Sweden)

    Cătălin NAE

    2010-12-01

    Full Text Available This is a study of the application of Grid technology and high performance parallelcomputing to a candidate algorithm for jointly accomplishing data fusion from different sensors. Thisincludes applications for both image analysis and/or data processing for simultaneously trackingmultiple targets in real-time. The emphasis is on comparing the architectures of the serial andparallel algorithms, and characterizing the performance benefits achieved by the parallel algorithmwith both on-ground and in-space hardware implementations. The improved performance levelsachieved by the use of Grid technology (middleware for Parallel Data Fusion are presented for themain metrics of interest in near real-time applications, namely latency, total computation load, andtotal sustainable throughput. The objective of this analysis is, therefore, to demonstrate animplementation of multi-sensor data fusion and/or multi-target tracking functions within an integratedmulti-node portable HPC architecture based on emerging Grid technology. The key metrics to bedetermined in support of ongoing system analyses includes: required computational throughput inMFLOPS; latency between receipt of input data and resulting outputs; and scalability, processorutilization and memory requirements. Furthermore, the standard MPI functions are considered to beused for inter-node communications in order to promote code portability across multiple HPCcomputer platforms, both in space and on-ground.

  8. Decision Fusion System for Bolted Joint Monitoring

    Directory of Open Access Journals (Sweden)

    Dong Liang

    2015-01-01

    Full Text Available Bolted joint is widely used in mechanical and architectural structures, such as machine tools, industrial robots, transport machines, power plants, aviation stiffened plate, bridges, and steel towers. The bolt loosening induced by flight load and environment factor can cause joint failure leading to a disastrous accident. Hence, structural health monitoring is critical for the bolted joint detection. In order to realize a real-time and convenient monitoring and satisfy the requirement of advanced maintenance of the structure, this paper proposes an intelligent bolted joint failure monitoring approach using a developed decision fusion system integrated with Lamb wave propagation based actuator-sensor monitoring method. Firstly, the basic knowledge of decision fusion and classifier selection techniques is briefly introduced. Then, a developed decision fusion system is presented. Finally, three fusion algorithms, which consist of majority voting, Bayesian belief, and multiagent method, are adopted for comparison in a real-world monitoring experiment for the large aviation aluminum plate. Based on the results shown in the experiment, a big potential in real-time application is presented that the method can accurately and rapidly identify the bolt loosening by analyzing the acquired strain signal using proposed decision fusion system.

  9. Perception-oriented fusion of multi-sensor imagery: visible, IR, and SAR

    Science.gov (United States)

    Sidorchuk, D.; Volkov, V.; Gladilin, S.

    2018-04-01

    This paper addresses the problem of image fusion of optical (visible and thermal domain) data and radar data for the purpose of visualization. These types of images typically contain a lot of complimentary information, and their joint visualization can be useful and more convenient for human user than a set of individual images. To solve the image fusion problem we propose a novel algorithm that utilizes some peculiarities of human color perception and based on the grey-scale structural visualization. Benefits of presented algorithm are exemplified by satellite imagery.

  10. An LPV Adaptive Observer for Updating a Map Applied to an MAF Sensor in a Diesel Engine.

    Science.gov (United States)

    Liu, Zhiyuan; Wang, Changhui

    2015-10-23

    In this paper, a new method for mass air flow (MAF) sensor error compensation and an online updating error map (or lookup table) due to installation and aging in a diesel engine is developed. Since the MAF sensor error is dependent on the engine operating point, the error model is represented as a two-dimensional (2D) map with two inputs, fuel mass injection quantity and engine speed. Meanwhile, the 2D map representing the MAF sensor error is described as a piecewise bilinear interpolation model, which can be written as a dot product between the regression vector and parameter vector using a membership function. With the combination of the 2D map regression model and the diesel engine air path system, an LPV adaptive observer with low computational load is designed to estimate states and parameters jointly. The convergence of the proposed algorithm is proven under the conditions of persistent excitation and given inequalities. The observer is validated against the simulation data from engine software enDYNA provided by Tesis. The results demonstrate that the operating point-dependent error of the MAF sensor can be approximated acceptably by the 2D map from the proposed method.

  11. Distributed service-based approach for sensor data fusion in IoT environments.

    Science.gov (United States)

    Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L

    2014-10-15

    The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.

  12. The Radio Frequency Health Node Wireless Sensor System

    Science.gov (United States)

    Valencia, J. Emilio; Stanley, Priscilla C.; Mackey, Paul J.

    2009-01-01

    The Radio Frequency Health Node (RFHN) wireless sensor system differs from other wireless sensor systems in ways originally intended to enhance utility as an instrumentation system for a spacecraft. The RFHN can also be adapted to use in terrestrial applications in which there are requirements for operational flexibility and integrability into higher-level instrumentation and data acquisition systems. As shown in the figure, the heart of the system is the RFHN, which is a unit that passes commands and data between (1) one or more commercially available wireless sensor units (optionally, also including wired sensor units) and (2) command and data interfaces with a local control computer that may be part of the spacecraft or other engineering system in which the wireless sensor system is installed. In turn, the local control computer can be in radio or wire communication with a remote control computer that may be part of a higher-level system. The remote control computer, acting via the local control computer and the RFHN, cannot only monitor readout data from the sensor units but can also remotely configure (program or reprogram) the RFHN and the sensor units during operation. In a spacecraft application, the RFHN and the sensor units can also be configured more nearly directly, prior to launch, via a serial interface that includes an umbilical cable between the spacecraft and ground support equipment. In either case, the RFHN wireless sensor system has the flexibility to be configured, as required, with different numbers and types of sensors for different applications. The RFHN can be used to effect realtime transfer of data from, and commands to, the wireless sensor units. It can also store data for later retrieval by an external computer. The RFHN communicates with the wireless sensor units via a radio transceiver module. The modular design of the RFHN makes it possible to add radio transceiver modules as needed to accommodate additional sets of wireless sensor

  13. Use of data fusion to optimize contaminant transport predictions

    International Nuclear Information System (INIS)

    Eeckhout, E. van

    1997-10-01

    The original data fusion workstation, as envisioned by Coleman Research Corp., was constructed under funding from DOE (EM-50) in the early 1990s. The intent was to demonstrate the viability of fusion and analysis of data from various types of sensors for waste site characterization, but primarily geophysical. This overall concept changed over time and evolved more towards hydrogeological (groundwater) data fusion after some initial geophysical fusion work focused at Coleman. This initial geophysical fusion platform was tested at Hanford and Fernald, and the later hydrogeological fusion work has been demonstrated at Pantex, Savannah River, the US Army Letterkenny Depot, a DoD Massachusetts site and a DoD California site. The hydrogeologic data fusion package has been spun off to a company named Fusion and Control Technology, Inc. This package is called the Hydrological Fusion And Control Tool (Hydro-FACT) and is being sold as a product that links with the software package, MS-VMS (MODFLOW-SURFACT Visual Modeling System), sold by HydroGeoLogic, Inc. MODFLOW is a USGS development, and is in the public domain. Since the government paid for the data fusion development at Coleman, the government and their contractors have access to the data fusion technology in this hydrogeologic package for certain computer platforms, but would probably have to hire FACT (Fusion and Control Technology, Inc.,) and/or HydroGeoLogic for some level of software and services. Further discussion in this report will concentrate on the hydrogeologic fusion module that is being sold as Hydro-FACT, which can be linked with MS-VMS

  14. Data Reduction with Quantization Constraints for Decentralized Estimation in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yang Weng

    2014-01-01

    Full Text Available The unknown vector estimation problem with bandwidth constrained wireless sensor network is considered. In such networks, sensor nodes make distributed observations on the unknown vector and collaborate with a fusion center to generate a final estimate. Due to power and communication bandwidth limitations, each sensor node must compress its data and transmit to the fusion center. In this paper, both centralized and decentralized estimation frameworks are developed. The closed-form solution for the centralized estimation framework is proposed. The computational complexity of decentralized estimation problem is proven to be NP-hard and a Gauss-Seidel algorithm to search for an optimal solution is also proposed. Simulation results show the good performance of the proposed algorithms.

  15. Kinect Fusion improvement using depth camera calibration

    Science.gov (United States)

    Pagliari, D.; Menna, F.; Roncella, R.; Remondino, F.; Pinto, L.

    2014-06-01

    Scene's 3D modelling, gesture recognition and motion tracking are fields in rapid and continuous development which have caused growing demand on interactivity in video-game and e-entertainment market. Starting from the idea of creating a sensor that allows users to play without having to hold any remote controller, the Microsoft Kinect device was created. The Kinect has always attract researchers in different fields, from robotics to Computer Vision (CV) and biomedical engineering as well as third-party communities that have released several Software Development Kit (SDK) versions for Kinect in order to use it not only as a game device but as measurement system. Microsoft Kinect Fusion control libraries (firstly released in March 2013) allow using the device as a 3D scanning and produce meshed polygonal of a static scene just moving the Kinect around. A drawback of this sensor is the geometric quality of the delivered data and the low repeatability. For this reason the authors carried out some investigation in order to evaluate the accuracy and repeatability of the depth measured delivered by the Kinect. The paper will present a throughout calibration analysis of the Kinect imaging sensor, with the aim of establishing the accuracy and precision of the delivered information: a straightforward calibration of the depth sensor in presented and then the 3D data are correct accordingly. Integrating the depth correction algorithm and correcting the IR camera interior and exterior orientation parameters, the Fusion Libraries are corrected and a new reconstruction software is created to produce more accurate models.

  16. Kinect Fusion improvement using depth camera calibration

    Directory of Open Access Journals (Sweden)

    D. Pagliari

    2014-06-01

    Full Text Available Scene's 3D modelling, gesture recognition and motion tracking are fields in rapid and continuous development which have caused growing demand on interactivity in video-game and e-entertainment market. Starting from the idea of creating a sensor that allows users to play without having to hold any remote controller, the Microsoft Kinect device was created. The Kinect has always attract researchers in different fields, from robotics to Computer Vision (CV and biomedical engineering as well as third-party communities that have released several Software Development Kit (SDK versions for Kinect in order to use it not only as a game device but as measurement system. Microsoft Kinect Fusion control libraries (firstly released in March 2013 allow using the device as a 3D scanning and produce meshed polygonal of a static scene just moving the Kinect around. A drawback of this sensor is the geometric quality of the delivered data and the low repeatability. For this reason the authors carried out some investigation in order to evaluate the accuracy and repeatability of the depth measured delivered by the Kinect. The paper will present a throughout calibration analysis of the Kinect imaging sensor, with the aim of establishing the accuracy and precision of the delivered information: a straightforward calibration of the depth sensor in presented and then the 3D data are correct accordingly. Integrating the depth correction algorithm and correcting the IR camera interior and exterior orientation parameters, the Fusion Libraries are corrected and a new reconstruction software is created to produce more accurate models.

  17. An Adaptive Channel Access Method for Dynamic Super Dense Wireless Sensor Networks.

    Science.gov (United States)

    Lei, Chunyang; Bie, Hongxia; Fang, Gengfa; Zhang, Xuekun

    2015-12-03

    Super dense and distributed wireless sensor networks have become very popular with the development of small cell technology, Internet of Things (IoT), Machine-to-Machine (M2M) communications, Vehicular-to-Vehicular (V2V) communications and public safety networks. While densely deployed wireless networks provide one of the most important and sustainable solutions to improve the accuracy of sensing and spectral efficiency, a new channel access scheme needs to be designed to solve the channel congestion problem introduced by the high dynamics of competing nodes accessing the channel simultaneously. In this paper, we firstly analyzed the channel contention problem using a novel normalized channel contention analysis model which provides information on how to tune the contention window according to the state of channel contention. We then proposed an adaptive channel contention window tuning algorithm in which the contention window tuning rate is set dynamically based on the estimated channel contention level. Simulation results show that our proposed adaptive channel access algorithm based on fast contention window tuning can achieve more than 95 % of the theoretical optimal throughput and 0 . 97 of fairness index especially in dynamic and dense networks.

  18. An Adaptive Channel Access Method for Dynamic Super Dense Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Chunyang Lei

    2015-12-01

    Full Text Available Super dense and distributed wireless sensor networks have become very popular with the development of small cell technology, Internet of Things (IoT, Machine-to-Machine (M2M communications, Vehicular-to-Vehicular (V2V communications and public safety networks. While densely deployed wireless networks provide one of the most important and sustainable solutions to improve the accuracy of sensing and spectral efficiency, a new channel access scheme needs to be designed to solve the channel congestion problem introduced by the high dynamics of competing nodes accessing the channel simultaneously. In this paper, we firstly analyzed the channel contention problem using a novel normalized channel contention analysis model which provides information on how to tune the contention window according to the state of channel contention. We then proposed an adaptive channel contention window tuning algorithm in which the contention window tuning rate is set dynamically based on the estimated channel contention level. Simulation results show that our proposed adaptive channel access algorithm based on fast contention window tuning can achieve more than 95 % of the theoretical optimal throughput and 0 . 97 of fairness index especially in dynamic and dense networks.

  19. Cermet coatings for magnetic fusion reactors

    International Nuclear Information System (INIS)

    Smith, M.F.; Whitley, J.B.; McDonald, J.M.

    1984-01-01

    Cermet coatings consisting of SiC particles in an aluminum matrix were produced by a low pressure chamber plasma spray process. Properties of these coatings are being investigated to evaluate their suitability for use in the next generation of magnetic confinement fusion reactors. Although this preliminary study has focused primarily upon SiC-Al cermets, the deposition process can be adapted to other ceramic-metal combinations. Potential applications for cermet coatings in magnetic fusion devices are presented along with experimental results from thermal tests of candidate coatings. (Auth.)

  20. Introduction to adaptive arrays

    CERN Document Server

    Monzingo, Bob; Haupt, Randy

    2011-01-01

    This second edition is an extensive modernization of the bestselling introduction to the subject of adaptive array sensor systems. With the number of applications of adaptive array sensor systems growing each year, this look at the principles and fundamental techniques that are critical to these systems is more important than ever before. Introduction to Adaptive Arrays, 2nd Edition is organized as a tutorial, taking the reader by the hand and leading them through the maze of jargon that often surrounds this highly technical subject. It is easy to read and easy to follow as fundamental concept