WorldWideScience

Sample records for based map detection

  1. Contour Detection for UAV-Based Cadastral Mapping

    Directory of Open Access Journals (Sweden)

    Sophie Crommelinck

    2017-02-01

    Full Text Available Unmanned aerial vehicles (UAVs provide a flexible and low-cost solution for the acquisition of high-resolution data. The potential of high-resolution UAV imagery to create and update cadastral maps is being increasingly investigated. Existing procedures generally involve substantial fieldwork and many manual processes. Arguably, multiple parts of UAV-based cadastral mapping workflows could be automated. Specifically, as many cadastral boundaries coincide with visible boundaries, they could be extracted automatically using image analysis methods. This study investigates the transferability of gPb contour detection, a state-of-the-art computer vision method, to remotely sensed UAV images and UAV-based cadastral mapping. Results show that the approach is transferable to UAV data and automated cadastral mapping: object contours are comprehensively detected at completeness and correctness rates of up to 80%. The detection quality is optimal when the entire scene is covered with one orthoimage, due to the global optimization of gPb contour detection. However, a balance between high completeness and correctness is hard to achieve, so a combination with area-based segmentation and further object knowledge is proposed. The localization quality exhibits the usual dependency on ground resolution. The approach has the potential to accelerate the process of general boundary delineation during the creation and updating of cadastral maps.

  2. Object detection system based on multimodel saliency maps

    Science.gov (United States)

    Guo, Ya'nan; Luo, Chongfan; Ma, Yide

    2017-03-01

    Detection of visually salient image regions is extensively applied in computer vision and computer graphics, such as object detection, adaptive compression, and object recognition, but any single model always has its limitations to various images, so in our work, we establish a method based on multimodel saliency maps to detect the object, which intelligently absorbs the merits of various individual saliency detection models to achieve promising results. The method can be roughly divided into three steps: in the first step, we propose a decision-making system to evaluate saliency maps obtained by seven competitive methods and merely select the three most valuable saliency maps; in the second step, we introduce heterogeneous PCNN algorithm to obtain three prime foregrounds; and then a self-designed nonlinear fusion method is proposed to merge these saliency maps; at last, the adaptive improved and simplified PCNN model is used to detect the object. Our proposed method can constitute an object detection system for different occasions, which requires no training, is simple, and highly efficient. The proposed saliency fusion technique shows better performance over a broad range of images and enriches the applicability range by fusing different individual saliency models, this proposed system is worthy enough to be called a strong model. Moreover, the proposed adaptive improved SPCNN model is stemmed from the Eckhorn's neuron model, which is skilled in image segmentation because of its biological background, and in which all the parameters are adaptive to image information. We extensively appraise our algorithm on classical salient object detection database, and the experimental results demonstrate that the aggregation of saliency maps outperforms the best saliency model in all cases, yielding highest precision of 89.90%, better recall rates of 98.20%, greatest F-measure of 91.20%, and lowest mean absolute error value of 0.057, the value of proposed saliency evaluation

  3. Risk-based fault detection using Self-Organizing Map

    International Nuclear Information System (INIS)

    Yu, Hongyang; Khan, Faisal; Garaniya, Vikram

    2015-01-01

    The complexity of modern systems is increasing rapidly and the dominating relationships among system variables have become highly non-linear. This results in difficulty in the identification of a system's operating states. In turn, this difficulty affects the sensitivity of fault detection and imposes a challenge on ensuring the safety of operation. In recent years, Self-Organizing Maps has gained popularity in system monitoring as a robust non-linear dimensionality reduction tool. Self-Organizing Map is able to capture non-linear variations of the system. Therefore, it is sensitive to the change of a system's states leading to early detection of fault. In this paper, a new approach based on Self-Organizing Map is proposed to detect and assess the risk of fault. In addition, probabilistic analysis is applied to characterize the risk of fault into different levels according to the hazard potential to enable a refined monitoring of the system. The proposed approach is applied on two experimental systems. The results from both systems have shown high sensitivity of the proposed approach in detecting and identifying the root cause of faults. The refined monitoring facilitates the determination of the risk of fault and early deployment of remedial actions and safety measures to minimize the potential impact of fault. - Highlights: • A new approach based on Self-Organizing Map is proposed to detect faults. • Integration of fault detection with risk assessment methodology. • Fault risk characterization into different levels to enable focused system monitoring

  4. Mobile Anomaly Detection Based on Improved Self-Organizing Maps

    Directory of Open Access Journals (Sweden)

    Chunyong Yin

    2017-01-01

    Full Text Available Anomaly detection has always been the focus of researchers and especially, the developments of mobile devices raise new challenges of anomaly detection. For example, mobile devices can keep connection with Internet and they are rarely turned off even at night. This means mobile devices can attack nodes or be attacked at night without being perceived by users and they have different characteristics from Internet behaviors. The introduction of data mining has made leaps forward in this field. Self-organizing maps, one of famous clustering algorithms, are affected by initial weight vectors and the clustering result is unstable. The optimal method of selecting initial clustering centers is transplanted from K-means to SOM. To evaluate the performance of improved SOM, we utilize diverse datasets and KDD Cup99 dataset to compare it with traditional one. The experimental results show that improved SOM can get higher accuracy rate for universal datasets. As for KDD Cup99 dataset, it achieves higher recall rate and precision rate.

  5. An Anomaly Detection Algorithm of Cloud Platform Based on Self-Organizing Maps

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2016-01-01

    Full Text Available Virtual machines (VM on a Cloud platform can be influenced by a variety of factors which can lead to decreased performance and downtime, affecting the reliability of the Cloud platform. Traditional anomaly detection algorithms and strategies for Cloud platforms have some flaws in their accuracy of detection, detection speed, and adaptability. In this paper, a dynamic and adaptive anomaly detection algorithm based on Self-Organizing Maps (SOM for virtual machines is proposed. A unified modeling method based on SOM to detect the machine performance within the detection region is presented, which avoids the cost of modeling a single virtual machine and enhances the detection speed and reliability of large-scale virtual machines in Cloud platform. The important parameters that affect the modeling speed are optimized in the SOM process to significantly improve the accuracy of the SOM modeling and therefore the anomaly detection accuracy of the virtual machine.

  6. Subpixel Mapping of Hyperspectral Image Based on Linear Subpixel Feature Detection and Object Optimization

    Science.gov (United States)

    Liu, Zhaoxin; Zhao, Liaoying; Li, Xiaorun; Chen, Shuhan

    2018-04-01

    Owing to the limitation of spatial resolution of the imaging sensor and the variability of ground surfaces, mixed pixels are widesperead in hyperspectral imagery. The traditional subpixel mapping algorithms treat all mixed pixels as boundary-mixed pixels while ignoring the existence of linear subpixels. To solve this question, this paper proposed a new subpixel mapping method based on linear subpixel feature detection and object optimization. Firstly, the fraction value of each class is obtained by spectral unmixing. Secondly, the linear subpixel features are pre-determined based on the hyperspectral characteristics and the linear subpixel feature; the remaining mixed pixels are detected based on maximum linearization index analysis. The classes of linear subpixels are determined by using template matching method. Finally, the whole subpixel mapping results are iteratively optimized by binary particle swarm optimization algorithm. The performance of the proposed subpixel mapping method is evaluated via experiments based on simulated and real hyperspectral data sets. The experimental results demonstrate that the proposed method can improve the accuracy of subpixel mapping.

  7. Weed map generation from UAV image mosaics based on crop row detection

    DEFF Research Database (Denmark)

    Midtiby, Henrik Skov

    To control weed in a field effectively with a minimum of herbicides, knowledge about the weed patches is required. Based on images acquired by Unmanned Aerial Vehicles (UAVs), a vegetation map of the entire field can be generated. Manual analysis, which is often required, to detect weed patches...... is used as input for the method. Issues related to perspective distortion are reduced by using an orthomosaic, which is a high resolution image of the entire field, built from hundreds of images taken by a UAV. A vegetation map is generated from the orthomosaic by calculating the excess green color index...

  8. Accelerometer-based automatic voice onset detection in speech mapping with navigated repetitive transcranial magnetic stimulation.

    Science.gov (United States)

    Vitikainen, Anne-Mari; Mäkelä, Elina; Lioumis, Pantelis; Jousmäki, Veikko; Mäkelä, Jyrki P

    2015-09-30

    The use of navigated repetitive transcranial magnetic stimulation (rTMS) in mapping of speech-related brain areas has recently shown to be useful in preoperative workflow of epilepsy and tumor patients. However, substantial inter- and intraobserver variability and non-optimal replicability of the rTMS results have been reported, and a need for additional development of the methodology is recognized. In TMS motor cortex mappings the evoked responses can be quantitatively monitored by electromyographic recordings; however, no such easily available setup exists for speech mappings. We present an accelerometer-based setup for detection of vocalization-related larynx vibrations combined with an automatic routine for voice onset detection for rTMS speech mapping applying naming. The results produced by the automatic routine were compared with the manually reviewed video-recordings. The new method was applied in the routine navigated rTMS speech mapping for 12 consecutive patients during preoperative workup for epilepsy or tumor surgery. The automatic routine correctly detected 96% of the voice onsets, resulting in 96% sensitivity and 71% specificity. Majority (63%) of the misdetections were related to visible throat movements, extra voices before the response, or delayed naming of the previous stimuli. The no-response errors were correctly detected in 88% of events. The proposed setup for automatic detection of voice onsets provides quantitative additional data for analysis of the rTMS-induced speech response modifications. The objectively defined speech response latencies increase the repeatability, reliability and stratification of the rTMS results. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Rapid detection of structural variation in a human genome using nanochannel-based genome mapping technology

    DEFF Research Database (Denmark)

    Cao, Hongzhi; Hastie, Alex R.; Cao, Dandan

    2014-01-01

    mutations; however, none of the current detection methods are comprehensive, and currently available methodologies are incapable of providing sufficient resolution and unambiguous information across complex regions in the human genome. To address these challenges, we applied a high-throughput, cost......-effective genome mapping technology to comprehensively discover genome-wide SVs and characterize complex regions of the YH genome using long single molecules (>150 kb) in a global fashion. RESULTS: Utilizing nanochannel-based genome mapping technology, we obtained 708 insertions/deletions and 17 inversions larger...... fosmid data. Of the remaining 270 SVs, 260 are insertions and 213 overlap known SVs in the Database of Genomic Variants. Overall, 609 out of 666 (90%) variants were supported by experimental orthogonal methods or historical evidence in public databases. At the same time, genome mapping also provides...

  10. Smartphone-Based Mobile Detection Platform for Molecular Diagnostics and Spatiotemporal Disease Mapping.

    Science.gov (United States)

    Song, Jinzhao; Pandian, Vikram; Mauk, Michael G; Bau, Haim H; Cherry, Sara; Tisi, Laurence C; Liu, Changchun

    2018-04-03

    Rapid and quantitative molecular diagnostics in the field, at home, and at remote clinics is essential for evidence-based disease management, control, and prevention. Conventional molecular diagnostics requires extensive sample preparation, relatively sophisticated instruments, and trained personnel, restricting its use to centralized laboratories. To overcome these limitations, we designed a simple, inexpensive, hand-held, smartphone-based mobile detection platform, dubbed "smart-connected cup" (SCC), for rapid, connected, and quantitative molecular diagnostics. Our platform combines bioluminescent assay in real-time and loop-mediated isothermal amplification (BART-LAMP) technology with smartphone-based detection, eliminating the need for an excitation source and optical filters that are essential in fluorescent-based detection. The incubation heating for the isothermal amplification is provided, electricity-free, with an exothermic chemical reaction, and incubation temperature is regulated with a phase change material. A custom Android App was developed for bioluminescent signal monitoring and analysis, target quantification, data sharing, and spatiotemporal mapping of disease. SCC's utility is demonstrated by quantitative detection of Zika virus (ZIKV) in urine and saliva and HIV in blood within 45 min. We demonstrate SCC's connectivity for disease spatiotemporal mapping with a custom-designed website. Such a smart- and connected-diagnostic system does not require any lab facilities and is suitable for use at home, in the field, in the clinic, and particularly in resource-limited settings in the context of Internet of Medical Things (IoMT).

  11. Landslide Mapping in Vegetated Areas Using Change Detection Based on Optical and Polarimetric SAR Data

    Directory of Open Access Journals (Sweden)

    Simon Plank

    2016-04-01

    Full Text Available Mapping of landslides, quickly providing information about the extent of the affected area and type and grade of damage, is crucial to enable fast crisis response, i.e., to support rescue and humanitarian operations. Most synthetic aperture radar (SAR data-based landslide detection approaches reported in the literature use change detection techniques, requiring very high resolution (VHR SAR imagery acquired shortly before the landslide event, which is commonly not available. Modern VHR SAR missions, e.g., Radarsat-2, TerraSAR-X, or COSMO-SkyMed, do not systematically cover the entire world, due to limitations in onboard disk space and downlink transmission rates. Here, we present a fast and transferable procedure for mapping of landslides, based on change detection between pre-event optical imagery and the polarimetric entropy derived from post-event VHR polarimetric SAR data. Pre-event information is derived from high resolution optical imagery of Landsat-8 or Sentinel-2, which are freely available and systematically acquired over the entire Earth’s landmass. The landslide mapping is refined by slope information from a digital elevation model generated from bi-static TanDEM-X imagery. The methodology was successfully applied to two landslide events of different characteristics: A rotational slide near Charleston, West Virginia, USA and a mining waste earthflow near Bolshaya Talda, Russia.

  12. Hardware Implementation of a Modified Delay-Coordinate Mapping-Based QRS Complex Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Andrej Zemva

    2007-01-01

    Full Text Available We present a modified delay-coordinate mapping-based QRS complex detection algorithm, suitable for hardware implementation. In the original algorithm, the phase-space portrait of an electrocardiogram signal is reconstructed in a two-dimensional plane using the method of delays. Geometrical properties of the obtained phase-space portrait are exploited for QRS complex detection. In our solution, a bandpass filter is used for ECG signal prefiltering and an improved method for detection threshold-level calculation is utilized. We developed the algorithm on the MIT-BIH Arrhythmia Database (sensitivity of 99.82% and positive predictivity of 99.82% and tested it on the long-term ST database (sensitivity of 99.72% and positive predictivity of 99.37%. Our algorithm outperforms several well-known QRS complex detection algorithms, including the original algorithm.

  13. Cloud-based computation for accelerating vegetation mapping and change detection at regional to national scales

    Science.gov (United States)

    Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts

    2015-01-01

    Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...

  14. Automatic Pedestrian Crossing Detection and Impairment Analysis Based on Mobile Mapping System

    Science.gov (United States)

    Liu, X.; Zhang, Y.; Li, Q.

    2017-09-01

    Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians' lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  15. AUTOMATIC PEDESTRIAN CROSSING DETECTION AND IMPAIRMENT ANALYSIS BASED ON MOBILE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    X. Liu

    2017-09-01

    Full Text Available Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians’ lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  16. Ischemia Detection Using Supervised Learning for Hierarchical Neural Networks Based on Kohonen-Maps

    National Research Council Canada - National Science Library

    Vladutu, L

    2001-01-01

    .... The motivation for developing the Supervising Network - Self Organizing Map (sNet-SOM) model is to design computationally effective solutions for the particular problem of ischemia detection and other similar applications...

  17. Web Based Rapid Mapping of Disaster Areas using Satellite Images, Web Processing Service, Web Mapping Service, Frequency Based Change Detection Algorithm and J-iView

    Science.gov (United States)

    Bandibas, J. C.; Takarada, S.

    2013-12-01

    Timely identification of areas affected by natural disasters is very important for a successful rescue and effective emergency relief efforts. This research focuses on the development of a cost effective and efficient system of identifying areas affected by natural disasters, and the efficient distribution of the information. The developed system is composed of 3 modules which are the Web Processing Service (WPS), Web Map Service (WMS) and the user interface provided by J-iView (fig. 1). WPS is an online system that provides computation, storage and data access services. In this study, the WPS module provides online access of the software implementing the developed frequency based change detection algorithm for the identification of areas affected by natural disasters. It also sends requests to WMS servers to get the remotely sensed data to be used in the computation. WMS is a standard protocol that provides a simple HTTP interface for requesting geo-registered map images from one or more geospatial databases. In this research, the WMS component provides remote access of the satellite images which are used as inputs for land cover change detection. The user interface in this system is provided by J-iView, which is an online mapping system developed at the Geological Survey of Japan (GSJ). The 3 modules are seamlessly integrated into a single package using J-iView, which could rapidly generate a map of disaster areas that is instantaneously viewable online. The developed system was tested using ASTER images covering the areas damaged by the March 11, 2011 tsunami in northeastern Japan. The developed system efficiently generated a map showing areas devastated by the tsunami. Based on the initial results of the study, the developed system proved to be a useful tool for emergency workers to quickly identify areas affected by natural disasters.

  18. People Detection Based on Spatial Mapping of Friendliness and Floor Boundary Points for a Mobile Navigation Robot

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Tasaki

    2011-01-01

    Full Text Available Navigation robots must single out partners requiring navigation and move in the cluttered environment where people walk around. Developing such robots requires two different people detections: detecting partners and detecting all moving people around the robots. For detecting partners, we design divided spaces based on the spatial relationships and sensing ranges. Mapping the friendliness of each divided space based on the stimulus from the multiple sensors to detect people calling robots positively, robots detect partners on the highest friendliness space. For detecting moving people, we regard objects’ floor boundary points in an omnidirectional image as obstacles. We classify obstacles as moving people by comparing movement of each point with robot movement using odometry data, dynamically changing thresholds to detect. Our robot detected 95.0% of partners while it stands by and interacts with people and detected 85.0% of moving people while robot moves, which was four times higher than previous methods did.

  19. Development of a sensitive Luminex xMAP-based microsphere immunoassay for specific detection of Iris yellow spot virus.

    Science.gov (United States)

    Yu, Cui; Yang, Cuiyun; Song, Shaoyi; Yu, Zixiang; Zhou, Xueping; Wu, Jianxiang

    2018-04-04

    Iris yellow spot virus (IYSV) is an Orthotospovirus that infects most Allium species. Very few approaches for specific detection of IYSV from infected plants are available to date. We report the development of a high-sensitive Luminex xMAP-based microsphere immunoassay (MIA) for specific detection of IYSV. The nucleocapsid (N) gene of IYSV was cloned and expressed in Escherichia coli to produce the His-tagged recombinant N protein. A panel of monoclonal antibodies (MAbs) against IYSV was generated by immunizing the mice with recombinant N protein. Five specific MAbs (16D9, 11C6, 7F4, 12C10, and 14H12) were identified and used for developing the Luminex xMAP-based MIA systems along with a polyclonal antibody against IYSV. Comparative analyses of their sensitivity and specificity in detecting IYSV from infected tobacco leaves identified 7F4 as the best-performed MAb in MIA. We then optimized the working conditions of Luminex xMAP-based MIA in specific detection of IYSV from infected tobacco leaves by using appropriate blocking buffer and proper concentration of biotin-labeled antibodies as well as the suitable ratio between the antibodies and the streptavidin R-phycoerythrin (SA-RPE). Under the optimized conditions the Luminex xMAP-based MIA was able to specifically detect IYSV with much higher sensitivity than conventional enzyme-linked immunosorbent assay (ELISA). Importantly, the Luminex xMAP-based MIA is time-saving and the whole procedure could be completed within 2.5 h. We generated five specific MAbs against IYSV and developed the Luminex xMAP-based MIA method for specific detection of IYSV in plants. This assay provides a sensitive, high-specific, easy to perform and likely cost-effective approach for IYSV detection from infected plants, implicating potential broad usefulness of MIA in plant virus diagnosis.

  20. Landslide Inventory Mapping from Bitemporal 10 m SENTINEL-2 Images Using Change Detection Based Markov Random Field

    Science.gov (United States)

    Qin, Y.; Lu, P.; Li, Z.

    2018-04-01

    Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF) method for landslide inventory mapping. The proposed method mainly includes two steps: 1) change detection-based multi-threshold for training samples generation and 2) MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1) it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2) it takes the spectral characteristics of landslides into account; and 3) it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2) images in China.

  1. On a Hopping-Points SVD and Hough Transform-Based Line Detection Algorithm for Robot Localization and Mapping

    Directory of Open Access Journals (Sweden)

    Abhijeet Ravankar

    2016-05-01

    Full Text Available Line detection is an important problem in computer vision, graphics and autonomous robot navigation. Lines detected using a laser range sensor (LRS mounted on a robot can be used as features to build a map of the environment, and later to localize the robot in the map, in a process known as Simultaneous Localization and Mapping (SLAM. We propose an efficient algorithm for line detection from LRS data using a novel hopping-points Singular Value Decomposition (SVD and Hough transform-based algorithm, in which SVD is applied to intermittent LRS points to accelerate the algorithm. A reverse-hop mechanism ensures that the end points of the line segments are accurately extracted. Line segments extracted from the proposed algorithm are used to form a map and, subsequently, LRS data points are matched with the line segments to localize the robot. The proposed algorithm eliminates the drawbacks of point-based matching algorithms like the Iterative Closest Points (ICP algorithm, the performance of which degrades with an increasing number of points. We tested the proposed algorithm for mapping and localization in both simulated and real environments, and found it to detect lines accurately and build maps with good self-localization.

  2. A Depth Map Generation Algorithm Based on Saliency Detection for 2D to 3D Conversion

    Science.gov (United States)

    Yang, Yizhong; Hu, Xionglou; Wu, Nengju; Wang, Pengfei; Xu, Dong; Rong, Shen

    2017-09-01

    In recent years, 3D movies attract people's attention more and more because of their immersive stereoscopic experience. However, 3D movies is still insufficient, so estimating depth information for 2D to 3D conversion from a video is more and more important. In this paper, we present a novel algorithm to estimate depth information from a video via scene classification algorithm. In order to obtain perceptually reliable depth information for viewers, the algorithm classifies them into three categories: landscape type, close-up type, linear perspective type firstly. Then we employ a specific algorithm to divide the landscape type image into many blocks, and assign depth value by similar relative height cue with the image. As to the close-up type image, a saliency-based method is adopted to enhance the foreground in the image and the method combine it with the global depth gradient to generate final depth map. By vanishing line detection, the calculated vanishing point which is regarded as the farthest point to the viewer is assigned with deepest depth value. According to the distance between the other points and the vanishing point, the entire image is assigned with corresponding depth value. Finally, depth image-based rendering is employed to generate stereoscopic virtual views after bilateral filter. Experiments show that the proposed algorithm can achieve realistic 3D effects and yield satisfactory results, while the perception scores of anaglyph images lie between 6.8 and 7.8.

  3. MODIS 250m burned area mapping based on an algorithm using change point detection and Markov random fields.

    Science.gov (United States)

    Mota, Bernardo; Pereira, Jose; Campagnolo, Manuel; Killick, Rebeca

    2013-04-01

    Area burned in tropical savannas of Brazil was mapped using MODIS-AQUA daily 250m resolution imagery by adapting one of the European Space Agency fire_CCI project burned area algorithms, based on change point detection and Markov random fields. The study area covers 1,44 Mkm2 and was performed with data from 2005. The daily 1000 m image quality layer was used for cloud and cloud shadow screening. The algorithm addresses each pixel as a time series and detects changes in the statistical properties of NIR reflectance values, to identify potential burning dates. The first step of the algorithm is robust filtering, to exclude outlier observations, followed by application of the Pruned Exact Linear Time (PELT) change point detection technique. Near-infrared (NIR) spectral reflectance changes between time segments, and post change NIR reflectance values are combined into a fire likelihood score. Change points corresponding to an increase in reflectance are dismissed as potential burn events, as are those occurring outside of a pre-defined fire season. In the last step of the algorithm, monthly burned area probability maps and detection date maps are converted to dichotomous (burned-unburned maps) using Markov random fields, which take into account both spatial and temporal relations in the potential burned area maps. A preliminary assessment of our results is performed by comparison with data from the MODIS 1km active fires and the 500m burned area products, taking into account differences in spatial resolution between the two sensors.

  4. LANDSLIDE INVENTORY MAPPING FROM BITEMPORAL 10 m SENTINEL-2 IMAGES USING CHANGE DETECTION BASED MARKOV RANDOM FIELD

    Directory of Open Access Journals (Sweden)

    Y. Qin

    2018-04-01

    Full Text Available Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF method for landslide inventory mapping. The proposed method mainly includes two steps: 1 change detection-based multi-threshold for training samples generation and 2 MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1 it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2 it takes the spectral characteristics of landslides into account; and 3 it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2 images in China.

  5. An evaluation of image based techniques for wildfire detection and fuel mapping

    Science.gov (United States)

    Gabbert, Dustin W.

    Few events can cause the catastrophic impact to ecology, infrastructure, and human safety of a wildland fire along the wildland urban interface. The suppression of natural wildland fires over the past decade has caused a buildup of dry, dead surface fuels: a condition that, coupled with the right weather conditions, can cause large destructive wildfires that are capable of threatening both ancient tree stands and manmade infrastructure. Firefighters use fire danger models to determine staffing needs on high fire risk days; however models are only as effective as the spatial and temporal density of their observations. OKFIRE, an Oklahoma initiative created by a partnership between Oklahoma State University and the University of Oklahoma, has proven that fire danger assessments close to the fire - both geographically and temporally - can give firefighters a significant increase in their situational awareness while fighting a wildland fire. This paper investigates several possible solutions for a small Unmanned Aerial System (UAS) which could gather information useful for detecting ground fires and constructing fire danger maps. Multiple fire detection and fuel mapping programs utilize satellites, manned aircraft, and large UAS equipped with hyperspectral sensors to gather useful information. Their success provides convincing proof of the utility that could be gained from low-altitude UAS gathering information at the exact time and place firefighters and land managers are interested in. Close proximity, both geographically and operationally, to the end can reduce latency times below what could ever be possible with satellite observation. This paper expands on recent advances in computer vision, photogrammetry, and infrared and color imagery to develop a framework for a next-generation UAS which can assess fire danger and aid firefighters in real time as they observe, contain, or extinguish wildland fires. It also investigates the impact information gained by this

  6. Fall Detection for Elderly from Partially Observed Depth-Map Video Sequences Based on View-Invariant Human Activity Representation

    Directory of Open Access Journals (Sweden)

    Rami Alazrai

    2017-03-01

    Full Text Available This paper presents a new approach for fall detection from partially-observed depth-map video sequences. The proposed approach utilizes the 3D skeletal joint positions obtained from the Microsoft Kinect sensor to build a view-invariant descriptor for human activity representation, called the motion-pose geometric descriptor (MPGD. Furthermore, we have developed a histogram-based representation (HBR based on the MPGD to construct a length-independent representation of the observed video subsequences. Using the constructed HBR, we formulate the fall detection problem as a posterior-maximization problem in which the posteriori probability for each observed video subsequence is estimated using a multi-class SVM (support vector machine classifier. Then, we combine the computed posteriori probabilities from all of the observed subsequences to obtain an overall class posteriori probability of the entire partially-observed depth-map video sequence. To evaluate the performance of the proposed approach, we have utilized the Kinect sensor to record a dataset of depth-map video sequences that simulates four fall-related activities of elderly people, including: walking, sitting, falling form standing and falling from sitting. Then, using the collected dataset, we have developed three evaluation scenarios based on the number of unobserved video subsequences in the testing videos, including: fully-observed video sequence scenario, single unobserved video subsequence of random lengths scenarios and two unobserved video subsequences of random lengths scenarios. Experimental results show that the proposed approach achieved an average recognition accuracy of 93 . 6 % , 77 . 6 % and 65 . 1 % , in recognizing the activities during the first, second and third evaluation scenario, respectively. These results demonstrate the feasibility of the proposed approach to detect falls from partially-observed videos.

  7. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Fogh Olsen, Ole; Sporring, Jon

    2007-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  8. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Olsen, Ole Fogh; Sporring, Jon

    2006-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  9. Reliable allele detection using SNP-based PCR primers containing Locked Nucleic Acid: application in genetic mapping

    Directory of Open Access Journals (Sweden)

    Trognitz Friederike

    2007-02-01

    Full Text Available Abstract Background The diploid, Solanum caripense, a wild relative of potato and tomato, possesses valuable resistance to potato late blight and we are interested in the genetic base of this resistance. Due to extremely low levels of genetic variation within the S. caripense genome it proved impossible to generate a dense genetic map and to assign individual Solanum chromosomes through the use of conventional chromosome-specific SSR, RFLP, AFLP, as well as gene- or locus-specific markers. The ease of detection of DNA polymorphisms depends on both frequency and form of sequence variation. The narrow genetic background of close relatives and inbreds complicates the detection of persisting, reduced polymorphism and is a challenge to the development of reliable molecular markers. Nonetheless, monomorphic DNA fragments representing not directly usable conventional markers can contain considerable variation at the level of single nucleotide polymorphisms (SNPs. This can be used for the design of allele-specific molecular markers. The reproducible detection of allele-specific markers based on SNPs has been a technical challenge. Results We present a fast and cost-effective protocol for the detection of allele-specific SNPs by applying Sequence Polymorphism-Derived (SPD markers. These markers proved highly efficient for fingerprinting of individuals possessing a homogeneous genetic background. SPD markers are obtained from within non-informative, conventional molecular marker fragments that are screened for SNPs to design allele-specific PCR primers. The method makes use of primers containing a single, 3'-terminal Locked Nucleic Acid (LNA base. We demonstrate the applicability of the technique by successful genetic mapping of allele-specific SNP markers derived from monomorphic Conserved Ortholog Set II (COSII markers mapped to Solanum chromosomes, in S. caripense. By using SPD markers it was possible for the first time to map the S. caripense alleles

  10. Landslide susceptibility mapping using decision-tree based CHi-squared automatic interaction detection (CHAID) and Logistic regression (LR) integration

    International Nuclear Information System (INIS)

    Althuwaynee, Omar F; Pradhan, Biswajeet; Ahmad, Noordin

    2014-01-01

    This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies

  11. Landslide susceptibility mapping using decision-tree based CHi-squared automatic interaction detection (CHAID) and Logistic regression (LR) integration

    Science.gov (United States)

    Althuwaynee, Omar F.; Pradhan, Biswajeet; Ahmad, Noordin

    2014-06-01

    This article uses methodology based on chi-squared automatic interaction detection (CHAID), as a multivariate method that has an automatic classification capacity to analyse large numbers of landslide conditioning factors. This new algorithm was developed to overcome the subjectivity of the manual categorization of scale data of landslide conditioning factors, and to predict rainfall-induced susceptibility map in Kuala Lumpur city and surrounding areas using geographic information system (GIS). The main objective of this article is to use CHi-squared automatic interaction detection (CHAID) method to perform the best classification fit for each conditioning factor, then, combining it with logistic regression (LR). LR model was used to find the corresponding coefficients of best fitting function that assess the optimal terminal nodes. A cluster pattern of landslide locations was extracted in previous study using nearest neighbor index (NNI), which were then used to identify the clustered landslide locations range. Clustered locations were used as model training data with 14 landslide conditioning factors such as; topographic derived parameters, lithology, NDVI, land use and land cover maps. Pearson chi-squared value was used to find the best classification fit between the dependent variable and conditioning factors. Finally the relationship between conditioning factors were assessed and the landslide susceptibility map (LSM) was produced. An area under the curve (AUC) was used to test the model reliability and prediction capability with the training and validation landslide locations respectively. This study proved the efficiency and reliability of decision tree (DT) model in landslide susceptibility mapping. Also it provided a valuable scientific basis for spatial decision making in planning and urban management studies.

  12. Human Detection System by Fusing Depth Map-Based Method and Convolutional Neural Network-Based Method

    Directory of Open Access Journals (Sweden)

    Anh Vu Le

    2017-01-01

    Full Text Available In this paper, the depth images and the colour images provided by Kinect sensors are used to enhance the accuracy of human detection. The depth-based human detection method is fast but less accurate. On the other hand, the faster region convolutional neural network-based human detection method is accurate but requires a rather complex hardware configuration. To simultaneously leverage the advantages and relieve the drawbacks of each method, one master and one client system is proposed. The final goal is to make a novel Robot Operation System (ROS-based Perception Sensor Network (PSN system, which is more accurate and ready for the real time application. The experimental results demonstrate the outperforming of the proposed method compared with other conventional methods in the challenging scenarios.

  13. A voting-based statistical cylinder detection framework applied to fallen tree mapping in terrestrial laser scanning point clouds

    Science.gov (United States)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2017-07-01

    This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.

  14. Gains in QTL detection using an ultra-high density SNP map based on population sequencing relative to traditional RFLP/SSR markers.

    Directory of Open Access Journals (Sweden)

    Huihui Yu

    Full Text Available Huge efforts have been invested in the last two decades to dissect the genetic bases of complex traits including yields of many crop plants, through quantitative trait locus (QTL analyses. However, almost all the studies were based on linkage maps constructed using low-throughput molecular markers, e.g. restriction fragment length polymorphisms (RFLPs and simple sequence repeats (SSRs, thus are mostly of low density and not able to provide precise and complete information about the numbers and locations of the genes or QTLs controlling the traits. In this study, we constructed an ultra-high density genetic map based on high quality single nucleotide polymorphisms (SNPs from low-coverage sequences of a recombinant inbred line (RIL population of rice, generated using new sequencing technology. The quality of the map was assessed by validating the positions of several cloned genes including GS3 and GW5/qSW5, two major QTLs for grain length and grain width respectively, and OsC1, a qualitative trait locus for pigmentation. In all the cases the loci could be precisely resolved to the bins where the genes are located, indicating high quality and accuracy of the map. The SNP map was used to perform QTL analysis for yield and three yield-component traits, number of tillers per plant, number of grains per panicle and grain weight, using data from field trials conducted over years, in comparison to QTL mapping based on RFLPs/SSRs. The SNP map detected more QTLs especially for grain weight, with precise map locations, demonstrating advantages in detecting power and resolution relative to the RFLP/SSR map. Thus this study provided an example for ultra-high density map construction using sequencing technology. Moreover, the results obtained are helpful for understanding the genetic bases of the yield traits and for fine mapping and cloning of QTLs.

  15. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Sporring, Jon; Fogh Olsen, Ole

    2008-01-01

    . To address this problem, we introduce a photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way, we preserve important illumination features, while...

  16. BaseMap

    Data.gov (United States)

    California Natural Resource Agency — The goal of this project is to provide a convenient base map that can be used as a starting point for CA projects. It's simple, but designed to work at a number of...

  17. Sea Ice Detection Based on Differential Delay-Doppler Maps from UK TechDemoSat-1

    Directory of Open Access Journals (Sweden)

    Yongchao Zhu

    2017-07-01

    Full Text Available Global Navigation Satellite System (GNSS signals can be exploited to remotely sense atmosphere and land and ocean surface to retrieve a range of geophysical parameters. This paper proposes two new methods, termed as power-summation of differential Delay-Doppler Maps (PS-D and pixel-number of differential Delay-Doppler Maps (PN-D, to distinguish between sea ice and sea water using differential Delay-Doppler Maps (dDDMs. PS-D and PN-D make use of power-summation and pixel-number of dDDMs, respectively, to measure the degree of difference between two DDMs so as to determine the transition state (water-water, water-ice, ice-ice and ice-water and hence ice and water are detected. Moreover, an adaptive incoherent averaging of DDMs is employed to improve the computational efficiency. A large number of DDMs recorded by UK TechDemoSat-1 (TDS-1 over the Arctic region are used to test the proposed sea ice detection methods. Through evaluating against ground-truth measurements from the Ocean Sea Ice SAF, the proposed PS-D and PN-D methods achieve a probability of detection of 99.72% and 99.69% respectively, while the probability of false detection is 0.28% and 0.31% respectively.

  18. Detection of myocardial 123I-BMIPP distribution abnormality in patients with ischemic heart disease based on normal data file in Bull's-eye polar map

    International Nuclear Information System (INIS)

    Takahashi, Nobukazu; Ishida, Yoshio; Hirose, Yoshiaki; Kawano, Shigeo; Fukuoka, Syuji; Hayashida, Kohei; Takamiya, Makoto; Nonogi, Hiroshi

    1995-01-01

    Visual interpretation of 123 I-BMIPP (BMIPP) myocardial images has difficulties in detecting mild reduction in tracer uptake. We studied the significance of the objective assessment of myocardial BMIPP maldistributions at rest by using a Bull's-eye map and its normal data file for detecting ischemic heart disease. Twenty nine patients, 15 with prior myocardial infarction and 14 with effort angina were studied. The initial 15-min BMIPP image was evaluated by visual analysis and by generating the extent Bull's-eye map which exhibits regions with reduced % uptake under mean-2SD of 10 normal controls. The sensitivity for determining coronary lesions in non-infarcted myocardial regions with the extent map was superior to that with visual analysis (67% vs. 33%). In the regions supplied by the stenotic coronary artery, those which showed visually negative but positive in the map and which showed positive in both had higher incidence of wall motion abnormalities and severe coronary stenosis than those with normal findings in both. These results suggest that the objective assessment based on the normal data file in a Bull's-eye polar map is clinically important for improving the limitation or the visual interpretation in 123 I-BMIPP imaging. (author)

  19. Forest Disturbance Mapping Using Dense Synthetic Landsat/MODIS Time-Series and Permutation-Based Disturbance Index Detection

    Directory of Open Access Journals (Sweden)

    David Frantz

    2016-03-01

    Full Text Available Spatio-temporal information on process-based forest loss is essential for a wide range of applications. Despite remote sensing being the only feasible means of monitoring forest change at regional or greater scales, there is no retrospectively available remote sensor that meets the demand of monitoring forests with the required spatial detail and guaranteed high temporal frequency. As an alternative, we employed the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM to produce a dense synthetic time series by fusing Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS nadir Bidirectional Reflectance Distribution Function (BRDF adjusted reflectance. Forest loss was detected by applying a multi-temporal disturbance detection approach implementing a Disturbance Index-based detection strategy. The detection thresholds were permutated with random numbers for the normal distribution in order to generate a multi-dimensional threshold confidence area. As a result, a more robust parameterization and a spatially more coherent detection could be achieved. (i The original Landsat time series; (ii synthetic time series; and a (iii combined hybrid approach were used to identify the timing and extent of disturbances. The identified clearings in the Landsat detection were verified using an annual woodland clearing dataset from Queensland’s Statewide Landcover and Trees Study. Disturbances caused by stand-replacing events were successfully identified. The increased temporal resolution of the synthetic time series indicated promising additional information on disturbance timing. The results of the hybrid detection unified the benefits of both approaches, i.e., the spatial quality and general accuracy of the Landsat detection and the increased temporal information of synthetic time series. Results indicated that a temporal improvement in the detection of the disturbance date could be achieved relative to the irregularly spaced Landsat

  20. A Hybrid Vision-Map Method for Urban Road Detection

    Directory of Open Access Journals (Sweden)

    Carlos Fernández

    2017-01-01

    Full Text Available A hybrid vision-map system is presented to solve the road detection problem in urban scenarios. The standardized use of machine learning techniques in classification problems has been merged with digital navigation map information to increase system robustness. The objective of this paper is to create a new environment perception method to detect the road in urban environments, fusing stereo vision with digital maps by detecting road appearance and road limits such as lane markings or curbs. Deep learning approaches make the system hard-coupled to the training set. Even though our approach is based on machine learning techniques, the features are calculated from different sources (GPS, map, curbs, etc., making our system less dependent on the training set.

  1. USGS Topo Base Map from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Topographic Base Map from The National Map. This tile cached web map service combines the most current data services (Boundaries, Names, Transportation,...

  2. Detecting domestic violence: Showcasing a knowledge browser based on formal concept analysis and emergent self organizing maps

    NARCIS (Netherlands)

    Elzinga, P.; Poelmans, J.; Viaene, S.; Dedene, G.; Cordeiro, J.; Filipe, J.

    2009-01-01

    Over 90% of the case data from police inquiries is stored as unstructured text in police databases. We use the combination of Formal Concept Analysis and Emergent Self Organizing Maps for exploring a dataset of unstructured police reports out of the Amsterdam-Amstelland police region in the

  3. A combined approach based on MAF analysis and AHP method to fault detection mapping: A case study from a gas field, southwest of Iran

    Science.gov (United States)

    Shakiba, Sima; Asghari, Omid; Khah, Nasser Keshavarz Faraj

    2018-01-01

    A combined geostatitical methodology based on Min/Max Auto-correlation Factor (MAF) analysis and Analytical Hierarchy Process (AHP) is presented to generate a suitable Fault Detection Map (FDM) through seismic attributes. Five seismic attributes derived from a 2D time slice obtained from data related to a gas field located in southwest of Iran are used including instantaneous amplitude, similarity, energy, frequency, and Fault Enhancement Filter (FEF). The MAF analysis is implemented to reduce dimension of input variables, and then AHP method is applied on three obtained de-correlated MAF factors as evidential layer. Three Decision Makers (DMs) are used to construct PCMs for determining weights of selected evidential layer. Finally, weights obtained by AHP were multiplied in normalized valued of each alternative (MAF layers) and the concluded weighted layers were integrated in order to prepare final FDM. Results proved that applying algorithm proposed in this study generate a map more acceptable than the each individual attribute and sharpen the non-surface discontinuities as well as enhancing continuity of detected faults.

  4. Visually directed vs. software-based targeted biopsy compared to transperineal template mapping biopsy in the detection of clinically significant prostate cancer.

    Science.gov (United States)

    Valerio, Massimo; McCartan, Neil; Freeman, Alex; Punwani, Shonit; Emberton, Mark; Ahmed, Hashim U

    2015-10-01

    Targeted biopsy based on cognitive or software magnetic resonance imaging (MRI) to transrectal ultrasound registration seems to increase the detection rate of clinically significant prostate cancer as compared with standard biopsy. However, these strategies have not been directly compared against an accurate test yet. The aim of this study was to obtain pilot data on the diagnostic ability of visually directed targeted biopsy vs. software-based targeted biopsy, considering transperineal template mapping (TPM) biopsy as the reference test. Prospective paired cohort study included 50 consecutive men undergoing TPM with one or more visible targets detected on preoperative multiparametric MRI. Targets were contoured on the Biojet software. Patients initially underwent software-based targeted biopsies, then visually directed targeted biopsies, and finally systematic TPM. The detection rate of clinically significant disease (Gleason score ≥3+4 and/or maximum cancer core length ≥4mm) of one strategy against another was compared by 3×3 contingency tables. Secondary analyses were performed using a less stringent threshold of significance (Gleason score ≥4+3 and/or maximum cancer core length ≥6mm). Median age was 68 (interquartile range: 63-73); median prostate-specific antigen level was 7.9ng/mL (6.4-10.2). A total of 79 targets were detected with a mean of 1.6 targets per patient. Of these, 27 (34%), 28 (35%), and 24 (31%) were scored 3, 4, and 5, respectively. At a patient level, the detection rate was 32 (64%), 34 (68%), and 38 (76%) for visually directed targeted, software-based biopsy, and TPM, respectively. Combining the 2 targeted strategies would have led to detection rate of 39 (78%). At a patient level and at a target level, software-based targeted biopsy found more clinically significant diseases than did visually directed targeted biopsy, although this was not statistically significant (22% vs. 14%, P = 0.48; 51.9% vs. 44.3%, P = 0.24). Secondary

  5. Epileptic Seizure Detection based on Wavelet Transform Statistics Map and EMD Method for Hilbert-Huang Spectral Analyzing in Gamma Frequency Band of EEG Signals

    Directory of Open Access Journals (Sweden)

    Morteza Behnam

    2015-08-01

    Full Text Available Seizure detection using brain signal (EEG analysis is the important clinical methods in drug therapy and the decisions before brain surgery. In this paper, after signal conditioning using suitable filtering, the Gamma frequency band has been extracted and the other brain rhythms, ambient noises and the other bio-signal are canceled. Then, the wavelet transform of brain signal and the map of wavelet transform in multi levels are computed. By dividing the color map to different epochs, the histogram of each sub-image is obtained and the statistics of it based on statistical momentums and Negentropy values are calculated. Statistical feature vector using Principle Component Analysis (PCA is reduced to one dimension. By EMD algorithm and sifting procedure for analyzing the data by Intrinsic Mode Function (IMF and computing the residues of brain signal using spectrum of Hilbert transform and Hilbert – Huang spectrum forming, one spatial feature based on the Euclidian distance for signal classification is obtained. By K-Nearest Neighbor (KNN classifier and by considering the optimal neighbor parameter, EEG signals are classified in two classes, seizure and non-seizure signal, with the rate of accuracy 76.54% and with variance of error 0.3685 in the different tests.

  6. Mapping epistasis and environment × QTX interaction based on four -omics genotypes for the detected QTX loci controlling complex traits in tobacco

    Directory of Open Access Journals (Sweden)

    Liyuan Zhou

    2013-12-01

    Full Text Available Using newly developed methods and software, association mapping was conducted for chromium content and total sugar in tobacco leaf, based on four -omics datasets. Our objective was to collect data on genotype and phenotype for 60 leaf samples at four developmental stages, from three plant architectural positions and for three cultivars that were grown in two locations. Association mapping was conducted to detect genetic variants at quantitative trait SNP (QTS loci, quantitative trait transcript (QTT differences, quantitative trait protein (QTP variability, and quantitative trait metabolite (QTM changes, which can be summarized as QTX locus variation. The total heritabilities of the four -omics loci for both traits tested were 23.60% for epistasis and 15.26% for treatment interaction. Epistasis and environment × treatment interaction had important impacts on complex traits at all -omics levels. For decreasing chromium content and increasing total sugar in tobacco leaf, six methylated loci can be directly used for marker-assisted selection, and expression of ten QTTs, seven QTPs and six QTMs can be modified by selection or cultivation.

  7. Positive maps, majorization, entropic inequalities and detection of entanglement

    International Nuclear Information System (INIS)

    Augusiak, R; Stasinska, J

    2009-01-01

    In this paper, we discuss some general connections between the notions of positive map, weak majorization and entropic inequalities in the context of detection of entanglement among bipartite quantum systems. First, basing on the fact that any positive map Λ:M d (C)→M d (C) can be written as the difference between two completely positive maps Λ=Λ 1 -Λ 2 , we propose a possible way to generalize the Nielsen-Kempe majorization criterion. Then, we present two methods of derivation of some general classes of entropic inequalities useful for the detection of entanglement. While the first one follows from the aforementioned generalized majorization relation and the concept of Schur-concave decreasing functions, the second is based on some functional inequalities. What is important is that, contrary to the Nielsen-Kempe majorization criterion and entropic inequalities, our criteria allow for the detection of entangled states with positive partial transposition when using indecomposable positive maps. We also point out that if a state with at least one maximally mixed subsystem is detected by some necessary criterion based on the positive map Λ, then there exist entropic inequalities derived from Λ (by both procedures) that also detect this state. In this sense, they are equivalent to the necessary criterion [IxΛ](rhov AB )≥0. Moreover, our inequalities provide a way of constructing multi-copy entanglement witnesses and therefore are promising from the experimental point of view. Finally, we discuss some of the derived inequalities in the context of the recently introduced protocol of state merging and the possibility of approximating the mean value of a linear entanglement witness.

  8. AN INVESTIGATION OF AUTOMATIC CHANGE DETECTION FOR TOPOGRAPHIC MAP UPDATING

    Directory of Open Access Journals (Sweden)

    P. Duncan

    2012-08-01

    Full Text Available Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI, South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  9. USGS Imagery Only Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Imagery Only is a tile cache base map of orthoimagery in The National Map visible to the 1:18,000 scale. Orthoimagery data are typically high resolution images...

  10. Simultaneous analysis of cerebrospinal fluid biomarkers using microsphere-based xMAP multiplex technology for early detection of Alzheimer's disease.

    Science.gov (United States)

    Kang, Ju-Hee; Vanderstichele, Hugo; Trojanowski, John Q; Shaw, Leslie M

    2012-04-01

    The xMAP-Luminex multiplex platform for measurement of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers using Innogenetics AlzBio3 immunoassay reagents that are for research use only has been shown to be an effective tool for early detection of an AD-like biomarker signature based on concentrations of CSF Aβ(1-42), t-tau and p-tau(181). Among the several advantages of the xMAP-Luminex platform for AD CSF biomarkers are: a wide dynamic range of ready-to-use calibrators, time savings for the simultaneous analyses of three biomarkers in one analytical run, reduction of human error, potential of reduced cost of reagents, and a modest reduction of sample volume as compared to conventional enzyme-linked immunosorbant assay (ELISA) methodology. Recent clinical studies support the use of CSF Aβ(1-42), t-tau and p-tau(181) measurement using the xMAP-Luminex platform for the early detection of AD pathology in cognitively normal individuals, and for prediction of progression to AD dementia in subjects with mild cognitive impairment (MCI). Studies that have shown the prediction of risk for progression to AD dementia by MCI patients provide the basis for the use of CSF Aβ(1-42), t-tau and p-tau(181) testing to assign risk for progression in patients enrolled in therapeutic trials. Furthermore emerging study data suggest that these pathologic changes occur in cognitively normal subjects 20 or more years before the onset of clinically detectable memory changes thus providing an objective measurement for use in the assessment of treatment effects in primary treatment trials. However, numerous previous ELISA and Luminex-based multiplex studies reported a wide range of absolute values of CSF Aβ(1-42), t-tau and p-tau(181) indicative of substantial inter-laboratory variability as well as varying degrees of intra-laboratory imprecision. In order to address these issues a recent inter-laboratory investigation that included a common set of CSF pool aliquots from

  11. European wet deposition maps based on measurements

    NARCIS (Netherlands)

    Leeuwen EP van; Erisman JW; Draaijers GPJ; Potma CJM; Pul WAJ van; LLO

    1995-01-01

    To date, wet deposition maps on a European scale have been based on long-range transport model results. For most components wet deposition maps based on measurements are only available on national scales. Wet deposition maps of acidifying components and base cations based on measurements are needed

  12. USGS Imagery Topo Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Imagery Topo is a topographic tile cache base map with orthoimagery as a backdrop, and combines the most current data (Boundaries, Names, Transportation,...

  13. USGS Hill Shade Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Hill Shade (or Shaded Relief) is a tile cache base map created from the National Elevation Dataset (NED), a seamless dataset of best available raster elevation...

  14. USGS Topo Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Topo is a topographic tile cache base map that combines the most current data (Boundaries, Names, Transportation, Elevation, Hydrography, Land Cover, and other...

  15. Vision-based mapping with cooperative robots

    Science.gov (United States)

    Little, James J.; Jennings, Cullen; Murray, Don

    1998-10-01

    Two stereo-vision-based mobile robots navigate and autonomously explore their environment safely while building occupancy grid maps of the environment. The robots maintain position estimates within a global coordinate frame using landmark recognition. This allows them to build a common map by sharing position information and stereo data. Stereo vision processing and map updates are done at 3 Hz and the robots move at speeds of 200 cm/s. Cooperative mapping is achieved through autonomous exploration of unstructured and dynamic environments. The map is constructed conservatively, so as to be useful for collision-free path planning. Each robot maintains a separate copy of a shared map, and then posts updates to the common map when it returns to observe a landmark at home base. Issues include synchronization, mutual localization, navigation, exploration, registration of maps, merging repeated views (fusion), centralized vs decentralized maps.

  16. Spectroscopic detection and mapping of vinyl cyanide on Titan

    Science.gov (United States)

    Cordiner, Martin; Yukiko Palmer, Maureen; Lai, James; Nixon, Conor A.; Teanby, Nicholas; Charnley, Steven B.; Vuitton, Veronique; Kisiel, Zbigniew; Irwin, Patrick; Molter, Ned; Mumma, Michael J.

    2017-10-01

    The first spectroscopic detection of vinyl cyanide (otherwise known as acrylonitrile; C2H3CN) on Titan was obtained by Palmer et al. (2017), based on three rotational emission lines observed with ALMA at millimeter wavelengths (in receiver band 6). The astrobiological significance of this detection was highlighted due to the theorized ability of C2H3CN molecules to combine into cell membrane-like structures under the cold conditions found in Titan's hydrocarbon lakes. Here we report the detection of three additional C2H3CN transitions at higher frequencies (from ALMA band 7 flux calibration data). We present the first emission maps for this gas on Titan, and compare the molecular distribution with that of other nitriles observed with ALMA including HC3N, CH3CN, C2H5CN and HNC. The molecular abundance patterns are interpreted based on our understanding of Titan's high-altitude photochemistry and time-variable global circulation. Similar to the short-lived HC3N molecule, vinyl cyanide is found to be most abundant in the vicinity of the southern (winter) pole, whereas the longer-lived CH3CN is more concentrated in the north. The vertical abundance profile of C2H3CN (from radiative transfer modeling), as well as its latitudinal distribution, are consistent with a short photochemical lifetime for this species. Complementary results from our more recent (2017) nitrile mapping studies at higher spatial resolution will also be discussed.REFERENCES:Palmer, M. Y., Cordiner, M. A., Nixon, C. A. et al. "ALMA detection and astrobiological potential of vinyl cyanide on Titan", Sci. Adv. 2017, 3, e1700022

  17. Smartphones Based Mobile Mapping Systems

    Directory of Open Access Journals (Sweden)

    A. Al-Hamad

    2014-06-01

    Full Text Available The past 20 years have witnessed an explosive growth in the demand for geo-spatial data. This demand has numerous sources and takes many forms; however, the net effect is an ever-increasing thirst for data that is more accurate, has higher density, is produced more rapidly, and is acquired less expensively. For mapping and Geographic Information Systems (GIS projects, this has been achieved through the major development of Mobile Mapping Systems (MMS. MMS integrate various navigation and remote sensing technologies which allow mapping from moving platforms (e.g. cars, airplanes, boats, etc. to obtain the 3D coordinates of the points of interest. Such systems obtain accuracies that are suitable for all but the most demanding mapping and engineering applications. However, this accuracy doesn't come cheaply. As a consequence of the platform and navigation and mapping technologies used, even an "inexpensive" system costs well over 200 000 USD. Today's mobile phones are getting ever more sophisticated. Phone makers are determined to reduce the gap between computers and mobile phones. Smartphones, in addition to becoming status symbols, are increasingly being equipped with extended Global Positioning System (GPS capabilities, Micro Electro Mechanical System (MEMS inertial sensors, extremely powerful computing power and very high resolution cameras. Using all of these components, smartphones have the potential to replace the traditional land MMS and portable GPS/GIS equipment. This paper introduces an innovative application of smartphones as a very low cost portable MMS for mapping and GIS applications.

  18. Smartphones Based Mobile Mapping Systems

    Science.gov (United States)

    Al-Hamad, A.; El-Sheimy, N.

    2014-06-01

    The past 20 years have witnessed an explosive growth in the demand for geo-spatial data. This demand has numerous sources and takes many forms; however, the net effect is an ever-increasing thirst for data that is more accurate, has higher density, is produced more rapidly, and is acquired less expensively. For mapping and Geographic Information Systems (GIS) projects, this has been achieved through the major development of Mobile Mapping Systems (MMS). MMS integrate various navigation and remote sensing technologies which allow mapping from moving platforms (e.g. cars, airplanes, boats, etc.) to obtain the 3D coordinates of the points of interest. Such systems obtain accuracies that are suitable for all but the most demanding mapping and engineering applications. However, this accuracy doesn't come cheaply. As a consequence of the platform and navigation and mapping technologies used, even an "inexpensive" system costs well over 200 000 USD. Today's mobile phones are getting ever more sophisticated. Phone makers are determined to reduce the gap between computers and mobile phones. Smartphones, in addition to becoming status symbols, are increasingly being equipped with extended Global Positioning System (GPS) capabilities, Micro Electro Mechanical System (MEMS) inertial sensors, extremely powerful computing power and very high resolution cameras. Using all of these components, smartphones have the potential to replace the traditional land MMS and portable GPS/GIS equipment. This paper introduces an innovative application of smartphones as a very low cost portable MMS for mapping and GIS applications.

  19. Comprehensive comparison of two image-based point clouds from aerial photos with airborne lidar for large-scale mapping : Door detection to envelope reconstruction

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    The integration of computer vision and photogrammetry to generate three-dimensional (3D) information from images has contributed to a wider use of point clouds, for mapping purposes. Large-scale topographic map production requires 3D data with high precision and

  20. Nonlinear Algorithms for Channel Equalization and Map Symbol Detection.

    Science.gov (United States)

    Giridhar, K.

    The transfer of information through a communication medium invariably results in various kinds of distortion to the transmitted signal. In this dissertation, a feed -forward neural network-based equalizer, and a family of maximum a posteriori (MAP) symbol detectors are proposed for signal recovery in the presence of intersymbol interference (ISI) and additive white Gaussian noise. The proposed neural network-based equalizer employs a novel bit-mapping strategy to handle multilevel data signals in an equivalent bipolar representation. It uses a training procedure to learn the channel characteristics, and at the end of training, the multilevel symbols are recovered from the corresponding inverse bit-mapping. When the channel characteristics are unknown and no training sequences are available, blind estimation of the channel (or its inverse) and simultaneous data recovery is required. Convergence properties of several existing Bussgang-type blind equalization algorithms are studied through computer simulations, and a unique gain independent approach is used to obtain a fair comparison of their rates of convergence. Although simple to implement, the slow convergence of these Bussgang-type blind equalizers make them unsuitable for many high data-rate applications. Rapidly converging blind algorithms based on the principle of MAP symbol-by -symbol detection are proposed, which adaptively estimate the channel impulse response (CIR) and simultaneously decode the received data sequence. Assuming a linear and Gaussian measurement model, the near-optimal blind MAP symbol detector (MAPSD) consists of a parallel bank of conditional Kalman channel estimators, where the conditioning is done on each possible data subsequence that can convolve with the CIR. This algorithm is also extended to the recovery of convolutionally encoded waveforms in the presence of ISI. Since the complexity of the MAPSD algorithm increases exponentially with the length of the assumed CIR, a suboptimal

  1. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    Science.gov (United States)

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  2. Rate based failure detection

    Science.gov (United States)

    Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward

    2018-01-02

    This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or data paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.

  3. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    Science.gov (United States)

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  4. Map based localization to assist commercial fleet operations.

    Science.gov (United States)

    2014-08-01

    This report outlines key recent contributions to the state of the art in lane detection, lane departure warning, : and map-based sensor fusion algorithms. These key studies are used as a basis for a discussion about the : limitations of systems that ...

  5. Mapping specific soil functions based on digital soil property maps

    Science.gov (United States)

    Pásztor, László; Fodor, Nándor; Farkas-Iványi, Kinga; Szabó, József; Bakacsi, Zsófia; Koós, Sándor

    2016-04-01

    Quantification of soil functions and services is a great challenge in itself even if the spatial relevance is supposed to be identified and regionalized. Proxies and indicators are widely used in ecosystem service mapping. Soil services could also be approximated by elementary soil features. One solution is the association of soil types with services as basic principle. Soil property maps however provide quantified spatial information, which could be utilized more versatilely for the spatial inference of soil functions and services. In the frame of the activities referred as "Digital, Optimized, Soil Related Maps and Information in Hungary" (DOSoReMI.hu) numerous soil property maps have been compiled so far with proper DSM techniques partly according to GSM.net specifications, partly by slightly or more strictly changing some of its predefined parameters (depth intervals, pixel size, property etc.). The elaborated maps have been further utilized, since even DOSoReMI.hu was intended to take steps toward the regionalization of higher level soil information (secondary properties, functions, services). In the meantime the recently started AGRAGIS project requested spatial soil related information in order to estimate agri-environmental related impacts of climate change and support the associated vulnerability assessment. One of the most vulnerable services of soils in the context of climate change is their provisioning service. In our work it was approximated by productivity, which was estimated by a sequential scenario based crop modelling. It took into consideration long term (50 years) time series of both measured and predicted climatic parameters as well as accounted for the potential differences in agricultural practice and crop production. The flexible parametrization and multiple results of modelling was then applied for the spatial assessment of sensitivity, vulnerability, exposure and adaptive capacity of soils in the context of the forecasted changes in

  6. Terrain Mapping and Obstacle Detection Using Gaussian Processes

    DEFF Research Database (Denmark)

    Kjærgaard, Morten; Massaro, Alessandro Salvatore; Bayramoglu, Enis

    2011-01-01

    In this paper we consider a probabilistic method for extracting terrain maps from a scene and use the information to detect potential navigation obstacles within it. The method uses Gaussian process regression (GPR) to predict an estimate function and its relative uncertainty. To test the new...... show that the estimated maps follow the terrain shape, while protrusions are identified and may be isolated as potential obstacles. Representing the data with a covariance function allows a dramatic reduction of the amount of data to process, while maintaining the statistical properties of the measured...... and interpolated features....

  7. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  8. Novelty Detection Classifiers in Weed Mapping: Silybum marianum Detection on UAV Multispectral Images.

    Science.gov (United States)

    Alexandridis, Thomas K; Tamouridou, Afroditi Alexandra; Pantazi, Xanthoula Eirini; Lagopodi, Anastasia L; Kashefi, Javid; Ovakoglou, Georgios; Polychronos, Vassilios; Moshou, Dimitrios

    2017-09-01

    In the present study, the detection and mapping of Silybum marianum (L.) Gaertn. weed using novelty detection classifiers is reported. A multispectral camera (green-red-NIR) on board a fixed wing unmanned aerial vehicle (UAV) was employed for obtaining high-resolution images. Four novelty detection classifiers were used to identify S. marianum between other vegetation in a field. The classifiers were One Class Support Vector Machine (OC-SVM), One Class Self-Organizing Maps (OC-SOM), Autoencoders and One Class Principal Component Analysis (OC-PCA). As input features to the novelty detection classifiers, the three spectral bands and texture were used. The S. marianum identification accuracy using OC-SVM reached an overall accuracy of 96%. The results show the feasibility of effective S. marianum mapping by means of novelty detection classifiers acting on multispectral UAV imagery.

  9. Hierarchical Self Organizing Map for Novelty Detection using Mobile Robot with Robust Sensor

    International Nuclear Information System (INIS)

    Sha'abani, M N A H; Miskon, M F; Sakidin, H

    2013-01-01

    This paper presents a novelty detection method based on Self Organizing Map neural network using a mobile robot. Based on hierarchical neural network, the network is divided into three networks; position, orientation and sensor measurement network. A simulation was done to demonstrate and validate the proposed method using MobileSim. Three cases of abnormal events; new, missing and shifted objects are employed for performance evaluation. The result of detection was then filtered for false positive detection. The result shows that the inspection produced less than 2% false positive detection at high sensitivity settings

  10. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    Science.gov (United States)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  11. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    International Nuclear Information System (INIS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-01-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy. (paper)

  12. Object-based Classification for Detecting Landslides and Stochastic Procedure to landslide susceptibility maps - A Case at Baolai Village, SW Taiwan

    Science.gov (United States)

    Lin, Ying-Tong; Chang, Kuo-Chen; Yang, Ci-Jian

    2017-04-01

    As the result of global warming in the past decades, Taiwan has experienced more and more extreme typhoons with hazardous massive landslides. In this study, we use object-oriented analysis method to classify landslide area at Baolai village by using Formosat-2 satellite images. We used for multiresolution segmented to generate the blocks, and used hierarchical logic to classified 5 different kinds of features. After that, classification the landslide into different type of landslide. Beside, we use stochastic procedure to integrate landslide susceptibility maps. This study assumed that in the extreme event, 2009 Typhoon Morakot, which precipitation goes to 1991.5mm in 5 days, and the highest landslide susceptible area. The results show that study area's landslide area was greatly changes, most of landslide was erosion by gully and made dip slope slide, or erosion by the stream, especially at undercut bank. From the landslide susceptibility maps, we know that the old landslide area have high potential to occur landslides in the extreme event. This study demonstrates the changing of landslide area and the landslide susceptible area. Keywords: Formosat-2, object-oriented, segmentation, classification, landslide, Baolai Village, SW Taiwan, FS

  13. Object-based Landslide Mapping: Examples, Challenges and Opportunities

    Science.gov (United States)

    Hölbling, Daniel; Eisank, Clemens; Friedl, Barbara; Chang, Kang-Tsung; Tsai, Tsai-Tsung; Birkefeldt Møller Pedersen, Gro; Betts, Harley; Cigna, Francesca; Chiang, Shou-Hao; Aubrey Robson, Benjamin; Bianchini, Silvia; Füreder, Petra; Albrecht, Florian; Spiekermann, Raphael; Weinke, Elisabeth; Blaschke, Thomas; Phillips, Chris

    2016-04-01

    Over the last decade, object-based image analysis (OBIA) has been increasingly used for mapping landslides that occur after triggering events such as heavy rainfall. The increasing availability and quality of Earth Observation (EO) data in terms of temporal, spatial and spectral resolution allows for comprehensive mapping of landslides at multiple scales. Most often very high resolution (VHR) or high resolution (HR) optical satellite images are used in combination with a digital elevation model (DEM) and its products such as slope and curvature. Semi-automated object-based mapping makes use of various characteristics of image objects that are derived through segmentation. OBIA enables numerous spectral, spatial, contextual and textural image object properties to be applied during an analysis. This is especially useful when mapping complex natural features such as landslides and constitutes an advantage over pixel-based image analysis. However, several drawbacks in the process of object-based landslide mapping have not been overcome yet. The developed classification routines are often rather complex and limited regarding their transferability across areas and sensors. There is still more research needed to further improve present approaches and to fully exploit the capabilities of OBIA for landslide mapping. In this study several examples of object-based landslide mapping from various geographical regions with different characteristics are presented. Examples from the Austrian and Italian Alps are shown, whereby one challenge lies in the detection of small-scale landslides on steep slopes while preventing the classification of false positives with similar spectral properties (construction areas, utilized land, etc.). Further examples feature landslides mapped in Iceland, where the differentiation of landslides from other landscape-altering processes in a highly dynamic volcanic landscape poses a very distinct challenge, and in Norway, which is exposed to multiple

  14. Global contrast based salient region detection

    KAUST Repository

    Cheng, Ming-Ming

    2011-08-25

    Reliable estimation of visual saliency allows appropriate processing of images without prior knowledge of their contents, and thus remains an important step in many computer vision tasks including image segmentation, object recognition, and adaptive compression. We propose a regional contrast based saliency extraction algorithm, which simultaneously evaluates global contrast differences and spatial coherence. The proposed algorithm is simple, efficient, and yields full resolution saliency maps. Our algorithm consistently outperformed existing saliency detection methods, yielding higher precision and better recall rates, when evaluated using one of the largest publicly available data sets. We also demonstrate how the extracted saliency map can be used to create high quality segmentation masks for subsequent image processing.

  15. Global contrast based salient region detection

    KAUST Repository

    Cheng, Ming-Ming; Zhang, Guo-Xin; Mitra, Niloy J.; Huang, Xiaolei; Hu, Shi-Min

    2011-01-01

    Reliable estimation of visual saliency allows appropriate processing of images without prior knowledge of their contents, and thus remains an important step in many computer vision tasks including image segmentation, object recognition, and adaptive compression. We propose a regional contrast based saliency extraction algorithm, which simultaneously evaluates global contrast differences and spatial coherence. The proposed algorithm is simple, efficient, and yields full resolution saliency maps. Our algorithm consistently outperformed existing saliency detection methods, yielding higher precision and better recall rates, when evaluated using one of the largest publicly available data sets. We also demonstrate how the extracted saliency map can be used to create high quality segmentation masks for subsequent image processing.

  16. Image denoising based on noise detection

    Science.gov (United States)

    Jiang, Yuanxiang; Yuan, Rui; Sun, Yuqiu; Tian, Jinwen

    2018-03-01

    Because of the noise points in the images, any operation of denoising would change the original information of non-noise pixel. A noise detection algorithm based on fractional calculus was proposed to denoise in this paper. Convolution of the image was made to gain direction gradient masks firstly. Then, the mean gray was calculated to obtain the gradient detection maps. Logical product was made to acquire noise position image next. Comparisons in the visual effect and evaluation parameters after processing, the results of experiment showed that the denoising algorithms based on noise were better than that of traditional methods in both subjective and objective aspects.

  17. Generating Impact Maps from Automatically Detected Bomb Craters in Aerial Wartime Images Using Marked Point Processes

    Science.gov (United States)

    Kruse, Christian; Rottensteiner, Franz; Hoberg, Thorsten; Ziems, Marcel; Rebke, Julia; Heipke, Christian

    2018-04-01

    The aftermath of wartime attacks is often felt long after the war ended, as numerous unexploded bombs may still exist in the ground. Typically, such areas are documented in so-called impact maps which are based on the detection of bomb craters. This paper proposes a method for the automatic detection of bomb craters in aerial wartime images that were taken during the Second World War. The object model for the bomb craters is represented by ellipses. A probabilistic approach based on marked point processes determines the most likely configuration of objects within the scene. Adding and removing new objects to and from the current configuration, respectively, changing their positions and modifying the ellipse parameters randomly creates new object configurations. Each configuration is evaluated using an energy function. High gradient magnitudes along the border of the ellipse are favored and overlapping ellipses are penalized. Reversible Jump Markov Chain Monte Carlo sampling in combination with simulated annealing provides the global energy optimum, which describes the conformance with a predefined model. For generating the impact map a probability map is defined which is created from the automatic detections via kernel density estimation. By setting a threshold, areas around the detections are classified as contaminated or uncontaminated sites, respectively. Our results show the general potential of the method for the automatic detection of bomb craters and its automated generation of an impact map in a heterogeneous image stock.

  18. Machine-based mapping of innovation portfolios

    NARCIS (Netherlands)

    de Visser, Matthias; Miao, Shengfa; Englebienne, Gwenn; Sools, Anna Maria; Visscher, Klaasjan

    2017-01-01

    Machine learning techniques show a great promise for improving innovation portfolio management. In this paper we experiment with different methods to classify innovation projects of a high-tech firm as either explorative or exploitative, and compare the results with a manual, theory-based mapping of

  19. Mapping of networks to detect priority zoonoses in Jordan

    Directory of Open Access Journals (Sweden)

    Erin M Sorrell

    2015-10-01

    Full Text Available Early detection of emerging disease events is a priority focus area for cooperative bioengagement programs. Communication and coordination among national disease surveillance and response networks are essential for timely detection and control of a public health event. Although systematic information sharing between the human and animal health sectors can help stakeholders detect and respond to zoonotic diseases rapidly, resource constraints and other barriers often prevent efficient cross-sector reporting. The purpose of this research project was to map the laboratory and surveillance networks currently in place for detecting and reporting priority zoonotic diseases in Jordan in order to identify the nodes of communication, coordination, and decision-making where health and veterinary sectors intersect, and to identify priorities and gaps that limit information-sharing for action. We selected three zoonotic diseases as case studies: highly pathogenic avian influenza (HPAI H5N1, rabies, and brucellosis. Through meetings with government agencies and health officials, and desk research, we mapped each system from the index case through response – including both surveillance and laboratory networks, highlighting both areas of strength and those that would benefit from capacity-building resources. Our major findings indicate informal communication exists across sectors; in the event of emergence of one of the priority zoonoses studied there is effective coordination across the Ministry of Health and Ministry of Agriculture. However, routine formal coordination is lacking. Overall, there is a strong desire and commitment for multi-sectoral coordination in detection and response to zoonoses across public health and veterinary sectors. Our analysis indicates that the networks developed in response to HPAI can and should be leveraged to develop a comprehensive laboratory and surveillance One Health network.

  20. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    Science.gov (United States)

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  1. Mapping yeast origins of replication via single-stranded DNA detection.

    Science.gov (United States)

    Feng, Wenyi; Raghuraman, M K; Brewer, Bonita J

    2007-02-01

    Studies in th Saccharomyces cerevisiae have provided a framework for understanding how eukaryotic cells replicate their chromosomal DNA to ensure faithful transmission of genetic information to their daughter cells. In particular, S. cerevisiae is the first eukaryote to have its origins of replication mapped on a genomic scale, by three independent groups using three different microarray-based approaches. Here we describe a new technique of origin mapping via detection of single-stranded DNA in yeast. This method not only identified the majority of previously discovered origins, but also detected new ones. We have also shown that this technique can identify origins in Schizosaccharomyces pombe, illustrating the utility of this method for origin mapping in other eukaryotes.

  2. Construction of microsatellite-based linkage map and mapping of nectarilessness and hairiness genes in Gossypium tomentosum.

    Science.gov (United States)

    Hou, Meiying; Cai, Caiping; Zhang, Shuwen; Guo, Wangzhen; Zhang, Tianzhen; Zhou, Baoliang

    2013-12-01

    Gossypium tomentosum, a wild tetraploid cotton species with AD genomes, possesses genes conferring strong fibers and high heat tolerance. To effectively transfer these genes into Gossypium hirsutum, an entire microsatellite (simple sequence repeat, SSR)-based genetic map was constructed using the interspecific cross of G. hirsutum x G. tomentosum (HT). We detected 1800 loci from 1347 pairs of polymorphic primers. Of these, 1204 loci were grouped into 35 linkage groups at LOD ≥ 4. The map covers 3320.8 cM, with a mean density of 2.76 cM per locus. We detected 420 common loci (186 in the At subgenome and 234 in Dt) between the HT map and the map of TM-1 (G. hirsutum) and Hai 7124 (G. barbadense; HB map). The linkage groups were assigned chromosome numbers based on location of common loci and the HB map as reference. A comparison of common markers revealed that no significant chromosomal rearrangement exist between G. tomentosum and G. barbadense. Interestingly, however, we detected numerous (33.7%) segregation loci deviating from 3:1 ratio (P constructed in this study will be useful for further genetic studies on cotton breeding, including mapping loci controlling quantitative traits associated with fiber quality, stress tolerance and developing chromosome segment specific introgression lines from G. tomentosum into G. hirsutum using marker-assisted selection.

  3. Topographical Hill Shading Map Production Based Tianditu (map World)

    Science.gov (United States)

    Wang, C.; Zha, Z.; Tang, D.; Yang, J.

    2018-04-01

    TIANDITU (Map World) is the public version of National Platform for Common Geospatial Information Service, and the terrain service is an important channel for users on the platform. With the development of TIANDITU, topographical hill shading map production for providing and updating global terrain map on line becomes necessary for the characters of strong intuition, three-dimensional sense and aesthetic effect. As such, the terrain service of TIANDITU focuses on displaying the different scales of topographical data globally. And this paper mainly aims to research the method of topographical hill shading map production globally using DEM (Digital Elevation Model) data between the displaying scales about 1 : 140,000,000 to 1 : 4,000,000, corresponded the display level from 2 to 7 on TIANDITU website.

  4. Impact of Aerosol Dust on xMAP Multiplex Detection of Different Class Pathogens

    Directory of Open Access Journals (Sweden)

    Denis A. Kleymenov

    2017-11-01

    Full Text Available Environmental or city-scale bioaerosol surveillance can provide additional value for biodefense and public health. Efficient bioaerosol monitoring should rely on multiplex systems capable of detecting a wide range of biologically hazardous components potentially present in air (bacteria, viruses, toxins and allergens. xMAP technology from LuminexTM allows multiplex bead-based detection of antigens or nucleic acids, but its use for simultaneous detection of different classes of pathogens (bacteria, virus, toxin is questionable. Another problem is the detection of pathogens in complex matrices, e.g., in the presence of dust. In the this research, we developed the model xMAP multiplex test-system aiRDeTeX 1.0, which enables detection of influenza A virus, Adenovirus type 6 Salmonella typhimurium, and cholera toxin B subunit representing RNA virus, DNA virus, gram-negative bacteria and toxin respectively as model organisms of biologically hazardous components potentially present in or spreadable through the air. We have extensively studied the effect of matrix solution (PBS, distilled water, environmental dust and ultrasound treatment for monoplex and multiplex detection efficiency of individual targets. All targets were efficiently detectable in PBS and in the presence of dust. Ultrasound does not improve the detection except for bacterial LPS.

  5. Hash function based on chaotic map lattices.

    Science.gov (United States)

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  6. Alteration mineral mapping in inaccessible regions using target detection algorithms to ASTER data

    International Nuclear Information System (INIS)

    Pour, A B; Hashim, M; Park, Y

    2017-01-01

    In this study, the applications of target detection algorithms such as Constrained Energy Minimization (CEM), Orthogonal Subspace Projection (OSP) and Adaptive Coherence Estimator (ACE) to shortwave infrared bands of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data was investigated to extract geological information for alteration mineral mapping in poorly exposed lithologies in inaccessible domains. The Oscar II coast area north-eastern Graham Land, Antarctic Peninsula (AP) was selected in this study to conduct a satellite-based remote sensing mapping technique. It is an inaccessible region due to the remoteness of many rock exposures and the necessity to travel over sever mountainous and glacier-cover terrains for geological field mapping and sample collection. Fractional abundance of alteration minerals such as muscovite, kaolinite, illite, montmorillonite, epidote, chlorite and biotite were identified in alteration zones using CEM, OSP and ACE algorithms in poorly mapped and unmapped zones at district scale for the Oscar II coast area. The results of this investigation demonstrated the applicability of ASTER shortwave infrared spectral data for lithological and alteration mineral mapping in poorly exposed lithologies and inaccessible regions, particularly using the image processing algorithms that are capable to detect sub-pixel targets in the remotely sensed images, where no prior information is available. (paper)

  7. USGS Imagery Topo Large-scale Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Imagery Topo Large service from The National Map (TNM) is a dynamic topographic base map service that combines the best available data (Boundaries,...

  8. Map-based mobile services design, interaction and usability

    CERN Document Server

    Meng, Liqiu; Winter, Stephan; Popovich, Vasily

    2008-01-01

    This book reports the newest research and technical achievements on the following theme blocks: Design of mobile map services and its constraints; Typology and usability of mobile map services; Visualization solutions on small displays for time-critical tasks; Mobile map users; Interaction and adaptation in mobile environments; and Applications of map-based mobile services.

  9. Accurate lateral positioning from map data and road marking detection

    OpenAIRE

    GRUYER, Dominique; BELAROUSSI, Rachid; REVILLOUD, Marc

    2015-01-01

    We are witnessing the clash of two industries and the remaking of in-car market order, as the world of digital knowledge recently made a significant move toward the automotive industry. Mobile operating system providers are battling between each other to take over the in-vehicle entertainment and information systems, while car makers either line up behind their technology or try to keep control over the in-car experience. What is at stake is the map content and location-based services, two ke...

  10. Spectrally based mapping of riverbed composition

    Science.gov (United States)

    Legleiter, Carl; Stegman, Tobin K.; Overstreet, Brandon T.

    2016-01-01

    Remote sensing methods provide an efficient means of characterizing fluvial systems. This study evaluated the potential to map riverbed composition based on in situ and/or remote measurements of reflectance. Field spectra and substrate photos from the Snake River, Wyoming, USA, were used to identify different sediment facies and degrees of algal development and to quantify their optical characteristics. We hypothesized that accounting for the effects of depth and water column attenuation to isolate the reflectance of the streambed would enhance distinctions among bottom types and facilitate substrate classification. A bottom reflectance retrieval algorithm adapted from coastal research yielded realistic spectra for the 450 to 700 nm range; but bottom reflectance-based substrate classifications, generated using a random forest technique, were no more accurate than classifications derived from above-water field spectra. Additional hypothesis testing indicated that a combination of reflectance magnitude (brightness) and indices of spectral shape provided the most accurate riverbed classifications. Convolving field spectra to the response functions of a multispectral satellite and a hyperspectral imaging system did not reduce classification accuracies, implying that high spectral resolution was not essential. Supervised classifications of algal density produced from hyperspectral data and an inferred bottom reflectance image were not highly accurate, but unsupervised classification of the bottom reflectance image revealed distinct spectrally based clusters, suggesting that such an image could provide additional river information. We attribute the failure of bottom reflectance retrieval to yield more reliable substrate maps to a latent correlation between depth and bottom type. Accounting for the effects of depth might have eliminated a key distinction among substrates and thus reduced discriminatory power. Although further, more systematic study across a broader

  11. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  12. Solution of the problem of superposing image and digital map for detection of new objects

    Science.gov (United States)

    Rizaev, I. S.; Miftakhutdinov, D. I.; Takhavova, E. G.

    2018-01-01

    The problem of superposing the map of the terrain with the image of the terrain is considered. The image of the terrain may be represented in different frequency bands. Further analysis of the results of collation the digital map with the image of the appropriate terrain is described. Also the approach to detection of differences between information represented on the digital map and information of the image of the appropriate area is offered. The algorithm for calculating the values of brightness of the converted image area on the original picture is offered. The calculation is based on using information about the navigation parameters and information according to arranged bench marks. For solving the posed problem the experiments were performed. The results of the experiments are shown in this paper. The presented algorithms are applicable to the ground complex of remote sensing data to assess differences between resulting images and accurate geopositional data. They are also suitable for detecting new objects in the image, based on the analysis of the matching the digital map and the image of corresponding locality.

  13. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  14. A review of feature detection and match algorithms for localization and mapping

    Science.gov (United States)

    Li, Shimiao

    2017-09-01

    Localization and mapping is an essential ability of a robot to keep track of its own location in an unknown environment. Among existing methods for this purpose, vision-based methods are more effective solutions for being accurate, inexpensive and versatile. Vision-based methods can generally be categorized as feature-based approaches and appearance-based approaches. The feature-based approaches prove higher performance in textured scenarios. However, their performance depend highly on the applied feature-detection algorithms. In this paper, we surveyed algorithms for feature detection, which is an essential step in achieving vision-based localization and mapping. In this pater, we present mathematical models of the algorithms one after another. To compare the performances of the algorithms, we conducted a series of experiments on their accuracy, speed, scale invariance and rotation invariance. The results of the experiments showed that ORB is the fastest algorithm in detecting and matching features, the speed of which is more than 10 times that of SURF and approximately 40 times that of SIFT. And SIFT, although with no advantage in terms of speed, shows the most correct matching pairs and proves its accuracy.

  15. Cosmic String Detection with Tree-Based Machine Learning

    Science.gov (United States)

    Vafaei Sadr, A.; Farhang, M.; Movahed, S. M. S.; Bassett, B.; Kunz, M.

    2018-05-01

    We explore the use of random forest and gradient boosting, two powerful tree-based machine learning algorithms, for the detection of cosmic strings in maps of the cosmic microwave background (CMB), through their unique Gott-Kaiser-Stebbins effect on the temperature anisotropies. The information in the maps is compressed into feature vectors before being passed to the learning units. The feature vectors contain various statistical measures of the processed CMB maps that boost cosmic string detectability. Our proposed classifiers, after training, give results similar to or better than claimed detectability levels from other methods for string tension, Gμ. They can make 3σ detection of strings with Gμ ≳ 2.1 × 10-10 for noise-free, 0.9΄-resolution CMB observations. The minimum detectable tension increases to Gμ ≳ 3.0 × 10-8 for a more realistic, CMB S4-like (II) strategy, improving over previous results.

  16. PCR-Based EST Mapping in Wheat (Triticum aestivum L.

    Directory of Open Access Journals (Sweden)

    J. PERRY GUSTAFSON

    2009-04-01

    Full Text Available Mapping expressed sequence tags (ESTs to hexaploid wheat is aimed to reveal the structure and function of the hexaploid wheat genome. Sixty eight ESTs representing 26 genes were mapped into all seven homologous chromosome groups of wheat (Triticum aestivum L using a polymerase chain reaction technique. The majority of the ESTs were mapped to homologous chromosome group 2, and the least were mapped to homologous chromosome group 6. Comparative analysis between the EST map from this study and the EST map based on RFLPs showed 14 genes that have been mapped by both approaches were mapped to the same arm of the same homologous chromosome, which indicated that using PCR-based ESTs was a reliable approach in mapping ESTs in hexaploid wheat.

  17. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  18. Agroforestry suitability analysis based upon nutrient availability mapping: a GIS based suitability mapping

    Directory of Open Access Journals (Sweden)

    Firoz Ahmad

    2017-05-01

    Full Text Available Agroforestry has drawn the attention of researchers due to its capacity to reduce the poverty and land degradation, improve food security and mitigate the climate change. However, the progress in promoting agroforestry is held back due to the lack of reliable data sets and appropriate tools to accurately map and to have an adequate decision making system for agroforestry modules. Agroforestry suitability being one special form of land suitability is very pertinent to study in the current times when there is tremendous pressure on the land as it is a limited commodity. The study aims for applying the geo-spatial tools towards visualizing various soil and environmental data to reveal the trends and interrelationships and to achieve a nutrient availability and agroforestry suitability map. Using weight matrix and ranks, individual maps were developed in ArcGIS 10.1 platform to generate nutrient availability map, which was later used to develop agroforestry suitability map. Watersheds were delineated using DEM in some part of the study area and were evaluated for prioritizing it and agroforestry suitability of the watersheds were also done as per the schematic flowchart. Agroforestry suitability regions were delineated based upon the weight and ranks by integrated mapping. The total open area was identified 42.4% out of which 21.6% area was found to have high suitability towards agroforestry. Within the watersheds, 22 village points were generated for creating buffers, which were further evaluated showing its proximity to high suitable agroforestry sites thus generating tremendous opportunity to the villagers to carry out agroforestry projects locally. This research shows the capability of remote sensing in studying agroforestry practices and in estimating the prominent factors for its optimal productivity. The ongoing agroforestry projects can be potentially diverted in the areas of high suitability as an extension. The use of ancillary data in GIS

  19. Risk maps for targeting exotic plant pest detection programs in the United States

    Science.gov (United States)

    R.D. Magarey; D.M. Borchert; J.S. Engle; M Garcia-Colunga; Frank H. Koch; et al

    2011-01-01

    In the United States, pest risk maps are used by the Cooperative Agricultural Pest Survey for spatial and temporal targeting of exotic plant pest detection programs. Methods are described to create standardized host distribution, climate and pathway risk maps for the top nationally ranked exotic pest targets. Two examples are provided to illustrate the risk mapping...

  20. Construction of Fisheye Lens Inverse Perspective Mapping Model and Its Applications of Obstacle Detection

    Directory of Open Access Journals (Sweden)

    Chin-Teng Lin

    2010-01-01

    Full Text Available In this paper, we develop a vision based obstacle detection system by utilizing our proposed fisheye lens inverse perspective mapping (FLIPM method. The new mapping equations are derived to transform the images captured by the fisheye lens camera into the undistorted remapped ones under practical circumstances. In the obstacle detection, we make use of the features of vertical edges on objects from remapped images to indicate the relative positions of obstacles. The static information of remapped images in the current frame is referred to determining the features of source images in the searching stage from either the profile or temporal IPM difference image. The profile image can be acquired by several processes such as sharpening, edge detection, morphological operation, and modified thinning algorithms on the remapped image. The temporal IPM difference image can be obtained by a spatial shift on the remapped image in the previous frame. Moreover, the polar histogram and its post-processing procedures will be used to indicate the position and length of feature vectors and to remove noises as well. Our obstacle detection can give drivers the warning signals within a limited distance from nearby vehicles while the detected obstacles are even with the quasi-vertical edges.

  1. Mapping population-based structural connectomes.

    Science.gov (United States)

    Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu

    2018-05-15

    Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Presenting a mapping method based on fuzzy Logic and TOPSIS multi criteria decision-making methods to detect promising porphyry copper mineralization areas in the east of the Sarcheshmeh copper metallogenic district

    Directory of Open Access Journals (Sweden)

    Shokouh Riahi

    2017-11-01

    Full Text Available Introduction The growing demand for base metals such as iron, copper, lead and zinc on the one hand and the diminishing of surficial and shallow resources of these elements on the other hand have forced explorationists to focus on detecting deep deposits of these metals. As a result, the discovery of such deep deposits requires more advanced and sophisticated methods in the course of preliminary prospecting stages. Since the discovery of new deposits is getting to be increasingly difficult, deploying new prospecting technologies that employ more deposit attributes in the course of combining exploratory evidence may reduce the exploration costs with lower uncertainties. In the past two decades, a number of new data mining and integrating approaches capable of incorporating direct and indirect mineralization indicators, based on expert knowledge, data, or a combination of both, have been proposed Bonham-Carter, 1994(. In the first step, the input exploratory data layers are corrected and validated through applying some statistical pre-processing algorithms such as background and outlier removal methods. In order to detect a mineralization occurrence, it is necessary to find the proper exploratory geological, geochemical and geophysical data layers which have direct or indirect associations with the governing mineralization followed by constructing these models in an appropriate GIS platform (Malkzewski, 1999. Due to the imperfect available data and a number of unknown parameters affecting the mineralization process, the application of conventional GIS integration methods such as Boolean or weighted overlay or even fuzzy logic methods alone may not produce appropriate results, pointing to a need for deploying multi-criteria decision-making methods such as TOPSIS. In the present study, the pre-processed exploratory data including geological, remotely sensed geophysical and geochemical imagery were used to detect favorable mineralization zones

  3. Particle filter based MAP state estimation: A comparison

    NARCIS (Netherlands)

    Saha, S.; Boers, Y.; Driessen, J.N.; Mandal, Pranab K.; Bagchi, Arunabha

    2009-01-01

    MAP estimation is a good alternative to MMSE for certain applications involving nonlinear non Gaussian systems. Recently a new particle filter based MAP estimator has been derived. This new method extracts the MAP directly from the output of a running particle filter. In the recent past, a Viterbi

  4. BEE FORAGE MAPPING BASED ON MULTISPECTRAL IMAGES LANDSAT

    Directory of Open Access Journals (Sweden)

    A. Moskalenko

    2016-10-01

    Full Text Available Possibilities of bee forage identification and mapping based on multispectral images have been shown in the research. Spectral brightness of bee forage has been determined with the use of satellite images. The effectiveness of some methods of image classification for mapping of bee forage is shown. Keywords: bee forage, mapping, multispectral images, image classification.

  5. Performance of T2 Maps in the Detection of Prostate Cancer.

    Science.gov (United States)

    Chatterjee, Aritrick; Devaraj, Ajit; Mathew, Melvy; Szasz, Teodora; Antic, Tatjana; Karczmar, Gregory S; Oto, Aytekin

    2018-05-03

    This study compares the performance of T2 maps in the detection of prostate cancer (PCa) in comparison to T2-weighted (T2W) magnetic resonance images. The prospective study was institutional review board approved. Consenting patients (n = 45) with histologic confirmed PCa underwent preoperative 3-T magnetic resonance imaging with or without endorectal coil. Two radiologists, working independently, marked regions of interests (ROIs) on PCa lesions separately on T2W images and T2 maps. Each ROI was assigned a score of 1-5 based on the confidence in accurately detecting cancer, with 5 being the highest confidence. Subsequently, the histologically confirmed PCa lesions (n = 112) on whole-mount sections were matched with ROIs to calculate sensitivity, positive predictive value (PPV), and radiologist confidence score. Quantitative T2 values of PCa and benign tissue ROIs were measured. Sensitivity and confidence score for PCa detection were similar for T2W images (51%, 4.5 ± 0.8) and T2 maps (52%, 4.5 ± 0.6). However, PPV was significantly higher (P = .001) for T2 maps (88%) compared to T2W (72%) images. The use of endorectal coils nominally improved sensitivity (T2W: 55 vs 47%, T2 map: 54% vs 48%) compared to the use of no endorectal coils, but not the PPV and the confidence score. Quantitative T2 values for PCa (105 ± 28 milliseconds) were significantly (P = 9.3 × 10 -14 ) lower than benign peripheral zone tissue (211 ± 71 milliseconds), with moderate significant correlation with Gleason score (ρ = -0.284). Our study shows that review of T2 maps by radiologists has similar sensitivity but higher PPV compared to T2W images. Additional quantitative information obtained from T2 maps is helpful in differentiating cancer from normal prostate tissue and determining its aggressiveness. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  6. Dynamic approximate entropy electroanatomic maps detect rotors in a simulated atrial fibrillation model.

    Science.gov (United States)

    Ugarte, Juan P; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping.

  7. Dynamic Approximate Entropy Electroanatomic Maps Detect Rotors in a Simulated Atrial Fibrillation Model

    Science.gov (United States)

    Ugarte, Juan P.; Orozco-Duque, Andrés; Tobón, Catalina; Kremen, Vaclav; Novak, Daniel; Saiz, Javier; Oesterlein, Tobias; Schmitt, Clauss; Luik, Armin; Bustamante, John

    2014-01-01

    There is evidence that rotors could be drivers that maintain atrial fibrillation. Complex fractionated atrial electrograms have been located in rotor tip areas. However, the concept of electrogram fractionation, defined using time intervals, is still controversial as a tool for locating target sites for ablation. We hypothesize that the fractionation phenomenon is better described using non-linear dynamic measures, such as approximate entropy, and that this tool could be used for locating the rotor tip. The aim of this work has been to determine the relationship between approximate entropy and fractionated electrograms, and to develop a new tool for rotor mapping based on fractionation levels. Two episodes of chronic atrial fibrillation were simulated in a 3D human atrial model, in which rotors were observed. Dynamic approximate entropy maps were calculated using unipolar electrogram signals generated over the whole surface of the 3D atrial model. In addition, we optimized the approximate entropy calculation using two real multi-center databases of fractionated electrogram signals, labeled in 4 levels of fractionation. We found that the values of approximate entropy and the levels of fractionation are positively correlated. This allows the dynamic approximate entropy maps to localize the tips from stable and meandering rotors. Furthermore, we assessed the optimized approximate entropy using bipolar electrograms generated over a vicinity enclosing a rotor, achieving rotor detection. Our results suggest that high approximate entropy values are able to detect a high level of fractionation and to locate rotor tips in simulated atrial fibrillation episodes. We suggest that dynamic approximate entropy maps could become a tool for atrial fibrillation rotor mapping. PMID:25489858

  8. Comparison between genetic algorithm and self organizing map to detect botnet network traffic

    Science.gov (United States)

    Yugandhara Prabhakar, Shinde; Parganiha, Pratishtha; Madhu Viswanatham, V.; Nirmala, M.

    2017-11-01

    In Cyber Security world the botnet attacks are increasing. To detect botnet is a challenging task. Botnet is a group of computers connected in a coordinated fashion to do malicious activities. Many techniques have been developed and used to detect and prevent botnet traffic and the attacks. In this paper, a comparative study is done on Genetic Algorithm (GA) and Self Organizing Map (SOM) to detect the botnet network traffic. Both are soft computing techniques and used in this paper as data analytics system. GA is based on natural evolution process and SOM is an Artificial Neural Network type, uses unsupervised learning techniques. SOM uses neurons and classifies the data according to the neurons. Sample of KDD99 dataset is used as input to GA and SOM.

  9. Active and Passive Remote Sensing Data Time Series for Flood Detection and Surface Water Mapping

    Science.gov (United States)

    Bioresita, Filsa; Puissant, Anne; Stumpf, André; Malet, Jean-Philippe

    2017-04-01

    Split Based Approach (MSBA) is used in order to focus on surface water areas automatically and facilitate the estimation of class models for water and non-water areas. A Finite Mixture Model is employed as the underlying statistical model to produce probabilistic maps. Subsequently, bilateral filtering is applied to take into account spatial neighborhood relationships in the generation of final map. The elimination of shadows effect is performed in a post-processing step. The processing chain is tested on three case studies. The first case is a flood event in central Ireland, the second case is located in Yorkshire county / Great Britain, and the third test case covers a recent flood event in northern Italy. The tests showed that the modified SBA step and the Finite Mixture Models can be applied for the automatic surface water detection in a variety of test cases. An evaluation again Copernicus products derived from very-high resolution imagery was performed, and showed a high overall accuracy and F-measure of the obtained maps. This evaluation also showed that the use of probability maps and bilateral filtering improved the accuracy of classification results significantly. Based on this quantitative evaluation, it is concluded that the processing chain can be applied for flood mapping from Sentinel-1 data. To estimate robust statistical distributions the method requires sufficient surface waters areas in the observed zone and sufficient contrast between surface waters and other land use classes. Ongoing research addresses the fusion of Sentinel-1 and passive remote sensing data (e.g. Sentinel-2) in order to reduce the current shortcomings in the developed processing chain. In this work, fusion is performed at the feature level to better account for the difference image properties of SAR and optical sensors. Further, the processing chain is currently being optimized in terms of calculation time for a further integration as a flood mapping service on the A2S (Alsace Aval

  10. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M., E-mail: rms@nih.gov [Imaging Biomarkers and Computer-aided Diagnosis Laboratory, Radiology and Imaging Sciences, National Institutes of Health Clinical Center Building, 10 Room 1C224 MSC 1182, Bethesda, Maryland 20892-1182 (United States)

    2016-07-15

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  11. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    International Nuclear Information System (INIS)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M.

    2016-01-01

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  12. Standard practice for detection sensitivity mapping of In-Plant Walk-through metal detectors

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This standard practice covers a procedure for determining the weakest detection path through the portal aperture and the worst-case orthogonal orientation of metallic test objects. It results in detection sensitivity maps, which model the detection zone in terms related to detection sensitivity and identify the weakest detection paths. Detection sensitivity maps support sensitivity adjustment and performance evaluation procedures (see Practices C1269 and C1309). Note 1—Unsymmetrical metal objects possessing a primary longitudinal component, such as handguns and knives, usually have one particular orientation that produces the weakest detection signal. The orientation and the path through the detector aperture where the weakest response is produced may not be the same for all test objects, even those with very similar appearance. Note 2—In the case of multiple specified test objects or for test objects that are orientation sensitive, it may be necessary to map each object several times to determine ...

  13. Change Detection Algorithm for the Production of Land Cover Change Maps over the European Union Countries

    Directory of Open Access Journals (Sweden)

    Sebastian Aleksandrowicz

    2014-06-01

    Full Text Available Contemporary satellite Earth Observation systems provide growing amounts of very high spatial resolution data that can be used in various applications. An increasing number of sensors make it possible to monitor selected areas in great detail. However, in order to handle the volume of data, a high level of automation is required. The semi-automatic change detection methodology described in this paper was developed to annually update land cover maps prepared in the context of the Geoland2. The proposed algorithm was tailored to work with different very high spatial resolution images acquired over different European landscapes. The methodology is a fusion of various change detection methods ranging from: (1 layer arithmetic; (2 vegetation indices (NDVI differentiating; (3 texture calculation; and methods based on (4 canonical correlation analysis (multivariate alteration detection (MAD. User intervention during the production of the change map is limited to the selection of the input data, the size of initial segments and the threshold for texture classification (optionally. To achieve a high level of automation, statistical thresholds were applied in most of the processing steps. Tests showed an overall change recognition accuracy of 89%, and the change type classification methodology can accurately classify transitions between classes.

  14. Activity Based Costing in Value Stream Mapping

    Directory of Open Access Journals (Sweden)

    S. S. Abuthakeer

    2010-12-01

    Full Text Available This paper attempts to integrate Value Stream Map (VSM with the cost aspects. A value stream map provides a blueprint for implementing lean manufacturing concepts by illustrating information and materials flow in a value stream. The objective of the present work is to integrate the various cost aspects. The idea is to introduce a cost line, which enhances the clarity in decision making. The redesigned map proves to be effective in highlighting the improvement areas, in terms of quantitative data. TAKT time calculation is carried out to set the pace of production. Target cost is set as a bench mark for product cost. The results of the study indicates that implementing VSM led to reduction in the following areas: processing lead time by 34%, processing cycle time was reduced by 35%, Inventory level by 66% and product cost from Rs 137 to Rs 125. It was found that adopting VSM in a small scale industry can make significant improvements.

  15. Acoustic Longitudinal Field NIF Optic Feature Detection Map Using Time-Reversal & MUSIC

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, S K

    2006-02-09

    We developed an ultrasonic longitudinal field time-reversal and MUltiple SIgnal Classification (MUSIC) based detection algorithm for identifying and mapping flaws in fused silica NIF optics. The algorithm requires a fully multistatic data set, that is one with multiple, independently operated, spatially diverse transducers, each transmitter of which, in succession, launches a pulse into the optic and the scattered signal measured and recorded at every receiver. We have successfully localized engineered ''defects'' larger than 1 mm in an optic. We confirmed detection and localization of 3 mm and 5 mm features in experimental data, and a 0.5 mm in simulated data with sufficiently high signal-to-noise ratio. We present the theory, experimental results, and simulated results.

  16. Damage Detection Method of Wind Turbine Blade Using Acoustic Emission Signal Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Han, Byeong Hee; Yoon, Dong JIn [Korea Research Institute of Standards and Seience, Daejeon (Korea, Republic of)

    2011-02-15

    Acoustic emission(AE) has emerged as a powerful nondestructive tool to detect any further growth or expansion of preexisting defects or to characterize failure mechanisms. Recently, this kind of technique, that is an in-situ monitoring of inside damages of materials or structures, becomes increasingly popular for monitoring the integrity of large structures like a huge wind turbine blade. Therefore, it is required to find a symptom of damage propagation before catastrophic failure through a continuous monitoring. In this study, a new damage location method has been proposed by using signal napping algorithm, and an experimental verification is conducted by using small wind turbine blade specimen: a part of 750 kW real blade. The results show that this new signal mapping method has high advantages such as a flexibility for sensor location, improved accuracy, high detectability. The newly proposed method was compared with traditional AE source location method based on arrival time difference

  17. Cognitive mapping based on synthetic vision?

    Science.gov (United States)

    Helmetag, Arnd; Halbig, Christian; Kubbat, Wolfgang; Schmidt, Rainer

    1999-07-01

    The analysis of accidents focused our work on the avoidance of 'Controlled Flight Into Terrain' caused by insufficient situation awareness. Analysis of safety concepts led us to the design of the proposed synthetic vision system that will be described. Since most information on these 3D-Displays is shown in a graphical way, it can intuitively be understood by the pilot. What are the new possibilities using SVS enhancing situation awareness? First, detection of ground collision hazard is possible by monitoring a perspective Primary Flight Display. Under the psychological point of view it is based on the perception of expanding objects in the visual flow field. Supported by a Navigation Display a local conflict resolution can be mentally worked out very fast. Secondly, it is possible to follow a 3D flight path visualized as a 'Tunnel in the sky.' This can further be improved by using a flight path prediction. These are the prerequisites for a safe and adequate movement in any kind of spatial environment. However situation awareness requires the ability of navigation and spatial problem solving. Both abilities are based on higher cognitive functions in real as well as in a synthetic environment. In this paper the current training concept will be analyzed. Advantages resulting from the integration of a SVS concerning pilot training will be discussed and necessary requirements in terrain depiction will be pinpointed. Finally a modified Computer Based Training for the familiarization with Salzburg Airport for a SVS equipped aircraft will be presented. It is developed by Darmstadt University of Technology in co-operation with Lufthansa Flight Training.

  18. Real-time method for establishing a detection map for a network of sensors

    Science.gov (United States)

    Nguyen, Hung D; Koch, Mark W; Giron, Casey; Rondeau, Daniel M; Russell, John L

    2012-09-11

    A method for establishing a detection map of a dynamically configurable sensor network. This method determines an appropriate set of locations for a plurality of sensor units of a sensor network and establishes a detection map for the network of sensors while the network is being set up; the detection map includes the effects of the local terrain and individual sensor performance. Sensor performance is characterized during the placement of the sensor units, which enables dynamic adjustment or reconfiguration of the placement of individual elements of the sensor network during network set-up to accommodate variations in local terrain and individual sensor performance. The reconfiguration of the network during initial set-up to accommodate deviations from idealized individual sensor detection zones improves the effectiveness of the sensor network in detecting activities at a detection perimeter and can provide the desired sensor coverage of an area while minimizing unintentional gaps in coverage.

  19. A Novel Vehicle Stationary Detection Utilizing Map Matching and IMU Sensors

    Directory of Open Access Journals (Sweden)

    Md. Syedul Amin

    2014-01-01

    Full Text Available Precise navigation is a vital need for many modern vehicular applications. The global positioning system (GPS cannot provide continuous navigation information in urban areas. The widely used inertial navigation system (INS can provide full vehicle state at high rates. However, the accuracy diverges quickly in low cost microelectromechanical systems (MEMS based INS due to bias, drift, noise, and other errors. These errors can be corrected in a stationary state. But detecting stationary state is a challenging task. A novel stationary state detection technique from the variation of acceleration, heading, and pitch and roll of an attitude heading reference system (AHRS built from the inertial measurement unit (IMU sensors is proposed. Besides, the map matching (MM algorithm detects the intersections where the vehicle is likely to stop. Combining these two results, the stationary state is detected with a smaller timing window of 3 s. A longer timing window of 5 s is used when the stationary state is detected only from the AHRS. The experimental results show that the stationary state is correctly identified and the position error is reduced to 90% and outperforms previously reported work. The proposed algorithm would help to reduce INS errors and enhance the performance of the navigation system.

  20. Early detection of sporadic pancreatic cancer: strategic map for innovation--a white paper.

    Science.gov (United States)

    Kenner, Barbara J; Chari, Suresh T; Cleeter, Deborah F; Go, Vay Liang W

    2015-07-01

    Innovation leading to significant advances in research and subsequent translation to clinical practice is urgently necessary in early detection of sporadic pancreatic cancer. Addressing this need, the Early Detection of Sporadic Pancreatic Cancer Summit Conference was conducted by Kenner Family Research Fund in conjunction with the 2014 American Pancreatic Association and Japan Pancreas Society Meeting. International interdisciplinary scientific representatives engaged in strategic facilitated conversations based on distinct areas of inquiry: Case for Early Detection: Definitions, Detection, Survival, and Challenges; Biomarkers for Early Detection; Imaging; and Collaborative Studies. Ideas generated from the summit have led to the development of a Strategic Map for Innovation built upon 3 components: formation of an international collaborative effort, design of an actionable strategic plan, and implementation of operational standards, research priorities, and first-phase initiatives. Through invested and committed efforts of leading researchers and institutions, philanthropic partners, government agencies, and supportive business entities, this endeavor will change the future of the field and consequently the survival rate of those diagnosed with pancreatic cancer.

  1. Iris Matching Based on Personalized Weight Map.

    Science.gov (United States)

    Dong, Wenbo; Sun, Zhenan; Tan, Tieniu

    2011-09-01

    Iris recognition typically involves three steps, namely, iris image preprocessing, feature extraction, and feature matching. The first two steps of iris recognition have been well studied, but the last step is less addressed. Each human iris has its unique visual pattern and local image features also vary from region to region, which leads to significant differences in robustness and distinctiveness among the feature codes derived from different iris regions. However, most state-of-the-art iris recognition methods use a uniform matching strategy, where features extracted from different regions of the same person or the same region for different individuals are considered to be equally important. This paper proposes a personalized iris matching strategy using a class-specific weight map learned from the training images of the same iris class. The weight map can be updated online during the iris recognition procedure when the successfully recognized iris images are regarded as the new training data. The weight map reflects the robustness of an encoding algorithm on different iris regions by assigning an appropriate weight to each feature code for iris matching. Such a weight map trained by sufficient iris templates is convergent and robust against various noise. Extensive and comprehensive experiments demonstrate that the proposed personalized iris matching strategy achieves much better iris recognition performance than uniform strategies, especially for poor quality iris images.

  2. Mapping genomic deletions down to the base

    DEFF Research Database (Denmark)

    Dunø, Morten; Hove, Hanne; Kirchhoff, Maria

    2004-01-01

    the breakpoint of the third patient was mapped to a region previously predicted to be prone for rearrangements. One patient also harboured an inversion in connection with the deletion that disrupted the HDAC9 gene. All three patients showed clinical characteristics reminiscent of the hand-foot-genital syndrome...

  3. Detection and Mapping of the Geomorphic Effects of Flooding Using UAV Photogrammetry

    Science.gov (United States)

    Langhammer, Jakub; Vacková, Tereza

    2018-04-01

    In this paper, we present a novel technique for the objective detection of the geomorphological effects of flooding in riverbeds and floodplains using imagery acquired by unmanned aerial vehicles (UAVs, also known as drones) equipped with an panchromatic camera. The proposed method is based on the fusion of the two key data products of UAV photogrammetry, the digital elevation model (DEM), and the orthoimage, as well as derived qualitative information, which together serve as the basis for object-based segmentation and the supervised classification of fluvial forms. The orthoimage is used to calculate textural features, enabling detection of the structural properties of the image area and supporting the differentiation of features with similar spectral responses but different surface structures. The DEM is used to derive a flood depth model and the terrain ruggedness index, supporting the detection of bank erosion. All the newly derived information layers are merged with the orthoimage to form a multi-band data set, which is used for object-based segmentation and the supervised classification of key fluvial forms resulting from flooding, i.e., fresh and old gravel accumulations, sand accumulations, and bank erosion. The method was tested on the effects of a snowmelt flood that occurred in December 2015 in a montane stream in the Sumava Mountains, Czech Republic, Central Europe. A multi-rotor UAV was used to collect images of a 1-km-long and 200-m-wide stretch of meandering stream with fresh traces of fluvial activity. The performed segmentation and classification proved that the fusion of 2D and 3D data with the derived qualitative layers significantly enhanced the reliability of the fluvial form detection process. The assessment accuracy for all of the detected classes exceeded 90%. The proposed technique proved its potential for application in rapid mapping and detection of the geomorphological effects of flooding.

  4. Potential of Pest and Host Phenological Data in the Attribution of Regional Forest Disturbance Detection Maps According to Causal Agent

    Science.gov (United States)

    Spruce, Joseph; Hargrove, William; Norman Steve; Christie, William

    2014-01-01

    Near real time forest disturbance detection maps from MODIS NDVI phenology data have been produced since 2010 for the conterminous U.S., as part of the on-line ForWarn national forest threat early warning system. The latter has been used by the forest health community to identify and track many regional forest disturbances caused by multiple biotic and abiotic damage agents. Attribution of causal agents for detected disturbances has been a goal since project initiation in 2006. Combined with detailed cover type maps, geospatial pest phenology data offer a potential means for narrowing the candidate causal agents responsible for a given biotic disturbance. U.S. Aerial Detection Surveys (ADS) employ such phenology data. Historic ADS products provide general locational data on recent insect-induced forest type specific disturbances that may help in determining candidate causal agents for MODIS-based disturbance maps, especially when combined with other historic geospatial disturbance data (e.g., wildfire burn scars and drought maps). Historic ADS disturbance detection polygons can show severe and extensive regional forest disturbances, though they also can show polygons with sparsely scattered or infrequent disturbances. Examples will be discussed that use various historic disturbance data to help determine potential causes of MODIS-detected regional forest disturbance anomalies.

  5. Domain similarity based orthology detection.

    Science.gov (United States)

    Bitard-Feildel, Tristan; Kemena, Carsten; Greenwood, Jenny M; Bornberg-Bauer, Erich

    2015-05-13

    Orthologous protein detection software mostly uses pairwise comparisons of amino-acid sequences to assert whether two proteins are orthologous or not. Accordingly, when the number of sequences for comparison increases, the number of comparisons to compute grows in a quadratic order. A current challenge of bioinformatic research, especially when taking into account the increasing number of sequenced organisms available, is to make this ever-growing number of comparisons computationally feasible in a reasonable amount of time. We propose to speed up the detection of orthologous proteins by using strings of domains to characterize the proteins. We present two new protein similarity measures, a cosine and a maximal weight matching score based on domain content similarity, and new software, named porthoDom. The qualities of the cosine and the maximal weight matching similarity measures are compared against curated datasets. The measures show that domain content similarities are able to correctly group proteins into their families. Accordingly, the cosine similarity measure is used inside porthoDom, the wrapper developed for proteinortho. porthoDom makes use of domain content similarity measures to group proteins together before searching for orthologs. By using domains instead of amino acid sequences, the reduction of the search space decreases the computational complexity of an all-against-all sequence comparison. We demonstrate that representing and comparing proteins as strings of discrete domains, i.e. as a concatenation of their unique identifiers, allows a drastic simplification of search space. porthoDom has the advantage of speeding up orthology detection while maintaining a degree of accuracy similar to proteinortho. The implementation of porthoDom is released using python and C++ languages and is available under the GNU GPL licence 3 at http://www.bornberglab.org/pages/porthoda .

  6. State Base Map for GIS – New Digital Topographic Map of the Republic of Macedonia

    Directory of Open Access Journals (Sweden)

    Zlatko Srbinoski

    2009-12-01

    Full Text Available The basic aim of the National Spatial Data Infrastructure (NSDI built in accordance with INSPIRE directive is to standardize spatial data infrastructure on national level. In that direction, topographic maps are a basic platform for acquiring spatial data within geoinformation systems and one of the most important  segments of NSDI. This paper presents methodology of establishing the new digital topographic map of the Republic of Macedonia titled “State Base Map for GIS in Macedonia”. This paper analyzes geometrical accuracy of new digital topographic maps. Production of the new digital topographic map has been the most important cartographic project in the Republic of Macedonia since it became independent.

  7. Detecting mutually exclusive interactions in protein-protein interaction maps.

    KAUST Repository

    Sá nchez Claros, Carmen; Tramontano, Anna

    2012-01-01

    Comprehensive protein interaction maps can complement genetic and biochemical experiments and allow the formulation of new hypotheses to be tested in the system of interest. The computational analysis of the maps may help to focus on interesting cases and thereby to appropriately prioritize the validation experiments. We show here that, by automatically comparing and analyzing structurally similar regions of proteins of known structure interacting with a common partner, it is possible to identify mutually exclusive interactions present in the maps with a sensitivity of 70% and a specificity higher than 85% and that, in about three fourth of the correctly identified complexes, we also correctly recognize at least one residue (five on average) belonging to the interaction interface. Given the present and continuously increasing number of proteins of known structure, the requirement of the knowledge of the structure of the interacting proteins does not substantially impact on the coverage of our strategy that can be estimated to be around 25%. We also introduce here the Estrella server that embodies this strategy, is designed for users interested in validating specific hypotheses about the functional role of a protein-protein interaction and it also allows access to pre-computed data for seven organisms.

  8. Detecting mutually exclusive interactions in protein-protein interaction maps.

    KAUST Repository

    Sánchez Claros, Carmen

    2012-06-08

    Comprehensive protein interaction maps can complement genetic and biochemical experiments and allow the formulation of new hypotheses to be tested in the system of interest. The computational analysis of the maps may help to focus on interesting cases and thereby to appropriately prioritize the validation experiments. We show here that, by automatically comparing and analyzing structurally similar regions of proteins of known structure interacting with a common partner, it is possible to identify mutually exclusive interactions present in the maps with a sensitivity of 70% and a specificity higher than 85% and that, in about three fourth of the correctly identified complexes, we also correctly recognize at least one residue (five on average) belonging to the interaction interface. Given the present and continuously increasing number of proteins of known structure, the requirement of the knowledge of the structure of the interacting proteins does not substantially impact on the coverage of our strategy that can be estimated to be around 25%. We also introduce here the Estrella server that embodies this strategy, is designed for users interested in validating specific hypotheses about the functional role of a protein-protein interaction and it also allows access to pre-computed data for seven organisms.

  9. Generalized logistic map and its application in chaos based cryptography

    Science.gov (United States)

    Lawnik, M.

    2017-12-01

    The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.

  10. Method for Stereo Mapping Based on Objectarx and Pipeline Technology

    Science.gov (United States)

    Liu, F.; Chen, T.; Lin, Z.; Yang, Y.

    2012-07-01

    Stereo mapping is an important way to acquire 4D production. Based on the development of the stereo mapping and the characteristics of ObjectARX and pipeline technology, a new stereo mapping scheme which can realize the interaction between the AutoCAD and digital photogrammetry system is offered by ObjectARX and pipeline technology. An experiment is made in order to make sure the feasibility with the example of the software MAP-AT (Modern Aerial Photogrammetry Automatic Triangulation), the experimental results show that this scheme is feasible and it has very important meaning for the realization of the acquisition and edit integration.

  11. Broadband illusion optical devices based on conformal mappings

    Science.gov (United States)

    Xiong, Zhan; Xu, Lin; Xu, Ya-Dong; Chen, Huan-Yang

    2017-10-01

    In this paper, we propose a simple method of illusion optics based on conformal mappings. By carefully developing designs with specific conformal mappings, one can make an object look like another with a significantly different shape. In addition, the illusion optical devices can work in a broadband of frequencies.

  12. Airborne detection and mapping of oil spills, Grand Bahamas, February 1973

    Energy Technology Data Exchange (ETDEWEB)

    Devilliers, J N

    1973-09-01

    An airborne exercise is described employing various sensors to investigate their ability to detect and map Louisiana crude and naphtha oil spills, both by day and by night. It is shown that photographic, infrared scanning, and low light level television all have some ability to detect Louisiana crude, but only infrared scanning detected naphtha. None of these sensors could identify the anomalies as oil. A laser fluorosensor showed promise in detecting oil at night. (Author) (GRA)

  13. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Science.gov (United States)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  14. Cellular telephone-based radiation detection instrument

    Science.gov (United States)

    Craig, William W [Pittsburg, CA; Labov, Simon E [Berkeley, CA

    2011-06-14

    A network of radiation detection instruments, each having a small solid state radiation sensor module integrated into a cellular phone for providing radiation detection data and analysis directly to a user. The sensor module includes a solid-state crystal bonded to an ASIC readout providing a low cost, low power, light weight compact instrument to detect and measure radiation energies in the local ambient radiation field. In particular, the photon energy, time of event, and location of the detection instrument at the time of detection is recorded for real time transmission to a central data collection/analysis system. The collected data from the entire network of radiation detection instruments are combined by intelligent correlation/analysis algorithms which map the background radiation and detect, identify and track radiation anomalies in the region.

  15. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  16. Detecting coevolving amino acid sites using Bayesian mutational mapping

    DEFF Research Database (Denmark)

    Dimmic, Matthew W.; Hubisz, Melissa J.; Bustamente, Carlos D.

    2005-01-01

    Motivation: The evolution of protein sequences is constrained by complex interactions between amino acid residues. Because harmful substitutions may be compensated for by other substitutions at neighboring sites, residues can coevolve. We describe a Bayesian phylogenetic approach to the detection...

  17. Approach of simultaneous localization and mapping based on local maps for robot

    Institute of Scientific and Technical Information of China (English)

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  18. A perceptual map for gait symmetry quantification and pathology detection.

    Science.gov (United States)

    Moevus, Antoine; Mignotte, Max; de Guise, Jacques A; Meunier, Jean

    2015-10-29

    The gait movement is an essential process of the human activity and the result of collaborative interactions between the neurological, articular and musculoskeletal systems, working efficiently together. This explains why gait analysis is important and increasingly used nowadays for the diagnosis of many different types (neurological, muscular, orthopedic, etc.) of diseases. This paper introduces a novel method to quickly visualize the different parts of the body related to an asymmetric movement in the human gait of a patient for daily clinical usage. The proposed gait analysis algorithm relies on the fact that the healthy walk has (temporally shift-invariant) symmetry properties in the coronal plane. The goal is to provide an inexpensive and easy-to-use method, exploiting an affordable consumer depth sensor, the Kinect, to measure the gait asymmetry and display results in a perceptual way. We propose a multi-dimensional scaling mapping using a temporally shift invariant distance, allowing us to efficiently visualize (in terms of perceptual color difference) the asymmetric body parts of the gait cycle of a subject. We also propose an index computed from this map and which quantifies locally and globally the degree of asymmetry. The proposed index is proved to be statistically significant and this new, inexpensive, marker-less, non-invasive, easy to set up, gait analysis system offers a readable and flexible tool for clinicians to analyze gait characteristics and to provide a fast diagnostic. This system, which estimates a perceptual color map providing a quick overview of asymmetry existing in the gait cycle of a subject, can be easily exploited for disease progression, recovery cues from post-operative surgery (e.g., to check the healing process or the effect of a treatment or a prosthesis) or might be used for other pathologies where gait asymmetry might be a symptom.

  19. Multi-Modal Detection and Mapping of Static and Dynamic Obstacles in Agriculture for Process Evaluation

    Directory of Open Access Journals (Sweden)

    Timo Korthals

    2018-03-01

    Full Text Available Today, agricultural vehicles are available that can automatically perform tasks such as weed detection and spraying, mowing, and sowing while being steered automatically. However, for such systems to be fully autonomous and self-driven, not only their specific agricultural tasks must be automated. An accurate and robust perception system automatically detecting and avoiding all obstacles must also be realized to ensure safety of humans, animals, and other surroundings. In this paper, we present a multi-modal obstacle and environment detection and recognition approach for process evaluation in agricultural fields. The proposed pipeline detects and maps static and dynamic obstacles globally, while providing process-relevant information along the traversed trajectory. Detection algorithms are introduced for a variety of sensor technologies, including range sensors (lidar and radar and cameras (stereo and thermal. Detection information is mapped globally into semantical occupancy grid maps and fused across all sensors with late fusion, resulting in accurate traversability assessment and semantical mapping of process-relevant categories (e.g., crop, ground, and obstacles. Finally, a decoding step uses a Hidden Markov model to extract relevant process-specific parameters along the trajectory of the vehicle, thus informing a potential control system of unexpected structures in the planned path. The method is evaluated on a public dataset for multi-modal obstacle detection in agricultural fields. Results show that a combination of multiple sensor modalities increases detection performance and that different fusion strategies must be applied between algorithms detecting similar and dissimilar classes.

  20. An Approach of Dynamic Object Removing for Indoor Mapping Based on UGV SLAM

    Directory of Open Access Journals (Sweden)

    Jian Tang

    2015-07-01

    Full Text Available The study of indoor mapping for Location Based Service (LBS becomes more and more popular in recent years. LiDAR SLAM based mapping method seems to be a promising indoor mapping solution. However, there are some dynamic objects such as pedestrians, indoor vehicles, etc. existing in the raw LiDAR range data. They have to be removal for mapping purpose. In this paper, a new approach of dynamic object removing called Likelihood Grid Voting (LGV is presented. It is a model free method and takes full advantage of the high scanning rate of LiDAR, which is moving at a relative low speed in indoor environment. In this method, a counting grid is allocated for recording the occupation of map position by laser scans. The lower counter value of this position can be recognized as dynamic objects and the point cloud will be removed from map. This work is a part of algorithms in our self- developed Unmanned Ground Vehicles (UGV simultaneous localization and Mapping (SLAM system- NAVIS. Field tests are carried in an indoor parking place with NAVIS to evaluate the effectiveness of the proposed method. The result shows that all the small size objects like pedestrians can be detected and removed quickly; large size of objects like cars can be detected and removed partly.

  1. A fast image encryption algorithm based on chaotic map

    Science.gov (United States)

    Liu, Wenhao; Sun, Kehui; Zhu, Congxu

    2016-09-01

    Derived from Sine map and iterative chaotic map with infinite collapse (ICMIC), a new two-dimensional Sine ICMIC modulation map (2D-SIMM) is proposed based on a close-loop modulation coupling (CMC) model, and its chaotic performance is analyzed by means of phase diagram, Lyapunov exponent spectrum and complexity. It shows that this map has good ergodicity, hyperchaotic behavior, large maximum Lyapunov exponent and high complexity. Based on this map, a fast image encryption algorithm is proposed. In this algorithm, the confusion and diffusion processes are combined for one stage. Chaotic shift transform (CST) is proposed to efficiently change the image pixel positions, and the row and column substitutions are applied to scramble the pixel values simultaneously. The simulation and analysis results show that this algorithm has high security, low time complexity, and the abilities of resisting statistical analysis, differential, brute-force, known-plaintext and chosen-plaintext attacks.

  2. Multi-lane detection based on multiple vanishing points detection

    Science.gov (United States)

    Li, Chuanxiang; Nie, Yiming; Dai, Bin; Wu, Tao

    2015-03-01

    Lane detection plays a significant role in Advanced Driver Assistance Systems (ADAS) for intelligent vehicles. In this paper we present a multi-lane detection method based on multiple vanishing points detection. A new multi-lane model assumes that a single lane, which has two approximately parallel boundaries, may not parallel to others on road plane. Non-parallel lanes associate with different vanishing points. A biological plausibility model is used to detect multiple vanishing points and fit lane model. Experimental results show that the proposed method can detect both parallel lanes and non-parallel lanes.

  3. Detection and mapping of polar stratospheric clouds using limb scattering observations

    Directory of Open Access Journals (Sweden)

    C. von Savigny

    2005-01-01

    Full Text Available Satellite-based measurements of Visible/NIR limb-scattered solar radiation are well suited for the detection and mapping of polar stratospheric clouds (PSCs. This publication describes a method to detect PCSs from limb scattering observations with the Scanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY on the European Space Agency's Envisat spacecraft. The method is based on a color-index approach and requires a priori knowledge of the stratospheric background aerosol loading in order to avoid false PSC identifications by stratospheric background aerosol. The method is applied to a sample data set including the 2003 PSC season in the Southern Hemisphere. The PSCs are correlated with coincident UKMO model temperature data, and with very few exceptions, the detected PSCs occur at temperatures below 195–198 K. Monthly averaged PSC descent rates are about 1.5 km/month for the −50° S to −75° S latitude range and assume a maximum between August and September with a value of about 2.5 km/month. The main cause of the PSC descent is the slow descent of the lower stratospheric temperature minimum.

  4. Pseudo-random bit generator based on Chebyshev map

    Science.gov (United States)

    Stoyanov, B. P.

    2013-10-01

    In this paper, we study a pseudo-random bit generator based on two Chebyshev polynomial maps. The novel derivative algorithm shows perfect statistical properties established by number of statistical tests.

  5. An integrated genetic map based on four mapping populations and quantitative trait loci associated with economically important traits in watermelon (Citrullus lanatus)

    Science.gov (United States)

    2014-01-01

    Background Modern watermelon (Citrullus lanatus L.) cultivars share a narrow genetic base due to many years of selection for desirable horticultural qualities. Wild subspecies within C. lanatus are important potential sources of novel alleles for watermelon breeding, but successful trait introgression into elite cultivars has had limited success. The application of marker assisted selection (MAS) in watermelon is yet to be realized, mainly due to the past lack of high quality genetic maps. Recently, a number of useful maps have become available, however these maps have few common markers, and were constructed using different marker sets, thus, making integration and comparative analysis among maps difficult. The objective of this research was to use single-nucleotide polymorphism (SNP) anchor markers to construct an integrated genetic map for C. lanatus. Results Under the framework of the high density genetic map, an integrated genetic map was constructed by merging data from four independent mapping experiments using a genetically diverse array of parental lines, which included three subspecies of watermelon. The 698 simple sequence repeat (SSR), 219 insertion-deletion (InDel), 36 structure variation (SV) and 386 SNP markers from the four maps were used to construct an integrated map. This integrated map contained 1339 markers, spanning 798 cM with an average marker interval of 0.6 cM. Fifty-eight previously reported quantitative trait loci (QTL) for 12 traits in these populations were also integrated into the map. In addition, new QTL identified for brix, fructose, glucose and sucrose were added. Some QTL associated with economically important traits detected in different genetic backgrounds mapped to similar genomic regions of the integrated map, suggesting that such QTL are responsible for the phenotypic variability observed in a broad array of watermelon germplasm. Conclusions The integrated map described herein enhances the utility of genomic tools over

  6. An integrated genetic map based on four mapping populations and quantitative trait loci associated with economically important traits in watermelon (Citrullus lanatus).

    Science.gov (United States)

    Ren, Yi; McGregor, Cecilia; Zhang, Yan; Gong, Guoyi; Zhang, Haiying; Guo, Shaogui; Sun, Honghe; Cai, Wantao; Zhang, Jie; Xu, Yong

    2014-01-20

    Modern watermelon (Citrullus lanatus L.) cultivars share a narrow genetic base due to many years of selection for desirable horticultural qualities. Wild subspecies within C. lanatus are important potential sources of novel alleles for watermelon breeding, but successful trait introgression into elite cultivars has had limited success. The application of marker assisted selection (MAS) in watermelon is yet to be realized, mainly due to the past lack of high quality genetic maps. Recently, a number of useful maps have become available, however these maps have few common markers, and were constructed using different marker sets, thus, making integration and comparative analysis among maps difficult. The objective of this research was to use single-nucleotide polymorphism (SNP) anchor markers to construct an integrated genetic map for C. lanatus. Under the framework of the high density genetic map, an integrated genetic map was constructed by merging data from four independent mapping experiments using a genetically diverse array of parental lines, which included three subspecies of watermelon. The 698 simple sequence repeat (SSR), 219 insertion-deletion (InDel), 36 structure variation (SV) and 386 SNP markers from the four maps were used to construct an integrated map. This integrated map contained 1339 markers, spanning 798 cM with an average marker interval of 0.6 cM. Fifty-eight previously reported quantitative trait loci (QTL) for 12 traits in these populations were also integrated into the map. In addition, new QTL identified for brix, fructose, glucose and sucrose were added. Some QTL associated with economically important traits detected in different genetic backgrounds mapped to similar genomic regions of the integrated map, suggesting that such QTL are responsible for the phenotypic variability observed in a broad array of watermelon germplasm. The integrated map described herein enhances the utility of genomic tools over previous watermelon genetic maps. A

  7. Simulation of seagrass bed mapping by satellite images based on the radiative transfer model

    Science.gov (United States)

    Sagawa, Tatsuyuki; Komatsu, Teruhisa

    2015-06-01

    Seagrass and seaweed beds play important roles in coastal marine ecosystems. They are food sources and habitats for many marine organisms, and influence the physical, chemical, and biological environment. They are sensitive to human impacts such as reclamation and pollution. Therefore, their management and preservation are necessary for a healthy coastal environment. Satellite remote sensing is a useful tool for mapping and monitoring seagrass beds. The efficiency of seagrass mapping, seagrass bed classification in particular, has been evaluated by mapping accuracy using an error matrix. However, mapping accuracies are influenced by coastal environments such as seawater transparency, bathymetry, and substrate type. Coastal management requires sufficient accuracy and an understanding of mapping limitations for monitoring coastal habitats including seagrass beds. Previous studies are mainly based on case studies in specific regions and seasons. Extensive data are required to generalise assessments of classification accuracy from case studies, which has proven difficult. This study aims to build a simulator based on a radiative transfer model to produce modelled satellite images and assess the visual detectability of seagrass beds under different transparencies and seagrass coverages, as well as to examine mapping limitations and classification accuracy. Our simulations led to the development of a model of water transparency and the mapping of depth limits and indicated the possibility for seagrass density mapping under certain ideal conditions. The results show that modelling satellite images is useful in evaluating the accuracy of classification and that establishing seagrass bed monitoring by remote sensing is a reliable tool.

  8. Automatic concrete cracks detection and mapping of terrestrial laser scan data

    Directory of Open Access Journals (Sweden)

    Mostafa Rabah

    2013-12-01

    The current paper submits a method for automatic concrete cracks detection and mapping from the data that was obtained during laser scanning survey. The method of cracks detection and mapping is achieved by three steps, namely the step of shading correction in the original image, step of crack detection and finally step of crack mapping and processing steps. The detected crack is defined in a pixel coordinate system. To remap the crack into the referred coordinate system, a reverse engineering is used. This is achieved by a hybrid concept of terrestrial laser-scanner point clouds and the corresponding camera image, i.e. a conversion from the pixel coordinate system to the terrestrial laser-scanner or global coordinate system. The results of the experiment show that the mean differences between terrestrial laser scan and the total station are about 30.5, 16.4 and 14.3 mms in x, y and z direction, respectively.

  9. Microcontroller based driver alertness detection systems to detect drowsiness

    Science.gov (United States)

    Adenin, Hasibah; Zahari, Rahimi; Lim, Tiong Hoo

    2018-04-01

    The advancement of embedded system for detecting and preventing drowsiness in a vehicle is a major challenge for road traffic accident systems. To prevent drowsiness while driving, it is necessary to have an alert system that can detect a decline in driver concentration and send a signal to the driver. Studies have shown that traffc accidents usually occur when the driver is distracted while driving. In this paper, we have reviewed a number of detection systems to monitor the concentration of a car driver and propose a portable Driver Alertness Detection System (DADS) to determine the level of concentration of the driver based on pixelated coloration detection technique using facial recognition. A portable camera will be placed at the front visor to capture facial expression and the eye activities. We evaluate DADS using 26 participants and have achieved 100% detection rate with good lighting condition and a low detection rate at night.

  10. Absence of rotational activity detected using 2-dimensional phase mapping in the corresponding 3-dimensional phase maps in human persistent atrial fibrillation.

    Science.gov (United States)

    Pathik, Bhupesh; Kalman, Jonathan M; Walters, Tomos; Kuklik, Pawel; Zhao, Jichao; Madry, Andrew; Sanders, Prashanthan; Kistler, Peter M; Lee, Geoffrey

    2018-02-01

    Current phase mapping systems for atrial fibrillation create 2-dimensional (2D) maps. This process may affect the accurate detection of rotors. We developed a 3-dimensional (3D) phase mapping technique that uses the 3D locations of basket electrodes to project phase onto patient-specific left atrial 3D surface anatomy. We sought to determine whether rotors detected in 2D phase maps were present at the corresponding time segments and anatomical locations in 3D phase maps. One-minute left atrial atrial fibrillation recordings were obtained in 14 patients using the basket catheter and analyzed off-line. Using the same phase values, 2D and 3D phase maps were created. Analysis involved determining the dominant propagation patterns in 2D phase maps and evaluating the presence of rotors detected in 2D phase maps in the corresponding 3D phase maps. Using 2D phase mapping, the dominant propagation pattern was single wavefront (36.6%) followed by focal activation (34.0%), disorganized activity (23.7%), rotors (3.3%), and multiple wavefronts (2.4%). Ten transient rotors were observed in 9 of 14 patients (64%). The mean rotor duration was 1.1 ± 0.7 seconds. None of the 10 rotors observed in 2D phase maps were seen at the corresponding time segments and anatomical locations in 3D phase maps; 4 of 10 corresponded with single wavefronts in 3D phase maps, 2 of 10 with 2 simultaneous wavefronts, 1 of 10 with disorganized activity, and in 3 of 10 there was no coverage by the basket catheter at the corresponding 3D anatomical location. Rotors detected in 2D phase maps were not observed in the corresponding 3D phase maps. These findings may have implications for current systems that use 2D phase mapping. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  11. ETHYL CYANIDE ON TITAN: SPECTROSCOPIC DETECTION AND MAPPING USING ALMA

    Energy Technology Data Exchange (ETDEWEB)

    Cordiner, M. A.; Palmer, M. Y.; Nixon, C. A.; Charnley, S. B.; Mumma, M. J.; Serigano, J. [NASA Goddard Space Flight Center, 8800 Greenbelt Road, Greenbelt, MD 20771 (United States); Irwin, P. G. J. [Atmospheric, Oceanic and Planetary Physics, Clarendon Laboratory, University of Oxford, Parks Road, Oxford, OX1 3PU (United Kingdom); Teanby, N. A. [School of Earth Sciences, University of Bristol, Wills Memorial Building, Queen’s Road, Bristol, BS8 1RJ (United Kingdom); Kisiel, Z. [Institute of Physics, Polish Academy of Sciences, Al. Lotnikøw 32/46, 02-668 Warszawa (Poland); Kuan, Y.-J.; Chuang, Y.-L. [National Taiwan Normal University, Taipei 116, Taiwan (China); Wang, K.-S., E-mail: martin.cordiner@nasa.gov [Institute of Astronomy and Astrophysics, Academia Sinica, Taipei 106, Taiwan (China)

    2015-02-10

    We report the first spectroscopic detection of ethyl cyanide (C{sub 2}H{sub 5}CN) in Titan’s atmosphere, obtained using spectrally and spatially resolved observations of multiple emission lines with the Atacama Large Millimeter/submillimeter Array (ALMA). The presence of C{sub 2}H{sub 5}CN in Titan’s ionosphere was previously inferred from Cassini ion mass spectrometry measurements of C{sub 2}H{sub 5}CNH{sup +}. Here we report the detection of 27 rotational lines from C{sub 2}H{sub 5}CN (in 19 separate emission features detected at >3σ confidence) in the frequency range 222–241 GHz. Simultaneous detections of multiple emission lines from HC{sub 3}N, CH{sub 3}CN, and CH{sub 3}CCH were also obtained. In contrast to HC{sub 3}N, CH{sub 3}CN, and CH{sub 3}CCH, which peak in Titan’s northern (spring) hemisphere, the emission from C{sub 2}H{sub 5}CN is found to be concentrated in the southern (autumn) hemisphere, suggesting a distinctly different chemistry for this species, consistent with a relatively short chemical lifetime for C{sub 2}H{sub 5}CN. Radiative transfer models show that C{sub 2}H{sub 5}CN is most concentrated at altitudes ≳200 km, suggesting production predominantly in the stratosphere and above. Vertical column densities are found to be in the range (1–5) × 10{sup 14} cm{sup −2}.

  12. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    Science.gov (United States)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the

  13. The evolution of mapping habitat for northern spotted owls (Strix occidentalis caurina): A comparison of photo-interpreted, Landsat-based, and lidar-based habitat maps

    Science.gov (United States)

    Ackers, Steven H.; Davis, Raymond J.; Olsen, K.; Dugger, Catherine

    2015-01-01

    Wildlife habitat mapping has evolved at a rapid pace over the last few decades. Beginning with simple, often subjective, hand-drawn maps, habitat mapping now involves complex species distribution models (SDMs) using mapped predictor variables derived from remotely sensed data. For species that inhabit large geographic areas, remote sensing technology is often essential for producing range wide maps. Habitat monitoring for northern spotted owls (Strix occidentalis caurina), whose geographic covers about 23 million ha, is based on SDMs that use Landsat Thematic Mapper imagery to create forest vegetation data layers using gradient nearest neighbor (GNN) methods. Vegetation data layers derived from GNN are modeled relationships between forest inventory plot data, climate and topographic data, and the spectral signatures acquired by the satellite. When used as predictor variables for SDMs, there is some transference of the GNN modeling error to the final habitat map.Recent increases in the use of light detection and ranging (lidar) data, coupled with the need to produce spatially accurate and detailed forest vegetation maps have spurred interest in its use for SDMs and habitat mapping. Instead of modeling predictor variables from remotely sensed spectral data, lidar provides direct measurements of vegetation height for use in SDMs. We expect a SDM habitat map produced from directly measured predictor variables to be more accurate than one produced from modeled predictors.We used maximum entropy (Maxent) SDM modeling software to compare predictive performance and estimates of habitat area between Landsat-based and lidar-based northern spotted owl SDMs and habitat maps. We explored the differences and similarities between these maps, and to a pre-existing aerial photo-interpreted habitat map produced by local wildlife biologists. The lidar-based map had the highest predictive performance based on 10 bootstrapped replicate models (AUC = 0.809 ± 0.011), but the

  14. Clustering of the Self-Organizing Map based Approach in Induction Machine Rotor Faults Diagnostics

    Directory of Open Access Journals (Sweden)

    Ahmed TOUMI

    2009-12-01

    Full Text Available Self-Organizing Maps (SOM is an excellent method of analyzingmultidimensional data. The SOM based classification is attractive, due to itsunsupervised learning and topology preserving properties. In this paper, theperformance of the self-organizing methods is investigated in induction motorrotor fault detection and severity evaluation. The SOM is based on motor currentsignature analysis (MCSA. The agglomerative hierarchical algorithms using theWard’s method is applied to automatically dividing the map into interestinginterpretable groups of map units that correspond to clusters in the input data. Theresults obtained with this approach make it possible to detect a rotor bar fault justdirectly from the visualization results. The system is also able to estimate theextent of rotor faults.

  15. a Mapping Method of Slam Based on Look up Table

    Science.gov (United States)

    Wang, Z.; Li, J.; Wang, A.; Wang, J.

    2017-09-01

    In the last years several V-SLAM(Visual Simultaneous Localization and Mapping) approaches have appeared showing impressive reconstructions of the world. However these maps are built with far more than the required information. This limitation comes from the whole process of each key-frame. In this paper we present for the first time a mapping method based on the LOOK UP TABLE(LUT) for visual SLAM that can improve the mapping effectively. As this method relies on extracting features in each cell divided from image, it can get the pose of camera that is more representative of the whole key-frame. The tracking direction of key-frames is obtained by counting the number of parallax directions of feature points. LUT stored all mapping needs the number of cell corresponding to the tracking direction which can reduce the redundant information in the key-frame, and is more efficient to mapping. The result shows that a better map with less noise is build using less than one-third of the time. We believe that the capacity of LUT efficiently building maps makes it a good choice for the community to investigate in the scene reconstruction problems.

  16. Potential fire detection based on Kalman-driven change detection

    CSIR Research Space (South Africa)

    Van Den Bergh, F

    2009-07-01

    Full Text Available A new active fire event detection algorithm for data collected with the Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensor, based on the extended Kalman filter, is introduced. Instead of using the observed temperatures of the spatial...

  17. Comparison of halo detection from noisy weak lensing convergence maps with Gaussian smoothing and MRLens treatment

    International Nuclear Information System (INIS)

    Jiao Yangxiu; Shan Huanyuan; Fan Zuhui

    2011-01-01

    Taking into account the noise from intrinsic ellipticities of source galaxies, we study the efficiency and completeness of halo detections from weak lensing convergence maps. Particularly, with numerical simulations, we compare the Gaussian filter with the so called MRLens treatment based on the modification of the Maximum Entropy Method. For a pure noise field without lensing signals, a Gaussian smoothing results in a residual noise field that is approximately Gaussian in terms of statistics if a large enough number of galaxies are included in the smoothing window. On the other hand, the noise field after the MRLens treatment is significantly non-Gaussian, resulting in complications in characterizing the noise effects. Considering weak-lensing cluster detections, although the MRLens treatment effectively deletes false peaks arising from noise, it removes the real peaks heavily due to its inability to distinguish real signals with relatively low amplitudes from noise in its restoration process. The higher the noise level is, the larger the removal effects are for the real peaks. For a survey with a source density n g ∼ 30 arcmin -2 , the number of peaks found in an area of 3 x 3 deg 2 after MRLens filtering is only ∼ 50 for the detection threshold κ = 0.02, while the number of halos with M > 5 x 10 13 M circleddot and with redshift z ≤ 2 in the same area is expected to be ∼ 530. For the Gaussian smoothing treatment, the number of detections is ∼ 260, much larger than that of the MRLens. The Gaussianity of the noise statistics in the Gaussian smoothing case adds further advantages for this method to circumvent the problem of the relatively low efficiency in weak-lensing cluster detections. Therefore, in studies aiming to construct large cluster samples from weak-lensing surveys, the Gaussian smoothing method performs significantly better than the MRLens treatment.

  18. A Game Map Complexity Measure Based on Hamming Distance

    Science.gov (United States)

    Li, Yan; Su, Pan; Li, Wenliang

    With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.

  19. Map-based model of the cardiac action potential

    International Nuclear Information System (INIS)

    Pavlov, Evgeny A.; Osipov, Grigory V.; Chan, C.K.; Suykens, Johan A.K.

    2011-01-01

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  20. Map-based model of the cardiac action potential

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, Evgeny A., E-mail: genie.pavlov@gmail.com [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Osipov, Grigory V. [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Chan, C.K. [Institute of Physics, Academia Sinica, 128 Sec. 2, Academia Road, Nankang, Taipei 115, Taiwan (China); Suykens, Johan A.K. [K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee) (Belgium)

    2011-07-25

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  1. Base Closure: A Road Map for Completion

    Science.gov (United States)

    1991-03-22

    leadership . At the same time, he should issue written press releases. Selected senior leaders from the base should be briefed separately from key community...they will hold with other agencies on base. These matters should be the commander’s call; they will reflect his style of leadership and preference for...base closure. He alone must foresee what can most likely go wrong as well as what should go right in this arduous process. DCNSIZING AN ORGANIZATIN The

  2. Color encryption scheme based on adapted quantum logistic map

    Science.gov (United States)

    Zaghloul, Alaa; Zhang, Tiejun; Amin, Mohamed; Abd El-Latif, Ahmed A.

    2014-04-01

    This paper presents a new color image encryption scheme based on quantum chaotic system. In this scheme, a new encryption scheme is accomplished by generating an intermediate chaotic key stream with the help of quantum chaotic logistic map. Then, each pixel is encrypted by the cipher value of the previous pixel and the adapted quantum logistic map. The results show that the proposed scheme has adequate security for the confidentiality of color images.

  3. Pemanfaatan Google Maps Api Untuk Visualisasi Data Base Transceiver Station

    OpenAIRE

    Rani, Septia

    2016-01-01

    This paper discusses the use of the Google Maps API to perform data visualization for Base Transceiver Station (BTS) data. BTS are typically used by telecommunications companies to facilitate wireless communication between communication devices with the network operator. Each BTS has important information such as it’s location, it’s transaction traffic, as well as information about revenue. With the implementation of BTS data visualization using the Google Maps API, key information owned by e...

  4. A novel block cryptosystem based on iterating a chaotic map

    International Nuclear Information System (INIS)

    Xiang Tao; Liao Xiaofeng; Tang Guoping; Chen Yong; Wong, Kwok-wo

    2006-01-01

    A block cryptographic scheme based on iterating a chaotic map is proposed. With random binary sequences generated from the real-valued chaotic map, the plaintext block is permuted by a key-dependent shift approach and then encrypted by the classical chaotic masking technique. Simulation results show that performance and security of the proposed cryptographic scheme are better than those of existing algorithms. Advantages and security of our scheme are also discussed in detail

  5. Pemanfaatan Google Maps Api Untuk Visualisasi Data Base Transceiver Station

    OpenAIRE

    Rani, Septia

    2016-01-01

    This paper discusses the use of the Google Maps API to perform data visualization for Base Transceiver Station (BTS) data. BTS are typically used by telecommunications companies to facilitate wireless communication between communication devices with the network operator. Each BTS has important information such as it's location, it's transaction traffic, as well as information about revenue. With the implementation of BTS data visualization using the Google Maps API, key information owned by e...

  6. PEMANFAATAN GOOGLE MAPS API UNTUK VISUALISASI DATA BASE TRANSCEIVER STATION

    OpenAIRE

    Rani, Septia

    2016-01-01

    This paper discusses the use of the Google Maps API to perform data visualization for Base Transceiver Station (BTS) data. BTS are typically used by telecommunications companies to facilitate wireless communication between communication devices with the network operator. Each BTS has important information such as it’s location, it’s transaction traffic, as well as information about revenue. With the implementation of BTS data visualization using the Google Maps API, key information owned by e...

  7. Fault detection of sensors in nuclear reactors using self-organizing maps

    Energy Technology Data Exchange (ETDEWEB)

    Barbosa, Paulo Roberto; Tiago, Graziela Marchi [Instituto Federal de Educacao, Ciencia e Tecnologia de Sao Paulo (IFSP), Sao Paulo, SP (Brazil); Bueno, Elaine Inacio [Instituto Federal de Educacao, Ciencia e Tecnologia de Sao Paulo (IFSP), Guarulhos, SP (Brazil); Pereira, Iraci Martinez, E-mail: martinez@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    In this work a Fault Detection System was developed based on the self-organizing maps methodology. This method was applied to the IEA-R1 research reactor at IPEN using a database generated by a theoretical model of the reactor. The IEA-R1 research reactor is a pool type reactor of 5 MW, cooled and moderated by light water, and uses graphite and beryllium as reflector. The theoretical model was developed using the Matlab Guide toolbox. The equations are based in the IEA-R1 mass and energy inventory balance and physical as well as operational aspects are taken into consideration. In order to test the model ability for fault detection, faults were artificially produced. As the value of the maximum calibration error for special thermocouples is +- 0.5 deg C, it had been inserted faults in the sensor signals with the purpose to produce the database considered in this work. The results show a high percentage of correct classification, encouraging the use of the technique for this type of industrial application. (author)

  8. Fault detection of sensors in nuclear reactors using self-organizing maps

    International Nuclear Information System (INIS)

    Barbosa, Paulo Roberto; Tiago, Graziela Marchi; Bueno, Elaine Inacio; Pereira, Iraci Martinez

    2011-01-01

    In this work a Fault Detection System was developed based on the self-organizing maps methodology. This method was applied to the IEA-R1 research reactor at IPEN using a database generated by a theoretical model of the reactor. The IEA-R1 research reactor is a pool type reactor of 5 MW, cooled and moderated by light water, and uses graphite and beryllium as reflector. The theoretical model was developed using the Matlab Guide toolbox. The equations are based in the IEA-R1 mass and energy inventory balance and physical as well as operational aspects are taken into consideration. In order to test the model ability for fault detection, faults were artificially produced. As the value of the maximum calibration error for special thermocouples is +- 0.5 deg C, it had been inserted faults in the sensor signals with the purpose to produce the database considered in this work. The results show a high percentage of correct classification, encouraging the use of the technique for this type of industrial application. (author)

  9. Detecting ICRS grade 1 cartilage lesions in anterior cruciate ligament injury using T1ρ and T2 mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nishioka, Hiroaki, E-mail: kinuhnishiok@fc.kuh.kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Hirose, Jun, E-mail: hirojun-mk@umin.ac.jp [Department of Orthopaedic Surgery, Kumamoto University Hospital, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Nakamura, Eiichi, E-mail: h@kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Okamoto, Nobukazu, E-mail: nobuoka9999@fc.kuh.kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Karasugi, Tatsuki, E-mail: tatsukik@fc.kuh.kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Taniwaki, Takuya, E-mail: takuyataniwaki@fc.kuh.kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Okada, Tatsuya, E-mail: tatsuya-okada@fc.kuh.kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Yamashita, Yasuyuki, E-mail: yama@kumamoto-u.ac.jp [Department of Diagnostic Radiology, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan); Mizuta, Hiroshi, E-mail: mizuta@kumamoto-u.ac.jp [Department of Orthopaedic Surgery, Faculty of Life Sciences, Kumamoto University, 1-1-1 Honjo, Kumamoto 860-8556 (Japan)

    2013-09-15

    Objective: The purpose of this study was to clarify the detectability of the International Cartilage Repair Society (ICRS) grade 1 cartilage lesions in anterior cruciate ligament (ACL)–injured knees using T1ρ and T2 mapping. Materials and Methods: We performed preoperative T1ρ and T2 mapping and 3D gradient–echo with water–selective excitation (WATS) sequences on 37 subjects with ACL injuries. We determined the detectability on 3D WATS based on arthroscopic findings. The T1ρ and T2 values (ms) were measured in the regions of interest that were placed on the weight–bearing cartilage of the femoral condyle. The receiver operating characteristic (ROC) curve based on these values was constructed using the arthroscopic findings as a reference standard. The evaluation of cartilage was carried out only in the weight–bearing cartilage. The cut–off values for determining the presence of a cartilage injury were determined using each ROC curve, and the detectability was calculated for the T1ρ and T2 mapping. Results: The cut–off values for the T1ρ and T2 were 41.6 and 41.2, respectively. The sensitivity and specificity of T1ρ were 91.2% and 89.5%, respectively, while those of T2 were 76.5% and 81.6%, respectively. For the 3D WATS images, the same values were 58.8% and 78.9%, respectively. Conclusions: Our study demonstrated that the T1ρ and T2 values were significantly higher for ICRS grade 1 cartilage lesions than for normal cartilage and that the two mappings were able to non–invasively detect ICRS grade 1 cartilage lesions in the ACL–injured knee with a higher detectability than were 3D WATS images.

  10. Modeling, Designing, and Implementing an Avatar-based Interactive Map

    Directory of Open Access Journals (Sweden)

    Stefan Andrei

    2016-03-01

    Full Text Available Designing interactive maps has always been a challenge due to the geographical complexity of the earth’s landscape and the difficulty of resolving details to a high resolution. In the past decade or so, one of the most impressive map-based software application, the Global Positioning System (GPS, has probably the highest level of interaction with the user. This article describes an innovative technique for designing an avatar-based virtual interactive map for the Lamar University Campus, which will entail the buildings’ exterior as well as their interiors. Many universities provide 2D or 3D maps and even interactive maps. However, these maps do not provide a complete interaction with the user. To the best of our knowledge, this project is the first avatar-based interaction game that allows 100% interaction with the user. This work provides tremendous help to the freshman students and visitors of Lamar University. As an important marketing tool, the main objective is to get better visibility of the campus worldwide and to increase the number of students attending Lamar University.

  11. Detection and Characterization of Single-Trial fMRI BOLD Responses : Paradigm Free Mapping

    NARCIS (Netherlands)

    Gaudes, Cesar Caballero; Petridou, Natalia; Dryden, Ian L.; Bai, Li; Francis, Susan T.; Gowland, Penny A.

    This work presents a novel method of mapping the brain's response to single stimuli in space and time without prior knowledge of the paradigm timing: paradigm free mapping (PFM). This method is based on deconvolution of the hemodynamic response from the voxel time series assuming a linear response

  12. BAC-end sequence-based SNPs and Bin mapping for rapid integration of physical and genetic maps in apple.

    Science.gov (United States)

    Han, Yuepeng; Chagné, David; Gasic, Ksenija; Rikkerink, Erik H A; Beever, Jonathan E; Gardiner, Susan E; Korban, Schuyler S

    2009-03-01

    A genome-wide BAC physical map of the apple, Malus x domestica Borkh., has been recently developed. Here, we report on integrating the physical and genetic maps of the apple using a SNP-based approach in conjunction with bin mapping. Briefly, BAC clones located at ends of BAC contigs were selected, and sequenced at both ends. The BAC end sequences (BESs) were used to identify candidate SNPs. Subsequently, these candidate SNPs were genetically mapped using a bin mapping strategy for the purpose of mapping the physical onto the genetic map. Using this approach, 52 (23%) out of 228 BESs tested were successfully exploited to develop SNPs. These SNPs anchored 51 contigs, spanning approximately 37 Mb in cumulative physical length, onto 14 linkage groups. The reliability of the integration of the physical and genetic maps using this SNP-based strategy is described, and the results confirm the feasibility of this approach to construct an integrated physical and genetic maps for apple.

  13. Vision-based Vehicle Detection Survey

    Directory of Open Access Journals (Sweden)

    Alex David S

    2016-03-01

    Full Text Available Nowadays thousands of drivers and passengers were losing their lives every year on road accident, due to deadly crashes between more than one vehicle. There are number of many research focuses were dedicated to the development of intellectual driver assistance systems and autonomous vehicles over the past decade, which reduces the danger by monitoring the on-road environment. In particular, researchers attracted towards the on-road detection of vehicles in recent years. Different parameters have been analyzed in this paper which includes camera placement and the various applications of monocular vehicle detection, common features and common classification methods, motion- based approaches and nighttime vehicle detection and monocular pose estimation. Previous works on the vehicle detection listed based on camera poisons, feature based detection and motion based detection works and night time detection.

  14. Population-based screening versus case detection.

    Directory of Open Access Journals (Sweden)

    Thomas Ravi

    2002-01-01

    Full Text Available India has a large burden of blindness and population-based screening is a strategy commonly employed to detect disease and prevent morbidity. However, not all diseases are amenable to screening. This communication examines the issue of "population-based screening" versus "case detection" in the Indian scenario. Using the example of glaucoma, it demonstrates that given the poor infrastructure, for a "rare" disease, case detection is more effective than population-based screening.

  15. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Science.gov (United States)

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  16. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Directory of Open Access Journals (Sweden)

    Pramod K Avti

    Full Text Available In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM was investigated to detect, map, and quantify trace amounts [nanograms (ng to micrograms (µg] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds.Optical-resolution (OR and acoustic-resolution (AR--Photoacoustic microscopy (PAM was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR fluorescence microscopy.Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections.The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  17. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  18. Convolutional neural network features based change detection in satellite images

    Science.gov (United States)

    Mohammed El Amin, Arabi; Liu, Qingjie; Wang, Yunhong

    2016-07-01

    With the popular use of high resolution remote sensing (HRRS) satellite images, a huge research efforts have been placed on change detection (CD) problem. An effective feature selection method can significantly boost the final result. While hand-designed features have proven difficulties to design features that effectively capture high and mid-level representations, the recent developments in machine learning (Deep Learning) omit this problem by learning hierarchical representation in an unsupervised manner directly from data without human intervention. In this letter, we propose approaching the change detection problem from a feature learning perspective. A novel deep Convolutional Neural Networks (CNN) features based HR satellite images change detection method is proposed. The main guideline is to produce a change detection map directly from two images using a pretrained CNN. This method can omit the limited performance of hand-crafted features. Firstly, CNN features are extracted through different convolutional layers. Then, a concatenation step is evaluated after an normalization step, resulting in a unique higher dimensional feature map. Finally, a change map was computed using pixel-wise Euclidean distance. Our method has been validated on real bitemporal HRRS satellite images according to qualitative and quantitative analyses. The results obtained confirm the interest of the proposed method.

  19. Creating soil moisture maps based on radar satellite imagery

    Science.gov (United States)

    Hnatushenko, Volodymyr; Garkusha, Igor; Vasyliev, Volodymyr

    2017-10-01

    The presented work is related to a study of mapping soil moisture basing on radar data from Sentinel-1 and a test of adequacy of the models constructed on the basis of data obtained from alternative sources. Radar signals are reflected from the ground differently, depending on its properties. In radar images obtained, for example, in the C band of the electromagnetic spectrum, soils saturated with moisture usually appear in dark tones. Although, at first glance, the problem of constructing moisture maps basing on radar data seems intuitively clear, its implementation on the basis of the Sentinel-1 data on an industrial scale and in the public domain is not yet available. In the process of mapping, for verification of the results, measurements of soil moisture obtained from logs of the network of climate stations NOAA US Climate Reference Network (USCRN) were used. This network covers almost the entire territory of the United States. The passive microwave radiometers of Aqua and SMAP satellites data are used for comparing processing. In addition, other supplementary cartographic materials were used, such as maps of soil types and ready moisture maps. The paper presents a comparison of the effect of the use of certain methods of roughening the quality of radar data on the result of mapping moisture. Regression models were constructed showing dependence of backscatter coefficient values Sigma0 for calibrated radar data of different spatial resolution obtained at different times on soil moisture values. The obtained soil moisture maps of the territories of research, as well as the conceptual solutions about automation of operations of constructing such digital maps, are presented. The comparative assessment of the time required for processing a given set of radar scenes with the developed tools and with the ESA SNAP product was carried out.

  20. Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

    Science.gov (United States)

    Mou, Xiaozheng; Wang, Han

    2018-01-01

    This paper proposes a wide-baseline stereo-based static obstacle mapping approach for unmanned surface vehicles (USVs). The proposed approach eliminates the complicated calibration work and the bulky rig in our previous binocular stereo system, and raises the ranging ability from 500 to 1000 m with a even larger baseline obtained from the motion of USVs. Integrating a monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed while the USV is traveling, and an obstacle map is then built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. Experimental results based on our own dataset demonstrate the high efficiency of our system. To the best of our knowledge, we are the first to address the task of wide-baseline stereo-based obstacle mapping in a maritime environment. PMID:29617293

  1. Radar-based hail detection

    Czech Academy of Sciences Publication Activity Database

    Skripniková, Kateřina; Řezáčová, Daniela

    2014-01-01

    Roč. 144, č. 1 (2014), s. 175-185 ISSN 0169-8095 R&D Projects: GA ČR(CZ) GAP209/11/2045; GA MŠk LD11044 Institutional support: RVO:68378289 Keywords : hail detection * weather radar * hail damage risk Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.844, year: 2014 http://www.sciencedirect.com/science/article/pii/S0169809513001804

  2. Domain similarity based orthology detection

    OpenAIRE

    Bitard-Feildel, Tristan; Kemena, Carsten; Greenwood, Jenny M; Bornberg-Bauer, Erich

    2015-01-01

    Background Orthologous protein detection software mostly uses pairwise comparisons of amino-acid sequences to assert whether two proteins are orthologous or not. Accordingly, when the number of sequences for comparison increases, the number of comparisons to compute grows in a quadratic order. A current challenge of bioinformatic research, especially when taking into account the increasing number of sequenced organisms available, is to make this ever-growing number of comparisons computationa...

  3. Hash function based on piecewise nonlinear chaotic map

    International Nuclear Information System (INIS)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2009-01-01

    Chaos-based cryptography appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an algorithm for one-way hash function construction based on piecewise nonlinear chaotic map with a variant probability parameter is proposed. Also the proposed algorithm is an attempt to present a new chaotic hash function based on multithreaded programming. In this chaotic scheme, the message is connected to the chaotic map using probability parameter and other parameters of chaotic map such as control parameter and initial condition, so that the generated hash value is highly sensitive to the message. Simulation results indicate that the proposed algorithm presented several interesting features, such as high flexibility, good statistical properties, high key sensitivity and message sensitivity. These properties make the scheme a suitable choice for practical applications.

  4. A real time QRS detection using delay-coordinate mapping for the microcontroller implementation.

    Science.gov (United States)

    Lee, Jeong-Whan; Kim, Kyeong-Seop; Lee, Bongsoo; Lee, Byungchae; Lee, Myoung-Ho

    2002-01-01

    In this article, we propose a new algorithm using the characteristics of reconstructed phase portraits by delay-coordinate mapping utilizing lag rotundity for a real-time detection of QRS complexes in ECG signals. In reconstructing phase portrait the mapping parameters, time delay, and mapping dimension play important roles in shaping of portraits drawn in a new dimensional space. Experimentally, the optimal mapping time delay for detection of QRS complexes turned out to be 20 ms. To explore the meaning of this time delay and the proper mapping dimension, we applied a fill factor, mutual information, and autocorrelation function algorithm that were generally used to analyze the chaotic characteristics of sampled signals. From these results, we could find the fact that the performance of our proposed algorithms relied mainly on the geometrical property such as an area of the reconstructed phase portrait. For the real application, we applied our algorithm for designing a small cardiac event recorder. This system was to record patients' ECG and R-R intervals for 1 h to investigate HRV characteristics of the patients who had vasovagal syncope symptom and for the evaluation, we implemented our algorithm in C language and applied to MIT/BIH arrhythmia database of 48 subjects. Our proposed algorithm achieved a 99.58% detection rate of QRS complexes.

  5. Flood Extent Mapping for Namibia Using Change Detection and Thresholding with SAR

    Science.gov (United States)

    Long, Stephanie; Fatoyinbo, Temilola E.; Policelli, Frederick

    2014-01-01

    A new method for flood detection change detection and thresholding (CDAT) was used with synthetic aperture radar (SAR) imagery to delineate the extent of flooding for the Chobe floodplain in the Caprivi region of Namibia. This region experiences annual seasonal flooding and has seen a recent renewal of severe flooding after a long dry period in the 1990s. Flooding in this area has caused loss of life and livelihoods for the surrounding communities and has caught the attention of disaster relief agencies. There is a need for flood extent mapping techniques that can be used to process images quickly, providing near real-time flooding information to relief agencies. ENVISAT/ASAR and Radarsat-2 images were acquired for several flooding seasons from February 2008 to March 2013. The CDAT method was used to determine flooding from these images and includes the use of image subtraction, decision based classification with threshold values, and segmentation of SAR images. The total extent of flooding determined for 2009, 2011 and 2012 was about 542 km2, 720 km2, and 673 km2 respectively. Pixels determined to be flooded in vegetation were typically flooding in vegetation was much greater (almost one third of the total flooded area). The time to maximum flooding for the 2013 flood season was determined to be about 27 days. Landsat water classification was used to compare the results from the new CDAT with SAR method; the results show good spatial agreement with Landsat scenes.

  6. DETECTION AND CLASSIFICATION OF POLE-LIKE OBJECTS FROM MOBILE MAPPING DATA

    Directory of Open Access Journals (Sweden)

    K. Fukano

    2015-08-01

    Full Text Available Laser scanners on a vehicle-based mobile mapping system can capture 3D point-clouds of roads and roadside objects. Since roadside objects have to be maintained periodically, their 3D models are useful for planning maintenance tasks. In our previous work, we proposed a method for detecting cylindrical poles and planar plates in a point-cloud. However, it is often required to further classify pole-like objects into utility poles, streetlights, traffic signals and signs, which are managed by different organizations. In addition, our previous method may fail to extract low pole-like objects, which are often observed in urban residential areas. In this paper, we propose new methods for extracting and classifying pole-like objects. In our method, we robustly extract a wide variety of poles by converting point-clouds into wireframe models and calculating cross-sections between wireframe models and horizontal cutting planes. For classifying pole-like objects, we subdivide a pole-like object into five subsets by extracting poles and planes, and calculate feature values of each subset. Then we apply a supervised machine learning method using feature variables of subsets. In our experiments, our method could achieve excellent results for detection and classification of pole-like objects.

  7. Audiovisual laughter detection based on temporal features

    NARCIS (Netherlands)

    Petridis, Stavros; Nijholt, Antinus; Nijholt, A.; Pantic, M.; Pantic, Maja; Poel, Mannes; Poel, M.; Hondorp, G.H.W.

    2008-01-01

    Previous research on automatic laughter detection has mainly been focused on audio-based detection. In this study we present an audiovisual approach to distinguishing laughter from speech based on temporal features and we show that the integration of audio and visual information leads to improved

  8. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    NARCIS (Netherlands)

    Avti, P.K.; Hu, S.; Favazza, C.; Mikos, A.G.; Jansen, J.A.; Shroyer, K.R.; Wang, L.V.; Sitharaman, B.

    2012-01-01

    AIMS: In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (microg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies

  9. Assessing the utility of the spot 6 sensor in detecting and mapping ...

    African Journals Online (AJOL)

    Lantana camara is a significant weed in South Africa which is causing severe impacts on agriculture by reducing grazing areas. This study assessed the potential of the SPOT 6 multispectral sensor and two broadband vegetation indices (NDVI and SR) for detecting and mapping Lantana camara in a community grazing ...

  10. Electron dose map inversion based on several algorithms

    International Nuclear Information System (INIS)

    Li Gui; Zheng Huaqing; Wu Yican; Fds Team

    2010-01-01

    The reconstruction to the electron dose map in radiation therapy was investigated by constructing the inversion model of electron dose map with different algorithms. The inversion model of electron dose map based on nonlinear programming was used, and this model was applied the penetration dose map to invert the total space one. The realization of this inversion model was by several inversion algorithms. The test results with seven samples show that except the NMinimize algorithm, which worked for just one sample, with great error,though,all the inversion algorithms could be realized to our inversion model rapidly and accurately. The Levenberg-Marquardt algorithm, having the greatest accuracy and speed, could be considered as the first choice in electron dose map inversion.Further tests show that more error would be created when the data close to the electron range was used (tail error). The tail error might be caused by the approximation of mean energy spectra, and this should be considered to improve the method. The time-saving and accurate algorithms could be used to achieve real-time dose map inversion. By selecting the best inversion algorithm, the clinical need in real-time dose verification can be satisfied. (authors)

  11. Stochastic Wheel-Slip Compensation Based Robot Localization and Mapping

    Directory of Open Access Journals (Sweden)

    SIDHARTHAN, R. K.

    2016-05-01

    Full Text Available Wheel slip compensation is vital for building accurate and reliable dead reckoning based robot localization and mapping algorithms. This investigation presents stochastic slip compensation scheme for robot localization and mapping. Main idea of the slip compensation technique is to use wheel-slip data obtained from experiments to model the variations in slip velocity as Gaussian distributions. This leads to a family of models that are switched depending on the input command. To obtain the wheel-slip measurements, experiments are conducted on a wheeled mobile robot and the measurements thus obtained are used to build the Gaussian models. Then the localization and mapping algorithm is tested on an experimental terrain and a new metric called the map spread factor is used to evaluate the ability of the slip compensation technique. Our results clearly indicate that the proposed methodology improves the accuracy by 72.55% for rotation and 66.67% for translation motion as against an uncompensated mapping system. The proposed compensation technique eliminates the need for extro receptive sensors for slip compensation, complex feature extraction and association algorithms. As a result, we obtain a simple slip compensation scheme for localization and mapping.

  12. Machine learning-based dual-energy CT parametric mapping.

    Science.gov (United States)

    Su, Kuan-Hao; Kuo, Jung-Wen; Jordan, David W; Van Hedent, Steven; Klahr, Paul; Wei, Zhouping; Al Helo, Rose; Liang, Fan; Qian, Pengjiang; Pereira, Gisele C; Rassouli, Negin; Gilkeson, Robert C; Traughber, Bryan J; Cheng, Chee-Wai; Muzic, Raymond F

    2018-05-22

    The aim is to develop and evaluate machine learning methods for generating quantitative parametric maps of effective atomic number (Zeff), relative electron density (ρe), mean excitation energy (Ix), and relative stopping power (RSP) from clinical dual-energy CT data. The maps could be used for material identification and radiation dose calculation. Machine learning methods of historical centroid (HC), random forest (RF), and artificial neural networks (ANN) were used to learn the relationship between dual-energy CT input data and ideal output parametric maps calculated for phantoms from the known compositions of 13 tissue substitutes. After training and model selection steps, the machine learning predictors were used to generate parametric maps from independent phantom and patient input data. Precision and accuracy were evaluated using the ideal maps. This process was repeated for a range of exposure doses, and performance was compared to that of the clinically-used dual-energy, physics-based method which served as the reference. The machine learning methods generated more accurate and precise parametric maps than those obtained using the reference method. Their performance advantage was particularly evident when using data from the lowest exposure, one-fifth of a typical clinical abdomen CT acquisition. The RF method achieved the greatest accuracy. In comparison, the ANN method was only 1% less accurate but had much better computational efficiency than RF, being able to produce parametric maps in 15 seconds. Machine learning methods outperformed the reference method in terms of accuracy and noise tolerance when generating parametric maps, encouraging further exploration of the techniques. Among the methods we evaluated, ANN is the most suitable for clinical use due to its combination of accuracy, excellent low-noise performance, and computational efficiency. . © 2018 Institute of Physics and Engineering in

  13. Global trends in satellite-based emergency mapping

    Science.gov (United States)

    Voigt, Stefan; Giulio-Tonolo, Fabio; Lyons, Josh; Kučera, Jan; Jones, Brenda; Schneiderhan, Tobias; Platzeck, Gabriel; Kaku, Kazuya; Hazarika, Manzul Kumar; Czaran, Lorant; Li, Suju; Pedersen, Wendi; James, Godstime Kadiri; Proy, Catherine; Muthike, Denis Macharia; Bequignon, Jerome; Guha-Sapir, Debarati

    2016-01-01

    Over the past 15 years, scientists and disaster responders have increasingly used satellite-based Earth observations for global rapid assessment of disaster situations. We review global trends in satellite rapid response and emergency mapping from 2000 to 2014, analyzing more than 1000 incidents in which satellite monitoring was used for assessing major disaster situations. We provide a synthesis of spatial patterns and temporal trends in global satellite emergency mapping efforts and show that satellite-based emergency mapping is most intensively deployed in Asia and Europe and follows well the geographic, physical, and temporal distributions of global natural disasters. We present an outlook on the future use of Earth observation technology for disaster response and mitigation by putting past and current developments into context and perspective.

  14. Updating National Topographic Data Base Using Change Detection Methods

    Science.gov (United States)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  15. Dipole-magnet field models based on a conformal map

    Directory of Open Access Journals (Sweden)

    P. L. Walstrom

    2012-10-01

    Full Text Available In general, generation of charged-particle transfer maps for conventional iron-pole-piece dipole magnets to third and higher order requires a model for the midplane field profile and its transverse derivatives (soft-edge model to high order and numerical integration of map coefficients. An exact treatment of the problem for a particular magnet requires use of measured magnetic data. However, in initial design of beam transport systems, users of charged-particle optics codes generally rely on magnet models built into the codes. Indeed, if maps to third order are adequate for the problem, an approximate analytic field model together with numerical map coefficient integration can capture the important features of the transfer map. The model described in this paper is based on the fact that, except at very large distances from the magnet, the magnetic field for parallel pole-face magnets with constant pole gap height and wide pole faces is basically two dimensional (2D. The field for all space outside of the pole pieces is given by a single (complex analytic expression and includes a parameter that controls the rate of falloff of the fringe field. Since the field function is analytic in the complex plane outside of the pole pieces, it satisfies two basic requirements of a field model for higher-order map codes: it is infinitely differentiable at the midplane and also a solution of the Laplace equation. It is apparently the only simple model available that combines an exponential approach to the central field with an inverse cubic falloff of field at large distances from the magnet in a single expression. The model is not intended for detailed fitting of magnetic field data, but for use in numerical map-generating codes for studying the effect of extended fringe fields on higher-order transfer maps. It is based on conformally mapping the area between the pole pieces to the upper half plane, and placing current filaments on the pole faces. An

  16. Vision based speed breaker detection for autonomous vehicle

    Science.gov (United States)

    C. S., Arvind; Mishra, Ritesh; Vishal, Kumar; Gundimeda, Venugopal

    2018-04-01

    In this paper, we are presenting a robust and real-time, vision-based approach to detect speed breaker in urban environments for autonomous vehicle. Our method is designed to detect the speed breaker using visual inputs obtained from a camera mounted on top of a vehicle. The method performs inverse perspective mapping to generate top view of the road and segment out region of interest based on difference of Gaussian and median filter images. Furthermore, the algorithm performs RANSAC line fitting to identify the possible speed breaker candidate region. This initial guessed region via RANSAC, is validated using support vector machine. Our algorithm can detect different categories of speed breakers on cement, asphalt and interlock roads at various conditions and have achieved a recall of 0.98.

  17. Hyperbolic mapping of complex networks based on community information

    Science.gov (United States)

    Wang, Zuxi; Li, Qingguang; Jin, Fengdong; Xiong, Wei; Wu, Yao

    2016-08-01

    To improve the hyperbolic mapping methods both in terms of accuracy and running time, a novel mapping method called Community and Hyperbolic Mapping (CHM) is proposed based on community information in this paper. Firstly, an index called Community Intimacy (CI) is presented to measure the adjacency relationship between the communities, based on which a community ordering algorithm is introduced. According to the proposed Community-Sector hypothesis, which supposes that most nodes of one community gather in a same sector in hyperbolic space, CHM maps the ordered communities into hyperbolic space, and then the angular coordinates of nodes are randomly initialized within the sector that they belong to. Therefore, all the network nodes are so far mapped to hyperbolic space, and then the initialized angular coordinates can be optimized by employing the information of all nodes, which can greatly improve the algorithm precision. By applying the proposed dual-layer angle sampling method in the optimization procedure, CHM reduces the time complexity to O(n2) . The experiments show that our algorithm outperforms the state-of-the-art methods.

  18. Flood extent mapping for Namibia using change detection and thresholding with SAR

    International Nuclear Information System (INIS)

    Long, Stephanie; Fatoyinbo, Temilola E; Policelli, Frederick

    2014-01-01

    A new method for flood detection change detection and thresholding (CDAT) was used with synthetic aperture radar (SAR) imagery to delineate the extent of flooding for the Chobe floodplain in the Caprivi region of Namibia. This region experiences annual seasonal flooding and has seen a recent renewal of severe flooding after a long dry period in the 1990s. Flooding in this area has caused loss of life and livelihoods for the surrounding communities and has caught the attention of disaster relief agencies. There is a need for flood extent mapping techniques that can be used to process images quickly, providing near real-time flooding information to relief agencies. ENVISAT/ASAR and Radarsat-2 images were acquired for several flooding seasons from February 2008 to March 2013. The CDAT method was used to determine flooding from these images and includes the use of image subtraction, decision-based classification with threshold values, and segmentation of SAR images. The total extent of flooding determined for 2009, 2011 and 2012 was about 542 km 2 , 720 km 2 , and 673 km 2 respectively. Pixels determined to be flooded in vegetation were typically <0.5% of the entire scene, with the exception of 2009 where the detection of flooding in vegetation was much greater (almost one third of the total flooded area). The time to maximum flooding for the 2013 flood season was determined to be about 27 days. Landsat water classification was used to compare the results from the new CDAT with SAR method; the results show good spatial agreement with Landsat scenes. (paper)

  19. Synchronous Adversarial Feature Learning for LiDAR based Loop Closure Detection

    OpenAIRE

    Yin, Peng; He, Yuqing; Xu, Lingyun; Peng, Yan; Han, Jianda; Xu, Weiliang

    2018-01-01

    Loop Closure Detection (LCD) is the essential module in the simultaneous localization and mapping (SLAM) task. In the current appearance-based SLAM methods, the visual inputs are usually affected by illumination, appearance and viewpoints changes. Comparing to the visual inputs, with the active property, light detection and ranging (LiDAR) based point-cloud inputs are invariant to the illumination and appearance changes. In this paper, we extract 3D voxel maps and 2D top view maps from LiDAR ...

  20. Laser-based optical detection of explosives

    CERN Document Server

    Pellegrino, Paul M; Farrell, Mikella E

    2015-01-01

    Laser-Based Optical Detection of Explosives offers a comprehensive review of past, present, and emerging laser-based methods for the detection of a variety of explosives. This book: Considers laser propagation safety and explains standard test material preparation for standoff optical-based detection system evaluation Explores explosives detection using deep ultraviolet native fluorescence, Raman spectroscopy, laser-induced breakdown spectroscopy, reflectometry, and hyperspectral imaging Examines photodissociation followed by laser-induced fluorescence, photothermal methods, cavity-enhanced absorption spectrometry, and short-pulse laser-based techniques Describes the detection and recognition of explosives using terahertz-frequency spectroscopic techniques Each chapter is authored by a leading expert on the respective technology, and is structured to supply historical perspective, address current advantages and challenges, and discuss novel research and applications. Readers are left with an in-depth understa...

  1. Performance metrics for state-of-the-art airborne magnetic and electromagnetic systems for mapping and detection of unexploded ordnance

    Science.gov (United States)

    Doll, William E.; Bell, David T.; Gamey, T. Jeffrey; Beard, Les P.; Sheehan, Jacob R.; Norton, Jeannemarie

    2010-04-01

    Over the past decade, notable progress has been made in the performance of airborne geophysical systems for mapping and detection of unexploded ordnance in terrestrial and shallow marine environments. For magnetometer systems, the most significant improvements include development of denser magnetometer arrays and vertical gradiometer configurations. In prototype analyses and recent Environmental Security Technology Certification Program (ESTCP) assessments using new production systems the greatest sensitivity has been achieved with a vertical gradiometer configuration, despite model-based survey design results which suggest that dense total-field arrays would be superior. As effective as magnetometer systems have proven to be at many sites, they are inadequate at sites where basalts and other ferrous geologic formations or soils produce anomalies that approach or exceed those of target ordnance items. Additionally, magnetometer systems are ineffective where detection of non-ferrous ordnance items is of primary concern. Recent completion of the Battelle TEM-8 airborne time-domain electromagnetic system represents the culmination of nearly nine years of assessment and development of airborne electromagnetic systems for UXO mapping and detection. A recent ESTCP demonstration of this system in New Mexico showed that it was able to detect 99% of blind-seeded ordnance items, 81mm and larger, and that it could be used to map in detail a bombing target on a basalt flow where previous airborne magnetometer surveys had failed. The probability of detection for the TEM-8 in the blind-seeded study area was better than that reported for a dense-array total-field magnetometer demonstration of the same blind-seeded site, and the TEM-8 system successfully detected these items with less than half as many anomaly picks as the dense-array total-field magnetometer system.

  2. The ETLMR MapReduce-Based ETL Framework

    DEFF Research Database (Denmark)

    Xiufeng, Liu; Thomsen, Christian; Pedersen, Torben Bach

    2011-01-01

    This paper presents ETLMR, a parallel Extract--Transform--Load (ETL) programming framework based on MapReduce. It has built-in support for high-level ETL-specific constructs including star schemas, snowflake schemas, and slowly changing dimensions (SCDs). ETLMR gives both high programming...

  3. Tissue-based map of the human proteome

    DEFF Research Database (Denmark)

    Uhlén, Mathias; Fagerberg, Linn; Hallström, Björn M.

    2015-01-01

    Resolving the molecular details of proteome variation in the different tissues and organs of the human body will greatly increase our knowledge of human biology and disease. Here, we present a map of the human tissue proteome based on an integrated omics approach that involves quantitative transc...

  4. Rule-based Test Generation with Mind Maps

    Directory of Open Access Journals (Sweden)

    Dimitry Polivaev

    2012-02-01

    Full Text Available This paper introduces basic concepts of rule based test generation with mind maps, and reports experiences learned from industrial application of this technique in the domain of smart card testing by Giesecke & Devrient GmbH over the last years. It describes the formalization of test selection criteria used by our test generator, our test generation architecture and test generation framework.

  5. Balanced Civilization Map Generation based on Open Data

    DEFF Research Database (Denmark)

    Barros, Gabriella; Togelius, Julian

    2015-01-01

    This work investigates how to incorporate real-world data into game content so that the content is playable and enjoyable while not misrepresenting the data. We propose a method for generating balanced Civilization maps based on Open Data, describing how to acquire, transform and integrate...

  6. An improved map based graphical android authentication system ...

    African Journals Online (AJOL)

    Currently, graphical password methods are available for android and other devices, but the major problem is vulnerability issue. A map graphical-based authentication system (Dheeraj et al, 2013) was designed on mobile android devices, but it did not provide a large choice or multiple sequence to user for selecting ...

  7. An Improved Consensus Linkage Map of Barley Based on Flow-Sorted Chromosomes and Single Nucleotide Polymorphism Markers

    Directory of Open Access Journals (Sweden)

    María Muñoz-Amatriaín

    2011-11-01

    Full Text Available Recent advances in high-throughput genotyping have made it easier to combine information from different mapping populations into consensus genetic maps, which provide increased marker density and genome coverage compared to individual maps. Previously, a single nucleotide polymorphism (SNP-based genotyping platform was developed and used to genotype 373 individuals in four barley ( L. mapping populations. This led to a 2943 SNP consensus genetic map with 975 unique positions. In this work, we add data from six additional populations and more individuals from one of the original populations to develop an improved consensus map from 1133 individuals. A stringent and systematic analysis of each of the 10 populations was performed to achieve uniformity. This involved reexamination of the four populations included in the previous map. As a consequence, we present a robust consensus genetic map that contains 2994 SNP loci mapped to 1163 unique positions. The map spans 1137.3 cM with an average density of one marker bin per 0.99 cM. A novel application of the genotyping platform for gene detection allowed the assignment of 2930 genes to flow-sorted chromosomes or arms, confirmed the position of 2545 SNP-mapped loci, added chromosome or arm allocations to an additional 370 SNP loci, and delineated pericentromeric regions for chromosomes 2H to 7H. Marker order has been improved and map resolution has been increased by almost 20%. These increased precision outcomes enable more optimized SNP selection for marker-assisted breeding and support association genetic analysis and map-based cloning. It will also improve the anchoring of DNA sequence scaffolds and the barley physical map to the genetic map.

  8. Community-Based Intrusion Detection

    OpenAIRE

    Weigert, Stefan

    2017-01-01

    Today, virtually every company world-wide is connected to the Internet. This wide-spread connectivity has given rise to sophisticated, targeted, Internet-based attacks. For example, between 2012 and 2013 security researchers counted an average of about 74 targeted attacks per day. These attacks are motivated by economical, financial, or political interests and commonly referred to as “Advanced Persistent Threat (APT)” attacks. Unfortunately, many of these attacks are successful and the advers...

  9. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  10. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  11. Sensor fusion-based map building for mobile robot exploration

    International Nuclear Information System (INIS)

    Ribo, M.

    2000-01-01

    To carry out exploration tasks in unknown or partially unknown environments, a mobile robot needs to acquire and maintain models of its environment. In doing so, several sensors of same nature and/or heterogeneous sensor configurations may be used by the robot to achieve reliable performances. However, this in turn poses the problem of sensor fusion-based map building: How to interpret, combine and integrate sensory information in order to build a proper representation of the environment. Specifically, the goal of this thesis is to probe integration algorithms for Occupancy Grid (OG) based map building using odometry, ultrasonic rangefinders, and stereo vision. Three different uncertainty calculi are presented here which are used for sensor fusion-based map building purposes. They are based on probability theory, Dempster-Shafer theory of evidence, and fuzzy set theory. Besides, two different sensor models are depicted which are used to translate sensing data into range information. Experimental examples of OGs built from real data recorded by two robots in office-like environment are presented. They show the feasibility of the proposed approach for building both sonar and visual based OGs. A comparison among the presented uncertainty calculi is performed in a sonar-based framework. Finally, the fusion of both sonar and visual information based of the fuzzy set theory is depicted. (author)

  12. Daytime Water Detection Based on Sky Reflections

    Science.gov (United States)

    Rankin, Arturo; Matthies, Larry; Bellutta, Paolo

    2011-01-01

    A water body s surface can be modeled as a horizontal mirror. Water detection based on sky reflections and color variation are complementary. A reflection coefficient model suggests sky reflections dominate the color of water at ranges > 12 meters. Water detection based on sky reflections: (1) geometrically locates the pixel in the sky that is reflecting on a candidate water pixel on the ground (2) predicts if the ground pixel is water based on color similarity and local terrain features. Water detection has been integrated on XUVs.

  13. CLOUD-BASED PLATFORM FOR CREATING AND SHARING WEB MAPS

    Directory of Open Access Journals (Sweden)

    Jean Pierre Gatera

    2014-01-01

    Full Text Available The rise of cloud computing is one the most important thing happening in information technology today. While many things are moving into the cloud, this trend has also reached the Geographic Information System (GIS world. For the users of GIS technology, the cloud opens new possibilities for sharing web maps, applications and spatial data. The goal of this presentation/demo is to demonstrate ArcGIS Online which is a cloud-based collaborative platform that allows to easily and quickly create interactive web maps that you can share with anyone. With ready-to-use content, apps, and templates you can produce web maps right away. And no matter what you use - desktops, browsers, smartphones, or tablets - you always have access to your content.

  14. Rapid Land Cover Map Updates Using Change Detection and Robust Random Forest Classifiers

    Directory of Open Access Journals (Sweden)

    Konrad J. Wessels

    2016-10-01

    Full Text Available The paper evaluated the Landsat Automated Land Cover Update Mapping (LALCUM system designed to rapidly update a land cover map to a desired nominal year using a pre-existing reference land cover map. The system uses the Iteratively Reweighted Multivariate Alteration Detection (IRMAD to identify areas of change and no change. The system then automatically generates large amounts of training samples (n > 1 million in the no-change areas as input to an optimized Random Forest classifier. Experiments were conducted in the KwaZulu-Natal Province of South Africa using a reference land cover map from 2008, a change mask between 2008 and 2011 and Landsat ETM+ data for 2011. The entire system took 9.5 h to process. We expected that the use of the change mask would improve classification accuracy by reducing the number of mislabeled training data caused by land cover change between 2008 and 2011. However, this was not the case due to exceptional robustness of Random Forest classifier to mislabeled training samples. The system achieved an overall accuracy of 65%–67% using 22 detailed classes and 72%–74% using 12 aggregated national classes. “Water”, “Plantations”, “Plantations—clearfelled”, “Orchards—trees”, “Sugarcane”, “Built-up/dense settlement”, “Cultivation—Irrigated” and “Forest (indigenous” had user’s accuracies above 70%. Other detailed classes (e.g., “Low density settlements”, “Mines and Quarries”, and “Cultivation, subsistence, drylands” which are required for operational, provincial-scale land use planning and are usually mapped using manual image interpretation, could not be mapped using Landsat spectral data alone. However, the system was able to map the 12 national classes, at a sufficiently high level of accuracy for national scale land cover monitoring. This update approach and the highly automated, scalable LALCUM system can improve the efficiency and update rate of regional land

  15. Spatiotemporal High-Resolution Cloud Mapping with a Ground-Based IR Scanner

    Directory of Open Access Journals (Sweden)

    Benjamin Brede

    2017-01-01

    Full Text Available The high spatiotemporal variability of clouds requires automated monitoring systems. This study presents a retrieval algorithm that evaluates observations of a hemispherically scanning thermal infrared radiometer, the NubiScope, to produce georeferenced, spatially explicit cloud maps. The algorithm uses atmospheric temperature and moisture profiles and an atmospheric radiative transfer code to differentiate between cloudy and cloudless measurements. In case of a cloud, it estimates its position by using the temperature profile and viewing geometry. The proposed algorithm was tested with 25 cloud maps generated by the Fmask algorithm from Landsat 7 images. The overall cloud detection rate was ranging from 0.607 for zenith angles of 0 to 10° to 0.298 for 50–60° on a pixel basis. The overall detection of cloudless pixels was 0.987 for zenith angles of 30–40° and much more stable over the whole range of zenith angles compared to cloud detection. This proves the algorithm’s capability in detecting clouds, but even better cloudless areas. Cloud-base height was best estimated up to a height of 4000 m compared to ceilometer base heights but showed large deviation above that level. This study shows the potential of the NubiScope system to produce high spatial and temporal resolution cloud maps. Future development is needed for a more accurate determination of cloud height with thermal infrared measurements.

  16. Intensity Based Seismic Hazard Map of Republic of Macedonia

    Science.gov (United States)

    Dojcinovski, Dragi; Dimiskovska, Biserka; Stojmanovska, Marta

    2016-04-01

    The territory of the Republic of Macedonia and the border terrains are among the most seismically active parts of the Balkan Peninsula belonging to the Mediterranean-Trans-Asian seismic belt. The seismological data on the R. Macedonia from the past 16 centuries point to occurrence of very strong catastrophic earthquakes. The hypocenters of the occurred earthquakes are located above the Mohorovicic discontinuity, most frequently, at a depth of 10-20 km. Accurate short -term prognosis of earthquake occurrence, i.e., simultaneous prognosis of time, place and intensity of their occurrence is still not possible. The present methods of seismic zoning have advanced to such an extent that it is with a great probability that they enable efficient protection against earthquake effects. The seismic hazard maps of the Republic of Macedonia are the result of analysis and synthesis of data from seismological, seismotectonic and other corresponding investigations necessary for definition of the expected level of seismic hazard for certain time periods. These should be amended, from time to time, with new data and scientific knowledge. The elaboration of this map does not completely solve all issues related to earthquakes, but it provides basic empirical data necessary for updating the existing regulations for construction of engineering structures in seismically active areas regulated by legal regulations and technical norms whose constituent part is the seismic hazard map. The map has been elaborated based on complex seismological and geophysical investigations of the considered area and synthesis of the results from these investigations. There were two phases of elaboration of the map. In the first phase, the map of focal zones characterized by maximum magnitudes of possible earthquakes has been elaborated. In the second phase, the intensities of expected earthquakes have been computed according to the MCS scale. The map is prognostic, i.e., it provides assessment of the

  17. Parallel image encryption algorithm based on discretized chaotic map

    International Nuclear Information System (INIS)

    Zhou Qing; Wong Kwokwo; Liao Xiaofeng; Xiang Tao; Hu Yue

    2008-01-01

    Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms

  18. Photonic crystal fiber based antibody detection

    DEFF Research Database (Denmark)

    Duval, A; Lhoutellier, M; Jensen, J B

    2004-01-01

    An original approach for detecting labeled antibodies based on strong penetration photonic crystal fibers is introduced. The target antibody is immobilized inside the air-holes of a photonic crystal fiber and the detection is realized by the means of evanescent-wave fluorescence spectroscopy...

  19. Delving Deep into Multiscale Pedestrian Detection via Single Scale Feature Maps

    Directory of Open Access Journals (Sweden)

    Xinchuan Fu

    2018-04-01

    Full Text Available The standard pipeline in pedestrian detection is sliding a pedestrian model on an image feature pyramid to detect pedestrians of different scales. In this pipeline, feature pyramid construction is time consuming and becomes the bottleneck for fast detection. Recently, a method called multiresolution filtered channels (MRFC was proposed which only used single scale feature maps to achieve fast detection. However, there are two shortcomings in MRFC which limit its accuracy. One is that the receptive field correspondence in different scales is weak. Another is that the features used are not scale invariance. In this paper, two solutions are proposed to tackle with the two shortcomings respectively. Specifically, scale-aware pooling is proposed to make a better receptive field correspondence, and soft decision tree is proposed to relive scale variance problem. When coupled with efficient sliding window classification strategy, our detector achieves fast detecting speed at the same time with state-of-the-art accuracy.

  20. Landslide Change Detection Based on Multi-Temporal Airborne LiDAR-Derived DEMs

    Directory of Open Access Journals (Sweden)

    Omar E. Mora

    2018-01-01

    Full Text Available Remote sensing technologies have seen extraordinary improvements in both spatial resolution and accuracy recently. In particular, airborne laser scanning systems can now provide data for surface modeling with unprecedented resolution and accuracy, which can effectively support the detection of sub-meter surface features, vital for landslide mapping. Also, the easy repeatability of data acquisition offers the opportunity to monitor temporal surface changes, which are essential to identifying developing or active slides. Specific methods are needed to detect and map surface changes due to landslide activities. In this paper, we present a methodology that is based on fusing probabilistic change detection and landslide surface feature extraction utilizing multi-temporal Light Detection and Ranging (LiDAR derived Digital Elevation Models (DEMs to map surface changes demonstrating landslide activity. The proposed method was tested in an area with numerous slides ranging from 200 m2 to 27,000 m2 in area under low vegetation and tree cover, Zanesville, Ohio, USA. The surface changes observed are probabilistically evaluated to determine the likelihood of the changes being landslide activity related. Next, based on surface features, a Support Vector Machine (SVM quantifies and maps the topographic signatures of landslides in the entire area. Finally, these two processes are fused to detect landslide prone changes. The results demonstrate that 53 out of 80 inventory mapped landslides were identified using this method. Additionally, some areas that were not mapped in the inventory map displayed changes that are likely to be developing landslides.

  1. The effects of a concept map-based support tool on simulation-based inquiry learning

    NARCIS (Netherlands)

    Hagemans, M.G.; van der Meij, Hans; de Jong, Anthonius J.M.

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations,

  2. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    Science.gov (United States)

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  3. Collaborative regression-based anatomical landmark detection

    International Nuclear Information System (INIS)

    Gao, Yaozong; Shen, Dinggang

    2015-01-01

    Anatomical landmark detection plays an important role in medical image analysis, e.g. for registration, segmentation and quantitative analysis. Among the various existing methods for landmark detection, regression-based methods have recently attracted much attention due to their robustness and efficiency. In these methods, landmarks are localised through voting from all image voxels, which is completely different from the classification-based methods that use voxel-wise classification to detect landmarks. Despite their robustness, the accuracy of regression-based landmark detection methods is often limited due to (1) the inclusion of uninformative image voxels in the voting procedure, and (2) the lack of effective ways to incorporate inter-landmark spatial dependency into the detection step. In this paper, we propose a collaborative landmark detection framework to address these limitations. The concept of collaboration is reflected in two aspects. (1) Multi-resolution collaboration. A multi-resolution strategy is proposed to hierarchically localise landmarks by gradually excluding uninformative votes from faraway voxels. Moreover, for informative voxels near the landmark, a spherical sampling strategy is also designed at the training stage to improve their prediction accuracy. (2) Inter-landmark collaboration. A confidence-based landmark detection strategy is proposed to improve the detection accuracy of ‘difficult-to-detect’ landmarks by using spatial guidance from ‘easy-to-detect’ landmarks. To evaluate our method, we conducted experiments extensively on three datasets for detecting prostate landmarks and head and neck landmarks in computed tomography images, and also dental landmarks in cone beam computed tomography images. The results show the effectiveness of our collaborative landmark detection framework in improving landmark detection accuracy, compared to other state-of-the-art methods. (paper)

  4. Apriori-based network intrusion detection system

    International Nuclear Information System (INIS)

    Wang Wenjin; Liu Junrong; Liu Baoxu

    2012-01-01

    With the development of network communication technology, more and more social activities run by Internet. In the meantime, the network information security is getting increasingly serious. Intrusion Detection System (IDS) has greatly improved the general security level of whole network. But there are still many problem exists in current IDS, e.g. high leak rate detection/false alarm rates and feature library need frequently upgrade. This paper presents an association-rule based IDS. This system can detect unknown attack by generate rules from training data. Experiment in last chapter proved the system has great accuracy on unknown attack detection. (authors)

  5. Power Consumption Based Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Hongyu Yang

    2016-01-01

    Full Text Available In order to solve the problem that Android platform’s sand-box mechanism prevents security protection software from accessing effective information to detect malware, this paper proposes a malicious software detection method based on power consumption. Firstly, the mobile battery consumption status information was obtained, and the Gaussian mixture model (GMM was built by using Mel frequency cepstral coefficients (MFCC. Then, the GMM was used to analyze power consumption; malicious software can be classified and detected through classification processing. Experiment results demonstrate that the function of an application and its power consumption have a close relationship, and our method can detect some typical malicious application software accurately.

  6. Using Map Service API for Driving Cycle Detection for Wearable GPS Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gonder, Jeffrey D [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-06

    Following advancements in smartphone and portable global positioning system (GPS) data collection, wearable GPS data have realized extensive use in transportation surveys and studies. The task of detecting driving cycles (driving or car-mode trajectory segments) from wearable GPS data has been the subject of much research. Specifically, distinguishing driving cycles from other motorized trips (such as taking a bus) is the main research problem in this paper. Many mode detection methods only focus on raw GPS speed data while some studies apply additional information, such as geographic information system (GIS) data, to obtain better detection performance. Procuring and maintaining dedicated road GIS data are costly and not trivial, whereas the technical maturity and broad use of map service application program interface (API) queries offers opportunities for mode detection tasks. The proposed driving cycle detection method takes advantage of map service APIs to obtain high-quality car-mode API route information and uses a trajectory segmentation algorithm to find the best-matched API route. The car-mode API route data combined with the actual route information, including the actual mode information, are used to train a logistic regression machine learning model, which estimates car modes and non-car modes with probability rates. The experimental results show promise for the proposed method's ability to detect vehicle mode accurately.

  7. Planimetric Features Generalization for the Production of Small-Scale Map by Using Base Maps and the Existing Algorithms

    Directory of Open Access Journals (Sweden)

    M. Modiri

    2014-10-01

    Full Text Available Cartographic maps are representations of the Earth upon a flat surface in the smaller scale than it’s true. Large scale maps cover relatively small regions in great detail and small scale maps cover large regions such as nations, continents and the whole globe. Logical connection between the features and scale map must be maintained by changing the scale and it is important to recognize that even the most accurate maps sacrifice a certain amount of accuracy in scale to deliver a greater visual usefulness to its user. Cartographic generalization, or map generalization, is the method whereby information is selected and represented on a map in a way that adapts to the scale of the display medium of the map, not necessarily preserving all intricate geographical or other cartographic details. Due to the problems facing small-scale map production process and the need to spend time and money for surveying, today’s generalization is used as executive approach. The software is proposed in this paper that converted various data and information to certain Data Model. This software can produce generalization map according to base map using the existing algorithm. Planimetric generalization algorithms and roles are described in this article. Finally small-scale maps with 1:100,000, 1:250,000 and 1:500,000 scale are produced automatically and they are shown at the end.

  8. Scalable, incremental learning with MapReduce parallelization for cell detection in high-resolution 3D microscopy data

    KAUST Repository

    Sung, Chul

    2013-08-01

    Accurate estimation of neuronal count and distribution is central to the understanding of the organization and layout of cortical maps in the brain, and changes in the cell population induced by brain disorders. High-throughput 3D microscopy techniques such as Knife-Edge Scanning Microscopy (KESM) are enabling whole-brain survey of neuronal distributions. Data from such techniques pose serious challenges to quantitative analysis due to the massive, growing, and sparsely labeled nature of the data. In this paper, we present a scalable, incremental learning algorithm for cell body detection that can address these issues. Our algorithm is computationally efficient (linear mapping, non-iterative) and does not require retraining (unlike gradient-based approaches) or retention of old raw data (unlike instance-based learning). We tested our algorithm on our rat brain Nissl data set, showing superior performance compared to an artificial neural network-based benchmark, and also demonstrated robust performance in a scenario where the data set is rapidly growing in size. Our algorithm is also highly parallelizable due to its incremental nature, and we demonstrated this empirically using a MapReduce-based implementation of the algorithm. We expect our scalable, incremental learning approach to be widely applicable to medical imaging domains where there is a constant flux of new data. © 2013 IEEE.

  9. Automated Land Cover Change Detection and Mapping from Hidden Parameter Estimates of Normalized Difference Vegetation Index (NDVI) Time-Series

    Science.gov (United States)

    Chakraborty, S.; Banerjee, A.; Gupta, S. K. S.; Christensen, P. R.; Papandreou-Suppappola, A.

    2017-12-01

    Multitemporal observations acquired frequently by satellites with short revisit periods such as the Moderate Resolution Imaging Spectroradiometer (MODIS), is an important source for modeling land cover. Due to the inherent seasonality of the land cover, harmonic modeling reveals hidden state parameters characteristic to it, which is used in classifying different land cover types and in detecting changes due to natural or anthropogenic factors. In this work, we use an eight day MODIS composite to create a Normalized Difference Vegetation Index (NDVI) time-series of ten years. Improved hidden parameter estimates of the nonlinear harmonic NDVI model are obtained using the Particle Filter (PF), a sequential Monte Carlo estimator. The nonlinear estimation based on PF is shown to improve parameter estimation for different land cover types compared to existing techniques that use the Extended Kalman Filter (EKF), due to linearization of the harmonic model. As these parameters are representative of a given land cover, its applicability in near real-time detection of land cover change is also studied by formulating a metric that captures parameter deviation due to change. The detection methodology is evaluated by considering change as a rare class problem. This approach is shown to detect change with minimum delay. Additionally, the degree of change within the change perimeter is non-uniform. By clustering the deviation in parameters due to change, this spatial variation in change severity is effectively mapped and validated with high spatial resolution change maps of the given regions.

  10. The feasibility of colorectal cancer detection using dual-energy computed tomography with iodine mapping

    International Nuclear Information System (INIS)

    Boellaard, T.N.; Henneman, O.D.F.; Streekstra, G.J.; Venema, H.W.; Nio, C.Y.; Dorth-Rombouts, M.C. van; Stoker, J.

    2013-01-01

    Aim: To assess the feasibility of colorectal cancer detection using dual-energy computed tomography with iodine mapping and without bowel preparation or bowel distension. Materials and methods: Consecutive patients scheduled for preoperative staging computed tomography (CT) because of diagnosed or high suspicion for colorectal cancer were prospectively included in the study. A single contrast-enhanced abdominal CT acquisition using dual-source mode (100 kV/140 kV) was performed without bowel preparation. Weighted average 120 kV images and iodine maps were created with post-processing. Two observers performed a blinded read for colorectal lesions after being trained on three colorectal cancer patients. One observer performed an unblinded read for lesion detectability and placed a region of interest (ROI) within each lesion. Results: In total 21 patients were included and 18 had a colorectal cancer at the time of the CT acquisition. Median cancer size was 43 mm [interquartile range (IQR) 27–60 mm] and all 18 colorectal cancers were visible on the 120 kV images and iodine map during the unblinded read. During the blinded read, observers found 90% (27/30) of the cancers with 120 kV images only and 96.7% (29/30) after viewing the iodine map in addition (p = 0.5). Median enhancement of colorectal cancers was 29.9 HU (IQR 23.1–34.6). The largest benign lesions (70 and 25 mm) were visible on the 120 kV images and iodine map, whereas four smaller benign lesions (7–15 mm) were not. Conclusion: Colorectal cancers are visible on the contrast-enhanced dual-energy CT without bowel preparation or insufflation. Because of the patient-friendly nature of this approach, further studies should explore its use for colorectal cancer detection in frail and elderly patients

  11. Linear-fitting-based similarity coefficient map for tissue dissimilarity analysis in -w magnetic resonance imaging

    International Nuclear Information System (INIS)

    Yu Shao-De; Wu Shi-Bin; Xie Yao-Qin; Wang Hao-Yu; Wei Xin-Hua; Chen Xin; Pan Wan-Long; Hu Jiani

    2015-01-01

    Similarity coefficient mapping (SCM) aims to improve the morphological evaluation of weighted magnetic resonance imaging However, how to interpret the generated SCM map is still pending. Moreover, is it probable to extract tissue dissimilarity messages based on the theory behind SCM? The primary purpose of this paper is to address these two questions. First, the theory of SCM was interpreted from the perspective of linear fitting. Then, a term was embedded for tissue dissimilarity information. Finally, our method was validated with sixteen human brain image series from multi-echo . Generated maps were investigated from signal-to-noise ratio (SNR) and perceived visual quality, and then interpreted from intra- and inter-tissue intensity. Experimental results show that both perceptibility of anatomical structures and tissue contrast are improved. More importantly, tissue similarity or dissimilarity can be quantified and cross-validated from pixel intensity analysis. This method benefits image enhancement, tissue classification, malformation detection and morphological evaluation. (paper)

  12. An authenticated image encryption scheme based on chaotic maps and memory cellular automata

    Science.gov (United States)

    Bakhshandeh, Atieh; Eslami, Ziba

    2013-06-01

    This paper introduces a new image encryption scheme based on chaotic maps, cellular automata and permutation-diffusion architecture. In the permutation phase, a piecewise linear chaotic map is utilized to confuse the plain-image and in the diffusion phase, we employ the Logistic map as well as a reversible memory cellular automata to obtain an efficient and secure cryptosystem. The proposed method admits advantages such as highly secure diffusion mechanism, computational efficiency and ease of implementation. A novel property of the proposed scheme is its authentication ability which can detect whether the image is tampered during the transmission or not. This is particularly important in applications where image data or part of it contains highly sensitive information. Results of various analyses manifest high security of this new method and its capability for practical image encryption.

  13. Failure detection in high-performance clusters and computers using chaotic map computations

    Science.gov (United States)

    Rao, Nageswara S.

    2015-09-01

    A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.

  14. Intelligent Machine Vision for Automated Fence Intruder Detection Using Self-organizing Map

    OpenAIRE

    Veldin A. Talorete Jr.; Sherwin A Guirnaldo

    2017-01-01

    This paper presents an intelligent machine vision for automated fence intruder detection. A series of still captured images that contain fence events using Internet Protocol cameras was used as input data to the system. Two classifiers were used; the first is to classify human posture and the second one will classify intruder location. The system classifiers were implemented using Self-Organizing Map after the implementation of several image segmentation processes. The human posture classifie...

  15. LIDAR-INCORPORATED TRAFFIC SIGN DETECTION FROM VIDEO LOG IMAGES OF MOBILE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    Y. Li

    2016-06-01

    Full Text Available Mobile Mapping System (MMS simultaneously collects the Lidar points and video log images in a scenario with the laser profiler and digital camera. Besides the textural details of video log images, it also captures the 3D geometric shape of point cloud. It is widely used to survey the street view and roadside transportation infrastructure, such as traffic sign, guardrail, etc., in many transportation agencies. Although many literature on traffic sign detection are available, they only focus on either Lidar or imagery data of traffic sign. Based on the well-calibrated extrinsic parameters of MMS, 3D Lidar points are, the first time, incorporated into 2D video log images to enhance the detection of traffic sign both physically and visually. Based on the local elevation, the 3D pavement area is first located. Within a certain distance and height of the pavement, points of the overhead and roadside traffic signs can be obtained according to the setup specification of traffic signs in different transportation agencies. The 3D candidate planes of traffic signs are then fitted using the RANSAC plane-fitting of those points. By projecting the candidate planes onto the image, Regions of Interest (ROIs of traffic signs are found physically with the geometric constraints between laser profiling and camera imaging. The Random forest learning of the visual color and shape features of traffic signs is adopted to validate the sign ROIs from the video log images. The sequential occurrence of a traffic sign among consecutive video log images are defined by the geometric constraint of the imaging geometry and GPS movement. Candidate ROIs are predicted in this temporal context to double-check the salient traffic sign among video log images. The proposed algorithm is tested on a diverse set of scenarios on the interstate highway G-4 near Beijing, China under varying lighting conditions and occlusions. Experimental results show the proposed algorithm enhances the

  16. Alteration zone mapping for detecting potential mineralized areas in Kaladawan of north altyn tagh using ASTER data

    International Nuclear Information System (INIS)

    Yong-gui, Zhou; Bai-lin, Chen; Xing-tong, Chen; Zheng-le, Chen

    2014-01-01

    The Kaladawan area has been found developing intense hydrothermal altered rocks associated with mineralized area such as Kaladaban Pb-Zn deposit, A-bei Ag-Pb depositduring earlier geological investigations.Yet the sparse vegetation cover and excellent bedrock exposure make it a suitable place for the use of remote sensing methods for lithological mapping. ASTER data has been used in this study to identify alteration zones, and then to detect potential mineralized areas. Band ratio and PCA procedures were applied based on the analysis of spectral properties of typical alteration minerals. Band 4/2 and mineralogic indices proposed by Ninomiya were designed to map the distribution of Fe-oxides and alteration zones. Selected bands combinations were transformed in a PCA procedure to map the Al-OH, Mg-OH, CO 3 2− and Fe-oxides altered minerals. The analysis focused on the spatial distribution of hydrothermal altered minerals. Band ratio result images including both Fe-oxides and mineralogic indices show high-level similarity with the PCA transform procedure. They both show intense hydrothermal alteration zone in Kaladaban,west Kaladawan and A-bei area. Hence, these areas are considered to have potential for further mineralogic exploration. The results were validated by field work in the Kaladaban and west Kaladawan area,indicating that this method can be a useful tool for detecting potential mineralization area in Kaladawan and similar areas elsewhere

  17. Name-Based Address Mapping for Virtual Private Networks

    Science.gov (United States)

    Surányi, Péter; Shinjo, Yasushi; Kato, Kazuhiko

    IPv4 private addresses are commonly used in local area networks (LANs). With the increasing popularity of virtual private networks (VPNs), it has become common that a user connects to multiple LANs at the same time. However, private address ranges for LANs frequently overlap. In such cases, existing systems do not allow the user to access the resources on all LANs at the same time. In this paper, we propose name-based address mapping for VPNs, a novel method that allows connecting to hosts through multiple VPNs at the same time, even when the address ranges of the VPNs overlap. In name-based address mapping, rather than using the IP addresses used on the LANs (the real addresses), we assign a unique virtual address to each remote host based on its domain name. The local host uses the virtual addresses to communicate with remote hosts. We have implemented name-based address mapping for layer 3 OpenVPN connections on Linux and measured its performance. The communication overhead of our system is less than 1.5% for throughput and less than 0.2ms for each name resolution.

  18. VISION BASED OBSTACLE DETECTION IN UAV IMAGING

    Directory of Open Access Journals (Sweden)

    S. Badrloo

    2017-08-01

    Full Text Available Detecting and preventing incidence with obstacles is crucial in UAV navigation and control. Most of the common obstacle detection techniques are currently sensor-based. Small UAVs are not able to carry obstacle detection sensors such as radar; therefore, vision-based methods are considered, which can be divided into stereo-based and mono-based techniques. Mono-based methods are classified into two groups: Foreground-background separation, and brain-inspired methods. Brain-inspired methods are highly efficient in obstacle detection; hence, this research aims to detect obstacles using brain-inspired techniques, which try to enlarge the obstacle by approaching it. A recent research in this field, has concentrated on matching the SIFT points along with, SIFT size-ratio factor and area-ratio of convex hulls in two consecutive frames to detect obstacles. This method is not able to distinguish between near and far obstacles or the obstacles in complex environment, and is sensitive to wrong matched points. In order to solve the above mentioned problems, this research calculates the dist-ratio of matched points. Then, each and every point is investigated for Distinguishing between far and close obstacles. The results demonstrated the high efficiency of the proposed method in complex environments.

  19. Islands of biogeodiversity in arid lands on a polygons map study: Detecting scale invariance patterns from natural resources maps.

    Science.gov (United States)

    Ibáñez, J J; Pérez-Gómez, R; Brevik, Eric C; Cerdà, A

    2016-12-15

    Many maps (geology, hydrology, soil, vegetation, etc.) are created to inventory natural resources. Each of these resources is mapped using a unique set of criteria, including scales and taxonomies. Past research indicates that comparing results of related maps (e.g., soil and geology maps) may aid in identifying mapping deficiencies. Therefore, this study was undertaken in Almeria Province, Spain to (i) compare the underlying map structures of soil and vegetation maps and (ii) investigate if a vegetation map can provide useful soil information that was not shown on a soil map. Soil and vegetation maps were imported into ArcGIS 10.1 for spatial analysis, and results then exported to Microsoft Excel worksheets for statistical analyses to evaluate fits to linear and power law regression models. Vegetative units were grouped according to the driving forces that determined their presence or absence: (i) climatophilous (ii) lithologic-climate; and (iii) edaphophylous. The rank abundance plots for both the soil and vegetation maps conformed to Willis or Hollow Curves, meaning the underlying structures of both maps were the same. Edaphophylous map units, which represent 58.5% of the vegetation units in the study area, did not show a good correlation with the soil map. Further investigation revealed that 87% of the edaphohygrophilous units were found in ramblas, ephemeral riverbeds that are not typically classified and mapped as soils in modern systems, even though they meet the definition of soil given by the most commonly used and most modern soil taxonomic systems. Furthermore, these edaphophylous map units tend to be islands of biodiversity that are threatened by anthropogenic activity in the region. Therefore, this study revealed areas that need to be revisited and studied pedologically. The vegetation mapped in these areas and the soils that support it are key components of the earth's critical zone that must be studied, understood, and preserved. Copyright © 2016

  20. Breaking an encryption scheme based on chaotic baker map

    International Nuclear Information System (INIS)

    Alvarez, Gonzalo; Li, Shujun

    2006-01-01

    In recent years, a growing number of cryptosystems based on chaos have been proposed, many of them fundamentally flawed by a lack of robustness and security. This Letter describes the security weaknesses of a recently proposed cryptographic algorithm with chaos at the physical level based on the baker map. It is shown that the security is trivially compromised for practical implementations of the cryptosystem with finite computing precision and for the use of the iteration number n as the secret key. Some possible countermeasures to enhance the security of the chaos-based cryptographic algorithm are also discussed

  1. Small-size pedestrian detection in large scene based on fast R-CNN

    Science.gov (United States)

    Wang, Shengke; Yang, Na; Duan, Lianghua; Liu, Lu; Dong, Junyu

    2018-04-01

    Pedestrian detection is a canonical sub-problem of object detection with high demand during recent years. Although recent deep learning object detectors such as Fast/Faster R-CNN have shown excellent performance for general object detection, they have limited success for small size pedestrian detection in large-view scene. We study that the insufficient resolution of feature maps lead to the unsatisfactory accuracy when handling small instances. In this paper, we investigate issues involving Fast R-CNN for pedestrian detection. Driven by the observations, we propose a very simple but effective baseline for pedestrian detection based on Fast R-CNN, employing the DPM detector to generate proposals for accuracy, and training a fast R-CNN style network to jointly optimize small size pedestrian detection with skip connection concatenating feature from different layers to solving coarseness of feature maps. And the accuracy is improved in our research for small size pedestrian detection in the real large scene.

  2. Pseudo random number generator based on quantum chaotic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Mobaraki, A.; Lim, S.-C.; Hassan, Z.

    2014-01-01

    For many years dissipative quantum maps were widely used as informative models of quantum chaos. In this paper, a new scheme for generating good pseudo-random numbers (PRNG), based on quantum logistic map is proposed. Note that the PRNG merely relies on the equations used in the quantum chaotic map. The algorithm is not complex, which does not impose high requirement on computer hardware and thus computation speed is fast. In order to face the challenge of using the proposed PRNG in quantum cryptography and other practical applications, the proposed PRNG is subjected to statistical tests using well-known test suites such as NIST, DIEHARD, ENT and TestU01. The results of the statistical tests were promising, as the proposed PRNG successfully passed all these tests. Moreover, the degree of non-periodicity of the chaotic sequences of the quantum map is investigated through the Scale index technique. The obtained result shows that, the sequence is more non-periodic. From these results it can be concluded that, the new scheme can generate a high percentage of usable pseudo-random numbers for simulation and other applications in scientific computing.

  3. Self-organizing maps based on limit cycle attractors.

    Science.gov (United States)

    Huang, Di-Wei; Gentili, Rodolphe J; Reggia, James A

    2015-03-01

    Recent efforts to develop large-scale brain and neurocognitive architectures have paid relatively little attention to the use of self-organizing maps (SOMs). Part of the reason for this is that most conventional SOMs use a static encoding representation: each input pattern or sequence is effectively represented as a fixed point activation pattern in the map layer, something that is inconsistent with the rhythmic oscillatory activity observed in the brain. Here we develop and study an alternative encoding scheme that instead uses sparsely-coded limit cycles to represent external input patterns/sequences. We establish conditions under which learned limit cycle representations arise reliably and dominate the dynamics in a SOM. These limit cycles tend to be relatively unique for different inputs, robust to perturbations, and fairly insensitive to timing. In spite of the continually changing activity in the map layer when a limit cycle representation is used, map formation continues to occur reliably. In a two-SOM architecture where each SOM represents a different sensory modality, we also show that after learning, limit cycles in one SOM can correctly evoke corresponding limit cycles in the other, and thus there is the potential for multi-SOM systems using limit cycles to work effectively as hetero-associative memories. While the results presented here are only first steps, they establish the viability of SOM models based on limit cycle activity patterns, and suggest that such models merit further study. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Pedestrian detection based on redundant wavelet transform

    Science.gov (United States)

    Huang, Lin; Ji, Liping; Hu, Ping; Yang, Tiejun

    2016-10-01

    Intelligent video surveillance is to analysis video or image sequences captured by a fixed or mobile surveillance camera, including moving object detection, segmentation and recognition. By using it, we can be notified immediately in an abnormal situation. Pedestrian detection plays an important role in an intelligent video surveillance system, and it is also a key technology in the field of intelligent vehicle. So pedestrian detection has very vital significance in traffic management optimization, security early warn and abnormal behavior detection. Generally, pedestrian detection can be summarized as: first to estimate moving areas; then to extract features of region of interest; finally to classify using a classifier. Redundant wavelet transform (RWT) overcomes the deficiency of shift variant of discrete wavelet transform, and it has better performance in motion estimation when compared to discrete wavelet transform. Addressing the problem of the detection of multi-pedestrian with different speed, we present an algorithm of pedestrian detection based on motion estimation using RWT, combining histogram of oriented gradients (HOG) and support vector machine (SVM). Firstly, three intensities of movement (IoM) are estimated using RWT and the corresponding areas are segmented. According to the different IoM, a region proposal (RP) is generated. Then, the features of a RP is extracted using HOG. Finally, the features are fed into a SVM trained by pedestrian databases and the final detection results are gained. Experiments show that the proposed algorithm can detect pedestrians accurately and efficiently.

  5. Generating a Danish raster-based topsoil property map combining choropleth maps and point information

    DEFF Research Database (Denmark)

    Greve, Mogens H.; Greve, Mette B.; Bøcher, Peder K.

    2007-01-01

    The Danish environmental authorities have posed a soil type dependent restriction on the application of nitrogen. The official Danish soil map is a choropleth topsoil map classifying the agricultural land into eight classes. The use of the soil map has shown that the maps have serious...... classification flaws. The objective of this work is to compile a continuous national topsoil texture map to replace the old topsoil map. Approximately 45,000 point samples were interpolated using ordinary kriging in 250 m x 250 m cells. To reduce variability and to obtain more homogeneous strata, the samples...... were stratified according to landscape types. Five new soil texture maps were compiled; one for each of the five textural classes, and a new categorical soil type map was compiled using the old classification system. Both the old choropleth map and the new continuous soil maps were compared to 354...

  6. Rapid Prototyping of a Map-Based Android App

    OpenAIRE

    Flanagan, Nicholas M; Theller, Eric; Theller, Larry

    2013-01-01

    This project tries to provide a mobile phone-based solution app named “DriftWatch Pollinator Mapper” that will allow beekeepers, apiary inspectors, and association staff to easily register and map a hive into the Driftwatch system, where local pesticide applicators will notice it and be aware of the presence of pollinators. The purpose of the mobile application is to speed the process of registering beekeepers within DriftWatch, since many beekeepers have significant trouble using only web-ba...

  7. SODIM: Service Oriented Data Integration based on MapReduce

    Directory of Open Access Journals (Sweden)

    Ghada ElSheikh

    2013-09-01

    Data integration systems can benefit from innovative dynamic infrastructure solutions such as Clouds, with its more agility, lower cost, device independency, location independency, and scalability. This study consolidates the data integration system, Service Orientation, and distributed processing to develop a new data integration system called Service Oriented Data Integration based on MapReduce (SODIM that improves the system performance, especially with large number of data sources, and that can efficiently be hosted on modern dynamic infrastructures as Clouds.

  8. VoIP attacks detection engine based on neural network

    Science.gov (United States)

    Safarik, Jakub; Slachta, Jiri

    2015-05-01

    The security is crucial for any system nowadays, especially communications. One of the most successful protocols in the field of communication over IP networks is Session Initiation Protocol. It is an open-source project used by different kinds of applications, both open-source and proprietary. High penetration and text-based principle made SIP number one target in IP telephony infrastructure, so security of SIP server is essential. To keep up with hackers and to detect potential malicious attacks, security administrator needs to monitor and evaluate SIP traffic in the network. But monitoring and following evaluation could easily overwhelm the security administrator in networks, typically in networks with a number of SIP servers, users and logically or geographically separated networks. The proposed solution lies in automatic attack detection systems. The article covers detection of VoIP attacks through a distributed network of nodes. Then the gathered data analyze aggregation server with artificial neural network. Artificial neural network means multilayer perceptron network trained with a set of collected attacks. Attack data could also be preprocessed and verified with a self-organizing map. The source data is detected by distributed network of detection nodes. Each node contains a honeypot application and traffic monitoring mechanism. Aggregation of data from each node creates an input for neural networks. The automatic classification on a centralized server with low false positive detection reduce the cost of attack detection resources. The detection system uses modular design for easy deployment in final infrastructure. The centralized server collects and process detected traffic. It also maintains all detection nodes.

  9. Fine mapping quantitative trait loci under selective phenotyping strategies based on linkage and linkage disequilibrium criteria

    DEFF Research Database (Denmark)

    Ansari-Mahyari, S; Berg, P; Lund, M S

    2009-01-01

    disequilibrium-based sampling criteria (LDC) for selecting individuals to phenotype are compared to random phenotyping in a quantitative trait loci (QTL) verification experiment using stochastic simulation. Several strategies based on LAC and LDC for selecting the most informative 30%, 40% or 50% of individuals...... for phenotyping to extract maximum power and precision in a QTL fine mapping experiment were developed and assessed. Linkage analyses for the mapping was performed for individuals sampled on LAC within families and combined linkage disequilibrium and linkage analyses was performed for individuals sampled across...... the whole population based on LDC. The results showed that selecting individuals with similar haplotypes to the paternal haplotypes (minimum recombination criterion) using LAC compared to random phenotyping gave at least the same power to detect a QTL but decreased the accuracy of the QTL position. However...

  10. Utilizing Multi-Sensor Fire Detections to Map Fires in the United States

    Science.gov (United States)

    Howard, S. M.; Picotte, J. J.; Coan, M. J.

    2014-11-01

    In 2006, the Monitoring Trends in Burn Severity (MTBS) project began a cooperative effort between the US Forest Service (USFS) and the U.S.Geological Survey (USGS) to map and assess burn severity all large fires that have occurred in the United States since 1984. Using Landsat imagery, MTBS is mandated to map wildfire and prescribed fire that meet specific size criteria: greater than 1000 acres in the west and 500 acres in the east, regardless of ownership. Relying mostly on federal and state fire occurrence records, over 15,300 individual fires have been mapped. While mapping recorded fires, an additional 2,700 "unknown" or undocumented fires were discovered and assessed. It has become apparent that there are perhaps thousands of undocumented fires in the US that are yet to be mapped. Fire occurrence records alone are inadequate if MTBS is to provide a comprehensive accounting of fire across the US. Additionally, the sheer number of fires to assess has overwhelmed current manual procedures. To address these problems, the National Aeronautics and Space Administration (NASA) Applied Sciences Program is helping to fund the efforts of the USGS and its MTBS partners (USFS, National Park Service) to develop, and implement a system to automatically identify fires using satellite data. In near real time, USGS will combine active fire satellite detections from MODIS, AVHRR and GOES satellites with Landsat acquisitions. Newly acquired Landsat imagery will be routinely scanned to identify freshly burned area pixels, derive an initial perimeter and tag the burned area with the satellite date and time of detection. Landsat imagery from the early archive will be scanned to identify undocumented fires. Additional automated fire assessment processes will be developed. The USGS will develop these processes using open source software packages in order to provide freely available tools to local land managers providing them with the capability to assess fires at the local level.

  11. Synergy Maps: exploring compound combinations using network-based visualization.

    Science.gov (United States)

    Lewis, Richard; Guha, Rajarshi; Korcsmaros, Tamás; Bender, Andreas

    2015-01-01

    The phenomenon of super-additivity of biological response to compounds applied jointly, termed synergy, has the potential to provide many therapeutic benefits. Therefore, high throughput screening of compound combinations has recently received a great deal of attention. Large compound libraries and the feasibility of all-pairs screening can easily generate large, information-rich datasets. Previously, these datasets have been visualized using either a heat-map or a network approach-however these visualizations only partially represent the information encoded in the dataset. A new visualization technique for pairwise combination screening data, termed "Synergy Maps", is presented. In a Synergy Map, information about the synergistic interactions of compounds is integrated with information about their properties (chemical structure, physicochemical properties, bioactivity profiles) to produce a single visualization. As a result the relationships between compound and combination properties may be investigated simultaneously, and thus may afford insight into the synergy observed in the screen. An interactive web app implementation, available at http://richlewis42.github.io/synergy-maps, has been developed for public use, which may find use in navigating and filtering larger scale combination datasets. This tool is applied to a recent all-pairs dataset of anti-malarials, tested against Plasmodium falciparum, and a preliminary analysis is given as an example, illustrating the disproportionate synergism of histone deacetylase inhibitors previously described in literature, as well as suggesting new hypotheses for future investigation. Synergy Maps improve the state of the art in compound combination visualization, by simultaneously representing individual compound properties and their interactions. The web-based tool allows straightforward exploration of combination data, and easier identification of correlations between compound properties and interactions.

  12. Power Consumption Based Android Malware Detection

    OpenAIRE

    Hongyu Yang; Ruiwen Tang

    2016-01-01

    In order to solve the problem that Android platform’s sand-box mechanism prevents security protection software from accessing effective information to detect malware, this paper proposes a malicious software detection method based on power consumption. Firstly, the mobile battery consumption status information was obtained, and the Gaussian mixture model (GMM) was built by using Mel frequency cepstral coefficients (MFCC). Then, the GMM was used to analyze power consumption; malicious software...

  13. Plagiarism Detection Based on SCAM Algorithm

    DEFF Research Database (Denmark)

    Anzelmi, Daniele; Carlone, Domenico; Rizzello, Fabio

    2011-01-01

    Plagiarism is a complex problem and considered one of the biggest in publishing of scientific, engineering and other types of documents. Plagiarism has also increased with the widespread use of the Internet as large amount of digital data is available. Plagiarism is not just direct copy but also...... paraphrasing, rewording, adapting parts, missing references or wrong citations. This makes the problem more difficult to handle adequately. Plagiarism detection techniques are applied by making a distinction between natural and programming languages. Our proposed detection process is based on natural language...... document. Our plagiarism detection system, like many Information Retrieval systems, is evaluated with metrics of precision and recall....

  14. Pedestrian detection from thermal images: A sparse representation based approach

    Science.gov (United States)

    Qi, Bin; John, Vijay; Liu, Zheng; Mita, Seiichi

    2016-05-01

    Pedestrian detection, a key technology in computer vision, plays a paramount role in the applications of advanced driver assistant systems (ADASs) and autonomous vehicles. The objective of pedestrian detection is to identify and locate people in a dynamic environment so that accidents can be avoided. With significant variations introduced by illumination, occlusion, articulated pose, and complex background, pedestrian detection is a challenging task for visual perception. Different from visible images, thermal images are captured and presented with intensity maps based objects' emissivity, and thus have an enhanced spectral range to make human beings perceptible from the cool background. In this study, a sparse representation based approach is proposed for pedestrian detection from thermal images. We first adopted the histogram of sparse code to represent image features and then detect pedestrian with the extracted features in an unimodal and a multimodal framework respectively. In the unimodal framework, two types of dictionaries, i.e. joint dictionary and individual dictionary, are built by learning from prepared training samples. In the multimodal framework, a weighted fusion scheme is proposed to further highlight the contributions from features with higher separability. To validate the proposed approach, experiments were conducted to compare with three widely used features: Haar wavelets (HWs), histogram of oriented gradients (HOG), and histogram of phase congruency (HPC) as well as two classification methods, i.e. AdaBoost and support vector machine (SVM). Experimental results on a publicly available data set demonstrate the superiority of the proposed approach.

  15. Integrating collaborative concept mapping in case based learning

    Directory of Open Access Journals (Sweden)

    Alfredo Tifi

    2013-03-01

    Full Text Available Different significance of collaborative concept mapping and collaborative argumentation in Case Based Learning are discussed and compared in the different perspectives of answering focus questions, of fostering reflective thinking skills and in managing uncertainty in problem solving in a scaffolded environment. Marked differences are pointed out between the way concepts are used in constructing concept maps and the way meanings are adopted in case based learning through guided argumentation activities. Shared concept maps should be given different scopes, as for example a as an advance organizer in preparing a background system of concepts that will undergo transformation while accompanying the inquiry activities on case studies or problems; b together with narratives, to enhance awareness of the situated epistemologies that are being entailed in choosing certain concepts during more complex case studies, and c after-learning construction of a holistic vision of the whole domain by means of the most inclusive concepts, while scaffoldedcollaborative writing of narratives and arguments in describing-treating cases could better serve as a source of situated-inspired tools to create-refine meanings for particular concepts.

  16. Affordance-based individuation of junctions in Open Street Map

    Directory of Open Access Journals (Sweden)

    Simon Scheider

    2012-06-01

    Full Text Available We propose an algorithm that can be used to identify automatically the subset of street segments of a road network map that corresponds to a junction. The main idea is to use turn-compliant locomotion affordances, i.e., restricted patterns of supported movement, in order to specify junctions independently of their data representation, and in order to motivate tractable individuation and classification strategies. We argue that common approaches based solely on geometry or topology of the street segment graph are useful but insufficient proxies. They miss certain turn restrictions essential to junctions. From a computational viewpoint, the main challenge of affordance-based individuation of junctions lies in its complex recursive definition. In this paper, we show how Open Street Map data can be interpreted into locomotion affordances, and how the recursive junction definition can be translated into a deterministic algorithm. We evaluate this algorithm by applying it to small map excerpts in order to delineate the contained junctions.

  17. Agent-based mapping of credit risk for sustainable microfinance.

    Directory of Open Access Journals (Sweden)

    Joung-Hun Lee

    Full Text Available By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  18. Agent-based mapping of credit risk for sustainable microfinance.

    Science.gov (United States)

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  19. Agroclimatic mapping of maize crop based on soil physical properties

    International Nuclear Information System (INIS)

    Dourado Neto, Durval; Sparovek, G.; Reichardt, K.; Timm, Luiz Carlos; Nielsen, D.R.

    2004-01-01

    With the purpose of estimating water deficit to forecast yield knowing productivity (potential yield), the water balance is useful tool to recommend maize exploration and to define the sowing date. The computation can be done for each region with the objective of mapping maize grain yield based on agro-climatic data and soil physical properties. Based on agro-climatic data, air temperature and solar radiation, a model was built to estimate the corn grain productivity (the energy conversion results in dry mass production). The carbon dioxide (CO 2 ) fixation by plants is related to gross carbohydrate (CH 2 O) production and solar radiation. The CO 2 assimilation by C4 plants depends on the photosynthetic active radiation and temperature. From agro-climatic data and soil physical properties, a map with region identification can be built for solar radiation, air temperature, rainfall, maize grain productivity and yield, potential and real evapo-transpiration and water deficit. The map allows to identify the agro-climatic and the soil physical restrictions. This procedure can be used in different spatial (farm to State) and temporal (daily to monthly data) scales. The statistical analysis allows to compare estimated and observed values in different situations to validate the model and to verify which scale is more appropriate

  20. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Ivan Corretjer

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  1. Keyframes Global Map Establishing Method for Robot Localization through Content-Based Image Matching

    Directory of Open Access Journals (Sweden)

    Tianyang Cao

    2017-01-01

    Full Text Available Self-localization and mapping are important for indoor mobile robot. We report a robust algorithm for map building and subsequent localization especially suited for indoor floor-cleaning robots. Common methods, for example, SLAM, can easily be kidnapped by colliding or disturbed by similar objects. Therefore, keyframes global map establishing method for robot localization in multiple rooms and corridors is needed. Content-based image matching is the core of this method. It is designed for the situation, by establishing keyframes containing both floor and distorted wall images. Image distortion, caused by robot view angle and movement, is analyzed and deduced. And an image matching solution is presented, consisting of extraction of overlap regions of keyframes extraction and overlap region rebuild through subblocks matching. For improving accuracy, ceiling points detecting and mismatching subblocks checking methods are incorporated. This matching method can process environment video effectively. In experiments, less than 5% frames are extracted as keyframes to build global map, which have large space distance and overlap each other. Through this method, robot can localize itself by matching its real-time vision frames with our keyframes map. Even with many similar objects/background in the environment or kidnapping robot, robot localization is achieved with position RMSE <0.5 m.

  2. A Parallel Encryption Algorithm Based on Piecewise Linear Chaotic Map

    Directory of Open Access Journals (Sweden)

    Xizhong Wang

    2013-01-01

    Full Text Available We introduce a parallel chaos-based encryption algorithm for taking advantage of multicore processors. The chaotic cryptosystem is generated by the piecewise linear chaotic map (PWLCM. The parallel algorithm is designed with a master/slave communication model with the Message Passing Interface (MPI. The algorithm is suitable not only for multicore processors but also for the single-processor architecture. The experimental results show that the chaos-based cryptosystem possesses good statistical properties. The parallel algorithm provides much better performance than the serial ones and would be useful to apply in encryption/decryption file with large size or multimedia.

  3. Understanding of Object Detection Based on CNN Family and YOLO

    Science.gov (United States)

    Du, Juan

    2018-04-01

    As a key use of image processing, object detection has boomed along with the unprecedented advancement of Convolutional Neural Network (CNN) and its variants since 2012. When CNN series develops to Faster Region with CNN (R-CNN), the Mean Average Precision (mAP) has reached 76.4, whereas, the Frame Per Second (FPS) of Faster R-CNN remains 5 to 18 which is far slower than the real-time effect. Thus, the most urgent requirement of object detection improvement is to accelerate the speed. Based on the general introduction to the background and the core solution CNN, this paper exhibits one of the best CNN representatives You Only Look Once (YOLO), which breaks through the CNN family’s tradition and innovates a complete new way of solving the object detection with most simple and high efficient way. Its fastest speed has achieved the exciting unparalleled result with FPS 155, and its mAP can also reach up to 78.6, both of which have surpassed the performance of Faster R-CNN greatly. Additionally, compared with the latest most advanced solution, YOLOv2 achieves an excellent tradeoff between speed and accuracy as well as an object detector with strong generalization ability to represent the whole image.

  4. SU-E-J-193: Application of Surface Mapping in Detecting Swallowing for Head-&-Neck Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Cao, D; Xie, X; Mehta, V; Shepard, D [Swedish Cancer Institute, Seattle, WA (United States)

    2015-06-15

    Purpose: Recent evidence is emerging that long term swallowing function may be improved after radiotherapy for head-&-neck cancer if doses are limited to certain swallowing structures. Immobilization of patients with head-&-neck cancer is typically done with a mask. This mask, however, doesn’t limit patient swallowing. Patient voluntary or involuntary swallowing may introduce significant tumor motion, which can lead to suboptimal delivery. In this study, we have examined the feasibility of using surface mapping technology to detect patient swallowing during treatment and evaluated its magnitude. Methods: The C-RAD Catalyst system was used to detect the patient surface map. A volunteer lying on the couch was used to simulate the patient under treatment. A virtual marker was placed near the throat and was used to monitor the swallowing action. The target motion calculated by the Catalyst system through deformable registration was also collected. Two treatment isocenters, one placed close to the throat and the other placed posterior to the base-of-tongue, were used to check the sensitivity of surface mapping technique. Results: When the patient’s throat is not in the shadow of the patient’s chest, the Catalyst system can clearly identify the swallowing motion. In our tests, the vertical motion of the skin can reach to about 5mm. The calculated target motion can reach up to 1 cm. The magnitude of this calculated target motion is more dramatic when the plan isocenter is closer to the skin surface, which suggests that the Catalyst motion tracking technique is more sensitive to the swallowing motion with a shallower isocenter. Conclusion: Surface mapping can clearly identify patient swallowing during radiation treatment. This information can be used to evaluate the dosimetric impact of the involuntary swallowing. It may also be used to potentially gate head-&-neck radiation treatments. A prospective IRB approved study is currently enrolling patients in our

  5. PERCEPTUAL MAPPING BASED ON IDIOSYNCRATIC SETS OF ATTRIBUTES

    NARCIS (Netherlands)

    STEENKAMP, JBEM; VANTRIJP, HCM; TENBERGE, JMF

    The authors describe a compositional perceptual mapping procedure, unrestricted attribute-elicitation mapping (UAM), which allows consumers to describe and rate the brands in their own terminology and thus relaxes the restrictive assumptions of traditional compositional mapping techniques regarding

  6. Performance Evaluation of Java Based Object Relational Mapping Tools

    Directory of Open Access Journals (Sweden)

    Shoaib Mahmood Bhatti

    2013-04-01

    Full Text Available Object persistency is the hot issue in the form of ORM (Object Relational Mapping tools in industry as developers use these tools during software development. This paper presents the performance evaluation of Java based ORM tools. For this purpose, Hibernate, Ebean and TopLinkhave been selected as the ORM tools which are popular and open source. Their performance has been measured from execution point of view. The results show that ORM tools are the good option for the developers considering the system throughput in shorter setbacks and they can be used efficiently and effectively for performing mapping of the objects into the relational dominated world of database, thus creating a hope for a better and well dominated future of this technology.

  7. Low Cost Vision Based Personal Mobile Mapping System

    Directory of Open Access Journals (Sweden)

    M. M. Amami

    2014-03-01

    Full Text Available Mobile mapping systems (MMS can be used for several purposes, such as transportation, highway infrastructure mapping and GIS data collecting. However, the acceptance of these systems is not wide spread and their use is still limited due the high cost and dependency on the Global Navigation Satellite System (GNSS. A low cost vision based personal MMS has been produced with an aim to overcome these limitations. The system has been designed to depend mainly on cameras and use of low cost GNSS and inertial sensors to provide a bundle adjustment solution with initial values. The system has the potential to be used indoor and outdoor. The system has been tested indoors and outdoors with different GPS coverage, surrounded features, and narrow and curvy paths. Tests show that the system is able to work in such environments providing 3D coordinates of better than 10 cm accuracy.

  8. Low Cost Vision Based Personal Mobile Mapping System

    Science.gov (United States)

    Amami, M. M.; Smith, M. J.; Kokkas, N.

    2014-03-01

    Mobile mapping systems (MMS) can be used for several purposes, such as transportation, highway infrastructure mapping and GIS data collecting. However, the acceptance of these systems is not wide spread and their use is still limited due the high cost and dependency on the Global Navigation Satellite System (GNSS). A low cost vision based personal MMS has been produced with an aim to overcome these limitations. The system has been designed to depend mainly on cameras and use of low cost GNSS and inertial sensors to provide a bundle adjustment solution with initial values. The system has the potential to be used indoor and outdoor. The system has been tested indoors and outdoors with different GPS coverage, surrounded features, and narrow and curvy paths. Tests show that the system is able to work in such environments providing 3D coordinates of better than 10 cm accuracy.

  9. A new cryptosystem based on chaotic map and operations algebraic

    International Nuclear Information System (INIS)

    Yang Huaqian; Liao Xiaofeng; Wong, Kwok-wo; Zhang Wei; Wei Pengcheng

    2009-01-01

    Based on the study of some existing chaotic encryption algorithms, a new block cipher is proposed. The proposed cipher encrypts 128-bit plaintext to 128-bit ciphertext blocks, using a 128-bit key K and the initial value x 0 and the control parameter mu of logistic map. It consists of an initial permutation and eight computationally identical rounds followed by an output transformation. Round r uses a 128-bit roundkey K (r) to transform a 128-bit input C (r-1) , which is fed to the next round. The output after round 8 enters the output transformation to produce the final ciphertext. All roundkeys are derived from K and a 128-bit random binary sequence generated from a chaotic map. Analysis shows that the proposed block cipher does not suffer from the flaws of pure chaotic cryptosystems and possesses high security.

  10. Empty tracks optimization based on Z-Map model

    Science.gov (United States)

    Liu, Le; Yan, Guangrong; Wang, Zaijun; Zang, Genao

    2017-12-01

    For parts with many features, there are more empty tracks during machining. If these tracks are not optimized, the machining efficiency will be seriously affected. In this paper, the characteristics of the empty tracks are studied in detail. Combining with the existing optimization algorithm, a new tracks optimization method based on Z-Map model is proposed. In this method, the tool tracks are divided into the unit processing section, and then the Z-Map model simulation technique is used to analyze the order constraint between the unit segments. The empty stroke optimization problem is transformed into the TSP with sequential constraints, and then through the genetic algorithm solves the established TSP problem. This kind of optimization method can not only optimize the simple structural parts, but also optimize the complex structural parts, so as to effectively plan the empty tracks and greatly improve the processing efficiency.

  11. Seizure detection algorithms based on EMG signals

    DEFF Research Database (Denmark)

    Conradsen, Isa

    Background: the currently used non-invasive seizure detection methods are not reliable. Muscle fibers are directly connected to the nerves, whereby electric signals are generated during activity. Therefore, an alarm system on electromyography (EMG) signals is a theoretical possibility. Objective...... on the amplitude of the signal. The other algorithm was based on information of the signal in the frequency domain, and it focused on synchronisation of the electrical activity in a single muscle during the seizure. Results: The amplitude-based algorithm reliably detected seizures in 2 of the patients, while...... the frequency-based algorithm was efficient for detecting the seizures in the third patient. Conclusion: Our results suggest that EMG signals could be used to develop an automatic seizuredetection system. However, different patients might require different types of algorithms /approaches....

  12. A LiDAR based analysis of hydraulic hazard mapping

    Science.gov (United States)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  13. UTE-T2* mapping detects sub-clinical meniscus injury after anterior cruciate ligament tear

    Science.gov (United States)

    Williams, A.; Qian, Y.; Golla, S.; Chu, C.R.

    2018-01-01

    SUMMARY Objective Meniscus tear is a known risk factor for osteoarthritis (OA). Quantitative assessment of meniscus degeneration, prior to surface break-down, is important to identification of early disease potentially amenable to therapeutic interventions. This work examines the diagnostic potential of ultrashort echo time-enhanced T2* (UTE-T2*) mapping to detect human meniscus degeneration in vitro and in vivo in subjects at risk of developing OA. Design UTE-T2* maps of 16 human cadaver menisci were compared to histological evaluations of meniscal structural integrity and clinical magnetic resonance imaging (MRI) assessment by a musculoskeletal radiologist. In vivo UTE-T2* maps were compared in 10 asymptomatic subjects and 25 ACL-injured patients with and without concomitant meniscal tear. Results In vitro, UTE-T2* values tended to be lower in histologically and clinically normal meniscus tissue and higher in torn or degenerate tissue. UTE-T2* map heterogeneity reflected collagen disorganization. In vivo, asymptomatic meniscus UTE-T2* values were repeatable within 9% (root-mean-square average coefficient of variation). Posteromedial meniscus UTE-T2* values in ACL-injured subjects with clinically diagnosed medial meniscus tear (n = 10) were 87% higher than asymptomatics (n = 10, P meniscus degeneration. Further study is needed to determine whether elevated subsurface meniscus UTE-T2* values predict progression of meniscal degeneration and development of OA. PMID:22306000

  14. Spectral features based tea garden extraction from digital orthophoto maps

    Science.gov (United States)

    Jamil, Akhtar; Bayram, Bulent; Kucuk, Turgay; Zafer Seker, Dursun

    2018-05-01

    The advancements in the photogrammetry and remote sensing technologies has made it possible to extract useful tangible information from data which plays a pivotal role in various application such as management and monitoring of forests and agricultural lands etc. This study aimed to evaluate the effectiveness of spectral signatures for extraction of tea gardens from 1 : 5000 scaled digital orthophoto maps obtained from Rize city in Turkey. First, the normalized difference vegetation index (NDVI) was derived from the input images to suppress the non-vegetation areas. NDVI values less than zero were discarded and the output images was normalized in the range 0-255. Individual pixels were then mapped into meaningful objects using global region growing technique. The resulting image was filtered and smoothed to reduce the impact of noise. Furthermore, geometrical constraints were applied to remove small objects (less than 500 pixels) followed by morphological opening operator to enhance the results. These objects served as building blocks for further image analysis. Finally, for the classification stage, a range of spectral values were empirically calculated for each band and applied on candidate objects to extract tea gardens. For accuracy assessment, we employed an area based similarity metric by overlapping obtained tea garden boundaries with the manually digitized tea garden boundaries created by experts of photogrammetry. The overall accuracy of the proposed method scored 89 % for tea gardens from 10 sample orthophoto maps. We concluded that exploiting the spectral signatures using object based analysis is an effective technique for extraction of dominant tree species from digital orthophoto maps.

  15. Construction of a genetic linkage map in Lilium using a RIL mapping population based on SRAP marker

    Directory of Open Access Journals (Sweden)

    Chen Li-Jing

    2015-01-01

    Full Text Available A genetic linkage map of lily was constructed using RILs (recombinant inbred lines population of 180 individuals. This mapping population was developed by crossing Raizan No.1 (Formolongo and Gelria (Longiflomm cultivars through single-seed descent (SSD. SRAPs were generated by using restriction enzymes EcoRI in combination with either MseI. The resulting products were separated by electrophoresis on 6% denaturing polyacrylamide gel and visualized by silver staining. The segregation of each marker and linkage analysis was done using the program Mapmaker3.0. With 50 primer pairs, a total of 189 parental polymorphic bands were detected and 78 were used for mapping. The total map length was 2,135.5 cM consisted of 16 linkage groups. The number of markers in the linkage groups varied from 1 to 12. The length of linkage groups was range from 11.2 cM to 425.9 cM and mean marker interval distance range from 9.4 cM to 345.4 cM individually. The mean marker interval distance between markers was 27.4 cM. The map developed in the present study was the first sequence-related amplified polymorphism markers map of lily constructed with recombinant inbred lines, it could be used for genetic mapping and molecular marker assisted breeding and quantitative trait locus mapping of Lilium.

  16. Detection of a weak meddy-like anomaly from high-resolution satellite SST maps

    Directory of Open Access Journals (Sweden)

    Mikhail Emelianov

    2012-09-01

    Full Text Available Despite the considerable impact of meddies on climate through the long-distance transport of properties, a consistent observation of meddy generation and propagation in the ocean is rather elusive. Meddies propagate at about 1000 m below the ocean surface, so satellite sensors are not able to detect them directly and finding them in the open ocean is more fortuitous than intentional. However, a consistent census of meddies and their paths is required in order to gain knowledge about their role in transporting properties such as heat and salt. In this paper we propose a new methodology for processing high-resolution sea surface temperature maps in order to detect meddy-like anomalies in the open ocean on a near-real-time basis. We present an example of detection, involving an atypical meddy-like anomaly that was confirmed as such by in situ measurements.

  17. UPDATING NATIONAL TOPOGRAPHIC DATA BASE USING CHANGE DETECTION METHODS

    Directory of Open Access Journals (Sweden)

    E. Keinan

    2016-06-01

    Full Text Available The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA, the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  18. Improved biosensor-based detection system

    DEFF Research Database (Denmark)

    2015-01-01

    Described is a new biosensor-based detection system for effector compounds, useful for in vivo applications in e.g. screening and selecting of cells which produce a small molecule effector compound or which take up a small molecule effector compound from its environment. The detection system...... comprises a protein or RNA-based biosensor for the effector compound which indirectly regulates the expression of a reporter gene via two hybrid proteins, providing for fewer false signals or less 'noise', tuning of sensitivity or other advantages over conventional systems where the biosensor directly...

  19. Real-time flood extent maps based on social media

    Science.gov (United States)

    Eilander, Dirk; van Loenen, Arnejan; Roskam, Ruud; Wagemaker, Jurjen

    2015-04-01

    During a flood event it is often difficult to get accurate information about the flood extent and the people affected. This information is very important for disaster risk reduction management and crisis relief organizations. In the post flood phase, information about the flood extent is needed for damage estimation and calibrating hydrodynamic models. Currently, flood extent maps are derived from a few sources such as satellite images, areal images and post-flooding flood marks. However, getting accurate real-time or maximum flood extent maps remains difficult. With the rise of social media, we now have a new source of information with large numbers of observations. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at 8 tweets per second during floods in early 2014. A fair amount of these tweets also contains observations of water depth and location. Our hypothesis is that based on the large numbers of tweets it is possible to generate real-time flood extent maps. In this study we use tweets from the city of Jakarta, Indonesia, to generate these flood extent maps. The data-mining procedure looks for tweets with a mention of 'banjir', the Bahasa Indonesia word for flood. It then removes modified and retweeted messages in order to keep unique tweets only. Since tweets are not always sent directly from the location of observation, the geotag in the tweets is unreliable. We therefore extract location information using mentions of names of neighborhoods and points of interest. Finally, where encountered, a mention of a length measure is extracted as water depth. These tweets containing a location reference and a water level are considered to be flood observations. The strength of this method is that it can easily be extended to other regions and languages. Based on the intensity of tweets in Jakarta during a flood event we can provide a rough estimate of the flood extent. To provide more accurate flood extend

  20. Detection of Second Sound in He-II for Thermal Quench Mapping of Superconducting Radio Frequency Accelerating Cavities

    CERN Document Server

    Stegmaier, Tobias; Kind, Matthias; Furci, Hernán; Koettig, Torsten; Peters, Benedikt

    The development of future particle accelerators requires intensive testing of superconducting radio frequency cavities with different sizes and geometries. Non-contact thermometry quench localisation techniques proved to be beneficial for the localisation of surface defects that can originate a quench (sudden loss of superconducting state). These techniques are based on the detection of second sound in helium II. Transition Edge Sensors (TES) are highly sensitive thin film thermometers with fast time response. In the present work, their capability as a thermal quench mapping device for superconducting radio frequency cavities is proven experimentally by detecting second sound waves emitted by SMD heaters in a He-II bath at saturated vapour pressure. A characterisation of the sensors at steady bath temperatures was conducted to calculate the thermal sensitivity. An intense metallurgical study of gold-tin TES with different compositions revealed important relations between the superconducting behaviour and the ...

  1. A new methodology for strategic planning using technological maps and detection of emerging research fronts applied to radiopharmacy

    International Nuclear Information System (INIS)

    Didio, Robert Joseph

    2011-01-01

    This research aims the development of a new methodology to support the strategic planning, using the process of elaboration of technological maps (TRM - Technological Roadmaps), associated with application of the detection process of emerging fronts of research in databases of scientific publications and patents. The innovation introduced in this research is the customization of the process of TRM to the radiopharmacy and, specifically, its association to the technique of detection of emerging fronts of research, in order to prove results and to establish a new and very useful methodology to the strategic planning of this area of businesses. The business unit DIRF - Diretoria de Radiofarmacia - of IPEN CNEN/SP was used as base of the study and implementation of this methodology presented in this work. (author)

  2. Confirmation of the detection of B modes in the Planck polarization maps

    DEFF Research Database (Denmark)

    Nørgaard-Nielsen, H. U.

    2018-01-01

    One of the main problems of extracting the cosmic microwave background (CMB) from submm/mm observations is correcting for the galactic components, mainly synchrotron, free–free, and thermal dust emission, with the required accuracy. Through a series of papers, it has been demonstrated that this t......One of the main problems of extracting the cosmic microwave background (CMB) from submm/mm observations is correcting for the galactic components, mainly synchrotron, free–free, and thermal dust emission, with the required accuracy. Through a series of papers, it has been demonstrated...... that this task can be fulfilled by means of simple neural networks with high confidence. The main purpose of this paper is to demonstrate that the CMB BB power spectrum detected in the Planck 2015 polarization maps is present in the improved Planck 2017 maps with higher signal‐to‐noise ratio. Two features have...

  3. A detailed study of the generation of optically detectable watermarks using the logistic map

    International Nuclear Information System (INIS)

    Mooney, Aidan; Keating, John G.; Heffernan, Daniel M.

    2006-01-01

    A digital watermark is a visible, or preferably invisible, identification code that is permanently embedded in digital media, to prove owner authentication and provide protection for documents. Given the interest in watermark generation using chaotic functions a detailed study of one chaotic function for this purpose is performed. In this paper, we present an approach for the generation of watermarks using the logistic map. Using this function, in conjunction with seed management, it is possible to generate chaotic sequences that may be used to create highpass or lowpass digital watermarks. In this paper we provide a detailed study on the generation of optically detectable watermarks and we provide some guidelines on successful chaotic watermark generation using the logistic map, and show using a recently published scheme, how care must be taken in the selection of the function seed

  4. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    Science.gov (United States)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  5. Brain-wide mapping of axonal connections: workflow for automated detection and spatial analysis of labeling in microscopic sections

    Directory of Open Access Journals (Sweden)

    Eszter Agnes ePapp

    2016-04-01

    Full Text Available Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA and Phaseolus vulgaris leucoagglutinin (Pha-L allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS atlas of the Sprague Dawley rat brain (v2 by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data.

  6. Utilizing Yagi antennas in Lightning Mapping Array to detect low-power VHF signals

    Science.gov (United States)

    Tilles, J.; Thomas, R. J.; Edens, H. E.; Krehbiel, P. R.; Rison, W.

    2013-12-01

    The New Mexico Tech VHF Lightning Mapping Array (LMA) being operated at Langmuir Laboratory in central New Mexico is comprised of 22 time-of-arrival stations spanning an area approximately 60 km north-south and 45 km east-west. Nine stations are at high altitude (3.1-3.3 km GPS) over a 3 x 4 km area around the mountain-top Laboratory, and 13 are on the surrounding plains and the Rio Grande valley, at altitudes between 1.4 and 2.2 km. Each station utilizes a vertical half-wave dipole antenna having about 2 dBi gain at horizontal incidence and providing omnidirectional azimuthal coverage. In 2012, four additional stations utilizing higher gain (11 dBi) Yagi antennas were co-located at four of the surrounding sites within 10-15 km of the laboratory, each pointed over the laboratory area. The purpose was to test if directional antennas would improve detection of low-power sources in the laboratory vicinity, such as those associated with positive breakdown or weak precursor events. The test involved comparing the number and quality of radiation sources obtained by processing data from two sets of stations: first for a 17-station network in which all stations were omnidirectional, and then for the same network with Yagi-based measurements substituted in place of the omni measurements at the four co-located stations. For radiation events located in both datasets, the indicated source power values from Yagi stations were typically 5-10 dB greater than their omnidirectional counterpart for sources over or near the laboratory, consistent with the 9 dB difference in on-axis gain values. The difference decreased through zero and to negative values with increasing distance from the laboratory, confirming that it was due to the directionality of the Yagi antennas. It was expected that a network having Yagi antennas at all outlying stations would improve the network's detection of lower power sources in its central region. Rather, preliminary results show that there is no

  7. Constructing a Soil Class Map of Denmark based on the FAO Legend Using Digital Techniques

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Minasny, Budiman; Greve, Mette Balslev

    2014-01-01

    Soil mapping in Denmark has a long history and a series of soil maps based on conventional mapping approaches have been produced. In this study, a national soil map of Denmark was constructed based on the FAO–Unesco Revised Legend 1990 using digital soil mapping techniques, existing soil profile......) confirmed that the output is reliable and can be used in various soil and environmental studies without major difficulties. This study also verified the importance of GlobalSoilMap products and a priori pedological information that improved prediction performance and quality of the new FAO soil map...

  8. Mapping Base Modifications in DNA by Transverse-Current Sequencing

    Science.gov (United States)

    Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.

    2018-02-01

    Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.

  9. Feature selection based on SVM significance maps for classification of dementia

    NARCIS (Netherlands)

    E.E. Bron (Esther); M. Smits (Marion); J.C. van Swieten (John); W.J. Niessen (Wiro); S. Klein (Stefan)

    2014-01-01

    textabstractSupport vector machine significance maps (SVM p-maps) previously showed clusters of significantly different voxels in dementiarelated brain regions. We propose a novel feature selection method for classification of dementia based on these p-maps. In our approach, the SVM p-maps are

  10. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers. 

  11. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  12. iMAR: An Interactive Web-Based Application for Mapping Herbicide Resistant Weeds.

    Directory of Open Access Journals (Sweden)

    Silvia Panozzo

    Full Text Available Herbicides are the major weed control tool in most cropping systems worldwide. However, the high reliance on herbicides has led to environmental issues as well as to the evolution of herbicide-resistant biotypes. Resistance is a major concern in modern agriculture and early detection of resistant biotypes is therefore crucial for its management and prevention. In this context, a timely update of resistance biotypes distribution is fundamental to devise and implement efficient resistance management strategies. Here we present an innovative web-based application called iMAR (interactive MApping of Resistance for the mapping of herbicide resistant biotypes. It is based on open source software tools and translates into maps the data reported in the GIRE (Italian herbicide resistance working group database of herbicide resistance at national level. iMAR allows an automatic, easy and cost-effective updating of the maps a nd provides two different systems, "static" and "dynamic". In the first one, the user choices are guided by a hierarchical tree menu, whereas the latter is more flexible and includes a multiple choice criteria (type of resistance, weed species, region, cropping systems that permits customized maps to be created. The generated information can be useful to various stakeholders who are involved in weed resistance management: farmers, advisors, national and local decision makers as well as the agrochemical industry. iMAR is freely available, and the system has the potential to handle large datasets and to be used for other purposes with geographical implications, such as the mapping of invasive plants or pests.

  13. Real-Time Vision-Based Stiffness Mapping †.

    Science.gov (United States)

    Faragasso, Angela; Bimbo, João; Stilli, Agostino; Wurdemann, Helge Arne; Althoefer, Kaspar; Asama, Hajime

    2018-04-26

    This paper presents new findings concerning a hand-held stiffness probe for the medical diagnosis of abnormalities during palpation of soft-tissue. Palpation is recognized by the medical community as an essential and low-cost method to detect and diagnose disease in soft-tissue. However, differences are often subtle and clinicians need to train for many years before they can conduct a reliable diagnosis. The probe presented here fills this gap providing a means to easily obtain stiffness values of soft tissue during a palpation procedure. Our stiffness sensor is equipped with a multi degree of freedom (DoF) Aurora magnetic tracker, allowing us to track and record the 3D position of the probe whilst examining a tissue area, and generate a 3D stiffness map in real-time. The stiffness probe was integrated in a robotic arm and tested in an artificial environment representing a good model of soft tissue organs; the results show that the sensor can accurately measure and map the stiffness of a silicon phantom embedded with areas of varying stiffness.

  14. Real-Time Vision-Based Stiffness Mapping

    Directory of Open Access Journals (Sweden)

    Angela Faragasso

    2018-04-01

    Full Text Available This paper presents new findings concerning a hand-held stiffness probe for the medical diagnosis of abnormalities during palpation of soft-tissue. Palpation is recognized by the medical community as an essential and low-cost method to detect and diagnose disease in soft-tissue. However, differences are often subtle and clinicians need to train for many years before they can conduct a reliable diagnosis. The probe presented here fills this gap providing a means to easily obtain stiffness values of soft tissue during a palpation procedure. Our stiffness sensor is equipped with a multi degree of freedom (DoF Aurora magnetic tracker, allowing us to track and record the 3D position of the probe whilst examining a tissue area, and generate a 3D stiffness map in real-time. The stiffness probe was integrated in a robotic arm and tested in an artificial environment representing a good model of soft tissue organs; the results show that the sensor can accurately measure and map the stiffness of a silicon phantom embedded with areas of varying stiffness.

  15. A human motion model based on maps for navigation systems

    Directory of Open Access Journals (Sweden)

    Kaiser Susanna

    2011-01-01

    Full Text Available Abstract Foot-mounted indoor positioning systems work remarkably well when using additionally the knowledge of floor-plans in the localization algorithm. Walls and other structures naturally restrict the motion of pedestrians. No pedestrian can walk through walls or jump from one floor to another when considering a building with different floor-levels. By incorporating known floor-plans in sequential Bayesian estimation processes such as particle filters (PFs, long-term error stability can be achieved as long as the map is sufficiently accurate and the environment sufficiently constraints pedestrians' motion. In this article, a new motion model based on maps and floor-plans is introduced that is capable of weighting the possible headings of the pedestrian as a function of the local environment. The motion model is derived from a diffusion algorithm that makes use of the principle of a source effusing gas and is used in the weighting step of a PF implementation. The diffusion algorithm is capable of including floor-plans as well as maps with areas of different degrees of accessibility. The motion model more effectively represents the probability density function of possible headings that are restricted by maps and floor-plans than a simple binary weighting of particles (i.e., eliminating those that crossed walls and keeping the rest. We will show that the motion model will help for obtaining better performance in critical navigation scenarios where two or more modes may be competing for some of the time (multi-modal scenarios.

  16. Family-Based Benchmarking of Copy Number Variation Detection Software.

    Science.gov (United States)

    Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael

    2015-01-01

    The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

  17. QTL detection and elite alleles mining for stigma traits in Oryza sativa by association mapping

    Directory of Open Access Journals (Sweden)

    Xiaojing Dang

    2016-08-01

    Full Text Available Stigma traits are very important for hybrid seed production in Oryza sativa, which is a self-pollinated crop; however, the genetic mechanism controlling the traits is poorly understood. In this study, we investigated the phenotypic data of 227 accessions across two years and assessed their genotypic variation with 249 simple sequence repeat (SSR markers. By combining phenotypic and genotypic data, a genome-wide association (GWA map was generated. Large phenotypic variations in stigma length (STL, stigma brush-shaped part length (SBPL and stigma non-brush-shaped part length (SNBPL were found. Significant positive correlations were identified among stigma traits. In total, 2,072 alleles were detected among 227 accessions, with an average of 8.3 alleles per SSR locus. GWA mapping detected 6 quantitative trait loci (QTLs for the STL, 2 QTLs for the SBPL and 7 QTLs for the SNBPL. Eleven, 5, and 12 elite alleles were found for the STL, SBPL and SNBPL, respectively. Optimal cross designs were predicted for improving the target traits. The detected genetic variation in stigma traits and QTLs provides helpful information for cloning candidate STL genes and breeding rice cultivars with longer STLs in the future.

  18. An Anomaly Detector Based on Multi-aperture Mapping for Hyperspectral Data

    Directory of Open Access Journals (Sweden)

    LI Min

    2016-10-01

    Full Text Available Considering the correlationship of spectral content between anomaly and clutter background, inaccurate selection of background pixels induced estimation error of background model. In order to solve the above problems, a multi-aperture mapping based anomaly detector was proposed in this paper. Firstly, differing from background model which focused on feature extraction of background, multi-aperture mapping of hyperspectral data characterized the feature of whole hyperspectral data. According to constructed basis set of multi-aperture mapping, anomaly salience index of every test pixel was proposed to measure the relative statistic difference. Secondly, in order to analysis the moderate salience anomaly precisely, membership value was constructed to identify anomaly salience of test pixels continuously based on fuzzy logical theory. At same time, weighted iterative estimation of multi-aperture mapping was expected to converge adaptively with membership value as weight. Thirdly, classical defuzzification was proposed to fuse different detection results. Hyperspectral data was used in the experiments, and the robustness and sensitivity to anomaly with lower silence of proposed detector were tested.

  19. Wireless Falling Detection System Based on Community.

    Science.gov (United States)

    Xia, Yun; Wu, Yanqi; Zhang, Bobo; Li, Zhiyang; He, Nongyue; Li, Song

    2015-06-01

    The elderly are more likely to suffer the aches or pains from the accidental falls, and both the physiology and psychology of patients would subject to a long-term disturbance, especially when the emergency treatment was not given timely and properly. Although many methods and devices have been developed creatively and shown their efficiency in experiments, few of them are suitable for commercial applications routinely. Here, we design a wearable falling detector as a mobile terminal, and utilize the wireless technology to transfer and monitor the activity data of the host in a relatively small community. With the help of the accelerometer sensor and the Google Mapping service, information of the location and the activity data will be send to the remote server for the downstream processing. The experimental result has shown that SA (Sum-vector of all axes) value of 2.5 g is the threshold value to distinguish the falling from other activities. A three-stage detection algorithm was adopted to increase the accuracy of the real alarm, and the accuracy rate of our system was more than 95%. With the further improvement, the falling detecting device which is low-cost, accurate and user-friendly would become more and more common in everyday life.

  20. Detection and mapping of organic molecules in Titan's atmosphere using ALMA

    Science.gov (United States)

    Cordiner, Martin

    2016-06-01

    Titan's atmospheric photochemistry results in the production of a wide range of organic molecules, including hydrocarbons, nitriles, aromatics and other complex species of possible pre-biotic relevance. Studies of Titan's atmospheric chemistry thus provide a unique opportunity to explore the origin and evolution of organic matter in primitive (terrestrial) planetary atmospheres. The Atacama Large Millimeter/submillimeter Array (ALMA) is a powerful new facility, well suited to the study of molecular emission from Titan's upper and middle-atmosphere. Results will be presented from our ongoing studies of Titan using ALMA data obtained during the period 2012-2014 [1,2], including detection and mapping of emission from C2H5CN, HNC, HC3N, CH3CN and CH3CCH. In addition, combining data from multiple ALMA Band 6 observations, we obtained high-resolution spectra with unprecedented sensitivity, enabling the first detection of C2H3CN (vinyl cyanide) on Titan, and derived a mean C2H3CN C2H5CN abundance ratio above 300 km of 0.3. Vinyl cyanide has recently been investigated as a possible constituent of (pre-biotic) vesicle membranes in Titan's liquid CH4 oceans [3]. Radiative transfer models and possible chemical formation pathways for the detected molecules will be discussed. ALMA observations provide instantaneous snapshot mapping of Titan's entire Earth-facing hemisphere for gases inaccessible to previous studies, and therefore provide new insights into photochemical production and transport, particularly at higher altitudes. Our maps show spatially resolved peaks in Titan's northern and southern hemispheres, consistent with the molecular distributions found in previous studies at infrared wavelengths by Voyager and Cassini, but high-altitude longitudinal asymmetries in our nitrile data indicate that the mesosphere may be more spatially variable than previously thought.

  1. A consensus linkage map of lentil based on DArT markers from three RIL mapping populations.

    Directory of Open Access Journals (Sweden)

    Duygu Ates

    Full Text Available Lentil (Lens culinaris ssp. culinaris Medikus is a diploid (2n = 2x = 14, self-pollinating grain legume with a haploid genome size of about 4 Gbp and is grown throughout the world with current annual production of 4.9 million tonnes.A consensus map of lentil (Lens culinaris ssp. culinaris Medikus was constructed using three different lentils recombinant inbred line (RIL populations, including "CDC Redberry" x "ILL7502" (LR8, "ILL8006" x "CDC Milestone" (LR11 and "PI320937" x "Eston" (LR39.The lentil consensus map was composed of 9,793 DArT markers, covered a total of 977.47 cM with an average distance of 0.10 cM between adjacent markers and constructed 7 linkage groups representing 7 chromosomes of the lentil genome. The consensus map had no gap larger than 12.67 cM and only 5 gaps were found to be between 12.67 cM and 6.0 cM (on LG3 and LG4. The localization of the SNP markers on the lentil consensus map were in general consistent with their localization on the three individual genetic linkage maps and the lentil consensus map has longer map length, higher marker density and shorter average distance between the adjacent markers compared to the component linkage maps.This high-density consensus map could provide insight into the lentil genome. The consensus map could also help to construct a physical map using a Bacterial Artificial Chromosome library and map based cloning studies. Sequence information of DArT may help localization of orientation scaffolds from Next Generation Sequencing data.

  2. A consensus linkage map of lentil based on DArT markers from three RIL mapping populations.

    Science.gov (United States)

    Ates, Duygu; Aldemir, Secil; Alsaleh, Ahmad; Erdogmus, Semih; Nemli, Seda; Kahriman, Abdullah; Ozkan, Hakan; Vandenberg, Albert; Tanyolac, Bahattin

    2018-01-01

    Lentil (Lens culinaris ssp. culinaris Medikus) is a diploid (2n = 2x = 14), self-pollinating grain legume with a haploid genome size of about 4 Gbp and is grown throughout the world with current annual production of 4.9 million tonnes. A consensus map of lentil (Lens culinaris ssp. culinaris Medikus) was constructed using three different lentils recombinant inbred line (RIL) populations, including "CDC Redberry" x "ILL7502" (LR8), "ILL8006" x "CDC Milestone" (LR11) and "PI320937" x "Eston" (LR39). The lentil consensus map was composed of 9,793 DArT markers, covered a total of 977.47 cM with an average distance of 0.10 cM between adjacent markers and constructed 7 linkage groups representing 7 chromosomes of the lentil genome. The consensus map had no gap larger than 12.67 cM and only 5 gaps were found to be between 12.67 cM and 6.0 cM (on LG3 and LG4). The localization of the SNP markers on the lentil consensus map were in general consistent with their localization on the three individual genetic linkage maps and the lentil consensus map has longer map length, higher marker density and shorter average distance between the adjacent markers compared to the component linkage maps. This high-density consensus map could provide insight into the lentil genome. The consensus map could also help to construct a physical map using a Bacterial Artificial Chromosome library and map based cloning studies. Sequence information of DArT may help localization of orientation scaffolds from Next Generation Sequencing data.

  3. A regularized, model-based approach to phase-based conductivity mapping using MRI.

    Science.gov (United States)

    Ropella, Kathleen M; Noll, Douglas C

    2017-11-01

    To develop a novel regularized, model-based approach to phase-based conductivity mapping that uses structural information to improve the accuracy of conductivity maps. The inverse of the three-dimensional Laplacian operator is used to model the relationship between measured phase maps and the object conductivity in a penalized weighted least-squares optimization problem. Spatial masks based on structural information are incorporated into the problem to preserve data near boundaries. The proposed Inverse Laplacian method was compared against a restricted Gaussian filter in simulation, phantom, and human experiments. The Inverse Laplacian method resulted in lower reconstruction bias and error due to noise in simulations than the Gaussian filter. The Inverse Laplacian method also produced conductivity maps closer to the measured values in a phantom and with reduced noise in the human brain, as compared to the Gaussian filter. The Inverse Laplacian method calculates conductivity maps with less noise and more accurate values near boundaries. Improving the accuracy of conductivity maps is integral for advancing the applications of conductivity mapping. Magn Reson Med 78:2011-2021, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  4. Provisional maps of thermal areas in Yellowstone National Park, based on satellite thermal infrared imaging and field observations

    Science.gov (United States)

    Vaughan, R. Greg; Heasler, Henry; Jaworowski, Cheryl; Lowenstern, Jacob B.; Keszthelyi, Laszlo P.

    2014-01-01

    Maps that define the current distribution of geothermally heated ground are useful toward setting a baseline for thermal activity to better detect and understand future anomalous hydrothermal and (or) volcanic activity. Monitoring changes in the dynamic thermal areas also supports decisions regarding the development of Yellowstone National Park infrastructure, preservation and protection of park resources, and ensuring visitor safety. Because of the challenges associated with field-based monitoring of a large, complex geothermal system that is spread out over a large and remote area, satellite-based thermal infrared images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) were used to map the location and spatial extent of active thermal areas, to generate thermal anomaly maps, and to quantify the radiative component of the total geothermal heat flux. ASTER thermal infrared data acquired during winter nights were used to minimize the contribution of solar heating of the surface. The ASTER thermal infrared mapping results were compared to maps of thermal areas based on field investigations and high-resolution aerial photos. Field validation of the ASTER thermal mapping is an ongoing task. The purpose of this report is to make available ASTER-based maps of Yellowstone’s thermal areas. We include an appendix containing the names and characteristics of Yellowstone’s thermal areas, georeferenced TIFF files containing ASTER thermal imagery, and several spatial data sets in Esri shapefile format.

  5. Water Detection Based on Object Reflections

    Science.gov (United States)

    Rankin, Arturo L.; Matthies, Larry H.

    2012-01-01

    Water bodies are challenging terrain hazards for terrestrial unmanned ground vehicles (UGVs) for several reasons. Traversing through deep water bodies could cause costly damage to the electronics of UGVs. Additionally, a UGV that is either broken down due to water damage or becomes stuck in a water body during an autonomous operation will require rescue, potentially drawing critical resources away from the primary operation and increasing the operation cost. Thus, robust water detection is a critical perception requirement for UGV autonomous navigation. One of the properties useful for detecting still water bodies is that their surface acts as a horizontal mirror at high incidence angles. Still water bodies in wide-open areas can be detected by geometrically locating the exact pixels in the sky that are reflecting on candidate water pixels on the ground, predicting if ground pixels are water based on color similarity to the sky and local terrain features. But in cluttered areas where reflections of objects in the background dominate the appearance of the surface of still water bodies, detection based on sky reflections is of marginal value. Specifically, this software attempts to solve the problem of detecting still water bodies on cross-country terrain in cluttered areas at low cost.

  6. Detection of fire protection and mineral glasses in industrial recycling using Raman mapping spectroscopy

    Science.gov (United States)

    De Biasio, Martin; Arnold, Thomas; McGunnigle, Gerald; Kraft, Martin; Leitner, Raimund; Balthasar, Dirk; Rehrmann, Volker

    2011-06-01

    Recycling of glass requires the removal of specialist glasses, such as fireproof and mineral glasses, and glass ceramics, which are regarded as contaminants. The sorting must take place before melting for efficient glass recycling. Here, we demonstrate the feasibility of a real-time Raman mapping system for detecting and discriminating a range of industrially relevant glass contaminants in recovered glass streams. The components used are suitable for industrial conditions and the chemometric model is robust against imaging geometry and excitation intensity. The proposed approach is a novel alternative to established glass sorting sensors.

  7. Finger Vein Recognition Based on a Personalized Best Bit Map

    Science.gov (United States)

    Yang, Gongping; Xi, Xiaoming; Yin, Yilong

    2012-01-01

    Finger vein patterns have recently been recognized as an effective biometric identifier. In this paper, we propose a finger vein recognition method based on a personalized best bit map (PBBM). Our method is rooted in a local binary pattern based method and then inclined to use the best bits only for matching. We first present the concept of PBBM and the generating algorithm. Then we propose the finger vein recognition framework, which consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PBBM achieves not only better performance, but also high robustness and reliability. In addition, PBBM can be used as a general framework for binary pattern based recognition. PMID:22438735

  8. Intelligent Machine Vision for Automated Fence Intruder Detection Using Self-organizing Map

    Directory of Open Access Journals (Sweden)

    Veldin A. Talorete Jr.

    2017-03-01

    Full Text Available This paper presents an intelligent machine vision for automated fence intruder detection. A series of still captured images that contain fence events using Internet Protocol cameras was used as input data to the system. Two classifiers were used; the first is to classify human posture and the second one will classify intruder location. The system classifiers were implemented using Self-Organizing Map after the implementation of several image segmentation processes. The human posture classifier is in charge of classifying the detected subject’s posture patterns from subject’s silhouette. Moreover, the Intruder Localization Classifier is in charge of classifying the detected pattern’s location classifier will estimate the location of the intruder with respect to the fence using geometric feature from images as inputs. The system is capable of activating the alarm, display the actual image and depict the location of the intruder when an intruder is detected. In detecting intruder posture, the system’s success rate of 88%. Overall system accuracy for day-time intruder localization is 83% and an accuracy of 88% for night-time intruder localization

  9. Gene-based single nucleotide polymorphism markers for genetic and association mapping in common bean.

    Science.gov (United States)

    Galeano, Carlos H; Cortés, Andrés J; Fernández, Andrea C; Soler, Álvaro; Franco-Herrera, Natalia; Makunde, Godwill; Vanderleyden, Jos; Blair, Matthew W

    2012-06-26

    In common bean, expressed sequence tags (ESTs) are an underestimated source of gene-based markers such as insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). However, due to the nature of these conserved sequences, detection of markers is difficult and portrays low levels of polymorphism. Therefore, development of intron-spanning EST-SNP markers can be a valuable resource for genetic experiments such as genetic mapping and association studies. In this study, a total of 313 new gene-based markers were developed at target genes. Intronic variation was deeply explored in order to capture more polymorphism. Introns were putatively identified after comparing the common bean ESTs with the soybean genome, and the primers were designed over intron-flanking regions. The intronic regions were evaluated for parental polymorphisms using the single strand conformational polymorphism (SSCP) technique and Sequenom MassARRAY system. A total of 53 new marker loci were placed on an integrated molecular map in the DOR364 × G19833 recombinant inbred line (RIL) population. The new linkage map was used to build a consensus map, merging the linkage maps of the BAT93 × JALO EEP558 and DOR364 × BAT477 populations. A total of 1,060 markers were mapped, with a total map length of 2,041 cM across 11 linkage groups. As a second application of the generated resource, a diversity panel with 93 genotypes was evaluated with 173 SNP markers using the MassARRAY-platform and KASPar technology. These results were coupled with previous SSR evaluations and drought tolerance assays carried out on the same individuals. This agglomerative dataset was examined, in order to discover marker-trait associations, using general linear model (GLM) and mixed linear model (MLM). Some significant associations with yield components were identified, and were consistent with previous findings. In short, this study illustrates the power of intron-based markers for linkage and association mapping in

  10. Road network selection for small-scale maps using an improved centrality-based algorithm

    Directory of Open Access Journals (Sweden)

    Roy Weiss

    2014-12-01

    Full Text Available The road network is one of the key feature classes in topographic maps and databases. In the task of deriving road networks for products at smaller scales, road network selection forms a prerequisite for all other generalization operators, and is thus a fundamental operation in the overall process of topographic map and database production. The objective of this work was to develop an algorithm for automated road network selection from a large-scale (1:10,000 to a small-scale database (1:200,000. The project was pursued in collaboration with swisstopo, the national mapping agency of Switzerland, with generic mapping requirements in mind. Preliminary experiments suggested that a selection algorithm based on betweenness centrality performed best for this purpose, yet also exposed problems. The main contribution of this paper thus consists of four extensions that address deficiencies of the basic centrality-based algorithm and lead to a significant improvement of the results. The first two extensions improve the formation of strokes concatenating the road segments, which is crucial since strokes provide the foundation upon which the network centrality measure is computed. Thus, the first extension ensures that roundabouts are detected and collapsed, thus avoiding interruptions of strokes by roundabouts, while the second introduces additional semantics in the process of stroke formation, allowing longer and more plausible strokes to built. The third extension detects areas of high road density (i.e., urban areas using density-based clustering and then locally increases the threshold of the centrality measure used to select road segments, such that more thinning takes place in those areas. Finally, since the basic algorithm tends to create dead-ends—which however are not tolerated in small-scale maps—the fourth extension reconnects these dead-ends to the main network, searching for the best path in the main heading of the dead-end.

  11. Robust Vehicle Detection in Aerial Images Based on Cascaded Convolutional Neural Networks.

    Science.gov (United States)

    Zhong, Jiandan; Lei, Tao; Yao, Guangle

    2017-11-24

    Vehicle detection in aerial images is an important and challenging task. Traditionally, many target detection models based on sliding-window fashion were developed and achieved acceptable performance, but these models are time-consuming in the detection phase. Recently, with the great success of convolutional neural networks (CNNs) in computer vision, many state-of-the-art detectors have been designed based on deep CNNs. However, these CNN-based detectors are inefficient when applied in aerial image data due to the fact that the existing CNN-based models struggle with small-size object detection and precise localization. To improve the detection accuracy without decreasing speed, we propose a CNN-based detection model combining two independent convolutional neural networks, where the first network is applied to generate a set of vehicle-like regions from multi-feature maps of different hierarchies and scales. Because the multi-feature maps combine the advantage of the deep and shallow convolutional layer, the first network performs well on locating the small targets in aerial image data. Then, the generated candidate regions are fed into the second network for feature extraction and decision making. Comprehensive experiments are conducted on the Vehicle Detection in Aerial Imagery (VEDAI) dataset and Munich vehicle dataset. The proposed cascaded detection model yields high performance, not only in detection accuracy but also in detection speed.

  12. Color image encryption based on Coupled Nonlinear Chaotic Map

    International Nuclear Information System (INIS)

    Mazloom, Sahar; Eftekhari-Moghadam, Amir Masud

    2009-01-01

    Image encryption is somehow different from text encryption due to some inherent features of image such as bulk data capacity and high correlation among pixels, which are generally difficult to handle by conventional methods. The desirable cryptographic properties of the chaotic maps such as sensitivity to initial conditions and random-like behavior have attracted the attention of cryptographers to develop new encryption algorithms. Therefore, recent researches of image encryption algorithms have been increasingly based on chaotic systems, though the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper proposes a Coupled Nonlinear Chaotic Map, called CNCM, and a novel chaos-based image encryption algorithm to encrypt color images by using CNCM. The chaotic cryptography technique which used in this paper is a symmetric key cryptography with a stream cipher structure. In order to increase the security of the proposed algorithm, 240 bit-long secret key is used to generate the initial conditions and parameters of the chaotic map by making some algebraic transformations to the key. These transformations as well as the nonlinearity and coupling structure of the CNCM have enhanced the cryptosystem security. For getting higher security and higher complexity, the current paper employs the image size and color components to cryptosystem, thereby significantly increasing the resistance to known/chosen-plaintext attacks. The results of several experimental, statistical analysis and key sensitivity tests show that the proposed image encryption scheme provides an efficient and secure way for real-time image encryption and transmission.

  13. High resolution mapping of development in the wildland-urban interface using object based image extraction

    Science.gov (United States)

    Caggiano, Michael D.; Tinkham, Wade T.; Hoffman, Chad; Cheng, Antony S.; Hawbaker, Todd J.

    2016-01-01

    The wildland-urban interface (WUI), the area where human development encroaches on undeveloped land, is expanding throughout the western United States resulting in increased wildfire risk to homes and communities. Although census based mapping efforts have provided insights into the pattern of development and expansion of the WUI at regional and national scales, these approaches do not provide sufficient detail for fine-scale fire and emergency management planning, which requires maps of individual building locations. Although fine-scale maps of the WUI have been developed, they are often limited in their spatial extent, have unknown accuracies and biases, and are costly to update over time. In this paper we assess a semi-automated Object Based Image Analysis (OBIA) approach that utilizes 4-band multispectral National Aerial Image Program (NAIP) imagery for the detection of individual buildings within the WUI. We evaluate this approach by comparing the accuracy and overall quality of extracted buildings to a building footprint control dataset. In addition, we assessed the effects of buffer distance, topographic conditions, and building characteristics on the accuracy and quality of building extraction. The overall accuracy and quality of our approach was positively related to buffer distance, with accuracies ranging from 50 to 95% for buffer distances from 0 to 100 m. Our results also indicate that building detection was sensitive to building size, with smaller outbuildings (footprints less than 75 m2) having detection rates below 80% and larger residential buildings having detection rates above 90%. These findings demonstrate that this approach can successfully identify buildings in the WUI in diverse landscapes while achieving high accuracies at buffer distances appropriate for most fire management applications while overcoming cost and time constraints associated with traditional approaches. This study is unique in that it evaluates the ability of an OBIA

  14. High resolution mapping of development in the wildland-urban interface using object based image extraction

    Directory of Open Access Journals (Sweden)

    Michael D. Caggiano

    2016-10-01

    Full Text Available The wildland-urban interface (WUI, the area where human development encroaches on undeveloped land, is expanding throughout the western United States resulting in increased wildfire risk to homes and communities. Although census based mapping efforts have provided insights into the pattern of development and expansion of the WUI at regional and national scales, these approaches do not provide sufficient detail for fine-scale fire and emergency management planning, which requires maps of individual building locations. Although fine-scale maps of the WUI have been developed, they are often limited in their spatial extent, have unknown accuracies and biases, and are costly to update over time. In this paper we assess a semi-automated Object Based Image Analysis (OBIA approach that utilizes 4-band multispectral National Aerial Image Program (NAIP imagery for the detection of individual buildings within the WUI. We evaluate this approach by comparing the accuracy and overall quality of extracted buildings to a building footprint control dataset. In addition, we assessed the effects of buffer distance, topographic conditions, and building characteristics on the accuracy and quality of building extraction. The overall accuracy and quality of our approach was positively related to buffer distance, with accuracies ranging from 50 to 95% for buffer distances from 0 to 100 m. Our results also indicate that building detection was sensitive to building size, with smaller outbuildings (footprints less than 75 m2 having detection rates below 80% and larger residential buildings having detection rates above 90%. These findings demonstrate that this approach can successfully identify buildings in the WUI in diverse landscapes while achieving high accuracies at buffer distances appropriate for most fire management applications while overcoming cost and time constraints associated with traditional approaches. This study is unique in that it evaluates the ability

  15. Object-based analysis of multispectral airborne laser scanner data for land cover classification and map updating

    Science.gov (United States)

    Matikainen, Leena; Karila, Kirsi; Hyyppä, Juha; Litkey, Paula; Puttonen, Eetu; Ahokas, Eero

    2017-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with passive multispectral information from aerial images, has shown its high feasibility for automated mapping processes. The main benefits have been achieved in the mapping of elevated objects such as buildings and trees. Recently, the first multispectral airborne laser scanners have been launched, and active multispectral information is for the first time available for 3D ALS point clouds from a single sensor. This article discusses the potential of this new technology in map updating, especially in automated object-based land cover classification and change detection in a suburban area. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from an object-based random forests analysis suggest that the multispectral ALS data are very useful for land cover classification, considering both elevated classes and ground-level classes. The overall accuracy of the land cover classification results with six classes was 96% compared with validation points. The classes under study included building, tree, asphalt, gravel, rocky area and low vegetation. Compared to classification of single-channel data, the main improvements were achieved for ground-level classes. According to feature importance analyses, multispectral intensity features based on several channels were more useful than those based on one channel. Automatic change detection for buildings and roads was also demonstrated by utilising the new multispectral ALS data in combination with old map vectors. In change detection of buildings, an old digital surface model (DSM) based on single-channel ALS data was also used. Overall, our analyses suggest that the new data have high potential for further increasing the automation level in mapping. Unlike passive aerial imaging commonly used in mapping, the multispectral ALS technology is independent of external illumination conditions, and there are

  16. Land use mapping and change detection using ERTS imagery in Montgomery County, Alabama

    Science.gov (United States)

    Wilms, R. P.

    1973-01-01

    The feasibility of using remotely sensed data from ERTS-1 for mapping land use and detecting land use change was investigated. Land use information was gathered from 1964 air photo mosaics and from 1972 ERTS data. The 1964 data provided the basis for comparison with ERTS-1 imagery. From this comparison, urban sprawl was quite evident for the city of Montgomery. A significant trend from forestland to agricultural was also discovered. The development of main traffic arteries between 1964 and 1972 was a vital factor in the development of some of the urban centers. Even though certain problems in interpreting and correlating land use data from ERTS imagery were encountered, it has been demonstrated that remotely sensed data from ERTS is useful for inventorying land use and detecting land use change.

  17. Detecting chaos in particle accelerators through the frequency map analysis method.

    Science.gov (United States)

    Papaphilippou, Yannis

    2014-06-01

    The motion of beams in particle accelerators is dominated by a plethora of non-linear effects, which can enhance chaotic motion and limit their performance. The application of advanced non-linear dynamics methods for detecting and correcting these effects and thereby increasing the region of beam stability plays an essential role during the accelerator design phase but also their operation. After describing the nature of non-linear effects and their impact on performance parameters of different particle accelerator categories, the theory of non-linear particle motion is outlined. The recent developments on the methods employed for the analysis of chaotic beam motion are detailed. In particular, the ability of the frequency map analysis method to detect chaotic motion and guide the correction of non-linear effects is demonstrated in particle tracking simulations but also experimental data.

  18. Nanomaterials based biosensors for cancer biomarker detection

    International Nuclear Information System (INIS)

    Malhotra, Bansi D; Kumar, Saurabh; Pandey, Chandra Mouli

    2016-01-01

    Biosensors have enormous potential to contribute to the evolution of new molecular diagnostic techniques for patients suffering with cancerous diseases. A major obstacle preventing faster development of biosensors pertains to the fact that cancer is a highly complex set of diseases. The oncologists currently rely on a few biomarkers and histological characterization of tumors. Some of the signatures include epigenetic and genetic markers, protein profiles, changes in gene expression, and post-translational modifications of proteins. These molecular signatures offer new opportunities for development of biosensors for cancer detection. In this context, conducting paper has recently been found to play an important role towards the fabrication of a biosensor for cancer biomarker detection. In this paper we will focus on results of some of the recent studies obtained in our laboratories relating to fabrication and application of nanomaterial modified paper based biosensors for cancer biomarker detection. (paper)

  19. An extended anchored linkage map and virtual mapping for the american mink genome based on homology to human and dog

    DEFF Research Database (Denmark)

    Anistoroaei, Razvan Marian; Ansari, S.; Farid, A.

    2009-01-01

    hybridization (FISH) and/or by means of human/dog/mink comparative homology. The average interval between markers is 8.5 cM and the linkage groups collectively span 1340 cM. In addition, 217 and 275 mink microsatellites have been placed on human and dog genomes, respectively. In conjunction with the existing...... comparative human/dog/mink data, these assignments represent useful virtual maps for the American mink genome. Comparison of the current human/dog assembled sequential map with the existing Zoo-FISH-based human/dog/mink maps helped to refine the human/dog/mink comparative map. Furthermore, comparison...... of the human and dog genome assemblies revealed a number of large synteny blocks, some of which are corroborated by data from the mink linkage map....

  20. Regulation of microtubule-based transport by MAP4

    Science.gov (United States)

    Semenova, Irina; Ikeda, Kazuho; Resaul, Karim; Kraikivski, Pavel; Aguiar, Mike; Gygi, Steven; Zaliapin, Ilya; Cowan, Ann; Rodionov, Vladimir

    2014-01-01

    Microtubule (MT)-based transport of organelles driven by the opposing MT motors kinesins and dynein is tightly regulated in cells, but the underlying molecular mechanisms remain largely unknown. Here we tested the regulation of MT transport by the ubiquitous protein MAP4 using Xenopus melanophores as an experimental system. In these cells, pigment granules (melanosomes) move along MTs to the cell center (aggregation) or to the periphery (dispersion) by means of cytoplasmic dynein and kinesin-2, respectively. We found that aggregation signals induced phosphorylation of threonine residues in the MT-binding domain of the Xenopus MAP4 (XMAP4), thus decreasing binding of this protein to MTs. Overexpression of XMAP4 inhibited pigment aggregation by shortening dynein-dependent MT runs of melanosomes, whereas removal of XMAP4 from MTs reduced the length of kinesin-2–dependent runs and suppressed pigment dispersion. We hypothesize that binding of XMAP4 to MTs negatively regulates dynein-dependent movement of melanosomes and positively regulates kinesin-2–based movement. Phosphorylation during pigment aggregation reduces binding of XMAP4 to MTs, thus increasing dynein-dependent and decreasing kinesin-2–dependent motility of melanosomes, which stimulates their accumulation in the cell center, whereas dephosphorylation of XMAP4 during dispersion has an opposite effect. PMID:25143402

  1. Water Detection Based on Color Variation

    Science.gov (United States)

    Rankin, Arturo L.

    2012-01-01

    This software has been designed to detect water bodies that are out in the open on cross-country terrain at close range (out to 30 meters), using imagery acquired from a stereo pair of color cameras mounted on a terrestrial, unmanned ground vehicle (UGV). This detector exploits the fact that the color variation across water bodies is generally larger and more uniform than that of other naturally occurring types of terrain, such as soil and vegetation. Non-traversable water bodies, such as large puddles, ponds, and lakes, are detected based on color variation, image intensity variance, image intensity gradient, size, and shape. At ranges beyond 20 meters, water bodies out in the open can be indirectly detected by detecting reflections of the sky below the horizon in color imagery. But at closer range, the color coming out of a water body dominates sky reflections, and the water cue from sky reflections is of marginal use. Since there may be times during UGV autonomous navigation when a water body does not come into a perception system s field of view until it is at close range, the ability to detect water bodies at close range is critical. Factors that influence the perceived color of a water body at close range are the amount and type of sediment in the water, the water s depth, and the angle of incidence to the water body. Developing a single model of the mixture ratio of light reflected off the water surface (to the camera) to light coming out of the water body (to the camera) for all water bodies would be fairly difficult. Instead, this software detects close water bodies based on local terrain features and the natural, uniform change in color that occurs across the surface from the leading edge to the trailing edge.

  2. Skeleton-Based Abnormal Gait Detection

    Directory of Open Access Journals (Sweden)

    Trong-Nguyen Nguyen

    2016-10-01

    Full Text Available Human gait analysis plays an important role in musculoskeletal disorder diagnosis. Detecting anomalies in human walking, such as shuffling gait, stiff leg or unsteady gait, can be difficult if the prior knowledge of such a gait pattern is not available. We propose an approach for detecting abnormal human gait based on a normal gait model. Instead of employing the color image, silhouette, or spatio-temporal volume, our model is created based on human joint positions (skeleton in time series. We decompose each sequence of normal gait images into gait cycles. Each human instant posture is represented by a feature vector which describes relationships between pairs of bone joints located in the lower body. Such vectors are then converted into codewords using a clustering technique. The normal human gait model is created based on multiple sequences of codewords corresponding to different gait cycles. In the detection stage, a gait cycle with normality likelihood below a threshold, which is determined automatically in the training step, is assumed as an anomaly. The experimental results on both marker-based mocap data and Kinect skeleton show that our method is very promising in distinguishing normal and abnormal gaits with an overall accuracy of 90.12%.

  3. Ionizing particle detection based on phononic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Aly, Arafa H., E-mail: arafa16@yahoo.com, E-mail: arafa.hussien@science.bsu.edu.eg; Mehaney, Ahmed; Eissa, Mostafa F. [Physics Department, Faculty of Science, Beni-Suef University, Beni-Suef (Egypt)

    2015-08-14

    Most conventional radiation detectors are based on electronic or photon collections. In this work, we introduce a new and novel type of ionizing particle detector based on phonon collection. Helium ion radiation treats tumors with better precision. There are nine known isotopes of helium, but only helium-3 and helium-4 are stable. Helium-4 is formed in fusion reactor technology and in enormous quantities during Big Bang nucleo-synthesis. In this study, we introduce a technique for helium-4 ion detection (sensing) based on the innovative properties of the new composite materials known as phononic crystals (PnCs). PnCs can provide an easy and cheap technique for ion detection compared with conventional methods. PnC structures commonly consist of a periodic array of two or more materials with different elastic properties. The two materials are polymethyl-methacrylate and polyethylene polymers. The calculations showed that the energies lost to target phonons are maximized at 1 keV helium-4 ion energy. There is a correlation between the total phonon energies and the transmittance of PnC structures. The maximum transmission for phonons due to the passage of helium-4 ions was found in the case of making polyethylene as a first layer in the PnC structure. Therefore, the concept of ion detection based on PnC structure is achievable.

  4. A Vision-Based Approach to Fire Detection

    Directory of Open Access Journals (Sweden)

    Pedro Gomes

    2014-09-01

    Full Text Available This paper presents a vision-based method for fire detection from fixed surveillance smart cameras. The method integrates several well-known techniques properly adapted to cope with the challenges related to the actual deployment of the vision system. Concretely, background subtraction is performed with a context-based learning mechanism so as to attain higher accuracy and robustness. The computational cost of a frequency analysis of potential fire regions is reduced by means of focusing its operation with an attentive mechanism. For fast discrimination between fire regions and fire-coloured moving objects, a new colour-based model of fire's appearance and a new wavelet-based model of fire's frequency signature are proposed. To reduce the false alarm rate due to the presence of fire-coloured moving objects, the category and behaviour of each moving object is taken into account in the decision-making. To estimate the expected object's size in the image plane and to generate geo-referenced alarms, the camera-world mapping is approximated with a GPS-based calibration process. Experimental results demonstrate the ability of the proposed method to detect fires with an average success rate of 93.1% at a processing rate of 10 Hz, which is often sufficient for real-life applications.

  5. An FPGA-Based People Detection System

    Directory of Open Access Journals (Sweden)

    James J. Clark

    2005-05-01

    Full Text Available This paper presents an FPGA-based system for detecting people from video. The system is designed to use JPEG-compressed frames from a network camera. Unlike previous approaches that use techniques such as background subtraction and motion detection, we use a machine-learning-based approach to train an accurate detector. We address the hardware design challenges involved in implementing such a detector, along with JPEG decompression, on an FPGA. We also present an algorithm that efficiently combines JPEG decompression with the detection process. This algorithm carries out the inverse DCT step of JPEG decompression only partially. Therefore, it is computationally more efficient and simpler to implement, and it takes up less space on the chip than the full inverse DCT algorithm. The system is demonstrated on an automated video surveillance application and the performance of both hardware and software implementations is analyzed. The results show that the system can detect people accurately at a rate of about 2.5 frames per second on a Virtex-II 2V1000 using a MicroBlaze processor running at 75 MHz, communicating with dedicated hardware over FSL links.

  6. Proteomics goes forensic: Detection and mapping of blood signatures in fingermarks.

    Science.gov (United States)

    Deininger, Lisa; Patel, Ekta; Clench, Malcolm R; Sears, Vaughn; Sammon, Chris; Francese, Simona

    2016-06-01

    A bottom up in situ proteomic method has been developed enabling the mapping of multiple blood signatures on the intact ridges of blood fingermarks by Matrix Assisted Laser Desorption Mass Spectrometry Imaging (MALDI-MSI). This method, at a proof of concept stage, builds upon recently published work demonstrating the opportunity to profile and identify multiple blood signatures in bloodstains via a bottom up proteomic approach. The present protocol addresses the limitation of the previously developed profiling method with respect to destructivity; destructivity should be avoided for evidence such as blood fingermarks, where the ridge detail must be preserved in order to provide the associative link between the biometric information and the events of bloodshed. Using a blood mark reference model, trypsin concentration and spraying conditions have been optimised within the technical constraints of the depositor eventually employed; the application of MALDI-MSI and Ion Mobility MS have enabled the detection, confirmation and visualisation of blood signatures directly onto the ridge pattern. These results are to be considered a first insight into a method eventually informing investigations (and judicial debates) of violent crimes in which the reliable and non-destructive detection and mapping of blood in fingermarks is paramount to reconstruct the events of bloodshed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Investigating hyperoxic effects in the rat brain using quantitative susceptibility mapping based on MRI phase.

    Science.gov (United States)

    Hsieh, Meng-Chi; Kuo, Li-Wei; Huang, Yun-An; Chen, Jyh-Horng

    2017-02-01

    To test whether susceptibility imaging can detect microvenous oxygen saturation changes, induced by hyperoxia, in the rat brain. A three-dimensional gradient-echo with a flow compensation sequence was used to acquire T2*-weighted images of rat brains during hyperoxia and normoxia. Quantitative susceptibility mapping (QSM) and QSM-based microvenous oxygenation venography were computed from gradient-echo (GRE) phase images and compared between the two conditions. Pulse oxygen saturation (SpO 2 ) in the cortex was examined and compared with venous oxygen saturation (SvO 2 ) estimated by QSM. Oxygen saturation change calculated by a conventional Δ R2* map was also compared with the ΔSvO 2 estimated by QSM. Susceptibilities of five venous and tissue regions were quantified separately by QSM. Venous susceptibility was reduced by nearly 10%, with an SvO 2 shift of 10% during hyperoxia. A hyperoxic effect, confirmed by SpO 2 measurement, resulted in an SvO 2 increase in the cortex. The ΔSvO 2 between hyperoxia and normoxia was consistent with what was estimated by the Δ R2* map in five regions. These findings suggest that a quantitative susceptibility map is a promising technique for SvO 2 measurement. This method may be useful for quantitatively investigating oxygenation-dependent functional MRI studies. Magn Reson Med 77:592-602, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  8. Gold nanoparticle-based probes for the colorimetric detection of Mycobacterium avium subspecies paratuberculosis DNA.

    Science.gov (United States)

    Ganareal, Thenor Aristotile Charles S; Balbin, Michelle M; Monserate, Juvy J; Salazar, Joel R; Mingala, Claro N

    2018-02-12

    Gold nanoparticle (AuNP) is considered to be the most stable metal nanoparticle having the ability to be functionalized with biomolecules. Recently, AuNP-based DNA detection methods captured the interest of researchers worldwide. Paratuberculosis or Johne's disease, a chronic gastroenteritis in ruminants caused by Mycobacterium avium subsp. paratuberculosis (MAP), was found to have negative effect in the livestock industry. In this study, AuNP-based probes were evaluated for the specific and sensitive detection of MAP DNA. AuNP-based probe was produced by functionalization of AuNPs with thiol-modified oligonucleotide and was confirmed by Fourier-Transform Infrared (FTIR) spectroscopy. UV-Vis spectroscopy and Scanning Electron Microscopy (SEM) were used to characterize AuNPs. DNA detection was done by hybridization of 10 μL of DNA with 5 μL of probe at 63 °C for 10 min and addition of 3 μL salt solution. The method was specific to MAP with detection limit of 103 ng. UV-Vis and SEM showed dispersion and aggregation of the AuNPs for the positive and negative results, respectively, with no observed particle growth. This study therefore reports an AuNP-based probes which can be used for the specific and sensitive detection of MAP DNA. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. A secure key agreement protocol based on chaotic maps

    International Nuclear Information System (INIS)

    Wang Xing-Yuan; Luan Da-Peng

    2013-01-01

    To guarantee the security of communication in the public channel, many key agreement protocols have been proposed. Recently, Gong et al. proposed a key agreement protocol based on chaotic maps with password sharing. In this paper, Gong et al.'s protocol is analyzed, and we find that this protocol exhibits key management issues and potential security problems. Furthermore, the paper presents a new key agreement protocol based on enhanced Chebyshev polynomials to overcome these problems. Through our analysis, our key agreement protocol not only provides mutual authentication and the ability to resist a variety of common attacks, but also solve the problems of key management and security issues existing in Gong et al.'s protocol

  10. Topology Optimization of Passive Micromixers Based on Lagrangian Mapping Method

    Directory of Open Access Journals (Sweden)

    Yuchen Guo

    2018-03-01

    Full Text Available This paper presents an optimization-based design method of passive micromixers for immiscible fluids, which means that the Peclet number infinitely large. Based on topology optimization method, an optimization model is constructed to find the optimal layout of the passive micromixers. Being different from the topology optimization methods with Eulerian description of the convection-diffusion dynamics, this proposed method considers the extreme case, where the mixing is dominated completely by the convection with negligible diffusion. In this method, the mixing dynamics is modeled by the mapping method, a Lagrangian description that can deal with the case with convection-dominance. Several numerical examples have been presented to demonstrate the validity of the proposed method.

  11. GPU-BSM: a GPU-based tool to map bisulfite-treated reads.

    Directory of Open Access Journals (Sweden)

    Andrea Manconi

    Full Text Available Cytosine DNA methylation is an epigenetic mark implicated in several biological processes. Bisulfite treatment of DNA is acknowledged as the gold standard technique to study methylation. This technique introduces changes in the genomic DNA by converting cytosines to uracils while 5-methylcytosines remain nonreactive. During PCR amplification 5-methylcytosines are amplified as cytosine, whereas uracils and thymines as thymine. To detect the methylation levels, reads treated with the bisulfite must be aligned against a reference genome. Mapping these reads to a reference genome represents a significant computational challenge mainly due to the increased search space and the loss of information introduced by the treatment. To deal with this computational challenge we devised GPU-BSM, a tool based on modern Graphics Processing Units. Graphics Processing Units are hardware accelerators that are increasingly being used successfully to accelerate general-purpose scientific applications. GPU-BSM is a tool able to map bisulfite-treated reads from whole genome bisulfite sequencing and reduced representation bisulfite sequencing, and to estimate methylation levels, with the goal of detecting methylation. Due to the massive parallelization obtained by exploiting graphics cards, GPU-BSM aligns bisulfite-treated reads faster than other cutting-edge solutions, while outperforming most of them in terms of unique mapped reads.

  12. Developing a scientific procedure for community based hazard mapping and risk mitigation

    Science.gov (United States)

    Verrier, M.

    2011-12-01

    projects that are being conducted alongside the community hazard map include marking evacuation routes with painted bamboo signs, creating a meaningful landslide awareness mural, and installing simple early warning systems that detect land movement and alert residents that evacuation routes should be used. KKN-PPM is scheduled to continue until August 25th, 2011. In the future, research will be done into using the model for community based hazard mapping outlined here in the Geological Sciences Department at SDSU to increase georisk awareness and improve mitigation of landslides in local areas of need such as Tijuana, Mexico.

  13. Vibration Based Sun Gear Damage Detection

    Science.gov (United States)

    Hood, Adrian; LaBerge, Kelsen; Lewicki, David; Pines, Darryll

    2013-01-01

    Seeded fault experiments were conducted on the planetary stage of an OH-58C helicopter transmission. Two vibration based methods are discussed that isolate the dynamics of the sun gear from that of the planet gears, bearings, input spiral bevel stage, and other components in and around the gearbox. Three damaged sun gears: two spalled and one cracked, serve as the focus of this current work. A non-sequential vibration separation algorithm was developed and the resulting signals analyzed. The second method uses only the time synchronously averaged data but takes advantage of the signal/source mapping required for vibration separation. Both algorithms were successful in identifying the spall damage. Sun gear damage was confirmed by the presence of sun mesh groups. The sun tooth crack condition was inconclusive.

  14. Combining voxel-based morphometry and diffusion tensor imaging to detect age-related brain changes.

    Science.gov (United States)

    Lehmbeck, Jan T; Brassen, Stefanie; Weber-Fahr, Wolfgang; Braus, Dieter F

    2006-04-03

    The present study combined optimized voxel-based morphometry and diffusion tensor imaging to detect age-related brain changes. We compared grey matter density maps (grey matter voxel-based morphometry) and white matter fractional anisotropy maps (diffusion tensor imaging-voxel-based morphometry) between two groups of 17 younger and 17 older women. Older women exhibited reduced white matter fractional anisotropy as well as decreased grey matter density most prominently in the frontal, limbic, parietal and temporal lobes. A discriminant analysis identified four frontal and limbic grey and white matter areas that separated the two groups most effectively. We conclude that grey matter voxel-based morphometry and diffusion tensor imaging voxel-based morphometry are well suited for the detection of age-related changes and their combination provides high accuracy when detecting the neural correlates of aging.

  15. Tensor-based spatiotemporal saliency detection

    Science.gov (United States)

    Dou, Hao; Li, Bin; Deng, Qianqian; Zhang, LiRui; Pan, Zhihong; Tian, Jinwen

    2018-03-01

    This paper proposes an effective tensor-based spatiotemporal saliency computation model for saliency detection in videos. First, we construct the tensor representation of video frames. Then, the spatiotemporal saliency can be directly computed by the tensor distance between different tensors, which can preserve the complete temporal and spatial structure information of object in the spatiotemporal domain. Experimental results demonstrate that our method can achieve encouraging performance in comparison with the state-of-the-art methods.

  16. Landslide susceptibility mapping using GIS-based statistical models and Remote sensing data in tropical environment.

    Science.gov (United States)

    Shahabi, Himan; Hashim, Mazlan

    2015-04-22

    This research presents the results of the GIS-based statistical models for generation of landslide susceptibility mapping using geographic information system (GIS) and remote-sensing data for Cameron Highlands area in Malaysia. Ten factors including slope, aspect, soil, lithology, NDVI, land cover, distance to drainage, precipitation, distance to fault, and distance to road were extracted from SAR data, SPOT 5 and WorldView-1 images. The relationships between the detected landslide locations and these ten related factors were identified by using GIS-based statistical models including analytical hierarchy process (AHP), weighted linear combination (WLC) and spatial multi-criteria evaluation (SMCE) models. The landslide inventory map which has a total of 92 landslide locations was created based on numerous resources such as digital aerial photographs, AIRSAR data, WorldView-1 images, and field surveys. Then, 80% of the landslide inventory was used for training the statistical models and the remaining 20% was used for validation purpose. The validation results using the Relative landslide density index (R-index) and Receiver operating characteristic (ROC) demonstrated that the SMCE model (accuracy is 96%) is better in prediction than AHP (accuracy is 91%) and WLC (accuracy is 89%) models. These landslide susceptibility maps would be useful for hazard mitigation purpose and regional planning.

  17. Automated detection of qualitative spatio-temporal features in electrocardiac activation maps.

    Science.gov (United States)

    Ironi, Liliana; Tentoni, Stefania

    2007-02-01

    This paper describes a piece of work aiming at the realization of a tool for the automated interpretation of electrocardiac maps. Such maps can capture a number of electrical conduction pathologies, such as arrhytmia, that can be missed by the analysis of traditional electrocardiograms. But, their introduction into the clinical practice is still far away as their interpretation requires skills that belongs to very few experts. Then, an automated interpretation tool would bridge the gap between the established research outcome and clinical practice with a consequent great impact on health care. Qualitative spatial reasoning can play a crucial role in the identification of spatio-temporal patterns and salient features that characterize the heart electrical activity. We adopted the spatial aggregation (SA) conceptual framework and an interplay of numerical and qualitative information to extract features from epicardial maps, and to make them available for reasoning tasks. Our focus is on epicardial activation isochrone maps as they are a synthetic representation of spatio-temporal aspects of the propagation of the electrical excitation. We provide a computational SA-based methodology to extract, from 3D epicardial data gathered over time, (1) the excitation wavefront structure, and (2) the salient features that characterize wavefront propagation and visually correspond to specific geometric objects. The proposed methodology provides a robust and efficient way to identify salient pieces of information in activation time maps. The hierarchical structure of the abstracted geometric objects, crucial in capturing the prominent information, facilitates the definition of general rules necessary to infer the correlation between pathophysiological patterns and wavefront structure and propagation.

  18. The value of bladder mapping and prostatic urethra biopsies for detection of carcinoma in situ (CIS).

    Science.gov (United States)

    Gudjónsson, Sigurdur; Bläckberg, Mats; Chebil, Gunilla; Jahnson, Staffan; Olsson, Hans; Bendahl, Pär-Ola; Månsson, Wiking; Liedberg, Fredrik

    2012-07-01

    It is well known that CIS is a major risk factor for muscle-invasive bladder cancer and that this entity can be difficult to diagnose. Taking cold-cup mapping biopsies from different areas of the bladder (BMAP) is commonly used in patients at risk of harbouring CIS. The diagnostic accuracy of this approach has not been assessed until now. By using the CIS found in the cystoprostatectomy specimen as an indicator of the true occurrence of CIS and comparing that with the findings of BMAP, it is clear that the sensitivity of BMAP to detect CIS when present is low and that negative findings should be considered unreliable. To assess the value of bladder mapping and prostatic urethra biopsies for detection of urothelial carcinoma in situ (CIS). CIS of the urinary bladder is a flat high-grade lesion of the mucosa associated with a significant risk of progression to muscle-invasive disease. CIS is difficult to identify on cystoscopy, and definite diagnosis requires histopathology. Traditionally, if CIS is suspected, multiple cold-cup biopsies are taken from the bladder mucosa, and resection biopsies are obtained from the prostatic urethra in males. This approach is often called bladder mapping (BMAP). The accuracy of BMAP as a diagnostic tool is not known. Male patients with bladder cancer scheduled for cystectomy underwent cold-cup bladder biopsies (sidewalls, posterior wall, dome, trigone), and resection biopsies were taken from the prostatic urethra. After cystectomy, the surgical specimen was investigated in a standardised manner and subsequently compared with the BMAP biopsies for the presence of CIS. The histopathology reports of 162 patients were analysed. CIS was detected in 46% of the cystoprostatectomy specimens, and multiple (≥2) CIS lesions were found in 30%. BMAP (cold-cup bladder biopsies + resection biopsies from the prostatic urethra) provided sensitivity of 51% for any CIS, and 55% for multiple CIS lesions. The cold-cup biopsies for CIS in the bladder

  19. Region-Based Building Rooftop Extraction and Change Detection

    Science.gov (United States)

    Tian, J.; Metzlaff, L.; d'Angelo, P.; Reinartz, P.

    2017-09-01

    Automatic extraction of building changes is important for many applications like disaster monitoring and city planning. Although a lot of research work is available based on 2D as well as 3D data, an improvement in accuracy and efficiency is still needed. The introducing of digital surface models (DSMs) to building change detection has strongly improved the resulting accuracy. In this paper, a post-classification approach is proposed for building change detection using satellite stereo imagery. Firstly, DSMs are generated from satellite stereo imagery and further refined by using a segmentation result obtained from the Sobel gradients of the panchromatic image. Besides the refined DSMs, the panchromatic image and the pansharpened multispectral image are used as input features for mean-shift segmentation. The DSM is used to calculate the nDSM, out of which the initial building candidate regions are extracted. The candidate mask is further refined by morphological filtering and by excluding shadow regions. Following this, all segments that overlap with a building candidate region are determined. A building oriented segments merging procedure is introduced to generate a final building rooftop mask. As the last step, object based change detection is performed by directly comparing the building rooftops extracted from the pre- and after-event imagery and by fusing the change indicators with the roof-top region map. A quantitative and qualitative assessment of the proposed approach is provided by using WorldView-2 satellite data from Istanbul, Turkey.

  20. REGION-BASED BUILDING ROOFTOP EXTRACTION AND CHANGE DETECTION

    Directory of Open Access Journals (Sweden)

    J. Tian

    2017-09-01

    Full Text Available Automatic extraction of building changes is important for many applications like disaster monitoring and city planning. Although a lot of research work is available based on 2D as well as 3D data, an improvement in accuracy and efficiency is still needed. The introducing of digital surface models (DSMs to building change detection has strongly improved the resulting accuracy. In this paper, a post-classification approach is proposed for building change detection using satellite stereo imagery. Firstly, DSMs are generated from satellite stereo imagery and further refined by using a segmentation result obtained from the Sobel gradients of the panchromatic image. Besides the refined DSMs, the panchromatic image and the pansharpened multispectral image are used as input features for mean-shift segmentation. The DSM is used to calculate the nDSM, out of which the initial building candidate regions are extracted. The candidate mask is further refined by morphological filtering and by excluding shadow regions. Following this, all segments that overlap with a building candidate region are determined. A building oriented segments merging procedure is introduced to generate a final building rooftop mask. As the last step, object based change detection is performed by directly comparing the building rooftops extracted from the pre- and after-event imagery and by fusing the change indicators with the roof-top region map. A quantitative and qualitative assessment of the proposed approach is provided by using WorldView-2 satellite data from Istanbul, Turkey.

  1. Automated image based prominent nucleoli detection.

    Science.gov (United States)

    Yap, Choon K; Kalaw, Emarene M; Singh, Malay; Chong, Kian T; Giron, Danilo M; Huang, Chao-Hui; Cheng, Li; Law, Yan N; Lee, Hwee Kuan

    2015-01-01

    Nucleolar changes in cancer cells are one of the cytologic features important to the tumor pathologist in cancer assessments of tissue biopsies. However, inter-observer variability and the manual approach to this work hamper the accuracy of the assessment by pathologists. In this paper, we propose a computational method for prominent nucleoli pattern detection. Thirty-five hematoxylin and eosin stained images were acquired from prostate cancer, breast cancer, renal clear cell cancer and renal papillary cell cancer tissues. Prostate cancer images were used for the development of a computer-based automated prominent nucleoli pattern detector built on a cascade farm. An ensemble of approximately 1000 cascades was constructed by permuting different combinations of classifiers such as support vector machines, eXclusive component analysis, boosting, and logistic regression. The output of cascades was then combined using the RankBoost algorithm. The output of our prominent nucleoli pattern detector is a ranked set of detected image patches of patterns of prominent nucleoli. The mean number of detected prominent nucleoli patterns in the top 100 ranked detected objects was 58 in the prostate cancer dataset, 68 in the breast cancer dataset, 86 in the renal clear cell cancer dataset, and 76 in the renal papillary cell cancer dataset. The proposed cascade farm performs twice as good as the use of a single cascade proposed in the seminal paper by Viola and Jones. For comparison, a naive algorithm that randomly chooses a pixel as a nucleoli pattern would detect five correct patterns in the first 100 ranked objects. Detection of sparse nucleoli patterns in a large background of highly variable tissue patterns is a difficult challenge our method has overcome. This study developed an accurate prominent nucleoli pattern detector with the potential to be used in the clinical settings.

  2. Automated image based prominent nucleoli detection

    Directory of Open Access Journals (Sweden)

    Choon K Yap

    2015-01-01

    Full Text Available Introduction: Nucleolar changes in cancer cells are one of the cytologic features important to the tumor pathologist in cancer assessments of tissue biopsies. However, inter-observer variability and the manual approach to this work hamper the accuracy of the assessment by pathologists. In this paper, we propose a computational method for prominent nucleoli pattern detection. Materials and Methods: Thirty-five hematoxylin and eosin stained images were acquired from prostate cancer, breast cancer, renal clear cell cancer and renal papillary cell cancer tissues. Prostate cancer images were used for the development of a computer-based automated prominent nucleoli pattern detector built on a cascade farm. An ensemble of approximately 1000 cascades was constructed by permuting different combinations of classifiers such as support vector machines, eXclusive component analysis, boosting, and logistic regression. The output of cascades was then combined using the RankBoost algorithm. The output of our prominent nucleoli pattern detector is a ranked set of detected image patches of patterns of prominent nucleoli. Results: The mean number of detected prominent nucleoli patterns in the top 100 ranked detected objects was 58 in the prostate cancer dataset, 68 in the breast cancer dataset, 86 in the renal clear cell cancer dataset, and 76 in the renal papillary cell cancer dataset. The proposed cascade farm performs twice as good as the use of a single cascade proposed in the seminal paper by Viola and Jones. For comparison, a naive algorithm that randomly chooses a pixel as a nucleoli pattern would detect five correct patterns in the first 100 ranked objects. Conclusions: Detection of sparse nucleoli patterns in a large background of highly variable tissue patterns is a difficult challenge our method has overcome. This study developed an accurate prominent nucleoli pattern detector with the potential to be used in the clinical settings.

  3. Chaotic map based key agreement with/out clock synchronization

    International Nuclear Information System (INIS)

    Han, S.; Chang, E.

    2009-01-01

    In order to address Bergamo et al.'s attack, Xiao et al. proposed a key agreement protocol using chaotic maps. Han then presented three attacks on Xiao et al.'s protocol. To enhance the security of key agreement based on chaotic maps, Chang et al. proposed a new key agreement using passphrase, which works in clock synchronization environment. However, their protocol still has some issues: one is its passphrase is not easy to remember and much longer than password; the second one is it cannot resist guessing attack if the constructed passphrase is easy to remember and also has already existed in some rational dictionaries; the third one is it cannot work without clock synchronization. In this paper, we will present two different key agreement protocols, which can resist guessing attack. The first one works in clock synchronization environment. The second one can work without clock synchronization. They both use authenticated password for secure communications. The protocols are secure against replaying attacks and a shared session key can be established.

  4. VISUAL UAV TRAJECTORY PLAN SYSTEM BASED ON NETWORK MAP

    Directory of Open Access Journals (Sweden)

    X. L. Li

    2012-07-01

    Full Text Available The base map of the current software UP-30 using in trajectory plan for Unmanned Aircraft Vehicle is vector diagram. UP-30 draws navigation points manually. But in the field of operation process, the efficiency and the quality of work is influenced because of insufficient information, screen reflection, calculate inconveniently and other factors. If we do this work in indoor, the effect of external factors on the results would be eliminated, the network earth users can browse the free world high definition satellite images through downloading a client software, and can export the high resolution image by standard file format. This brings unprecedented convenient of trajectory plan. But the images must be disposed by coordinate transformation, geometric correction. In addition, according to the requirement of mapping scale ,camera parameters and overlap degree we can calculate exposure hole interval and trajectory distance between the adjacent trajectory automatically . This will improve the degree of automation of data collection. Software will judge the position of next point according to the intersection of the trajectory and the survey area and ensure the position of point according to trajectory distance. We can undertake the points artificially. So the trajectory plan is automatic and flexible. Considering safety, the date can be used in flying after simulating flight. Finally we can export all of the date using a key

  5. Visual Uav Trajectory Plan System Based on Network Map

    Science.gov (United States)

    Li, X. L.; Lin, Z. J.; Su, G. Z.; Wu, B. Y.

    2012-07-01

    The base map of the current software UP-30 using in trajectory plan for Unmanned Aircraft Vehicle is vector diagram. UP-30 draws navigation points manually. But in the field of operation process, the efficiency and the quality of work is influenced because of insufficient information, screen reflection, calculate inconveniently and other factors. If we do this work in indoor, the effect of external factors on the results would be eliminated, the network earth users can browse the free world high definition satellite images through downloading a client software, and can export the high resolution image by standard file format. This brings unprecedented convenient of trajectory plan. But the images must be disposed by coordinate transformation, geometric correction. In addition, according to the requirement of mapping scale ,camera parameters and overlap degree we can calculate exposure hole interval and trajectory distance between the adjacent trajectory automatically . This will improve the degree of automation of data collection. Software will judge the position of next point according to the intersection of the trajectory and the survey area and ensure the position of point according to trajectory distance. We can undertake the points artificially. So the trajectory plan is automatic and flexible. Considering safety, the date can be used in flying after simulating flight. Finally we can export all of the date using a key

  6. An Isometric Mapping Based Co-Location Decision Tree Algorithm

    Science.gov (United States)

    Zhou, G.; Wei, J.; Zhou, X.; Zhang, R.; Huang, W.; Sha, H.; Chen, J.

    2018-05-01

    Decision tree (DT) induction has been widely used in different pattern classification. However, most traditional DTs have the disadvantage that they consider only non-spatial attributes (ie, spectral information) as a result of classifying pixels, which can result in objects being misclassified. Therefore, some researchers have proposed a co-location decision tree (Cl-DT) method, which combines co-location and decision tree to solve the above the above-mentioned traditional decision tree problems. Cl-DT overcomes the shortcomings of the existing DT algorithms, which create a node for each value of a given attribute, which has a higher accuracy than the existing decision tree approach. However, for non-linearly distributed data instances, the euclidean distance between instances does not reflect the true positional relationship between them. In order to overcome these shortcomings, this paper proposes an isometric mapping method based on Cl-DT (called, (Isomap-based Cl-DT), which is a method that combines heterogeneous and Cl-DT together. Because isometric mapping methods use geodetic distances instead of Euclidean distances between non-linearly distributed instances, the true distance between instances can be reflected. The experimental results and several comparative analyzes show that: (1) The extraction method of exposed carbonate rocks is of high accuracy. (2) The proposed method has many advantages, because the total number of nodes, the number of leaf nodes and the number of nodes are greatly reduced compared to Cl-DT. Therefore, the Isomap -based Cl-DT algorithm can construct a more accurate and faster decision tree.

  7. AN ISOMETRIC MAPPING BASED CO-LOCATION DECISION TREE ALGORITHM

    Directory of Open Access Journals (Sweden)

    G. Zhou

    2018-05-01

    Full Text Available Decision tree (DT induction has been widely used in different pattern classification. However, most traditional DTs have the disadvantage that they consider only non-spatial attributes (ie, spectral information as a result of classifying pixels, which can result in objects being misclassified. Therefore, some researchers have proposed a co-location decision tree (Cl-DT method, which combines co-location and decision tree to solve the above the above-mentioned traditional decision tree problems. Cl-DT overcomes the shortcomings of the existing DT algorithms, which create a node for each value of a given attribute, which has a higher accuracy than the existing decision tree approach. However, for non-linearly distributed data instances, the euclidean distance between instances does not reflect the true positional relationship between them. In order to overcome these shortcomings, this paper proposes an isometric mapping method based on Cl-DT (called, (Isomap-based Cl-DT, which is a method that combines heterogeneous and Cl-DT together. Because isometric mapping methods use geodetic distances instead of Euclidean distances between non-linearly distributed instances, the true distance between instances can be reflected. The experimental results and several comparative analyzes show that: (1 The extraction method of exposed carbonate rocks is of high accuracy. (2 The proposed method has many advantages, because the total number of nodes, the number of leaf nodes and the number of nodes are greatly reduced compared to Cl-DT. Therefore, the Isomap -based Cl-DT algorithm can construct a more accurate and faster decision tree.

  8. Detect, map, and preserve Bronze & Iron Age monuments along the pre-historic Silk Road

    Science.gov (United States)

    Balz, Timo; Caspari, Gino; Fu, Bihong

    2017-02-01

    Central Asia is rich in cultural heritage generated by thousands of years of human occupation. Aiming for a better understanding of Central Asia’s archaeology and how this unique heritage can be protected, the region should be studied as a whole with regard to its cultural ties with China and combined efforts should be undertaken in shielding the archaeological monuments from destruction. So far, international research campaigns have focused predominantly on single-sites or small-scale surveys, mainly due to the bureaucratic and security related issues involved in cross-border research. This is why we created the Dzungaria Landscape Project. Since 2013, we have worked on collecting remote sensing data of Xinjiang including IKONOS, WorldView-2, and TerraSAR-X data. We have developed a method for the automatic detection of larger grave mound structures in optical and SAR data. Gravemounds are typically spatially clustered and the detection of larger mound structures is a sufficient hint towards areas of high archaeological interest in a region. A meticulous remote sensing survey is the best planning tool for subsequent ground surveys and excavation. In summer 2015, we undertook a survey in the Chinese Altai in order to establish ground-truth in the Hailiutan valley. We categorized over 1000 monuments in just three weeks thanks to the previous detection and classification work using remote sensing data. Creating accurate maps of the cemeteries in northern Xinjiang is a crucial step to preserving the cultural heritage of the region since graves in remote areas are especially prone to looting. We will continue our efforts with the ultimate aim to map and monitor all large gravemounds in Dzungaria and potentially neighbouring eastern Kazakhstan.

  9. Performance of UWB Array-Based Radar Sensor in a Multi-Sensor Vehicle-Based Suit for Landmine Detection

    NARCIS (Netherlands)

    Yarovoy, A.; Savelyev, T.; Zhuge, X.; Aubry, P.; Ligthart, L.; Schavemaker, J.G.M.; Tettelaar, P.; Breejen, E. de

    2008-01-01

    In this paper, integration of an UWB array-based timedomain radar sensor in a vehicle-mounted multi-sensor system for landmine detection is described. Dedicated real-time signal processing algorithms are developed to compute the radar sensor confidence map which is used for sensor fusion.

  10. Advances in neutron based bulk explosive detection

    Science.gov (United States)

    Gozani, Tsahi; Strellis, Dan

    2007-08-01

    Neutron based explosive inspection systems can detect a wide variety of national security threats. The inspection is founded on the detection of characteristic gamma rays emitted as the result of neutron interactions with materials. Generally these are gamma rays resulting from thermal neutron capture and inelastic scattering reactions in most materials and fast and thermal neutron fission in fissile (e.g.235U and 239Pu) and fertile (e.g.238U) materials. Cars or trucks laden with explosives, drugs, chemical agents and hazardous materials can be detected. Cargo material classification via its main elements and nuclear materials detection can also be accomplished with such neutron based platforms, when appropriate neutron sources, gamma ray spectroscopy, neutron detectors and suitable decision algorithms are employed. Neutron based techniques can be used in a variety of scenarios and operational modes. They can be used as stand alones for complete scan of objects such as vehicles, or for spot-checks to clear (or validate) alarms indicated by another inspection system such as X-ray radiography. The technologies developed over the last two decades are now being implemented with good results. Further advances have been made over the last few years that increase the sensitivity, applicability and robustness of these systems. The advances range from the synchronous inspection of two sides of vehicles, increasing throughput and sensitivity and reducing imparted dose to the inspected object and its occupants (if any), to taking advantage of the neutron kinetic behavior of cargo to remove systematic errors, reducing background effects and improving fast neutron signals.

  11. Scanning Electron Microscope Mapping System Developed for Detecting Surface Defects in Fatigue Specimens

    Science.gov (United States)

    Bonacuse, Peter J.; Kantzos, Peter T.

    2002-01-01

    An automated two-degree-of-freedom specimen positioning stage has been developed at the NASA Glenn Research Center to map and monitor defects in fatigue specimens. This system expedites the examination of the entire gauge section of fatigue specimens so that defects can be found using scanning electron microscopy (SEM). Translation and rotation stages are driven by microprocessor-based controllers that are, in turn, interfaced to a computer running custom-designed software. This system is currently being used to find and record the location of ceramic inclusions in powder metallurgy materials. The mapped inclusions are periodically examined during interrupted fatigue experiments. The number of cycles to initiate cracks from these inclusions and the rate of growth of initiated cracks can then be quantified. This information is necessary to quantify the effect of this type of defect on the durability of powder metallurgy materials. This system was developed with support of the Ultra Safe program.

  12. Body-Sensor-Network-Based Spasticity Detection.

    Science.gov (United States)

    Misgeld, Berno J E; Luken, Markus; Heitzmann, Daniel; Wolf, Sebastian I; Leonhardt, Steffen

    2016-05-01

    Spasticity is a common disorder of the skeletal muscle with a high incidence in industrialised countries. A quantitative measure of spasticity using body-worn sensors is important in order to assess rehabilitative motor training and to adjust the rehabilitative therapy accordingly. We present a new approach to spasticity detection using the Integrated Posture and Activity Network by Medit Aachen body sensor network (BSN). For this, a new electromyography (EMG) sensor node was developed and employed in human locomotion. Following an analysis of the clinical gait data of patients with unilateral cerebral palsy, a novel algorithm was developed based on the idea to detect coactivation of antagonistic muscle groups as observed in the exaggerated stretch reflex with associated joint rigidity. The algorithm applies a cross-correlation function to the EMG signals of two antagonistically working muscles and subsequent weighting using a Blackman window. The result is a coactivation index which is also weighted by the signal equivalent energy to exclude positive detection of inactive muscles. Our experimental study indicates good performance in the detection of coactive muscles associated with spasticity from clinical data as well as measurements from a BSN in qualitative comparison with the Modified Ashworth Scale as classified by clinical experts. Possible applications of the new algorithm include (but are not limited to) use in robotic sensorimotor therapy to reduce the effect of spasticity.

  13. A street rubbish detection algorithm based on Sift and RCNN

    Science.gov (United States)

    Yu, XiPeng; Chen, Zhong; Zhang, Shuo; Zhang, Ting

    2018-02-01

    This paper presents a street rubbish detection algorithm based on image registration with Sift feature and RCNN. Firstly, obtain the rubbish region proposal on the real-time street image and set up the CNN convolution neural network trained by the rubbish samples set consists of rubbish and non-rubbish images; Secondly, for every clean street image, obtain the Sift feature and do image registration with the real-time street image to obtain the differential image, the differential image filters a lot of background information, obtain the rubbish region proposal rect where the rubbish may appear on the differential image by the selective search algorithm. Then, the CNN model is used to detect the image pixel data in each of the region proposal on the real-time street image. According to the output vector of the CNN, it is judged whether the rubbish is in the region proposal or not. If it is rubbish, the region proposal on the real-time street image is marked. This algorithm avoids the large number of false detection caused by the detection on the whole image because the CNN is used to identify the image only in the region proposal on the real-time street image that may appear rubbish. Different from the traditional object detection algorithm based on the region proposal, the region proposal is obtained on the differential image not whole real-time street image, and the number of the invalid region proposal is greatly reduced. The algorithm has the high mean average precision (mAP).

  14. QRS detection based ECG quality assessment

    International Nuclear Information System (INIS)

    Hayn, Dieter; Jammerbund, Bernhard; Schreier, Günter

    2012-01-01

    Although immediate feedback concerning ECG signal quality during recording is useful, up to now not much literature describing quality measures is available. We have implemented and evaluated four ECG quality measures. Empty lead criterion (A), spike detection criterion (B) and lead crossing point criterion (C) were calculated from basic signal properties. Measure D quantified the robustness of QRS detection when applied to the signal. An advanced Matlab-based algorithm combining all four measures and a simplified algorithm for Android platforms, excluding measure D, were developed. Both algorithms were evaluated by taking part in the Computing in Cardiology Challenge 2011. Each measure's accuracy and computing time was evaluated separately. During the challenge, the advanced algorithm correctly classified 93.3% of the ECGs in the training-set and 91.6 % in the test-set. Scores for the simplified algorithm were 0.834 in event 2 and 0.873 in event 3. Computing time for measure D was almost five times higher than for other measures. Required accuracy levels depend on the application and are related to computing time. While our simplified algorithm may be accurate for real-time feedback during ECG self-recordings, QRS detection based measures can further increase the performance if sufficient computing power is available. (paper)

  15. Transit detections of extrasolar planets around main-sequence stars. I. Sky maps for hot Jupiters

    Science.gov (United States)

    Heller, R.; Mislis, D.; Antoniadis, J.

    2009-12-01

    Context: The findings of more than 350 extrasolar planets, most of them nontransiting Hot Jupiters, have revealed correlations between the metallicity of the main-sequence (MS) host stars and planetary incidence. This connection can be used to calculate the planet formation probability around other stars, not yet known to have planetary companions. Numerous wide-field surveys have recently been initiated, aiming at the transit detection of extrasolar planets in front of their host stars. Depending on instrumental properties and the planetary distribution probability, the promising transit locations on the celestial plane will differ among these surveys. Aims: We want to locate the promising spots for transit surveys on the celestial plane and strive for absolute values of the expected number of transits in general. Our study will also clarify the impact of instrumental properties such as pixel size, field of view (FOV), and magnitude range on the detection probability. Methods: We used data of the Tycho catalog for ≈1 million objects to locate all the stars with 0^m~≲~m_V~≲~11.5m on the celestial plane. We took several empirical relations between the parameters listed in the Tycho catalog, such as distance to Earth, m_V, and (B-V), and those parameters needed to account for the probability of a star to host an observable, transiting exoplanet. The empirical relations between stellar metallicity and planet occurrence combined with geometrical considerations were used to yield transit probabilities for the MS stars in the Tycho catalog. Magnitude variations in the FOV were simulated to test whether this fluctuations would be detected by BEST, XO, SuperWASP and HATNet. Results: We present a sky map of the expected number of Hot Jupiter transit events on the basis of the Tycho catalog. Conditioned by the accumulation of stars towards the galactic plane, the zone of the highest number of transits follows the same trace, interrupted by spots of very low and high

  16. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  17. An Improved Piecewise Linear Chaotic Map Based Image Encryption Algorithm

    Directory of Open Access Journals (Sweden)

    Yuping Hu

    2014-01-01

    Full Text Available An image encryption algorithm based on improved piecewise linear chaotic map (MPWLCM model was proposed. The algorithm uses the MPWLCM to permute and diffuse plain image simultaneously. Due to the sensitivity to initial key values, system parameters, and ergodicity in chaotic system, two pseudorandom sequences are designed and used in the processes of permutation and diffusion. The order of processing pixels is not in accordance with the index of pixels, but it is from beginning or end alternately. The cipher feedback was introduced in diffusion process. Test results and security analysis show that not only the scheme can achieve good encryption results but also its key space is large enough to resist against brute attack.

  18. A novel image encryption scheme based on spatial chaos map

    International Nuclear Information System (INIS)

    Sun Fuyan; Liu Shutang; Li Zhongqin; Lue Zongwang

    2008-01-01

    In recent years, the chaos-based cryptographic algorithms have suggested some new and efficient ways to develop secure image encryption techniques, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. In this paper, spatial chaos system are used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail. The basic idea is to encrypt the image in space with spatial chaos map pixel by pixel, and then the pixels are confused in multiple directions of space. Using this method one cycle, the image becomes indistinguishable in space due to inherent properties of spatial chaotic systems. Several experimental results, key sensitivity tests, key space analysis, and statistical analysis show that the approach for image cryptosystems provides an efficient and secure way for real time image encryption and transmission from the cryptographic viewpoint

  19. Mapping shape to visuomotor mapping: learning and generalisation of sensorimotor behaviour based on contextual information.

    Directory of Open Access Journals (Sweden)

    Loes C J van Dam

    2015-03-01

    Full Text Available Humans can learn and store multiple visuomotor mappings (dual-adaptation when feedback for each is provided alternately. Moreover, learned context cues associated with each mapping can be used to switch between the stored mappings. However, little is known about the associative learning between cue and required visuomotor mapping, and how learning generalises to novel but similar conditions. To investigate these questions, participants performed a rapid target-pointing task while we manipulated the offset between visual feedback and movement end-points. The visual feedback was presented with horizontal offsets of different amounts, dependent on the targets shape. Participants thus needed to use different visuomotor mappings between target location and required motor response depending on the target shape in order to "hit" it. The target shapes were taken from a continuous set of shapes, morphed between spiky and circular shapes. After training we tested participants performance, without feedback, on different target shapes that had not been learned previously. We compared two hypotheses. First, we hypothesised that participants could (explicitly extract the linear relationship between target shape and visuomotor mapping and generalise accordingly. Second, using previous findings of visuomotor learning, we developed a (implicit Bayesian learning model that predicts generalisation that is more consistent with categorisation (i.e. use one mapping or the other. The experimental results show that, although learning the associations requires explicit awareness of the cues' role, participants apply the mapping corresponding to the trained shape that is most similar to the current one, consistent with the Bayesian learning model. Furthermore, the Bayesian learning model predicts that learning should slow down with increased numbers of training pairs, which was confirmed by the present results. In short, we found a good correspondence between the

  20. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    Science.gov (United States)

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  1. Point source detection using the Spherical Mexican Hat Wavelet on simulated all-sky Planck maps

    Science.gov (United States)

    Vielva, P.; Martínez-González, E.; Gallegos, J. E.; Toffolatti, L.; Sanz, J. L.

    2003-09-01

    We present an estimation of the point source (PS) catalogue that could be extracted from the forthcoming ESA Planck mission data. We have applied the Spherical Mexican Hat Wavelet (SMHW) to simulated all-sky maps that include cosmic microwave background (CMB), Galactic emission (thermal dust, free-free and synchrotron), thermal Sunyaev-Zel'dovich effect and PS emission, as well as instrumental white noise. This work is an extension of the one presented in Vielva et al. We have developed an algorithm focused on a fast local optimal scale determination, that is crucial to achieve a PS catalogue with a large number of detections and a low flux limit. An important effort has been also done to reduce the CPU time processor for spherical harmonic transformation, in order to perform the PS detection in a reasonable time. The presented algorithm is able to provide a PS catalogue above fluxes: 0.48 Jy (857 GHz), 0.49 Jy (545 GHz), 0.18 Jy (353 GHz), 0.12 Jy (217 GHz), 0.13 Jy (143 GHz), 0.16 Jy (100 GHz HFI), 0.19 Jy (100 GHz LFI), 0.24 Jy (70 GHz), 0.25 Jy (44 GHz) and 0.23 Jy (30 GHz). We detect around 27 700 PS at the highest frequency Planck channel and 2900 at the 30-GHz one. The completeness level are: 70 per cent (857 GHz), 75 per cent (545 GHz), 70 per cent (353 GHz), 80 per cent (217 GHz), 90 per cent (143 GHz), 85 per cent (100 GHz HFI), 80 per cent (100 GHz LFI), 80 per cent (70 GHz), 85 per cent (44 GHz) and 80 per cent (30 GHz). In addition, we can find several PS at different channels, allowing the study of the spectral behaviour and the physical processes acting on them. We also present the basic procedure to apply the method in maps convolved with asymmetric beams. The algorithm takes ~72 h for the most CPU time-demanding channel (857 GHz) in a Compaq HPC320 (Alpha EV68 1-GHz processor) and requires 4 GB of RAM memory; the CPU time goes as O[NRoN3/2pix log(Npix)], where Npix is the number of pixels in the map and NRo is the number of optimal scales needed.

  2. Sensors Fusion based Online Mapping and Features Extraction of Mobile Robot in the Road Following and Roundabout

    International Nuclear Information System (INIS)

    Ali, Mohammed A H; Yussof, Wan Azhar B.; Hamedon, Zamzuri B; Yussof, Zulkifli B.; Majeed, Anwar P P; Mailah, Musa

    2016-01-01

    A road feature extraction based mapping system using a sensor fusion technique for mobile robot navigation in road environments is presented in this paper. The online mapping of mobile robot is performed continuously in the road environments to find the road properties that enable the robot to move from a certain start position to pre-determined goal while discovering and detecting the roundabout. The sensors fusion involving laser range finder, camera and odometry which are installed in a new platform, are used to find the path of the robot and localize it within its environments. The local maps are developed using camera and laser range finder to recognize the roads borders parameters such as road width, curbs and roundabout. Results show the capability of the robot with the proposed algorithms to effectively identify the road environments and build a local mapping for road following and roundabout. (paper)

  3. Template based rodent brain extraction and atlas mapping.

    Science.gov (United States)

    Weimin Huang; Jiaqi Zhang; Zhiping Lin; Su Huang; Yuping Duan; Zhongkang Lu

    2016-08-01

    Accurate rodent brain extraction is the basic step for many translational studies using MR imaging. This paper presents a template based approach with multi-expert refinement to automatic rodent brain extraction. We first build the brain appearance model based on the learning exemplars. Together with the template matching, we encode the rodent brain position into the search space to reliably locate the rodent brain and estimate the rough segmentation. With the initial mask, a level-set segmentation and a mask-based template learning are implemented further to the brain region. The multi-expert fusion is used to generate a new mask. We finally combine the region growing based on the histogram distribution learning to delineate the final brain mask. A high-resolution rodent atlas is used to illustrate that the segmented low resolution anatomic image can be well mapped to the atlas. Tested on a public data set, all brains are located reliably and we achieve the mean Jaccard similarity score at 94.99% for brain segmentation, which is a statistically significant improvement compared to two other rodent brain extraction methods.

  4. Finger Vein Recognition Based on Personalized Weight Maps

    Science.gov (United States)

    Yang, Gongping; Xiao, Rongyang; Yin, Yilong; Yang, Lu

    2013-01-01

    Finger vein recognition is a promising biometric recognition technology, which verifies identities via the vein patterns in the fingers. Binary pattern based methods were thoroughly studied in order to cope with the difficulties of extracting the blood vessel network. However, current binary pattern based finger vein matching methods treat every bit of feature codes derived from different image of various individuals as equally important and assign the same weight value to them. In this paper, we propose a finger vein recognition method based on personalized weight maps (PWMs). The different bits have different weight values according to their stabilities in a certain number of training samples from an individual. Firstly we present the concept of PWM, and then propose the finger vein recognition framework, which mainly consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PWM achieves not only better performance, but also high robustness and reliability. In addition, PWM can be used as a general framework for binary pattern based recognition. PMID:24025556

  5. Finger Vein Recognition Based on Personalized Weight Maps

    Directory of Open Access Journals (Sweden)

    Lu Yang

    2013-09-01

    Full Text Available Finger vein recognition is a promising biometric recognition technology, which verifies identities via the vein patterns in the fingers. Binary pattern based methods were thoroughly studied in order to cope with the difficulties of extracting the blood vessel network. However, current binary pattern based finger vein matching methods treat every bit of feature codes derived from different image of various individuals as equally important and assign the same weight value to them. In this paper, we propose a finger vein recognition method based on personalized weight maps (PWMs. The different bits have different weight values according to their stabilities in a certain number of training samples from an individual. Firstly we present the concept of PWM, and then propose the finger vein recognition framework, which mainly consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PWM achieves not only better performance, but also high robustness and reliability. In addition, PWM can be used as a general framework for binary pattern based recognition.

  6. Projector primary-based optimization for superimposed projection mappings

    Science.gov (United States)

    Ahmed, Bilal; Lee, Jong Hun; Lee, Yong Yi; Lee, Kwan H.

    2018-01-01

    Recently, many researchers have focused on fully overlapping projections for three-dimensional (3-D) projection mapping systems but reproducing a high-quality appearance using this technology still remains a challenge. On top of existing color compensation-based methods, much effort is still required to faithfully reproduce an appearance that is free from artifacts, colorimetric inconsistencies, and inappropriate illuminance over the 3-D projection surface. According to our observation, this is due to the fact that overlapping projections are treated as an additive-linear mixture of color. However, this is not the case according to our elaborated observations. We propose a method that enables us to use high-quality appearance data that are measured from original objects and regenerate the same appearance by projecting optimized images using multiple projectors, ensuring that the projection-rendered results look visually close to the real object. We prepare our target appearances by photographing original objects. Then, using calibrated projector-camera pairs, we compensate for missing geometric correspondences to make our method robust against noise. The heart of our method is a target appearance-driven adaptive sampling of the projection surface followed by a representation of overlapping projections in terms of the projector-primary response. This gives off projector-primary weights to facilitate blending and the system is applied with constraints. These samples are used to populate a light transport-based system. Then, the system is solved minimizing the error to get the projection images in a noise-free manner by utilizing intersample overlaps. We ensure that we make the best utilization of available hardware resources to recreate projection mapped appearances that look as close to the original object as possible. Our experimental results show compelling results in terms of visual similarity and colorimetric error.

  7. Reset Tree-Based Optical Fault Detection

    Directory of Open Access Journals (Sweden)

    Howon Kim

    2013-05-01

    Full Text Available In this paper, we present a new reset tree-based scheme to protect cryptographic hardware against optical fault injection attacks. As one of the most powerful invasive attacks on cryptographic hardware, optical fault attacks cause semiconductors to misbehave by injecting high-energy light into a decapped integrated circuit. The contaminated result from the affected chip is then used to reveal secret information, such as a key, from the cryptographic hardware. Since the advent of such attacks, various countermeasures have been proposed. Although most of these countermeasures are strong, there is still the possibility of attack. In this paper, we present a novel optical fault detection scheme that utilizes the buffers on a circuit’s reset signal tree as a fault detection sensor. To evaluate our proposal, we model radiation-induced currents into circuit components and perform a SPICE simulation. The proposed scheme is expected to be used as a supplemental security tool.

  8. Frequency Based Fault Detection in Wind Turbines

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2014-01-01

    In order to obtain lower cost of energy for wind turbines fault detection and accommodation is important. Expensive condition monitoring systems are often used to monitor the condition of rotating and vibrating system parts. One example is the gearbox in a wind turbine. This system is operated...... in parallel to the control system, using different computers and additional often expensive sensors. In this paper a simple filter based algorithm is proposed to detect changes in a resonance frequency in a system, exemplified with faults resulting in changes in the resonance frequency in the wind turbine...... gearbox. Only the generator speed measurement which is available in even simple wind turbine control systems is used as input. Consequently this proposed scheme does not need additional sensors and computers for monitoring the condition of the wind gearbox. The scheme is evaluated on a wide-spread wind...

  9. Comparing registration methods for mapping brain change using tensor-based morphometry.

    Science.gov (United States)

    Yanovsky, Igor; Leow, Alex D; Lee, Suh; Osher, Stanley J; Thompson, Paul M

    2009-10-01

    Measures of brain changes can be computed from sequential MRI scans, providing valuable information on disease progression for neuroscientific studies and clinical trials. Tensor-based morphometry (TBM) creates maps of these brain changes, visualizing the 3D profile and rates of tissue growth or atrophy. In this paper, we examine the power of different nonrigid registration models to detect changes in TBM, and their stability when no real changes are present. Specifically, we investigate an asymmetric version of a recently proposed Unbiased registration method, using mutual information as the matching criterion. We compare matching functionals (sum of squared differences and mutual information), as well as large-deformation registration schemes (viscous fluid and inverse-consistent linear elastic registration methods versus Symmetric and Asymmetric Unbiased registration) for detecting changes in serial MRI scans of 10 elderly normal subjects and 10 patients with Alzheimer's Disease scanned at 2-week and 1-year intervals. We also analyzed registration results when matching images corrupted with artificial noise. We demonstrated that the unbiased methods, both symmetric and asymmetric, have higher reproducibility. The unbiased methods were also less likely to detect changes in the absence of any real physiological change. Moreover, they measured biological deformations more accurately by penalizing bias in the corresponding statistical maps.

  10. Damage detection on mesosurfaces using distributed sensor network and spectral diffusion maps

    International Nuclear Information System (INIS)

    Chinde, V; Vaidya, U; Laflamme, S; Cao, L

    2016-01-01

    In this work, we develop a data-driven method for the diagnosis of damage in mesoscale mechanical structures using an array of distributed sensor networks. The proposed approach relies on comparing intrinsic geometries of data sets corresponding to the undamaged and damaged states of the system. We use a spectral diffusion map approach to identify the intrinsic geometry of the data set. In particular, time series data from distributed sensors is used for the construction of diffusion maps. The low dimensional embedding of the data set corresponding to different damage levels is obtained using a singular value decomposition of the diffusion map. We construct appropriate metrics in the diffusion space to compare the different data sets corresponding to different damage cases. The developed algorithm is applied for damage diagnosis of wind turbine blades. To achieve this goal, we developed a detailed finite element-based model of CX-100 blade in ANSYS using shell elements. Typical damage, such as crack or delamination, will lead to a loss of stiffness, is modeled by altering the stiffness of the laminate layer. One of the main challenges in the development of health monitoring algorithms is the ability to use sensor data with a relatively small signal-to-noise ratio. Our developed diffusion map-based algorithm is shown to be robust to the presence of sensor noise. The proposed diffusion map-based algorithm is advantageous by enabling the comparison of data from numerous sensors of similar or different types of data through data fusion, hereby making it attractive to exploit the distributed nature of sensor arrays. This distributed nature is further exploited for the purpose of damage localization. We perform extensive numerical simulations to demonstrate that the proposed method can successfully determine the extent of damage on the wind turbine blade and also localize the damage. We also present preliminary results for the application of the developed algorithm on

  11. Seep Detection using E/V Nautilus Integrated Seafloor Mapping and Remotely Operated Vehicles on the United States West Coast

    Science.gov (United States)

    Gee, L. J.; Raineault, N.; Kane, R.; Saunders, M.; Heffron, E.; Embley, R. W.; Merle, S. G.

    2017-12-01

    Exploration Vessel (E/V) Nautilus has been mapping the seafloor off the west coast of the United States, from Washington to California, for the past three years with a Kongsberg EM302 multibeam sonar. This system simultaneously collects bathymetry, seafloor and water column backscatter data, allowing an integrated approach to mapping to more completely characterize a region, and has identified over 1,000 seafloor seeps. Hydrographic multibeam sonars like the EM302 were designed for mapping the bathymetry. It is only in the last decade that major mapping projects included an integrated approach that utilizes the seabed and water column backscatter information in addition to the bathymetry. Nautilus mapping in the Eastern Pacific over the past three years has included a number of seep-specific expeditions, and utilized and adapted the preliminary mapping guidelines that have emerged from research. The likelihood of seep detection is affected by many factors: the environment: seabed geomorphology, surficial sediment, seep location/depth, regional oceanography and biology, the nature of the seeps themselves: size variation, varying flux, depth, and transience, the detection system: design of hydrographic multibeam sonars limits use for water column detection, the platform: variations in the vessel and operations such as noise, speed, and swath overlap. Nautilus integrated seafloor mapping provided multiple indicators of seep locations, but it remains difficult to assess the probability of seep detection. Even when seeps were detected, they have not always been located during ROV dives. However, the presence of associated features (methane hydrate and bacterial mats) serve as evidence of potential seep activity and reinforce the transient nature of the seeps. Not detecting a seep in the water column data does not necessarily indicate that there is not a seep at a given location, but with multiple passes over an area and by the use of other contextual data, an area may

  12. Land degradation mapping based on hyperion data in desertification region of northwest China

    Science.gov (United States)

    Cheng, Penggen; Wu, Jian; Ouyang, Ping; He, Ting

    2008-10-01

    Desertification is an alarming sign of land degradation in Henshan county of northwest china. Due to the considerable costs of detailed ground surveys of this phenomenon, remote sensing is an appropriate alternative for analyzing and evaluating the risks of the expansion of land degradation. Degradation features can be detected directly or indirectly by using image data. In this paper, based on the Hyperion images of Hengshan desertification region of northwest china, a new algorithm aimed at land degradation mapping, called Land Degradation Index (LDI), was put forward. This new algorithm is based on the classified process. We applied the linear spectral unmixing algorithm with the training samples derived from the formerly classified process so as to find out new endmembers in the RMS error imagine. After that, using neutral net mapping with new training samples, the classified result was gained. In addition, after applying mask processing, the soils were grouped to 3 types (Kappa =0.90): highly degraded soils, moderately degraded soils and slightly degraded soils. By analyzing 3 mapping methods: mixture-classification, the spectral angle mapper and mixturetuned matched filtering, the results suggest that the mixture-classification has the higher accuracy (Kappa=0.7075) than the spectral angle mapper (Kappa=0.5418) and the mixture-tuned matched filter (Kappa=0.6039). As a result, the mixture-classification is selected to carry out Land Degradation Index analysis.

  13. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    Poline, J.B.; Mazoyer, B.M.

    1994-01-01

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  14. Spatio Temporal Detection and Virtual Mapping of Landslide Using High-Resolution Airborne Laser Altimetry (lidar) in Densely Vegetated Areas of Tropics

    Science.gov (United States)

    Bibi, T.; Azahari Razak, K.; Rahman, A. Abdul; Latif, A.

    2017-10-01

    Landslides are an inescapable natural disaster, resulting in massive social, environmental and economic impacts all over the world. The tropical, mountainous landscape in generally all over Malaysia especially in eastern peninsula (Borneo) is highly susceptible to landslides because of heavy rainfall and tectonic disturbances. The purpose of the Landslide hazard mapping is to identify the hazardous regions for the execution of mitigation plans which can reduce the loss of life and property from future landslide incidences. Currently, the Malaysian research bodies e.g. academic institutions and government agencies are trying to develop a landslide hazard and risk database for susceptible areas to backing the prevention, mitigation, and evacuation plan. However, there is a lack of devotion towards landslide inventory mapping as an elementary input of landslide susceptibility, hazard and risk mapping. The developing techniques based on remote sensing technologies (satellite, terrestrial and airborne) are promising techniques to accelerate the production of landslide maps, shrinking the time and resources essential for their compilation and orderly updates. The aim of the study is to provide a better perception regarding the use of virtual mapping of landslides with the help of LiDAR technology. The focus of the study is spatio temporal detection and virtual mapping of landslide inventory via visualization and interpretation of very high-resolution data (VHR) in forested terrain of Mesilau river, Kundasang. However, to cope with the challenges of virtual inventory mapping on in forested terrain high resolution LiDAR derivatives are used. This study specifies that the airborne LiDAR technology can be an effective tool for mapping landslide inventories in a complex climatic and geological conditions, and a quick way of mapping regional hazards in the tropics.

  15. COMPARISON of FUZZY-BASED MODELS in LANDSLIDE HAZARD MAPPING

    Directory of Open Access Journals (Sweden)

    N. Mijani

    2017-09-01

    Full Text Available Landslide is one of the main geomorphic processes which effects on the development of prospect in mountainous areas and causes disastrous accidents. Landslide is an event which has different uncertain criteria such as altitude, slope, aspect, land use, vegetation density, precipitation, distance from the river and distance from the road network. This research aims to compare and evaluate different fuzzy-based models including Fuzzy Analytic Hierarchy Process (Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR. The main contribution of this paper reveals to the comprehensive criteria causing landslide hazard considering their uncertainties and comparison of different fuzzy-based models. The quantify of evaluation process are calculated by Density Ratio (DR and Quality Sum (QS. The proposed methodology implemented in Sari, one of the city of Iran which has faced multiple landslide accidents in recent years due to the particular environmental conditions. The achieved results of accuracy assessment based on the quantifier strated that Fuzzy-AHP model has higher accuracy compared to other two models in landslide hazard zonation. Accuracy of zoning obtained from Fuzzy-AHP model is respectively 0.92 and 0.45 based on method Precision (P and QS indicators. Based on obtained landslide hazard maps, Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR respectively cover 13, 26 and 35 percent of the study area with a very high risk level. Based on these findings, fuzzy-AHP model has been selected as the most appropriate method of zoning landslide in the city of Sari and the Fuzzy-gamma method with a minor difference is in the second order.

  16. GPS-based handheld device for mapping contaminated areas

    International Nuclear Information System (INIS)

    Paridaens, J.

    2005-01-01

    Sometimes one is confronted with the challenge to map large areas with enhanced radioactivity. Examples are mine tailings or waste rock piles, deposits of the phosphate industry, flooding zones contaminated by effluents of plants processing ores containing enhanced natural radiation, nuclear accident sites etc. Car borne measuring equipment is not always an option, as the terrain might be rough and only accessible by foot. Airborne mapping with helicopters on the other hand is fast, but expensive, not readily available, shows difficulties with complex topography and lacks the necessary detail. The objective of this study was to create a portable and easily usable tool for the real time logging of radiation and location data, allowing mapping the radioactivity by simply walking over any kind of terrain with the portable equipment and post processing the data in the office. We also assessed the performance of the GPS based system on contaminated sites with areas varying from less than a hectare to several tens of hectares, with respect to speed, precision and ease of use. At sites of large scale mining and processing of uranium ore, tailings and waste rock piles are today the most visible relics of the uranium extractive industry. These mining relics are constantly subjected to weathering and leaching processes causing the dissemination of radioactive and toxic elements and sometimes requiring remedial operations. The in situ remediation of waste rock piles usually includes their revegetation for minimizing the water infiltration and for increasing surface soil stability. Thanks to its biomass density and longevity, the perennial vegetation plays an important role in stabilisation of the water cycling. The buffer role of forest vegetation can reduce water export from watersheds as well as erosion and hydrological losses of chemicals including radionuclides from contaminated sites. If long term reduction of contaminant dispersion at revegetated uranium mining sites is

  17. Waveguide-Based Biosensors for Pathogen Detection

    Directory of Open Access Journals (Sweden)

    Nile Hartman

    2009-07-01

    Full Text Available Optical phenomena such as fluorescence, phosphorescence, polarization, interference and non-linearity have been extensively used for biosensing applications. Optical waveguides (both planar and fiber-optic are comprised of a material with high permittivity/high refractive index surrounded on all sides by materials with lower refractive indices, such as a substrate and the media to be sensed. This arrangement allows coupled light to propagate through the high refractive index waveguide by total internal reflection and generates an electromagnetic wave—the evanescent field—whose amplitude decreases exponentially as the distance from the surface increases. Excitation of fluorophores within the evanescent wave allows for sensitive detection while minimizing background fluorescence from complex, “dirty” biological samples. In this review, we will describe the basic principles, advantages and disadvantages of planar optical waveguide-based biodetection technologies. This discussion will include already commercialized technologies (e.g., Corning’s EPIC® Ô, SRU Biosystems’ BIND™, Zeptosense®, etc. and new technologies that are under research and development. We will also review differing assay approaches for the detection of various biomolecules, as well as the thin-film coatings that are often required for waveguide functionalization and effective detection. Finally, we will discuss reverse-symmetry waveguides, resonant waveguide grating sensors and metal-clad leaky waveguides as alternative signal transducers in optical biosensing.

  18. DNA & Protein detection based on microbead agglutination

    KAUST Repository

    Kodzius, Rimantas

    2012-06-06

    We report a simple and rapid room temperature assay for point-of-care (POC) testing that is based on specific agglutination. Agglutination tests are based on aggregation of microparticles in the presence of a specific analyte thus enabling the macroscopic observation. Agglutination-based tests are most often used to explore the antibody-antigen reactions. Agglutination has been used for mode protein assays using a biotin/streptavidin two-component system, as well as a hybridization based two-component assay; however, as our work shows, two-component systems are prone to self-termination of the linking analyte and thus have a lower sensitivity. Three component systems have also been used with DNA hybridization, as in our work; however, their assay requires 48 hours for incubation, while our assay is performed in 5 minutes making it a real candidate for POC testing. We demonstrate three assays: a two-component biotin/streptavidin assay, a three-component hybridization assay using single stranded DNA (ssDNA) molecules and a stepped three-component hybridization assay. The comparison of these three assays shows our simple stepped three-component agglutination assay to be rapid at room temperature and more sensitive than the two-component version by an order of magnitude. An agglutination assay was also performed in a PDMS microfluidic chip where agglutinated beads were trapped by filter columns for easy observation. We developed a rapid (5 minute) room temperature assay, which is based on microbead agglutination. Our three-component assay solves the linker self-termination issue allowing an order of magnitude increase in sensitivity over two–component assays. Our stepped version of the three-component assay solves the issue with probe site saturation thus enabling a wider range of detection. Detection of the agglutinated beads with the naked eye by trapping in microfluidic channels has been shown.

  19. GeneRecon Users' Manual — A coalescent based tool for fine-scale association mapping

    DEFF Research Database (Denmark)

    Mailund, T

    2006-01-01

    GeneRecon is a software package for linkage disequilibrium mapping using coalescent theory. It is based on Bayesian Markov-chain Monte Carlo (MCMC) method for fine-scale linkage-disequilibrium gene mapping using high-density marker maps. GeneRecon explicitly models the genealogy of a sample of th...

  20. A Electronic Map Data Model Based on PDF

    Science.gov (United States)

    Zhou, Xiaodong; Yang, Chuncheng; Meng, Nina; Peng, Peng

    2018-05-01

    In this paper, we proposed the PDFEMAP (PDF electronic map) that is a kind of new electronic map products aiming at the current situation and demand of the use of electronic map products. Firstly gives the definition and characteristics of PDFEMAP, followed by a detailed description of the data model and method for generating PDFEMAP, and finally expounds application modes of the PDFEMAP which feasibility and effectiveness are verified.

  1. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery

    Science.gov (United States)

    Gao, Junfeng; Liao, Wenzhi; Nuyttens, David; Lootens, Peter; Vangeyte, Jürgen; Pižurica, Aleksandra; He, Yong; Pieters, Jan G.

    2018-05-01

    The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for inter- and intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.

  2. Assessing the MODIS crop detection algorithm for soybean crop area mapping and expansion in the Mato Grosso state, Brazil.

    Science.gov (United States)

    Gusso, Anibal; Arvor, Damien; Ducati, Jorge Ricardo; Veronez, Mauricio Roberto; da Silveira, Luiz Gonzaga

    2014-01-01

    Estimations of crop area were made based on the temporal profiles of the Enhanced Vegetation Index (EVI) obtained from moderate resolution imaging spectroradiometer (MODIS) images. Evaluation of the ability of the MODIS crop detection algorithm (MCDA) to estimate soybean crop areas was performed for fields in the Mato Grosso state, Brazil. Using the MCDA approach, soybean crop area estimations can be provided for December (first forecast) using images from the sowing period and for February (second forecast) using images from the sowing period and the maximum crop development period. The area estimates were compared to official agricultural statistics from the Brazilian Institute of Geography and Statistics (IBGE) and from the National Company of Food Supply (CONAB) at different crop levels from 2000/2001 to 2010/2011. At the municipality level, the estimates were highly correlated, with R (2) = 0.97 and RMSD = 13,142 ha. The MCDA was validated using field campaign data from the 2006/2007 crop year. The overall map accuracy was 88.25%, and the Kappa Index of Agreement was 0.765. By using pre-defined parameters, MCDA is able to provide the evolution of annual soybean maps, forecast of soybean cropping areas, and the crop area expansion in the Mato Grosso state.

  3. Detecting Buried Archaeological Remains by the Use of Geophysical Data Processing with 'Diffusion Maps' Methodology

    Science.gov (United States)

    Eppelbaum, Lev

    2015-04-01

    Geophysical methods are prompt, non-invasive and low-cost tool for quantitative delineation of buried archaeological targets. However, taking into account the complexity of geological-archaeological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient (Khesin and Eppelbaum, 1997). Besides this, it is well-known that the majority of inverse-problem solutions in geophysics are ill-posed (e.g., Zhdanov, 2002), which means, according to Hadamard (1902), that the solution does not exist, or is not unique, or is not a continuous function of observed geophysical data (when small perturbations in the observations will cause arbitrary mistakes in the solution). This fact has a wide application for informational, probabilistic and wavelet methodologies in archaeological geophysics (Eppelbaum, 2014a). The goal of the modern geophysical data examination is to detect the geophysical signatures of buried targets at noisy areas via the analysis of some physical parameters with a minimal number of false alarms and miss-detections (Eppelbaum et al., 2011; Eppelbaum, 2014b). The proposed wavelet approach to recognition of archaeological targets (AT) by the examination of geophysical method integration consists of advanced processing of each geophysical method and nonconventional integration of different geophysical methods between themselves. The recently developed technique of diffusion clustering combined with the abovementioned wavelet methods was utilized to integrate the geophysical data and detect existing irregularities. The approach is based on the wavelet packet techniques applied as to the geophysical images (or graphs) versus coordinates. For such an analysis may be utilized practically all geophysical methods (magnetic, gravity, seismic, GPR, ERT, self-potential, etc.). On the first stage of the proposed investigation a few tens of typical physical-archaeological models (PAM

  4. Direct 13C-detected NMR experiments for mapping and characterization of hydrogen bonds in RNA

    International Nuclear Information System (INIS)

    Fürtig, Boris; Schnieders, Robbin; Richter, Christian; Zetzsche, Heidi; Keyhani, Sara; Helmling, Christina; Kovacs, Helena; Schwalbe, Harald

    2016-01-01

    In RNA secondary structure determination, it is essential to determine whether a nucleotide is base-paired and not. Base-pairing of nucleotides is mediated by hydrogen bonds. The NMR characterization of hydrogen bonds relies on experiments correlating the NMR resonances of exchangeable protons and can be best performed for structured parts of the RNA, where labile hydrogen atoms are protected from solvent exchange. Functionally important regions in RNA, however, frequently reveal increased dynamic disorder which often leads to NMR signals of exchangeable protons that are broadened beyond 1 H detection. Here, we develop 13 C direct detected experiments to observe all nucleotides in RNA irrespective of whether they are involved in hydrogen bonds or not. Exploiting the self-decoupling of scalar couplings due to the exchange process, the hydrogen bonding behavior of the hydrogen bond donor of each individual nucleotide can be determined. Furthermore, the adaption of HNN-COSY experiments for 13 C direct detection allows correlations of donor–acceptor pairs and the localization of hydrogen-bond acceptor nucleotides. The proposed 13 C direct detected experiments therefore provide information about molecular sites not amenable by conventional proton-detected methods. Such information makes the RNA secondary structure determination by NMR more accurate and helps to validate secondary structure predictions based on bioinformatics.

  5. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    Directory of Open Access Journals (Sweden)

    Sungho Kim

    2016-07-01

    Full Text Available Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR images or infrared (IR images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter and an asymmetric morphological closing filter (AMCF, post-filter into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic

  6. Robust Ground Target Detection by SAR and IR Sensor Fusion Using Adaboost-Based Feature Selection

    Science.gov (United States)

    Kim, Sungho; Song, Woo-Jin; Kim, So-Hyun

    2016-01-01

    Long-range ground targets are difficult to detect in a noisy cluttered environment using either synthetic aperture radar (SAR) images or infrared (IR) images. SAR-based detectors can provide a high detection rate with a high false alarm rate to background scatter noise. IR-based approaches can detect hot targets but are affected strongly by the weather conditions. This paper proposes a novel target detection method by decision-level SAR and IR fusion using an Adaboost-based machine learning scheme to achieve a high detection rate and low false alarm rate. The proposed method consists of individual detection, registration, and fusion architecture. This paper presents a single framework of a SAR and IR target detection method using modified Boolean map visual theory (modBMVT) and feature-selection based fusion. Previous methods applied different algorithms to detect SAR and IR targets because of the different physical image characteristics. One method that is optimized for IR target detection produces unsuccessful results in SAR target detection. This study examined the image characteristics and proposed a unified SAR and IR target detection method by inserting a median local average filter (MLAF, pre-filter) and an asymmetric morphological closing filter (AMCF, post-filter) into the BMVT. The original BMVT was optimized to detect small infrared targets. The proposed modBMVT can remove the thermal and scatter noise by the MLAF and detect extended targets by attaching the AMCF after the BMVT. Heterogeneous SAR and IR images were registered automatically using the proposed RANdom SAmple Region Consensus (RANSARC)-based homography optimization after a brute-force correspondence search using the detected target centers and regions. The final targets were detected by feature-selection based sensor fusion using Adaboost. The proposed method showed good SAR and IR target detection performance through feature selection-based decision fusion on a synthetic database generated

  7. Molecular polarization potential maps of the nucleic acid bases

    International Nuclear Information System (INIS)

    Alkorta, I.; Perez, J.J.

    1996-01-01

    Ab initio calculations at the SCF level were carried out to compute the polarization potential map NM of the nucleic acid bases: cytosine, thymine, uracil, adedine, and guanine. For this purpose, the Dunning's 9s5p basis set contracted to a split-valence, was selected to perform the calculations. The molecular polarization potential (MPP) at each point was evaluated by the difference between the interaction energy of the molecule with a unit point charge and the molecular electrostatic potential (MEP) at that point. MEPS and MPPS for the different molecules were computed with a density of 5 points/Angstrom 2 on the van der Waals surface of each molecule, defined using the van der Waals radii. Due to the symmetry of the molecules, only half the points were computed. The total number of points calculated was 558 for cytosine, 621 for thymine, 526 for uracil, 666 for adenine, and 699 for guanine. The results of these calculations are analyzed in terms of their implications on the molecular interactions between pairs of nucleic acid bases. 23 refs., 5 figs., 1 tab

  8. Vision-based topological map building and localisation using persistent features

    CSIR Research Space (South Africa)

    Sabatta, DG

    2008-11-01

    Full Text Available stream_source_info Sabatta_2008.pdf.txt stream_content_type text/plain stream_size 32284 Content-Encoding UTF-8 stream_name Sabatta_2008.pdf.txt Content-Type text/plain; charset=UTF-8 Vision-based Topological Map... of topological mapping was introduced into the field of robotics following studies of human cogni- tive mapping undertaken by Kuipers [8]. Since then, much progress has been made in the field of vision-based topologi- cal mapping. Topological mapping lends...

  9. Attribute and topology based change detection in a constellation of previously detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  10. Overlay improvement by exposure map based mask registration optimization

    Science.gov (United States)

    Shi, Irene; Guo, Eric; Chen, Ming; Lu, Max; Li, Gordon; Li, Rivan; Tian, Eric

    2015-03-01

    Along with the increased miniaturization of semiconductor electronic devices, the design rules of advanced semiconductor devices shrink dramatically. [1] One of the main challenges of lithography step is the layer-to-layer overlay control. Furthermore, DPT (Double Patterning Technology) has been adapted for the advanced technology node like 28nm and 14nm, corresponding overlay budget becomes even tighter. [2][3] After the in-die mask registration (pattern placement) measurement is introduced, with the model analysis of a KLA SOV (sources of variation) tool, it's observed that registration difference between masks is a significant error source of wafer layer-to-layer overlay at 28nm process. [4][5] Mask registration optimization would highly improve wafer overlay performance accordingly. It was reported that a laser based registration control (RegC) process could be applied after the pattern generation or after pellicle mounting and allowed fine tuning of the mask registration. [6] In this paper we propose a novel method of mask registration correction, which can be applied before mask writing based on mask exposure map, considering the factors of mask chip layout, writing sequence, and pattern density distribution. Our experiment data show if pattern density on the mask keeps at a low level, in-die mask registration residue error in 3sigma could be always under 5nm whatever blank type and related writer POSCOR (position correction) file was applied; it proves random error induced by material or equipment would occupy relatively fixed error budget as an error source of mask registration. On the real production, comparing the mask registration difference through critical production layers, it could be revealed that registration residue error of line space layers with higher pattern density is always much larger than the one of contact hole layers with lower pattern density. Additionally, the mask registration difference between layers with similar pattern density

  11. Non-Markovianity Measure Based on Brukner-Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    Science.gov (United States)

    He, Zhi; Zhu, Lie-Qiang; Li, Li

    2017-03-01

    A non-Markovianity measure based on Brukner-Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner-Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner-Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. Supported by the National Natural Science Foundation of China under Grant No. 61505053, the Natural Science Foundation of Hunan Province under Grant No. 2015JJ3092, the Research Foundation of Education Bureau of Hunan Province, China under Grant No. 16B177, the School Foundation from the Hunan University of Arts and Science under Grant No. 14ZD01

  12. Non-Markovianity Measure Based on Brukner–Zeilinger Invariant Information for Unital Quantum Dynamical Maps

    International Nuclear Information System (INIS)

    He Zhi; Zhu Lie-Qiang; Li Li

    2017-01-01

    A non-Markovianity measure based on Brukner–Zeilinger invariant information to characterize non-Markovian effect of open systems undergoing unital dynamical maps is proposed. The method takes advantage of non-increasing property of the Brukner–Zeilinger invariant information under completely positive and trace-preserving unital maps. The simplicity of computing the Brukner–Zeilinger invariant information is the advantage of the proposed measure because of mainly depending on the purity of quantum state. The measure effectively captures the characteristics of non-Markovianity of unital dynamical maps. As some concrete application, we consider two typical non-Markovian noise channels, i.e., the phase damping channel and the random unitary channel to show the sensitivity of the proposed measure. By investigation, we find that the conditions of detecting the non-Markovianity for the phase damping channel are consistent with the results of existing measures for non-Markovianity, i.e., information flow, divisibility and quantum mutual information. However, for the random unitary channel non-Markovian conditions are same to that of the information flow, but is different from that of the divisibility and quantum mutual information. (paper)

  13. Identification of probabilistic approaches and map-based navigation ...

    Indian Academy of Sciences (India)

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  14. Remote sensing-based fire frequency mapping in a savannah ...

    African Journals Online (AJOL)

    Moderate Resolution Imaging Spectroradiometer (MODIS) images from 2000 to 2006 were obtained and classified for burnt area mapping. Linear pixel unmixing was used for image classification and subsequent mapping of burnt areas. The results showed that it was feasible to have discrimination of burnt areas and ...

  15. Creating Geologically Based Radon Potential Maps for Kentucky

    Science.gov (United States)

    Overfield, B.; Hahn, E.; Wiggins, A.; Andrews, W. M., Jr.

    2017-12-01

    Radon potential in the United States, Kentucky in particular, has historically been communicated using a single hazard level for each county; however, physical phenomena are not controlled by administrative boundaries, so single-value county maps do not reflect the significant variations in radon potential in each county. A more accurate approach uses bedrock geology as a predictive tool. A team of nurses, health educators, statisticians, and geologists partnered to create 120 county maps showing spatial variations in radon potential by intersecting residential radon test kit results (N = 60,000) with a statewide 1:24,000-scale bedrock geology coverage to determine statistically valid radon-potential estimates for each geologic unit. Maps using geology as a predictive tool for radon potential are inherently more detailed than single-value county maps. This mapping project revealed that areas in central and south-central Kentucky with the highest radon potential are underlain by shales and karstic limestones.

  16. The MAPS based PXL vertex detector for the STAR experiment

    Science.gov (United States)

    Contin, G.; Anderssen, E.; Greiner, L.; Schambach, J.; Silber, J.; Stezelberger, T.; Sun, X.; Szelezniak, M.; Vu, C.; Wieman, H.; Woodmansee, S.

    2015-03-01

    The Heavy Flavor Tracker (HFT) was installed in the STAR experiment for the 2014 heavy ion run of RHIC. Designed to improve the vertex resolution and extend the measurement capabilities in the heavy flavor domain, the HFT is composed of three different silicon detectors based on CMOS monolithic active pixels (MAPS), pads and strips respectively, arranged in four concentric cylinders close to the STAR interaction point. The two innermost HFT layers are placed at a radius of 2.7 and 8 cm from the beam line, respectively, and accommodate 400 ultra-thin (50 μ m) high resolution MAPS sensors arranged in 10-sensor ladders to cover a total silicon area of 0.16 m2. Each sensor includes a pixel array of 928 rows and 960 columns with a 20.7 μ m pixel pitch, providing a sensitive area of ~ 3.8 cm2. The architecture is based on a column parallel readout with amplification and correlated double sampling inside each pixel. Each column is terminated with a high precision discriminator, is read out in a rolling shutter mode and the output is processed through an integrated zero suppression logic. The results are stored in two SRAM with ping-pong arrangement for a continuous readout. The sensor features 185.6 μ s readout time and 170 mW/cm2 power dissipation. The detector is air-cooled, allowing a global material budget as low as 0.39% on the inner layer. A novel mechanical approach to detector insertion enables effective installation and integration of the pixel layers within an 8 hour shift during the on-going STAR run.In addition to a detailed description of the detector characteristics, the experience of the first months of data taking will be presented in this paper, with a particular focus on sensor threshold calibration, latch-up protection procedures and general system operations aimed at stabilizing the running conditions. Issues faced during the 2014 run will be discussed together with the implemented solutions. A preliminary analysis of the detector performance

  17. The MAPS based PXL vertex detector for the STAR experiment

    International Nuclear Information System (INIS)

    Contin, G.; Anderssen, E.; Greiner, L.; Silber, J.; Stezelberger, T.; Vu, C.; Wieman, H.; Woodmansee, S.; Schambach, J.; Sun, X.; Szelezniak, M.

    2015-01-01

    The Heavy Flavor Tracker (HFT) was installed in the STAR experiment for the 2014 heavy ion run of RHIC. Designed to improve the vertex resolution and extend the measurement capabilities in the heavy flavor domain, the HFT is composed of three different silicon detectors based on CMOS monolithic active pixels (MAPS), pads and strips respectively, arranged in four concentric cylinders close to the STAR interaction point. The two innermost HFT layers are placed at a radius of 2.7 and 8 cm from the beam line, respectively, and accommodate 400 ultra-thin (50 μ m) high resolution MAPS sensors arranged in 10-sensor ladders to cover a total silicon area of 0.16 m 2 . Each sensor includes a pixel array of 928 rows and 960 columns with a 20.7 μ m pixel pitch, providing a sensitive area of ∼ 3.8 cm 2 . The architecture is based on a column parallel readout with amplification and correlated double sampling inside each pixel. Each column is terminated with a high precision discriminator, is read out in a rolling shutter mode and the output is processed through an integrated zero suppression logic. The results are stored in two SRAM with ping-pong arrangement for a continuous readout. The sensor features 185.6 μ s readout time and 170 mW/cm 2 power dissipation. The detector is air-cooled, allowing a global material budget as low as 0.39% on the inner layer. A novel mechanical approach to detector insertion enables effective installation and integration of the pixel layers within an 8 hour shift during the on-going STAR run.In addition to a detailed description of the detector characteristics, the experience of the first months of data taking will be presented in this paper, with a particular focus on sensor threshold calibration, latch-up protection procedures and general system operations aimed at stabilizing the running conditions. Issues faced during the 2014 run will be discussed together with the implemented solutions. A preliminary analysis of the detector

  18. Features of the organization of bread wheat chromosome 5BS based on physical mapping.

    Science.gov (United States)

    Salina, Elena A; Nesterov, Mikhail A; Frenkel, Zeev; Kiseleva, Antonina A; Timonova, Ekaterina M; Magni, Federica; Vrána, Jan; Šafář, Jan; Šimková, Hana; Doležel, Jaroslav; Korol, Abraham; Sergeeva, Ekaterina M

    2018-02-09

    The IWGSC strategy for construction of the reference sequence of the bread wheat genome is based on first obtaining physical maps of the individual chromosomes. Our aim is to develop and use the physical map for analysis of the organization of the short arm of wheat chromosome 5B (5BS) which bears a number of agronomically important genes, including genes conferring resistance to fungal diseases. A physical map of the 5BS arm (290 Mbp) was constructed using restriction fingerprinting and LTC software for contig assembly of 43,776 BAC clones. The resulting physical map covered ~ 99% of the 5BS chromosome arm (111 scaffolds, N50 = 3.078 Mb). SSR, ISBP and zipper markers were employed for anchoring the BAC clones, and from these 722 novel markers were developed based on previously obtained data from partial sequencing of 5BS. The markers were mapped using a set of Chinese Spring (CS) deletion lines, and F2 and RICL populations from a cross of CS and CS-5B dicoccoides. Three approaches have been used for anchoring BAC contigs on the 5BS chromosome, including clone-by-clone screening of BACs, GenomeZipper analysis, and comparison of BAC-fingerprints with in silico fingerprinting of 5B pseudomolecules of T. dicoccoides. These approaches allowed us to reach a high level of BAC contig anchoring: 96% of 5BS BAC contigs were located on 5BS. An interesting pattern was revealed in the distribution of contigs along the chromosome. Short contigs (200-999 kb) containing markers for the regions interrupted by tandem repeats, were mainly localized to the 5BS subtelomeric block; whereas the distribution of larger 1000-3500 kb contigs along the chromosome better correlated with the distribution of the regions syntenic to rice, Brachypodium, and sorghum, as detected by the Zipper approach. The high fingerprinting quality, LTC software and large number of BAC clones selected by the informative markers in screening of the 43,776 clones allowed us to significantly increase the

  19. Strategies for haplotype-based association mapping in complex pedigreed populations

    DEFF Research Database (Denmark)

    Boleckova, J; Christensen, Ole Fredslund; Sørensen, Peter

    2012-01-01

    In association mapping, haplotype-based methods are generally regarded to provide higher power and increased precision than methods based on single markers. For haplotype-based association mapping most studies use a fixed haplotype effect in the model. However, an increase in haplotype length inc...

  20. Object-based landslide detection in different geographic regions

    Science.gov (United States)

    Friedl, Barbara; Hölbling, Daniel; Eisank, Clemens; Blaschke, Thomas

    2015-04-01

    Landslides occur in almost all mountainous regions of the world and rank among the most severe natural hazards. In the last decade - according to the world disaster report 2014 published by the International Federation of Red Cross and Red Crescent Societies (IRFC) - more than 9.000 people were killed by mass movements, more than 3.2 million people were affected and the total amount of disaster estimated damage accounts to more than 1.700 million US dollars. The application of remote sensing data for mapping landslides can contribute to post-disaster reconstruction or hazard mitigation, either by providing rapid information about the spatial distribution and location of landslides in the aftermath of triggering events or by creating and updating landslide inventories. This is especially valid for remote and inaccessible areas, where information on landslides is often lacking. However, reliable methods are needed for extracting timely and relevant information about landslides from remote sensing data. In recent years, novel methods such as object-based image analysis (OBIA) have been successfully employed for semi-automated landslide mapping. Several studies revealed that OBIA frequently outperforms pixel-based approaches, as a range of image object properties (spectral, spatial, morphometric, contextual) can be exploited during the analysis. However, object-based methods are often tailored to specific study areas, and thus, the transferability to regions with different geological settings, is often limited. The present case study evaluates the transferability and applicability of an OBIA approach for landslide detection in two distinct regions, i.e. the island of Taiwan and Austria. In Taiwan, sub-areas in the Baichi catchment in the North and in the Huaguoshan catchment in the southern-central part of the island are selected; in Austria, landslide-affected sites in the Upper Salzach catchment in the federal state of Salzburg are investigated. For both regions

  1. Blanding’s Turtle (Emydoidea blandingii Potential Habitat Mapping Using Aerial Orthophotographic Imagery and Object Based Classification

    Directory of Open Access Journals (Sweden)

    Douglas J. King

    2012-01-01

    Full Text Available Blanding’s turtle (Emydoidea blandingii is a threatened species under Canada’s Species at Risk Act. In southern Québec, field based inventories are ongoing to determine its abundance and potential habitat. The goal of this research was to develop means for mapping of potential habitat based on primary habitat attributes that can be detected with high-resolution remotely sensed imagery. Using existing spring leaf-off 20 cm resolution aerial orthophotos of a portion of Gatineau Park where some Blanding’s turtle observations had been made, habitat attributes were mapped at two scales: (1 whole wetlands; (2 within wetland habitat features of open water, vegetation (used for camouflage and thermoregulation, and logs (used for spring sun-basking. The processing steps involved initial pixel-based classification to eliminate most areas of non-wetland, followed by object-based segmentations and classifications using a customized rule sequence to refine the wetland map and to map the within wetland habitat features. Variables used as inputs to the classifications were derived from the orthophotos and included image brightness, texture, and segmented object shape and area. Independent validation using field data and visual interpretation showed classification accuracy for all habitat attributes to be generally over 90% with a minimum of 81.5% for the producer’s accuracy of logs. The maps for each attribute were combined to produce a habitat suitability map for Blanding’s turtle. Of the 115 existing turtle observations, 92.3% were closest to a wetland of the two highest suitability classes. High-resolution imagery combined with object-based classification and habitat suitability mapping methods such as those presented provide a much more spatially explicit representation of detailed habitat attributes than can be obtained through field work alone. They can complement field efforts to document and track turtle activities and can contribute to

  2. A buffer overflow detection based on inequalities solution

    International Nuclear Information System (INIS)

    Xu Guoai; Zhang Miao; Yang Yixian

    2007-01-01

    A new buffer overflow detection model based on Inequalities Solution was designed, which is based on analyzing disadvantage of the old buffer overflow detection technique and successfully converting buffer overflow detection to Inequalities Solution. The new model can conquer the disadvantage of the old technique and improve efficiency of buffer overflow detection. (authors)

  3. Creation Greenhouse Environment Map Using Localization of Edge of Cultivation Platforms Based on Stereo Vision

    Directory of Open Access Journals (Sweden)

    A Nasiri

    2017-10-01

    Full Text Available Introduction Stereo vision means the capability of extracting the depth based on analysis of two images taken from different angles of one scene. The result of stereo vision is a collection of three-dimensional points which describes the details of scene proportional to the resolution of the obtained images. Vehicle automatic steering and crop growth monitoring are two important operations in agricultural precision. The essential aspects of an automated steering are position and orientation of the agricultural equipment in relation to crop row, detection of obstacles and design of path planning between the crop rows. The developed map can provide this information in the real time. Machine vision has the capabilities to perform these tasks in order to execute some operations such as cultivation, spraying and harvesting. In greenhouse environment, it is possible to develop a map and perform an automatic control by detecting and localizing the cultivation platforms as the main moving obstacle. The current work was performed to meet a method based on the stereo vision for detecting and localizing platforms, and then, providing a two-dimensional map for cultivation platforms in the greenhouse environment. Materials and Methods In this research, two webcams, made by Microsoft Corporation with the resolution of 960×544, are connected to the computer via USB2 in order to produce a stereo parallel camera. Due to the structure of cultivation platforms, the number of points in the point cloud will be decreased by extracting the only upper and lower edges of the platform. The proposed method in this work aims at extracting the edges based on depth discontinuous features in the region of platform edge. By getting the disparity image of the platform edges from the rectified stereo images and translating its data to 3D-space, the point cloud model of the environments is constructed. Then by projecting the points to XZ plane and putting local maps together

  4. Vulnerability mapping in kelud volcano based on village information

    Science.gov (United States)

    Hisbaron, D. R.; Wijayanti, H.; Iffani, M.; Winastuti, R.; Yudinugroho, M.

    2018-04-01

    Kelud Volcano is a basaltic andesitic stratovolcano, situated at 27 km to the east of Kediri, Indonesia. Historically, Kelud Volcano has erupted with return period of 9-75 years, had caused nearly 160,000 people living in Tulungagung, Blitar and Kediri District to be in high-risk areas. This study aims to map vulnerability towards lava flows in Kediri and Malang using detailed scale. There are four major variables, namely demography, asset, hazard, and land use variables. PGIS (Participatory Geographic Information System) is employed to collect data, while ancillary data is derived from statistics information, interpretation of high resolution satellite imagery and Unmanned Aerial Vehicles (UAVs). Data were obtained from field checks and some from high resolution satellite imagery and UAVs. The output of this research is village-based vulnerability information that becomes a valuable input for local stakeholders to improve local preparedness in areas prone to improved disaster resilience. The results indicated that the highest vulnerability to lava flood disaster in Kelud Volcano is owned by Kandangan Hamlet, Pandean Hamlet and Kacangan Hamlet, because these two hamlets are in the dominant high vulnerability position of 3 out of 4 scenarios (economic, social and equal).

  5. Identification of Coupled Map Lattice Based on Compressed Sensing

    Directory of Open Access Journals (Sweden)

    Dong Xie

    2016-01-01

    Full Text Available A novel approach for the parameter identification of coupled map lattice (CML based on compressed sensing is presented in this paper. We establish a meaningful connection between these two seemingly unrelated study topics and identify the weighted parameters using the relevant recovery algorithms in compressed sensing. Specifically, we first transform the parameter identification problem of CML into the sparse recovery problem of underdetermined linear system. In fact, compressed sensing provides a feasible method to solve underdetermined linear system if the sensing matrix satisfies some suitable conditions, such as restricted isometry property (RIP and mutual coherence. Then we give a low bound on the mutual coherence of the coefficient matrix generated by the observed values of CML and also prove that it satisfies the RIP from a theoretical point of view. If the weighted vector of each element is sparse in the CML system, our proposed approach can recover all the weighted parameters using only about M samplings, which is far less than the number of the lattice elements N. Another important and significant advantage is that if the observed data are contaminated with some types of noises, our approach is still effective. In the simulations, we mainly show the effects of coupling parameter and noise on the recovery rate.

  6. Comic image understanding based on polygon detection

    Science.gov (United States)

    Li, Luyuan; Wang, Yongtao; Tang, Zhi; Liu, Dong

    2013-01-01

    Comic image understanding aims to automatically decompose scanned comic page images into storyboards and then identify the reading order of them, which is the key technique to produce digital comic documents that are suitable for reading on mobile devices. In this paper, we propose a novel comic image understanding method based on polygon detection. First, we segment a comic page images into storyboards by finding the polygonal enclosing box of each storyboard. Then, each storyboard can be represented by a polygon, and the reading order of them is determined by analyzing the relative geometric relationship between each pair of polygons. The proposed method is tested on 2000 comic images from ten printed comic series, and the experimental results demonstrate that it works well on different types of comic images.

  7. Low complexity pixel-based halftone detection

    Science.gov (United States)

    Ok, Jiheon; Han, Seong Wook; Jarno, Mielikainen; Lee, Chulhee

    2011-10-01

    With the rapid advances of the internet and other multimedia technologies, the digital document market has been growing steadily. Since most digital images use halftone technologies, quality degradation occurs when one tries to scan and reprint them. Therefore, it is necessary to extract the halftone areas to produce high quality printing. In this paper, we propose a low complexity pixel-based halftone detection algorithm. For each pixel, we considered a surrounding block. If the block contained any flat background regions, text, thin lines, or continuous or non-homogeneous regions, the pixel was classified as a non-halftone pixel. After excluding those non-halftone pixels, the remaining pixels were considered to be halftone pixels. Finally, documents were classified as pictures or photo documents by calculating the halftone pixel ratio. The proposed algorithm proved to be memory-efficient and required low computation costs. The proposed algorithm was easily implemented using GPU.

  8. Aptamer Based Microsphere Biosensor for Thrombin Detection

    Directory of Open Access Journals (Sweden)

    Xudong Fan

    2006-08-01

    Full Text Available We have developed an optical microsphere resonator biosensor using aptamer asreceptor for the measurement of the important biomolecule thrombin. The sphere surface ismodified with anti-thrombin aptamer, which has excellent binding affinity and selectivityfor thrombin. Binding of the thrombin at the sphere surface is monitored by the spectralposition of the microsphere’s whispering gallery mode resonances. A detection limit on theorder of 1 NIH Unit/mL is demonstrated. Control experiments with non-aptameroligonucleotide and BSA are also carried out to confirm the specific binding betweenaptamer and thrombin. We expect that this demonstration will lead to the development ofhighly sensitive biomarker sensors based on aptamer with lower cost and higher throughputthan current technology.

  9. Detecting Soft Errors in Stencil based Computations

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, V. [Univ. of Utah, Salt Lake City, UT (United States); Gopalkrishnan, G. [Univ. of Utah, Salt Lake City, UT (United States); Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-06

    Given the growing emphasis on system resilience, it is important to develop software-level error detectors that help trap hardware-level faults with reasonable accuracy while minimizing false alarms as well as the performance overhead introduced. We present a technique that approaches this idea by taking stencil computations as our target, and synthesizing detectors based on machine learning. In particular, we employ linear regression to generate computationally inexpensive models which form the basis for error detection. Our technique has been incorporated into a new open-source library called SORREL. In addition to reporting encouraging experimental results, we demonstrate techniques that help reduce the size of training data. We also discuss the efficacy of various detectors synthesized, as well as our future plans.

  10. Web-Based Urban Metabolic Mapping for Bangalore, India

    Science.gov (United States)

    Mehta, V. K.; Kemp-Benedict, E.; Wang, G.; Malghan, D.

    2012-12-01

    Cities are like living entities, needing a continuous throughput of resources and energy for survival and growth, creating waste in the process. This paper documents the Bangalore Urban Mapping Project: an initiative that uses this metabolic concept [1],[2]. to inform comprehensive planning in the rapidly growing software capital of Bangalore city in India. Focusing on demographic growth, and water supply and consumption in its first phase, a web-based geo-portal has been developed for two purposes - interactive information communication and delivery, and online planning in the water supply sector. The application, titled Bangalore Urban Mapping Project (BUMP) is built on a free and open source web GIS stack consisting of a Postgis database, PHP, OpenLayers, and Apache Web Server deployed on a 64-bit Ubuntu Linux server platform. The interactive planning portion of the application allows BUMP users to build, run and visualize demographic growth, water supply, and growth scenarios on the browser. Application logic is written in PHP to connect the many components of the interactive application, which is available on the BUMP website (http://www.seimapping.org/bump/index.php). It relies on AJAX to fetch layer data from the server and render the layer using OpenLayers on the fly. This allows users to view multiple layers at the same time without refreshing the page. Data is packed in GeoJSON format and is compressed to reduce traffic. The information communication portion of the application provides thematic representation of each of twenty different map layers, graphical and tabular summaries of demographic and water data that are presented dynamically using Javascript libraries including the Google Chart API. The application also uses other common Javascript libraries/plug-ins, like jQuery, jQuery UI, qTip, to ease the development and to ensure cross-browser compatibility. The planning portion of the platform allows the user to interact with a scenario explorer

  11. Single electron based binary multipliers with overflow detection ...

    African Journals Online (AJOL)

    electron based device. Multipliers with overflow detection based on serial and parallel prefix computation algorithm are elaborately discussed analytically and designed. The overflow detection circuits works in parallel with a simplified multiplier to ...

  12. Automated detection and mapping of crown discolouration caused by jack pine budworm with 2.5 m resolution multispectral imagery

    Science.gov (United States)

    Leckie, Donald G.; Cloney, Ed; Joyce, Steve P.

    2005-05-01

    Jack pine budworm ( Choristoneura pinus pinus (Free.)) is a native insect defoliator of mainly jack pine ( Pinus banksiana Lamb.) in North America east of the Rocky Mountains. Periodic outbreaks of this insect, which generally last two to three years, can cause growth loss and mortality and have an important impact ecologically and economically in terms of timber production and harvest. The jack pine budworm prefers to feed on current year needles. Their characteristic feeding habits cause discolouration or reddening of the canopy. This red colouration is used to map the distribution and intensity of defoliation that has taken place that year (current defoliation). An accurate and consistent map of the distribution and intensity of budworm defoliation (as represented by the red discolouration) at the stand and within stand level is desirable. Automated classification of multispectral imagery, such as is available from airborne and new high resolution satellite systems, was explored as a viable tool for objectively classifying current discolouration. Airborne multispectral imagery was acquired at a 2.5 m resolution with the Multispectral Electro-optical Imaging Sensor (MEIS). It recorded imagery in six nadir looking spectral bands specifically designed to detect discolouration caused by budworm and a near-infrared band viewing forward at 35° was also used. A 2200 nm middle infrared image was acquired with a Daedalus scanner. Training and test areas of different levels of discolouration were created based on field observations and a maximum likelihood supervized classification was used to estimate four classes of discolouration (nil-trace, light, moderate and severe). Good discrimination was achieved with an overall accuracy of 84% for the four discolouration levels. The moderate discolouration class was the poorest at 73%, because of confusion with both the severe and light classes. Accuracy on a stand basis was also good, and regional and within stand

  13. Cellular telephone-based wide-area radiation detection network

    Science.gov (United States)

    Craig, William W [Pittsburg, CA; Labov, Simon E [Berkeley, CA

    2009-06-09

    A network of radiation detection instruments, each having a small solid state radiation sensor module integrated into a cellular phone for providing radiation detection data and analysis directly to a user. The sensor module includes a solid-state crystal bonded to an ASIC readout providing a low cost, low power, light weight compact instrument to detect and measure radiation energies in the local ambient radiation field. In particular, the photon energy, time of event, and location of the detection instrument at the time of detection is recorded for real time transmission to a central data collection/analysis system. The collected data from the entire network of radiation detection instruments are combined by intelligent correlation/analysis algorithms which map the background radiation and detect, identify and track radiation anomalies in the region.

  14. Structural mapping based on potential field and remote sensing data ...

    Indian Academy of Sciences (India)

    Swarnapriya Chowdari

    2017-08-31

    Aug 31, 2017 ... to comprehend the tectonic development of the ... software for the analysis and interpretation of G– .... The application of remote sensing for mapping ..... Pf1 and Pf2 show profile locations adopted for joint G–M modelling.

  15. Standards-Based Open-Source Planetary Map Server: Lunaserv

    Science.gov (United States)

    Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.

    2018-04-01

    Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.

  16. Development of a web based GIS for health facilities mapping ...

    African Journals Online (AJOL)

    Hilary Mushonga

    Key Words: Spatial Decision Support System, Web GIS, Mapping, Health geography. 1. Introduction ... Health geography is an area of medical research that incorporates geographic techniques into the study of ... street water pump. Once the ...

  17. Aircraft route planning based on digital map pre-treatment

    Directory of Open Access Journals (Sweden)

    Ran ZHEN

    2015-04-01

    Full Text Available Aiming at the flight path project in low complicated airspace, the influence of terrain conditions and surface threatening to aircraft flight are studied. Through the analysis of digital map and static threat, the paper explores the processing method of the digital map, and uses the Hermite function to process the map smoothly, reducing the searching range of optimal trajectory. By designing the terrain following, terrain avoidance and the way of avoiding a threat, the safety of aircraft can be guaranteed. In-depth analysis of particle swarm optimization (PSO algorithm realizes the three dimensional paths project before the aircraft performs a task. Through simulation, the difference of the maps before and after processing is shown, and offline programming of the three dimensional optimal path is achieved.

  18. Privacy-Aware Image Encryption Based on Logistic Map and Data Hiding

    Science.gov (United States)

    Sun, Jianglin; Liao, Xiaofeng; Chen, Xin; Guo, Shangwei

    The increasing need for image communication and storage has created a great necessity for securely transforming and storing images over a network. Whereas traditional image encryption algorithms usually consider the security of the whole plain image, region of interest (ROI) encryption schemes, which are of great importance in practical applications, protect the privacy regions of plain images. Existing ROI encryption schemes usually adopt approximate techniques to detect the privacy region and measure the quality of encrypted images; however, their performance is usually inconsistent with a human visual system (HVS) and is sensitive to statistical attacks. In this paper, we propose a novel privacy-aware ROI image encryption (PRIE) scheme based on logistical mapping and data hiding. The proposed scheme utilizes salient object detection to automatically, adaptively and accurately detect the privacy region of a given plain image. After private pixels have been encrypted using chaotic cryptography, the significant bits are embedded into the nonprivacy region of the plain image using data hiding. Extensive experiments are conducted to illustrate the consistency between our automatic ROI detection and HVS. Our experimental results also demonstrate that the proposed scheme exhibits satisfactory security performance.

  19. Video-based Mobile Mapping System Using Smartphones

    Science.gov (United States)

    Al-Hamad, A.; Moussa, A.; El-Sheimy, N.

    2014-11-01

    The last two decades have witnessed a huge growth in the demand for geo-spatial data. This demand has encouraged researchers around the world to develop new algorithms and design new mapping systems in order to obtain reliable sources for geo-spatial data. Mobile Mapping Systems (MMS) are one of the main sources for mapping and Geographic Information Systems (GIS) data. MMS integrate various remote sensing sensors, such as cameras and LiDAR, along with navigation sensors to provide the 3D coordinates of points of interest from moving platform (e.g. cars, air planes, etc.). Although MMS can provide accurate mapping solution for different GIS applications, the cost of these systems is not affordable for many users and only large scale companies and institutions can benefits from MMS systems. The main objective of this paper is to propose a new low cost MMS with reasonable accuracy using the available sensors in smartphones and its video camera. Using the smartphone video camera, instead of capturing individual images, makes the system easier to be used by non-professional users since the system will automatically extract the highly overlapping frames out of the video without the user intervention. Results of the proposed system are presented which demonstrate the effect of the number of the used images in mapping solution. In addition, the accuracy of the mapping results obtained from capturing a video is compared to the same results obtained from using separate captured images instead of video.

  20. Detection and mapping of illicit drugs and their metabolites in fingermarks by MALDI MS and compatibility with forensic techniques

    Science.gov (United States)

    Groeneveld, G.; de Puit, M.; Bleay, S.; Bradshaw, R.; Francese, S.

    2015-06-01

    Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect’s lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details.

  1. Towards Stable Adversarial Feature Learning for LiDAR based Loop Closure Detection

    OpenAIRE

    Xu, Lingyun; Yin, Peng; Luo, Haibo; Liu, Yunhui; Han, Jianda

    2017-01-01

    Stable feature extraction is the key for the Loop closure detection (LCD) task in the simultaneously localization and mapping (SLAM) framework. In our paper, the feature extraction is operated by using a generative adversarial networks (GANs) based unsupervised learning. GANs are powerful generative models, however, GANs based adversarial learning suffers from training instability. We find that the data-code joint distribution in the adversarial learning is a more complex manifold than in the...

  2. A SNP and SSR Based Genetic Map of Asparagus Bean (Vigna. unguiculata ssp. sesquipedialis) and Comparison with the Broader Species

    Science.gov (United States)

    Xu, Pei; Wu, Xiaohua; Wang, Baogen; Liu, Yonghua; Ehlers, Jeffery D.; Close, Timothy J.; Roberts, Philip A.; Diop, Ndeye-Ndack; Qin, Dehui; Hu, Tingting; Lu, Zhongfu; Li, Guojing

    2011-01-01

    Asparagus bean (Vigna. unguiculata ssp. sesquipedialis) is a distinctive subspecies of cowpea [Vigna. unguiculata (L.) Walp.] that apparently originated in East Asia and is characterized by extremely long and thin pods and an aggressive climbing growth habit. The crop is widely cultivated throughout Asia for the production of immature pods known as ‘long beans’ or ‘asparagus beans’. While the genome of cowpea ssp. unguiculata has been characterized recently by high-density genetic mapping and partial sequencing, little is known about the genome of asparagus bean. We report here the first genetic map of asparagus bean based on SNP and SSR markers. The current map consists of 375 loci mapped onto 11 linkage groups (LGs), with 191 loci detected by SNP markers and 184 loci by SSR markers. The overall map length is 745 cM, with an average marker distance of 1.98 cM. There are four high marker-density blocks distributed on three LGs and three regions of segregation distortion (SDRs) identified on two other LGs, two of which co-locate in chromosomal regions syntenic to SDRs in soybean. Synteny between asparagus bean and the model legume Lotus. japonica was also established. This work provides the basis for mapping and functional analysis of genes/QTLs of particular interest in asparagus bean, as well as for comparative genomics study of cowpea at the subspecies level. PMID:21253606

  3. A SNP and SSR based genetic map of asparagus bean (Vigna. unguiculata ssp. sesquipedialis and comparison with the broader species.

    Directory of Open Access Journals (Sweden)

    Pei Xu

    Full Text Available Asparagus bean (Vigna. unguiculata ssp. sesquipedialis is a distinctive subspecies of cowpea [Vigna. unguiculata (L. Walp.] that apparently originated in East Asia and is characterized by extremely long and thin pods and an aggressive climbing growth habit. The crop is widely cultivated throughout Asia for the production of immature pods known as 'long beans' or 'asparagus beans'. While the genome of cowpea ssp. unguiculata has been characterized recently by high-density genetic mapping and partial sequencing, little is known about the genome of asparagus bean. We report here the first genetic map of asparagus bean based on SNP and SSR markers. The current map consists of 375 loci mapped onto 11 linkage groups (LGs, with 191 loci detected by SNP markers and 184 loci by SSR markers. The overall map length is 745 cM, with an average marker distance of 1.98 cM. There are four high marker-density blocks distributed on three LGs and three regions of segregation distortion (SDRs identified on two other LGs, two of which co-locate in chromosomal regions syntenic to SDRs in soybean. Synteny between asparagus bean and the model legume Lotus. japonica was also established. This work provides the basis for mapping and functional analysis of genes/QTLs of particular interest in asparagus bean, as well as for comparative genomics study of cowpea at the subspecies level.

  4. 12MAP: Cloud Disaster Recovery Based on Image-Instance Mapping

    OpenAIRE

    Nadgowda , Shripad; Jayachandran , Praveen; Verma , Akshat

    2013-01-01

    Part 2: Cloud Computing; International audience; Virtual machines (VMs) in a cloud use standardized ‘golden master’ images, standard software catalog and management tools. This facilitates quick provisioning of VMs and helps reduce the cost of managing the cloud by reducing the need for specialized software skills. However, knowledge of this similarity is lost post-provisioning, as VMs could experience different changes and may drift away from one another. In this work, we propose the 12MAP s...

  5. A consensus linkage map of the grass carp (Ctenopharyngodon idella based on microsatellites and SNPs

    Directory of Open Access Journals (Sweden)

    Li Jiale

    2010-02-01

    Full Text Available Abstract Background Grass carp (Ctenopharyngodon idella belongs to the family Cyprinidae which includes more than 2000 fish species. It is one of the most important freshwater food fish species in world aquaculture. A linkage map is an essential framework for mapping traits of interest and is often the first step towards understanding genome evolution. The aim of this study is to construct a first generation genetic map of grass carp using microsatellites and SNPs to generate a new resource for mapping QTL for economically important traits and to conduct a comparative mapping analysis to shed new insights into the evolution of fish genomes. Results We constructed a first generation linkage map of grass carp with a mapping panel containing two F1 families including 192 progenies. Sixteen SNPs in genes and 263 microsatellite markers were mapped to twenty-four linkage groups (LGs. The number of LGs was corresponding to the haploid chromosome number of grass carp. The sex-specific map was 1149.4 and 888.8 cM long in females and males respectively whereas the sex-averaged map spanned 1176.1 cM. The average resolution of the map was 4.2 cM/locus. BLAST searches of sequences of mapped markers of grass carp against the whole genome sequence of zebrafish revealed substantial macrosynteny relationship and extensive colinearity of markers between grass carp and zebrafish. Conclusions The linkage map of grass carp presented here is the first linkage map of a food fish species based on co-dominant markers in the family Cyprinidae. This map provides a valuable resource for mapping phenotypic variations and serves as a reference to approach comparative genomics and understand the evolution of fish genomes and could be complementary to grass carp genome sequencing project.

  6. Stereo-vision-based terrain mapping for off-road autonomous navigation

    Science.gov (United States)

    Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.

    2009-05-01

    Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as nogo regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.

  7. Face recognition based on depth maps and surface curvature

    Science.gov (United States)

    Gordon, Gaile G.

    1991-09-01

    This paper explores the representation of the human face by features based on the curvature of the face surface. Curature captures many features necessary to accurately describe the face, such as the shape of the forehead, jawline, and cheeks, which are not easily detected from standard intensity images. Moreover, the value of curvature at a point on the surface is also viewpoint invariant. Until recently range data of high enough resolution and accuracy to perform useful curvature calculations on the scale of the human face had been unavailable. Although several researchers have worked on the problem of interpreting range data from curved (although usually highly geometrically structured) surfaces, the main approaches have centered on segmentation by signs of mean and Gaussian curvature which have not proved sufficient in themselves for the case of the human face. This paper details the calculation of principal curvature for a particular data set, the calculation of general surface descriptors based on curvature, and the calculation of face specific descriptors based both on curvature features and a priori knowledge about the structure of the face. These face specific descriptors can be incorporated into many different recognition strategies. A system that implements one such strategy, depth template comparison, giving recognition rates between 80% and 90% is described.

  8. Functional cine MR imaging for the detection and mapping of intraabdominal adhesions: method and surgical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Buhmann-Kirchhoff, Sonja; Reiser, Maximilian; Lienemann, Andreas [University Hospital Grosshadern, Ludwig-Maximilians-University Munich, Department of Clinical Radiology, Munich (Germany); Lang, Reinhold; Steitz, Heinrich O.; Jauch, Karl W. [University Hospital Munich-Grosshadern, Department of Surgery, Munich (Germany); Kirchhoff, Chlodwig [University Hospital Munich-Innenstadt, Department of Surgery, Munich (Germany)

    2008-06-15

    The purpose of this study was to evaluate the presence and localization of intraabdominal adhesions using functional cine magnetic resonance imaging (MRI) and to correlate the MR findings with intraoperative results. In a retrospective study, patients who had undergone previous abdominal surgery with suspected intraabdominal adhesions were examined. A true fast imaging with steady state precession sequence in transverse/sagittal orientation was used for a section-by-section dynamic depiction of visceral slide on a 1.5-Tesla system. After MRI, all patients underwent anew surgery. A nine-segment abdominal map was used to document the location and type of the adhesions. The intraoperative results were taken as standard of reference. Ninety patients were enrolled. During surgery 71 adhesions were detected, MRI depicted 68 intraabdominal adhesions. The most common type of adhesion in MRI was found between the anterior abdominal wall and small bowel loops (n = 22, 32.5%) and between small bowel loops and pelvic organs (n = 14, 20.6%). Comparing MRI with the intraoperative findings, sensitivity varied between 31 and 75% with a varying specificity between 65 and 92% in the different segments leading to an overall MRI accuracy of 89%. Functional cine MRI proved to be a useful examination technique for the identification of intraabdominal adhesions in patients with acute or chronic pain and corresponding clinical findings providing accurate results. However, no differentiation for symptomatic versus asymptomatic adhesions is possible. (orig.)

  9. White matter mapping by DTI-based tractography for neurosurgery

    International Nuclear Information System (INIS)

    Kamada, Kyousuke

    2009-01-01

    To validate the corticospinal tract (CST) and arcuate fasciculus (AF) illustrated by diffusion tensor imaging (DTI), we used CST- and AF-tractography integrated neuronavigation and monopolar and bipolar direct fiber stimulation. Forty seven patients with brain lesions adjacent to the CST and AF were studied. During lesion resection, direct fiber stimulation was applied to the CST and AF to elicit motor responses (fiber-motor evoked potential (MEP)) and the impairment of language-related functions to identify the CST and AF. The minimum distance between the resection border and illustrated CST was measured on postoperative images. Direct fiber stimulation demonstrated that CST- and AF-tractography accurately reflected anatomical CST functioning. The cortical stimulation to the gyrus, including the language-functional MRI (fMRI) activation, evoked speech arrest, while the subcortical stimulation close to the AF reproducibly caused 'paranomia' without speech arrest. There were strong correlations between stimulus intensity for the fiber-MEP and the distance between eloquent fibers and the stimulus points. The convergent calculation formulated 1.8 mA as the electrical threshold of CST for the fiber-MEP, which was much smaller than that of the hand motor area. Validated tractography demonstrated the mean distance and intersection angle between CST and AF were 5 mm and 107 deg, respectively. In addition, the anisotropic diffusion-weighted image (ADWI) and CST-tractography clearly indicated the locations of the primary motor area (PMA) and the central sulcus and well reflected the anatomical characteristics of the corticospinal tract in the human brain. DTI-based tractography is a reliable way to map the white matter connections in the entire brain in clinical and basic neuroscience. By combining these techniques, investigating the cortico-subcortical connections in the human central nervous system could contribute to elucidating the neural networks of the human brain and

  10. White matter mapping by DTI-based tractography for neurosurgery

    International Nuclear Information System (INIS)

    Kamada, Kyousuke

    2011-01-01

    The purpose of this study was to validate the corticospinal tract (CST) and arcuate fasciculus (AF) illustrated by diffusion tensor imaging (DTI), we used CST- and AF-tractography integrated neuronavigation and monopolar and bipolar direct fiber stimulation. Forty seven patients with brain lesions adjacent to the CST and AF were studied. During lesion resection, direct fiber stimulation was applied to the CST and AF to elicit motor responses (fiber-MEP) and the impairment of language-related functions to identify the CST and AF. The minimum distance between the resection border and illustrated CST was measured on postoperative images. Direct fiber stimulation demonstrated that CST- and AF-tractography accurately reflected anatomical CST functioning. The cortical stimulation to the gyrus, including the language-fMRI activation, evoked speech arrest, while the subcortical stimulation close to the AF reproducibly caused 'paranomia' without speech arrest. There were strong correlations between stimulus intensity for the fiber-MEP and the distance between eloquent fibers and the stimulus points. The convergent calculation formulated 1.8 mA as the electrical threshold of CST for the fiber-MEP, which was much smaller than that of the hand motor area. Validated tractography demonstrated the mean distance and intersection angle between CST and AF were 5 mm and 107 deg, respectively. In addition, the anisotropic diffusion-weighted image (ADWI) and CST-tractography clearly indicated the locations of the primary motor area (PMA) and the central sulcus and well reflected the anatomical characteristics of the corticospinal tract in the human brain. DTI-based tractography is a reliable way to map the white matter connections in the entire brain in clinical and basic neuroscience. By combining these techniques, investigating the cortico-subcortical connections in the human central nervous system could contribute to elucidating the neural networks of the human brain and shed light

  11. Extrinsic Calibration for Vehicle-based Mobile Mapping System

    Directory of Open Access Journals (Sweden)

    SHI Limei

    2015-01-01

    Full Text Available Having the advantage of 360° imaging and rotation invariance, panoramic camera has gradually been used in mobile mapping systems(MMS. Calibration is an essential requirement to make sure that MMS can get high quality geo-information. This paper presents a way to address the extrinsic calibration for vehicle-based MMS composed of panoramic camera and Position and Orientation System (POS. Firstly, control points in the natural scene are set up, whose spatial coordinates are measured with high precision. Secondly, a panoramic spherical model is constructed and panoramic image can be projected to this model by means of spherical reverse transformation projection. Then, localize and select the control points in 3D spherical panoramic view but not in panoramic distorted image directly, the spherical coordinates of control points in panoramic image are gotten. After points correspondence is established, make use of direct geo-reference positioning equation and coordinate transformation, the translation and rotation parameters of panoramic camera relative to POS are computed. Experiments are conducted separately in space city calibration site located in Beijing and the Binhai New Area in Tianjin using our approach. Test results are listed as follows. When the GPS signal are of good quality, absolute positioning mean square error of a point is 10.3 cm in two-dimension plane and 16.5 cm in height direction; Otherwise, it is 35.4 cm in two-dimension plane and 54.8 cm in height direction. The max relative error of distance measurement is about 5 cm over a short distance (distance<3 km, which is not obviously affected by the GPS signal quality.

  12. Estimating missing hourly climatic data using artificial neural network for energy balance based ET mapping applications

    Science.gov (United States)

    Remote sensing based evapotranspiration (ET) mapping has become an important tool for water resources management at a regional scale. Accurate hourly climatic data and reference ET are crucial input for successfully implementing remote sensing based ET models such as Mapping ET with internal calibra...

  13. Comparison of model reference and map based control method for vehicle stability enhancement

    NARCIS (Netherlands)

    Baek, S.; Son, M.; Song, J.; Boo, K.; Kim, H.

    2012-01-01

    A map based controller method to improve a vehicle lateral stability is proposed in this study and compared with the conventional method, a model referenced controller. A model referenced controller to determine compensated yaw moment uses the sliding mode method, but the proposed map based

  14. Vehicle Detection in Aerial Images Based on Region Convolutional Neural Networks and Hard Negative Example Mining.

    Science.gov (United States)

    Tang, Tianyu; Zhou, Shilin; Deng, Zhipeng; Zou, Huanxin; Lei, Lin

    2017-02-10

    Detecting vehicles in aerial imagery plays an important role in a wide range of applications. The current vehicle detection methods are mostly based on sliding-window search and handcrafted or shallow-learning-based features, having limited description capability and heavy computational costs. Recently, due to the powerful feature representations, region convolutional neural networks (CNN) based detection methods have achieved state-of-the-art performance in computer vision, especially Faster R-CNN. However, directly using it for vehicle detection in aerial images has many limitations: (1) region proposal network (RPN) in Faster R-CNN has poor performance for accurately locating small-sized vehicles, due to the relatively coarse feature maps; and (2) the classifier after RPN cannot distinguish vehicles and complex backgrounds well. In this study, an improved detection method based on Faster R-CNN is proposed in order to accomplish the two challenges mentioned above. Firstly, to improve the recall, we employ a hyper region proposal network (HRPN) to extract vehicle-like targets with a combination of hierarchical feature maps. Then, we replace the classifier after RPN by a cascade of boosted classifiers to verify the candidate regions, aiming at reducing false detection by negative example mining. We evaluate our method on the Munich vehicle dataset and the collected vehicle dataset, with improvements in accuracy and robustness compared to existing methods.

  15. Cognitive Mapping Based on Conjunctive Representations of Space and Movement

    Directory of Open Access Journals (Sweden)

    Taiping Zeng

    2017-11-01

    Full Text Available It is a challenge to build robust simultaneous localization and mapping (SLAM system in dynamical large-scale environments. Inspired by recent findings in the entorhinal–hippocampal neuronal circuits, we propose a cognitive mapping model that includes continuous attractor networks of head-direction cells and conjunctive grid cells to integrate velocity information by conjunctive encodings of space and movement. Visual inputs from the local view cells in the model provide feedback cues to correct drifting errors of the attractors caused by the noisy velocity inputs. We demonstrate the mapping performance of the proposed cognitive mapping model on an open-source dataset of 66 km car journey in a 3 km × 1.6 km urban area. Experimental results show that the proposed model is robust in building a coherent semi-metric topological map of the entire urban area using a monocular camera, even though the image inputs contain various changes caused by different light conditions and terrains. The results in this study could inspire both neuroscience and robotic research to better understand the neural computational mechanisms of spatial cognition and to build robust robotic navigation systems in large-scale environments.

  16. Mapping of a major QTL for salt tolerance of mature field-grown maize plants based on SNP markers.

    Science.gov (United States)

    Luo, Meijie; Zhao, Yanxin; Zhang, Ruyang; Xing, Jinfeng; Duan, Minxiao; Li, Jingna; Wang, Naishun; Wang, Wenguang; Zhang, Shasha; Chen, Zhihui; Zhang, Huasheng; Shi, Zi; Song, Wei; Zhao, Jiuran

    2017-08-15

    Salt stress significantly restricts plant growth and production. Maize is an important food and economic crop but is also a salt sensitive crop. Identification of the genetic architecture controlling salt tolerance facilitates breeders to select salt tolerant lines. However, the critical quantitative trait loci (QTLs) responsible for the salt tolerance of field-grown maize plants are still unknown. To map the main genetic factors contributing to salt tolerance in mature maize, a double haploid population (240 individuals) and 1317 single nucleotide polymorphism (SNP) markers were employed to produce a genetic linkage map covering 1462.05 cM. Plant height of mature maize cultivated in the saline field (SPH) and plant height-based salt tolerance index (ratio of plant height between saline and control fields, PHI) were used to evaluate salt tolerance of mature maize plants. A major QTL for SPH was detected on Chromosome 1 with the LOD score of 22.4, which explained 31.2% of the phenotypic variation. In addition, the major QTL conditioning PHI was also mapped at the same position on Chromosome 1, and two candidate genes involving in ion homeostasis were identified within the confidence interval of this QTL. The detection of the major QTL in adult maize plant establishes the basis for the map-based cloning of genes associated with salt tolerance and provides a potential target for marker assisted selection in developing maize varieties with salt tolerance.

  17. A BAC/BIBAC-based physical map of chickpea, Cicer arietinum L

    Directory of Open Access Journals (Sweden)

    Abbo Shahal

    2010-09-01

    Full Text Available Abstract Background Chickpea (Cicer arietinum L. is the third most important pulse crop worldwide. Despite its importance, relatively little is known about its genome. The availability of a genome-wide physical map allows rapid fine mapping of QTL, development of high-density genome maps, and sequencing of the entire genome. However, no such a physical map has been developed in chickpea. Results We present a genome-wide, BAC/BIBAC-based physical map of chickpea developed by fingerprint analysis. Four chickpea BAC and BIBAC libraries, two of which were constructed in this study, were used. A total of 67,584 clones were fingerprinted, and 64,211 (~11.7 × of the fingerprints validated and used in the physical map assembly. The physical map consists of 1,945 BAC/BIBAC contigs, with each containing an average of 28.3 clones and having an average physical length of 559 kb. The contigs collectively span approximately 1,088 Mb. By using the physical map, we identified the BAC/BIBAC contigs containing or closely linked to QTL4.1 for resistance to Didymella rabiei (RDR and QTL8 for days to first flower (DTF, thus further verifying the physical map and confirming its utility in fine mapping and cloning of QTL. Conclusion The physical map represents the first genome-wide, BAC/BIBAC-based physical map of chickpea. This map, along with other genomic resources previously developed in the species and the genome sequences of related species (soybean, Medicago and Lotus, will provide a foundation necessary for many areas of advanced genomics research in chickpea and other legume species. The inclusion of transformation-ready BIBACs in the map greatly facilitates its utility in functional analysis of the legume genomes.

  18. Methodical bases of perceptual mapping of printing industry companies

    Directory of Open Access Journals (Sweden)

    Kalinin Pavel

    2017-01-01

    Full Text Available This is to study the methodological foundations of perceptual mapping in printing industry enterprises. This research has a practice focus which affects the choice of its methodological framework. The authors use such scientific research as analysis of cause-effect relationships, synthesis, problem analysis, expert evaluation and image visualization methods. In this paper, the authors present their assessment of the competitive environment of major printing industry companies in Kirov oblast; their assessment employs perceptual mapping enables by Minitab 14. This technique can be used by experts in the field of marketing and branding to assess the competitive environment in any market. The object of research is printing industry in Kirov oblast. The most important conclusion of this study is that in perceptual mapping, all the parameters are integrated in a single system and provide a more objective view of the company’s market situation.

  19. Lagrangian based methods for coherent structure detection

    Energy Technology Data Exchange (ETDEWEB)

    Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu [Center for Nonlinear Dynamics and Department of Physics, University of Texas at Austin, Austin, Texas 78712 (United States); Peacock, Thomas, E-mail: tomp@mit.edu [Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2015-09-15

    There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.

  20. Intelligent-based Structural Damage Detection Model

    International Nuclear Information System (INIS)

    Lee, Eric Wai Ming; Yu, K.F.

    2010-01-01

    This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.

  1. Intelligent-based Structural Damage Detection Model

    Science.gov (United States)

    Lee, Eric Wai Ming; Yu, Kin Fung

    2010-05-01

    This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.

  2. The Hazard Mapping System (HMS)-a Multiplatform Remote Sensing Approach to Fire and Smoke Detection

    Science.gov (United States)

    Kibler, J.; Ruminski, M. G.

    2003-12-01

    smaller and/or cooler burning fires. Each of the algorithms utilizes a number of temporal, thermal and contextual filters in an attempt to screen out false detects. However, false detects do get processed by the algorithms to varying degrees. Therefore, the automated fire detects from each algorithm are quality controlled by an analyst who scans the imagery and may either accept or delete fire points. The analyst also has the ability to manually add additional fire points based on the imagery. Smoke is outlined by the analyst using visible imagery, primarily GOES which provides 1 km resolution. Occasionally a smoke plume seen in visible imagery is the only indicator of a fire and would be manually added to the fire detect file. The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) is a forecast model that projects the trajectory and dispersion of a smoke plume over a period of time. The HYSPLIT is run for fires that are selected by the analyst that are seen to be producing a significant smoke plume. The analyst defines a smoke producing area commensurate with the size of the fire and amount of smoke detected. The output is hosted on an Air Resources Lab (ARL) web site which can be accessed from the web site listed below. All of the information is posted to the web page noted below. Besides the interactive GIS presentation users can view the product in graphical jpg format. The analyst edited points as well as the unedited automated fire detects are available for users to view directly on the web page or to download. All of the data is also archived and accessed via ftp.

  3. Water Pollution Detection Based on Hypothesis Testing in Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xu Luo

    2017-01-01

    Full Text Available Water pollution detection is of great importance in water conservation. In this paper, the water pollution detection problems of the network and of the node in sensor networks are discussed. The detection problems in both cases of the distribution of the monitoring noise being normal and nonnormal are considered. The pollution detection problems are analyzed based on hypothesis testing theory firstly; then, the specific detection algorithms are given. Finally, two implementation examples are given to illustrate how the proposed detection methods are used in the water pollution detection in sensor networks and prove the effectiveness of the proposed detection methods.

  4. A third-generation microsatellite-based linkage map of the honey bee, Apis mellifera, and its comparison with the sequence-based physical map.

    Science.gov (United States)

    Solignac, Michel; Mougel, Florence; Vautrin, Dominique; Monnerot, Monique; Cornuet, Jean-Marie

    2007-01-01

    The honey bee is a key model for social behavior and this feature led to the selection of the species for genome sequencing. A genetic map is a necessary companion to the sequence. In addition, because there was originally no physical map for the honey bee genome project, a meiotic map was the only resource for organizing the sequence assembly on the chromosomes. We present the genetic (meiotic) map here and describe the main features that emerged from comparison with the sequence-based physical map. The genetic map of the honey bee is saturated and the chromosomes are oriented from the centromeric to the telomeric regions. The map is based on 2,008 markers and is about 40 Morgans (M) long, resulting in a marker density of one every 2.05 centiMorgans (cM). For the 186 megabases (Mb) of the genome mapped and assembled, this corresponds to a very high average recombination rate of 22.04 cM/Mb. Honey bee meiosis shows a relatively homogeneous recombination rate along and across chromosomes, as well as within and between individuals. Interference is higher than inferred from the Kosambi function of distance. In addition, numerous recombination hotspots are dispersed over the genome. The very large genetic length of the honey bee genome, its small physical size and an almost complete genome sequence with a relatively low number of genes suggest a very promising future for association mapping in the honey bee, particularly as the existence of haploid males allows easy bulk segregant analysis.

  5. Developing web map application based on user centered design

    Directory of Open Access Journals (Sweden)

    Petr Voldan

    2012-03-01

    Full Text Available User centred design is an approach in process of development any kind of human product where the main idea is to create a product for the end user. This article presents User centred design method in developing web mapping services. This method can be split into four main phases – user research, creation of concepts, developing with usability research and lunch of product. The article describes each part of this phase with an aim to provide guidelines for developers and primarily with an aim to improve the usability of web mapping services.

  6. Affine-Invariant Geometric Constraints-Based High Accuracy Simultaneous Localization and Mapping

    Directory of Open Access Journals (Sweden)

    Gangchen Hua

    2017-01-01

    Full Text Available In this study we describe a new appearance-based loop-closure detection method for online incremental simultaneous localization and mapping (SLAM using affine-invariant-based geometric constraints. Unlike other pure bag-of-words-based approaches, our proposed method uses geometric constraints as a supplement to improve accuracy. By establishing an affine-invariant hypothesis, the proposed method excludes incorrect visual words and calculates the dispersion of correctly matched visual words to improve the accuracy of the likelihood calculation. In addition, camera’s intrinsic parameters and distortion coefficients are adequate for this method. 3D measuring is not necessary. We use the mechanism of Long-Term Memory and Working Memory (WM to manage the memory. Only a limited size of the WM is used for loop-closure detection; therefore the proposed method is suitable for large-scale real-time SLAM. We tested our method using the CityCenter and Lip6Indoor datasets. Our proposed method results can effectively correct the typical false-positive localization of previous methods, thus gaining better recall ratios and better precision.

  7. Performance Analysis of Long-Reach Coherent Detection OFDM-PON Downstream Transmission Using m-QAM-Mapped OFDM Signal

    Science.gov (United States)

    Pandey, Gaurav; Goel, Aditya

    2017-12-01

    In this paper, orthogonal frequency division multiplexing (OFDM)-passive optical network (PON) downstream transmission is demonstrated over different lengths of fiber at remote node (RN) for different m-QAM (quadrature amplitude modulation)-mapped OFDM signal (m=4, 16, 32 and 64) transmission from the central office (CO) for different data rates (10, 20 30 and 40 Gbps) using coherent detection at the user end or optical network unit (ONU). Investigation is performed with different number of subcarriers (32, 64, 128, 512 and 1,024), back-to-back optical signal-to-noise ratio (OSNR) along with transmitted and received constellation diagrams for m-QAM-mapped coherent OFDM downstream transmission at different speeds over different transmission distances. Received optical power is calculated for different bit error rates (BERs) at different speeds using m-QAM-mapped coherent detection OFDM downstream transmission. No dispersion compensation is utilized in between the fiber span. Simulation results suggest the different lengths and data rates that can be used for different m-QAM-mapped coherent detection OFDM downstream transmission, and the proposed system may be implemented in next-generation high-speed PONs (NG-PONs).

  8. 1H-detected MAS solid-state NMR experiments enable the simultaneous mapping of rigid and dynamic domains of membrane proteins

    Science.gov (United States)

    Gopinath, T.; Nelson, Sarah E. D.; Veglia, Gianluigi

    2017-12-01

    Magic angle spinning (MAS) solid-state NMR (ssNMR) spectroscopy is emerging as a unique method for the atomic resolution structure determination of native membrane proteins in lipid bilayers. Although 13C-detected ssNMR experiments continue to play a major role, recent technological developments have made it possible to carry out 1H-detected experiments, boosting both sensitivity and resolution. Here, we describe a new set of 1H-detected hybrid pulse sequences that combine through-bond and through-space correlation elements into single experiments, enabling the simultaneous detection of rigid and dynamic domains of membrane proteins. As proof-of-principle, we applied these new pulse sequences to the membrane protein phospholamban (PLN) reconstituted in lipid bilayers under moderate MAS conditions. The cross-polarization (CP) based elements enabled the detection of the relatively immobile residues of PLN in the transmembrane domain using through-space correlations; whereas the most dynamic region, which is in equilibrium between folded and unfolded states, was mapped by through-bond INEPT-based elements. These new 1H-detected experiments will enable one to detect not only the most populated (ground) states of biomacromolecules, but also sparsely populated high-energy (excited) states for a complete characterization of protein free energy landscapes.

  9. Feature Matching for SAR and Optical Images Based on Gaussian-Gamma-shaped Edge Strength Map

    Directory of Open Access Journals (Sweden)

    CHEN Min

    2016-03-01

    Full Text Available A matching method for SAR and optical images, robust to pixel noise and nonlinear grayscale differences, is presented. Firstly, a rough correction to eliminate rotation and scale change between images is performed. Secondly, features robust to speckle noise of SAR image are detected by improving the original phase congruency based method. Then, feature descriptors are constructed on the Gaussian-Gamma-shaped edge strength map according to the histogram of oriented gradient pattern. Finally, descriptor similarity and geometrical relationship are combined to constrain the matching processing.The experimental results demonstrate that the proposed method provides significant improvement in correct matches number and image registration accuracy compared with other traditional methods.

  10. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  11. Dangerous gas detection based on infrared video

    Science.gov (United States)

    Ding, Kang; Hong, Hanyu; Huang, Likun

    2018-03-01

    As the gas leak infrared imaging detection technology has significant advantages of high efficiency and remote imaging detection, in order to enhance the detail perception of observers and equivalently improve the detection limit, we propose a new type of gas leak infrared image detection method, which combines background difference methods and multi-frame interval difference method. Compared to the traditional frame methods, the multi-frame interval difference method we proposed can extract a more complete target image. By fusing the background difference image and the multi-frame interval difference image, we can accumulate the information of infrared target image of the gas leak in many aspect. The experiment demonstrate that the completeness of the gas leakage trace information is enhanced significantly, and the real-time detection effect can be achieved.

  12. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  13. MapReduce-based Dimensional ETL Made Easy

    DEFF Research Database (Denmark)

    Xiufeng, Liu; Thomsen, Christian; Pedersen, Torben Bach

    2012-01-01

    This paper demonstrates ETLMR, a novel dimensional Extract–Transform–Load (ETL) programming framework that uses MapReduce to achieve scalability. ETLMR has builtin native support of data warehouse (DW) specific constructs such as star schemas, snowflake schemas, and slowly changing dimensions (SCDs...

  14. Improving Seroreactivity-Based Detection of Glioma

    Directory of Open Access Journals (Sweden)

    Nicole Ludwig

    2009-12-01

    Full Text Available Seroreactivity profiling emerges as valuable technique for minimal invasive cancer detection. Recently, we provided first evidence for the applicability of serum profiling of glioma using a limited number of immunogenic antigens. Here, we screened 57 glioma and 60 healthy sera for autoantibodies against 1827 Escherichia coli expressed clones, including 509 in-frame peptide sequences. By a linear support vector machine approach, we calculated mean specificity, sensitivity, and accuracy of 100 repetitive classifications. We were able to differentiate glioma sera from sera of the healthy controls with a specificity of 90.28%, a sensitivity of 87.31% and an accuracy of 88.84%. We were also able to differentiate World Health Organization grade IV glioma sera from healthy sera with a specificity of 98.45%, a sensitivity of 80.93%, and an accuracy of 92.88%. To rank the antigens according to their information content, we computed the area under the receiver operator characteristic curve value for each clone. Altogether, we found 46 immunogenic clones including 16 in-frame clones that were informative for the classification of glioma sera versus healthy sera. For the separation of glioblastoma versus healthy sera, we found 91 informative clones including 26 in-frame clones. The best-suited in-frame clone for the classification glioma sera versus healthy sera corresponded to the vimentin gene (VIM that was previously associated with glioma. In the future, autoantibody signatures in glioma not only may prove useful for diagnosis but also offer the prospect for a personalized immune-based therapy.

  15. Usability evaluation of cloud-based mapping tools for the display of very large datasets

    Science.gov (United States)

    Stotz, Nicole Marie

    The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would

  16. Lymphatic mapping and sentinel lymph node detection in patients with breast cancer

    International Nuclear Information System (INIS)

    Chen, S.L.; Du, Q.Q.; Shi, H.C.; Chen, J.X.; Wang, H.

    2002-01-01

    Objectives: To localize sentinel lymph node (SLN) and to test the hypothesis that the histologic characteristics of the SLN can predict the histologic characteristic of the remaining lymph nodes along the lymphatic chain. To calculate the absorbed dose of patients, doctors and nurses. Methods: Seventy-one patients with early-stage breast cancer underwent SLN localization using filtered technetium-99m labeled sulfur colloid, blue dye, or combination of them. SLN was identified as a blue lymph node and/or a 'hot lymph node' detected by ex vivo gamma probe. A 'hot lymph node' is the lymph node the radioactivity of which was 10 times higher than that of background. Pathological examination was performed with all resected lymph nodes. The approximate absorbed dose of the patients, doctors and nurses was calculated by using MIRD techniques. Results: For patients who were injected with only blue dye, the sensitivity, accuracy and false negative rate was 80.0%, 90.7% and 20.0% respectively. For patients who were injected with only radioactive colloids, the sensitivity, accuracy and false negative rate was 100%, 100% and 0% respectively. For patients who were injected with both blue dye and radioactive colloids, the sensitivity, accuracy and false negative rate was 100%, 100% and 100% respectively. The absorbed dose of breast tissue was 26.52 rad. The absorbed dose of nuclear medicine doctors, surgeons, nurses and pathologists was 1.9x10 -2 rad, 9.6x10 -3 rad, 3.8x10 -4 rad and 9.6x10 -3 rad respectively. Conclusions: Lymphatic mapping and SLN biopsy were the most effective when a combination of blue dye and radio-labeled sulfur colloid was used. Radio-labeled sulfur colloid was safe to patients and the medical staff. SLN biopsy had the potential value for avoiding unnecessary axillary lymph nodes resection for patients with early-stage breast cancer

  17. Mapping of Rill Erosion of Arable Soils Based on Unmanned Aerial Vehicles Survey

    Science.gov (United States)

    Kashtanov, A. N.; Vernyuk, Yu. I.; Savin, I. Yu.; Shchepot'ev, V. V.; Dokukin, P. A.; Sharychev, D. V.; Li, K. A.

    2018-04-01

    Possibilities of using data obtained from unmanned aerial vehicles for detection and mapping of rill erosion on arable lands are analyzed. Identification and mapping of rill erosion was performed on a key plot with a predominance of arable gray forest soils (Greyzemic Phaeozems) under winter wheat in Tula oblast. This plot was surveyed from different heights and in different periods to determine the reliability of identification of rill erosion on the basis of automated procedures in a GIS. It was found that, despite changes in the pattern of rills during the warm season, only one survey during this season is sufficient for adequate assessment of the area of eroded soils. According to our data, the most reliable identification of rill erosion is based on the aerial survey from the height of 50 m above the soil surface. When the height of the flight is more than 200 m, erosional rills virtually escape identification. The efficiency of identification depends on the type of crops, their status, and time of the survey. The surveys of bare soil surface in periods with maximum possible interval from the previous rain or snowmelt season are most efficient. The results of our study can be used in the systems of remote sensing monitoring of erosional processes on arable fields. Application of multiand hyperspectral cameras can improve the efficiency of monitoring.

  18. THE PERFORMANCE ANALYSIS OF A UAV BASED MOBILE MAPPING SYSTEM PLATFORM

    Directory of Open Access Journals (Sweden)

    M. L. Tsai

    2013-08-01

    Full Text Available To facilitate applications such as environment detection or disaster monitoring, the development of rapid low cost systems for collecting near real-time spatial information is very critical. Rapid spatial information collection has become an emerging trend for remote sensing and mapping applications. This study develops a Direct Georeferencing (DG based fixed-wing Unmanned Aerial Vehicle (UAV photogrammetric platform where an Inertial Navigation System (INS/Global Positioning System (GPS integrated Positioning and Orientation System (POS system is implemented to provide the DG capability of the platform. The performance verification indicates that the proposed platform can capture aerial images successfully. A flight test is performed to verify the positioning accuracy in DG mode without using Ground Control Points (GCP. The preliminary results illustrate that horizontal DG positioning accuracies in the x and y axes are around 5 m with 300 m flight height. The positioning accuracy in the z axis is less than 10 m. Such accuracy is good for near real-time disaster relief. The DG ready function of proposed platform guarantees mapping and positioning capability even in GCP free environments, which is very important for rapid urgent response for disaster relief. Generally speaking, the data processing time for the DG module, including POS solution generalization, interpolation, Exterior Orientation Parameters (EOP generation, and feature point measurements, is less than one hour.

  19. The Performance Analysis of a Uav Based Mobile Mapping System Platform

    Science.gov (United States)

    Tsai, M. L.; Chiang, K. W.; Lo, C. F.; Ch, C. H.

    2013-08-01

    To facilitate applications such as environment detection or disaster monitoring, the development of rapid low cost systems for collecting near real-time spatial information is very critical. Rapid spatial information collection has become an emerging trend for remote sensing and mapping applications. This study develops a Direct Georeferencing (DG) based fixed-wing Unmanned Aerial Vehicle (UAV) photogrammetric platform where an Inertial Navigation System (INS)/Global Positioning System (GPS) integrated Positioning and Orientation System (POS) system is implemented to provide the DG capability of the platform. The performance verification indicates that the proposed platform can capture aerial images successfully. A flight test is performed to verify the positioning accuracy in DG mode without using Ground Control Points (GCP). The preliminary results illustrate that horizontal DG positioning accuracies in the x and y axes are around 5 m with 300 m flight height. The positioning accuracy in the z axis is less than 10 m. Such accuracy is good for near real-time disaster relief. The DG ready function of proposed platform guarantees mapping and positioning capability even in GCP free environments, which is very important for rapid urgent response for disaster relief. Generally speaking, the data processing time for the DG module, including POS solution generalization, interpolation, Exterior Orientation Parameters (EOP) generation, and feature point measurements, is less than one hour.

  20. Construction of an SNP-based high-density linkage map for flax (Linum usitatissimum L.) using specific length amplified fragment sequencing (SLAF-seq) technology.

    Science.gov (United States)

    Yi, Liuxi; Gao, Fengyun; Siqin, Bateer; Zhou, Yu; Li, Qiang; Zhao, Xiaoqing; Jia, Xiaoyun; Zhang, Hui

    2017-01-01

    Flax is an important crop for oil and fiber, however, no high-density genetic maps have been reported for this species. Specific length amplified fragment sequencing (SLAF-seq) is a high-resolution strategy for large scale de novo discovery and genotyping of single nucleotide polymorphisms. In this study, SLAF-seq was employed to develop SNP markers in an F2 population to construct a high-density genetic map for flax. In total, 196.29 million paired-end reads were obtained. The average sequencing depth was 25.08 in male parent, 32.17 in the female parent, and 9.64 in each F2 progeny. In total, 389,288 polymorphic SLAFs were detected, from which 260,380 polymorphic SNPs were developed. After filtering, 4,638 SNPs were found suitable for genetic map construction. The final genetic map included 4,145 SNP markers on 15 linkage groups and was 2,632.94 cM in length, with an average distance of 0.64 cM between adjacent markers. To our knowledge, this map is the densest SNP-based genetic map for flax. The SNP markers and genetic map reported in here will serve as a foundation for the fine mapping of quantitative trait loci (QTLs), map-based gene cloning and marker assisted selection (MAS) for flax.

  1. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    Science.gov (United States)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  2. A Deep Convolutional Coupling Network for Change Detection Based on Heterogeneous Optical and Radar Images.

    Science.gov (United States)

    Liu, Jia; Gong, Maoguo; Qin, Kai; Zhang, Puzhao

    2018-03-01

    We propose an unsupervised deep convolutional coupling network for change detection based on two heterogeneous images acquired by optical sensors and radars on different dates. Most existing change detection methods are based on homogeneous images. Due to the complementary properties of optical and radar sensors, there is an increasing interest in change detection based on heterogeneous images. The proposed network is symmetric with each side consisting of one convolutional layer and several coupling layers. The two input images connected with the two sides of the network, respectively, are transformed into a feature space where their feature representations become more consistent. In this feature space, the different map is calculated, which then leads to the ultimate detection map by applying a thresholding algorithm. The network parameters are learned by optimizing a coupling function. The learning process is unsupervised, which is different from most existing change detection methods based on heterogeneous images. Experimental results on both homogenous and heterogeneous images demonstrate the promising performance of the proposed network compared with several existing approaches.

  3. CdTe detector based PIXE mapping of geological samples

    Energy Technology Data Exchange (ETDEWEB)

    Chaves, P.C., E-mail: cchaves@ctn.ist.utl.pt [Centro de Física Atómica da Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003 Lisboa (Portugal); IST/ITN, Instituto Superior Técnico, Universidade Técnica de Lisboa, Campus Tecnológico e Nuclear, EN10, 2686-953 Sacavém (Portugal); Taborda, A. [Centro de Física Atómica da Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003 Lisboa (Portugal); IST/ITN, Instituto Superior Técnico, Universidade Técnica de Lisboa, Campus Tecnológico e Nuclear, EN10, 2686-953 Sacavém (Portugal); Oliveira, D.P.S. de [Laboratório Nacional de Energia e Geologia (LNEG), Apartado 7586, 2611-901 Alfragide (Portugal); Reis, M.A. [Centro de Física Atómica da Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003 Lisboa (Portugal); IST/ITN, Instituto Superior Técnico, Universidade Técnica de Lisboa, Campus Tecnológico e Nuclear, EN10, 2686-953 Sacavém (Portugal)

    2014-01-01

    A sample collected from a borehole drilled approximately 10 km ESE of Bragança, Trás-os-Montes, was analysed by standard and high energy PIXE at both CTN (previous ITN) PIXE setups. The sample is a fine-grained metapyroxenite grading to coarse-grained in the base with disseminated sulphides and fine veinlets of pyrrhotite and pyrite. Matrix composition was obtained at the standard PIXE setup using a 1.25 MeV H{sup +} beam at three different spots. Medium and high Z elemental concentrations were then determined using the DT2fit and DT2simul codes (Reis et al., 2008, 2013 [1,2]), on the spectra obtained in the High Resolution and High Energy (HRHE)-PIXE setup (Chaves et al., 2013 [3]) by irradiation of the sample with a 3.8 MeV proton beam provided by the CTN 3 MV Tandetron accelerator. In this paper we present results, discuss detection limits of the method and the added value of the use of the CdTe detector in this context.

  4. Spatio-temporal dimension of lightning flashes based on three-dimensional Lightning Mapping Array

    Science.gov (United States)

    López, Jesús A.; Pineda, Nicolau; Montanyà, Joan; Velde, Oscar van der; Fabró, Ferran; Romero, David

    2017-11-01

    3D mapping system like the LMA - Lightning Mapping Array - are a leap forward in lightning observation. LMA measurements has lead to an improvement on the analysis of the fine structure of lightning, allowing to characterize the duration and maximum extension of the cloud fraction of a lightning flash. During several years of operation, the first LMA deployed in Europe has been providing a large amount of data which now allows a statistical approach to compute the full duration and horizontal extension of the in-cloud phase of a lightning flash. The "Ebro Lightning Mapping Array" (ELMA) is used in the present study. Summer and winter lighting were analyzed for seasonal periods (Dec-Feb and Jun-Aug). A simple method based on an ellipse fitting technique (EFT) has been used to characterize the spatio-temporal dimensions from a set of about 29,000 lightning flashes including both summer and winter events. Results show an average lightning flash duration of 440 ms (450 ms in winter) and a horizontal maximum length of 15.0 km (18.4 km in winter). The uncertainties for summer lightning lengths were about ± 1.2 km and ± 0.7 km for the mean and median values respectively. In case of winter lightning, the level of uncertainty reaches up to 1 km and 0.7 km of mean and median value. The results of the successful correlation of CG discharges with the EFT method, represent 6.9% and 35.5% of the total LMA flashes detected in summer and winter respectively. Additionally, the median value of lightning lengths calculated through this correlative method was approximately 17 km for both seasons. On the other hand, the highest median ratios of lightning length to CG discharges in both summer and winter were reported for positive CG discharges.

  5. The impact of sample size on the reproducibility of voxel-based lesion-deficit mappings.

    Science.gov (United States)

    Lorca-Puls, Diego L; Gajardo-Vidal, Andrea; White, Jitrachote; Seghier, Mohamed L; Leff, Alexander P; Green, David W; Crinion, Jenny T; Ludersdorfer, Philipp; Hope, Thomas M H; Bowman, Howard; Price, Cathy J

    2018-07-01

    This study investigated how sample size affects the reproducibility of findings from univariate voxel-based lesion-deficit analyses (e.g., voxel-based lesion-symptom mapping and voxel-based morphometry). Our effect of interest was the strength of the mapping between brain damage and speech articulation difficulties, as measured in terms of the proportion of variance explained. First, we identified a region of interest by searching on a voxel-by-voxel basis for brain areas where greater lesion load was associated with poorer speech articulation using a large sample of 360 right-handed English-speaking stroke survivors. We then randomly drew thousands of bootstrap samples from this data set that included either 30, 60, 90, 120, 180, or 360 patients. For each resample, we recorded effect size estimates and p values after conducting exactly the same lesion-deficit analysis within the previously identified region of interest and holding all procedures constant. The results show (1) how often small effect sizes in a heterogeneous population fail to be detected; (2) how effect size and its statistical significance varies with sample size; (3) how low-powered studies (due to small sample sizes) can greatly over-estimate as well as under-estimate effect sizes; and (4) how large sample sizes (N ≥ 90) can yield highly significant p values even when effect sizes are so small that they become trivial in practical terms. The implications of these findings for interpreting the results from univariate voxel-based lesion-deficit analyses are discussed. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  6. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  7. EXTINCTION MAP OF THE SMALL MAGELLANIC CLOUD BASED ON THE SIRIUS AND 6X 2MASS POINT SOURCE CATALOGS

    International Nuclear Information System (INIS)

    Dobashi, Kazuhito; Egusa, Fumi; Bernard, Jean-Philippe; Paradis, Deborah; Kawamura, Akiko; Hughes, Annie; Bot, Caroline; Reach, William T.

    2009-01-01

    In this paper, we present the first extinction map of the Small Magellanic Cloud (SMC) constructed using the color excess at near-infrared wavelengths. Using a new technique named X percentile method , which we developed recently to measure the color excess of dark clouds embedded within a star distribution, we have derived an E(J - H) map based on the SIRIUS and 6X Two Micron All Sky Survey (2MASS) star catalogs. Several dark clouds are detected in the map derived from the SIRIUS star catalog, which is deeper than the 6X 2MASS catalog. We have compared the E(J - H) map with a model calculation in order to infer the locations of the clouds along the line of sight, and found that many of them are likely to be located in or elongated toward the far side of the SMC. Most of the dark clouds found in the E(J - H) map have counterparts in the CO clouds detected by Mizuno et al. with the NANTEN telescope. A comparison of the E(J - H) map with the virial mass derived from the CO data indicates that the dust-to-gas ratio in the SMC varies in the range A V /N H = 1-2 x 10 -22 mag H -1 cm 2 with a mean value of ∼1.5 x 10 -22 mag H -1 cm 2 . If the virial mass underestimates the true cloud mass by a factor of ∼2, as recently suggested by Bot et al., the mean value would decrease to ∼8x10 -23 mag H -1 cm 2 , in good agreement with the value reported by Gordon et al., 7.59 x 10 -23 mag H -1 cm 2 .

  8. Performance Comparison of Reputation Assessment Techniques Based on Self-Organizing Maps in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sabrina Sicari

    2017-01-01

    Full Text Available Many solutions based on machine learning techniques have been proposed in literature aimed at detecting and promptly counteracting various kinds of malicious attack (data violation, clone, sybil, neglect, greed, and DoS attacks, which frequently affect Wireless Sensor Networks (WSNs. Besides recognizing the corrupted or violated information, also the attackers should be identified, in order to activate the proper countermeasures for preserving network’s resources and to mitigate their malicious effects. To this end, techniques adopting Self-Organizing Maps (SOM for intrusion detection in WSN were revealed to represent a valuable and effective solution to the problem. In this paper, the mechanism, namely, Good Network (GoNe, which is based on SOM and is able to assess the reliability of the sensor nodes, is compared with another relevant and similar work existing in literature. Extensive performance simulations, in terms of nodes’ classification, attacks’ identification, data accuracy, energy consumption, and signalling overhead, have been carried out in order to demonstrate the better feasibility and efficiency of the proposed solution in WSN field.

  9. Precise mapping of annual river bed changes based on airborne laser bathymetry

    Science.gov (United States)

    Mandlburger, Gottfried; Wieser, Martin; Pfeifer, Norbert; Pfennigbauer, Martin; Steinbacher, Frank; Aufleger, Markus

    2014-05-01

    three epochs constituting an excellent basis for, both, the visual and quantitative estimation of the changes over the year. It turned out that even between the April and May flight remarkable differences could be detected although there was no major precipitation event in-between and the flow conditions were entirely below mean flow. In contrast to the moderate changes between April and May, the flood event in June 2013 (HQ1) resulted in a radical change of the river bed topography documented by the October flight. Since the study site (Neubacher Au) is a Natura2000 conservation area, space for a meandering flow is allowed. Entire gravel bars have been removed and new bars were deposited down-stream. Furthermore, the river axis was locally shifted by more than 1m during the flood event. The results demonstrate the high potential of laser bathymetry for precise mapping of river bed changes. This opens new perspectives for the validation of sediment transport models models and a much better understanding of the river morphology (e.g. formation and changes of sand and gravel banks). The traditional approach in sediment transport modelling based on a limited number of cross sections can be completed respectively replaced by a more comprehensive and more reliable concept on the basis of spatial distributed river bed data. Valuable calibration data in a new quality will be available.

  10. QRS Detection Based on Improved Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Xuanyu Lu

    2018-01-01

    Full Text Available Cardiovascular disease is the first cause of death around the world. In accomplishing quick and accurate diagnosis, automatic electrocardiogram (ECG analysis algorithm plays an important role, whose first step is QRS detection. The threshold algorithm of QRS complex detection is known for its high-speed computation and minimized memory storage. In this mobile era, threshold algorithm can be easily transported into portable, wearable, and wireless ECG systems. However, the detection rate of the threshold algorithm still calls for improvement. An improved adaptive threshold algorithm for QRS detection is reported in this paper. The main steps of this algorithm are preprocessing, peak finding, and adaptive threshold QRS detecting. The detection rate is 99.41%, the sensitivity (Se is 99.72%, and the specificity (Sp is 99.69% on the MIT-BIH Arrhythmia database. A comparison is also made with two other algorithms, to prove our superiority. The suspicious abnormal area is shown at the end of the algorithm and RR-Lorenz plot drawn for doctors and cardiologists to use as aid for diagnosis.

  11. Design of Intelligent Transportation Inquiry System Based on MapX in the Environment of VC++

    Directory of Open Access Journals (Sweden)

    Cheng Juan

    2016-01-01

    Full Text Available This paper applied MapInfo, the professional soft ware tool of GIS, integrated secondary exploiture combining with elctronic maps, and made use of the exploiture flat roof Visual C++ as the tool of visualize development, transferred MapX, a control of MapInfo, integrated them. The paper designed the Inquiry System in Intelligent Transportation, which including query system of road information, query system of bus information, query system of district information. It can be carried out space analysis and query function based on GIS. Adopted SQL Server manage attribute data, by data binding, attribute data in SQL Server and victor picture data were combined.

  12. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping

    Directory of Open Access Journals (Sweden)

    Shi Weisong

    2011-06-01

    Full Text Available Abstract Background Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS. However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. Results To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80% mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http

  13. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping.

    Science.gov (United States)

    Nguyen, Tung; Shi, Weisong; Ruden, Douglas

    2011-06-06

    Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS). However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80%) mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http://cloudaligner.sourceforge.net/ and its web version is at http

  14. A Karnaugh-Map based fingerprint minutiae extraction method

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Fingerprint is one of the most promising method among all the biometric techniques and has been used for thepersonal authentication for a long time because of its wide acceptance and reliability. Features (Minutiae are extracted fromthe fingerprint in question and are compared with the features already stored in the database for authentication. Crossingnumber (CN is the most commonly used minutiae extraction method for fingerprints. In this paper, a new Karnaugh-Mapbased fingerprint minutiae extraction method has been proposed and discussed. In the proposed algorithm the 8 neighborsof a pixel in a 33 window are arranged as 8 bits of a byte and corresponding hexadecimal (hex value is calculated. Thesehex values are simplified using standard Karnaugh-Map (K-map technique to obtain the minimized logical expression.Experiments conducted on the FVC2002/Db1_a database reveals that the developed method is better than the crossingnumber (CN method.

  15. An annotated genetic map of loblolly pine based on microsatellite and cDNA markers

    Science.gov (United States)

    Craig S. Echt; Surya Saha; Konstantin V. Krutovsky; Kokulapalan Wimalanathan; John E. Erpelding; Chun Liang; C Dana Nelson

    2011-01-01

    Previous loblolly pine (Pinus taeda L.) genetic linkage maps have been based on a variety of DNA polymorphisms, such as AFLPs, RAPDs, RFLPs, and ESTPs, but only a few SSRs (simple sequence repeats), also known as simple tandem repeats or microsatellites, have been mapped in P. taeda. The objective of this study was to integrate a large set of SSR markers from a variety...

  16. Identification of p38α MAP kinase inhibitors by pharmacophore based virtual screening

    DEFF Research Database (Denmark)

    Gangwal, Rahul P; Das, Nihar R; Thanki, Kaushik

    2014-01-01

    The p38α mitogen-activated protein (MAP) kinase plays a vital role in treating many inflammatory diseases. In the present study, a combined ligand and structure based pharmacophore model was developed to identify potential DFG-in selective p38 MAP kinase inhibitors. Conformations of co...

  17. Machine vision-based high-resolution weed mapping and patch-sprayer performance simulation

    NARCIS (Netherlands)

    Tang, L.; Tian, L.F.; Steward, B.L.

    1999-01-01

    An experimental machine vision-based patch-sprayer was developed. This sprayer was primarily designed to do real-time weed density estimation and variable herbicide application rate control. However, the sprayer also had the capability to do high-resolution weed mapping if proper mapping techniques

  18. Map-IT! A Web-Based GIS Tool for Watershed Science Education.

    Science.gov (United States)

    Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.

    This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…

  19. The research of selection model based on LOD in multi-scale display of electronic map

    Science.gov (United States)

    Zhang, Jinming; You, Xiong; Liu, Yingzhen

    2008-10-01

    This paper proposes a selection model based on LOD to aid the display of electronic map. The ratio of display scale to map scale is regarded as a LOD operator. The categorization rule, classification rule, elementary rule and spatial geometry character rule of LOD operator setting are also concluded.

  20. An annotated genetic map of loblolly pine based on microsatellite and cDNA markers

    Science.gov (United States)

    Previous loblolly pine (Pinus taeda L.) genetic linkage maps have been based on a variety of DNA polymorphisms, such as AFLPs, RAPDs, RFLPs, and ESTPs, but only a few SSRs (simple sequence repeats), also known as simple tandem repeats or microsatellites, have been mapped in P. taeda. The objective o...

  1. Mapping query terms to data and schema using content based similarity search in clinical information systems.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2013-01-01

    This paper reports on the issues in mapping the terms of a query to the field names of the schema of an Entity Relationship (ER) model or to the data part of the Entity Attribute Value (EAV) model using similarity based Top-K algorithm in clinical information system together with an extension of EAV mapping for medication names. In addition, the details of the mapping algorithm and the required pre-processing including NLP (Natural Language Processing) tasks to prepare resources for mapping are explained. The experimental results on an example clinical information system demonstrate more than 84 per cent of accuracy in mapping. The results will be integrated into our proposed Clinical Data Analytics Language (CliniDAL) to automate mapping process in CliniDAL.

  2. Blockchain-based Smart Contracts: A Systematic Mapping Study

    OpenAIRE

    Alharby, Maher; van Moorsel, Aad

    2017-01-01

    An appealing feature of blockchain technology is smart contracts. A smart contract is executable code that runs on top of the blockchain to facilitate, execute and enforce an agreement between untrusted parties without the involvement of a trusted third party. In this paper, we conduct a systematic mapping study to collect all research that is relevant to smart contracts from a technical perspective. The aim of doing so is to identify current research topics and open challenges for future stu...

  3. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Science.gov (United States)

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Relaxation-based viscosity mapping for magnetic particle imaging

    Science.gov (United States)

    Utkur, M.; Muslu, Y.; Saritas, E. U.

    2017-05-01

    Magnetic particle imaging (MPI) has been shown to provide remarkable contrast for imaging applications such as angiography, stem cell tracking, and cancer imaging. Recently, there is growing interest in the functional imaging capabilities of MPI, where ‘color MPI’ techniques have explored separating different nanoparticles, which could potentially be used to distinguish nanoparticles in different states or environments. Viscosity mapping is a promising functional imaging application for MPI, as increased viscosity levels in vivo have been associated with numerous diseases such as hypertension, atherosclerosis, and cancer. In this work, we propose a viscosity mapping technique for MPI through the estimation of the relaxation time constant of the nanoparticles. Importantly, the proposed time constant estimation scheme does not require any prior information regarding the nanoparticles. We validate this method with extensive experiments in an in-house magnetic particle spectroscopy (MPS) setup at four different frequencies (between 250 Hz and 10.8 kHz) and at three different field strengths (between 5 mT and 15 mT) for viscosities ranging between 0.89 mPa · s-15.33 mPa · s. Our results demonstrate the viscosity mapping ability of MPI in the biologically relevant viscosity range.

  5. Mapping the Indonesian territory, based on pollution, social demography and geographical data, using self organizing feature map

    Science.gov (United States)

    Hernawati, Kuswari; Insani, Nur; Bambang S. H., M.; Nur Hadi, W.; Sahid

    2017-08-01

    This research aims to mapping the 33 (thirty-three) provinces in Indonesia, based on the data on air, water and soil pollution, as well as social demography and geography data, into a clustered model. The method used in this study was unsupervised method that combines the basic concept of Kohonen or Self-Organizing Feature Maps (SOFM). The method is done by providing the design parameters for the model based on data related directly/ indirectly to pollution, which are the demographic and social data, pollution levels of air, water and soil, as well as the geographical situation of each province. The parameters used consists of 19 features/characteristics, including the human development index, the number of vehicles, the availability of the plant's water absorption and flood prevention, as well as geographic and demographic situation. The data used were secondary data from the Central Statistics Agency (BPS), Indonesia. The data are mapped into SOFM from a high-dimensional vector space into two-dimensional vector space according to the closeness of location in term of Euclidean distance. The resulting outputs are represented in clustered grouping. Thirty-three provinces are grouped into five clusters, where each cluster has different features/characteristics and level of pollution. The result can used to help the efforts on prevention and resolution of pollution problems on each cluster in an effective and efficient way.

  6. User Experience Design in Professional Map-Based Geo-Portals

    Directory of Open Access Journals (Sweden)

    Bastian Zimmer

    2013-10-01

    Full Text Available We have recently been witnessing the growing establishment of map-centered web-based geo-portals on national, regional and local levels. However, a particular issue with these geo-portals is that each instance has been implemented in different ways in terms of design, usability, functionality, interaction possibilities, map size and symbologies. In this paper, we try to tackle these shortcomings by analyzing and formalizing the requirements for map-based geo-portals in a user experience based approach. First, we propose a holistic definition the term of a “geo-portal”. Then, we present our approach to user experience design for map-based geo-portals by defining the functional requirements of a geo-portal, by analyzing previous geo-portal developments, by distilling the results of our empirical user study to perform practically-oriented user requirements, and finally by establishing a set of user experience design guidelines for the creation of map-based geo-portals. These design guidelines have been extracted for each of the main components of a geo-portal, i.e., the map, the search dialogue, the presentation of the search results, symbologies, and other aspects. These guidelines shall constitute the basis for future geo-portal developments to achieve standardization in the user-experience design of map-based geo-portals.

  7. An Entropy-Based Network Anomaly Detection Method

    Directory of Open Access Journals (Sweden)

    Przemysław Bereziński

    2015-04-01

    Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.

  8. Vision-based vehicle detection and tracking algorithm design

    Science.gov (United States)

    Hwang, Junyeon; Huh, Kunsoo; Lee, Donghwi

    2009-12-01

    The vision-based vehicle detection in front of an ego-vehicle is regarded as promising for driver assistance as well as for autonomous vehicle guidance. The feasibility of vehicle detection in a passenger car requires accurate and robust sensing performance. A multivehicle detection system based on stereo vision has been developed for better accuracy and robustness. This system utilizes morphological filter, feature detector, template matching, and epipolar constraint techniques in order to detect the corresponding pairs of vehicles. After the initial detection, the system executes the tracking algorithm for the vehicles. The proposed system can detect front vehicles such as the leading vehicle and side-lane vehicles. The position parameters of the vehicles located in front are obtained based on the detection information. The proposed vehicle detection system is implemented on a passenger car, and its performance is verified experimentally.

  9. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    Science.gov (United States)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present n