WorldWideScience

Sample records for sampling based map

  1. The impact of sample size on the reproducibility of voxel-based lesion-deficit mappings.

    Science.gov (United States)

    Lorca-Puls, Diego L; Gajardo-Vidal, Andrea; White, Jitrachote; Seghier, Mohamed L; Leff, Alexander P; Green, David W; Crinion, Jenny T; Ludersdorfer, Philipp; Hope, Thomas M H; Bowman, Howard; Price, Cathy J

    2018-07-01

    This study investigated how sample size affects the reproducibility of findings from univariate voxel-based lesion-deficit analyses (e.g., voxel-based lesion-symptom mapping and voxel-based morphometry). Our effect of interest was the strength of the mapping between brain damage and speech articulation difficulties, as measured in terms of the proportion of variance explained. First, we identified a region of interest by searching on a voxel-by-voxel basis for brain areas where greater lesion load was associated with poorer speech articulation using a large sample of 360 right-handed English-speaking stroke survivors. We then randomly drew thousands of bootstrap samples from this data set that included either 30, 60, 90, 120, 180, or 360 patients. For each resample, we recorded effect size estimates and p values after conducting exactly the same lesion-deficit analysis within the previously identified region of interest and holding all procedures constant. The results show (1) how often small effect sizes in a heterogeneous population fail to be detected; (2) how effect size and its statistical significance varies with sample size; (3) how low-powered studies (due to small sample sizes) can greatly over-estimate as well as under-estimate effect sizes; and (4) how large sample sizes (N ≥ 90) can yield highly significant p values even when effect sizes are so small that they become trivial in practical terms. The implications of these findings for interpreting the results from univariate voxel-based lesion-deficit analyses are discussed. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  2. Generating samples for association studies based on HapMap data

    Directory of Open Access Journals (Sweden)

    Chen Yixuan

    2008-01-01

    Full Text Available Abstract Background With the completion of the HapMap project, a variety of computational algorithms and tools have been proposed for haplotype inference, tag SNP selection and genome-wide association studies. Simulated data are commonly used in evaluating these new developed approaches. In addition to simulations based on population models, empirical data generated by perturbing real data, has also been used because it may inherit specific properties from real data. However, there is no tool that is publicly available to generate large scale simulated variation data by taking into account knowledge from the HapMap project. Results A computer program (gs was developed to quickly generate a large number of samples based on real data that are useful for a variety of purposes, including evaluating methods for haplotype inference, tag SNP selection and association studies. Two approaches have been implemented to generate dense SNP haplotype/genotype data that share similar local linkage disequilibrium (LD patterns as those in human populations. The first approach takes haplotype pairs from samples as inputs, and the second approach takes patterns of haplotype block structures as inputs. Both quantitative and qualitative traits have been incorporated in the program. Phenotypes are generated based on a disease model, or based on the effect of a quantitative trait nucleotide, both of which can be specified by users. In addition to single-locus disease models, two-locus disease models have also been implemented that can incorporate any degree of epistasis. Users are allowed to specify all nine parameters in a 3 × 3 penetrance table. For several commonly used two-locus disease models, the program can automatically calculate penetrances based on the population prevalence and marginal effects of a disease that users can conveniently specify. Conclusion The program gs can effectively generate large scale genetic and phenotypic variation data that can be

  3. Sampling for validation of digital soil maps

    NARCIS (Netherlands)

    Brus, D.J.; Kempen, B.; Heuvelink, G.B.M.

    2011-01-01

    The increase in digital soil mapping around the world means that appropriate and efficient sampling strategies are needed for validation. Data used for calibrating a digital soil mapping model typically are non-random samples. In such a case we recommend collection of additional independent data and

  4. Geographical affinities of the HapMap samples.

    Directory of Open Access Journals (Sweden)

    Miao He

    Full Text Available The HapMap samples were collected for medical-genetic studies, but are also widely used in population-genetic and evolutionary investigations. Yet the ascertainment of the samples differs from most population-genetic studies which collect individuals who live in the same local region as their ancestors. What effects could this non-standard ascertainment have on the interpretation of HapMap results?We compared the HapMap samples with more conventionally-ascertained samples used in population- and forensic-genetic studies, including the HGDP-CEPH panel, making use of published genome-wide autosomal SNP data and Y-STR haplotypes, as well as producing new Y-STR data. We found that the HapMap samples were representative of their broad geographical regions of ancestry according to all tests applied. The YRI and JPT were indistinguishable from independent samples of Yoruba and Japanese in all ways investigated. However, both the CHB and the CEU were distinguishable from all other HGDP-CEPH populations with autosomal markers, and both showed Y-STR similarities to unusually large numbers of populations, perhaps reflecting their admixed origins.The CHB and JPT are readily distinguished from one another with both autosomal and Y-chromosomal markers, and results obtained after combining them into a single sample should be interpreted with caution. The CEU are better described as being of Western European ancestry than of Northern European ancestry as often reported. Both the CHB and CEU show subtle but detectable signs of admixture. Thus the YRI and JPT samples are well-suited to standard population-genetic studies, but the CHB and CEU less so.

  5. Synchrotron-based FTIR microspectroscopy for the mapping of photo-oxidation and additives in acrylonitrile-butadiene-styrene model samples and historical objects.

    Science.gov (United States)

    Saviello, Daniela; Pouyet, Emeline; Toniolo, Lucia; Cotte, Marine; Nevin, Austin

    2014-09-16

    Synchrotron-based Fourier transform infrared micro-spectroscopy (SR-μFTIR) was used to map photo-oxidative degradation of acrylonitrile-butadiene-styrene (ABS) and to investigate the presence and the migration of additives in historical samples from important Italian design objects. High resolution (3×3 μm(2)) molecular maps were obtained by FTIR microspectroscopy in transmission mode, using a new method for the preparation of polymer thin sections. The depth of photo-oxidation in samples was evaluated and accompanied by the formation of ketones, aldehydes, esters, and unsaturated carbonyl compounds. This study demonstrates selective surface oxidation and a probable passivation of material against further degradation. In polymer fragments from design objects made of ABS from the 1960s, UV-stabilizers were detected and mapped, and microscopic inclusions of proteinaceous material were identified and mapped for the first time. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Regional geochemical maps of the Tonopah 1 degree by 2 degrees Quadrangle, Nevada, based on samples of stream sediment and nonmagnetic heavy-mineral concentrate

    Science.gov (United States)

    Nash, J.T.; Siems, D.F.

    1988-01-01

    This report is part of a series of geologic, geochemical, and geophysical maps of the Tonopah 1° x 2° quadrangle, Nevada, prepared during studies of the area for the Conterminous United States Mineral Assessment Program (CUSMAP). Included here are 21 maps showing the distributions of selected elements or combinations of elements. These regional geochemical maps are based on chemical analyses of the minus-60 mesh (0.25 mm) fraction of stream-sediment samples and the nonmagnetic heavy-mineral concentrate derived from stream sediment. Stream sediments were collected at 1,217 sites. Our geochemical studies of mineralized rock samples provide a framework for evaluating the results from stream sediments.

  7. Spatial Mapping of Organic Carbon in Returned Samples from Mars

    Science.gov (United States)

    Siljeström, S.; Fornaro, T.; Greenwalt, D.; Steele, A.

    2018-04-01

    To map organic material spatially to minerals present in the sample will be essential for the understanding of the origin of any organics in returned samples from Mars. It will be shown how ToF-SIMS may be used to map organics in samples from Mars.

  8. Using Environmental Variables for Studying of the Quality of Sampling in Soil Mapping

    Directory of Open Access Journals (Sweden)

    A. Jafari

    2016-02-01

    Full Text Available Introduction: Methods of soil survey are generally empirical and based on the mental development of the surveyor, correlating soil with underlying geology, landforms, vegetation and air-photo interpretation. Since there are no statistical criteria for traditional soil sampling; this may lead to bias in the areas being sampled. In digital soil mapping, soil samples may be used to elaborate quantitative relationships or models between soil attributes and soil covariates. Because the relationships are based on the soil observations, the quality of the resulting soil map depends also on the soil observation quality. An appropriate sampling design for digital soil mapping depends on how much data is available and where the data is located. Some statistical methods have been developed for optimizing data sampling for soil surveys. Some of these methods deal with the use of ancillary information. The purpose of this study was to evaluate the quality of sampling of existing data. Materials and Methods: The study area is located in the central basin of the Iranian plateau (Figure 1. The geologic infrastructure of the area is mainly Cretaceous limestone, Mesozoic shale and sandstone. Air photo interpretation (API was used to differentiate geomorphic patterns based on their formation processes, general structure and morphometry. The patterns were differentiated through a nested geomorphic hierarchy (Fig. 2. A four-level geomorphic hierarchy is used to breakdown the complexity of different landscapes of the study area. In the lower level of the hierarchy, the geomorphic surfaces, which were formed by a unique process during a specific geologic time, were defined. A stratified sampling scheme was designed based on geomorphic mapping. In the stratified simple random sampling, the area was divided into sub-areas referred to as strata based on geomorphic surfaces, and within each stratum, sampling locations were randomly selected (Figure 2. This resulted in 191

  9. Using NDVI and guided sampling to develop yield prediction maps of processing tomato crop

    Energy Technology Data Exchange (ETDEWEB)

    Fortes, A.; Henar Prieto, M. del; García-Martín, A.; Córdoba, A.; Martínez, L.; Campillo, C.

    2015-07-01

    The use of yield prediction maps is an important tool for the delineation of within-field management zones. Vegetation indices based on crop reflectance are of potential use in the attainment of this objective. There are different types of vegetation indices based on crop reflectance, the most commonly used of which is the NDVI (normalized difference vegetation index). NDVI values are reported to have good correlation with several vegetation parameters including the ability to predict yield. The field research was conducted in two commercial farms of processing tomato crop, Cantillana and Enviciados. An NDVI prediction map developed through ordinary kriging technique was used for guided sampling of processing tomato yield. Yield was studied and related with NDVI, and finally a prediction map of crop yield for the entire plot was generated using two geostatistical methodologies (ordinary and regression kriging). Finally, a comparison was made between the yield obtained at validation points and the yield values according to the prediction maps. The most precise yield maps were obtained with the regression kriging methodology with RRMSE values of 14% and 17% in Cantillana and Enviciados, respectively, using the NDVI as predictor. The coefficient of correlation between NDVI and yield was correlated in the point samples taken in the two locations, with values of 0.71 and 0.67 in Cantillana and Enviciados, respectively. The results suggest that the use of a massive sampling parameter such as NDVI is a good indicator of the distribution of within-field yield variation. (Author)

  10. Using Environmental Variables for Studying of the Quality of Sampling in Soil Mapping

    OpenAIRE

    A. Jafari; Norair Toomanian; R. Taghizadeh Mehrjerdi

    2016-01-01

    Introduction: Methods of soil survey are generally empirical and based on the mental development of the surveyor, correlating soil with underlying geology, landforms, vegetation and air-photo interpretation. Since there are no statistical criteria for traditional soil sampling; this may lead to bias in the areas being sampled. In digital soil mapping, soil samples may be used to elaborate quantitative relationships or models between soil attributes and soil covariates. Because the relationshi...

  11. Electron dose map inversion based on several algorithms

    International Nuclear Information System (INIS)

    Li Gui; Zheng Huaqing; Wu Yican; Fds Team

    2010-01-01

    The reconstruction to the electron dose map in radiation therapy was investigated by constructing the inversion model of electron dose map with different algorithms. The inversion model of electron dose map based on nonlinear programming was used, and this model was applied the penetration dose map to invert the total space one. The realization of this inversion model was by several inversion algorithms. The test results with seven samples show that except the NMinimize algorithm, which worked for just one sample, with great error,though,all the inversion algorithms could be realized to our inversion model rapidly and accurately. The Levenberg-Marquardt algorithm, having the greatest accuracy and speed, could be considered as the first choice in electron dose map inversion.Further tests show that more error would be created when the data close to the electron range was used (tail error). The tail error might be caused by the approximation of mean energy spectra, and this should be considered to improve the method. The time-saving and accurate algorithms could be used to achieve real-time dose map inversion. By selecting the best inversion algorithm, the clinical need in real-time dose verification can be satisfied. (authors)

  12. Testing of Alignment Parameters for Ancient Samples: Evaluating and Optimizing Mapping Parameters for Ancient Samples Using the TAPAS Tool

    Directory of Open Access Journals (Sweden)

    Ulrike H. Taron

    2018-03-01

    Full Text Available High-throughput sequence data retrieved from ancient or other degraded samples has led to unprecedented insights into the evolutionary history of many species, but the analysis of such sequences also poses specific computational challenges. The most commonly used approach involves mapping sequence reads to a reference genome. However, this process becomes increasingly challenging with an elevated genetic distance between target and reference or with the presence of contaminant sequences with high sequence similarity to the target species. The evaluation and testing of mapping efficiency and stringency are thus paramount for the reliable identification and analysis of ancient sequences. In this paper, we present ‘TAPAS’, (Testing of Alignment Parameters for Ancient Samples, a computational tool that enables the systematic testing of mapping tools for ancient data by simulating sequence data reflecting the properties of an ancient dataset and performing test runs using the mapping software and parameter settings of interest. We showcase TAPAS by using it to assess and improve mapping strategy for a degraded sample from a banded linsang (Prionodon linsang, for which no closely related reference is currently available. This enables a 1.8-fold increase of the number of mapped reads without sacrificing mapping specificity. The increase of mapped reads effectively reduces the need for additional sequencing, thus making more economical use of time, resources, and sample material.

  13. Demonstration of inaccuracy of QTL mapping due to sampling error€

    African Journals Online (AJOL)

    Bertrand Collard

    2012-06-12

    Jun 12, 2012 ... finding in simple terms so that findings and implications can be easily ... While sampling from the true populations for n = 94 and n =190, individual mapping .... Overview of QTL mapping experimental design. Four true ...

  14. USGS Topo Base Map from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Topographic Base Map from The National Map. This tile cached web map service combines the most current data services (Boundaries, Names, Transportation,...

  15. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Fogh Olsen, Ole; Sporring, Jon

    2007-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  16. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Olsen, Ole Fogh; Sporring, Jon

    2006-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  17. Elemental mapping of large samples by external ion beam analysis with sub-millimeter resolution and its applications

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Added, N.; Rizzutto, M. A.; Tabacniks, M. H.; Mangiarotti, A.; Curado, J. F.; Aguirre, F. R.; Aguero, N. F.; Allegro, P. R. P.; Campos, P. H. O. V.; Restrepo, J. M.; Trindade, G. F.; Antonio, M. R.; Assis, R. F.; Leite, A. R.

    2018-05-01

    The elemental mapping of large areas using ion beam techniques is a desired capability for several scientific communities, involved on topics ranging from geoscience to cultural heritage. Usually, the constraints for large-area mapping are not met in setups employing micro- and nano-probes implemented all over the world. A novel setup for mapping large sized samples in an external beam was recently built at the University of São Paulo employing a broad MeV-proton probe with sub-millimeter dimension, coupled to a high-precision large range XYZ robotic stage (60 cm range in all axis and precision of 5 μ m ensured by optical sensors). An important issue on large area mapping is how to deal with the irregularities of the sample's surface, that may introduce artifacts in the images due to the variation of the measuring conditions. In our setup, we implemented an automatic system based on machine vision to correct the position of the sample to compensate for its surface irregularities. As an additional benefit, a 3D digital reconstruction of the scanned surface can also be obtained. Using this new and unique setup, we have produced large-area elemental maps of ceramics, stones, fossils, and other sort of samples.

  18. An Integrated Approach Using Chaotic Map & Sample Value Difference Method for Electrocardiogram Steganography and OFDM Based Secured Patient Information Transmission.

    Science.gov (United States)

    Pandey, Anukul; Saini, Barjinder Singh; Singh, Butta; Sood, Neetu

    2017-10-18

    This paper presents a patient's confidential data hiding scheme in electrocardiogram (ECG) signal and its subsequent wireless transmission. Patient's confidential data is embedded in ECG (called stego-ECG) using chaotic map and the sample value difference approach. The sample value difference approach effectually hides the patient's confidential data in ECG sample pairs at the predefined locations. The chaotic map generates these predefined locations through the use of selective control parameters. Subsequently, the wireless transmission of the stego-ECG is analyzed using the Orthogonal Frequency Division Multiplexing (OFDM) system in a Rayleigh fading scenario for telemedicine applications. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through the statistical and clinical performance measures. Statistical measures comprise of Percentage Root-mean-square Difference (PRD), Peak Signal to Noise Ratio (PSNR), and Kulback-Leibler Divergence (KL-Div), etc. while clinical metrics includes wavelet Energy Based Diagnostic Distortion (WEDD) and Wavelet based Weighted PRD (WWPRD). The various channel Signal-to-Noise Ratio scenarios are simulated for wireless communication of stego-ECG in OFDM system. The proposed method over all the 48 records of MIT-BIH arrhythmia database resulted in average, PRD = 0.26, PSNR = 55.49, KL-Div = 3.34 × 10 -6 , WEDD = 0.02, and WWPRD = 0.10 with secret data size of 21Kb. Further, a comparative analysis of proposed method and recent existing works was also performed. The results clearly, demonstrated the superiority of proposed method.

  19. Mapping cell populations in flow cytometry data for cross-sample comparison using the Friedman-Rafsky test statistic as a distance measure.

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu; Scheuermann, Richard H

    2016-01-01

    Flow cytometry (FCM) is a fluorescence-based single-cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap-FR, a novel method for cell population mapping across FCM samples. FlowMap-FR is based on the Friedman-Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap-FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap-FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap-FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap-FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap-FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback-Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL-distance in distinguishing equivalent from nonequivalent cell

  20. CdTe detector based PIXE mapping of geological samples

    Energy Technology Data Exchange (ETDEWEB)

    Chaves, P.C., E-mail: cchaves@ctn.ist.utl.pt [Centro de Física Atómica da Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003 Lisboa (Portugal); IST/ITN, Instituto Superior Técnico, Universidade Técnica de Lisboa, Campus Tecnológico e Nuclear, EN10, 2686-953 Sacavém (Portugal); Taborda, A. [Centro de Física Atómica da Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003 Lisboa (Portugal); IST/ITN, Instituto Superior Técnico, Universidade Técnica de Lisboa, Campus Tecnológico e Nuclear, EN10, 2686-953 Sacavém (Portugal); Oliveira, D.P.S. de [Laboratório Nacional de Energia e Geologia (LNEG), Apartado 7586, 2611-901 Alfragide (Portugal); Reis, M.A. [Centro de Física Atómica da Universidade de Lisboa, Av. Prof. Gama Pinto 2, 1649-003 Lisboa (Portugal); IST/ITN, Instituto Superior Técnico, Universidade Técnica de Lisboa, Campus Tecnológico e Nuclear, EN10, 2686-953 Sacavém (Portugal)

    2014-01-01

    A sample collected from a borehole drilled approximately 10 km ESE of Bragança, Trás-os-Montes, was analysed by standard and high energy PIXE at both CTN (previous ITN) PIXE setups. The sample is a fine-grained metapyroxenite grading to coarse-grained in the base with disseminated sulphides and fine veinlets of pyrrhotite and pyrite. Matrix composition was obtained at the standard PIXE setup using a 1.25 MeV H{sup +} beam at three different spots. Medium and high Z elemental concentrations were then determined using the DT2fit and DT2simul codes (Reis et al., 2008, 2013 [1,2]), on the spectra obtained in the High Resolution and High Energy (HRHE)-PIXE setup (Chaves et al., 2013 [3]) by irradiation of the sample with a 3.8 MeV proton beam provided by the CTN 3 MV Tandetron accelerator. In this paper we present results, discuss detection limits of the method and the added value of the use of the CdTe detector in this context.

  1. High-resolution AUV mapping and sampling of a deep hydrocarbon plume in the Gulf of Mexico

    Science.gov (United States)

    Ryan, J. P.; Zhang, Y.; Thomas, H.; Rienecker, E.; Nelson, R.; Cummings, S.

    2010-12-01

    During NOAA cruise GU-10-02 on the Ship Gordon Gunter, the Monterey Bay Aquarium Research Institute (MBARI) autonomous underwater vehicle (AUV) Dorado was deployed to map and sample a deep (900-1200 m) volume centered approximately seven nautical miles southwest of the Deepwater Horizon wellhead. Dorado was equipped to detect optical and chemical signals of hydrocarbons and to acquire targeted samples. The primary sensor reading used for hydrocarbon detection was colored dissolved organic matter (CDOM) fluorescence (CF). On June 2 and 3, ship cast and subsequent AUV surveys detected elevated CF in a layer between 1100 and 1200 m depth. While the deep volume was mapped in a series of parallel vertical sections, the AUV ran a peak-capture algorithm to target sample acquisition at layer signal peaks. Samples returned by ship CTD/CF rosette sampling and by AUV were preliminarily examined at sea, and they exhibited odor and fluorometric signal consistent with oil. More definitive and detailed results on these samples are forthcoming from shore-based laboratory analyses. During post-cruise analysis, all of the CF data were analyzed to objectively define and map the deep plume feature. Specifically, the maximum expected background CF over the depth range 1000-1200 m was extrapolated from a linear relationship between depth and maximum CF over the depth range 200 to 1000 m. Values exceeding the maximum expected background in the depth range 1000-1200 m were interpreted as signal from a hydrocarbon-enriched plume. Using this definition we examine relationships between CF and other AUV measurements within the plume, illustrate the three-dimensional structure of the plume boundary region that was mapped, describe small-scale layering on isopycnals, and examine short-term variations in plume depth, intensity and hydrographic relationships. Three-dimensional representation of part of a deep hydrocarbon plume mapped and sampled by AUV on June 2-3, 2010.

  2. Hyperbolic mapping of complex networks based on community information

    Science.gov (United States)

    Wang, Zuxi; Li, Qingguang; Jin, Fengdong; Xiong, Wei; Wu, Yao

    2016-08-01

    To improve the hyperbolic mapping methods both in terms of accuracy and running time, a novel mapping method called Community and Hyperbolic Mapping (CHM) is proposed based on community information in this paper. Firstly, an index called Community Intimacy (CI) is presented to measure the adjacency relationship between the communities, based on which a community ordering algorithm is introduced. According to the proposed Community-Sector hypothesis, which supposes that most nodes of one community gather in a same sector in hyperbolic space, CHM maps the ordered communities into hyperbolic space, and then the angular coordinates of nodes are randomly initialized within the sector that they belong to. Therefore, all the network nodes are so far mapped to hyperbolic space, and then the initialized angular coordinates can be optimized by employing the information of all nodes, which can greatly improve the algorithm precision. By applying the proposed dual-layer angle sampling method in the optimization procedure, CHM reduces the time complexity to O(n2) . The experiments show that our algorithm outperforms the state-of-the-art methods.

  3. Generating a Danish raster-based topsoil property map combining choropleth maps and point information

    DEFF Research Database (Denmark)

    Greve, Mogens H.; Greve, Mette B.; Bøcher, Peder K.

    2007-01-01

    The Danish environmental authorities have posed a soil type dependent restriction on the application of nitrogen. The official Danish soil map is a choropleth topsoil map classifying the agricultural land into eight classes. The use of the soil map has shown that the maps have serious...... classification flaws. The objective of this work is to compile a continuous national topsoil texture map to replace the old topsoil map. Approximately 45,000 point samples were interpolated using ordinary kriging in 250 m x 250 m cells. To reduce variability and to obtain more homogeneous strata, the samples...... were stratified according to landscape types. Five new soil texture maps were compiled; one for each of the five textural classes, and a new categorical soil type map was compiled using the old classification system. Both the old choropleth map and the new continuous soil maps were compared to 354...

  4. Terrestrial gamma radiation baseline mapping using ultra low density sampling methods

    International Nuclear Information System (INIS)

    Kleinschmidt, R.; Watson, D.

    2016-01-01

    Baseline terrestrial gamma radiation maps are indispensable for providing basic reference information that may be used in assessing the impact of a radiation related incident, performing epidemiological studies, remediating land contaminated with radioactive materials, assessment of land use applications and resource prospectivity. For a large land mass, such as Queensland, Australia (over 1.7 million km 2 ), it is prohibitively expensive and practically difficult to undertake detailed in-situ radiometric surveys of this scale. It is proposed that an existing, ultra-low density sampling program already undertaken for the purpose of a nationwide soil survey project be utilised to develop a baseline terrestrial gamma radiation map. Geoelement data derived from the National Geochemistry Survey of Australia (NGSA) was used to construct a baseline terrestrial gamma air kerma rate map, delineated by major drainage catchments, for Queensland. Three drainage catchments (sampled at the catchment outlet) spanning low, medium and high radioelement concentrations were selected for validation of the methodology using radiometric techniques including in-situ measurements and soil sampling for high resolution gamma spectrometry, and comparative non-radiometric analysis. A Queensland mean terrestrial air kerma rate, as calculated from the NGSA outlet sediment uranium, thorium and potassium concentrations, of 49 ± 69 nGy h −1 (n = 311, 3σ 99% confidence level) is proposed as being suitable for use as a generic terrestrial air kerma rate background range. Validation results indicate that catchment outlet measurements are representative of the range of results obtained across the catchment and that the NGSA geoelement data is suitable for calculation and mapping of terrestrial air kerma rate. - Highlights: • A baseline terrestrial air kerma map of Queensland, Australia was developed using geochemical data from a major drainage catchment ultra-low density sampling program

  5. GeneRecon Users' Manual — A coalescent based tool for fine-scale association mapping

    DEFF Research Database (Denmark)

    Mailund, T

    2006-01-01

    GeneRecon is a software package for linkage disequilibrium mapping using coalescent theory. It is based on Bayesian Markov-chain Monte Carlo (MCMC) method for fine-scale linkage-disequilibrium gene mapping using high-density marker maps. GeneRecon explicitly models the genealogy of a sample of th...

  6. Local activation time sampling density for atrial tachycardia contact mapping: how much is enough?

    Science.gov (United States)

    Williams, Steven E; Harrison, James L; Chubb, Henry; Whitaker, John; Kiedrowicz, Radek; Rinaldi, Christopher A; Cooklin, Michael; Wright, Matthew; Niederer, Steven; O'Neill, Mark D

    2018-02-01

    Local activation time (LAT) mapping forms the cornerstone of atrial tachycardia diagnosis. Although anatomic and positional accuracy of electroanatomic mapping (EAM) systems have been validated, the effect of electrode sampling density on LAT map reconstruction is not known. Here, we study the effect of chamber geometry and activation complexity on optimal LAT sampling density using a combined in silico and in vivo approach. In vivo 21 atrial tachycardia maps were studied in three groups: (1) focal activation, (2) macro-re-entry, and (3) localized re-entry. In silico activation was simulated on a 4×4cm atrial monolayer, sampled randomly at 0.25-10 points/cm2 and used to re-interpolate LAT maps. Activation patterns were studied in the geometrically simple porcine right atrium (RA) and complex human left atrium (LA). Activation complexity was introduced into the porcine RA by incomplete inter-caval linear ablation. In all cases, optimal sampling density was defined as the highest density resulting in minimal further error reduction in the re-interpolated maps. Optimal sampling densities for LA tachycardias were 0.67 ± 0.17 points/cm2 (focal activation), 1.05 ± 0.32 points/cm2 (macro-re-entry) and 1.23 ± 0.26 points/cm2 (localized re-entry), P = 0.0031. Increasing activation complexity was associated with increased optimal sampling density both in silico (focal activation 1.09 ± 0.14 points/cm2; re-entry 1.44 ± 0.49 points/cm2; spiral-wave 1.50 ± 0.34 points/cm2, P density (0.61 ± 0.22 points/cm2 vs. 1.0 ± 0.34 points/cm2, P = 0.0015). Optimal sampling densities can be identified to maximize diagnostic yield of LAT maps. Greater sampling density is required to correctly reveal complex activation and represent activation across complex geometries. Overall, the optimal sampling density for LAT map interpolation defined in this study was ∼1.0-1.5 points/cm2. Published on behalf of the European Society of

  7. Large area synchrotron X-ray fluorescence mapping of biological samples

    International Nuclear Information System (INIS)

    Kempson, I.; Thierry, B.; Smith, E.; Gao, M.; De Jonge, M.

    2014-01-01

    Large area mapping of inorganic material in biological samples has suffered severely from prohibitively long acquisition times. With the advent of new detector technology we can now generate statistically relevant information for studying cell populations, inter-variability and bioinorganic chemistry in large specimen. We have been implementing ultrafast synchrotron-based XRF mapping afforded by the MAIA detector for large area mapping of biological material. For example, a 2.5 million pixel map can be acquired in 3 hours, compared to a typical synchrotron XRF set-up needing over 1 month of uninterrupted beamtime. Of particular focus to us is the fate of metals and nanoparticles in cells, 3D tissue models and animal tissues. The large area scanning has for the first time provided statistically significant information on sufficiently large numbers of cells to provide data on intercellular variability in uptake of nanoparticles. Techniques such as flow cytometry generally require analysis of thousands of cells for statistically meaningful comparison, due to the large degree of variability. Large area XRF now gives comparable information in a quantifiable manner. Furthermore, we can now image localised deposition of nanoparticles in tissues that would be highly improbable to 'find' by typical XRF imaging. In addition, the ultra fast nature also makes it viable to conduct 3D XRF tomography over large dimensions. This technology avails new opportunities in biomonitoring and understanding metal and nanoparticle fate ex-vivo. Following from this is extension to molecular imaging through specific anti-body targeted nanoparticles to label specific tissues and monitor cellular process or biological consequence

  8. BaseMap

    Data.gov (United States)

    California Natural Resource Agency — The goal of this project is to provide a convenient base map that can be used as a starting point for CA projects. It's simple, but designed to work at a number of...

  9. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  10. CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.

    Science.gov (United States)

    Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng

    2017-01-01

    Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.

  11. Vision-based mapping with cooperative robots

    Science.gov (United States)

    Little, James J.; Jennings, Cullen; Murray, Don

    1998-10-01

    Two stereo-vision-based mobile robots navigate and autonomously explore their environment safely while building occupancy grid maps of the environment. The robots maintain position estimates within a global coordinate frame using landmark recognition. This allows them to build a common map by sharing position information and stereo data. Stereo vision processing and map updates are done at 3 Hz and the robots move at speeds of 200 cm/s. Cooperative mapping is achieved through autonomous exploration of unstructured and dynamic environments. The map is constructed conservatively, so as to be useful for collision-free path planning. Each robot maintains a separate copy of a shared map, and then posts updates to the common map when it returns to observe a landmark at home base. Issues include synchronization, mutual localization, navigation, exploration, registration of maps, merging repeated views (fusion), centralized vs decentralized maps.

  12. European wet deposition maps based on measurements

    NARCIS (Netherlands)

    Leeuwen EP van; Erisman JW; Draaijers GPJ; Potma CJM; Pul WAJ van; LLO

    1995-01-01

    To date, wet deposition maps on a European scale have been based on long-range transport model results. For most components wet deposition maps based on measurements are only available on national scales. Wet deposition maps of acidifying components and base cations based on measurements are needed

  13. Wind class sampling of satellite SAR imagery for offshore wind resource mapping

    DEFF Research Database (Denmark)

    Badger, Merete; Badger, Jake; Nielsen, Morten

    2010-01-01

    developed for mesoscale modeling of wind resources. Its performance in connection with sampling of SAR scenes is tested against two sets of random SAR samples and meteorological observations at three sites in the North Sea during 2005–08. Predictions of the mean wind speed and the Weibull scale parameter......High-resolution wind fields retrieved from satellite synthetic aperture radar (SAR) imagery are combined for mapping of wind resources offshore where site measurements are costly and sparse. A new sampling strategy for the SAR scenes is introduced, based on a method for statistical......-dynamical downscaling of large-scale wind conditions using a set of wind classes that describe representative wind situations. One or more SAR scenes are then selected to represent each wind class and the classes are weighted according to their frequency of occurrence. The wind class methodology was originally...

  14. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Sporring, Jon; Fogh Olsen, Ole

    2008-01-01

    . To address this problem, we introduce a photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way, we preserve important illumination features, while...

  15. Mapping of depleted uranium with in situ spectrometry and soil samples

    International Nuclear Information System (INIS)

    Shebell, P.; Reginatto, M.; Monetti, M.; Faller, S.; Davis, L.

    1999-01-01

    Depleted uranium (DU) has been developed in the past two decades as a highly effective material for armor penetrating rounds and vehicle shielding. There is now a growing interest in the defense community to determine the presence and extent of DU contamination quickly and with a minimum amount of intrusive sampling. We report on a new approach using deconvolution techniques to quantitatively map DU contamination in surface soil. This approach combines data from soil samples with data from in situ gamma-ray spectrometry measurements to produce an accurate and detailed map of DU contamination. Results of a field survey at the Aberdeen Proving Ground are presented. (author)

  16. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  17. USGS Imagery Only Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Imagery Only is a tile cache base map of orthoimagery in The National Map visible to the 1:18,000 scale. Orthoimagery data are typically high resolution images...

  18. 36 CFR 9.42 - Well records and reports, plots and maps, samples, tests and surveys.

    Science.gov (United States)

    2010-07-01

    ... Well records and reports, plots and maps, samples, tests and surveys. Any technical data gathered... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Well records and reports, plots and maps, samples, tests and surveys. 9.42 Section 9.42 Parks, Forests, and Public Property...

  19. Interfacial orientation and misorientation relationships in nanolamellar Cu/Nb composites using transmission-electron-microscope-based orientation and phase mapping

    International Nuclear Information System (INIS)

    Liu, X.; Nuhfer, N.T.; Rollett, A.D.; Sinha, S.; Lee, S.-B.; Carpenter, J.S.; LeDonne, J.E.; Darbal, A.; Barmak, K.

    2014-01-01

    A transmission-electron-microscope-based orientation mapping technique that makes use of beam precession to achieve near-kinematical conditions was used to map the phase and crystal orientations in nanolamellar Cu/Nb composites with average layer thicknesses of 86, 30 and 18 nm. Maps of high quality and reliability were obtained by comparing the recorded diffraction patterns with pre-calculated templates. Particular care was taken in optimizing the dewarping parameters and in calibrating the frames of reference. Layers with thicknesses as low as 4 nm were successfully mapped. Heterophase interface plane and character distributions (HIPD and HICD, respectively) of Cu and Nb phases from the samples were determined from the orientation maps. In addition, local orientation relation stereograms of the Cu/Nb interfaces were calculated, and these revealed the detailed layer-to-layer texture information. The results are in agreement with previously reported neutron-diffraction-based and precession-electron-diffraction-based measurements on an accumulated roll bonding (ARB)-fabricated Cu/Nb sample with an average layer thickness of 30 nm as well as scanning-electron-microscope-based electron backscattered diffraction HIPD/HICD plots of ARB-fabricated Cu/Nb samples with layer thicknesses between 200 and 600 nm

  20. Terrestrial gamma radiation baseline mapping using ultra low density sampling methods.

    Science.gov (United States)

    Kleinschmidt, R; Watson, D

    2016-01-01

    Baseline terrestrial gamma radiation maps are indispensable for providing basic reference information that may be used in assessing the impact of a radiation related incident, performing epidemiological studies, remediating land contaminated with radioactive materials, assessment of land use applications and resource prospectivity. For a large land mass, such as Queensland, Australia (over 1.7 million km(2)), it is prohibitively expensive and practically difficult to undertake detailed in-situ radiometric surveys of this scale. It is proposed that an existing, ultra-low density sampling program already undertaken for the purpose of a nationwide soil survey project be utilised to develop a baseline terrestrial gamma radiation map. Geoelement data derived from the National Geochemistry Survey of Australia (NGSA) was used to construct a baseline terrestrial gamma air kerma rate map, delineated by major drainage catchments, for Queensland. Three drainage catchments (sampled at the catchment outlet) spanning low, medium and high radioelement concentrations were selected for validation of the methodology using radiometric techniques including in-situ measurements and soil sampling for high resolution gamma spectrometry, and comparative non-radiometric analysis. A Queensland mean terrestrial air kerma rate, as calculated from the NGSA outlet sediment uranium, thorium and potassium concentrations, of 49 ± 69 nGy h(-1) (n = 311, 3σ 99% confidence level) is proposed as being suitable for use as a generic terrestrial air kerma rate background range. Validation results indicate that catchment outlet measurements are representative of the range of results obtained across the catchment and that the NGSA geoelement data is suitable for calculation and mapping of terrestrial air kerma rate. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  1. Mapping Rice Cropping Systems in Vietnam Using an NDVI-Based Time-Series Similarity Measurement Based on DTW Distance

    Directory of Open Access Journals (Sweden)

    Xudong Guan

    2016-01-01

    Full Text Available Normalized Difference Vegetation Index (NDVI derived from Moderate Resolution Imaging Spectroradiometer (MODIS time-series data has been widely used in the fields of crop and rice classification. The cloudy and rainy weather characteristics of the monsoon season greatly reduce the likelihood of obtaining high-quality optical remote sensing images. In addition, the diverse crop-planting system in Vietnam also hinders the comparison of NDVI among different crop stages. To address these problems, we apply a Dynamic Time Warping (DTW distance-based similarity measure approach and use the entire yearly NDVI time series to reduce the inaccuracy of classification using a single image. We first de-noise the NDVI time series using S-G filtering based on the TIMESAT software. Then, a standard NDVI time-series base for rice growth is established based on field survey data and Google Earth sample data. NDVI time-series data for each pixel are constructed and the DTW distance with the standard rice growth NDVI time series is calculated. Then, we apply thresholds to extract rice growth areas. A qualitative assessment using statistical data and a spatial assessment using sampled data from the rice-cropping map reveal a high mapping accuracy at the national scale between the statistical data, with the corresponding R2 being as high as 0.809; however, the mapped rice accuracy decreased at the provincial scale due to the reduced number of rice planting areas per province. An analysis of the results indicates that the 500-m resolution MODIS data are limited in terms of mapping scattered rice parcels. The results demonstrate that the DTW-based similarity measure of the NDVI time series can be effectively used to map large-area rice cropping systems with diverse cultivation processes.

  2. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  3. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  4. Bridging scale gaps between regional maps of forest aboveground biomass and field sampling plots using TanDEM-X data

    Science.gov (United States)

    Ni, W.; Zhang, Z.; Sun, G.

    2017-12-01

    Several large-scale maps of forest AGB have been released [1] [2] [3]. However, these existing global or regional datasets were only approximations based on combining land cover type and representative values instead of measurements of actual forest aboveground biomass or forest heights [4]. Rodríguez-Veiga et al[5] reported obvious discrepancies of existing forest biomass stock maps with in-situ observations in Mexico. One of the biggest challenges to the credibility of these maps comes from the scale gaps between the size of field sampling plots used to develop(or validate) estimation models and the pixel size of these maps and the availability of field sampling plots with sufficient size for the verification of these products [6]. It is time-consuming and labor-intensive to collect sufficient number of field sampling data over the plot size of the same as resolutions of regional maps. The smaller field sampling plots cannot fully represent the spatial heterogeneity of forest stands as shown in Figure 1. Forest AGB is directly determined by forest heights, diameter at breast height (DBH) of each tree, forest density and tree species. What measured in the field sampling are the geometrical characteristics of forest stands including the DBH, tree heights and forest densities. The LiDAR data is considered as the best dataset for the estimation of forest AGB. The main reason is that LiDAR can directly capture geometrical features of forest stands by its range detection capabilities.The remotely sensed dataset, which is capable of direct measurements of forest spatial structures, may serve as a ladder to bridge the scale gaps between the pixel size of regional maps of forest AGB and field sampling plots. Several researches report that TanDEM-X data can be used to characterize the forest spatial structures [7, 8]. In this study, the forest AGB map of northeast China were produced using ALOS/PALSAR data taking TanDEM-X data as a bridges. The TanDEM-X InSAR data used in

  5. Fine mapping quantitative trait loci under selective phenotyping strategies based on linkage and linkage disequilibrium criteria

    DEFF Research Database (Denmark)

    Ansari-Mahyari, S; Berg, P; Lund, M S

    2009-01-01

    disequilibrium-based sampling criteria (LDC) for selecting individuals to phenotype are compared to random phenotyping in a quantitative trait loci (QTL) verification experiment using stochastic simulation. Several strategies based on LAC and LDC for selecting the most informative 30%, 40% or 50% of individuals...... for phenotyping to extract maximum power and precision in a QTL fine mapping experiment were developed and assessed. Linkage analyses for the mapping was performed for individuals sampled on LAC within families and combined linkage disequilibrium and linkage analyses was performed for individuals sampled across...... the whole population based on LDC. The results showed that selecting individuals with similar haplotypes to the paternal haplotypes (minimum recombination criterion) using LAC compared to random phenotyping gave at least the same power to detect a QTL but decreased the accuracy of the QTL position. However...

  6. X-ray fluorescence microscopy artefacts in elemental maps of topologically complex samples: Analytical observations, simulation and a map correction method

    Science.gov (United States)

    Billè, Fulvio; Kourousias, George; Luchinat, Enrico; Kiskinova, Maya; Gianoncelli, Alessandra

    2016-08-01

    XRF spectroscopy is among the most widely used non-destructive techniques for elemental analysis. Despite the known angular dependence of X-ray fluorescence (XRF), topological artefacts remain an unresolved issue when using X-ray micro- or nano-probes. In this work we investigate the origin of the artefacts in XRF imaging of topologically complex samples, which are unresolved problems in studies of organic matter due to the limited travel distances of low energy XRF emission from the light elements. In particular we mapped Human Embryonic Kidney (HEK293T) cells. The exemplary results with biological samples, obtained with a soft X-ray scanning microscope installed at a synchrotron facility were used for testing a mathematical model based on detector response simulations, and for proposing an artefact correction method based on directional derivatives. Despite the peculiar and specific application, the methodology can be easily extended to hard X-rays and to set-ups with multi-array detector systems when the dimensions of surface reliefs are in the order of the probing beam size.

  7. A Lithology Based Map Unit Schema For Onegeology Regional Geologic Map Integration

    Science.gov (United States)

    Moosdorf, N.; Richard, S. M.

    2012-12-01

    A system of lithogenetic categories for a global lithological map (GLiM, http://www.ifbm.zmaw.de/index.php?id=6460&L=3) has been compiled based on analysis of lithology/genesis categories for regional geologic maps for the entire globe. The scheme is presented for discussion and comment. Analysis of units on a variety of regional geologic maps indicates that units are defined based on assemblages of rock types, as well as their genetic type. In this compilation of continental geology, outcropping surface materials are dominantly sediment/sedimentary rock; major subdivisions of the sedimentary category include clastic sediment, carbonate sedimentary rocks, clastic sedimentary rocks, mixed carbonate and clastic sedimentary rock, colluvium and residuum. Significant areas of mixed igneous and metamorphic rock are also present. A system of global categories to characterize the lithology of regional geologic units is important for Earth System models of matter fluxes to soils, ecosystems, rivers and oceans, and for regional analysis of Earth surface processes at global scale. Because different applications of the classification scheme will focus on different lithologic constituents in mixed units, an ontology-type representation of the scheme that assigns properties to the units in an analyzable manner will be pursued. The OneGeology project is promoting deployment of geologic map services at million scale for all nations. Although initial efforts are commonly simple scanned map WMS services, the intention is to move towards data-based map services that categorize map units with standard vocabularies to allow use of a common map legend for better visual integration of the maps (e.g. see OneGeology Europe, http://onegeology-europe.brgm.fr/ geoportal/ viewer.jsp). Current categorization of regional units with a single lithology from the CGI SimpleLithology (http://resource.geosciml.org/201202/ Vocab2012html/ SimpleLithology201012.html) vocabulary poorly captures the

  8. Estimating cross-validatory predictive p-values with integrated importance sampling for disease mapping models.

    Science.gov (United States)

    Li, Longhai; Feng, Cindy X; Qiu, Shi

    2017-06-30

    An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. USGS Imagery Topo Large-scale Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS Imagery Topo Large service from The National Map (TNM) is a dynamic topographic base map service that combines the best available data (Boundaries,...

  10. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  11. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  12. Kalman/Map Filtering-Aided Fast Normalized Cross Correlation-Based Wi-Fi Fingerprinting Location Sensing

    Directory of Open Access Journals (Sweden)

    Yongliang Sun

    2013-11-01

    Full Text Available A Kalman/map filtering (KMF-aided fast normalized cross correlation (FNCC-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results.

  13. Kalman/Map filtering-aided fast normalized cross correlation-based Wi-Fi fingerprinting location sensing.

    Science.gov (United States)

    Sun, Yongliang; Xu, Yubin; Li, Cheng; Ma, Lin

    2013-11-13

    A Kalman/map filtering (KMF)-aided fast normalized cross correlation (FNCC)-based Wi-Fi fingerprinting location sensing system is proposed in this paper. Compared with conventional neighbor selection algorithms that calculate localization results with received signal strength (RSS) mean samples, the proposed FNCC algorithm makes use of all the on-line RSS samples and reference point RSS variations to achieve higher fingerprinting accuracy. The FNCC computes efficiently while maintaining the same accuracy as the basic normalized cross correlation. Additionally, a KMF is also proposed to process fingerprinting localization results. It employs a new map matching algorithm to nonlinearize the linear location prediction process of Kalman filtering (KF) that takes advantage of spatial proximities of consecutive localization results. With a calibration model integrated into an indoor map, the map matching algorithm corrects unreasonable prediction locations of the KF according to the building interior structure. Thus, more accurate prediction locations are obtained. Using these locations, the KMF considerably improves fingerprinting algorithm performance. Experimental results demonstrate that the FNCC algorithm with reduced computational complexity outperforms other neighbor selection algorithms and the KMF effectively improves location sensing accuracy by using indoor map information and spatial proximities of consecutive localization results.

  14. Accuracy assessment of the National Forest Inventory map of Mexico: sampling designs and the fuzzy characterization of landscapes

    Directory of Open Access Journals (Sweden)

    Stéphane Couturier

    2009-10-01

    Full Text Available There is no record so far in the literature of a comprehensive method to assess the accuracy of regional scale Land Cover/ Land Use (LCLU maps in the sub-tropical belt. The elevated biodiversity and the presence of highly fragmented classes hamper the use of sampling designs commonly employed in previous assessments of mainly temperate zones. A sampling design for assessing the accuracy of the Mexican National Forest Inventory (NFI map at community level is presented. A pilot study was conducted on the Cuitzeo Lake watershed region covering 400 000 ha of the 2000 Landsat-derived map. Various sampling designs were tested in order to find a trade-off between operational costs, a good spatial distribution of the sample and the inclusion of all scarcely distributed classes (‘rare classes’. A two-stage sampling design where the selection of Primary Sampling Units (PSU was done under separate schemes for commonly and scarcely distributed classes, showed best characteristics. A total of 2 023 punctual secondary sampling units were verified against their NFI map label. Issues regarding the assessment strategy and trends of class confusions are devised.

  15. The evolution of mapping habitat for northern spotted owls (Strix occidentalis caurina): A comparison of photo-interpreted, Landsat-based, and lidar-based habitat maps

    Science.gov (United States)

    Ackers, Steven H.; Davis, Raymond J.; Olsen, K.; Dugger, Catherine

    2015-01-01

    Wildlife habitat mapping has evolved at a rapid pace over the last few decades. Beginning with simple, often subjective, hand-drawn maps, habitat mapping now involves complex species distribution models (SDMs) using mapped predictor variables derived from remotely sensed data. For species that inhabit large geographic areas, remote sensing technology is often essential for producing range wide maps. Habitat monitoring for northern spotted owls (Strix occidentalis caurina), whose geographic covers about 23 million ha, is based on SDMs that use Landsat Thematic Mapper imagery to create forest vegetation data layers using gradient nearest neighbor (GNN) methods. Vegetation data layers derived from GNN are modeled relationships between forest inventory plot data, climate and topographic data, and the spectral signatures acquired by the satellite. When used as predictor variables for SDMs, there is some transference of the GNN modeling error to the final habitat map.Recent increases in the use of light detection and ranging (lidar) data, coupled with the need to produce spatially accurate and detailed forest vegetation maps have spurred interest in its use for SDMs and habitat mapping. Instead of modeling predictor variables from remotely sensed spectral data, lidar provides direct measurements of vegetation height for use in SDMs. We expect a SDM habitat map produced from directly measured predictor variables to be more accurate than one produced from modeled predictors.We used maximum entropy (Maxent) SDM modeling software to compare predictive performance and estimates of habitat area between Landsat-based and lidar-based northern spotted owl SDMs and habitat maps. We explored the differences and similarities between these maps, and to a pre-existing aerial photo-interpreted habitat map produced by local wildlife biologists. The lidar-based map had the highest predictive performance based on 10 bootstrapped replicate models (AUC = 0.809 ± 0.011), but the

  16. Integrating field sampling, geostatistics and remote sensing to map wetland vegetation in the Pantanal, Brazil

    Directory of Open Access Journals (Sweden)

    J. Arieira

    2011-03-01

    Full Text Available Development of efficient methodologies for mapping wetland vegetation is of key importance to wetland conservation. Here we propose the integration of a number of statistical techniques, in particular cluster analysis, universal kriging and error propagation modelling, to integrate observations from remote sensing and field sampling for mapping vegetation communities and estimating uncertainty. The approach results in seven vegetation communities with a known floral composition that can be mapped over large areas using remotely sensed data. The relationship between remotely sensed data and vegetation patterns, captured in four factorial axes, were described using multiple linear regression models. There were then used in a universal kriging procedure to reduce the mapping uncertainty. Cross-validation procedures and Monte Carlo simulations were used to quantify the uncertainty in the resulting map. Cross-validation showed that accuracy in classification varies according with the community type, as a result of sampling density and configuration. A map of uncertainty derived from Monte Carlo simulations revealed significant spatial variation in classification, but this had little impact on the proportion and arrangement of the communities observed. These results suggested that mapping improvement could be achieved by increasing the number of field observations of those communities with a scattered and small patch size distribution; or by including a larger number of digital images as explanatory variables in the model. Comparison of the resulting plant community map with a flood duration map, revealed that flooding duration is an important driver of vegetation zonation. This mapping approach is able to integrate field point data and high-resolution remote-sensing images, providing a new basis to map wetland vegetation and allow its future application in habitat management, conservation assessment and long-term ecological monitoring in wetland

  17. PCR-Based EST Mapping in Wheat (Triticum aestivum L.

    Directory of Open Access Journals (Sweden)

    J. PERRY GUSTAFSON

    2009-04-01

    Full Text Available Mapping expressed sequence tags (ESTs to hexaploid wheat is aimed to reveal the structure and function of the hexaploid wheat genome. Sixty eight ESTs representing 26 genes were mapped into all seven homologous chromosome groups of wheat (Triticum aestivum L using a polymerase chain reaction technique. The majority of the ESTs were mapped to homologous chromosome group 2, and the least were mapped to homologous chromosome group 6. Comparative analysis between the EST map from this study and the EST map based on RFLPs showed 14 genes that have been mapped by both approaches were mapped to the same arm of the same homologous chromosome, which indicated that using PCR-based ESTs was a reliable approach in mapping ESTs in hexaploid wheat.

  18. USGS Topo Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Topo is a topographic tile cache base map that combines the most current data (Boundaries, Names, Transportation, Elevation, Hydrography, Land Cover, and other...

  19. Landslide Inventory Mapping from Bitemporal 10 m SENTINEL-2 Images Using Change Detection Based Markov Random Field

    Science.gov (United States)

    Qin, Y.; Lu, P.; Li, Z.

    2018-04-01

    Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF) method for landslide inventory mapping. The proposed method mainly includes two steps: 1) change detection-based multi-threshold for training samples generation and 2) MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1) it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2) it takes the spectral characteristics of landslides into account; and 3) it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2) images in China.

  20. User Experience Design in Professional Map-Based Geo-Portals

    Directory of Open Access Journals (Sweden)

    Bastian Zimmer

    2013-10-01

    Full Text Available We have recently been witnessing the growing establishment of map-centered web-based geo-portals on national, regional and local levels. However, a particular issue with these geo-portals is that each instance has been implemented in different ways in terms of design, usability, functionality, interaction possibilities, map size and symbologies. In this paper, we try to tackle these shortcomings by analyzing and formalizing the requirements for map-based geo-portals in a user experience based approach. First, we propose a holistic definition the term of a “geo-portal”. Then, we present our approach to user experience design for map-based geo-portals by defining the functional requirements of a geo-portal, by analyzing previous geo-portal developments, by distilling the results of our empirical user study to perform practically-oriented user requirements, and finally by establishing a set of user experience design guidelines for the creation of map-based geo-portals. These design guidelines have been extracted for each of the main components of a geo-portal, i.e., the map, the search dialogue, the presentation of the search results, symbologies, and other aspects. These guidelines shall constitute the basis for future geo-portal developments to achieve standardization in the user-experience design of map-based geo-portals.

  1. Reliability of different sampling densities for estimating and mapping lichen diversity in biomonitoring studies

    International Nuclear Information System (INIS)

    Ferretti, M.; Brambilla, E.; Brunialti, G.; Fornasier, F.; Mazzali, C.; Giordani, P.; Nimis, P.L.

    2004-01-01

    Sampling requirements related to lichen biomonitoring include optimal sampling density for obtaining precise and unbiased estimates of population parameters and maps of known reliability. Two available datasets on a sub-national scale in Italy were used to determine a cost-effective sampling density to be adopted in medium-to-large-scale biomonitoring studies. As expected, the relative error in the mean Lichen Biodiversity (Italian acronym: BL) values and the error associated with the interpolation of BL values for (unmeasured) grid cells increased as the sampling density decreased. However, the increase in size of the error was not linear and even a considerable reduction (up to 50%) in the original sampling effort led to a far smaller increase in errors in the mean estimates (<6%) and in mapping (<18%) as compared with the original sampling densities. A reduction in the sampling effort can result in considerable savings of resources, which can then be used for a more detailed investigation of potentially problematic areas. It is, however, necessary to decide the acceptable level of precision at the design stage of the investigation, so as to select the proper sampling density. - An acceptable level of precision must be decided before determining a sampling design

  2. Object-based Landslide Mapping: Examples, Challenges and Opportunities

    Science.gov (United States)

    Hölbling, Daniel; Eisank, Clemens; Friedl, Barbara; Chang, Kang-Tsung; Tsai, Tsai-Tsung; Birkefeldt Møller Pedersen, Gro; Betts, Harley; Cigna, Francesca; Chiang, Shou-Hao; Aubrey Robson, Benjamin; Bianchini, Silvia; Füreder, Petra; Albrecht, Florian; Spiekermann, Raphael; Weinke, Elisabeth; Blaschke, Thomas; Phillips, Chris

    2016-04-01

    Over the last decade, object-based image analysis (OBIA) has been increasingly used for mapping landslides that occur after triggering events such as heavy rainfall. The increasing availability and quality of Earth Observation (EO) data in terms of temporal, spatial and spectral resolution allows for comprehensive mapping of landslides at multiple scales. Most often very high resolution (VHR) or high resolution (HR) optical satellite images are used in combination with a digital elevation model (DEM) and its products such as slope and curvature. Semi-automated object-based mapping makes use of various characteristics of image objects that are derived through segmentation. OBIA enables numerous spectral, spatial, contextual and textural image object properties to be applied during an analysis. This is especially useful when mapping complex natural features such as landslides and constitutes an advantage over pixel-based image analysis. However, several drawbacks in the process of object-based landslide mapping have not been overcome yet. The developed classification routines are often rather complex and limited regarding their transferability across areas and sensors. There is still more research needed to further improve present approaches and to fully exploit the capabilities of OBIA for landslide mapping. In this study several examples of object-based landslide mapping from various geographical regions with different characteristics are presented. Examples from the Austrian and Italian Alps are shown, whereby one challenge lies in the detection of small-scale landslides on steep slopes while preventing the classification of false positives with similar spectral properties (construction areas, utilized land, etc.). Further examples feature landslides mapped in Iceland, where the differentiation of landslides from other landscape-altering processes in a highly dynamic volcanic landscape poses a very distinct challenge, and in Norway, which is exposed to multiple

  3. A regularized, model-based approach to phase-based conductivity mapping using MRI.

    Science.gov (United States)

    Ropella, Kathleen M; Noll, Douglas C

    2017-11-01

    To develop a novel regularized, model-based approach to phase-based conductivity mapping that uses structural information to improve the accuracy of conductivity maps. The inverse of the three-dimensional Laplacian operator is used to model the relationship between measured phase maps and the object conductivity in a penalized weighted least-squares optimization problem. Spatial masks based on structural information are incorporated into the problem to preserve data near boundaries. The proposed Inverse Laplacian method was compared against a restricted Gaussian filter in simulation, phantom, and human experiments. The Inverse Laplacian method resulted in lower reconstruction bias and error due to noise in simulations than the Gaussian filter. The Inverse Laplacian method also produced conductivity maps closer to the measured values in a phantom and with reduced noise in the human brain, as compared to the Gaussian filter. The Inverse Laplacian method calculates conductivity maps with less noise and more accurate values near boundaries. Improving the accuracy of conductivity maps is integral for advancing the applications of conductivity mapping. Magn Reson Med 78:2011-2021, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  4. Map-based mobile services design, interaction and usability

    CERN Document Server

    Meng, Liqiu; Winter, Stephan; Popovich, Vasily

    2008-01-01

    This book reports the newest research and technical achievements on the following theme blocks: Design of mobile map services and its constraints; Typology and usability of mobile map services; Visualization solutions on small displays for time-critical tasks; Mobile map users; Interaction and adaptation in mobile environments; and Applications of map-based mobile services.

  5. AMCO Scribe Sampling Data Map Service, Oakland CA, 2017, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — This map service contains a single layer: Groundwater Samples. The layer draws at all scales. Full FGDC metadata for the layer may be found by clicking the layer...

  6. Contour Detection for UAV-Based Cadastral Mapping

    Directory of Open Access Journals (Sweden)

    Sophie Crommelinck

    2017-02-01

    Full Text Available Unmanned aerial vehicles (UAVs provide a flexible and low-cost solution for the acquisition of high-resolution data. The potential of high-resolution UAV imagery to create and update cadastral maps is being increasingly investigated. Existing procedures generally involve substantial fieldwork and many manual processes. Arguably, multiple parts of UAV-based cadastral mapping workflows could be automated. Specifically, as many cadastral boundaries coincide with visible boundaries, they could be extracted automatically using image analysis methods. This study investigates the transferability of gPb contour detection, a state-of-the-art computer vision method, to remotely sensed UAV images and UAV-based cadastral mapping. Results show that the approach is transferable to UAV data and automated cadastral mapping: object contours are comprehensively detected at completeness and correctness rates of up to 80%. The detection quality is optimal when the entire scene is covered with one orthoimage, due to the global optimization of gPb contour detection. However, a balance between high completeness and correctness is hard to achieve, so a combination with area-based segmentation and further object knowledge is proposed. The localization quality exhibits the usual dependency on ground resolution. The approach has the potential to accelerate the process of general boundary delineation during the creation and updating of cadastral maps.

  7. Self-Organizing Maps Neural Networks Applied to the Classification of Ethanol Samples According to the Region of Commercialization

    Directory of Open Access Journals (Sweden)

    Aline Regina Walkoff

    2017-10-01

    Full Text Available Physical-chemical analysis data were collected, from 998 ethanol samples of automotive ethanol commercialized in the northern, midwestern and eastern regions of the state of Paraná. The data presented self-organizing maps (SOM neural networks, which classified them according to those regions. The self-organizing maps best configuration had a 45 x 45 topology and 5000 training epochs, with a final learning rate of 6.7x10-4, a final neighborhood relationship of 3x10-2 and a mean quantization error of 2x10-2. This neural network provided a topological map depicting three separated groups, each one corresponding to samples of a same region of commercialization. Four maps of weights, one for each parameter, were presented. The network established the pH was the most important variable for classification and electrical conductivity the least one. The self-organizing maps application allowed the segmentation of alcohol samples, therefore identifying them according to the region of commercialization. DOI: http://dx.doi.org/10.17807/orbital.v9i4.982

  8. Agroforestry suitability analysis based upon nutrient availability mapping: a GIS based suitability mapping

    Directory of Open Access Journals (Sweden)

    Firoz Ahmad

    2017-05-01

    Full Text Available Agroforestry has drawn the attention of researchers due to its capacity to reduce the poverty and land degradation, improve food security and mitigate the climate change. However, the progress in promoting agroforestry is held back due to the lack of reliable data sets and appropriate tools to accurately map and to have an adequate decision making system for agroforestry modules. Agroforestry suitability being one special form of land suitability is very pertinent to study in the current times when there is tremendous pressure on the land as it is a limited commodity. The study aims for applying the geo-spatial tools towards visualizing various soil and environmental data to reveal the trends and interrelationships and to achieve a nutrient availability and agroforestry suitability map. Using weight matrix and ranks, individual maps were developed in ArcGIS 10.1 platform to generate nutrient availability map, which was later used to develop agroforestry suitability map. Watersheds were delineated using DEM in some part of the study area and were evaluated for prioritizing it and agroforestry suitability of the watersheds were also done as per the schematic flowchart. Agroforestry suitability regions were delineated based upon the weight and ranks by integrated mapping. The total open area was identified 42.4% out of which 21.6% area was found to have high suitability towards agroforestry. Within the watersheds, 22 village points were generated for creating buffers, which were further evaluated showing its proximity to high suitable agroforestry sites thus generating tremendous opportunity to the villagers to carry out agroforestry projects locally. This research shows the capability of remote sensing in studying agroforestry practices and in estimating the prominent factors for its optimal productivity. The ongoing agroforestry projects can be potentially diverted in the areas of high suitability as an extension. The use of ancillary data in GIS

  9. The Metacognitive Anger Processing (MAP) Scale - Validation in a Mixed Clinical and a Forensic In-Patient Sample

    DEFF Research Database (Denmark)

    Moeller, Stine Bjerrum; Bech, Per

    2018-01-01

    BACKGROUND: The metacognitive approach by Wells and colleagues has gained empirical support with a broad range of symptoms. The Metacognitive Anger Processing (MAP) scale was developed to provide a metacognitive measure on anger (Moeller, 2016). In the preliminary validation, three components were...... identified (positive beliefs, negative beliefs and rumination) to be positively correlated with the anger. AIMS: To validate the MAP in a sample of mixed clinical patients (n = 88) and a sample of male forensic patients (n = 54). METHOD: The MAP was administered together with measures of metacognition, anger......, rumination, anxiety and depressive symptoms. RESULTS: The MAP showed acceptable scalability and excellent reliability. Convergent validity was evidenced using the general metacognitive measure (MCQ-30), and concurrent validity was supported using two different anger measures (STAXI-2 and NAS). CONCLUSIONS...

  10. Spectral features based tea garden extraction from digital orthophoto maps

    Science.gov (United States)

    Jamil, Akhtar; Bayram, Bulent; Kucuk, Turgay; Zafer Seker, Dursun

    2018-05-01

    The advancements in the photogrammetry and remote sensing technologies has made it possible to extract useful tangible information from data which plays a pivotal role in various application such as management and monitoring of forests and agricultural lands etc. This study aimed to evaluate the effectiveness of spectral signatures for extraction of tea gardens from 1 : 5000 scaled digital orthophoto maps obtained from Rize city in Turkey. First, the normalized difference vegetation index (NDVI) was derived from the input images to suppress the non-vegetation areas. NDVI values less than zero were discarded and the output images was normalized in the range 0-255. Individual pixels were then mapped into meaningful objects using global region growing technique. The resulting image was filtered and smoothed to reduce the impact of noise. Furthermore, geometrical constraints were applied to remove small objects (less than 500 pixels) followed by morphological opening operator to enhance the results. These objects served as building blocks for further image analysis. Finally, for the classification stage, a range of spectral values were empirically calculated for each band and applied on candidate objects to extract tea gardens. For accuracy assessment, we employed an area based similarity metric by overlapping obtained tea garden boundaries with the manually digitized tea garden boundaries created by experts of photogrammetry. The overall accuracy of the proposed method scored 89 % for tea gardens from 10 sample orthophoto maps. We concluded that exploiting the spectral signatures using object based analysis is an effective technique for extraction of dominant tree species from digital orthophoto maps.

  11. Approach of simultaneous localization and mapping based on local maps for robot

    Institute of Scientific and Technical Information of China (English)

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  12. Particle filter based MAP state estimation: A comparison

    NARCIS (Netherlands)

    Saha, S.; Boers, Y.; Driessen, J.N.; Mandal, Pranab K.; Bagchi, Arunabha

    2009-01-01

    MAP estimation is a good alternative to MMSE for certain applications involving nonlinear non Gaussian systems. Recently a new particle filter based MAP estimator has been derived. This new method extracts the MAP directly from the output of a running particle filter. In the recent past, a Viterbi

  13. USGS Imagery Topo Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Imagery Topo is a topographic tile cache base map with orthoimagery as a backdrop, and combines the most current data (Boundaries, Names, Transportation,...

  14. Land degradation mapping based on hyperion data in desertification region of northwest China

    Science.gov (United States)

    Cheng, Penggen; Wu, Jian; Ouyang, Ping; He, Ting

    2008-10-01

    Desertification is an alarming sign of land degradation in Henshan county of northwest china. Due to the considerable costs of detailed ground surveys of this phenomenon, remote sensing is an appropriate alternative for analyzing and evaluating the risks of the expansion of land degradation. Degradation features can be detected directly or indirectly by using image data. In this paper, based on the Hyperion images of Hengshan desertification region of northwest china, a new algorithm aimed at land degradation mapping, called Land Degradation Index (LDI), was put forward. This new algorithm is based on the classified process. We applied the linear spectral unmixing algorithm with the training samples derived from the formerly classified process so as to find out new endmembers in the RMS error imagine. After that, using neutral net mapping with new training samples, the classified result was gained. In addition, after applying mask processing, the soils were grouped to 3 types (Kappa =0.90): highly degraded soils, moderately degraded soils and slightly degraded soils. By analyzing 3 mapping methods: mixture-classification, the spectral angle mapper and mixturetuned matched filtering, the results suggest that the mixture-classification has the higher accuracy (Kappa=0.7075) than the spectral angle mapper (Kappa=0.5418) and the mixture-tuned matched filter (Kappa=0.6039). As a result, the mixture-classification is selected to carry out Land Degradation Index analysis.

  15. Advancing the quantification of humid tropical forest cover loss with multi-resolution optical remote sensing data: Sampling & wall-to-wall mapping

    Science.gov (United States)

    Broich, Mark

    Humid tropical forest cover loss is threatening the sustainability of ecosystem goods and services as vast forest areas are rapidly cleared for industrial scale agriculture and tree plantations. Despite the importance of humid tropical forest in the provision of ecosystem services and economic development opportunities, the spatial and temporal distribution of forest cover loss across large areas is not well quantified. Here I improve the quantification of humid tropical forest cover loss using two remote sensing-based methods: sampling and wall-to-wall mapping. In all of the presented studies, the integration of coarse spatial, high temporal resolution data with moderate spatial, low temporal resolution data enable advances in quantifying forest cover loss in the humid tropics. Imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) are used as the source of coarse spatial resolution, high temporal resolution data and imagery from the Landsat Enhanced Thematic Mapper Plus (ETM+) sensor are used as the source of moderate spatial, low temporal resolution data. In a first study, I compare the precision of different sampling designs for the Brazilian Amazon using the annual deforestation maps derived by the Brazilian Space Agency for reference. I show that sampling designs can provide reliable deforestation estimates; furthermore, sampling designs guided by MODIS data can provide more efficient estimates than the systematic design used for the United Nations Food and Agricultural Organization Forest Resource Assessment 2010. Sampling approaches, such as the one demonstrated, are viable in regions where data limitations, such as cloud contamination, limit exhaustive mapping methods. Cloud-contaminated regions experiencing high rates of change include Insular Southeast Asia, specifically Indonesia and Malaysia. Due to persistent cloud cover, forest cover loss in Indonesia has only been mapped at a 5-10 year interval using photo interpretation of single

  16. Mapping Typical Urban LULC from Landsat Imagery without Training Samples or Self-Defined Parameters

    Directory of Open Access Journals (Sweden)

    Hui Li

    2017-07-01

    Full Text Available Land use/land cover (LULC change is one of the most important indicators in understanding the interactions between humans and the environment. Traditionally, when LULC maps are produced yearly, most existing remote-sensing methods have to collect ground reference data annually, as the classifiers have to be trained individually in each corresponding year. This study presented a novel strategy to map LULC classes without training samples or assigning parameters. First of all, several novel indices were carefully selected from the index pool, which were able to highlight certain LULC very well. Following this, a common unsupervised classifier was employed to extract the LULC from the associated index image without assigning thresholds. Finally, a supervised classification was implemented with samples automatically collected from the unsupervised classification outputs. Results illustrated that the proposed method could achieve satisfactory performance, reaching similar accuracies to traditional approaches. Findings of this study demonstrate that the proposed strategy is a simple and effective alternative to mapping urban LULC. With the proposed strategy, the budget and time required for remote-sensing data processing could be reduced dramatically.

  17. Vision-based topological map building and localisation using persistent features

    CSIR Research Space (South Africa)

    Sabatta, DG

    2008-11-01

    Full Text Available stream_source_info Sabatta_2008.pdf.txt stream_content_type text/plain stream_size 32284 Content-Encoding UTF-8 stream_name Sabatta_2008.pdf.txt Content-Type text/plain; charset=UTF-8 Vision-based Topological Map... of topological mapping was introduced into the field of robotics following studies of human cogni- tive mapping undertaken by Kuipers [8]. Since then, much progress has been made in the field of vision-based topologi- cal mapping. Topological mapping lends...

  18. Hyperspectral Data for Mangrove Species Mapping: A Comparison of Pixel-Based and Object-Based Approach

    Directory of Open Access Journals (Sweden)

    Muhammad Kamal

    2011-10-01

    Full Text Available Visual image interpretation and digital image classification have been used to map and monitor mangrove extent and composition for decades. The presence of a high-spatial resolution hyperspectral sensor can potentially improve our ability to differentiate mangrove species. However, little research has explored the use of pixel-based and object-based approaches on high-spatial hyperspectral datasets for this purpose. This study assessed the ability of CASI-2 data for mangrove species mapping using pixel-based and object-based approaches at the mouth of the Brisbane River area, southeast Queensland, Australia. Three mapping techniques used in this study: spectral angle mapper (SAM and linear spectral unmixing (LSU for the pixel-based approaches, and multi-scale segmentation for the object-based image analysis (OBIA. The endmembers for the pixel-based approach were collected based on existing vegetation community map. Nine targeted classes were mapped in the study area from each approach, including three mangrove species: Avicennia marina, Rhizophora stylosa, and Ceriops australis. The mapping results showed that SAM produced accurate class polygons with only few unclassified pixels (overall accuracy 69%, Kappa 0.57, the LSU resulted in a patchy polygon pattern with many unclassified pixels (overall accuracy 56%, Kappa 0.41, and the object-based mapping produced the most accurate results (overall accuracy 76%, Kappa 0.67. Our results demonstrated that the object-based approach, which combined a rule-based and nearest-neighbor classification method, was the best classifier to map mangrove species and its adjacent environments.

  19. USGS Hill Shade Base Map Service from The National Map

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — USGS Hill Shade (or Shaded Relief) is a tile cache base map created from the National Elevation Dataset (NED), a seamless dataset of best available raster elevation...

  20. Projector primary-based optimization for superimposed projection mappings

    Science.gov (United States)

    Ahmed, Bilal; Lee, Jong Hun; Lee, Yong Yi; Lee, Kwan H.

    2018-01-01

    Recently, many researchers have focused on fully overlapping projections for three-dimensional (3-D) projection mapping systems but reproducing a high-quality appearance using this technology still remains a challenge. On top of existing color compensation-based methods, much effort is still required to faithfully reproduce an appearance that is free from artifacts, colorimetric inconsistencies, and inappropriate illuminance over the 3-D projection surface. According to our observation, this is due to the fact that overlapping projections are treated as an additive-linear mixture of color. However, this is not the case according to our elaborated observations. We propose a method that enables us to use high-quality appearance data that are measured from original objects and regenerate the same appearance by projecting optimized images using multiple projectors, ensuring that the projection-rendered results look visually close to the real object. We prepare our target appearances by photographing original objects. Then, using calibrated projector-camera pairs, we compensate for missing geometric correspondences to make our method robust against noise. The heart of our method is a target appearance-driven adaptive sampling of the projection surface followed by a representation of overlapping projections in terms of the projector-primary response. This gives off projector-primary weights to facilitate blending and the system is applied with constraints. These samples are used to populate a light transport-based system. Then, the system is solved minimizing the error to get the projection images in a noise-free manner by utilizing intersample overlaps. We ensure that we make the best utilization of available hardware resources to recreate projection mapped appearances that look as close to the original object as possible. Our experimental results show compelling results in terms of visual similarity and colorimetric error.

  1. Global trends in satellite-based emergency mapping

    Science.gov (United States)

    Voigt, Stefan; Giulio-Tonolo, Fabio; Lyons, Josh; Kučera, Jan; Jones, Brenda; Schneiderhan, Tobias; Platzeck, Gabriel; Kaku, Kazuya; Hazarika, Manzul Kumar; Czaran, Lorant; Li, Suju; Pedersen, Wendi; James, Godstime Kadiri; Proy, Catherine; Muthike, Denis Macharia; Bequignon, Jerome; Guha-Sapir, Debarati

    2016-01-01

    Over the past 15 years, scientists and disaster responders have increasingly used satellite-based Earth observations for global rapid assessment of disaster situations. We review global trends in satellite rapid response and emergency mapping from 2000 to 2014, analyzing more than 1000 incidents in which satellite monitoring was used for assessing major disaster situations. We provide a synthesis of spatial patterns and temporal trends in global satellite emergency mapping efforts and show that satellite-based emergency mapping is most intensively deployed in Asia and Europe and follows well the geographic, physical, and temporal distributions of global natural disasters. We present an outlook on the future use of Earth observation technology for disaster response and mitigation by putting past and current developments into context and perspective.

  2. LANDSLIDE INVENTORY MAPPING FROM BITEMPORAL 10 m SENTINEL-2 IMAGES USING CHANGE DETECTION BASED MARKOV RANDOM FIELD

    Directory of Open Access Journals (Sweden)

    Y. Qin

    2018-04-01

    Full Text Available Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF method for landslide inventory mapping. The proposed method mainly includes two steps: 1 change detection-based multi-threshold for training samples generation and 2 MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1 it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2 it takes the spectral characteristics of landslides into account; and 3 it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2 images in China.

  3. BEE FORAGE MAPPING BASED ON MULTISPECTRAL IMAGES LANDSAT

    Directory of Open Access Journals (Sweden)

    A. Moskalenko

    2016-10-01

    Full Text Available Possibilities of bee forage identification and mapping based on multispectral images have been shown in the research. Spectral brightness of bee forage has been determined with the use of satellite images. The effectiveness of some methods of image classification for mapping of bee forage is shown. Keywords: bee forage, mapping, multispectral images, image classification.

  4. Modeling, Designing, and Implementing an Avatar-based Interactive Map

    Directory of Open Access Journals (Sweden)

    Stefan Andrei

    2016-03-01

    Full Text Available Designing interactive maps has always been a challenge due to the geographical complexity of the earth’s landscape and the difficulty of resolving details to a high resolution. In the past decade or so, one of the most impressive map-based software application, the Global Positioning System (GPS, has probably the highest level of interaction with the user. This article describes an innovative technique for designing an avatar-based virtual interactive map for the Lamar University Campus, which will entail the buildings’ exterior as well as their interiors. Many universities provide 2D or 3D maps and even interactive maps. However, these maps do not provide a complete interaction with the user. To the best of our knowledge, this project is the first avatar-based interaction game that allows 100% interaction with the user. This work provides tremendous help to the freshman students and visitors of Lamar University. As an important marketing tool, the main objective is to get better visibility of the campus worldwide and to increase the number of students attending Lamar University.

  5. Investigating the tradeoffs between spatial resolution and diffusion sampling for brain mapping with diffusion tractography: time well spent?

    Science.gov (United States)

    Calabrese, Evan; Badea, Alexandra; Coe, Christopher L; Lubach, Gabriele R; Styner, Martin A; Johnson, G Allan

    2014-11-01

    Interest in mapping white matter pathways in the brain has peaked with the recognition that altered brain connectivity may contribute to a variety of neurologic and psychiatric diseases. Diffusion tractography has emerged as a popular method for postmortem brain mapping initiatives, including the ex-vivo component of the human connectome project, yet it remains unclear to what extent computer-generated tracks fully reflect the actual underlying anatomy. Of particular concern is the fact that diffusion tractography results vary widely depending on the choice of acquisition protocol. The two major acquisition variables that consume scan time, spatial resolution, and diffusion sampling, can each have profound effects on the resulting tractography. In this analysis, we determined the effects of the temporal tradeoff between spatial resolution and diffusion sampling on tractography in the ex-vivo rhesus macaque brain, a close primate model for the human brain. We used the wealth of autoradiography-based connectivity data available for the rhesus macaque brain to assess the anatomic accuracy of six time-matched diffusion acquisition protocols with varying balance between spatial and diffusion sampling. We show that tractography results vary greatly, even when the subject and the total acquisition time are held constant. Further, we found that focusing on either spatial resolution or diffusion sampling at the expense of the other is counterproductive. A balanced consideration of both sampling domains produces the most anatomically accurate and consistent results. Copyright © 2014 Wiley Periodicals, Inc.

  6. Mapping of marine benthic invertebrates in the Oslofjord and the Skagerrak: sampling data of museum collections from 1950-1955 and from recent investigations

    Directory of Open Access Journals (Sweden)

    Eivind Oug

    2015-12-01

    Full Text Available Data from large sampling programmes for the mapping of marine invertebrates in the Oslofjord, Norway, and the Skagerrak, spanning more than six decades, are compiled and digitized to provide easy access in modern data repositories. Two sampling programmes undertaken in the period 1950–55 are still the most extensive mapping of marine benthic fauna in the area. Information from a total of more than 900 localities, or sampling events, covering all benthic habitats in the Oslofjord and coastal waters to Kvitsøy in Rogaland county, have been carefully digitized from field notes, original sea charts, and primary observations from sample handling in the field. Geographical coordinates referred to WGS84 chart datum have been fixed with a general accuracy of 20 m in the Oslofjord and 100–250 m in coastal areas, based on precise map sketches with cross-bearings to land objects and chart annotations. Most samples were collected using triangular, Agassiz and lightweight dredges. The collected material has been deposited in the collections of the Natural History Museum, University of Oslo. Two recent projects, ‘Polyskag’ and ‘Bioskag’ (2006–2014, are briefly described. The projects focused on the diversity of marine bristle worms (Polychaeta, inter alia providing material for molecular genetic analyses. Type localities for early described species and generally understudied biotopes were visited. The data from the 1950s, together with recent studies, constitute a considerable resource for studies of biodiversity, facilitated through the sharing of species records from the museum collections in modern data repositories. The accurate positioning of sampling localities in the 1950s is of particular value for documenting species distributions over long time spans, thus providing a reference base for studying present and future species changes and assessing the effects of human influence and environmental changes in the Oslofjord and the Skagerrak.

  7. Finger Vein Recognition Based on Personalized Weight Maps

    Science.gov (United States)

    Yang, Gongping; Xiao, Rongyang; Yin, Yilong; Yang, Lu

    2013-01-01

    Finger vein recognition is a promising biometric recognition technology, which verifies identities via the vein patterns in the fingers. Binary pattern based methods were thoroughly studied in order to cope with the difficulties of extracting the blood vessel network. However, current binary pattern based finger vein matching methods treat every bit of feature codes derived from different image of various individuals as equally important and assign the same weight value to them. In this paper, we propose a finger vein recognition method based on personalized weight maps (PWMs). The different bits have different weight values according to their stabilities in a certain number of training samples from an individual. Firstly we present the concept of PWM, and then propose the finger vein recognition framework, which mainly consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PWM achieves not only better performance, but also high robustness and reliability. In addition, PWM can be used as a general framework for binary pattern based recognition. PMID:24025556

  8. Finger Vein Recognition Based on Personalized Weight Maps

    Directory of Open Access Journals (Sweden)

    Lu Yang

    2013-09-01

    Full Text Available Finger vein recognition is a promising biometric recognition technology, which verifies identities via the vein patterns in the fingers. Binary pattern based methods were thoroughly studied in order to cope with the difficulties of extracting the blood vessel network. However, current binary pattern based finger vein matching methods treat every bit of feature codes derived from different image of various individuals as equally important and assign the same weight value to them. In this paper, we propose a finger vein recognition method based on personalized weight maps (PWMs. The different bits have different weight values according to their stabilities in a certain number of training samples from an individual. Firstly we present the concept of PWM, and then propose the finger vein recognition framework, which mainly consists of preprocessing, feature extraction, and matching. Finally, we design extensive experiments to evaluate the effectiveness of our proposal. Experimental results show that PWM achieves not only better performance, but also high robustness and reliability. In addition, PWM can be used as a general framework for binary pattern based recognition.

  9. Sampling and Mapping Soil Erosion Cover Factor for Fort Richardson, Alaska. Integrating Stratification and an Up-Scaling Method

    National Research Council Canada - National Science Library

    Wang, Guangxing; Gertner, George; Anderson, Alan B; Howard, Heidi

    2006-01-01

    When a ground and vegetation cover factor related to soil erosion is mapped with the aid of remotely sensed data, a cost-efficient sample design to collect ground data and obtain an accurate map is required...

  10. Application of In-Segment Multiple Sampling in Object-Based Classification

    Directory of Open Access Journals (Sweden)

    Nataša Đurić

    2014-12-01

    Full Text Available When object-based analysis is applied to very high-resolution imagery, pixels within the segments reveal large spectral inhomogeneity; their distribution can be considered complex rather than normal. When normality is violated, the classification methods that rely on the assumption of normally distributed data are not as successful or accurate. It is hard to detect normality violations in small samples. The segmentation process produces segments that vary highly in size; samples can be very big or very small. This paper investigates whether the complexity within the segment can be addressed using multiple random sampling of segment pixels and multiple calculations of similarity measures. In order to analyze the effect sampling has on classification results, statistics and probability value equations of non-parametric two-sample Kolmogorov-Smirnov test and parametric Student’s t-test are selected as similarity measures in the classification process. The performance of both classifiers was assessed on a WorldView-2 image for four land cover classes (roads, buildings, grass and trees and compared to two commonly used object-based classifiers—k-Nearest Neighbor (k-NN and Support Vector Machine (SVM. Both proposed classifiers showed a slight improvement in the overall classification accuracies and produced more accurate classification maps when compared to the ground truth image.

  11. The Mapping X-ray Fluorescence Spectrometer (MapX)

    Science.gov (United States)

    Sarrazin, P.; Blake, D. F.; Marchis, F.; Bristow, T.; Thompson, K.

    2017-12-01

    Many planetary surface processes leave traces of their actions as features in the size range 10s to 100s of microns. The Mapping X-ray Fluorescence Spectrometer (MapX) will provide elemental imaging at 100 micron spatial resolution, yielding elemental chemistry at a scale where many relict physical, chemical, or biological features can be imaged and interpreted in ancient rocks on planetary bodies and planetesimals. MapX is an arm-based instrument positioned on a rock or regolith with touch sensors. During an analysis, an X-ray source (tube or radioisotope) bombards the sample with X-rays or alpha-particles / gamma-rays, resulting in sample X-ray Fluorescence (XRF). X-rays emitted in the direction of an X-ray sensitive CCD imager pass through a 1:1 focusing lens (X-ray micro-pore Optic (MPO)) that projects a spatially resolved image of the X-rays onto the CCD. The CCD is operated in single photon counting mode so that the energies and positions of individual X-ray photons are recorded. In a single analysis, several thousand frames are both stored and processed in real-time. Higher level data products include single-element maps with a lateral spatial resolution of 100 microns and quantitative XRF spectra from ground- or instrument- selected Regions of Interest (ROI). XRF spectra from ROI are compared with known rock and mineral compositions to extrapolate the data to rock types and putative mineralogies. When applied to airless bodies and implemented with an appropriate radioisotope source for alpha-particle excitation, MapX will be able to analyze biogenic elements C, N, O, P, S, in addition to the cations of the rock-forming elements >Na, accessible with either X-ray or gamma-ray excitation. The MapX concept has been demonstrated with a series of lab-based prototypes and is currently under refinement and TRL maturation.

  12. BAC-end sequence-based SNPs and Bin mapping for rapid integration of physical and genetic maps in apple.

    Science.gov (United States)

    Han, Yuepeng; Chagné, David; Gasic, Ksenija; Rikkerink, Erik H A; Beever, Jonathan E; Gardiner, Susan E; Korban, Schuyler S

    2009-03-01

    A genome-wide BAC physical map of the apple, Malus x domestica Borkh., has been recently developed. Here, we report on integrating the physical and genetic maps of the apple using a SNP-based approach in conjunction with bin mapping. Briefly, BAC clones located at ends of BAC contigs were selected, and sequenced at both ends. The BAC end sequences (BESs) were used to identify candidate SNPs. Subsequently, these candidate SNPs were genetically mapped using a bin mapping strategy for the purpose of mapping the physical onto the genetic map. Using this approach, 52 (23%) out of 228 BESs tested were successfully exploited to develop SNPs. These SNPs anchored 51 contigs, spanning approximately 37 Mb in cumulative physical length, onto 14 linkage groups. The reliability of the integration of the physical and genetic maps using this SNP-based strategy is described, and the results confirm the feasibility of this approach to construct an integrated physical and genetic maps for apple.

  13. Computerized mappings of the cerebral cortex: a multiresolution flattening method and a surface-based coordinate system

    Science.gov (United States)

    Drury, H. A.; Van Essen, D. C.; Anderson, C. H.; Lee, C. W.; Coogan, T. A.; Lewis, J. W.

    1996-01-01

    We present a new method for generating two-dimensional maps of the cerebral cortex. Our computerized, two-stage flattening method takes as its input any well-defined representation of a surface within the three-dimensional cortex. The first stage rapidly converts this surface to a topologically correct two-dimensional map, without regard for the amount of distortion introduced. The second stage reduces distortions using a multiresolution strategy that makes gross shape changes on a coarsely sampled map and further shape refinements on progressively finer resolution maps. We demonstrate the utility of this approach by creating flat maps of the entire cerebral cortex in the macaque monkey and by displaying various types of experimental data on such maps. We also introduce a surface-based coordinate system that has advantages over conventional stereotaxic coordinates and is relevant to studies of cortical organization in humans as well as non-human primates. Together, these methods provide an improved basis for quantitative studies of individual variability in cortical organization.

  14. Canopy Fuel Load Mapping of Mediterranean Pine Sites Based on Individual Tree-Crown Delineation

    Directory of Open Access Journals (Sweden)

    Giorgos Mallinis

    2013-12-01

    Full Text Available This study presents an individual tree-crown-based approach for canopy fuel load estimation and mapping in two Mediterranean pine stands. Based on destructive sampling, an allometric equation was developed for the estimation of crown fuel weight considering only pine crown width, a tree characteristic that can be estimated from passive imagery. Two high resolution images were used originally for discriminating Aleppo and Calabrian pines crown regions through a geographic object based image analysis approach. Subsequently, the crown region images were segmented using a watershed segmentation algorithm and crown width was extracted. The overall accuracy of the tree crown isolation expressed through a perfect match between the reference and the delineated crowns was 34.00% for the Kassandra site and 48.11% for the Thessaloniki site, while the coefficient of determination between the ground measured and the satellite extracted crown width was 0.5. Canopy fuel load values estimated in the current study presented mean values from 1.29 ± 0.6 to 1.65 ± 0.7 kg/m2 similar to other conifers worldwide. Despite the modest accuracies attained in this first study of individual tree crown fuel load mapping, the combination of the allometric equations with satellite-based extracted crown width information, can contribute to the spatially explicit mapping of canopy fuel load in Mediterranean areas. These maps can be used among others in fire behavior prediction, in fuel reduction treatments prioritization and during active fire suppression.

  15. Strategies for haplotype-based association mapping in complex pedigreed populations

    DEFF Research Database (Denmark)

    Boleckova, J; Christensen, Ole Fredslund; Sørensen, Peter

    2012-01-01

    In association mapping, haplotype-based methods are generally regarded to provide higher power and increased precision than methods based on single markers. For haplotype-based association mapping most studies use a fixed haplotype effect in the model. However, an increase in haplotype length inc...

  16. Hash function based on piecewise nonlinear chaotic map

    International Nuclear Information System (INIS)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2009-01-01

    Chaos-based cryptography appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an algorithm for one-way hash function construction based on piecewise nonlinear chaotic map with a variant probability parameter is proposed. Also the proposed algorithm is an attempt to present a new chaotic hash function based on multithreaded programming. In this chaotic scheme, the message is connected to the chaotic map using probability parameter and other parameters of chaotic map such as control parameter and initial condition, so that the generated hash value is highly sensitive to the message. Simulation results indicate that the proposed algorithm presented several interesting features, such as high flexibility, good statistical properties, high key sensitivity and message sensitivity. These properties make the scheme a suitable choice for practical applications.

  17. Data splitting for artificial neural networks using SOM-based stratified sampling.

    Science.gov (United States)

    May, R J; Maier, H R; Dandy, G C

    2010-03-01

    Data splitting is an important consideration during artificial neural network (ANN) development where hold-out cross-validation is commonly employed to ensure generalization. Even for a moderate sample size, the sampling methodology used for data splitting can have a significant effect on the quality of the subsets used for training, testing and validating an ANN. Poor data splitting can result in inaccurate and highly variable model performance; however, the choice of sampling methodology is rarely given due consideration by ANN modellers. Increased confidence in the sampling is of paramount importance, since the hold-out sampling is generally performed only once during ANN development. This paper considers the variability in the quality of subsets that are obtained using different data splitting approaches. A novel approach to stratified sampling, based on Neyman sampling of the self-organizing map (SOM), is developed, with several guidelines identified for setting the SOM size and sample allocation in order to minimize the bias and variance in the datasets. Using an example ANN function approximation task, the SOM-based approach is evaluated in comparison to random sampling, DUPLEX, systematic stratified sampling, and trial-and-error sampling to minimize the statistical differences between data sets. Of these approaches, DUPLEX is found to provide benchmark performance with good model performance, with no variability. The results show that the SOM-based approach also reliably generates high-quality samples and can therefore be used with greater confidence than other approaches, especially in the case of non-uniform datasets, with the benefit of scalability to perform data splitting on large datasets. Copyright 2009 Elsevier Ltd. All rights reserved.

  18. State Base Map for GIS – New Digital Topographic Map of the Republic of Macedonia

    Directory of Open Access Journals (Sweden)

    Zlatko Srbinoski

    2009-12-01

    Full Text Available The basic aim of the National Spatial Data Infrastructure (NSDI built in accordance with INSPIRE directive is to standardize spatial data infrastructure on national level. In that direction, topographic maps are a basic platform for acquiring spatial data within geoinformation systems and one of the most important  segments of NSDI. This paper presents methodology of establishing the new digital topographic map of the Republic of Macedonia titled “State Base Map for GIS in Macedonia”. This paper analyzes geometrical accuracy of new digital topographic maps. Production of the new digital topographic map has been the most important cartographic project in the Republic of Macedonia since it became independent.

  19. Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping.

    Science.gov (United States)

    Hamalainen, Sampsa; Geng, Xiaoyuan; He, Juanxia

    2017-04-01

    Latin Hypercube Sampling (LHS) at variable resolutions for enhanced watershed scale Soil Sampling and Digital Soil Mapping. Sampsa Hamalainen, Xiaoyuan Geng, and Juanxia, He. AAFC - Agriculture and Agr-Food Canada, Ottawa, Canada. The Latin Hypercube Sampling (LHS) approach to assist with Digital Soil Mapping has been developed for some time now, however the purpose of this work was to complement LHS with use of multiple spatial resolutions of covariate datasets and variability in the range of sampling points produced. This allowed for specific sets of LHS points to be produced to fulfil the needs of various partners from multiple projects working in the Ontario and Prince Edward Island provinces of Canada. Secondary soil and environmental attributes are critical inputs that are required in the development of sampling points by LHS. These include a required Digital Elevation Model (DEM) and subsequent covariate datasets produced as a result of a Digital Terrain Analysis performed on the DEM. These additional covariates often include but are not limited to Topographic Wetness Index (TWI), Length-Slope (LS) Factor, and Slope which are continuous data. The range of specific points created in LHS included 50 - 200 depending on the size of the watershed and more importantly the number of soil types found within. The spatial resolution of covariates included within the work ranged from 5 - 30 m. The iterations within the LHS sampling were run at an optimal level so the LHS model provided a good spatial representation of the environmental attributes within the watershed. Also, additional covariates were included in the Latin Hypercube Sampling approach which is categorical in nature such as external Surficial Geology data. Some initial results of the work include using a 1000 iteration variable within the LHS model. 1000 iterations was consistently a reasonable value used to produce sampling points that provided a good spatial representation of the environmental

  20. Vaccination with map specific peptides reduces map burden in tissues of infected goats

    DEFF Research Database (Denmark)

    Melvang, Heidi Mikkelsen; Hassan, Sufia Butt; Thakur, Aneesh

    As an alternative to protein-based vaccines, we investigated the effect of post-exposure vaccination with Map specific peptides in a goat model aiming at developing a Map vaccine that will neither interfere with diagnosis of paratuberculosis nor bovine tuberculosis. Peptides were initially select...... in the unvaccinated control group seroconverted in ID Screen® ELISA at last sampling prior to euthanasia. These results indicate that a subunit vaccine against Map can induce a protective immune response against paratuberculosis in goats....

  1. A fast image encryption algorithm based on chaotic map

    Science.gov (United States)

    Liu, Wenhao; Sun, Kehui; Zhu, Congxu

    2016-09-01

    Derived from Sine map and iterative chaotic map with infinite collapse (ICMIC), a new two-dimensional Sine ICMIC modulation map (2D-SIMM) is proposed based on a close-loop modulation coupling (CMC) model, and its chaotic performance is analyzed by means of phase diagram, Lyapunov exponent spectrum and complexity. It shows that this map has good ergodicity, hyperchaotic behavior, large maximum Lyapunov exponent and high complexity. Based on this map, a fast image encryption algorithm is proposed. In this algorithm, the confusion and diffusion processes are combined for one stage. Chaotic shift transform (CST) is proposed to efficiently change the image pixel positions, and the row and column substitutions are applied to scramble the pixel values simultaneously. The simulation and analysis results show that this algorithm has high security, low time complexity, and the abilities of resisting statistical analysis, differential, brute-force, known-plaintext and chosen-plaintext attacks.

  2. Application of self-organising maps towards segmentation of soybean samples by determination of amino acids concentration.

    Science.gov (United States)

    Silva, Lívia Ramazzoti Chanan; Angilelli, Karina Gomes; Cremasco, Hágata; Romagnoli, Érica Signori; Galão, Olívio Fernandes; Borsato, Dionisio; Moraes, Larissa Alexandra Cardoso; Mandarino, José Marcos Gontijo

    2016-09-01

    Soybeans are widely used both for human nutrition and animal feed, since they are an important source of protein, and they also provide components such as phytosterols, isoflavones, and amino acids. In this study, were determined the concentrations of the amino acids lysine, histidine, arginine, asparagine, glutamic acid, glycine, alanine, valine, isoleucine, leucine, tyrosine, phenylalanine present in 14 samples of conventional soybeans and 6 transgenic, cultivated in two cities of the state of Paraná, Londrina and Ponta Grossa. The results were tabulated and presented to a self-organising map for segmentation according planting regions and conventional or transgenic varieties. A network with 7000 training epochs and a 10 × 10 topology was used, and it proved appropriate in the segmentation of the samples using the data analysed. The weight maps provided by the network, showed that all the amino acids were important in targeting the samples, especially isoleucine. Three clusters were formed, one with only Ponta Grossa samples (including transgenic (PGT) and common (PGC)), a second group with Londrina transgenic (LT) samples and the third with Londrina common (LC) samples. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  3. Map-based model of the cardiac action potential

    International Nuclear Information System (INIS)

    Pavlov, Evgeny A.; Osipov, Grigory V.; Chan, C.K.; Suykens, Johan A.K.

    2011-01-01

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  4. Map-based model of the cardiac action potential

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, Evgeny A., E-mail: genie.pavlov@gmail.com [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Osipov, Grigory V. [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Chan, C.K. [Institute of Physics, Academia Sinica, 128 Sec. 2, Academia Road, Nankang, Taipei 115, Taiwan (China); Suykens, Johan A.K. [K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee) (Belgium)

    2011-07-25

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  5. A third-generation microsatellite-based linkage map of the honey bee, Apis mellifera, and its comparison with the sequence-based physical map.

    Science.gov (United States)

    Solignac, Michel; Mougel, Florence; Vautrin, Dominique; Monnerot, Monique; Cornuet, Jean-Marie

    2007-01-01

    The honey bee is a key model for social behavior and this feature led to the selection of the species for genome sequencing. A genetic map is a necessary companion to the sequence. In addition, because there was originally no physical map for the honey bee genome project, a meiotic map was the only resource for organizing the sequence assembly on the chromosomes. We present the genetic (meiotic) map here and describe the main features that emerged from comparison with the sequence-based physical map. The genetic map of the honey bee is saturated and the chromosomes are oriented from the centromeric to the telomeric regions. The map is based on 2,008 markers and is about 40 Morgans (M) long, resulting in a marker density of one every 2.05 centiMorgans (cM). For the 186 megabases (Mb) of the genome mapped and assembled, this corresponds to a very high average recombination rate of 22.04 cM/Mb. Honey bee meiosis shows a relatively homogeneous recombination rate along and across chromosomes, as well as within and between individuals. Interference is higher than inferred from the Kosambi function of distance. In addition, numerous recombination hotspots are dispersed over the genome. The very large genetic length of the honey bee genome, its small physical size and an almost complete genome sequence with a relatively low number of genes suggest a very promising future for association mapping in the honey bee, particularly as the existence of haploid males allows easy bulk segregant analysis.

  6. The effects of a concept map-based support tool on simulation-based inquiry learning

    NARCIS (Netherlands)

    Hagemans, M.G.; van der Meij, Hans; de Jong, Anthonius J.M.

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations,

  7. Broadband illusion optical devices based on conformal mappings

    Science.gov (United States)

    Xiong, Zhan; Xu, Lin; Xu, Ya-Dong; Chen, Huan-Yang

    2017-10-01

    In this paper, we propose a simple method of illusion optics based on conformal mappings. By carefully developing designs with specific conformal mappings, one can make an object look like another with a significantly different shape. In addition, the illusion optical devices can work in a broadband of frequencies.

  8. Assessing Volunteered Geographic Information (vgi) Quality Based on CONTRIBUTORS' Mapping Behaviours

    Science.gov (United States)

    Bégin, D.; Devillers, R.; Roche, S.

    2013-05-01

    VGI changed the mapping landscape by allowing people that are not professional cartographers to contribute to large mapping projects, resulting at the same time in concerns about the quality of the data produced. While a number of early VGI studies used conventional methods to assess data quality, such approaches are not always well adapted to VGI. Since VGI is a user-generated content, we posit that features and places mapped by contributors largely reflect contributors' personal interests. This paper proposes studying contributors' mapping processes to understand the characteristics and quality of the data produced. We argue that contributors' behaviour when mapping reflects contributors' motivation and individual preferences in selecting mapped features and delineating mapped areas. Such knowledge of contributors' behaviour could allow for the derivation of information about the quality of VGI datasets. This approach was tested using a sample area from OpenStreetMap, leading to a better understanding of data completeness for contributor's preferred features.

  9. Data Assimilation with Optimal Maps

    Science.gov (United States)

    El Moselhy, T.; Marzouk, Y.

    2012-12-01

    Tarek El Moselhy and Youssef Marzouk Massachusetts Institute of Technology We present a new approach to Bayesian inference that entirely avoids Markov chain simulation and sequential importance resampling, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. The map is written as a multivariate polynomial expansion and computed efficiently through the solution of a stochastic optimization problem. While our previous work [1] focused on static Bayesian inference problems, we now extend the map-based approach to sequential data assimilation, i.e., nonlinear filtering and smoothing. One scheme involves pushing forward a fixed reference measure to each filtered state distribution, while an alternative scheme computes maps that push forward the filtering distribution from one stage to the other. We compare the performance of these schemes and extend the former to problems of smoothing, using a map implementation of the forward-backward smoothing formula. Advantages of a map-based representation of the filtering and smoothing distributions include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent uniformly-weighted posterior samples without additional evaluations of the dynamical model. Perhaps the main advantage, however, is that the map approach inherently avoids issues of sample impoverishment, since it explicitly represents the posterior as the pushforward of a reference measure, rather than with a particular set of samples. The computational complexity of our algorithm is comparable to state-of-the-art particle filters. Moreover, the accuracy of the approach is controlled via the convergence criterion of the underlying optimization problem. We demonstrate the efficiency and accuracy of the map approach via data assimilation in

  10. Bedrock Geologic Map of Vermont - Geochronology Sample Locations

    Data.gov (United States)

    Vermont Center for Geographic Information — The bedrock geology was last mapped at a statewide scale 50 years ago at a scale of 1:250,000 (Doll and others, 1961). The 1961 map was compiled from 1:62,500-scale...

  11. Soil classification basing on the spectral characteristics of topsoil samples

    Science.gov (United States)

    Liu, Huanjun; Zhang, Xiaokang; Zhang, Xinle

    2016-04-01

    Soil taxonomy plays an important role in soil utility and management, but China has only course soil map created based on 1980s data. New technology, e.g. spectroscopy, could simplify soil classification. The study try to classify soils basing on the spectral characteristics of topsoil samples. 148 topsoil samples of typical soils, including Black soil, Chernozem, Blown soil and Meadow soil, were collected from Songnen plain, Northeast China, and the room spectral reflectance in the visible and near infrared region (400-2500 nm) were processed with weighted moving average, resampling technique, and continuum removal. Spectral indices were extracted from soil spectral characteristics, including the second absorption positions of spectral curve, the first absorption vale's area, and slope of spectral curve at 500-600 nm and 1340-1360 nm. Then K-means clustering and decision tree were used respectively to build soil classification model. The results indicated that 1) the second absorption positions of Black soil and Chernozem were located at 610 nm and 650 nm respectively; 2) the spectral curve of the meadow is similar to its adjacent soil, which could be due to soil erosion; 3) decision tree model showed higher classification accuracy, and accuracy of Black soil, Chernozem, Blown soil and Meadow are 100%, 88%, 97%, 50% respectively, and the accuracy of Blown soil could be increased to 100% by adding one more spectral index (the first two vole's area) to the model, which showed that the model could be used for soil classification and soil map in near future.

  12. Mapping Vineyard Leaf Area Using Mobile Terrestrial Laser Scanners: Should Rows be Scanned On-the-Go or Discontinuously Sampled?

    Directory of Open Access Journals (Sweden)

    Ignacio del-Moral-Martínez

    2016-01-01

    Full Text Available The leaf area index (LAI is defined as the one-side leaf area per unit ground area, and is probably the most widely used index to characterize grapevine vigor. However, LAI varies spatially within vineyard plots. Mapping and quantifying this variability is very important for improving management decisions and agricultural practices. In this study, a mobile terrestrial laser scanner (MTLS was used to map the LAI of a vineyard, and then to examine how different scanning methods (on-the-go or discontinuous systematic sampling may affect the reliability of the resulting raster maps. The use of the MTLS allows calculating the enveloping vegetative area of the canopy, which is the sum of the leaf wall areas for both sides of the row (excluding gaps and the projected upper area. Obtaining the enveloping areas requires scanning from both sides one meter length section along the row at each systematic sampling point. By converting the enveloping areas into LAI values, a raster map of the latter can be obtained by spatial interpolation (kriging. However, the user can opt for scanning on-the-go in a continuous way and compute 1-m LAI values along the rows, or instead, perform the scanning at discontinuous systematic sampling within the plot. An analysis of correlation between maps indicated that MTLS can be used discontinuously in specific sampling sections separated by up to 15 m along the rows. This capability significantly reduces the amount of data to be acquired at field level, the data storage capacity and the processing power of computers.

  13. One-dimensional map-based neuron model: A logistic modification

    International Nuclear Information System (INIS)

    Mesbah, Samineh; Moghtadaei, Motahareh; Hashemi Golpayegani, Mohammad Reza; Towhidkhah, Farzad

    2014-01-01

    A one-dimensional map is proposed for modeling some of the neuronal activities, including different spiking and bursting behaviors. The model is obtained by applying some modifications on the well-known Logistic map and is named the Modified and Confined Logistic (MCL) model. Map-based neuron models are known as phenomenological models and recently, they are widely applied in modeling tasks due to their computational efficacy. Most of discrete map-based models involve two variables representing the slow-fast prototype. There are also some one-dimensional maps, which can replicate some of the neuronal activities. However, the existence of four bifurcation parameters in the MCL model gives rise to reproduction of spiking behavior with control over the frequency of the spikes, and imitation of chaotic and regular bursting responses concurrently. It is also shown that the proposed model has the potential to reproduce more realistic bursting activity by adding a second variable. Moreover the MCL model is able to replicate considerable number of experimentally observed neuronal responses introduced in Izhikevich (2004) [23]. Some analytical and numerical analyses of the MCL model dynamics are presented to explain the emersion of complex dynamics from this one-dimensional map

  14. Construction of microsatellite-based linkage map and mapping of nectarilessness and hairiness genes in Gossypium tomentosum.

    Science.gov (United States)

    Hou, Meiying; Cai, Caiping; Zhang, Shuwen; Guo, Wangzhen; Zhang, Tianzhen; Zhou, Baoliang

    2013-12-01

    Gossypium tomentosum, a wild tetraploid cotton species with AD genomes, possesses genes conferring strong fibers and high heat tolerance. To effectively transfer these genes into Gossypium hirsutum, an entire microsatellite (simple sequence repeat, SSR)-based genetic map was constructed using the interspecific cross of G. hirsutum x G. tomentosum (HT). We detected 1800 loci from 1347 pairs of polymorphic primers. Of these, 1204 loci were grouped into 35 linkage groups at LOD ≥ 4. The map covers 3320.8 cM, with a mean density of 2.76 cM per locus. We detected 420 common loci (186 in the At subgenome and 234 in Dt) between the HT map and the map of TM-1 (G. hirsutum) and Hai 7124 (G. barbadense; HB map). The linkage groups were assigned chromosome numbers based on location of common loci and the HB map as reference. A comparison of common markers revealed that no significant chromosomal rearrangement exist between G. tomentosum and G. barbadense. Interestingly, however, we detected numerous (33.7%) segregation loci deviating from 3:1 ratio (P constructed in this study will be useful for further genetic studies on cotton breeding, including mapping loci controlling quantitative traits associated with fiber quality, stress tolerance and developing chromosome segment specific introgression lines from G. tomentosum into G. hirsutum using marker-assisted selection.

  15. The Effects of a Concept Map-Based Support Tool on Simulation-Based Inquiry Learning

    Science.gov (United States)

    Hagemans, Mieke G.; van der Meij, Hans; de Jong, Ton

    2013-01-01

    Students often need support to optimize their learning in inquiry learning environments. In 2 studies, we investigated the effects of adding concept-map-based support to a simulation-based inquiry environment on kinematics. The concept map displayed the main domain concepts and their relations, while dynamic color coding of the concepts displayed…

  16. Feature selection based on SVM significance maps for classification of dementia

    NARCIS (Netherlands)

    E.E. Bron (Esther); M. Smits (Marion); J.C. van Swieten (John); W.J. Niessen (Wiro); S. Klein (Stefan)

    2014-01-01

    textabstractSupport vector machine significance maps (SVM p-maps) previously showed clusters of significantly different voxels in dementiarelated brain regions. We propose a novel feature selection method for classification of dementia based on these p-maps. In our approach, the SVM p-maps are

  17. Recovering Sample Diversity in Rao-Blackwellized Particle Filters for Simultaneous Localization and Mapping

    National Research Council Canada - National Science Library

    Anderson, Andrew D

    2006-01-01

    ...) in simultaneous localization and mapping (SLAM) situations that arises when precise feature measurements yield a limited perceptual distribution relative to a motion-based proposal distribution...

  18. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera

    Directory of Open Access Journals (Sweden)

    Hyungjin Kim

    2015-08-01

    Full Text Available Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments

  19. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.

    Science.gov (United States)

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-08-31

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.

  20. Name-Based Address Mapping for Virtual Private Networks

    Science.gov (United States)

    Surányi, Péter; Shinjo, Yasushi; Kato, Kazuhiko

    IPv4 private addresses are commonly used in local area networks (LANs). With the increasing popularity of virtual private networks (VPNs), it has become common that a user connects to multiple LANs at the same time. However, private address ranges for LANs frequently overlap. In such cases, existing systems do not allow the user to access the resources on all LANs at the same time. In this paper, we propose name-based address mapping for VPNs, a novel method that allows connecting to hosts through multiple VPNs at the same time, even when the address ranges of the VPNs overlap. In name-based address mapping, rather than using the IP addresses used on the LANs (the real addresses), we assign a unique virtual address to each remote host based on its domain name. The local host uses the virtual addresses to communicate with remote hosts. We have implemented name-based address mapping for layer 3 OpenVPN connections on Linux and measured its performance. The communication overhead of our system is less than 1.5% for throughput and less than 0.2ms for each name resolution.

  1. Ensemble based system for whole-slide prostate cancer probability mapping using color texture features.

    LENUS (Irish Health Repository)

    DiFranco, Matthew D

    2011-01-01

    We present a tile-based approach for producing clinically relevant probability maps of prostatic carcinoma in histological sections from radical prostatectomy. Our methodology incorporates ensemble learning for feature selection and classification on expert-annotated images. Random forest feature selection performed over varying training sets provides a subset of generalized CIEL*a*b* co-occurrence texture features, while sample selection strategies with minimal constraints reduce training data requirements to achieve reliable results. Ensembles of classifiers are built using expert-annotated tiles from training images, and scores for the probability of cancer presence are calculated from the responses of each classifier in the ensemble. Spatial filtering of tile-based texture features prior to classification results in increased heat-map coherence as well as AUC values of 95% using ensembles of either random forests or support vector machines. Our approach is designed for adaptation to different imaging modalities, image features, and histological decision domains.

  2. Evaluating web-based static, animated and interactive maps for injury prevention

    Directory of Open Access Journals (Sweden)

    Jonathan Cinnamon

    2009-11-01

    Full Text Available Public health planning can benefit from visual exploration and analysis of geospatial data. Maps and geovisualization tools must be developed with the user-group in mind. User-needs assessment and usability testing are crucial elements in the iterative process of map design and implementation. This study presents the results of a usability test of static, animated and interactive maps of injury rates and socio-demographic determinants of injury by a sample of potential end-users in Toronto, Canada. The results of the user-testing suggest that different map types are useful for different purposes and for satisfying the varying skill level of the individual user. The static maps were deemed to be easy to use and versatile, while the animated maps could be made more useful if animation controls were provided. The split-screen concept of the interactive maps was highlighted as particularly effective for map comparison. Overall, interactive maps were identified as the preferred map type for comparing patterns of injury and related socio-demographic risk factors. Information collected from the user-tests is being used to expand and refine the injury web maps for Toronto, and could inform other public health-related geo-visualization projects.

  3. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    Science.gov (United States)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to

  4. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping

    Directory of Open Access Journals (Sweden)

    Shi Weisong

    2011-06-01

    Full Text Available Abstract Background Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS. However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. Results To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80% mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http

  5. CloudAligner: A fast and full-featured MapReduce based tool for sequence mapping.

    Science.gov (United States)

    Nguyen, Tung; Shi, Weisong; Ruden, Douglas

    2011-06-06

    Research in genetics has developed rapidly recently due to the aid of next generation sequencing (NGS). However, massively-parallel NGS produces enormous amounts of data, which leads to storage, compatibility, scalability, and performance issues. The Cloud Computing and MapReduce framework, which utilizes hundreds or thousands of shared computers to map sequencing reads quickly and efficiently to reference genome sequences, appears to be a very promising solution for these issues. Consequently, it has been adopted by many organizations recently, and the initial results are very promising. However, since these are only initial steps toward this trend, the developed software does not provide adequate primary functions like bisulfite, pair-end mapping, etc., in on-site software such as RMAP or BS Seeker. In addition, existing MapReduce-based applications were not designed to process the long reads produced by the most recent second-generation and third-generation NGS instruments and, therefore, are inefficient. Last, it is difficult for a majority of biologists untrained in programming skills to use these tools because most were developed on Linux with a command line interface. To urge the trend of using Cloud technologies in genomics and prepare for advances in second- and third-generation DNA sequencing, we have built a Hadoop MapReduce-based application, CloudAligner, which achieves higher performance, covers most primary features, is more accurate, and has a user-friendly interface. It was also designed to be able to deal with long sequences. The performance gain of CloudAligner over Cloud-based counterparts (35 to 80%) mainly comes from the omission of the reduce phase. In comparison to local-based approaches, the performance gain of CloudAligner is from the partition and parallel processing of the huge reference genome as well as the reads. The source code of CloudAligner is available at http://cloudaligner.sourceforge.net/ and its web version is at http

  6. Remote Sensing Based Two-Stage Sampling for Accuracy Assessment and Area Estimation of Land Cover Changes

    Directory of Open Access Journals (Sweden)

    Heinz Gallaun

    2015-09-01

    Full Text Available Land cover change processes are accelerating at the regional to global level. The remote sensing community has developed reliable and robust methods for wall-to-wall mapping of land cover changes; however, land cover changes often occur at rates below the mapping errors. In the current publication, we propose a cost-effective approach to complement wall-to-wall land cover change maps with a sampling approach, which is used for accuracy assessment and accurate estimation of areas undergoing land cover changes, including provision of confidence intervals. We propose a two-stage sampling approach in order to keep accuracy, efficiency, and effort of the estimations in balance. Stratification is applied in both stages in order to gain control over the sample size allocated to rare land cover change classes on the one hand and the cost constraints for very high resolution reference imagery on the other. Bootstrapping is used to complement the accuracy measures and the area estimates with confidence intervals. The area estimates and verification estimations rely on a high quality visual interpretation of the sampling units based on time series of satellite imagery. To demonstrate the cost-effective operational applicability of the approach we applied it for assessment of deforestation in an area characterized by frequent cloud cover and very low change rate in the Republic of Congo, which makes accurate deforestation monitoring particularly challenging.

  7. Machine-based mapping of innovation portfolios

    NARCIS (Netherlands)

    de Visser, Matthias; Miao, Shengfa; Englebienne, Gwenn; Sools, Anna Maria; Visscher, Klaasjan

    2017-01-01

    Machine learning techniques show a great promise for improving innovation portfolio management. In this paper we experiment with different methods to classify innovation projects of a high-tech firm as either explorative or exploitative, and compare the results with a manual, theory-based mapping of

  8. Sample selection based on kernel-subclustering for the signal reconstruction of multifunctional sensors

    International Nuclear Information System (INIS)

    Wang, Xin; Wei, Guo; Sun, Jinwei

    2013-01-01

    The signal reconstruction methods based on inverse modeling for the signal reconstruction of multifunctional sensors have been widely studied in recent years. To improve the accuracy, the reconstruction methods have become more and more complicated because of the increase in the model parameters and sample points. However, there is another factor that affects the reconstruction accuracy, the position of the sample points, which has not been studied. A reasonable selection of the sample points could improve the signal reconstruction quality in at least two ways: improved accuracy with the same number of sample points or the same accuracy obtained with a smaller number of sample points. Both ways are valuable for improving the accuracy and decreasing the workload, especially for large batches of multifunctional sensors. In this paper, we propose a sample selection method based on kernel-subclustering distill groupings of the sample data and produce the representation of the data set for inverse modeling. The method calculates the distance between two data points based on the kernel-induced distance instead of the conventional distance. The kernel function is a generalization of the distance metric by mapping the data that are non-separable in the original space into homogeneous groups in the high-dimensional space. The method obtained the best results compared with the other three methods in the simulation. (paper)

  9. An image-space parallel convolution filtering algorithm based on shadow map

    Science.gov (United States)

    Li, Hua; Yang, Huamin; Zhao, Jianping

    2017-07-01

    Shadow mapping is commonly used in real-time rendering. In this paper, we presented an accurate and efficient method of soft shadows generation from planar area lights. First this method generated a depth map from light's view, and analyzed the depth-discontinuities areas as well as shadow boundaries. Then these areas were described as binary values in the texture map called binary light-visibility map, and a parallel convolution filtering algorithm based on GPU was enforced to smooth out the boundaries with a box filter. Experiments show that our algorithm is an effective shadow map based method that produces perceptually accurate soft shadows in real time with more details of shadow boundaries compared with the previous works.

  10. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    Science.gov (United States)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  11. Two-dimensional T2 distribution mapping in rock core plugs with optimal k-space sampling.

    Science.gov (United States)

    Xiao, Dan; Balcom, Bruce J

    2012-07-01

    Spin-echo single point imaging has been employed for 1D T(2) distribution mapping, but a simple extension to 2D is challenging since the time increase is n fold, where n is the number of pixels in the second dimension. Nevertheless 2D T(2) mapping in fluid saturated rock core plugs is highly desirable because the bedding plane structure in rocks often results in different pore properties within the sample. The acquisition time can be improved by undersampling k-space. The cylindrical shape of rock core plugs yields well defined intensity distributions in k-space that may be efficiently determined by new k-space sampling patterns that are developed in this work. These patterns acquire 22.2% and 11.7% of the k-space data points. Companion density images may be employed, in a keyhole imaging sense, to improve image quality. T(2) weighted images are fit to extract T(2) distributions, pixel by pixel, employing an inverse Laplace transform. Images reconstructed with compressed sensing, with similar acceleration factors, are also presented. The results show that restricted k-space sampling, in this application, provides high quality results. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Accounting for access costs in validation of soil maps

    NARCIS (Netherlands)

    Yang, Lin; Brus, Dick J.; Zhu, A.X.; Li, Xinming; Shi, Jingjing

    2018-01-01

    The quality of soil maps can best be estimated by collecting additional data at locations selected by probability sampling. These data can be used in design-based estimation of map quality measures such as the population mean of the squared prediction errors (MSE) for continuous soil maps and

  13. A BAC-based physical map of the Drosophila buzzatii genome

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Josefa; Nefedov, Michael; Bosdet, Ian; Casals, Ferran; Calvete, Oriol; Delprat, Alejandra; Shin, Heesun; Chiu, Readman; Mathewson, Carrie; Wye, Natasja; Hoskins, Roger A.; Schein, JacquelineE.; de Jong, Pieter; Ruiz, Alfredo

    2005-03-18

    Large-insert genomic libraries facilitate cloning of large genomic regions, allow the construction of clone-based physical maps and provide useful resources for sequencing entire genomes. Drosophilabuzzatii is a representative species of the repleta group in the Drosophila subgenus, which is being widely used as a model in studies of genome evolution, ecological adaptation and speciation. We constructed a Bacterial Artificial Chromosome (BAC) genomic library of D. buzzatii using the shuttle vector pTARBAC2.1. The library comprises 18,353 clones with an average insert size of 152 kb and a {approx}18X expected representation of the D. buzzatii euchromatic genome. We screened the entire library with six euchromatic gene probes and estimated the actual genome representation to be {approx}23X. In addition, we fingerprinted by restriction digestion and agarose gel electrophoresis a sample of 9,555 clones, and assembled them using Finger Printed Contigs (FPC) software and manual editing into 345 contigs (mean of 26 clones per contig) and 670singletons. Finally, we anchored 181 large contigs (containing 7,788clones) to the D. buzzatii salivary gland polytene chromosomes by in situ hybridization of 427 representative clones. The BAC library and a database with all the information regarding the high coverage BAC-based physical map described in this paper are available to the research community.

  14. Analysis of spatial distribution of land cover maps accuracy

    Science.gov (United States)

    Khatami, R.; Mountrakis, G.; Stehman, S. V.

    2017-12-01

    Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain

  15. Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method

    Directory of Open Access Journals (Sweden)

    Majid Shadman Roodposhti

    2016-09-01

    Full Text Available Assessing Landslide Susceptibility Mapping (LSM contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70% and testing (≈30% samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC curves in combination with area under the curve (AUC. The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area.

  16. Spectrally based bathymetric mapping of a dynamic, sand‐bedded channel: Niobrara River, Nebraska, USA

    Science.gov (United States)

    Dilbone, Elizabeth; Legleiter, Carl; Alexander, Jason S.; McElroy, Brandon

    2018-01-01

    Methods for spectrally based mapping of river bathymetry have been developed and tested in clear‐flowing, gravel‐bed channels, with limited application to turbid, sand‐bed rivers. This study used hyperspectral images and field surveys from the dynamic, sandy Niobrara River to evaluate three depth retrieval methods. The first regression‐based approach, optimal band ratio analysis (OBRA), paired in situ depth measurements with image pixel values to estimate depth. The second approach used ground‐based field spectra to calibrate an OBRA relationship. The third technique, image‐to‐depth quantile transformation (IDQT), estimated depth by linking the cumulative distribution function (CDF) of depth to the CDF of an image‐derived variable. OBRA yielded the lowest depth retrieval mean error (0.005 m) and highest observed versus predicted R2 (0.817). Although misalignment between field and image data did not compromise the performance of OBRA in this study, poor georeferencing could limit regression‐based approaches such as OBRA in dynamic, sand‐bedded rivers. Field spectroscopy‐based depth maps exhibited a mean error with a slight shallow bias (0.068 m) but provided reliable estimates for most of the study reach. IDQT had a strong deep bias but provided informative relative depth maps. Overprediction of depth by IDQT highlights the need for an unbiased sampling strategy to define the depth CDF. Although each of the techniques we tested demonstrated potential to provide accurate depth estimates in sand‐bed rivers, each method also was subject to certain constraints and limitations.

  17. An Effective NoSQL-Based Vector Map Tile Management Approach

    Directory of Open Access Journals (Sweden)

    Lin Wan

    2016-11-01

    Full Text Available Within a digital map service environment, the rapid growth of Spatial Big-Data is driving new requirements for effective mechanisms for massive online vector map tile processing. The emergence of Not Only SQL (NoSQL databases has resulted in a new data storage and management model for scalable spatial data deployments and fast tracking. They better suit the scenario of high-volume, low-latency network map services than traditional standalone high-performance computer (HPC or relational databases. In this paper, we propose a flexible storage framework that provides feasible methods for tiled map data parallel clipping and retrieval operations within a distributed NoSQL database environment. We illustrate the parallel vector tile generation and querying algorithms with the MapReduce programming model. Three different processing approaches, including local caching, distributed file storage, and the NoSQL-based method, are compared by analyzing the concurrent load and calculation time. An online geological vector tile map service prototype was developed to embed our processing framework in the China Geological Survey Information Grid. Experimental results show that our NoSQL-based parallel tile management framework can support applications that process huge volumes of vector tile data and improve performance of the tiled map service.

  18. Chaotic maps-based password-authenticated key agreement using smart cards

    Science.gov (United States)

    Guo, Cheng; Chang, Chin-Chen

    2013-06-01

    Password-based authenticated key agreement using smart cards has been widely and intensively researched. Inspired by the semi-group property of Chebyshev maps and key agreement protocols based on chaotic maps, we proposed a novel chaotic maps-based password-authenticated key agreement protocol with smart cards. In our protocol, we avoid modular exponential computing or scalar multiplication on elliptic curve used in traditional authenticated key agreement protocols using smart cards. Our analysis shows that our protocol has comprehensive characteristics and can withstand attacks, including the insider attack, replay attack, and others, satisfying essential security requirements. Performance analysis shows that our protocol can refrain from consuming modular exponential computing and scalar multiplication on an elliptic curve. The computational cost of our protocol compared with related protocols is acceptable.

  19. Use of GIS-Based Sampling to Inform Food Security Assessments and Decision Making in Kenya

    Science.gov (United States)

    Wahome, A.; Ndubi, A. O.; Ndungu, L. W.; Mugo, R. M.; Flores Cordova, A. I.

    2017-12-01

    Kenya relies on agricultural production for supporting local consumption and other processing value chains. With changing climate in a rain-fed dependent agricultural production system, cropping zones are shifting and proper decision making will require updated data. Where up-to-date data is not available it is important that it is generated and passed over to relevant stakeholders to inform their decision making. The process of generating this data should be cost effective and less time consuming. The Kenyan State Department of Agriculture (SDA) runs an insurance programme for maize farmers in a number of counties in Kenya. Previously, SDA was using a list of farmers to identify the crop fields for this insurance programme. However, the process of listing of all farmers in each Unit Area of Insurance (UAI) proved to be tedious and very costly, hence need for an alternative approach, but acceptable sampling methodology. Building on the existing cropland maps, SERVIR, a joint NASA-USAID initiative that brings Earth observations (EO) for improved environmental decision making in developing countries, specifically its hub in Eastern and Soutehrn Africa developed a High Resolution Map based on 10m Sentinel satellite images from which a GIS based sampling frame for identifying maize fields was developed. Sampling points were randomly generated in each UAI and navigated to using hand-held GPS units for identification of maize farmers. In the GIS-based identification of farmers SDA uses 1 day to cover an area covered in 1 week by list identification of farmers. Similarly, SDA spends approximately 3,000 USD per sub-county to locate maize fields using GIS-based sampling as compared 10,000 USD they used to spend before. This has resulted in 70% cost reduction.

  20. Selecting the optimum plot size for a California design-based stream and wetland mapping program.

    Science.gov (United States)

    Lackey, Leila G; Stein, Eric D

    2014-04-01

    Accurate estimates of the extent and distribution of wetlands and streams are the foundation of wetland monitoring, management, restoration, and regulatory programs. Traditionally, these estimates have relied on comprehensive mapping. However, this approach is prohibitively resource-intensive over large areas, making it both impractical and statistically unreliable. Probabilistic (design-based) approaches to evaluating status and trends provide a more cost-effective alternative because, compared with comprehensive mapping, overall extent is inferred from mapping a statistically representative, randomly selected subset of the target area. In this type of design, the size of sample plots has a significant impact on program costs and on statistical precision and accuracy; however, no consensus exists on the appropriate plot size for remote monitoring of stream and wetland extent. This study utilized simulated sampling to assess the performance of four plot sizes (1, 4, 9, and 16 km(2)) for three geographic regions of California. Simulation results showed smaller plot sizes (1 and 4 km(2)) were most efficient for achieving desired levels of statistical accuracy and precision. However, larger plot sizes were more likely to contain rare and spatially limited wetland subtypes. Balancing these considerations led to selection of 4 km(2) for the California status and trends program.

  1. Generalized double-humped logistic map-based medical image encryption

    Directory of Open Access Journals (Sweden)

    Samar M. Ismail

    2018-03-01

    Full Text Available This paper presents the design of the generalized Double Humped (DH logistic map, used for pseudo-random number key generation (PRNG. The generalized parameter added to the map provides more control on the map chaotic range. A new special map with a zooming effect of the bifurcation diagram is obtained by manipulating the generalization parameter value. The dynamic behavior of the generalized map is analyzed, including the study of the fixed points and stability ranges, Lyapunov exponent, and the complete bifurcation diagram. The option of designing any specific map is made possible through changing the general parameter increasing the randomness and controllability of the map. An image encryption algorithm is introduced based on pseudo-random sequence generation using the proposed generalized DH map offering secure communication transfer of medical MRI and X-ray images. Security analyses are carried out to consolidate system efficiency including: key sensitivity and key-space analyses, histogram analysis, correlation coefficients, MAE, NPCR and UACI calculations. System robustness against noise attacks has been proved along with the NIST test ensuring the system efficiency. A comparison between the proposed system with respect to previous works is presented.

  2. Design of an image encryption scheme based on a multiple chaotic map

    Science.gov (United States)

    Tong, Xiao-Jun

    2013-07-01

    In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.

  3. Creating soil moisture maps based on radar satellite imagery

    Science.gov (United States)

    Hnatushenko, Volodymyr; Garkusha, Igor; Vasyliev, Volodymyr

    2017-10-01

    The presented work is related to a study of mapping soil moisture basing on radar data from Sentinel-1 and a test of adequacy of the models constructed on the basis of data obtained from alternative sources. Radar signals are reflected from the ground differently, depending on its properties. In radar images obtained, for example, in the C band of the electromagnetic spectrum, soils saturated with moisture usually appear in dark tones. Although, at first glance, the problem of constructing moisture maps basing on radar data seems intuitively clear, its implementation on the basis of the Sentinel-1 data on an industrial scale and in the public domain is not yet available. In the process of mapping, for verification of the results, measurements of soil moisture obtained from logs of the network of climate stations NOAA US Climate Reference Network (USCRN) were used. This network covers almost the entire territory of the United States. The passive microwave radiometers of Aqua and SMAP satellites data are used for comparing processing. In addition, other supplementary cartographic materials were used, such as maps of soil types and ready moisture maps. The paper presents a comparison of the effect of the use of certain methods of roughening the quality of radar data on the result of mapping moisture. Regression models were constructed showing dependence of backscatter coefficient values Sigma0 for calibrated radar data of different spatial resolution obtained at different times on soil moisture values. The obtained soil moisture maps of the territories of research, as well as the conceptual solutions about automation of operations of constructing such digital maps, are presented. The comparative assessment of the time required for processing a given set of radar scenes with the developed tools and with the ESA SNAP product was carried out.

  4. Mapping specific soil functions based on digital soil property maps

    Science.gov (United States)

    Pásztor, László; Fodor, Nándor; Farkas-Iványi, Kinga; Szabó, József; Bakacsi, Zsófia; Koós, Sándor

    2016-04-01

    Quantification of soil functions and services is a great challenge in itself even if the spatial relevance is supposed to be identified and regionalized. Proxies and indicators are widely used in ecosystem service mapping. Soil services could also be approximated by elementary soil features. One solution is the association of soil types with services as basic principle. Soil property maps however provide quantified spatial information, which could be utilized more versatilely for the spatial inference of soil functions and services. In the frame of the activities referred as "Digital, Optimized, Soil Related Maps and Information in Hungary" (DOSoReMI.hu) numerous soil property maps have been compiled so far with proper DSM techniques partly according to GSM.net specifications, partly by slightly or more strictly changing some of its predefined parameters (depth intervals, pixel size, property etc.). The elaborated maps have been further utilized, since even DOSoReMI.hu was intended to take steps toward the regionalization of higher level soil information (secondary properties, functions, services). In the meantime the recently started AGRAGIS project requested spatial soil related information in order to estimate agri-environmental related impacts of climate change and support the associated vulnerability assessment. One of the most vulnerable services of soils in the context of climate change is their provisioning service. In our work it was approximated by productivity, which was estimated by a sequential scenario based crop modelling. It took into consideration long term (50 years) time series of both measured and predicted climatic parameters as well as accounted for the potential differences in agricultural practice and crop production. The flexible parametrization and multiple results of modelling was then applied for the spatial assessment of sensitivity, vulnerability, exposure and adaptive capacity of soils in the context of the forecasted changes in

  5. Generalized logistic map and its application in chaos based cryptography

    Science.gov (United States)

    Lawnik, M.

    2017-12-01

    The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.

  6. Spectrally based mapping of riverbed composition

    Science.gov (United States)

    Legleiter, Carl; Stegman, Tobin K.; Overstreet, Brandon T.

    2016-01-01

    Remote sensing methods provide an efficient means of characterizing fluvial systems. This study evaluated the potential to map riverbed composition based on in situ and/or remote measurements of reflectance. Field spectra and substrate photos from the Snake River, Wyoming, USA, were used to identify different sediment facies and degrees of algal development and to quantify their optical characteristics. We hypothesized that accounting for the effects of depth and water column attenuation to isolate the reflectance of the streambed would enhance distinctions among bottom types and facilitate substrate classification. A bottom reflectance retrieval algorithm adapted from coastal research yielded realistic spectra for the 450 to 700 nm range; but bottom reflectance-based substrate classifications, generated using a random forest technique, were no more accurate than classifications derived from above-water field spectra. Additional hypothesis testing indicated that a combination of reflectance magnitude (brightness) and indices of spectral shape provided the most accurate riverbed classifications. Convolving field spectra to the response functions of a multispectral satellite and a hyperspectral imaging system did not reduce classification accuracies, implying that high spectral resolution was not essential. Supervised classifications of algal density produced from hyperspectral data and an inferred bottom reflectance image were not highly accurate, but unsupervised classification of the bottom reflectance image revealed distinct spectrally based clusters, suggesting that such an image could provide additional river information. We attribute the failure of bottom reflectance retrieval to yield more reliable substrate maps to a latent correlation between depth and bottom type. Accounting for the effects of depth might have eliminated a key distinction among substrates and thus reduced discriminatory power. Although further, more systematic study across a broader

  7. A consensus linkage map of lentil based on DArT markers from three RIL mapping populations.

    Directory of Open Access Journals (Sweden)

    Duygu Ates

    Full Text Available Lentil (Lens culinaris ssp. culinaris Medikus is a diploid (2n = 2x = 14, self-pollinating grain legume with a haploid genome size of about 4 Gbp and is grown throughout the world with current annual production of 4.9 million tonnes.A consensus map of lentil (Lens culinaris ssp. culinaris Medikus was constructed using three different lentils recombinant inbred line (RIL populations, including "CDC Redberry" x "ILL7502" (LR8, "ILL8006" x "CDC Milestone" (LR11 and "PI320937" x "Eston" (LR39.The lentil consensus map was composed of 9,793 DArT markers, covered a total of 977.47 cM with an average distance of 0.10 cM between adjacent markers and constructed 7 linkage groups representing 7 chromosomes of the lentil genome. The consensus map had no gap larger than 12.67 cM and only 5 gaps were found to be between 12.67 cM and 6.0 cM (on LG3 and LG4. The localization of the SNP markers on the lentil consensus map were in general consistent with their localization on the three individual genetic linkage maps and the lentil consensus map has longer map length, higher marker density and shorter average distance between the adjacent markers compared to the component linkage maps.This high-density consensus map could provide insight into the lentil genome. The consensus map could also help to construct a physical map using a Bacterial Artificial Chromosome library and map based cloning studies. Sequence information of DArT may help localization of orientation scaffolds from Next Generation Sequencing data.

  8. A consensus linkage map of lentil based on DArT markers from three RIL mapping populations.

    Science.gov (United States)

    Ates, Duygu; Aldemir, Secil; Alsaleh, Ahmad; Erdogmus, Semih; Nemli, Seda; Kahriman, Abdullah; Ozkan, Hakan; Vandenberg, Albert; Tanyolac, Bahattin

    2018-01-01

    Lentil (Lens culinaris ssp. culinaris Medikus) is a diploid (2n = 2x = 14), self-pollinating grain legume with a haploid genome size of about 4 Gbp and is grown throughout the world with current annual production of 4.9 million tonnes. A consensus map of lentil (Lens culinaris ssp. culinaris Medikus) was constructed using three different lentils recombinant inbred line (RIL) populations, including "CDC Redberry" x "ILL7502" (LR8), "ILL8006" x "CDC Milestone" (LR11) and "PI320937" x "Eston" (LR39). The lentil consensus map was composed of 9,793 DArT markers, covered a total of 977.47 cM with an average distance of 0.10 cM between adjacent markers and constructed 7 linkage groups representing 7 chromosomes of the lentil genome. The consensus map had no gap larger than 12.67 cM and only 5 gaps were found to be between 12.67 cM and 6.0 cM (on LG3 and LG4). The localization of the SNP markers on the lentil consensus map were in general consistent with their localization on the three individual genetic linkage maps and the lentil consensus map has longer map length, higher marker density and shorter average distance between the adjacent markers compared to the component linkage maps. This high-density consensus map could provide insight into the lentil genome. The consensus map could also help to construct a physical map using a Bacterial Artificial Chromosome library and map based cloning studies. Sequence information of DArT may help localization of orientation scaffolds from Next Generation Sequencing data.

  9. Sensor fusion-based map building for mobile robot exploration

    International Nuclear Information System (INIS)

    Ribo, M.

    2000-01-01

    To carry out exploration tasks in unknown or partially unknown environments, a mobile robot needs to acquire and maintain models of its environment. In doing so, several sensors of same nature and/or heterogeneous sensor configurations may be used by the robot to achieve reliable performances. However, this in turn poses the problem of sensor fusion-based map building: How to interpret, combine and integrate sensory information in order to build a proper representation of the environment. Specifically, the goal of this thesis is to probe integration algorithms for Occupancy Grid (OG) based map building using odometry, ultrasonic rangefinders, and stereo vision. Three different uncertainty calculi are presented here which are used for sensor fusion-based map building purposes. They are based on probability theory, Dempster-Shafer theory of evidence, and fuzzy set theory. Besides, two different sensor models are depicted which are used to translate sensing data into range information. Experimental examples of OGs built from real data recorded by two robots in office-like environment are presented. They show the feasibility of the proposed approach for building both sonar and visual based OGs. A comparison among the presented uncertainty calculi is performed in a sonar-based framework. Finally, the fusion of both sonar and visual information based of the fuzzy set theory is depicted. (author)

  10. A BAC/BIBAC-based physical map of chickpea, Cicer arietinum L

    Directory of Open Access Journals (Sweden)

    Abbo Shahal

    2010-09-01

    Full Text Available Abstract Background Chickpea (Cicer arietinum L. is the third most important pulse crop worldwide. Despite its importance, relatively little is known about its genome. The availability of a genome-wide physical map allows rapid fine mapping of QTL, development of high-density genome maps, and sequencing of the entire genome. However, no such a physical map has been developed in chickpea. Results We present a genome-wide, BAC/BIBAC-based physical map of chickpea developed by fingerprint analysis. Four chickpea BAC and BIBAC libraries, two of which were constructed in this study, were used. A total of 67,584 clones were fingerprinted, and 64,211 (~11.7 × of the fingerprints validated and used in the physical map assembly. The physical map consists of 1,945 BAC/BIBAC contigs, with each containing an average of 28.3 clones and having an average physical length of 559 kb. The contigs collectively span approximately 1,088 Mb. By using the physical map, we identified the BAC/BIBAC contigs containing or closely linked to QTL4.1 for resistance to Didymella rabiei (RDR and QTL8 for days to first flower (DTF, thus further verifying the physical map and confirming its utility in fine mapping and cloning of QTL. Conclusion The physical map represents the first genome-wide, BAC/BIBAC-based physical map of chickpea. This map, along with other genomic resources previously developed in the species and the genome sequences of related species (soybean, Medicago and Lotus, will provide a foundation necessary for many areas of advanced genomics research in chickpea and other legume species. The inclusion of transformation-ready BIBACs in the map greatly facilitates its utility in functional analysis of the legume genomes.

  11. Clinical concept mapping: Does it improve discipline-based critical thinking of nursing students?

    Science.gov (United States)

    Moattari, Marzieh; Soleimani, Sara; Moghaddam, Neda Jamali; Mehbodi, Farkhondeh

    2014-01-01

    Background: Enhancing nursing students’ critical thinking is a challenge faced by nurse educators. This study aimed at determining the effect of clinical concept mapping on discipline-based critical thinking of nursing students. Materials and Methods: In this quasi-experimental post-test only design, a convenient sample of 4th year nursing students (N = 32) participated. They were randomly divided into two groups. The experimental group participated in a 1-day workshop on clinical concept mapping. They were also assigned to use at least two clinical concepts mapping during their clinical practice. Post-test was done using a specially designed package consisting of vignettes for measurement of 17 dimensions of critical thinking in nursing under two categories of cognitive critical thinking skills and habits of mind. They were required to write about how they would use a designated critical thinking skills or habits of mind to accomplish the nursing actions. The students’ responses were evaluated based on identification of critical thinking, justification, and quality of the student's response. The mean score of both groups was compared by Mann-Whitney test using SPSS version 16.5. Results: The results of the study revealed a significant difference between the two groups’ critical thinking regarding identification, justification, and quality of responses, and overall critical thinking scores, cognitive thinking skills, and habits of mind. The two groups also differed significantly from each other in 11 out of 17 dimensions of critical thinking. Conclusion: Clinical concept mapping is a valuable strategy for improvement of critical thinking of nursing students. However, further studies are recommended to generalize this result to nursing students in their earlier stage of education. PMID:24554963

  12. Clinical concept mapping: Does it improve discipline-based critical thinking of nursing students?

    Science.gov (United States)

    Moattari, Marzieh; Soleimani, Sara; Moghaddam, Neda Jamali; Mehbodi, Farkhondeh

    2014-01-01

    Enhancing nursing students' critical thinking is a challenge faced by nurse educators. This study aimed at determining the effect of clinical concept mapping on discipline-based critical thinking of nursing students. In this quasi-experimental post-test only design, a convenient sample of 4(th) year nursing students (N = 32) participated. They were randomly divided into two groups. The experimental group participated in a 1-day workshop on clinical concept mapping. They were also assigned to use at least two clinical concepts mapping during their clinical practice. Post-test was done using a specially designed package consisting of vignettes for measurement of 17 dimensions of critical thinking in nursing under two categories of cognitive critical thinking skills and habits of mind. They were required to write about how they would use a designated critical thinking skills or habits of mind to accomplish the nursing actions. The students' responses were evaluated based on identification of critical thinking, justification, and quality of the student's response. The mean score of both groups was compared by Mann-Whitney test using SPSS version 16.5. The results of the study revealed a significant difference between the two groups' critical thinking regarding identification, justification, and quality of responses, and overall critical thinking scores, cognitive thinking skills, and habits of mind. The two groups also differed significantly from each other in 11 out of 17 dimensions of critical thinking. Clinical concept mapping is a valuable strategy for improvement of critical thinking of nursing students. However, further studies are recommended to generalize this result to nursing students in their earlier stage of education.

  13. Small-angle X-ray scattering tensor tomography: model of the three-dimensional reciprocal-space map, reconstruction algorithm and angular sampling requirements.

    Science.gov (United States)

    Liebi, Marianne; Georgiadis, Marios; Kohlbrecher, Joachim; Holler, Mirko; Raabe, Jörg; Usov, Ivan; Menzel, Andreas; Schneider, Philipp; Bunk, Oliver; Guizar-Sicairos, Manuel

    2018-01-01

    Small-angle X-ray scattering tensor tomography, which allows reconstruction of the local three-dimensional reciprocal-space map within a three-dimensional sample as introduced by Liebi et al. [Nature (2015), 527, 349-352], is described in more detail with regard to the mathematical framework and the optimization algorithm. For the case of trabecular bone samples from vertebrae it is shown that the model of the three-dimensional reciprocal-space map using spherical harmonics can adequately describe the measured data. The method enables the determination of nanostructure orientation and degree of orientation as demonstrated previously in a single momentum transfer q range. This article presents a reconstruction of the complete reciprocal-space map for the case of bone over extended ranges of q. In addition, it is shown that uniform angular sampling and advanced regularization strategies help to reduce the amount of data required.

  14. Statistical methods in physical mapping

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  15. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  16. Managing mapping data using commercial data base management software.

    Science.gov (United States)

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  17. Integrating collaborative concept mapping in case based learning

    Directory of Open Access Journals (Sweden)

    Alfredo Tifi

    2013-03-01

    Full Text Available Different significance of collaborative concept mapping and collaborative argumentation in Case Based Learning are discussed and compared in the different perspectives of answering focus questions, of fostering reflective thinking skills and in managing uncertainty in problem solving in a scaffolded environment. Marked differences are pointed out between the way concepts are used in constructing concept maps and the way meanings are adopted in case based learning through guided argumentation activities. Shared concept maps should be given different scopes, as for example a as an advance organizer in preparing a background system of concepts that will undergo transformation while accompanying the inquiry activities on case studies or problems; b together with narratives, to enhance awareness of the situated epistemologies that are being entailed in choosing certain concepts during more complex case studies, and c after-learning construction of a holistic vision of the whole domain by means of the most inclusive concepts, while scaffoldedcollaborative writing of narratives and arguments in describing-treating cases could better serve as a source of situated-inspired tools to create-refine meanings for particular concepts.

  18. Smartphone-Based Mobile Detection Platform for Molecular Diagnostics and Spatiotemporal Disease Mapping.

    Science.gov (United States)

    Song, Jinzhao; Pandian, Vikram; Mauk, Michael G; Bau, Haim H; Cherry, Sara; Tisi, Laurence C; Liu, Changchun

    2018-04-03

    Rapid and quantitative molecular diagnostics in the field, at home, and at remote clinics is essential for evidence-based disease management, control, and prevention. Conventional molecular diagnostics requires extensive sample preparation, relatively sophisticated instruments, and trained personnel, restricting its use to centralized laboratories. To overcome these limitations, we designed a simple, inexpensive, hand-held, smartphone-based mobile detection platform, dubbed "smart-connected cup" (SCC), for rapid, connected, and quantitative molecular diagnostics. Our platform combines bioluminescent assay in real-time and loop-mediated isothermal amplification (BART-LAMP) technology with smartphone-based detection, eliminating the need for an excitation source and optical filters that are essential in fluorescent-based detection. The incubation heating for the isothermal amplification is provided, electricity-free, with an exothermic chemical reaction, and incubation temperature is regulated with a phase change material. A custom Android App was developed for bioluminescent signal monitoring and analysis, target quantification, data sharing, and spatiotemporal mapping of disease. SCC's utility is demonstrated by quantitative detection of Zika virus (ZIKV) in urine and saliva and HIV in blood within 45 min. We demonstrate SCC's connectivity for disease spatiotemporal mapping with a custom-designed website. Such a smart- and connected-diagnostic system does not require any lab facilities and is suitable for use at home, in the field, in the clinic, and particularly in resource-limited settings in the context of Internet of Medical Things (IoMT).

  19. GIS-based interactive tool to map the advent of world conquerors

    Science.gov (United States)

    Lakkaraju, Mahesh

    The objective of this thesis is to show the scale and extent of some of the greatest empires the world has ever seen. This is a hybrid project between the GIS based interactive tool and the web-based JavaScript tool. This approach lets the students learn effectively about the emperors themselves while understanding how long and far their empires spread. In the GIS based tool, a map is displayed with various points on it, and when a user clicks on one point, the relevant information of what happened at that particular place is displayed. Apart from this information, users can also select the interactive animation button and can walk through a set of battles in chronological order. As mentioned, this uses Java as the main programming language, and MOJO (Map Objects Java Objects) provided by ESRI. MOJO is very effective as its GIS related features can be included in the application itself. This app. is a simple tool and has been developed for university or high school level students. D3.js is an interactive animation and visualization platform built on the Javascript framework. Though HTML5, CSS3, Javascript and SVG animations can be used to derive custom animations, this tool can help bring out results with less effort and more ease of use. Hence, it has become the most sought after visualization tool for multiple applications. D3.js has provided a map-based visualization feature so that we can easily display text-based data in a map-based interface. To draw the map and the points on it, D3.js uses data rendered in TOPO JSON format. The latitudes and longitudes can be provided, which are interpolated into the Map svg. One of the main advantages of doing it this way is that more information is retained when we use a visual medium.

  20. A sampling and metagenomic sequencing-based methodology for monitoring antimicrobial resistance in swine herds

    DEFF Research Database (Denmark)

    Munk, Patrick; Dalhoff Andersen, Vibe; de Knegt, Leonardo

    2016-01-01

    Objectives Reliable methods for monitoring antimicrobial resistance (AMR) in livestock and other reservoirs are essential to understand the trends, transmission and importance of agricultural resistance. Quantification of AMR is mostly done using culture-based techniques, but metagenomic read...... mapping shows promise for quantitative resistance monitoring. Methods We evaluated the ability of: (i) MIC determination for Escherichia coli; (ii) cfu counting of E. coli; (iii) cfu counting of aerobic bacteria; and (iv) metagenomic shotgun sequencing to predict expected tetracycline resistance based...... cultivation-based techniques in terms of predicting expected tetracycline resistance based on antimicrobial consumption. Our metagenomic approach had sufficient resolution to detect antimicrobial-induced changes to individual resistance gene abundances. Pen floor manure samples were found to represent rectal...

  1. Method for Stereo Mapping Based on Objectarx and Pipeline Technology

    Science.gov (United States)

    Liu, F.; Chen, T.; Lin, Z.; Yang, Y.

    2012-07-01

    Stereo mapping is an important way to acquire 4D production. Based on the development of the stereo mapping and the characteristics of ObjectARX and pipeline technology, a new stereo mapping scheme which can realize the interaction between the AutoCAD and digital photogrammetry system is offered by ObjectARX and pipeline technology. An experiment is made in order to make sure the feasibility with the example of the software MAP-AT (Modern Aerial Photogrammetry Automatic Triangulation), the experimental results show that this scheme is feasible and it has very important meaning for the realization of the acquisition and edit integration.

  2. The ETLMR MapReduce-Based ETL Framework

    DEFF Research Database (Denmark)

    Xiufeng, Liu; Thomsen, Christian; Pedersen, Torben Bach

    2011-01-01

    This paper presents ETLMR, a parallel Extract--Transform--Load (ETL) programming framework based on MapReduce. It has built-in support for high-level ETL-specific constructs including star schemas, snowflake schemas, and slowly changing dimensions (SCDs). ETLMR gives both high programming...

  3. Topographical Hill Shading Map Production Based Tianditu (map World)

    Science.gov (United States)

    Wang, C.; Zha, Z.; Tang, D.; Yang, J.

    2018-04-01

    TIANDITU (Map World) is the public version of National Platform for Common Geospatial Information Service, and the terrain service is an important channel for users on the platform. With the development of TIANDITU, topographical hill shading map production for providing and updating global terrain map on line becomes necessary for the characters of strong intuition, three-dimensional sense and aesthetic effect. As such, the terrain service of TIANDITU focuses on displaying the different scales of topographical data globally. And this paper mainly aims to research the method of topographical hill shading map production globally using DEM (Digital Elevation Model) data between the displaying scales about 1 : 140,000,000 to 1 : 4,000,000, corresponded the display level from 2 to 7 on TIANDITU website.

  4. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  5. Analyzing the Scientific Evolution of Social Work Using Science Mapping

    Science.gov (United States)

    Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique

    2015-01-01

    Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…

  6. An object-based approach for tree species extraction from digital orthophoto maps

    Science.gov (United States)

    Jamil, Akhtar; Bayram, Bulent

    2018-05-01

    Tree segmentation is an active and ongoing research area in the field of photogrammetry and remote sensing. It is more challenging due to both intra-class and inter-class similarities among various tree species. In this study, we exploited various statistical features for extraction of hazelnut trees from 1 : 5000 scaled digital orthophoto maps. Initially, the non-vegetation areas were eliminated using traditional normalized difference vegetation index (NDVI) followed by application of mean shift segmentation for transforming the pixels into meaningful homogeneous objects. In order to eliminate false positives, morphological opening and closing was employed on candidate objects. A number of heuristics were also derived to eliminate unwanted effects such as shadow and bounding box aspect ratios, before passing them into the classification stage. Finally, a knowledge based decision tree was constructed to distinguish the hazelnut trees from rest of objects which include manmade objects and other type of vegetation. We evaluated the proposed methodology on 10 sample orthophoto maps obtained from Giresun province in Turkey. The manually digitized hazelnut tree boundaries were taken as reference data for accuracy assessment. Both manually digitized and segmented tree borders were converted into binary images and the differences were calculated. According to the obtained results, the proposed methodology obtained an overall accuracy of more than 85 % for all sample images.

  7. Pseudo-random bit generator based on Chebyshev map

    Science.gov (United States)

    Stoyanov, B. P.

    2013-10-01

    In this paper, we study a pseudo-random bit generator based on two Chebyshev polynomial maps. The novel derivative algorithm shows perfect statistical properties established by number of statistical tests.

  8. Computer-aided diagnosis of early knee osteoarthritis based on MRI T2 mapping.

    Science.gov (United States)

    Wu, Yixiao; Yang, Ran; Jia, Sen; Li, Zhanjun; Zhou, Zhiyang; Lou, Ting

    2014-01-01

    This work was aimed at studying the method of computer-aided diagnosis of early knee OA (OA: osteoarthritis). Based on the technique of MRI (MRI: Magnetic Resonance Imaging) T2 Mapping, through computer image processing, feature extraction, calculation and analysis via constructing a classifier, an effective computer-aided diagnosis method for knee OA was created to assist doctors in their accurate, timely and convenient detection of potential risk of OA. In order to evaluate this method, a total of 1380 data from the MRI images of 46 samples of knee joints were collected. These data were then modeled through linear regression on an offline general platform by the use of the ImageJ software, and a map of the physical parameter T2 was reconstructed. After the image processing, the T2 values of ten regions in the WORMS (WORMS: Whole-organ Magnetic Resonance Imaging Score) areas of the articular cartilage were extracted to be used as the eigenvalues in data mining. Then,a RBF (RBF: Radical Basis Function) network classifier was built to classify and identify the collected data. The classifier exhibited a final identification accuracy of 75%, indicating a good result of assisting diagnosis. Since the knee OA classifier constituted by a weights-directly-determined RBF neural network didn't require any iteration, our results demonstrated that the optimal weights, appropriate center and variance could be yielded through simple procedures. Furthermore, the accuracy for both the training samples and the testing samples from the normal group could reach 100%. Finally, the classifier was superior both in time efficiency and classification performance to the frequently used classifiers based on iterative learning. Thus it was suitable to be used as an aid to computer-aided diagnosis of early knee OA.

  9. Enhanced Map-Matching Algorithm with a Hidden Markov Model for Mobile Phone Positioning

    Directory of Open Access Journals (Sweden)

    An Luo

    2017-10-01

    Full Text Available Numerous map-matching techniques have been developed to improve positioning, using Global Positioning System (GPS data and other sensors. However, most existing map-matching algorithms process GPS data with high sampling rates, to achieve a higher correct rate and strong universality. This paper introduces a novel map-matching algorithm based on a hidden Markov model (HMM for GPS positioning and mobile phone positioning with a low sampling rate. The HMM is a statistical model well known for providing solutions to temporal recognition applications such as text and speech recognition. In this work, the hidden Markov chain model was built to establish a map-matching process, using the geometric data, the topologies matrix of road links in road network and refined quad-tree data structure. HMM-based map-matching exploits the Viterbi algorithm to find the optimized road link sequence. The sequence consists of hidden states in the HMM model. The HMM-based map-matching algorithm is validated on a vehicle trajectory using GPS and mobile phone data. The results show a significant improvement in mobile phone positioning and high and low sampling of GPS data.

  10. Planimetric Features Generalization for the Production of Small-Scale Map by Using Base Maps and the Existing Algorithms

    Directory of Open Access Journals (Sweden)

    M. Modiri

    2014-10-01

    Full Text Available Cartographic maps are representations of the Earth upon a flat surface in the smaller scale than it’s true. Large scale maps cover relatively small regions in great detail and small scale maps cover large regions such as nations, continents and the whole globe. Logical connection between the features and scale map must be maintained by changing the scale and it is important to recognize that even the most accurate maps sacrifice a certain amount of accuracy in scale to deliver a greater visual usefulness to its user. Cartographic generalization, or map generalization, is the method whereby information is selected and represented on a map in a way that adapts to the scale of the display medium of the map, not necessarily preserving all intricate geographical or other cartographic details. Due to the problems facing small-scale map production process and the need to spend time and money for surveying, today’s generalization is used as executive approach. The software is proposed in this paper that converted various data and information to certain Data Model. This software can produce generalization map according to base map using the existing algorithm. Planimetric generalization algorithms and roles are described in this article. Finally small-scale maps with 1:100,000, 1:250,000 and 1:500,000 scale are produced automatically and they are shown at the end.

  11. Learning process mapping heuristics under stochastic sampling overheads

    Science.gov (United States)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  12. Physico-empirical approach for mapping soil hydraulic behaviour

    Directory of Open Access Journals (Sweden)

    G. D'Urso

    1997-01-01

    Full Text Available Abstract: Pedo-transfer functions are largely used in soil hydraulic characterisation of large areas. The use of physico-empirical approaches for the derivation of soil hydraulic parameters from disturbed samples data can be greatly enhanced if a characterisation performed on undisturbed cores of the same type of soil is available. In this study, an experimental procedure for deriving maps of soil hydraulic behaviour is discussed with reference to its application in an irrigation district (30 km2 in southern Italy. The main steps of the proposed procedure are: i the precise identification of soil hydraulic functions from undisturbed sampling of main horizons in representative profiles for each soil map unit; ii the determination of pore-size distribution curves from larger disturbed sampling data sets within the same soil map unit. iii the calibration of physical-empirical methods for retrieving soil hydraulic parameters from particle-size data and undisturbed soil sample analysis; iv the definition of functional hydraulic properties from water balance output; and v the delimitation of soil hydraulic map units based on functional properties.

  13. Generation of a BAC-based physical map of the melon genome

    Directory of Open Access Journals (Sweden)

    Puigdomènech Pere

    2010-05-01

    Full Text Available Abstract Background Cucumis melo (melon belongs to the Cucurbitaceae family, whose economic importance among horticulture crops is second only to Solanaceae. Melon has high intra-specific genetic variation, morphologic diversity and a small genome size (450 Mb, which make this species suitable for a great variety of molecular and genetic studies that can lead to the development of tools for breeding varieties of the species. A number of genetic and genomic resources have already been developed, such as several genetic maps and BAC genomic libraries. These tools are essential for the construction of a physical map, a valuable resource for map-based cloning, comparative genomics and assembly of whole genome sequencing data. However, no physical map of any Cucurbitaceae has yet been developed. A project has recently been started to sequence the complete melon genome following a whole-genome shotgun strategy, which makes use of massive sequencing data. A BAC-based melon physical map will be a useful tool to help assemble and refine the draft genome data that is being produced. Results A melon physical map was constructed using a 5.7 × BAC library and a genetic map previously developed in our laboratories. High-information-content fingerprinting (HICF was carried out on 23,040 BAC clones, digesting with five restriction enzymes and SNaPshot labeling, followed by contig assembly with FPC software. The physical map has 1,355 contigs and 441 singletons, with an estimated physical length of 407 Mb (0.9 × coverage of the genome and the longest contig being 3.2 Mb. The anchoring of 845 BAC clones to 178 genetic markers (100 RFLPs, 76 SNPs and 2 SSRs also allowed the genetic positioning of 183 physical map contigs/singletons, representing 55 Mb (12% of the melon genome, to individual chromosomal loci. The melon FPC database is available for download at http://melonomics.upv.es/static/files/public/physical_map/. Conclusions Here we report the construction

  14. New segmentation-based tone mapping algorithm for high dynamic range image

    Science.gov (United States)

    Duan, Weiwei; Guo, Huinan; Zhou, Zuofeng; Huang, Huimin; Cao, Jianzhong

    2017-07-01

    The traditional tone mapping algorithm for the display of high dynamic range (HDR) image has the drawback of losing the impression of brightness, contrast and color information. To overcome this phenomenon, we propose a new tone mapping algorithm based on dividing the image into different exposure regions in this paper. Firstly, the over-exposure region is determined using the Local Binary Pattern information of HDR image. Then, based on the peak and average gray of the histogram, the under-exposure and normal-exposure region of HDR image are selected separately. Finally, the different exposure regions are mapped by differentiated tone mapping methods to get the final result. The experiment results show that the proposed algorithm achieve the better performance both in visual quality and objective contrast criterion than other algorithms.

  15. Mapping visual cortex in monkeys and humans using surface-based atlases

    Science.gov (United States)

    Van Essen, D. C.; Lewis, J. W.; Drury, H. A.; Hadjikhani, N.; Tootell, R. B.; Bakircioglu, M.; Miller, M. I.

    2001-01-01

    We have used surface-based atlases of the cerebral cortex to analyze the functional organization of visual cortex in humans and macaque monkeys. The macaque atlas contains multiple partitioning schemes for visual cortex, including a probabilistic atlas of visual areas derived from a recent architectonic study, plus summary schemes that reflect a combination of physiological and anatomical evidence. The human atlas includes a probabilistic map of eight topographically organized visual areas recently mapped using functional MRI. To facilitate comparisons between species, we used surface-based warping to bring functional and geographic landmarks on the macaque map into register with corresponding landmarks on the human map. The results suggest that extrastriate visual cortex outside the known topographically organized areas is dramatically expanded in human compared to macaque cortex, particularly in the parietal lobe.

  16. Communication: Quantitative multi-site frequency maps for amide I vibrational spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Reppert, Mike [Department of Chemistry, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States); Department of Chemistry, University of Chicago, Chicago, Illinois 60637 (United States); Tokmakoff, Andrei, E-mail: tokmakoff@uchicago.edu [Department of Chemistry, University of Chicago, Chicago, Illinois 60637 (United States)

    2015-08-14

    An accurate method for predicting the amide I vibrational spectrum of a given protein structure has been sought for many years. Significant progress has been made recently by sampling structures from molecular dynamics simulations and mapping local electrostatic variables onto the frequencies of individual amide bonds. Agreement with experiment, however, has remained largely qualitative. Previously, we used dipeptide fragments and isotope-labeled constructs of the protein G mimic NuG2b as experimental standards for developing and testing amide I frequency maps. Here, we combine these datasets to test different frequency-map models and develop a novel method to produce an optimized four-site potential (4P) map based on the CHARMM27 force field. Together with a charge correction for glycine residues, the optimized map accurately describes both experimental datasets, with average frequency errors of 2–3 cm{sup −1}. This 4P map is shown to be convertible to a three-site field map which provides equivalent performance, highlighting the viability of both field- and potential-based maps for amide I spectral modeling. The use of multiple sampling points for local electrostatics is found to be essential for accurate map performance.

  17. Tissue-based map of the human proteome

    DEFF Research Database (Denmark)

    Uhlén, Mathias; Fagerberg, Linn; Hallström, Björn M.

    2015-01-01

    Resolving the molecular details of proteome variation in the different tissues and organs of the human body will greatly increase our knowledge of human biology and disease. Here, we present a map of the human tissue proteome based on an integrated omics approach that involves quantitative transc...

  18. Dipole-magnet field models based on a conformal map

    Directory of Open Access Journals (Sweden)

    P. L. Walstrom

    2012-10-01

    Full Text Available In general, generation of charged-particle transfer maps for conventional iron-pole-piece dipole magnets to third and higher order requires a model for the midplane field profile and its transverse derivatives (soft-edge model to high order and numerical integration of map coefficients. An exact treatment of the problem for a particular magnet requires use of measured magnetic data. However, in initial design of beam transport systems, users of charged-particle optics codes generally rely on magnet models built into the codes. Indeed, if maps to third order are adequate for the problem, an approximate analytic field model together with numerical map coefficient integration can capture the important features of the transfer map. The model described in this paper is based on the fact that, except at very large distances from the magnet, the magnetic field for parallel pole-face magnets with constant pole gap height and wide pole faces is basically two dimensional (2D. The field for all space outside of the pole pieces is given by a single (complex analytic expression and includes a parameter that controls the rate of falloff of the fringe field. Since the field function is analytic in the complex plane outside of the pole pieces, it satisfies two basic requirements of a field model for higher-order map codes: it is infinitely differentiable at the midplane and also a solution of the Laplace equation. It is apparently the only simple model available that combines an exponential approach to the central field with an inverse cubic falloff of field at large distances from the magnet in a single expression. The model is not intended for detailed fitting of magnetic field data, but for use in numerical map-generating codes for studying the effect of extended fringe fields on higher-order transfer maps. It is based on conformally mapping the area between the pole pieces to the upper half plane, and placing current filaments on the pole faces. An

  19. Machine learning-based dual-energy CT parametric mapping.

    Science.gov (United States)

    Su, Kuan-Hao; Kuo, Jung-Wen; Jordan, David W; Van Hedent, Steven; Klahr, Paul; Wei, Zhouping; Al Helo, Rose; Liang, Fan; Qian, Pengjiang; Pereira, Gisele C; Rassouli, Negin; Gilkeson, Robert C; Traughber, Bryan J; Cheng, Chee-Wai; Muzic, Raymond F

    2018-05-22

    The aim is to develop and evaluate machine learning methods for generating quantitative parametric maps of effective atomic number (Zeff), relative electron density (ρe), mean excitation energy (Ix), and relative stopping power (RSP) from clinical dual-energy CT data. The maps could be used for material identification and radiation dose calculation. Machine learning methods of historical centroid (HC), random forest (RF), and artificial neural networks (ANN) were used to learn the relationship between dual-energy CT input data and ideal output parametric maps calculated for phantoms from the known compositions of 13 tissue substitutes. After training and model selection steps, the machine learning predictors were used to generate parametric maps from independent phantom and patient input data. Precision and accuracy were evaluated using the ideal maps. This process was repeated for a range of exposure doses, and performance was compared to that of the clinically-used dual-energy, physics-based method which served as the reference. The machine learning methods generated more accurate and precise parametric maps than those obtained using the reference method. Their performance advantage was particularly evident when using data from the lowest exposure, one-fifth of a typical clinical abdomen CT acquisition. The RF method achieved the greatest accuracy. In comparison, the ANN method was only 1% less accurate but had much better computational efficiency than RF, being able to produce parametric maps in 15 seconds. Machine learning methods outperformed the reference method in terms of accuracy and noise tolerance when generating parametric maps, encouraging further exploration of the techniques. Among the methods we evaluated, ANN is the most suitable for clinical use due to its combination of accuracy, excellent low-noise performance, and computational efficiency. . © 2018 Institute of Physics and Engineering in

  20. Stochastic Wheel-Slip Compensation Based Robot Localization and Mapping

    Directory of Open Access Journals (Sweden)

    SIDHARTHAN, R. K.

    2016-05-01

    Full Text Available Wheel slip compensation is vital for building accurate and reliable dead reckoning based robot localization and mapping algorithms. This investigation presents stochastic slip compensation scheme for robot localization and mapping. Main idea of the slip compensation technique is to use wheel-slip data obtained from experiments to model the variations in slip velocity as Gaussian distributions. This leads to a family of models that are switched depending on the input command. To obtain the wheel-slip measurements, experiments are conducted on a wheeled mobile robot and the measurements thus obtained are used to build the Gaussian models. Then the localization and mapping algorithm is tested on an experimental terrain and a new metric called the map spread factor is used to evaluate the ability of the slip compensation technique. Our results clearly indicate that the proposed methodology improves the accuracy by 72.55% for rotation and 66.67% for translation motion as against an uncompensated mapping system. The proposed compensation technique eliminates the need for extro receptive sensors for slip compensation, complex feature extraction and association algorithms. As a result, we obtain a simple slip compensation scheme for localization and mapping.

  1. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    Science.gov (United States)

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. A Game Map Complexity Measure Based on Hamming Distance

    Science.gov (United States)

    Li, Yan; Su, Pan; Li, Wenliang

    With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.

  3. Affordance-based individuation of junctions in Open Street Map

    Directory of Open Access Journals (Sweden)

    Simon Scheider

    2012-06-01

    Full Text Available We propose an algorithm that can be used to identify automatically the subset of street segments of a road network map that corresponds to a junction. The main idea is to use turn-compliant locomotion affordances, i.e., restricted patterns of supported movement, in order to specify junctions independently of their data representation, and in order to motivate tractable individuation and classification strategies. We argue that common approaches based solely on geometry or topology of the street segment graph are useful but insufficient proxies. They miss certain turn restrictions essential to junctions. From a computational viewpoint, the main challenge of affordance-based individuation of junctions lies in its complex recursive definition. In this paper, we show how Open Street Map data can be interpreted into locomotion affordances, and how the recursive junction definition can be translated into a deterministic algorithm. We evaluate this algorithm by applying it to small map excerpts in order to delineate the contained junctions.

  4. Fourier-Mellin moment-based intertwining map for image encryption

    Science.gov (United States)

    Kaur, Manjit; Kumar, Vijay

    2018-03-01

    In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.

  5. Dataflow-Based Mapping of Computer Vision Algorithms onto FPGAs

    Directory of Open Access Journals (Sweden)

    Ivan Corretjer

    2007-01-01

    Full Text Available We develop a design methodology for mapping computer vision algorithms onto an FPGA through the use of coarse-grain reconfigurable dataflow graphs as a representation to guide the designer. We first describe a new dataflow modeling technique called homogeneous parameterized dataflow (HPDF, which effectively captures the structure of an important class of computer vision applications. This form of dynamic dataflow takes advantage of the property that in a large number of image processing applications, data production and consumption rates can vary, but are equal across dataflow graph edges for any particular application iteration. After motivating and defining the HPDF model of computation, we develop an HPDF-based design methodology that offers useful properties in terms of verifying correctness and exposing performance-enhancing transformations; we discuss and address various challenges in efficiently mapping an HPDF-based application representation into target-specific HDL code; and we present experimental results pertaining to the mapping of a gesture recognition application onto the Xilinx Virtex II FPGA.

  6. a Mapping Method of Slam Based on Look up Table

    Science.gov (United States)

    Wang, Z.; Li, J.; Wang, A.; Wang, J.

    2017-09-01

    In the last years several V-SLAM(Visual Simultaneous Localization and Mapping) approaches have appeared showing impressive reconstructions of the world. However these maps are built with far more than the required information. This limitation comes from the whole process of each key-frame. In this paper we present for the first time a mapping method based on the LOOK UP TABLE(LUT) for visual SLAM that can improve the mapping effectively. As this method relies on extracting features in each cell divided from image, it can get the pose of camera that is more representative of the whole key-frame. The tracking direction of key-frames is obtained by counting the number of parallax directions of feature points. LUT stored all mapping needs the number of cell corresponding to the tracking direction which can reduce the redundant information in the key-frame, and is more efficient to mapping. The result shows that a better map with less noise is build using less than one-third of the time. We believe that the capacity of LUT efficiently building maps makes it a good choice for the community to investigate in the scene reconstruction problems.

  7. Usability evaluation of cloud-based mapping tools for the display of very large datasets

    Science.gov (United States)

    Stotz, Nicole Marie

    The elasticity and on-demand nature of cloud services have made it easier to create web maps. Users only need access to a web browser and the Internet to utilize cloud based web maps, eliminating the need for specialized software. To encourage a wide variety of users, a map must be well designed; usability is a very important concept in designing a web map. Fusion Tables, a new product from Google, is one example of newer cloud-based distributed GIS services. It allows for easy spatial data manipulation and visualization, within the Google Maps framework. ESRI has also introduced a cloud based version of their software, called ArcGIS Online, built on Amazon's EC2 cloud. Utilizing a user-centered design framework, two prototype maps were created with data from the San Diego East County Economic Development Council. One map was built on Fusion Tables, and another on ESRI's ArcGIS Online. A usability analysis was conducted and used to compare both map prototypes in term so of design and functionality. Load tests were also ran, and performance metrics gathered on both map prototypes. The usability analysis was taken by 25 geography students, and consisted of time based tasks and questions on map design and functionality. Survey participants completed the time based tasks for the Fusion Tables map prototype quicker than those of the ArcGIS Online map prototype. While response was generally positive towards the design and functionality of both prototypes, overall the Fusion Tables map prototype was preferred. For the load tests, the data set was broken into 22 groups for a total of 44 tests. While the Fusion Tables map prototype performed more efficiently than the ArcGIS Online prototype, differences are almost unnoticeable. A SWOT analysis was conducted for each prototype. The results from this research point to the Fusion Tables map prototype. A redesign of this prototype would incorporate design suggestions from the usability survey, while some functionality would

  8. Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

    Science.gov (United States)

    Mou, Xiaozheng; Wang, Han

    2018-01-01

    This paper proposes a wide-baseline stereo-based static obstacle mapping approach for unmanned surface vehicles (USVs). The proposed approach eliminates the complicated calibration work and the bulky rig in our previous binocular stereo system, and raises the ranging ability from 500 to 1000 m with a even larger baseline obtained from the motion of USVs. Integrating a monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed while the USV is traveling, and an obstacle map is then built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. Experimental results based on our own dataset demonstrate the high efficiency of our system. To the best of our knowledge, we are the first to address the task of wide-baseline stereo-based obstacle mapping in a maritime environment. PMID:29617293

  9. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Directory of Open Access Journals (Sweden)

    Qi Zheng

    2016-10-01

    Full Text Available Accurate mapping of next-generation sequencing (NGS reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  10. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    Science.gov (United States)

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  11. A recognition method research based on the heart sound texture map

    Directory of Open Access Journals (Sweden)

    Huizhong Cheng

    2016-06-01

    Full Text Available In order to improve the Heart Sound recognition rate and reduce the recognition time, in this paper, we introduces a new method for Heart Sound pattern recognition by using Heart Sound Texture Map. Based on the Heart Sound model, we give the Heart Sound time-frequency diagram and the Heart Sound Texture Map definition, we study the structure of the Heart Sound Window Function principle and realization method, and then discusses how to use the Heart Sound Window Function and the Short-time Fourier Transform to obtain two-dimensional Heart Sound time-frequency diagram, propose corner correlation recognition algorithm based on the Heart Sound Texture Map according to the characteristics of Heart Sound. The simulation results show that the Heart Sound Window Function compared with the traditional window function makes the first (S1 and the second (S2 Heart Sound texture clearer. And the corner correlation recognition algorithm based on the Heart Sound Texture Map can significantly improve the recognition rate and reduce the expense, which is an effective Heart Sound recognition method.

  12. Color encryption scheme based on adapted quantum logistic map

    Science.gov (United States)

    Zaghloul, Alaa; Zhang, Tiejun; Amin, Mohamed; Abd El-Latif, Ahmed A.

    2014-04-01

    This paper presents a new color image encryption scheme based on quantum chaotic system. In this scheme, a new encryption scheme is accomplished by generating an intermediate chaotic key stream with the help of quantum chaotic logistic map. Then, each pixel is encrypted by the cipher value of the previous pixel and the adapted quantum logistic map. The results show that the proposed scheme has adequate security for the confidentiality of color images.

  13. Validation and application of Acoustic Mapping Velocimetry

    Science.gov (United States)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army

  14. Analyzing thematic maps and mapping for accuracy

    Science.gov (United States)

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by

  15. Risk-based fault detection using Self-Organizing Map

    International Nuclear Information System (INIS)

    Yu, Hongyang; Khan, Faisal; Garaniya, Vikram

    2015-01-01

    The complexity of modern systems is increasing rapidly and the dominating relationships among system variables have become highly non-linear. This results in difficulty in the identification of a system's operating states. In turn, this difficulty affects the sensitivity of fault detection and imposes a challenge on ensuring the safety of operation. In recent years, Self-Organizing Maps has gained popularity in system monitoring as a robust non-linear dimensionality reduction tool. Self-Organizing Map is able to capture non-linear variations of the system. Therefore, it is sensitive to the change of a system's states leading to early detection of fault. In this paper, a new approach based on Self-Organizing Map is proposed to detect and assess the risk of fault. In addition, probabilistic analysis is applied to characterize the risk of fault into different levels according to the hazard potential to enable a refined monitoring of the system. The proposed approach is applied on two experimental systems. The results from both systems have shown high sensitivity of the proposed approach in detecting and identifying the root cause of faults. The refined monitoring facilitates the determination of the risk of fault and early deployment of remedial actions and safety measures to minimize the potential impact of fault. - Highlights: • A new approach based on Self-Organizing Map is proposed to detect faults. • Integration of fault detection with risk assessment methodology. • Fault risk characterization into different levels to enable focused system monitoring

  16. Silicon based ultrafast optical waveform sampling

    DEFF Research Database (Denmark)

    Ji, Hua; Galili, Michael; Pu, Minhao

    2010-01-01

    A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode-locker as th......A 300 nmx450 nmx5 mm silicon nanowire is designed and fabricated for a four wave mixing based non-linear optical gate. Based on this silicon nanowire, an ultra-fast optical sampling system is successfully demonstrated using a free-running fiber laser with a carbon nanotube-based mode......-locker as the sampling source. A clear eye-diagram of a 320 Gbit/s data signal is obtained. The temporal resolution of the sampling system is estimated to 360 fs....

  17. Constructing a Soil Class Map of Denmark based on the FAO Legend Using Digital Techniques

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Minasny, Budiman; Greve, Mette Balslev

    2014-01-01

    Soil mapping in Denmark has a long history and a series of soil maps based on conventional mapping approaches have been produced. In this study, a national soil map of Denmark was constructed based on the FAO–Unesco Revised Legend 1990 using digital soil mapping techniques, existing soil profile......) confirmed that the output is reliable and can be used in various soil and environmental studies without major difficulties. This study also verified the importance of GlobalSoilMap products and a priori pedological information that improved prediction performance and quality of the new FAO soil map...

  18. LARGE-SCALE INDICATIVE MAPPING OF SOIL RUNOFF

    Directory of Open Access Journals (Sweden)

    E. Panidi

    2017-11-01

    Full Text Available In our study we estimate relationships between quantitative parameters of relief, soil runoff regime, and spatial distribution of radioactive pollutants in the soil. The study is conducted on the test arable area located in basin of the upper Oka River (Orel region, Russia. Previously we collected rich amount of soil samples, which make it possible to investigate redistribution of the Chernobyl-origin cesium-137 in soil material and as a consequence the soil runoff magnitude at sampling points. Currently we are describing and discussing the technique applied to large-scale mapping of the soil runoff. The technique is based upon the cesium-137 radioactivity measurement in the different relief structures. Key stages are the allocation of the places for soil sampling points (we used very high resolution space imagery as a supporting data; soil samples collection and analysis; calibration of the mathematical model (using the estimated background value of the cesium-137 radioactivity; and automated compilation of the map (predictive map of the studied territory (digital elevation model is used for this purpose, and cesium-137 radioactivity can be predicted using quantitative parameters of the relief. The maps can be used as a support data for precision agriculture and for recultivation or melioration purposes.

  19. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  20. Map based localization to assist commercial fleet operations.

    Science.gov (United States)

    2014-08-01

    This report outlines key recent contributions to the state of the art in lane detection, lane departure warning, : and map-based sensor fusion algorithms. These key studies are used as a basis for a discussion about the : limitations of systems that ...

  1. An Educational Data Mining Approach to Concept Map Construction for Web based Learning

    Directory of Open Access Journals (Sweden)

    Anal ACHARYA

    2017-01-01

    Full Text Available This aim of this article is to study the use of Educational Data Mining (EDM techniques in constructing concept maps for organizing knowledge in web based learning systems whereby studying their synergistic effects in enhancing learning. This article first provides a tutorial based introduction to EDM. The applicability of web based learning systems in enhancing the efficiency of EDM techniques in real time environment is investigated. Web based learning systems often use a tool for organizing knowledge. This article explores the use of one such tool called concept map for this purpose. The pioneering works by various researchers who proposed web based learning systems in personalized and collaborative environment in this arena are next presented. A set of parameters are proposed based on which personalized and collaborative learning applications may be generalized and their performances compared. It is found that personalized learning environment uses EDM techniques more exhaustively compared to collaborative learning for concept map construction in web based environment. This article can be used as a starting point for freshers who would like to use EDM techniques for concept map construction for web based learning purposes.

  2. A novel algorithm for image encryption based on mixture of chaotic maps

    International Nuclear Information System (INIS)

    Behnia, S.; Akhshani, A.; Mahmodi, H.; Akhavan, A.

    2008-01-01

    Chaos-based encryption appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an implementation of digital image encryption scheme based on the mixture of chaotic systems is reported. The chaotic cryptography technique used in this paper is a symmetric key cryptography. In this algorithm, a typical coupled map was mixed with a one-dimensional chaotic map and used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail, along with its security analysis and implementation. The experimental results based on mixture of chaotic maps approves the effectiveness of the proposed method and the implementation of the algorithm. This mixture application of chaotic maps shows advantages of large key space and high-level security. The ciphertext generated by this method is the same size as the plaintext and is suitable for practical use in the secure transmission of confidential information over the Internet

  3. Area–Oriented Technology Mapping for LUT–Based Logic Blocks

    Directory of Open Access Journals (Sweden)

    Kubica Marcin

    2017-03-01

    Full Text Available One of the main aspects of logic synthesis dedicated to FPGA is the problem of technology mapping, which is directly associated with the logic decomposition technique. This paper focuses on using configurable properties of CLBs in the process of logic decomposition and technology mapping. A novel theory and a set of efficient techniques for logic decomposition based on a BDD are proposed. The paper shows that logic optimization can be efficiently carried out by using multiple decomposition. The essence of the proposed synthesis method is multiple cutting of a BDD. A new diagram form called an SMTBDD is proposed. Moreover, techniques that allow finding the best technology mapping oriented to configurability of CLBs are presented. In the experimental section, the presented method (MultiDec is compared with academic and commercial tools. The experimental results show that the proposed technology mapping strategy leads to good results in terms of the number of CLBs.

  4. Mapping population-based structural connectomes.

    Science.gov (United States)

    Zhang, Zhengwu; Descoteaux, Maxime; Zhang, Jingwen; Girard, Gabriel; Chamberland, Maxime; Dunson, David; Srivastava, Anuj; Zhu, Hongtu

    2018-05-15

    Advances in understanding the structural connectomes of human brain require improved approaches for the construction, comparison and integration of high-dimensional whole-brain tractography data from a large number of individuals. This article develops a population-based structural connectome (PSC) mapping framework to address these challenges. PSC simultaneously characterizes a large number of white matter bundles within and across different subjects by registering different subjects' brains based on coarse cortical parcellations, compressing the bundles of each connection, and extracting novel connection weights. A robust tractography algorithm and streamline post-processing techniques, including dilation of gray matter regions, streamline cutting, and outlier streamline removal are applied to improve the robustness of the extracted structural connectomes. The developed PSC framework can be used to reproducibly extract binary networks, weighted networks and streamline-based brain connectomes. We apply the PSC to Human Connectome Project data to illustrate its application in characterizing normal variations and heritability of structural connectomes in healthy subjects. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Topographical memory for newly-learned maps is differentially affected by route-based versus landmark-based learning

    DEFF Research Database (Denmark)

    Beatty, Erin L.; Muller-Gass, Alexandra; Wojtarowicz, Dorothy

    2018-01-01

    on their ability to distinguish previously studied 'old' maps from completely unfamiliar 'new' maps under conditions of high and low working memory load in the functional MRI scanner. Viewing old versus new maps was associated with relatively greater activation in a distributed set of regions including bilateral...... inferior temporal gyrus - an important region for recognizing visual objects. Critically, whereas the performance of participants who had followed a route-based strategy dropped to chance level under high working memory load, participants who had followed a landmark-based strategy performed at above chance...... levels under both high and low working memory load - reflected by relatively greater activation in the left inferior parietal lobule (i.e. rostral part of the supramarginal gyrus known as area PFt). Our findings suggest that landmark-based learning may buffer against the effects of working memory load...

  6. Investigating the origins of double photopeaks in CsI:Tl samples through activator mapping

    Science.gov (United States)

    Onken, Drew R.; Gridin, Sergii; Williams, Richard T.; Williams, Charles B.; Donati, George L.; Gayshan, Vadim; Vasyukov, Sergey; Gektin, Alex

    2018-06-01

    Careful examination of the origins of double photopeaks in CsI:Tl provides a foundation for exploring the relationship between activator homogeneity and photopeak resolution in scintillators. In rare cases, certain CsI:Tl crystals exhibit a second photopeak in the pulse-height spectrum. A combination of optical mapping and ICP-MS measurements reveals the presence of two distinct regions with differing Tl concentrations in these crystals. The oscillator strength of the 299 nm absorption A-band of Tl in CsI was measured to be 0.0526 ± 0.0008; this parameter can be used to quantify activator concentration from the optical absorption. Using published measurements of luminescence intensity versus Tl concentration, the distributions of Tl measured from optical absorption maps of the samples were reconstructed into photopeaks in good agreement with experiment. The distribution of Tl concentrations in these particular crystals allowed examining luminescence pulse shape as a function of Tl concentration.

  7. Integrating pipeline data management application and Google maps dataset on web based GIS application unsing open source technology Sharp Map and Open Layers

    Energy Technology Data Exchange (ETDEWEB)

    Wisianto, Arie; Sania, Hidayatus [PT PERTAMINA GAS, Bontang (Indonesia); Gumilar, Oki [PT PERTAMINA GAS, Jakarta (Indonesia)

    2010-07-01

    PT Pertamina Gas operates 3 pipe segments carrying natural gas from producers to PT Pupuk Kaltim in the Kalimantan area. The company wants to build a pipeline data management system consisting of pipeline facilities, inspections and risk assessments which would run on Geographic Information Systems (GIS) platforms. The aim of this paper is to present the integration of the pipeline data management system with GIS. A web based GIS application is developed using the combination of Google maps datasets with local spatial datasets. In addition, Open Layers is used to integrate pipeline data model and Google Map dataset into a single map display on Sharp Map. The GIS based pipeline data management system developed herein constitutes a low cost, powerful and efficient web based GIS solution.

  8. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  9. CLOUD-BASED PLATFORM FOR CREATING AND SHARING WEB MAPS

    Directory of Open Access Journals (Sweden)

    Jean Pierre Gatera

    2014-01-01

    Full Text Available The rise of cloud computing is one the most important thing happening in information technology today. While many things are moving into the cloud, this trend has also reached the Geographic Information System (GIS world. For the users of GIS technology, the cloud opens new possibilities for sharing web maps, applications and spatial data. The goal of this presentation/demo is to demonstrate ArcGIS Online which is a cloud-based collaborative platform that allows to easily and quickly create interactive web maps that you can share with anyone. With ready-to-use content, apps, and templates you can produce web maps right away. And no matter what you use - desktops, browsers, smartphones, or tablets - you always have access to your content.

  10. Comparison of model reference and map based control method for vehicle stability enhancement

    NARCIS (Netherlands)

    Baek, S.; Son, M.; Song, J.; Boo, K.; Kim, H.

    2012-01-01

    A map based controller method to improve a vehicle lateral stability is proposed in this study and compared with the conventional method, a model referenced controller. A model referenced controller to determine compensated yaw moment uses the sliding mode method, but the proposed map based

  11. Gradient-based reliability maps for ACM-based segmentation of hippocampus.

    Science.gov (United States)

    Zarpalas, Dimitrios; Gkontra, Polyxeni; Daras, Petros; Maglaveras, Nicos

    2014-04-01

    Automatic segmentation of deep brain structures, such as the hippocampus (HC), in MR images has attracted considerable scientific attention due to the widespread use of MRI and to the principal role of some structures in various mental disorders. In this literature, there exists a substantial amount of work relying on deformable models incorporating prior knowledge about structures' anatomy and shape information. However, shape priors capture global shape characteristics and thus fail to model boundaries of varying properties; HC boundaries present rich, poor, and missing gradient regions. On top of that, shape prior knowledge is blended with image information in the evolution process, through global weighting of the two terms, again neglecting the spatially varying boundary properties, causing segmentation faults. An innovative method is hereby presented that aims to achieve highly accurate HC segmentation in MR images, based on the modeling of boundary properties at each anatomical location and the inclusion of appropriate image information for each of those, within an active contour model framework. Hence, blending of image information and prior knowledge is based on a local weighting map, which mixes gradient information, regional and whole brain statistical information with a multi-atlas-based spatial distribution map of the structure's labels. Experimental results on three different datasets demonstrate the efficacy and accuracy of the proposed method.

  12. Smartphones Based Mobile Mapping Systems

    Science.gov (United States)

    Al-Hamad, A.; El-Sheimy, N.

    2014-06-01

    The past 20 years have witnessed an explosive growth in the demand for geo-spatial data. This demand has numerous sources and takes many forms; however, the net effect is an ever-increasing thirst for data that is more accurate, has higher density, is produced more rapidly, and is acquired less expensively. For mapping and Geographic Information Systems (GIS) projects, this has been achieved through the major development of Mobile Mapping Systems (MMS). MMS integrate various navigation and remote sensing technologies which allow mapping from moving platforms (e.g. cars, airplanes, boats, etc.) to obtain the 3D coordinates of the points of interest. Such systems obtain accuracies that are suitable for all but the most demanding mapping and engineering applications. However, this accuracy doesn't come cheaply. As a consequence of the platform and navigation and mapping technologies used, even an "inexpensive" system costs well over 200 000 USD. Today's mobile phones are getting ever more sophisticated. Phone makers are determined to reduce the gap between computers and mobile phones. Smartphones, in addition to becoming status symbols, are increasingly being equipped with extended Global Positioning System (GPS) capabilities, Micro Electro Mechanical System (MEMS) inertial sensors, extremely powerful computing power and very high resolution cameras. Using all of these components, smartphones have the potential to replace the traditional land MMS and portable GPS/GIS equipment. This paper introduces an innovative application of smartphones as a very low cost portable MMS for mapping and GIS applications.

  13. Current trends in satellite based emergency mapping - the need for harmonisation

    Science.gov (United States)

    Voigt, Stefan

    2013-04-01

    During the past years, the availability and use of satellite image data to support disaster management and humanitarian relief organisations has largely increased. The automation and data processing techniques are greatly improving as well as the capacity in accessing and processing satellite imagery in getting better globally. More and more global activities via the internet and through global organisations like the United Nations or the International Charter Space and Major Disaster engage in the topic, while at the same time, more and more national or local centres engage rapid mapping operations and activities. In order to make even more effective use of this very positive increase of capacity, for the sake of operational provision of analysis results, for fast validation of satellite derived damage assessments, for better cooperation in the joint inter agency generation of rapid mapping products and for general scientific use, rapid mapping results in general need to be better harmonized, if not even standardized. In this presentation, experiences from various years of rapid mapping gained by the DLR Center for satellite based Crisis Information (ZKI) within the context of the national activities, the International Charter Space and Major Disasters, GMES/Copernicus etc. are reported. Furthermore, an overview on how automation, quality assurance and optimization can be achieved through standard operation procedures within a rapid mapping workflow is given. Building on this long term rapid mapping experience, and building on the DLR initiative to set in pace an "International Working Group on Satellite Based Emergency Mapping" current trends in rapid mapping are discussed and thoughts on how the sharing of rapid mapping information can be optimized by harmonizing analysis results and data structures are presented. Such an harmonization of analysis procedures, nomenclatures and representations of data as well as meta data are the basis to better cooperate within

  14. Balanced Civilization Map Generation based on Open Data

    DEFF Research Database (Denmark)

    Barros, Gabriella; Togelius, Julian

    2015-01-01

    This work investigates how to incorporate real-world data into game content so that the content is playable and enjoyable while not misrepresenting the data. We propose a method for generating balanced Civilization maps based on Open Data, describing how to acquire, transform and integrate...

  15. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  16. Color Doppler Ultrasonography-Targeted Perforator Mapping and Angiosome-Based Flap Reconstruction

    DEFF Research Database (Denmark)

    Gunnarsson, Gudjon Leifur; Tei, Troels; Thomsen, Jørn Bo

    2016-01-01

    Knowledge about perforators and angiosomes has inspired new and innovative flap designs for reconstruction of defects throughout the body. The purpose of this article is to share our experience using color Doppler ultrasonography (CDU)-targeted perforator mapping and angiosome-based flap reconstr......Knowledge about perforators and angiosomes has inspired new and innovative flap designs for reconstruction of defects throughout the body. The purpose of this article is to share our experience using color Doppler ultrasonography (CDU)-targeted perforator mapping and angiosome-based flap...

  17. Improved chaotic maps-based password-authenticated key agreement using smart cards

    Science.gov (United States)

    Lin, Han-Yu

    2015-02-01

    Elaborating on the security of password-based authenticated key agreement, in this paper, the author cryptanalyzes a chaotic maps-based password-authenticated key agreement proposed by Guo and Chang recently. Specifically, their protocol could not achieve strong user anonymity due to a fixed parameter and a malicious adversary is able to derive the shared session key by manipulating the property of Chebyshev chaotic maps. Additionally, the author also presents an improved scheme to eliminate the above weaknesses and still maintain the efficiency.

  18. Rule-based Test Generation with Mind Maps

    Directory of Open Access Journals (Sweden)

    Dimitry Polivaev

    2012-02-01

    Full Text Available This paper introduces basic concepts of rule based test generation with mind maps, and reports experiences learned from industrial application of this technique in the domain of smart card testing by Giesecke & Devrient GmbH over the last years. It describes the formalization of test selection criteria used by our test generator, our test generation architecture and test generation framework.

  19. A novel image encryption algorithm based on a 3D chaotic map

    Science.gov (United States)

    Kanso, A.; Ghebleh, M.

    2012-07-01

    Recently [Solak E, Çokal C, Yildiz OT Biyikoǧlu T. Cryptanalysis of Fridrich's chaotic image encryption. Int J Bifur Chaos 2010;20:1405-1413] cryptanalyzed the chaotic image encryption algorithm of [Fridrich J. Symmetric ciphers based on two-dimensional chaotic maps. Int J Bifur Chaos 1998;8(6):1259-1284], which was considered a benchmark for measuring security of many image encryption algorithms. This attack can also be applied to other encryption algorithms that have a structure similar to Fridrich's algorithm, such as that of [Chen G, Mao Y, Chui, C. A symmetric image encryption scheme based on 3D chaotic cat maps. Chaos Soliton Fract 2004;21:749-761]. In this paper, we suggest a novel image encryption algorithm based on a three dimensional (3D) chaotic map that can defeat the aforementioned attack among other existing attacks. The design of the proposed algorithm is simple and efficient, and based on three phases which provide the necessary properties for a secure image encryption algorithm including the confusion and diffusion properties. In phase I, the image pixels are shuffled according to a search rule based on the 3D chaotic map. In phases II and III, 3D chaotic maps are used to scramble shuffled pixels through mixing and masking rules, respectively. Simulation results show that the suggested algorithm satisfies the required performance tests such as high level security, large key space and acceptable encryption speed. These characteristics make it a suitable candidate for use in cryptographic applications.

  20. Topsoil organic carbon content of Europe, a new map based on a generalised additive model

    Science.gov (United States)

    de Brogniez, Delphine; Ballabio, Cristiano; Stevens, Antoine; Jones, Robert J. A.; Montanarella, Luca; van Wesemael, Bas

    2014-05-01

    There is an increasing demand for up-to-date spatially continuous organic carbon (OC) data for global environment and climatic modeling. Whilst the current map of topsoil organic carbon content for Europe (Jones et al., 2005) was produced by applying expert-knowledge based pedo-transfer rules on large soil mapping units, the aim of this study was to replace it by applying digital soil mapping techniques on the first European harmonised geo-referenced topsoil (0-20 cm) database, which arises from the LUCAS (land use/cover area frame statistical survey) survey. A generalized additive model (GAM) was calibrated on 85% of the dataset (ca. 17 000 soil samples) and a backward stepwise approach selected slope, land cover, temperature, net primary productivity, latitude and longitude as environmental covariates (500 m resolution). The validation of the model (applied on 15% of the dataset), gave an R2 of 0.27. We observed that most organic soils were under-predicted by the model and that soils of Scandinavia were also poorly predicted. The model showed an RMSE of 42 g kg-1 for mineral soils and of 287 g kg-1 for organic soils. The map of predicted OC content showed the lowest values in Mediterranean countries and in croplands across Europe, whereas highest OC content were predicted in wetlands, woodlands and in mountainous areas. The map of standard error of the OC model predictions showed high values in northern latitudes, wetlands, moors and heathlands, whereas low uncertainty was mostly found in croplands. A comparison of our results with the map of Jones et al. (2005) showed a general agreement on the prediction of mineral soils' OC content, most probably because the models use some common covariates, namely land cover and temperature. Our model however failed to predict values of OC content greater than 200 g kg-1, which we explain by the imposed unimodal distribution of our model, whose mean is tilted towards the majority of soils, which are mineral. Finally, average

  1. An extended anchored linkage map and virtual mapping for the american mink genome based on homology to human and dog

    DEFF Research Database (Denmark)

    Anistoroaei, Razvan Marian; Ansari, S.; Farid, A.

    2009-01-01

    hybridization (FISH) and/or by means of human/dog/mink comparative homology. The average interval between markers is 8.5 cM and the linkage groups collectively span 1340 cM. In addition, 217 and 275 mink microsatellites have been placed on human and dog genomes, respectively. In conjunction with the existing...... comparative human/dog/mink data, these assignments represent useful virtual maps for the American mink genome. Comparison of the current human/dog assembled sequential map with the existing Zoo-FISH-based human/dog/mink maps helped to refine the human/dog/mink comparative map. Furthermore, comparison...... of the human and dog genome assemblies revealed a number of large synteny blocks, some of which are corroborated by data from the mink linkage map....

  2. Elemental mapping in scanning transmission electron microscopy

    International Nuclear Information System (INIS)

    Allen, L J; D'Alfonso, A J; Lugg, N R; Findlay, S D; LeBeau, J M; Stemmer, S

    2010-01-01

    We discuss atomic resolution chemical mapping in scanning transmission electron microscopy (STEM) based on core-loss electron energy loss spectroscopy (EELS) and also on energy dispersive X-ray (EDX) imaging. Chemical mapping using EELS can yield counterintuitive results which, however, can be understood using first principles calculations. Experimental chemical maps based on EDX bear out the thesis that such maps are always likely to be directly interpretable. This can be explained in terms of the local nature of the effective optical potential for ionization under those imaging conditions. This is followed by an excursion into the complementary technique of elemental mapping using energy-filtered transmission electron microscopy (EFTEM) in a conventional transmission electron microscope. We will then consider the widely used technique of Z-contrast or high-angle annular dark field (HAADF) imaging, which is based on phonon excitation, where it has recently been shown that intensity variations can be placed on an absolute scale by normalizing the measured intensities to the incident beam. Results, showing excellent agreement between theory and experiment to within a few percent, are shown for Z-contrast imaging from a sample of PbWO 4 .

  3. BAC-HAPPY mapping (BAP mapping: a new and efficient protocol for physical mapping.

    Directory of Open Access Journals (Sweden)

    Giang T H Vu

    2010-02-01

    Full Text Available Physical and linkage mapping underpin efforts to sequence and characterize the genomes of eukaryotic organisms by providing a skeleton framework for whole genome assembly. Hitherto, linkage and physical "contig" maps were generated independently prior to merging. Here, we develop a new and easy method, BAC HAPPY MAPPING (BAP mapping, that utilizes BAC library pools as a HAPPY mapping panel together with an Mbp-sized DNA panel to integrate the linkage and physical mapping efforts into one pipeline. Using Arabidopsis thaliana as an exemplar, a set of 40 Sequence Tagged Site (STS markers spanning approximately 10% of chromosome 4 were simultaneously assembled onto a BAP map compiled using both a series of BAC pools each comprising 0.7x genome coverage and dilute (0.7x genome samples of sheared genomic DNA. The resultant BAP map overcomes the need for polymorphic loci to separate genetic loci by recombination and allows physical mapping in segments of suppressed recombination that are difficult to analyze using traditional mapping techniques. Even virtual "BAC-HAPPY-mapping" to convert BAC landing data into BAC linkage contigs is possible.

  4. [MapDraw: a microsoft excel macro for drawing genetic linkage maps based on given genetic linkage data].

    Science.gov (United States)

    Liu, Ren-Hu; Meng, Jin-Ling

    2003-05-01

    MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.

  5. Hash function based on chaotic map lattices.

    Science.gov (United States)

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  6. Smartphones Based Mobile Mapping Systems

    Directory of Open Access Journals (Sweden)

    A. Al-Hamad

    2014-06-01

    Full Text Available The past 20 years have witnessed an explosive growth in the demand for geo-spatial data. This demand has numerous sources and takes many forms; however, the net effect is an ever-increasing thirst for data that is more accurate, has higher density, is produced more rapidly, and is acquired less expensively. For mapping and Geographic Information Systems (GIS projects, this has been achieved through the major development of Mobile Mapping Systems (MMS. MMS integrate various navigation and remote sensing technologies which allow mapping from moving platforms (e.g. cars, airplanes, boats, etc. to obtain the 3D coordinates of the points of interest. Such systems obtain accuracies that are suitable for all but the most demanding mapping and engineering applications. However, this accuracy doesn't come cheaply. As a consequence of the platform and navigation and mapping technologies used, even an "inexpensive" system costs well over 200 000 USD. Today's mobile phones are getting ever more sophisticated. Phone makers are determined to reduce the gap between computers and mobile phones. Smartphones, in addition to becoming status symbols, are increasingly being equipped with extended Global Positioning System (GPS capabilities, Micro Electro Mechanical System (MEMS inertial sensors, extremely powerful computing power and very high resolution cameras. Using all of these components, smartphones have the potential to replace the traditional land MMS and portable GPS/GIS equipment. This paper introduces an innovative application of smartphones as a very low cost portable MMS for mapping and GIS applications.

  7. Probabilistic mapping of descriptive health status responses onto health state utilities using Bayesian networks: an empirical analysis converting SF-12 into EQ-5D utility index in a national US sample.

    Science.gov (United States)

    Le, Quang A; Doctor, Jason N

    2011-05-01

    As quality-adjusted life years have become the standard metric in health economic evaluations, mapping health-profile or disease-specific measures onto preference-based measures to obtain quality-adjusted life years has become a solution when health utilities are not directly available. However, current mapping methods are limited due to their predictive validity, reliability, and/or other methodological issues. We employ probability theory together with a graphical model, called a Bayesian network, to convert health-profile measures into preference-based measures and to compare the results to those estimated with current mapping methods. A sample of 19,678 adults who completed both the 12-item Short Form Health Survey (SF-12v2) and EuroQoL 5D (EQ-5D) questionnaires from the 2003 Medical Expenditure Panel Survey was split into training and validation sets. Bayesian networks were constructed to explore the probabilistic relationships between each EQ-5D domain and 12 items of the SF-12v2. The EQ-5D utility scores were estimated on the basis of the predicted probability of each response level of the 5 EQ-5D domains obtained from the Bayesian inference process using the following methods: Monte Carlo simulation, expected utility, and most-likely probability. Results were then compared with current mapping methods including multinomial logistic regression, ordinary least squares, and censored least absolute deviations. The Bayesian networks consistently outperformed other mapping models in the overall sample (mean absolute error=0.077, mean square error=0.013, and R overall=0.802), in different age groups, number of chronic conditions, and ranges of the EQ-5D index. Bayesian networks provide a new robust and natural approach to map health status responses into health utility measures for health economic evaluations.

  8. GeneRecon—A coalescent based tool for fine-scale association mapping

    DEFF Research Database (Denmark)

    Mailund, Thomas; Schierup, Mikkel Heide; Pedersen, Christian Nørgaard Storm

    2006-01-01

    GeneRecon is a tool for fine-scale association mapping using a coalescence model. GeneRecon takes as input case-control data from phased or unphased SNP and micro-satellite genotypes. The posterior distribution of disease locus position is obtained by Metropolis Hastings sampling in the state space...

  9. Agent-based mapping of credit risk for sustainable microfinance.

    Directory of Open Access Journals (Sweden)

    Joung-Hun Lee

    Full Text Available By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  10. Agent-based mapping of credit risk for sustainable microfinance.

    Science.gov (United States)

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital.

  11. Method for mapping population-based case-control studies: an application using generalized additive models

    Directory of Open Access Journals (Sweden)

    Aschengrau Ann

    2006-06-01

    Full Text Available Abstract Background Mapping spatial distributions of disease occurrence and risk can serve as a useful tool for identifying exposures of public health concern. Disease registry data are often mapped by town or county of diagnosis and contain limited data on covariates. These maps often possess poor spatial resolution, the potential for spatial confounding, and the inability to consider latency. Population-based case-control studies can provide detailed information on residential history and covariates. Results Generalized additive models (GAMs provide a useful framework for mapping point-based epidemiologic data. Smoothing on location while controlling for covariates produces adjusted maps. We generate maps of odds ratios using the entire study area as a reference. We smooth using a locally weighted regression smoother (loess, a method that combines the advantages of nearest neighbor and kernel methods. We choose an optimal degree of smoothing by minimizing Akaike's Information Criterion. We use a deviance-based test to assess the overall importance of location in the model and pointwise permutation tests to locate regions of significantly increased or decreased risk. The method is illustrated with synthetic data and data from a population-based case-control study, using S-Plus and ArcView software. Conclusion Our goal is to develop practical methods for mapping population-based case-control and cohort studies. The method described here performs well for our synthetic data, reproducing important features of the data and adequately controlling the covariate. When applied to the population-based case-control data set, the method suggests spatial confounding and identifies statistically significant areas of increased and decreased odds ratios.

  12. Prediction of CT Substitutes from MR Images Based on Local Diffeomorphic Mapping for Brain PET Attenuation Correction.

    Science.gov (United States)

    Wu, Yao; Yang, Wei; Lu, Lijun; Lu, Zhentai; Zhong, Liming; Huang, Meiyan; Feng, Yanqiu; Feng, Qianjin; Chen, Wufan

    2016-10-01

    Attenuation correction is important for PET reconstruction. In PET/MR, MR intensities are not directly related to attenuation coefficients that are needed in PET imaging. The attenuation coefficient map can be derived from CT images. Therefore, prediction of CT substitutes from MR images is desired for attenuation correction in PET/MR. This study presents a patch-based method for CT prediction from MR images, generating attenuation maps for PET reconstruction. Because no global relation exists between MR and CT intensities, we propose local diffeomorphic mapping (LDM) for CT prediction. In LDM, we assume that MR and CT patches are located on 2 nonlinear manifolds, and the mapping from the MR manifold to the CT manifold approximates a diffeomorphism under a local constraint. Locality is important in LDM and is constrained by the following techniques. The first is local dictionary construction, wherein, for each patch in the testing MR image, a local search window is used to extract patches from training MR/CT pairs to construct MR and CT dictionaries. The k-nearest neighbors and an outlier detection strategy are then used to constrain the locality in MR and CT dictionaries. Second is local linear representation, wherein, local anchor embedding is used to solve MR dictionary coefficients when representing the MR testing sample. Under these local constraints, dictionary coefficients are linearly transferred from the MR manifold to the CT manifold and used to combine CT training samples to generate CT predictions. Our dataset contains 13 healthy subjects, each with T1- and T2-weighted MR and CT brain images. This method provides CT predictions with a mean absolute error of 110.1 Hounsfield units, Pearson linear correlation of 0.82, peak signal-to-noise ratio of 24.81 dB, and Dice in bone regions of 0.84 as compared with real CTs. CT substitute-based PET reconstruction has a regression slope of 1.0084 and R 2 of 0.9903 compared with real CT-based PET. In this method, no

  13. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Science.gov (United States)

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Vaccination with peptides of Mycobacterium avium subsp. paratuberculosis (MAP) reduces MAP burden of infected goats

    DEFF Research Database (Denmark)

    Melvang, Heidi Mikkelsen; Hassan, Sufia Butt; Thakur, Aneesh

    Mycobacterium avium subsp. paratuberculosis (Map) is the cause of paratuberculosis, a chronic enteritis of ruminants that is widespread worldwide. We investigated the effect of post-exposure vaccination with Map specific peptides in a goat model aiming at developing a Map vaccine that will neither...... unique to Map from selected proteins (n =68). For vaccination, 23 MAP peptides (20 µg each) were selected and formulated with Montanide ISA 61 VG adjuvant. At age three weeks 10 goats were orally inoculated with 4x10E9 live Map and assigned to two groups of 5 goats each: 5 vaccinated (V) at 14 and 18...... weeks post inoculation (PI) and 5 unvaccinated (C). At termination 32 weeks PI, Map burdens in 15 intestinal tissues and lymph nodes were determined by IS900 qPCR. Of the 75 tissue samples from the 5 C goats only 5 samples were IS900 qPCR negative. In contrast, only 9 samples in total from 5 V goats...

  15. USGS "Did You Feel It?" internet-based macroseismic intensity maps

    Science.gov (United States)

    Wald, D.J.; Quitoriano, V.; Worden, B.; Hopper, M.; Dewey, J.W.

    2011-01-01

    The U.S. Geological Survey (USGS) "Did You Feel It?" (DYFI) system is an automated approach for rapidly collecting macroseismic intensity data from Internet users' shaking and damage reports and generating intensity maps immediately following earthquakes; it has been operating for over a decade (1999-2011). DYFI-based intensity maps made rapidly available through the DYFI system fundamentally depart from more traditional maps made available in the past. The maps are made more quickly, provide more complete coverage and higher resolution, provide for citizen input and interaction, and allow data collection at rates and quantities never before considered. These aspects of Internet data collection, in turn, allow for data analyses, graphics, and ways to communicate with the public, opportunities not possible with traditional data-collection approaches. Yet web-based contributions also pose considerable challenges, as discussed herein. After a decade of operational experience with the DYFI system and users, we document refinements to the processing and algorithmic procedures since DYFI was first conceived. We also describe a number of automatic post-processing tools, operations, applications, and research directions, all of which utilize the extensive DYFI intensity datasets now gathered in near-real time. DYFI can be found online at the website http://earthquake.usgs.gov/dyfi/. ?? 2011 by the Istituto Nazionale di Geofisica e Vulcanologia.

  16. An Anomaly Detector Based on Multi-aperture Mapping for Hyperspectral Data

    Directory of Open Access Journals (Sweden)

    LI Min

    2016-10-01

    Full Text Available Considering the correlationship of spectral content between anomaly and clutter background, inaccurate selection of background pixels induced estimation error of background model. In order to solve the above problems, a multi-aperture mapping based anomaly detector was proposed in this paper. Firstly, differing from background model which focused on feature extraction of background, multi-aperture mapping of hyperspectral data characterized the feature of whole hyperspectral data. According to constructed basis set of multi-aperture mapping, anomaly salience index of every test pixel was proposed to measure the relative statistic difference. Secondly, in order to analysis the moderate salience anomaly precisely, membership value was constructed to identify anomaly salience of test pixels continuously based on fuzzy logical theory. At same time, weighted iterative estimation of multi-aperture mapping was expected to converge adaptively with membership value as weight. Thirdly, classical defuzzification was proposed to fuse different detection results. Hyperspectral data was used in the experiments, and the robustness and sensitivity to anomaly with lower silence of proposed detector were tested.

  17. Subpixel Mapping of Hyperspectral Image Based on Linear Subpixel Feature Detection and Object Optimization

    Science.gov (United States)

    Liu, Zhaoxin; Zhao, Liaoying; Li, Xiaorun; Chen, Shuhan

    2018-04-01

    Owing to the limitation of spatial resolution of the imaging sensor and the variability of ground surfaces, mixed pixels are widesperead in hyperspectral imagery. The traditional subpixel mapping algorithms treat all mixed pixels as boundary-mixed pixels while ignoring the existence of linear subpixels. To solve this question, this paper proposed a new subpixel mapping method based on linear subpixel feature detection and object optimization. Firstly, the fraction value of each class is obtained by spectral unmixing. Secondly, the linear subpixel features are pre-determined based on the hyperspectral characteristics and the linear subpixel feature; the remaining mixed pixels are detected based on maximum linearization index analysis. The classes of linear subpixels are determined by using template matching method. Finally, the whole subpixel mapping results are iteratively optimized by binary particle swarm optimization algorithm. The performance of the proposed subpixel mapping method is evaluated via experiments based on simulated and real hyperspectral data sets. The experimental results demonstrate that the proposed method can improve the accuracy of subpixel mapping.

  18. Self-organizing adaptive map: autonomous learning of curves and surfaces from point samples.

    Science.gov (United States)

    Piastra, Marco

    2013-05-01

    Competitive Hebbian Learning (CHL) (Martinetz, 1993) is a simple and elegant method for estimating the topology of a manifold from point samples. The method has been adopted in a number of self-organizing networks described in the literature and has given rise to related studies in the fields of geometry and computational topology. Recent results from these fields have shown that a faithful reconstruction can be obtained using the CHL method only for curves and surfaces. Within these limitations, these findings constitute a basis for defining a CHL-based, growing self-organizing network that produces a faithful reconstruction of an input manifold. The SOAM (Self-Organizing Adaptive Map) algorithm adapts its local structure autonomously in such a way that it can match the features of the manifold being learned. The adaptation process is driven by the defects arising when the network structure is inadequate, which cause a growth in the density of units. Regions of the network undergo a phase transition and change their behavior whenever a simple, local condition of topological regularity is met. The phase transition is eventually completed across the entire structure and the adaptation process terminates. In specific conditions, the structure thus obtained is homeomorphic to the input manifold. During the adaptation process, the network also has the capability to focus on the acquisition of input point samples in critical regions, with a substantial increase in efficiency. The behavior of the network has been assessed experimentally with typical data sets for surface reconstruction, including suboptimal conditions, e.g. with undersampling and noise. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Automatic Mapping of Forest Stands Based on Three-Dimensional Point Clouds Derived from Terrestrial Laser-Scanning

    Directory of Open Access Journals (Sweden)

    Tim Ritter

    2017-07-01

    Full Text Available Mapping of exact tree positions can be regarded as a crucial task of field work associated with forest monitoring, especially on intensive research plots. We propose a two-stage density clustering approach for the automatic mapping of tree positions, and an algorithm for automatic tree diameter estimates based on terrestrial laser-scanning (TLS point cloud data sampled under limited sighting conditions. We show that our novel approach is able to detect tree positions in a mixed and vertically structured stand with an overall accuracy of 91.6%, and with omission- and commission error of only 5.7% and 2.7% respectively. Moreover, we were able to reproduce the stand’s diameter in breast height (DBH distribution, and to estimate single trees DBH with a mean average deviation of ±2.90 cm compared with tape measurements as reference.

  20. Calculation of upper confidence bounds on not-sampled vegetation types using a systematic grid sample: An application to map unit definition for existing vegetation maps

    Science.gov (United States)

    Paul L. Patterson; Mark Finco

    2009-01-01

    This paper explores the information FIA data can produce regarding forest types that were not sampled and develops the equations necessary to define the upper confidence bounds on not-sampled forest types. The problem is reduced to a Bernoulli variable. This simplification allows the upper confidence bounds to be calculated based on Cochran (1977). Examples are...

  1. Cloud-based computation for accelerating vegetation mapping and change detection at regional to national scales

    Science.gov (United States)

    Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts

    2015-01-01

    Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...

  2. Whole brain MP2RAGE-based mapping of the longitudinal relaxation time at 9.4T.

    Science.gov (United States)

    Hagberg, G E; Bause, J; Ethofer, T; Ehses, P; Dresler, T; Herbert, C; Pohmann, R; Shajan, G; Fallgatter, A; Pavlova, M A; Scheffler, K

    2017-01-01

    Mapping of the longitudinal relaxation time (T 1 ) with high accuracy and precision is central for neuroscientific and clinical research, since it opens up the possibility to obtain accurate brain tissue segmentation and gain myelin-related information. An ideal, quantitative method should enable whole brain coverage within a limited scan time yet allow for detailed sampling with sub-millimeter voxel sizes. The use of ultra-high magnetic fields is well suited for this purpose, however the inhomogeneous transmit field potentially hampers its use. In the present work, we conducted whole brain T 1 mapping based on the MP2RAGE sequence at 9.4T and explored potential pitfalls for automated tissue classification compared with 3T. Data accuracy and T 2 -dependent variation of the adiabatic inversion efficiency were investigated by single slice T 1 mapping with inversion recovery EPI measurements, quantitative T 2 mapping using multi-echo techniques and simulations of the Bloch equations. We found that the prominent spatial variation of the transmit field at 9.4T (yielding flip angles between 20% and 180% of nominal values) profoundly affected the result of image segmentation and T 1 mapping. These effects could be mitigated by correcting for both flip angle and inversion efficiency deviations. Based on the corrected T 1 maps, new, 'flattened', MP2RAGE contrast images were generated, that were no longer affected by variations of the transmit field. Unlike the uncorrected MP2RAGE contrast images acquired at 9.4T, these flattened images yielded image segmentations comparable to 3T, making bias-field correction prior to image segmentation and tissue classification unnecessary. In terms of the T 1 estimates at high field, the proposed correction methods resulted in an improved precision, with test-retest variability below 1% and a coefficient-of-variation across 25 subjects below 3%. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Pemanfaatan Google Maps Api Untuk Visualisasi Data Base Transceiver Station

    OpenAIRE

    Rani, Septia

    2016-01-01

    This paper discusses the use of the Google Maps API to perform data visualization for Base Transceiver Station (BTS) data. BTS are typically used by telecommunications companies to facilitate wireless communication between communication devices with the network operator. Each BTS has important information such as it's location, it's transaction traffic, as well as information about revenue. With the implementation of BTS data visualization using the Google Maps API, key information owned by e...

  4. PEMANFAATAN GOOGLE MAPS API UNTUK VISUALISASI DATA BASE TRANSCEIVER STATION

    OpenAIRE

    Rani, Septia

    2016-01-01

    This paper discusses the use of the Google Maps API to perform data visualization for Base Transceiver Station (BTS) data. BTS are typically used by telecommunications companies to facilitate wireless communication between communication devices with the network operator. Each BTS has important information such as it’s location, it’s transaction traffic, as well as information about revenue. With the implementation of BTS data visualization using the Google Maps API, key information owned by e...

  5. A novel block cryptosystem based on iterating a chaotic map

    International Nuclear Information System (INIS)

    Xiang Tao; Liao Xiaofeng; Tang Guoping; Chen Yong; Wong, Kwok-wo

    2006-01-01

    A block cryptographic scheme based on iterating a chaotic map is proposed. With random binary sequences generated from the real-valued chaotic map, the plaintext block is permuted by a key-dependent shift approach and then encrypted by the classical chaotic masking technique. Simulation results show that performance and security of the proposed cryptographic scheme are better than those of existing algorithms. Advantages and security of our scheme are also discussed in detail

  6. Pemanfaatan Google Maps Api Untuk Visualisasi Data Base Transceiver Station

    OpenAIRE

    Rani, Septia

    2016-01-01

    This paper discusses the use of the Google Maps API to perform data visualization for Base Transceiver Station (BTS) data. BTS are typically used by telecommunications companies to facilitate wireless communication between communication devices with the network operator. Each BTS has important information such as it’s location, it’s transaction traffic, as well as information about revenue. With the implementation of BTS data visualization using the Google Maps API, key information owned by e...

  7. The first generation of a BAC-based physical map of Brassica rapa

    Directory of Open Access Journals (Sweden)

    Lee Soo

    2008-06-01

    Full Text Available Abstract Background The genus Brassica includes the most extensively cultivated vegetable crops worldwide. Investigation of the Brassica genome presents excellent challenges to study plant genome evolution and divergence of gene function associated with polyploidy and genome hybridization. A physical map of the B. rapa genome is a fundamental tool for analysis of Brassica "A" genome structure. Integration of a physical map with an existing genetic map by linking genetic markers and BAC clones in the sequencing pipeline provides a crucial resource for the ongoing genome sequencing effort and assembly of whole genome sequences. Results A genome-wide physical map of the B. rapa genome was constructed by the capillary electrophoresis-based fingerprinting of 67,468 Bacterial Artificial Chromosome (BAC clones using the five restriction enzyme SNaPshot technique. The clones were assembled into contigs by means of FPC v8.5.3. After contig validation and manual editing, the resulting contig assembly consists of 1,428 contigs and is estimated to span 717 Mb in physical length. This map provides 242 anchored contigs on 10 linkage groups to be served as seed points from which to continue bidirectional chromosome extension for genome sequencing. Conclusion The map reported here is the first physical map for Brassica "A" genome based on the High Information Content Fingerprinting (HICF technique. This physical map will serve as a fundamental genomic resource for accelerating genome sequencing, assembly of BAC sequences, and comparative genomics between Brassica genomes. The current build of the B. rapa physical map is available at the B. rapa Genome Project website for the user community.

  8. Map-IT! A Web-Based GIS Tool for Watershed Science Education.

    Science.gov (United States)

    Curtis, David H.; Hewes, Christopher M.; Lossau, Matthew J.

    This paper describes the development of a prototypic, Web-accessible GIS solution for K-12 science education and citizen-based watershed monitoring. The server side consists of ArcView IMS running on an NT workstation. The client is built around MapCafe. The client interface, which runs through a standard Web browser, supports standard MapCafe…

  9. Semantic Modeling for SNPs Associated with Ethnic Disparities in HapMap Samples

    Directory of Open Access Journals (Sweden)

    HyoYoung Kim

    2014-03-01

    Full Text Available Single-nucleotide polymorphisms (SNPs have been emerging out of the efforts to research human diseases and ethnic disparities. A semantic network is needed for in-depth understanding of the impacts of SNPs, because phenotypes are modulated by complex networks, including biochemical and physiological pathways. We identified ethnicity-specific SNPs by eliminating overlapped SNPs from HapMap samples, and the ethnicity-specific SNPs were mapped to the UCSC RefGene lists. Ethnicity-specific genes were identified as follows: 22 genes in the USA (CEU individuals, 25 genes in the Japanese (JPT individuals, and 332 genes in the African (YRI individuals. To analyze the biologically functional implications for ethnicity-specific SNPs, we focused on constructing a semantic network model. Entities for the network represented by "Gene," "Pathway," "Disease," "Chemical," "Drug," "ClinicalTrials," "SNP," and relationships between entity-entity were obtained through curation. Our semantic modeling for ethnicity-specific SNPs showed interesting results in the three categories, including three diseases ("AIDS-associated nephropathy," "Hypertension," and "Pelvic infection", one drug ("Methylphenidate", and five pathways ("Hemostasis," "Systemic lupus erythematosus," "Prostate cancer," "Hepatitis C virus," and "Rheumatoid arthritis". We found ethnicity-specific genes using the semantic modeling, and the majority of our findings was consistent with the previous studies - that an understanding of genetic variability explained ethnicity-specific disparities.

  10. Groundwater potentiality mapping using geoelectrical-based aquifer hydraulic parameters: A GIS-based multi-criteria decision analysis modeling approach

    Directory of Open Access Journals (Sweden)

    Kehinde Anthony Mogaji Hwee San Lim

    2017-01-01

    Full Text Available This study conducted a robust analysis on acquired 2D resistivity imaging data and borehole pumping test records to optimize groundwater potentiality mapping in Perak province, Malaysia using derived aquifer hydraulic properties. The transverse resistance (TR parameter was determined from the interpreted 2D resistivity imaging data by applying the Dar-Zarrouk parameter equation. Linear regression and GIS techniques were used to regress the estimated values for TR parameters with the aquifer transmissivity values extracted from the geospatially produced BPT records-based aquifer transmissivity map to develop the aquifer transmissivity parameter predictive (ATPP model. The reliability evaluated ATPP model using the Theil inequality coefficient measurement approach was used to establish geoelectrical-based hydraulic parameters (GHP modeling equations for the modeling of transmissivity (Tr, hydraulic conductivity (K, storativity (St, and hydraulic diffusivity (D properties. The applied GHP modeling equation results to the delineated aquifer media was used to produce aquifer potential conditioning factor maps for Tr, K, St, and D. The maps were modeled to develop an aquifer potential mapping index (APMI model via applying the multi-criteria decision analysis-analytic hierarchy process principle. The area groundwater reservoir productivity potential model map produced based on the processed APMI model estimates in the GIS environment was found to be 71% accurate. This study establishes a good alternative approach to determine aquifer hydraulic parameters even in areas where pumping test information is unavailable using a cost effective geophysical data. The produced map can be explored for hydrological decision making.

  11. Application of terrestrial laser scanning to the development and updating of the base map

    Directory of Open Access Journals (Sweden)

    Klapa Przemysław

    2017-06-01

    Full Text Available The base map provides basic information about land to individuals, companies, developers, design engineers, organizations, and government agencies. Its contents include spatial location data for control network points, buildings, land lots, infrastructure facilities, and topographic features. As the primary map of the country, it must be developed in accordance with specific laws and regulations and be continuously updated. The base map is a data source used for the development and updating of derivative maps and other large scale cartographic materials such as thematic or topographic maps. Thanks to the advancement of science and technology, the quality of land surveys carried out by means of terrestrial laser scanning (TLS matches that of traditional surveying methods in many respects.

  12. A natural-color mapping for single-band night-time image based on FPGA

    Science.gov (United States)

    Wang, Yilun; Qian, Yunsheng

    2018-01-01

    A natural-color mapping for single-band night-time image method based on FPGA can transmit the color of the reference image to single-band night-time image, which is consistent with human visual habits and can help observers identify the target. This paper introduces the processing of the natural-color mapping algorithm based on FPGA. Firstly, the image can be transformed based on histogram equalization, and the intensity features and standard deviation features of reference image are stored in SRAM. Then, the real-time digital images' intensity features and standard deviation features are calculated by FPGA. At last, FPGA completes the color mapping through matching pixels between images using the features in luminance channel.

  13. Real-time flood extent maps based on social media

    Science.gov (United States)

    Eilander, Dirk; van Loenen, Arnejan; Roskam, Ruud; Wagemaker, Jurjen

    2015-04-01

    During a flood event it is often difficult to get accurate information about the flood extent and the people affected. This information is very important for disaster risk reduction management and crisis relief organizations. In the post flood phase, information about the flood extent is needed for damage estimation and calibrating hydrodynamic models. Currently, flood extent maps are derived from a few sources such as satellite images, areal images and post-flooding flood marks. However, getting accurate real-time or maximum flood extent maps remains difficult. With the rise of social media, we now have a new source of information with large numbers of observations. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at 8 tweets per second during floods in early 2014. A fair amount of these tweets also contains observations of water depth and location. Our hypothesis is that based on the large numbers of tweets it is possible to generate real-time flood extent maps. In this study we use tweets from the city of Jakarta, Indonesia, to generate these flood extent maps. The data-mining procedure looks for tweets with a mention of 'banjir', the Bahasa Indonesia word for flood. It then removes modified and retweeted messages in order to keep unique tweets only. Since tweets are not always sent directly from the location of observation, the geotag in the tweets is unreliable. We therefore extract location information using mentions of names of neighborhoods and points of interest. Finally, where encountered, a mention of a length measure is extracted as water depth. These tweets containing a location reference and a water level are considered to be flood observations. The strength of this method is that it can easily be extended to other regions and languages. Based on the intensity of tweets in Jakarta during a flood event we can provide a rough estimate of the flood extent. To provide more accurate flood extend

  14. NaviCell: a web-based environment for navigation, curation and maintenance of large molecular interaction maps.

    Science.gov (United States)

    Kuperstein, Inna; Cohen, David P A; Pook, Stuart; Viara, Eric; Calzone, Laurence; Barillot, Emmanuel; Zinovyev, Andrei

    2013-10-07

    Molecular biology knowledge can be formalized and systematically represented in a computer-readable form as a comprehensive map of molecular interactions. There exist an increasing number of maps of molecular interactions containing detailed and step-wise description of various cell mechanisms. It is difficult to explore these large maps, to organize discussion of their content and to maintain them. Several efforts were recently made to combine these capabilities together in one environment, and NaviCell is one of them. NaviCell is a web-based environment for exploiting large maps of molecular interactions, created in CellDesigner, allowing their easy exploration, curation and maintenance. It is characterized by a combination of three essential features: (1) efficient map browsing based on Google Maps; (2) semantic zooming for viewing different levels of details or of abstraction of the map and (3) integrated web-based blog for collecting community feedback. NaviCell can be easily used by experts in the field of molecular biology for studying molecular entities of interest in the context of signaling pathways and crosstalk between pathways within a global signaling network. NaviCell allows both exploration of detailed molecular mechanisms represented on the map and a more abstract view of the map up to a top-level modular representation. NaviCell greatly facilitates curation, maintenance and updating the comprehensive maps of molecular interactions in an interactive and user-friendly fashion due to an imbedded blogging system. NaviCell provides user-friendly exploration of large-scale maps of molecular interactions, thanks to Google Maps and WordPress interfaces, with which many users are already familiar. Semantic zooming which is used for navigating geographical maps is adopted for molecular maps in NaviCell, making any level of visualization readable. In addition, NaviCell provides a framework for community-based curation of maps.

  15. An improved map based graphical android authentication system ...

    African Journals Online (AJOL)

    Currently, graphical password methods are available for android and other devices, but the major problem is vulnerability issue. A map graphical-based authentication system (Dheeraj et al, 2013) was designed on mobile android devices, but it did not provide a large choice or multiple sequence to user for selecting ...

  16. PENERAPAN METODE MIND MAPPING PADA PEMBELAJARAN MATEMATIKA

    Directory of Open Access Journals (Sweden)

    Rahma Faelasofi

    2016-09-01

    Full Text Available The objectives of this research was to solve how to increase students learning mathematic achievments on the subject of statistic. This study aims to determine whether the students learning mathematic achievments of the student SMP Muhammadiyah 1 Gadingrejo in the academic year of 2014-2015 using learning method mind mapping on the subject of statistics can be higher than the students’ learning mathematic achievments without using learning method mind mapping. This research is a quantitative approach. The population was all of the students in first grade of the SMP Muhammadiyah 1 Gadingrejo in the academic year of 2014- 2015. The samples of the research were taken by using the cluster random sampling technique. Based on hypothesis test, it can be included that there are differences in the average student learning achievments between using learning method mind mapping and learning method lecture on the subject of Statistics of the student in first grade of the SMP Muhammadiyah 1 Gadingrejo in the academic year of 2014-2015. Keywords: learning method mind mapping, students’ learning mathematic achievments

  17. Machine vision-based high-resolution weed mapping and patch-sprayer performance simulation

    NARCIS (Netherlands)

    Tang, L.; Tian, L.F.; Steward, B.L.

    1999-01-01

    An experimental machine vision-based patch-sprayer was developed. This sprayer was primarily designed to do real-time weed density estimation and variable herbicide application rate control. However, the sprayer also had the capability to do high-resolution weed mapping if proper mapping techniques

  18. PhyloGeoViz: a web-based program that visualizes genetic data on maps.

    Science.gov (United States)

    Tsai, Yi-Hsin E

    2011-05-01

    The first step of many population genetic studies is the simple visualization of allele frequencies on a landscape. This basic data exploration can be challenging without proprietary software, and the manual plotting of data is cumbersome and unfeasible at large sample sizes. I present an open source, web-based program that plots any kind of frequency or count data as pie charts in Google Maps (Google Inc., Mountain View, CA). Pie polygons are then exportable to Google Earth (Google Inc.), a free Geographic Information Systems platform. Import of genetic data into Google Earth allows phylogeographers access to a wealth of spatial information layers integral to forming hypotheses and understanding patterns in the data. © 2010 Blackwell Publishing Ltd.

  19. High-resolution mapping of forest carbon stocks in the Colombian Amazon

    Directory of Open Access Journals (Sweden)

    G. P. Asner

    2012-07-01

    Full Text Available High-resolution mapping of tropical forest carbon stocks can assist forest management and improve implementation of large-scale carbon retention and enhancement programs. Previous high-resolution approaches have relied on field plot and/or light detection and ranging (LiDAR samples of aboveground carbon density, which are typically upscaled to larger geographic areas using stratification maps. Such efforts often rely on detailed vegetation maps to stratify the region for sampling, but existing tropical forest maps are often too coarse and field plots too sparse for high-resolution carbon assessments. We developed a top-down approach for high-resolution carbon mapping in a 16.5 million ha region (> 40% of the Colombian Amazon – a remote landscape seldom documented. We report on three advances for large-scale carbon mapping: (i employing a universal approach to airborne LiDAR-calibration with limited field data; (ii quantifying environmental controls over carbon densities; and (iii developing stratification- and regression-based approaches for scaling up to regions outside of LiDAR coverage. We found that carbon stocks are predicted by a combination of satellite-derived elevation, fractional canopy cover and terrain ruggedness, allowing upscaling of the LiDAR samples to the full 16.5 million ha region. LiDAR-derived carbon maps have 14% uncertainty at 1 ha resolution, and the regional map based on stratification has 28% uncertainty in any given hectare. High-resolution approaches with quantifiable pixel-scale uncertainties will provide the most confidence for monitoring changes in tropical forest carbon stocks. Improved confidence will allow resource managers and decision makers to more rapidly and effectively implement actions that better conserve and utilize forests in tropical regions.

  20. Comparing the performance of cluster random sampling and integrated threshold mapping for targeting trachoma control, using computer simulation.

    Directory of Open Access Journals (Sweden)

    Jennifer L Smith

    Full Text Available Implementation of trachoma control strategies requires reliable district-level estimates of trachomatous inflammation-follicular (TF, generally collected using the recommended gold-standard cluster randomized surveys (CRS. Integrated Threshold Mapping (ITM has been proposed as an integrated and cost-effective means of rapidly surveying trachoma in order to classify districts according to treatment thresholds. ITM differs from CRS in a number of important ways, including the use of a school-based sampling platform for children aged 1-9 and a different age distribution of participants. This study uses computerised sampling simulations to compare the performance of these survey designs and evaluate the impact of varying key parameters.Realistic pseudo gold standard data for 100 districts were generated that maintained the relative risk of disease between important sub-groups and incorporated empirical estimates of disease clustering at the household, village and district level. To simulate the different sampling approaches, 20 clusters were selected from each district, with individuals sampled according to the protocol for ITM and CRS. Results showed that ITM generally under-estimated the true prevalence of TF over a range of epidemiological settings and introduced more district misclassification according to treatment thresholds than did CRS. However, the extent of underestimation and resulting misclassification was found to be dependent on three main factors: (i the district prevalence of TF; (ii the relative risk of TF between enrolled and non-enrolled children within clusters; and (iii the enrollment rate in schools.Although in some contexts the two methodologies may be equivalent, ITM can introduce a bias-dependent shift as prevalence of TF increases, resulting in a greater risk of misclassification around treatment thresholds. In addition to strengthening the evidence base around choice of trachoma survey methodologies, this study illustrates

  1. IDAS, software support for mathematical models and map-based graphics

    International Nuclear Information System (INIS)

    Birnbaum, M.D.; Wecker, D.B.

    1984-01-01

    IDAS (Intermediate Dose Assessment System) was developed for the U.S. Nuclear Regulatory Commission as a hardware/software host for radiological models and display of map-based plume graphics at the Operations Center (HQ), regional incident response centers, and site emergency facilities. IDAS design goals acknowledged the likelihood of future changes in the suite of models and the composition of map features for analysis and graphical display. IDAS provides a generalized software support environment to programmers and users of modeling programs. A database manager process provides multi-user access control to all input and output data for modeling programs. A programmer-created data description file (schema) specifies data field names, data types, legal and recommended ranges, default values, preferred units of measurement, and ''help'' text. Subroutine calls to IDAS from a model program invoke a consistent user interface which can show any of the schema contents, convert units of measurement, and route data to multiple logical devices, including the database. A stand-alone data editor allows the user to read and write model data records without execution of a model. IDAS stores digitized map features in a 4-level naming hierarchy. A user can select the map icon, color, and whether to show a stored name tag, for each map feature. The user also selects image scale (zoom) within limits set by map digitization. The resulting image combines static map information, computed analytic modeling results, and the user's feature selections for display to decision-makers

  2. Sustainability-Based Flood Hazard Mapping of the Swannanoa River Watershed

    Directory of Open Access Journals (Sweden)

    Ebrahim Ahmadisharaf

    2017-09-01

    Full Text Available An integrated framework is presented for sustainability-based flood hazard mapping of the Swannanoa River watershed in the state of North Carolina, U.S. The framework uses a hydrologic model for rainfall–runoff transformation, a two-dimensional unsteady hydraulic model flood simulation and a GIS-based multi-criteria decision-making technique for flood hazard mapping. Economic, social, and environmental flood hazards are taken into account. The importance of each hazard is quantified through a survey to the experts. Utilizing the proposed framework, sustainability-based flood hazard mapping is performed for the 100-year design event. As a result, the overall flood hazard is provided in each geographic location. The sensitivity of the overall hazard with respect to the weights of the three hazard components were also investigated. While the conventional flood management approach is to assess the environmental impacts of mitigation measures after a set of feasible options are selected, the presented framework incorporates the environmental impacts into the analysis concurrently with the economic and social influences. Thereby, it provides a more sustainable perspective of flood management and can greatly help the decision makers to make better-informed decisions by clearly understanding the impacts of flooding on economy, society and environment.

  3. Optimization of the sampling scheme for maps of physical and chemical properties estimated by kriging

    Directory of Open Access Journals (Sweden)

    Gener Tadeu Pereira

    2013-10-01

    Full Text Available The sampling scheme is essential in the investigation of the spatial variability of soil properties in Soil Science studies. The high costs of sampling schemes optimized with additional sampling points for each physical and chemical soil property, prevent their use in precision agriculture. The purpose of this study was to obtain an optimal sampling scheme for physical and chemical property sets and investigate its effect on the quality of soil sampling. Soil was sampled on a 42-ha area, with 206 geo-referenced points arranged in a regular grid spaced 50 m from each other, in a depth range of 0.00-0.20 m. In order to obtain an optimal sampling scheme for every physical and chemical property, a sample grid, a medium-scale variogram and the extended Spatial Simulated Annealing (SSA method were used to minimize kriging variance. The optimization procedure was validated by constructing maps of relative improvement comparing the sample configuration before and after the process. A greater concentration of recommended points in specific areas (NW-SE direction was observed, which also reflects a greater estimate variance at these locations. The addition of optimal samples, for specific regions, increased the accuracy up to 2 % for chemical and 1 % for physical properties. The use of a sample grid and medium-scale variogram, as previous information for the conception of additional sampling schemes, was very promising to determine the locations of these additional points for all physical and chemical soil properties, enhancing the accuracy of kriging estimates of the physical-chemical properties.

  4. Pseudo random number generator based on quantum chaotic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Mobaraki, A.; Lim, S.-C.; Hassan, Z.

    2014-01-01

    For many years dissipative quantum maps were widely used as informative models of quantum chaos. In this paper, a new scheme for generating good pseudo-random numbers (PRNG), based on quantum logistic map is proposed. Note that the PRNG merely relies on the equations used in the quantum chaotic map. The algorithm is not complex, which does not impose high requirement on computer hardware and thus computation speed is fast. In order to face the challenge of using the proposed PRNG in quantum cryptography and other practical applications, the proposed PRNG is subjected to statistical tests using well-known test suites such as NIST, DIEHARD, ENT and TestU01. The results of the statistical tests were promising, as the proposed PRNG successfully passed all these tests. Moreover, the degree of non-periodicity of the chaotic sequences of the quantum map is investigated through the Scale index technique. The obtained result shows that, the sequence is more non-periodic. From these results it can be concluded that, the new scheme can generate a high percentage of usable pseudo-random numbers for simulation and other applications in scientific computing.

  5. Single Image Super-Resolution Using Global Regression Based on Multiple Local Linear Mappings.

    Science.gov (United States)

    Choi, Jae-Seok; Kim, Munchurl

    2017-03-01

    Super-resolution (SR) has become more vital, because of its capability to generate high-quality ultra-high definition (UHD) high-resolution (HR) images from low-resolution (LR) input images. Conventional SR methods entail high computational complexity, which makes them difficult to be implemented for up-scaling of full-high-definition input images into UHD-resolution images. Nevertheless, our previous super-interpolation (SI) method showed a good compromise between Peak-Signal-to-Noise Ratio (PSNR) performances and computational complexity. However, since SI only utilizes simple linear mappings, it may fail to precisely reconstruct HR patches with complex texture. In this paper, we present a novel SR method, which inherits the large-to-small patch conversion scheme from SI but uses global regression based on local linear mappings (GLM). Thus, our new SR method is called GLM-SI. In GLM-SI, each LR input patch is divided into 25 overlapped subpatches. Next, based on the local properties of these subpatches, 25 different local linear mappings are applied to the current LR input patch to generate 25 HR patch candidates, which are then regressed into one final HR patch using a global regressor. The local linear mappings are learned cluster-wise in our off-line training phase. The main contribution of this paper is as follows: Previously, linear-mapping-based conventional SR methods, including SI only used one simple yet coarse linear mapping to each patch to reconstruct its HR version. On the contrary, for each LR input patch, our GLM-SI is the first to apply a combination of multiple local linear mappings, where each local linear mapping is found according to local properties of the current LR patch. Therefore, it can better approximate nonlinear LR-to-HR mappings for HR patches with complex texture. Experiment results show that the proposed GLM-SI method outperforms most of the state-of-the-art methods, and shows comparable PSNR performance with much lower

  6. Methodology for Evaluating the Quality of Ecosystem Maps: A Case Study in the Andes

    Directory of Open Access Journals (Sweden)

    Dolors Armenteras

    2016-08-01

    Full Text Available Uncertainty in thematic maps has been tested mainly in maps with discrete or fuzzy classifications based on spectral data. However, many ecosystem maps in tropical countries consist of discrete polygons containing information on various ecosystem properties such as vegetation cover, soil, climate, geomorphology and biodiversity. The combination of these properties into one class leads to error. We propose a probability-based sampling design with two domains, multiple stages, and stratification with selection of primary sampling units (PSUs proportional to the richness of strata present. Validation is undertaken through field visits and fine resolution remote sensing data. A pilot site in the center of the Colombian Andes was chosen to validate a government official ecosystem map. Twenty primary sampling units (PSUs of 10 × 15 km were selected, and the final numbers of final sampling units (FSUs were 76 for the terrestrial domain and 46 for the aquatic domain. Our results showed a confidence level of 95%, with the accuracy in the terrestrial domain varying between 51.8% and 64.3% and in the aquatic domain varying between 75% and 92%. Governments need to account for uncertainty since they rely on the quality of these maps to make decisions and guide policies.

  7. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    Science.gov (United States)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  8. Mapping of unknown industrial plant using ROS-based navigation mobile robot

    Science.gov (United States)

    Priyandoko, G.; Ming, T. Y.; Achmad, M. S. H.

    2017-10-01

    This research examines how humans work with teleoperated unmanned mobile robot inspection in industrial plant area resulting 2D/3D map for further critical evaluation. This experiment focuses on two parts, the way human-robot doing remote interactions using robust method and the way robot perceives the environment surround as a 2D/3D perspective map. ROS (robot operating system) as a tool was utilized in the development and implementation during the research which comes up with robust data communication method in the form of messages and topics. RGBD SLAM performs the visual mapping function to construct 2D/3D map using Kinect sensor. The results showed that the mobile robot-based teleoperated system are successful to extend human perspective in term of remote surveillance in large area of industrial plant. It was concluded that the proposed work is robust solution for large mapping within an unknown construction building.

  9. Using a Metro Map Metaphor for organizing Web-based learning resources

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Bang, Tove; Hansen, Per Steen

    2002-01-01

    This paper briefly describes the WebNize system and how it applies a Metro Map metaphor for organizing guided tours in Web based resources. Then, experiences in using the Metro Map based tours in a Knowledge Sharing project at the library at Aarhus School of Business (ASB) in Denmark, are discussed...... is to create models for Intelligent Knowledge Solutions that can contribute to form the learning environments of the School in the 21st century. The WebNize system is used for sharing of knowledge through metro maps for specific subject areas made available in the Learning Resource Centre at ASB. The metro....... The Library has been involved in establishing a Learning Resource Center (LRC). The LRC serves as an exploratorium for the development and the testing of new forms of communication and learning, at the same time as it integrates the information resources of the electronic research library. The objective...

  10. Application of terrestrial laser scanning to the development and updating of the base map

    Science.gov (United States)

    Klapa, Przemysław; Mitka, Bartosz

    2017-06-01

    The base map provides basic information about land to individuals, companies, developers, design engineers, organizations, and government agencies. Its contents include spatial location data for control network points, buildings, land lots, infrastructure facilities, and topographic features. As the primary map of the country, it must be developed in accordance with specific laws and regulations and be continuously updated. The base map is a data source used for the development and updating of derivative maps and other large scale cartographic materials such as thematic or topographic maps. Thanks to the advancement of science and technology, the quality of land surveys carried out by means of terrestrial laser scanning (TLS) matches that of traditional surveying methods in many respects. This paper discusses the potential application of output data from laser scanners (point clouds) to the development and updating of cartographic materials, taking Poland's base map as an example. A few research sites were chosen to present the method and the process of conducting a TLS land survey: a fragment of a residential area, a street, the surroundings of buildings, and an undeveloped area. The entire map that was drawn as a result of the survey was checked by comparing it to a map obtained from PODGiK (pol. Powiatowy Ośrodek Dokumentacji Geodezyjnej i Kartograficznej - Regional Centre for Geodetic and Cartographic Records) and by conducting a field inspection. An accuracy and quality analysis of the conducted fieldwork and deskwork yielded very good results, which provide solid grounds for predicating that cartographic materials based on a TLS point cloud are a reliable source of information about land. The contents of the map that had been created with the use of the obtained point cloud were very accurately located in space (x, y, z). The conducted accuracy analysis and the inspection of the performed works showed that high quality is characteristic of TLS surveys. The

  11. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    Science.gov (United States)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  12. Active Collection of Land Cover Sample Data from Geo-Tagged Web Texts

    Directory of Open Access Journals (Sweden)

    Dongyang Hou

    2015-05-01

    Full Text Available Sample data plays an important role in land cover (LC map validation. Traditionally, they are collected through field survey or image interpretation, either of which is costly, labor-intensive and time-consuming. In recent years, massive geo-tagged texts are emerging on the web and they contain valuable information for LC map validation. However, this kind of special textual data has seldom been analyzed and used for supporting LC map validation. This paper examines the potential of geo-tagged web texts as a new cost-free sample data source to assist LC map validation and proposes an active data collection approach. The proposed approach uses a customized deep web crawler to search for geo-tagged web texts based on land cover-related keywords and string-based rules matching. A data transformation based on buffer analysis is then performed to convert the collected web texts into LC sample data. Using three provinces and three municipalities directly under the Central Government in China as study areas, geo-tagged web texts were collected to validate artificial surface class of China’s 30-meter global land cover datasets (GlobeLand30-2010. A total of 6283 geo-tagged web texts were collected at a speed of 0.58 texts per second. The collected texts about built-up areas were transformed into sample data. User’s accuracy of 82.2% was achieved, which is close to that derived from formal expert validation. The preliminary results show that geo-tagged web texts are valuable ancillary data for LC map validation and the proposed approach can improve the efficiency of sample data collection.

  13. Behavior Analysis of Novel Wearable Indoor Mapping System Based on 3D-SLAM.

    Science.gov (United States)

    Lagüela, Susana; Dorado, Iago; Gesto, Manuel; Arias, Pedro; González-Aguilera, Diego; Lorenzo, Henrique

    2018-03-02

    This paper presents a Wearable Prototype for indoor mapping developed by the University of Vigo. The system is based on a Velodyne LiDAR, acquiring points with 16 rays for a simplistic or low-density 3D representation of reality. With this, a Simultaneous Localization and Mapping (3D-SLAM) method is developed for the mapping and generation of 3D point clouds of scenarios deprived from GNSS signal. The quality of the system presented is validated through the comparison with a commercial indoor mapping system, Zeb-Revo, from the company GeoSLAM and with a terrestrial LiDAR, Faro Focus 3D X330. The first is considered as a relative reference with other mobile systems and is chosen due to its use of the same principle for mapping: SLAM techniques based on Robot Operating System (ROS), while the second is taken as ground-truth for the determination of the final accuracy of the system regarding reality. Results show that the accuracy of the system is mainly determined by the accuracy of the sensor, with little increment in the error introduced by the mapping algorithm.

  14. Behavior Analysis of Novel Wearable Indoor Mapping System Based on 3D-SLAM

    Directory of Open Access Journals (Sweden)

    Susana Lagüela

    2018-03-01

    Full Text Available This paper presents a Wearable Prototype for indoor mapping developed by the University of Vigo. The system is based on a Velodyne LiDAR, acquiring points with 16 rays for a simplistic or low-density 3D representation of reality. With this, a Simultaneous Localization and Mapping (3D-SLAM method is developed for the mapping and generation of 3D point clouds of scenarios deprived from GNSS signal. The quality of the system presented is validated through the comparison with a commercial indoor mapping system, Zeb-Revo, from the company GeoSLAM and with a terrestrial LiDAR, Faro Focus3D X330. The first is considered as a relative reference with other mobile systems and is chosen due to its use of the same principle for mapping: SLAM techniques based on Robot Operating System (ROS, while the second is taken as ground-truth for the determination of the final accuracy of the system regarding reality. Results show that the accuracy of the system is mainly determined by the accuracy of the sensor, with little increment in the error introduced by the mapping algorithm.

  15. Sampling in image space for vision based SLAM

    NARCIS (Netherlands)

    Booij, O.; Zivkovic, Z.; Kröse, B.

    2008-01-01

    Loop closing in vision based SLAM applications is a difficult task. Comparing new image data with all previous image data acquired for the map is practically impossible because of the high computational costs. This problem is part of the bigger problem to acquire local geometric constraints from

  16. Fast and robust generation of feature maps for region-based visual attention.

    Science.gov (United States)

    Aziz, Muhammad Zaheer; Mertsching, Bärbel

    2008-05-01

    Visual attention is one of the important phenomena in biological vision which can be followed to achieve more efficiency, intelligence, and robustness in artificial vision systems. This paper investigates a region-based approach that performs pixel clustering prior to the processes of attention in contrast to late clustering as done by contemporary methods. The foundation steps of feature map construction for the region-based attention model are proposed here. The color contrast map is generated based upon the extended findings from the color theory, the symmetry map is constructed using a novel scanning-based method, and a new algorithm is proposed to compute a size contrast map as a formal feature channel. Eccentricity and orientation are computed using the moments of obtained regions and then saliency is evaluated using the rarity criteria. The efficient design of the proposed algorithms allows incorporating five feature channels while maintaining a processing rate of multiple frames per second. Another salient advantage over the existing techniques is the reusability of the salient regions in the high-level machine vision procedures due to preservation of their shapes and precise locations. The results indicate that the proposed model has the potential to efficiently integrate the phenomenon of attention into the main stream of machine vision and systems with restricted computing resources such as mobile robots can benefit from its advantages.

  17. Application of self-organising maps towards segmentation of soybean samples by determination of inorganic compounds content.

    Science.gov (United States)

    Cremasco, Hágata; Borsato, Dionísio; Angilelli, Karina Gomes; Galão, Olívio Fernandes; Bona, Evandro; Valle, Marcos Eduardo

    2016-01-15

    In this study, 20 samples of soybean, both transgenic and conventional cultivars, which were planted in two different regions, Londrina and Ponta Grossa, both located at Paraná, Brazil, were analysed. In order to verify whether the inorganic compound levels in soybeans varied with the region of planting, K, P, Ca, Mg, S, Zn, Mn, Fe, Cu and B contents were analysed by an artificial neural network self-organising map. It was observed that with a topology 10 × 10, 8000 epochs, initial learning rate of 0.1 and initial neighbourhood ratio of 4.5, the network was able to differentiate samples according to region of origin. Among all of the variables analysed by the artificial neural network, the elements Zn, Ca and Mn were those which most contributed to the classification of the samples. The results indicated that samples planted in these two regions differ in their mineral content; however, conventional and transgenic samples grown in the same region show no difference in mineral contents in the grain. © 2015 Society of Chemical Industry.

  18. Designing problem-based curricula: The role of concept mapping in scaffolding learning for the health sciences

    Directory of Open Access Journals (Sweden)

    Susan M. Bridges

    2015-03-01

    Full Text Available While the utility of concept mapping has been widely reported in primary and secondary educational contexts, its application in the health sciences in higher education has been less frequently noted. Two case studies of the application of concept mapping in undergraduate and postgraduate health sciences are detailed in this paper. The case in undergraduate dental education examines the role of concept mapping in supporting problem-based learning and explores how explicit induction into the principles and practices of CM has add-on benefits to learning in an inquiry-based curriculum. The case in postgraduate medical education describes the utility of concept mapping in an online inquiry-based module design. Specific attention is given to applications of CMapTools™ software to support the implementation of Novakian concept mapping in both inquiry-based curricular contexts.

  19. Changes in hydrogen isotope ratios in sequential plumage stages: an implication for the creation of isotope-base maps for tracking migratory birds.

    Science.gov (United States)

    Duxbury, J M; Holroyd, G L; Muehlenbachs, K

    2003-09-01

    Accurate reference maps are important in the use of stable-isotopes to track the movements of migratory birds. Reference maps created by the analysis of samples collected from young at the nest site are more accurate than simply referring to naturally occurring patterns of hydrogen isotope ratios created by precipitation cycles. Ratios of hydrogen isotopes in the nutrients incorporated early in the development of young birds can be derived from endogenous, maternal sources. Base-maps should be created with the analysis of tissue samples from hatchlings after local the isotopic signature of exogenous nutrients is dominant. Migratory species such as Peregrine Falcons are known to use endogenous sources in the creation of their eggs, therefore knowledge of what plumage stage best represents the local hydrogen ratios would assist in the planning of nest visits. We conducted diet manipulation experiments involving Japanese Quail and Peregrine Falcons to determine the plumage stage when hydrogen isotope ratios were indicative of a switch in their food source. The natal down of both the quail and falcons reflected the diet of breeding adult females. The hydrogen isotope ratios of a new food source were dominant in the juvenile down of the young falcons, although a further shift was detected in the final juvenile plumage. The juvenile plumage is grown during weeks 3-4 after hatch on Peregrine Falcons. Nest visits for the purpose of collecting feathers for isotope-base-map creation should be made around 4 weeks after the presumed hatch of the young falcons.

  20. Self-organizing maps based on limit cycle attractors.

    Science.gov (United States)

    Huang, Di-Wei; Gentili, Rodolphe J; Reggia, James A

    2015-03-01

    Recent efforts to develop large-scale brain and neurocognitive architectures have paid relatively little attention to the use of self-organizing maps (SOMs). Part of the reason for this is that most conventional SOMs use a static encoding representation: each input pattern or sequence is effectively represented as a fixed point activation pattern in the map layer, something that is inconsistent with the rhythmic oscillatory activity observed in the brain. Here we develop and study an alternative encoding scheme that instead uses sparsely-coded limit cycles to represent external input patterns/sequences. We establish conditions under which learned limit cycle representations arise reliably and dominate the dynamics in a SOM. These limit cycles tend to be relatively unique for different inputs, robust to perturbations, and fairly insensitive to timing. In spite of the continually changing activity in the map layer when a limit cycle representation is used, map formation continues to occur reliably. In a two-SOM architecture where each SOM represents a different sensory modality, we also show that after learning, limit cycles in one SOM can correctly evoke corresponding limit cycles in the other, and thus there is the potential for multi-SOM systems using limit cycles to work effectively as hetero-associative memories. While the results presented here are only first steps, they establish the viability of SOM models based on limit cycle activity patterns, and suggest that such models merit further study. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  2. National-scale crop type mapping and area estimation using multi-resolution remote sensing and field survey

    Science.gov (United States)

    Song, X. P.; Potapov, P.; Adusei, B.; King, L.; Khan, A.; Krylov, A.; Di Bella, C. M.; Pickens, A. H.; Stehman, S. V.; Hansen, M.

    2016-12-01

    Reliable and timely information on agricultural production is essential for ensuring world food security. Freely available medium-resolution satellite data (e.g. Landsat, Sentinel) offer the possibility of improved global agriculture monitoring. Here we develop and test a method for estimating in-season crop acreage using a probability sample of field visits and producing wall-to-wall crop type maps at national scales. The method is first illustrated for soybean cultivated area in the US for 2015. A stratified, two-stage cluster sampling design was used to collect field data to estimate national soybean area. The field-based estimate employed historical soybean extent maps from the U.S. Department of Agriculture (USDA) Cropland Data Layer to delineate and stratify U.S. soybean growing regions. The estimated 2015 U.S. soybean cultivated area based on the field sample was 341,000 km2 with a standard error of 23,000 km2. This result is 1.0% lower than USDA's 2015 June survey estimate and 1.9% higher than USDA's 2016 January estimate. Our area estimate was derived in early September, about 2 months ahead of harvest. To map soybean cover, the Landsat image archive for the year 2015 growing season was processed using an active learning approach. Overall accuracy of the soybean map was 84%. The field-based sample estimated area was then used to calibrate the map such that the soybean acreage of the map derived through pixel counting matched the sample-based area estimate. The strength of the sample-based area estimation lies in the stratified design that takes advantage of the spatially explicit cropland layers to construct the strata. The success of the mapping was built upon an automated system which transforms Landsat images into standardized time-series metrics. The developed method produces reliable and timely information on soybean area in a cost-effective way and could be implemented in an operational mode. The approach has also been applied for other crops in

  3. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  4. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  5. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  6. Towards an EO-based Landslide Web Mapping and Monitoring Service

    Science.gov (United States)

    Hölbling, Daniel; Weinke, Elisabeth; Albrecht, Florian; Eisank, Clemens; Vecchiotti, Filippo; Friedl, Barbara; Kociu, Arben

    2017-04-01

    National and regional authorities and infrastructure maintainers in mountainous regions require accurate knowledge of the location and spatial extent of landslides for hazard and risk management. Information on landslides is often collected by a combination of ground surveying and manual image interpretation following landslide triggering events. However, the high workload and limited time for data acquisition result in a trade-off between completeness, accuracy and detail. Remote sensing data offers great potential for mapping and monitoring landslides in a fast and efficient manner. While facing an increased availability of high-quality Earth Observation (EO) data and new computational methods, there is still a lack in science-policy interaction and in providing innovative tools and methods that can easily be used by stakeholders and users to support their daily work. Taking up this issue, we introduce an innovative and user-oriented EO-based web service for landslide mapping and monitoring. Three central design components of the service are presented: (1) the user requirements definition, (2) the semi-automated image analysis methods implemented in the service, and (3) the web mapping application with its responsive user interface. User requirements were gathered during semi-structured interviews with regional authorities. The potential users were asked if and how they employ remote sensing data for landslide investigation and what their expectations to a landslide web mapping service regarding reliability and usability are. The interviews revealed the capability of our service for landslide documentation and mapping as well as monitoring of selected landslide sites, for example to complete and update landslide inventory maps. In addition, the users see a considerable potential for landslide rapid mapping. The user requirements analysis served as basis for the service concept definition. Optical satellite imagery from different high resolution (HR) and very high

  7. Procedure for extraction of disparate data from maps into computerized data bases

    Science.gov (United States)

    Junkin, B. G.

    1979-01-01

    A procedure is presented for extracting disparate sources of data from geographic maps and for the conversion of these data into a suitable format for processing on a computer-oriented information system. Several graphic digitizing considerations are included and related to the NASA Earth Resources Laboratory's Digitizer System. Current operating procedures for the Digitizer System are given in a simplified and logical manner. The report serves as a guide to those organizations interested in converting map-based data by using a comparable map digitizing system.

  8. An Approach of Dynamic Object Removing for Indoor Mapping Based on UGV SLAM

    Directory of Open Access Journals (Sweden)

    Jian Tang

    2015-07-01

    Full Text Available The study of indoor mapping for Location Based Service (LBS becomes more and more popular in recent years. LiDAR SLAM based mapping method seems to be a promising indoor mapping solution. However, there are some dynamic objects such as pedestrians, indoor vehicles, etc. existing in the raw LiDAR range data. They have to be removal for mapping purpose. In this paper, a new approach of dynamic object removing called Likelihood Grid Voting (LGV is presented. It is a model free method and takes full advantage of the high scanning rate of LiDAR, which is moving at a relative low speed in indoor environment. In this method, a counting grid is allocated for recording the occupation of map position by laser scans. The lower counter value of this position can be recognized as dynamic objects and the point cloud will be removed from map. This work is a part of algorithms in our self- developed Unmanned Ground Vehicles (UGV simultaneous localization and Mapping (SLAM system- NAVIS. Field tests are carried in an indoor parking place with NAVIS to evaluate the effectiveness of the proposed method. The result shows that all the small size objects like pedestrians can be detected and removed quickly; large size of objects like cars can be detected and removed partly.

  9. One-way hash function construction based on chaotic map network

    International Nuclear Information System (INIS)

    Yang Huaqian; Wong, K.-W.; Liao Xiaofeng; Wang Yong; Yang Degang

    2009-01-01

    A novel chaotic hash algorithm based on a network structure formed by 16 chaotic maps is proposed. The original message is first padded with zeros to make the length a multiple of four. Then it is divided into a number of blocks each contains 4 bytes. In the hashing process, the blocks are mixed together by the chaotic map network since the initial value and the control parameter of each tent map are dynamically determined by the output of its neighbors. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high flexibility, as required by practical keyed hash functions.

  10. MAP as a model for practice-based learning and improvement in child psychiatry training.

    Science.gov (United States)

    Kataoka, Sheryl H; Podell, Jennifer L; Zima, Bonnie T; Best, Karin; Sidhu, Shawn; Jura, Martha Bates

    2014-01-01

    Not only is there a growing literature demonstrating the positive outcomes that result from implementing evidence based treatments (EBTs) but also studies that suggest a lack of delivery of these EBTs in "usual care" practices. One way to address this deficit is to improve the quality of psychotherapy teaching for clinicians-in-training. The Accreditation Council for Graduate Medical Education (ACGME) requires all training programs to assess residents in a number of competencies including Practice-Based Learning and Improvements (PBLI). This article describes the piloting of Managing and Adapting Practice (MAP) for child psychiatry fellows, to teach them both EBT and PBLI skills. Eight child psychiatry trainees received 5 full days of MAP training and are delivering MAP in a year-long outpatient teaching clinic. In this setting, MAP is applied to the complex, multiply diagnosed psychiatric patients that present to this clinic. This article describes how MAP tools and resources assist in teaching trainees each of the eight required competency components of PBLI, including identifying deficits in expertise, setting learning goals, performing learning activities, conducting quality improvement methods in practice, incorporating formative feedback, using scientific studies to inform practice, using technology for learning, and participating in patient education. A case example illustrates the use of MAP in teaching PBLI. MAP provides a unique way to teach important quality improvement and practice-based learning skills to trainees while training them in important psychotherapy competence.

  11. Mapping species distributions with MAXENT using a geographically biased sample of presence data: a performance assessment of methods for correcting sampling bias.

    Science.gov (United States)

    Fourcade, Yoan; Engler, Jan O; Rödder, Dennis; Secondi, Jean

    2014-01-01

    MAXENT is now a common species distribution modeling (SDM) tool used by conservation practitioners for predicting the distribution of a species from a set of records and environmental predictors. However, datasets of species occurrence used to train the model are often biased in the geographical space because of unequal sampling effort across the study area. This bias may be a source of strong inaccuracy in the resulting model and could lead to incorrect predictions. Although a number of sampling bias correction methods have been proposed, there is no consensual guideline to account for it. We compared here the performance of five methods of bias correction on three datasets of species occurrence: one "virtual" derived from a land cover map, and two actual datasets for a turtle (Chrysemys picta) and a salamander (Plethodon cylindraceus). We subjected these datasets to four types of sampling biases corresponding to potential types of empirical biases. We applied five correction methods to the biased samples and compared the outputs of distribution models to unbiased datasets to assess the overall correction performance of each method. The results revealed that the ability of methods to correct the initial sampling bias varied greatly depending on bias type, bias intensity and species. However, the simple systematic sampling of records consistently ranked among the best performing across the range of conditions tested, whereas other methods performed more poorly in most cases. The strong effect of initial conditions on correction performance highlights the need for further research to develop a step-by-step guideline to account for sampling bias. However, this method seems to be the most efficient in correcting sampling bias and should be advised in most cases.

  12. Exploration and implementation of ontology-based cultural relic knowledge map integration platform

    Science.gov (United States)

    Yang, Weiqiang; Dong, Yiqiang

    2018-05-01

    To help designers to better carry out creative design and improve the ability of searching traditional cultural relic information, the ontology-based knowledge map construction method was explored and an integrated platform for cultural relic knowledge map was developed. First of all, the construction method of the ontology of cultural relics was put forward, and the construction of the knowledge map of cultural relics was completed based on the constructed cultural relic otology. Then, a personalized semantic retrieval framework for creative design was proposed. Finally, the integrated platform of the knowledge map of cultural relics was designed and realized. The platform was divided into two parts. One was the foreground display system, which was used for designers to search and browse cultural relics. The other was the background management system, which was for cultural experts to manage cultural relics' knowledge. The research results showed that the platform designed could improve the retrieval ability of cultural relic information. To sum up, the platform can provide a good support for the designer's creative design.

  13. Development of Geospatial Map Based Election Portal

    Science.gov (United States)

    Gupta, A. Kumar Chandra; Kumar, P.; Vasanth Kumar, N.

    2014-11-01

    The Geospatial Delhi Limited (GSDL), a Govt. of NCT of Delhi Company formed in order to provide the geospatial information of National Capital Territory of Delhi (NCTD) to the Government of National Capital Territory of Delhi (GNCTD) and its organs such as DDA, MCD, DJB, State Election Department, DMRC etc., for the benefit of all citizens of Government of National Capital Territory of Delhi (GNCTD). This paper describes the development of Geospatial Map based Election portal (GMEP) of NCT of Delhi. The portal has been developed as a map based spatial decision support system (SDSS) for pertain to planning and management of Department of Chief Electoral Officer, and as an election related information searching tools (Polling Station, Assembly and parliamentary constituency etc.,) for the citizens of NCTD. The GMEP is based on Client-Server architecture model. It has been developed using ArcGIS Server 10.0 with J2EE front-end on Microsoft Windows environment. The GMEP is scalable to enterprise SDSS with enterprise Geo Database & Virtual Private Network (VPN) connectivity. Spatial data to GMEP includes delimited precinct area boundaries of Voters Area of Polling stations, Assembly Constituency, Parliamentary Constituency, Election District, Landmark locations of Polling Stations & basic amenities (Police Stations, Hospitals, Schools and Fire Stations etc.). GMEP could help achieve not only the desired transparency and easiness in planning process but also facilitates through efficient & effective tools for management of elections. It enables a faster response to the changing ground realities in the development planning, owing to its in-built scientific approach and open-ended design.

  14. Ab initio and template-based prediction of multi-class distance maps by two-dimensional recursive neural networks

    Directory of Open Access Journals (Sweden)

    Martin Alberto JM

    2009-01-01

    Full Text Available Abstract Background Prediction of protein structures from their sequences is still one of the open grand challenges of computational biology. Some approaches to protein structure prediction, especially ab initio ones, rely to some extent on the prediction of residue contact maps. Residue contact map predictions have been assessed at the CASP competition for several years now. Although it has been shown that exact contact maps generally yield correct three-dimensional structures, this is true only at a relatively low resolution (3–4 Å from the native structure. Another known weakness of contact maps is that they are generally predicted ab initio, that is not exploiting information about potential homologues of known structure. Results We introduce a new class of distance restraints for protein structures: multi-class distance maps. We show that Cα trace reconstructions based on 4-class native maps are significantly better than those from residue contact maps. We then build two predictors of 4-class maps based on recursive neural networks: one ab initio, or relying on the sequence and on evolutionary information; one template-based, or in which homology information to known structures is provided as a further input. We show that virtually any level of sequence similarity to structural templates (down to less than 10% yields more accurate 4-class maps than the ab initio predictor. We show that template-based predictions by recursive neural networks are consistently better than the best template and than a number of combinations of the best available templates. We also extract binary residue contact maps at an 8 Å threshold (as per CASP assessment from the 4-class predictors and show that the template-based version is also more accurate than the best template and consistently better than the ab initio one, down to very low levels of sequence identity to structural templates. Furthermore, we test both ab-initio and template-based 8

  15. Aerosol Extinction Profile Mapping with Lognormal Distribution Based on MPL Data

    Science.gov (United States)

    Lin, T. H.; Lee, T. T.; Chang, K. E.; Lien, W. H.; Liu, G. R.; Liu, C. Y.

    2017-12-01

    This study intends to challenge the profile mapping of aerosol vertical distribution by mathematical function. With the similarity in distribution pattern, lognormal distribution is examined for mapping the aerosol extinction profile based on MPL (Micro Pulse LiDAR) in situ measurements. The variables of lognormal distribution are log mean (μ) and log standard deviation (σ), which will be correlated with the parameters of aerosol optical depht (AOD) and planetary boundary layer height (PBLH) associated with the altitude of extinction peak (Mode) defined in this study. On the base of 10 years MPL data with single peak, the mapping results showed that the mean error of Mode and σ retrievals are 16.1% and 25.3%, respectively. The mean error of σ retrieval can be reduced to 16.5% under the cases of larger distance between PBLH and Mode. The proposed method is further applied to MODIS AOD product in mapping extinction profile for the retrieval of PM2.5 in terms of satellite observations. The results indicated well agreement between retrievals and ground measurements when aerosols under 525 meters are well-mixed. The feasibility of proposed method to satellite remote sensing is also suggested by the case study. Keyword: Aerosol extinction profile, Lognormal distribution, MPL, Planetary boundary layer height (PBLH), Aerosol optical depth (AOD), Mode

  16. An Algorithm Based on the Self-Organized Maps for the Classification of Facial Features

    Directory of Open Access Journals (Sweden)

    Gheorghe Gîlcă

    2015-12-01

    Full Text Available This paper deals with an algorithm based on Self Organized Maps networks which classifies facial features. The proposed algorithm can categorize the facial features defined by the input variables: eyebrow, mouth, eyelids into a map of their grouping. The groups map is based on calculating the distance between each input vector and each output neuron layer , the neuron with the minimum distance being declared winner neuron. The network structure consists of two levels: the first level contains three input vectors, each having forty-one values, while the second level contains the SOM competitive network which consists of 100 neurons. The proposed system can classify facial features quickly and easily using the proposed algorithm based on SOMs.

  17. CONCEPT MAPS – IMPROVEMENT TOOL FOR ACCOUNTING INFORMATION

    Directory of Open Access Journals (Sweden)

    Oana DRĂGAN

    2014-12-01

    Full Text Available Concept maps, viewed as an innovative method for learning and evolution, are used to synthesize the knowledge of the participants to the learning process and are based on the main concepts and the relationship between them. They offer a visual representation of the information held by an individual, caught through his ability to synthesize the notions/the key concepts. The current study intends to show the importance and efficiency of using the concept maps in economics, especially in the accounting department, a method designed to settle the learning process and, also, to offer a sustainable value. The current empirical study is based on the manner in which the accounting knowledge is displayed by a sample group of 19 practitioners. The originality, the relevance of the concept maps method is underlined by the idea of the practitioners creating their own concept maps designed to point out the importance of the cognitive structure when describing the relationships between different accounting principles.

  18. Epitope mapping: the first step in developing epitope-based vaccines.

    Science.gov (United States)

    Gershoni, Jonathan M; Roitburd-Berman, Anna; Siman-Tov, Dror D; Tarnovitski Freund, Natalia; Weiss, Yael

    2007-01-01

    Antibodies are an effective line of defense in preventing infectious diseases. Highly potent neutralizing antibodies can intercept a virus before it attaches to its target cell and, thus, inactivate it. This ability is based on the antibodies' specific recognition of epitopes, the sites of the antigen to which antibodies bind. Thus, understanding the antibody/epitope interaction provides a basis for the rational design of preventive vaccines. It is assumed that immunization with the precise epitope, corresponding to an effective neutralizing antibody, would elicit the generation of similarly potent antibodies in the vaccinee. Such a vaccine would be a 'B-cell epitope-based vaccine', the implementation of which requires the ability to backtrack from a desired antibody to its corresponding epitope. In this article we discuss a range of methods that enable epitope discovery based on a specific antibody. Such a reversed immunological approach is the first step in the rational design of an epitope-based vaccine. Undoubtedly, the gold standard for epitope definition is x-ray analyses of crystals of antigen:antibody complexes. This method provides atomic resolution of the epitope; however, it is not readily applicable to many antigens and antibodies, and requires a very high degree of sophistication and expertise. Most other methods rely on the ability to monitor the binding of the antibody to antigen fragments or mutated variations. In mutagenesis of the antigen, loss of binding due to point modification of an amino acid residue is often considered an indication of an epitope component. In addition, computational combinatorial methods for epitope mapping are also useful. These methods rely on the ability of the antibody of interest to affinity isolate specific short peptides from combinatorial phage display peptide libraries. The peptides are then regarded as leads for the definition of the epitope corresponding to the antibody used to screen the peptide library. For

  19. A LiDAR based analysis of hydraulic hazard mapping

    Science.gov (United States)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  20. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    Science.gov (United States)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  1. A systematic review of concept mapping-based formative assessment processes in primary and secondary science education

    DEFF Research Database (Denmark)

    Hartmeyer, Rikke; Stevenson, Matt P.; Bentsen, Peter

    2017-01-01

    assessment: firstly, concept mapping should be constructed in teaching, preferably on repeated occasions. Secondly, concept mapping should be carried out individually if personal understanding is to be elicited; however, collaborative concept mapping might foster discussions valuable for developing students......’ understanding and for activating them as instructional resources and owners of their own learning. Thirdly, low-directed mapping seems most suitable for formative assessment. Fourthly, technology-based or peer assessments are useful strategies likely to reduce the load of interpretation for the educator......In this paper, we present and discuss the results of a systematic review of concept mapping-based interventions in primary and secondary science education. We identified the following recommendations for science educators on how to successfully apply concept mapping as a method for formative...

  2. Earth-Base: testing the temporal congruency of paleontological collections and geologic maps of North America

    Science.gov (United States)

    Heim, N. A.; Kishor, P.; McClennen, M.; Peters, S. E.

    2012-12-01

    Free and open source software and data facilitate novel research by allowing geoscientists to quickly and easily bring together disparate data that have been independently collected for many different purposes. The Earth-Base project brings together several datasets using a common space-time framework that is managed and analyzed using open source software. Earth-Base currently draws on stratigraphic, paleontologic, tectonic, geodynamic, seismic, botanical, hydrologic and cartographic data. Furthermore, Earth-Base is powered by RESTful data services operating on top of PostgreSQL and MySQL databases and the R programming environment, making much of the functionality accessible to third-parties even though the detailed data schemas are unknown to them. We demonstrate the scientific potential of Earth-Base and other FOSS by comparing the stated age of fossil collections to the age of the bedrock upon which they are geolocated. This analysis makes use of web services for the Paleobiology Database (PaleoDB), Macrostrat, the 2005 Geologic Map of North America (Garrity et al. 2009) and geologic maps of the conterminous United States. This analysis is a way to quickly assess the accuracy of temporal and spatial congruence of the paleontologic and geologic map datasets. We find that 56.1% of the 52,593 PaleoDB collections have temporally consistent ages with the bedrock upon which they are located based on the Geologic Map of North America. Surprisingly, fossil collections within the conterminous United States are more consistently located on bedrock with congruent geological ages, even though the USA maps are spatially and temporally more precise. Approximately 57% of the 37,344 PaleoDB collections in the USA are located on similarly aged geologic map units. Increased accuracy is attributed to the lumping of Pliocene and Quaternary geologic map units along the Atlantic and Gulf coastal plains in the Geologic Map of North America. The abundant Pliocene fossil collections

  3. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  4. Quantitative mapping of chemical compositions with MRI using compressed sensing.

    Science.gov (United States)

    von Harbou, Erik; Fabich, Hilary T; Benning, Martin; Tayler, Alexander B; Sederman, Andrew J; Gladden, Lynn F; Holland, Daniel J

    2015-12-01

    In this work, a magnetic resonance (MR) imaging method for accelerating the acquisition time of two dimensional concentration maps of different chemical species in mixtures by the use of compressed sensing (CS) is presented. Whilst 2D-concentration maps with a high spatial resolution are prohibitively time-consuming to acquire using full k-space sampling techniques, CS enables the reconstruction of quantitative concentration maps from sub-sampled k-space data. First, the method was tested by reconstructing simulated data. Then, the CS algorithm was used to reconstruct concentration maps of binary mixtures of 1,4-dioxane and cyclooctane in different samples with a field-of-view of 22mm and a spatial resolution of 344μm×344μm. Spiral based trajectories were used as sampling schemes. For the data acquisition, eight scans with slightly different trajectories were applied resulting in a total acquisition time of about 8min. In contrast, a conventional chemical shift imaging experiment at the same resolution would require about 17h. To get quantitative results, a careful weighting of the regularisation parameter (via the L-curve approach) or contrast-enhancing Bregman iterations are applied for the reconstruction of the concentration maps. Both approaches yield relative errors of the concentration map of less than 2mol-% without any calibration prior to the measurement. The accuracy of the reconstructed concentration maps deteriorates when the reconstruction model is biased by systematic errors such as large inhomogeneities in the static magnetic field. The presented method is a powerful tool for the fast acquisition of concentration maps that can provide valuable information for the investigation of many phenomena in chemical engineering applications. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Web Based Rapid Mapping of Disaster Areas using Satellite Images, Web Processing Service, Web Mapping Service, Frequency Based Change Detection Algorithm and J-iView

    Science.gov (United States)

    Bandibas, J. C.; Takarada, S.

    2013-12-01

    Timely identification of areas affected by natural disasters is very important for a successful rescue and effective emergency relief efforts. This research focuses on the development of a cost effective and efficient system of identifying areas affected by natural disasters, and the efficient distribution of the information. The developed system is composed of 3 modules which are the Web Processing Service (WPS), Web Map Service (WMS) and the user interface provided by J-iView (fig. 1). WPS is an online system that provides computation, storage and data access services. In this study, the WPS module provides online access of the software implementing the developed frequency based change detection algorithm for the identification of areas affected by natural disasters. It also sends requests to WMS servers to get the remotely sensed data to be used in the computation. WMS is a standard protocol that provides a simple HTTP interface for requesting geo-registered map images from one or more geospatial databases. In this research, the WMS component provides remote access of the satellite images which are used as inputs for land cover change detection. The user interface in this system is provided by J-iView, which is an online mapping system developed at the Geological Survey of Japan (GSJ). The 3 modules are seamlessly integrated into a single package using J-iView, which could rapidly generate a map of disaster areas that is instantaneously viewable online. The developed system was tested using ASTER images covering the areas damaged by the March 11, 2011 tsunami in northeastern Japan. The developed system efficiently generated a map showing areas devastated by the tsunami. Based on the initial results of the study, the developed system proved to be a useful tool for emergency workers to quickly identify areas affected by natural disasters.

  6. [An AIDS-related cognitive map in a sample of drug abusers in Buenos Aires City].

    Science.gov (United States)

    Kornblit, A L; Bilyk, A

    1990-01-01

    This paper is an approach to AIDS as a topic among a drug abusers sample of the city of Buenos Aires. Research was carried out on the basis of a qualitative methodology. In an attempt at surveying opinions and attitudes of such a sample as regards AIDS (i.e. subjects' cognitive map), 21 drug abusers from three different rehabilitation programs operating in the B.A. area were interviewed. On the basis of the research performed, the authors elaborate communication strategies among drug abusers that would be helpful for authorities engaged in AIDS prevention to adopt. To boost a strategy likely to break up the AIDS-drug association existing in the mind of many an abuser would be highly advisable so that a separation be settled between both representations, thus giving drug abusers a higher motivation for self-care practice.

  7. Switching-based Mapping and Control for Haptic Teleoperation of Aerial Robots

    NARCIS (Netherlands)

    Mersha, A.Y.; Stramigioli, Stefano; Carloni, Raffaella

    2012-01-01

    This paper deals with the bilateral teleoperation of underactuated aerial robots by means of a haptic interface. In particular, we propose a switching-based state mapping and control algorithm between a rate-based passive controller, which addresses the workspace incompatibility between the master

  8. Nominal 30-m Cropland Extent Map of Continental Africa by Integrating Pixel-Based and Object-Based Algorithms Using Sentinel-2 and Landsat-8 Data on Google Earth Engine

    Directory of Open Access Journals (Sweden)

    Jun Xiong

    2017-10-01

    five bands (blue, green, red, near-infrared, NDVI during each of the two periods (period 1: January–June 2016 and period 2: July–December 2015 plus a 30-m slope layer derived from the Shuttle Radar Topographic Mission (SRTM elevation dataset. Second, we selected Cropland/Non-cropland training samples (sample size = 9791 from various sources in GEE to create pixel-based classifications. As supervised classification algorithm, Random Forest (RF was used as the primary classifier because of its efficiency, and when over-fitting issues of RF happened due to the noise of input training data, Support Vector Machine (SVM was applied to compensate for such defects in specific areas. Third, the Recursive Hierarchical Segmentation (RHSeg algorithm was employed to generate an object-oriented segmentation layer based on spectral and spatial properties from the same input data. This layer was merged with the pixel-based classification to improve segmentation accuracy. Accuracies of the merged 30-m crop extent product were computed using an error matrix approach in which 1754 independent validation samples were used. In addition, a comparison was performed with other available cropland maps as well as with LULC maps to show spatial similarity. Finally, the cropland area results derived from the map were compared with UN FAO statistics. The independent accuracy assessment showed a weighted overall accuracy of 94%, with a producer’s accuracy of 85.9% (or omission error of 14.1%, and user’s accuracy of 68.5% (commission error of 31.5% for the cropland class. The total net cropland area (TNCA of Africa was estimated as 313 Mha for the nominal year 2015. The online product, referred to as the Global Food Security-support Analysis Data @ 30-m for the African Continent, Cropland Extent product (GFSAD30AFCE is distributed through the NASA’s Land Processes Distributed Active Archive Center (LP DAAC as (available for download by 10 November 2017 or earlier: https://doi.org/10

  9. Distinguishability notion based on Wootters statistical distance: Application to discrete maps

    Science.gov (United States)

    Gomez, Ignacio S.; Portesi, M.; Lamberti, P. W.

    2017-08-01

    We study the distinguishability notion given by Wootters for states represented by probability density functions. This presents the particularity that it can also be used for defining a statistical distance in chaotic unidimensional maps. Based on that definition, we provide a metric d ¯ for an arbitrary discrete map. Moreover, from d ¯ , we associate a metric space with each invariant density of a given map, which results to be the set of all distinguished points when the number of iterations of the map tends to infinity. Also, we give a characterization of the wandering set of a map in terms of the metric d ¯ , which allows us to identify the dissipative regions in the phase space. We illustrate the results in the case of the logistic and the circle maps numerically and analytically, and we obtain d ¯ and the wandering set for some characteristic values of their parameters. Finally, an extension of the metric space associated for arbitrary probability distributions (not necessarily invariant densities) is given along with some consequences. The statistical properties of distributions given by histograms are characterized in terms of the cardinal of the associated metric space. For two conjugate variables, the uncertainty principle is expressed in terms of the diameters of the associated metric space with those variables.

  10. Identification of p38α MAP kinase inhibitors by pharmacophore based virtual screening

    DEFF Research Database (Denmark)

    Gangwal, Rahul P; Das, Nihar R; Thanki, Kaushik

    2014-01-01

    The p38α mitogen-activated protein (MAP) kinase plays a vital role in treating many inflammatory diseases. In the present study, a combined ligand and structure based pharmacophore model was developed to identify potential DFG-in selective p38 MAP kinase inhibitors. Conformations of co...

  11. Adaptive sampling based on the cumulative distribution function of order statistics to delineate heavy-metal contaminated soils using kriging

    International Nuclear Information System (INIS)

    Juang, K.-W.; Lee, D.-Y.; Teng, Y.-L.

    2005-01-01

    Correctly classifying 'contaminated' areas in soils, based on the threshold for a contaminated site, is important for determining effective clean-up actions. Pollutant mapping by means of kriging is increasingly being used for the delineation of contaminated soils. However, those areas where the kriged pollutant concentrations are close to the threshold have a high possibility for being misclassified. In order to reduce the misclassification due to the over- or under-estimation from kriging, an adaptive sampling using the cumulative distribution function of order statistics (CDFOS) was developed to draw additional samples for delineating contaminated soils, while kriging. A heavy-metal contaminated site in Hsinchu, Taiwan was used to illustrate this approach. The results showed that compared with random sampling, adaptive sampling using CDFOS reduced the kriging estimation errors and misclassification rates, and thus would appear to be a better choice than random sampling, as additional sampling is required for delineating the 'contaminated' areas. - A sampling approach was derived for drawing additional samples while kriging

  12. Intensity Based Seismic Hazard Map of Republic of Macedonia

    Science.gov (United States)

    Dojcinovski, Dragi; Dimiskovska, Biserka; Stojmanovska, Marta

    2016-04-01

    The territory of the Republic of Macedonia and the border terrains are among the most seismically active parts of the Balkan Peninsula belonging to the Mediterranean-Trans-Asian seismic belt. The seismological data on the R. Macedonia from the past 16 centuries point to occurrence of very strong catastrophic earthquakes. The hypocenters of the occurred earthquakes are located above the Mohorovicic discontinuity, most frequently, at a depth of 10-20 km. Accurate short -term prognosis of earthquake occurrence, i.e., simultaneous prognosis of time, place and intensity of their occurrence is still not possible. The present methods of seismic zoning have advanced to such an extent that it is with a great probability that they enable efficient protection against earthquake effects. The seismic hazard maps of the Republic of Macedonia are the result of analysis and synthesis of data from seismological, seismotectonic and other corresponding investigations necessary for definition of the expected level of seismic hazard for certain time periods. These should be amended, from time to time, with new data and scientific knowledge. The elaboration of this map does not completely solve all issues related to earthquakes, but it provides basic empirical data necessary for updating the existing regulations for construction of engineering structures in seismically active areas regulated by legal regulations and technical norms whose constituent part is the seismic hazard map. The map has been elaborated based on complex seismological and geophysical investigations of the considered area and synthesis of the results from these investigations. There were two phases of elaboration of the map. In the first phase, the map of focal zones characterized by maximum magnitudes of possible earthquakes has been elaborated. In the second phase, the intensities of expected earthquakes have been computed according to the MCS scale. The map is prognostic, i.e., it provides assessment of the

  13. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  14. Environmental Risk Assessment Based on High-Resolution Spatial Maps of Potentially Toxic Elements Sampled on Stream Sediments of Santiago, Cape Verde

    Directory of Open Access Journals (Sweden)

    Marina M. S. Cabral Pinto

    2014-10-01

    Full Text Available Geochemical mapping is the base knowledge to identify the regions of the planet with critical contents of potentially toxic elements from either natural or anthropogenic sources. Sediments, soils and waters are the vehicles which link the inorganic environment to life through the supply of essential macro and micro nutrients. The chemical composition of surface geological materials may cause metabolic changes which may favor the occurrence of endemic diseases in humans. In order to better understand the relationships between environmental geochemistry and public health, we present environmental risk maps of some harmful elements (As, Cd, Co, Cr, Cu, Hg, Mn, Ni, Pb, V, and Zn in the stream sediments of Santiago, Cape Verde, identifying the potentially harmful areas in this island. The Estimated Background Values (EBV of Cd, Co, Cr, Ni and V were found to be above the Canadian guidelines for any type of use of stream sediments and also above the target values of the Dutch and United States guidelines. The Probably Effect Concentrations (PEC, above which harmful effects are likely in sediment dwelling organisms, were found for Cr and Ni. Some associations between the geological formations of the island and the composition of stream sediments were identified and confirmed by descriptive statistics and by Principal Component Analysis (PCA. The EBV spatial distribution of the metals and the results of PCA allowed us to establish relationships between the EBV maps and the geological formations. The first two PCA modes indicate that heavy metals in Santiago stream sediments are mainly originated from weathering of underlying bedrocks. The first metal association (Co, V, Cr, and Mn; first PCA mode consists of elements enriched in basic rocks and compatible elements. The second association of variables (Zn and Cd as opposed to Ni; second PCA mode appears to be strongly controlled by the composition of alkaline volcanic rocks and pyroclastic rocks. So, the

  15. Comparing orbiter and rover image-based mapping of an ancient sedimentary environment, Aeolis Palus, Gale crater, Mars

    Science.gov (United States)

    Stack, Kathryn M.; Edwards, Christopher; Grotzinger, J. P.; Gupta, S.; Sumner, D.; Edgar, Lauren; Fraeman, A.; Jacob, S.; LeDeit, L.; Lewis, K.W.; Rice, M.S.; Rubin, D.; Calef, F.; Edgett, K.; Williams, R.M.E.; Williford, K.H.

    2016-01-01

    This study provides the first systematic comparison of orbital facies maps with detailed ground-based geology observations from the Mars Science Laboratory (MSL) Curiosity rover to examine the validity of geologic interpretations derived from orbital image data. Orbital facies maps were constructed for the Darwin, Cooperstown, and Kimberley waypoints visited by the Curiosity rover using High Resolution Imaging Science Experiment (HiRISE) images. These maps, which represent the most detailed orbital analysis of these areas to date, were compared with rover image-based geologic maps and stratigraphic columns derived from Curiosity’s Mast Camera (Mastcam) and Mars Hand Lens Imager (MAHLI). Results show that bedrock outcrops can generally be distinguished from unconsolidated surficial deposits in high-resolution orbital images and that orbital facies mapping can be used to recognize geologic contacts between well-exposed bedrock units. However, process-based interpretations derived from orbital image mapping are difficult to infer without known regional context or observable paleogeomorphic indicators, and layer-cake models of stratigraphy derived from orbital maps oversimplify depositional relationships as revealed from a rover perspective. This study also shows that fine-scale orbital image-based mapping of current and future Mars landing sites is essential for optimizing the efficiency and science return of rover surface operations.

  16. Malaria Disease Mapping in Malaysia based on Besag-York-Mollie (BYM) Model

    Science.gov (United States)

    Azah Samat, Nor; Mey, Liew Wan

    2017-09-01

    Disease mapping is the visual representation of the geographical distribution which give an overview info about the incidence of disease within a population through spatial epidemiology data. Based on the result of map, it helps in monitoring and planning resource needs at all levels of health care and designing appropriate interventions, tailored towards areas that deserve closer scrutiny or communities that lead to further investigations to identify important risk factors. Therefore, the choice of statistical model used for relative risk estimation is important because production of disease risk map relies on the model used. This paper proposes Besag-York-Mollie (BYM) model to estimate the relative risk for Malaria in Malaysia. The analysis involved using the number of Malaria cases that obtained from the Ministry of Health Malaysia. The outcomes of analysis are displayed through graph and map, including Malaria disease risk map that constructed according to the estimation of relative risk. The distribution of high and low risk areas of Malaria disease occurrences for all states in Malaysia can be identified in the risk map.

  17. Comparison of prevalence estimation of Mycobacterium avium subsp. paratuberculosis infection by sampling slaughtered cattle with macroscopic lesions vs. systematic sampling.

    Science.gov (United States)

    Elze, J; Liebler-Tenorio, E; Ziller, M; Köhler, H

    2013-07-01

    The objective of this study was to identify the most reliable approach for prevalence estimation of Mycobacterium avium ssp. paratuberculosis (MAP) infection in clinically healthy slaughtered cattle. Sampling of macroscopically suspect tissue was compared to systematic sampling. Specimens of ileum, jejunum, mesenteric and caecal lymph nodes were examined for MAP infection using bacterial microscopy, culture, histopathology and immunohistochemistry. MAP was found most frequently in caecal lymph nodes, but sampling more tissues optimized the detection rate. Examination by culture was most efficient while combination with histopathology increased the detection rate slightly. MAP was detected in 49/50 animals with macroscopic lesions representing 1.35% of the slaughtered cattle examined. Of 150 systematically sampled macroscopically non-suspect cows, 28.7% were infected with MAP. This indicates that the majority of MAP-positive cattle are slaughtered without evidence of macroscopic lesions and before clinical signs occur. For reliable prevalence estimation of MAP infection in slaughtered cattle, systematic random sampling is essential.

  18. Soil sampling and analytical strategies for mapping fallout in nuclear emergencies based on the Fukushima Dai-ichi Nuclear Power Plant accident

    International Nuclear Information System (INIS)

    Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long

    2015-01-01

    The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and 131 I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. - Highlights: • Emergency soil sampling protocol was proposed for nuclear hazards. • Various sampling procedures were tested and evaluated in Fukushima area. • Soil sample mixing procedure was of key importance for measured radioactivity. • Minimum number of sampling was determined for reducing measurement uncertainty

  19. iMAR: An Interactive Web-Based Application for Mapping Herbicide Resistant Weeds.

    Directory of Open Access Journals (Sweden)

    Silvia Panozzo

    Full Text Available Herbicides are the major weed control tool in most cropping systems worldwide. However, the high reliance on herbicides has led to environmental issues as well as to the evolution of herbicide-resistant biotypes. Resistance is a major concern in modern agriculture and early detection of resistant biotypes is therefore crucial for its management and prevention. In this context, a timely update of resistance biotypes distribution is fundamental to devise and implement efficient resistance management strategies. Here we present an innovative web-based application called iMAR (interactive MApping of Resistance for the mapping of herbicide resistant biotypes. It is based on open source software tools and translates into maps the data reported in the GIRE (Italian herbicide resistance working group database of herbicide resistance at national level. iMAR allows an automatic, easy and cost-effective updating of the maps a nd provides two different systems, "static" and "dynamic". In the first one, the user choices are guided by a hierarchical tree menu, whereas the latter is more flexible and includes a multiple choice criteria (type of resistance, weed species, region, cropping systems that permits customized maps to be created. The generated information can be useful to various stakeholders who are involved in weed resistance management: farmers, advisors, national and local decision makers as well as the agrochemical industry. iMAR is freely available, and the system has the potential to handle large datasets and to be used for other purposes with geographical implications, such as the mapping of invasive plants or pests.

  20. A novel image encryption algorithm based on chaos maps with Markov properties

    Science.gov (United States)

    Liu, Quan; Li, Pei-yue; Zhang, Ming-chao; Sui, Yong-xin; Yang, Huai-jiang

    2015-02-01

    In order to construct high complexity, secure and low cost image encryption algorithm, a class of chaos with Markov properties was researched and such algorithm was also proposed. The kind of chaos has higher complexity than the Logistic map and Tent map, which keeps the uniformity and low autocorrelation. An improved couple map lattice based on the chaos with Markov properties is also employed to cover the phase space of the chaos and enlarge the key space, which has better performance than the original one. A novel image encryption algorithm is constructed on the new couple map lattice, which is used as a key stream generator. A true random number is used to disturb the key which can dynamically change the permutation matrix and the key stream. From the experiments, it is known that the key stream can pass SP800-22 test. The novel image encryption can resist CPA and CCA attack and differential attack. The algorithm is sensitive to the initial key and can change the distribution the pixel values of the image. The correlation of the adjacent pixels can also be eliminated. When compared with the algorithm based on Logistic map, it has higher complexity and better uniformity, which is nearer to the true random number. It is also efficient to realize which showed its value in common use.

  1. A new integrated statistical approach to the diagnostic use of two-dimensional maps.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio; Cecconi, Daniela; Domenici, Enrico

    2003-01-01

    Two-dimensional (2-D) electrophoresis is a very useful technique for the analysis of proteins in biological tissues. The complexity of the 2-D maps obtained causes many difficulties in the comparison of different samples. A new method is proposed for comparing different 2-D maps, based on five steps: (i) the digitalisation of the image; (ii) the transformation of the digitalised map in a fuzzy entity, in order to consider the variability of the 2-D electrophoretic separation; (iii) the calculation of a similarity index for each pair of maps; (iv) the analysis by multidimensional scaling of the previously obtained similarity matrix; (v) the analysis by classification or cluster analysis techniques of the resulting map co-ordinates. The method adopted was first tested on some simulated samples in order to evaluate its sensitivity to small changes in the spots position and size. The optimal setting of the method parameters was also investigated. Finally, the method was successfully applied to a series of real samples corresponding to the electrophoretic bidimensional analysis of sera from normal and nicotine-treated rats. Multidimensional scaling allowed the separation of the two classes of samples without any misclassification.

  2. Geodesy- and geology-based slip-rate models for the Western United States (excluding California) national seismic hazard maps

    Science.gov (United States)

    Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne

    2014-01-01

    The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.

  3. How relevant is heterogeneous chemistry on Mars? Strong tests via global mapping of water and ozone (sampled via O2 dayglow)

    Science.gov (United States)

    Villanueva, Geronimo Luis; Mumma, Michael J.; Novak, Robert E.

    2015-11-01

    Ozone and water are powerful tracers of photochemical processes on Mars. Considering that water is a condensable with a multifaceted hydrological cycle and ozone is continuously being produced / destroyed on short-time scales, their maps can test the validity of current 3D photochemical and dynamical models. Comparisons of modern GCM models (e.g., Lefèvre et al. 2004) with certain datasets (e.g., Clancy et al. 2012; Bertaux et al. 2012) point to significant disagreement, which in some cases have been related to heterogeneous (gas-dust) chemistry beyond the classical gas-gas homogeneous reactions.We address these concerns by acquiring full 2D maps of water and ozone (via O2 dayglow) on Mars, employing high spectral infrared spectrometers at ground-based telescopes (CRIRES/VLT and CSHELL/NASA-IRTF). By performing a rotational analysis on the O2 lines, we derive molecular temperature maps that we use to derive the vertical level of the emission (e.g., Novak et al. 2002). Our maps sample the full observable disk of Mars on March/25/2008 (Ls=50°, northern winter) and on Jan/29/2014 (Ls=83°, northern spring). The maps reveal a strong dependence of the O2 emission and water burden on local orography, while the temperature maps are in strong disagreement with current models. Could this be the signature of heterogeneous chemistry? We will present the global maps and will discuss possible scenarios to explain the observations.This work was partially funded by grants from NASA's Planetary Astronomy Program (344-32-51-96), NASA’s Mars Fundamental Research Program (203959.02.02.20.29), NASA’s Astrobiology Program (344-53-51), and the NSF-RUI Program (AST-805540). We thank the administration and staff of the European Southern Observatory/VLT and NASA-IRTF for awarding observing time and coordinating our observations.Bertaux, J.-L., Gondet, B., Lefèvre, F., et al. 2012. J. Geophys. Res. Pl. 117. pp. 1-9.Clancy, R.T., Sandor, B.J., Wolff, M.J., et al. 2012. J. Geophys. Res

  4. Sample sizes and model comparison metrics for species distribution models

    Science.gov (United States)

    B.B. Hanberry; H.S. He; D.C. Dey

    2012-01-01

    Species distribution models use small samples to produce continuous distribution maps. The question of how small a sample can be to produce an accurate model generally has been answered based on comparisons to maximum sample sizes of 200 observations or fewer. In addition, model comparisons often are made with the kappa statistic, which has become controversial....

  5. Effective sampling range of a synthetic protein-based attractant for Ceratitis capitata (Diptera: Tephritidae).

    Science.gov (United States)

    Epsky, Nancy D; Espinoza, Hernán R; Kendra, Paul E; Abernathy, Robert; Midgarden, David; Heath, Robert R

    2010-10-01

    Studies were conducted in Honduras to determine effective sampling range of a female-targeted protein-based synthetic attractant for the Mediterranean fruit fly, Ceratitis capitata (Wiedemann) (Diptera: Tephritidae). Multilure traps were baited with ammonium acetate, putrescine, and trimethylamine lures (three-component attractant) and sampled over eight consecutive weeks. Field design consisted of 38 traps (over 0.5 ha) placed in a combination of standard and high-density grids to facilitate geostatistical analysis, and tests were conducted in coffee (Coffea arabica L.),mango (Mangifera indica L.),and orthanique (Citrus sinensis X Citrus reticulata). Effective sampling range, as determined from the range parameter obtained from experimental variograms that fit a spherical model, was approximately 30 m for flies captured in tests in coffee or mango and approximately 40 m for flies captured in orthanique. For comparison, a release-recapture study was conducted in mango using wild (field-collected) mixed sex C. capitata and an array of 20 baited traps spaced 10-50 m from the release point. Contour analysis was used to document spatial distribution of fly recaptures and to estimate effective sampling range, defined by the area that encompassed 90% of the recaptures. With this approach, effective range of the three-component attractant was estimated to be approximately 28 m, similar to results obtained from variogram analysis. Contour maps indicated that wind direction had a strong influence on sampling range, which was approximately 15 m greater upwind compared with downwind from the release point. Geostatistical analysis of field-captured insects in appropriately designed trapping grids may provide a supplement or alternative to release-recapture studies to estimate sampling ranges for semiochemical-based trapping systems.

  6. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-12-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having "very high susceptibility", with the further 31% falling into zones classified as having "high susceptibility".

  7. The MAPS based PXL vertex detector for the STAR experiment

    Science.gov (United States)

    Contin, G.; Anderssen, E.; Greiner, L.; Schambach, J.; Silber, J.; Stezelberger, T.; Sun, X.; Szelezniak, M.; Vu, C.; Wieman, H.; Woodmansee, S.

    2015-03-01

    The Heavy Flavor Tracker (HFT) was installed in the STAR experiment for the 2014 heavy ion run of RHIC. Designed to improve the vertex resolution and extend the measurement capabilities in the heavy flavor domain, the HFT is composed of three different silicon detectors based on CMOS monolithic active pixels (MAPS), pads and strips respectively, arranged in four concentric cylinders close to the STAR interaction point. The two innermost HFT layers are placed at a radius of 2.7 and 8 cm from the beam line, respectively, and accommodate 400 ultra-thin (50 μ m) high resolution MAPS sensors arranged in 10-sensor ladders to cover a total silicon area of 0.16 m2. Each sensor includes a pixel array of 928 rows and 960 columns with a 20.7 μ m pixel pitch, providing a sensitive area of ~ 3.8 cm2. The architecture is based on a column parallel readout with amplification and correlated double sampling inside each pixel. Each column is terminated with a high precision discriminator, is read out in a rolling shutter mode and the output is processed through an integrated zero suppression logic. The results are stored in two SRAM with ping-pong arrangement for a continuous readout. The sensor features 185.6 μ s readout time and 170 mW/cm2 power dissipation. The detector is air-cooled, allowing a global material budget as low as 0.39% on the inner layer. A novel mechanical approach to detector insertion enables effective installation and integration of the pixel layers within an 8 hour shift during the on-going STAR run.In addition to a detailed description of the detector characteristics, the experience of the first months of data taking will be presented in this paper, with a particular focus on sensor threshold calibration, latch-up protection procedures and general system operations aimed at stabilizing the running conditions. Issues faced during the 2014 run will be discussed together with the implemented solutions. A preliminary analysis of the detector performance

  8. Mapping the Indonesian territory, based on pollution, social demography and geographical data, using self organizing feature map

    Science.gov (United States)

    Hernawati, Kuswari; Insani, Nur; Bambang S. H., M.; Nur Hadi, W.; Sahid

    2017-08-01

    This research aims to mapping the 33 (thirty-three) provinces in Indonesia, based on the data on air, water and soil pollution, as well as social demography and geography data, into a clustered model. The method used in this study was unsupervised method that combines the basic concept of Kohonen or Self-Organizing Feature Maps (SOFM). The method is done by providing the design parameters for the model based on data related directly/ indirectly to pollution, which are the demographic and social data, pollution levels of air, water and soil, as well as the geographical situation of each province. The parameters used consists of 19 features/characteristics, including the human development index, the number of vehicles, the availability of the plant's water absorption and flood prevention, as well as geographic and demographic situation. The data used were secondary data from the Central Statistics Agency (BPS), Indonesia. The data are mapped into SOFM from a high-dimensional vector space into two-dimensional vector space according to the closeness of location in term of Euclidean distance. The resulting outputs are represented in clustered grouping. Thirty-three provinces are grouped into five clusters, where each cluster has different features/characteristics and level of pollution. The result can used to help the efforts on prevention and resolution of pollution problems on each cluster in an effective and efficient way.

  9. Fuzzy rule-based landslide susceptibility mapping in Yığılca Forest District (Northwest of Turkey

    Directory of Open Access Journals (Sweden)

    Abdurrahim Aydın

    2016-07-01

    Full Text Available Landslide susceptibility map of Yığılca Forest District was formed based on developed fuzzy rules using GIS-based FuzzyCell software. An inventory of 315 landslides was updated through fieldworks after inventory map previously generated by the authors. Based on the landslide susceptibility mapping study previously made in the same area, for the comparison of two maps, same 8 landslide conditioning parameters were selected and then fuzzified for the landslide susceptibility mapping: land use, lithology, elevation, slope, aspect, distance to streams, distance to roads, and plan curvature. Mamdani model was selected as fuzzy inference system. After fuzzy rules definition, Center of Area (COA was selected as defuzzification method in model. The output of developed model was normalized between 0 and 1, and then divided five classes such as very low, low, moderate, high, and very high. According to developed model based 8 conditioning parameters, landslide susceptibility in Yığılca Forest District varies between 32 and 67 (in range of 0-100 with 0.703 Area Under the Curve (AUC value. According to classified landslide susceptibility map, in Yığılca Forest District, 32.89% of the total area has high and very high susceptibility while 29.59% of the area has low and very low susceptibility and the rest located in moderate susceptibility. The result of developed fuzzy rule based model compared with previously generated landslide map with logistic regression (LR. According to comparison of the results of two studies, higher differences exist in terms of AUC value and dispersion of susceptibility classes. This is because fuzzy rule based model completely depends on how parameters are classified and fuzzified and also depends on how truly the expert composed the rules. Even so, GIS-based fuzzy applications provide very valuable facilities for reasoning, which makes it possible to take into account inaccuracies and uncertainties.

  10. Parallel image encryption algorithm based on discretized chaotic map

    International Nuclear Information System (INIS)

    Zhou Qing; Wong Kwokwo; Liao Xiaofeng; Xiang Tao; Hu Yue

    2008-01-01

    Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms

  11. Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes

    Science.gov (United States)

    Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen

    2016-06-01

    Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.

  12. Agroclimatic mapping of maize crop based on soil physical properties

    International Nuclear Information System (INIS)

    Dourado Neto, Durval; Sparovek, G.; Reichardt, K.; Timm, Luiz Carlos; Nielsen, D.R.

    2004-01-01

    With the purpose of estimating water deficit to forecast yield knowing productivity (potential yield), the water balance is useful tool to recommend maize exploration and to define the sowing date. The computation can be done for each region with the objective of mapping maize grain yield based on agro-climatic data and soil physical properties. Based on agro-climatic data, air temperature and solar radiation, a model was built to estimate the corn grain productivity (the energy conversion results in dry mass production). The carbon dioxide (CO 2 ) fixation by plants is related to gross carbohydrate (CH 2 O) production and solar radiation. The CO 2 assimilation by C4 plants depends on the photosynthetic active radiation and temperature. From agro-climatic data and soil physical properties, a map with region identification can be built for solar radiation, air temperature, rainfall, maize grain productivity and yield, potential and real evapo-transpiration and water deficit. The map allows to identify the agro-climatic and the soil physical restrictions. This procedure can be used in different spatial (farm to State) and temporal (daily to monthly data) scales. The statistical analysis allows to compare estimated and observed values in different situations to validate the model and to verify which scale is more appropriate

  13. Object detection system based on multimodel saliency maps

    Science.gov (United States)

    Guo, Ya'nan; Luo, Chongfan; Ma, Yide

    2017-03-01

    Detection of visually salient image regions is extensively applied in computer vision and computer graphics, such as object detection, adaptive compression, and object recognition, but any single model always has its limitations to various images, so in our work, we establish a method based on multimodel saliency maps to detect the object, which intelligently absorbs the merits of various individual saliency detection models to achieve promising results. The method can be roughly divided into three steps: in the first step, we propose a decision-making system to evaluate saliency maps obtained by seven competitive methods and merely select the three most valuable saliency maps; in the second step, we introduce heterogeneous PCNN algorithm to obtain three prime foregrounds; and then a self-designed nonlinear fusion method is proposed to merge these saliency maps; at last, the adaptive improved and simplified PCNN model is used to detect the object. Our proposed method can constitute an object detection system for different occasions, which requires no training, is simple, and highly efficient. The proposed saliency fusion technique shows better performance over a broad range of images and enriches the applicability range by fusing different individual saliency models, this proposed system is worthy enough to be called a strong model. Moreover, the proposed adaptive improved SPCNN model is stemmed from the Eckhorn's neuron model, which is skilled in image segmentation because of its biological background, and in which all the parameters are adaptive to image information. We extensively appraise our algorithm on classical salient object detection database, and the experimental results demonstrate that the aggregation of saliency maps outperforms the best saliency model in all cases, yielding highest precision of 89.90%, better recall rates of 98.20%, greatest F-measure of 91.20%, and lowest mean absolute error value of 0.057, the value of proposed saliency evaluation

  14. Extended substitution-diffusion based image cipher using chaotic standard map

    Science.gov (United States)

    Kumar, Anil; Ghose, M. K.

    2011-01-01

    This paper proposes an extended substitution-diffusion based image cipher using chaotic standard map [1] and linear feedback shift register to overcome the weakness of previous technique by adding nonlinearity. The first stage consists of row and column rotation and permutation which is controlled by the pseudo-random sequences which is generated by standard chaotic map and linear feedback shift register, second stage further diffusion and confusion is obtained in the horizontal and vertical pixels by mixing the properties of the horizontally and vertically adjacent pixels, respectively, with the help of chaotic standard map. The number of rounds in both stage are controlled by combination of pseudo-random sequence and original image. The performance is evaluated from various types of analysis such as entropy analysis, difference analysis, statistical analysis, key sensitivity analysis, key space analysis and speed analysis. The experimental results illustrate that performance of this is highly secured and fast.

  15. RF-Based Location Using Interpolation Functions to Reduce Fingerprint Mapping

    Science.gov (United States)

    Ezpeleta, Santiago; Claver, José M.; Pérez-Solano, Juan J.; Martí, José V.

    2015-01-01

    Indoor RF-based localization using fingerprint mapping requires an initial training step, which represents a time consuming process. This location methodology needs a database conformed with RSSI (Radio Signal Strength Indicator) measures from the communication transceivers taken at specific locations within the localization area. But, the real world localization environment is dynamic and it is necessary to rebuild the fingerprint database when some environmental changes are made. This paper explores the use of different interpolation functions to complete the fingerprint mapping needed to achieve the sought accuracy, thereby reducing the effort in the training step. Also, different distributions of test maps and reference points have been evaluated, showing the validity of this proposal and necessary trade-offs. Results reported show that the same or similar localization accuracy can be achieved even when only 50% of the initial fingerprint reference points are taken. PMID:26516862

  16. On the downscaling of actual evapotranspiration maps based on combination of MODIS and landsat-based actual evapotranspiration estimates

    Science.gov (United States)

    Singh, Ramesh K.; Senay, Gabriel B.; Velpuri, Naga Manohar; Bohms, Stefanie; Verdin, James P.

    2014-01-01

     Downscaling is one of the important ways of utilizing the combined benefits of the high temporal resolution of Moderate Resolution Imaging Spectroradiometer (MODIS) images and fine spatial resolution of Landsat images. We have evaluated the output regression with intercept method and developed the Linear with Zero Intercept (LinZI) method for downscaling MODIS-based monthly actual evapotranspiration (AET) maps to the Landsat-scale monthly AET maps for the Colorado River Basin for 2010. We used the 8-day MODIS land surface temperature product (MOD11A2) and 328 cloud-free Landsat images for computing AET maps and downscaling. The regression with intercept method does have limitations in downscaling if the slope and intercept are computed over a large area. A good agreement was obtained between downscaled monthly AET using the LinZI method and the eddy covariance measurements from seven flux sites within the Colorado River Basin. The mean bias ranged from −16 mm (underestimation) to 22 mm (overestimation) per month, and the coefficient of determination varied from 0.52 to 0.88. Some discrepancies between measured and downscaled monthly AET at two flux sites were found to be due to the prevailing flux footprint. A reasonable comparison was also obtained between downscaled monthly AET using LinZI method and the gridded FLUXNET dataset. The downscaled monthly AET nicely captured the temporal variation in sampled land cover classes. The proposed LinZI method can be used at finer temporal resolution (such as 8 days) with further evaluation. The proposed downscaling method will be very useful in advancing the application of remotely sensed images in water resources planning and management.

  17. MAP Fault Localization Based on Wide Area Synchronous Phasor Measurement Information

    Science.gov (United States)

    Zhang, Yagang; Wang, Zengping

    2015-02-01

    In the research of complicated electrical engineering, the emergence of phasor measurement units (PMU) is a landmark event. The establishment and application of wide area measurement system (WAMS) in power system has made widespread and profound influence on the safe and stable operation of complicated power system. In this paper, taking full advantage of wide area synchronous phasor measurement information provided by PMUs, we have carried out precise fault localization based on the principles of maximum posteriori probability (MAP). Large numbers of simulation experiments have confirmed that the results of MAP fault localization are accurate and reliable. Even if there are interferences from white Gaussian stochastic noise, the results from MAP classification are also identical to the actual real situation.

  18. Generalized Smooth Transition Map Between Tent and Logistic Maps

    Science.gov (United States)

    Sayed, Wafaa S.; Fahmy, Hossam A. H.; Rezk, Ahmed A.; Radwan, Ahmed G.

    There is a continuous demand on novel chaotic generators to be employed in various modeling and pseudo-random number generation applications. This paper proposes a new chaotic map which is a general form for one-dimensional discrete-time maps employing the power function with the tent and logistic maps as special cases. The proposed map uses extra parameters to provide responses that fit multiple applications for which conventional maps were not enough. The proposed generalization covers also maps whose iterative relations are not based on polynomials, i.e. with fractional powers. We introduce a framework for analyzing the proposed map mathematically and predicting its behavior for various combinations of its parameters. In addition, we present and explain the transition map which results in intermediate responses as the parameters vary from their values corresponding to tent map to those corresponding to logistic map case. We study the properties of the proposed map including graph of the map equation, general bifurcation diagram and its key-points, output sequences, and maximum Lyapunov exponent. We present further explorations such as effects of scaling, system response with respect to the new parameters, and operating ranges other than transition region. Finally, a stream cipher system based on the generalized transition map validates its utility for image encryption applications. The system allows the construction of more efficient encryption keys which enhances its sensitivity and other cryptographic properties.

  19. Color reproduction system based on color appearance model and gamut mapping

    Science.gov (United States)

    Cheng, Fang-Hsuan; Yang, Chih-Yuan

    2000-06-01

    By the progress of computer, computer peripherals such as color monitor and printer are often used to generate color image. However, cross media color reproduction by human perception is usually different. Basically, the influence factors are device calibration and characterization, viewing condition, device gamut and human psychology. In this thesis, a color reproduction system based on color appearance model and gamut mapping is proposed. It consists of four parts; device characterization, color management technique, color appearance model and gamut mapping.

  20. Low Cost Vision Based Personal Mobile Mapping System

    Science.gov (United States)

    Amami, M. M.; Smith, M. J.; Kokkas, N.

    2014-03-01

    Mobile mapping systems (MMS) can be used for several purposes, such as transportation, highway infrastructure mapping and GIS data collecting. However, the acceptance of these systems is not wide spread and their use is still limited due the high cost and dependency on the Global Navigation Satellite System (GNSS). A low cost vision based personal MMS has been produced with an aim to overcome these limitations. The system has been designed to depend mainly on cameras and use of low cost GNSS and inertial sensors to provide a bundle adjustment solution with initial values. The system has the potential to be used indoor and outdoor. The system has been tested indoors and outdoors with different GPS coverage, surrounded features, and narrow and curvy paths. Tests show that the system is able to work in such environments providing 3D coordinates of better than 10 cm accuracy.

  1. Low Cost Vision Based Personal Mobile Mapping System

    Directory of Open Access Journals (Sweden)

    M. M. Amami

    2014-03-01

    Full Text Available Mobile mapping systems (MMS can be used for several purposes, such as transportation, highway infrastructure mapping and GIS data collecting. However, the acceptance of these systems is not wide spread and their use is still limited due the high cost and dependency on the Global Navigation Satellite System (GNSS. A low cost vision based personal MMS has been produced with an aim to overcome these limitations. The system has been designed to depend mainly on cameras and use of low cost GNSS and inertial sensors to provide a bundle adjustment solution with initial values. The system has the potential to be used indoor and outdoor. The system has been tested indoors and outdoors with different GPS coverage, surrounded features, and narrow and curvy paths. Tests show that the system is able to work in such environments providing 3D coordinates of better than 10 cm accuracy.

  2. Empty tracks optimization based on Z-Map model

    Science.gov (United States)

    Liu, Le; Yan, Guangrong; Wang, Zaijun; Zang, Genao

    2017-12-01

    For parts with many features, there are more empty tracks during machining. If these tracks are not optimized, the machining efficiency will be seriously affected. In this paper, the characteristics of the empty tracks are studied in detail. Combining with the existing optimization algorithm, a new tracks optimization method based on Z-Map model is proposed. In this method, the tool tracks are divided into the unit processing section, and then the Z-Map model simulation technique is used to analyze the order constraint between the unit segments. The empty stroke optimization problem is transformed into the TSP with sequential constraints, and then through the genetic algorithm solves the established TSP problem. This kind of optimization method can not only optimize the simple structural parts, but also optimize the complex structural parts, so as to effectively plan the empty tracks and greatly improve the processing efficiency.

  3. a Fast and Flexible Method for Meta-Map Building for Icp Based Slam

    Science.gov (United States)

    Kurian, A.; Morin, K. W.

    2016-06-01

    Recent developments in LiDAR sensors make mobile mapping fast and cost effective. These sensors generate a large amount of data which in turn improves the coverage and details of the map. Due to the limited range of the sensor, one has to collect a series of scans to build the entire map of the environment. If we have good GNSS coverage, building a map is a well addressed problem. But in an indoor environment, we have limited GNSS reception and an inertial solution, if available, can quickly diverge. In such situations, simultaneous localization and mapping (SLAM) is used to generate a navigation solution and map concurrently. SLAM using point clouds possesses a number of computational challenges even with modern hardware due to the shear amount of data. In this paper, we propose two strategies for minimizing the cost of computation and storage when a 3D point cloud is used for navigation and real-time map building. We have used the 3D point cloud generated by Leica Geosystems's Pegasus Backpack which is equipped with Velodyne VLP-16 LiDARs scanners. To improve the speed of the conventional iterative closest point (ICP) algorithm, we propose a point cloud sub-sampling strategy which does not throw away any key features and yet significantly reduces the number of points that needs to be processed and stored. In order to speed up the correspondence finding step, a dual kd-tree and circular buffer architecture is proposed. We have shown that the proposed method can run in real time and has excellent navigation accuracy characteristics.

  4. Synergy Maps: exploring compound combinations using network-based visualization.

    Science.gov (United States)

    Lewis, Richard; Guha, Rajarshi; Korcsmaros, Tamás; Bender, Andreas

    2015-01-01

    The phenomenon of super-additivity of biological response to compounds applied jointly, termed synergy, has the potential to provide many therapeutic benefits. Therefore, high throughput screening of compound combinations has recently received a great deal of attention. Large compound libraries and the feasibility of all-pairs screening can easily generate large, information-rich datasets. Previously, these datasets have been visualized using either a heat-map or a network approach-however these visualizations only partially represent the information encoded in the dataset. A new visualization technique for pairwise combination screening data, termed "Synergy Maps", is presented. In a Synergy Map, information about the synergistic interactions of compounds is integrated with information about their properties (chemical structure, physicochemical properties, bioactivity profiles) to produce a single visualization. As a result the relationships between compound and combination properties may be investigated simultaneously, and thus may afford insight into the synergy observed in the screen. An interactive web app implementation, available at http://richlewis42.github.io/synergy-maps, has been developed for public use, which may find use in navigating and filtering larger scale combination datasets. This tool is applied to a recent all-pairs dataset of anti-malarials, tested against Plasmodium falciparum, and a preliminary analysis is given as an example, illustrating the disproportionate synergism of histone deacetylase inhibitors previously described in literature, as well as suggesting new hypotheses for future investigation. Synergy Maps improve the state of the art in compound combination visualization, by simultaneously representing individual compound properties and their interactions. The web-based tool allows straightforward exploration of combination data, and easier identification of correlations between compound properties and interactions.

  5. NeuroMap: A spline-based interactive open-source software for spatiotemporal mapping of 2D and 3D MEA data

    Directory of Open Access Journals (Sweden)

    Oussama eAbdoun

    2011-01-01

    Full Text Available A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA technology. Indeed, high-density MEAs provide large-scale covering (several mm² of whole neural structures combined with microscopic resolution (about 50µm of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid deformation based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License (GPL and available at http://sites.google.com/site/neuromapsoftware.

  6. NeuroMap: A Spline-Based Interactive Open-Source Software for Spatiotemporal Mapping of 2D and 3D MEA Data.

    Science.gov (United States)

    Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise

    2011-01-01

    A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.

  7. An internal reference model-based PRF temperature mapping method with Cramer-Rao lower bound noise performance analysis.

    Science.gov (United States)

    Li, Cheng; Pan, Xinyi; Ying, Kui; Zhang, Qiang; An, Jing; Weng, Dehe; Qin, Wen; Li, Kuncheng

    2009-11-01

    The conventional phase difference method for MR thermometry suffers from disturbances caused by the presence of lipid protons, motion-induced error, and field drift. A signal model is presented with multi-echo gradient echo (GRE) sequence using a fat signal as an internal reference to overcome these problems. The internal reference signal model is fit to the water and fat signals by the extended Prony algorithm and the Levenberg-Marquardt algorithm to estimate the chemical shifts between water and fat which contain temperature information. A noise analysis of the signal model was conducted using the Cramer-Rao lower bound to evaluate the noise performance of various algorithms, the effects of imaging parameters, and the influence of the water:fat signal ratio in a sample on the temperature estimate. Comparison of the calculated temperature map and thermocouple temperature measurements shows that the maximum temperature estimation error is 0.614 degrees C, with a standard deviation of 0.06 degrees C, confirming the feasibility of this model-based temperature mapping method. The influence of sample water:fat signal ratio on the accuracy of the temperature estimate is evaluated in a water-fat mixed phantom experiment with an optimal ratio of approximately 0.66:1. (c) 2009 Wiley-Liss, Inc.

  8. Performance Evaluation of Java Based Object Relational Mapping Tools

    Directory of Open Access Journals (Sweden)

    Shoaib Mahmood Bhatti

    2013-04-01

    Full Text Available Object persistency is the hot issue in the form of ORM (Object Relational Mapping tools in industry as developers use these tools during software development. This paper presents the performance evaluation of Java based ORM tools. For this purpose, Hibernate, Ebean and TopLinkhave been selected as the ORM tools which are popular and open source. Their performance has been measured from execution point of view. The results show that ORM tools are the good option for the developers considering the system throughput in shorter setbacks and they can be used efficiently and effectively for performing mapping of the objects into the relational dominated world of database, thus creating a hope for a better and well dominated future of this technology.

  9. A microarray-based genotyping and genetic mapping approach for highly heterozygous outcrossing species enables localization of a large fraction of the unassembled Populus trichocarpa genome sequence.

    Science.gov (United States)

    Drost, Derek R; Novaes, Evandro; Boaventura-Novaes, Carolina; Benedict, Catherine I; Brown, Ryan S; Yin, Tongming; Tuskan, Gerald A; Kirst, Matias

    2009-06-01

    Microarrays have demonstrated significant power for genome-wide analyses of gene expression, and recently have also revolutionized the genetic analysis of segregating populations by genotyping thousands of loci in a single assay. Although microarray-based genotyping approaches have been successfully applied in yeast and several inbred plant species, their power has not been proven in an outcrossing species with extensive genetic diversity. Here we have developed methods for high-throughput microarray-based genotyping in such species using a pseudo-backcross progeny of 154 individuals of Populus trichocarpa and P. deltoides analyzed with long-oligonucleotide in situ-synthesized microarray probes. Our analysis resulted in high-confidence genotypes for 719 single-feature polymorphism (SFP) and 1014 gene expression marker (GEM) candidates. Using these genotypes and an established microsatellite (SSR) framework map, we produced a high-density genetic map comprising over 600 SFPs, GEMs and SSRs. The abundance of gene-based markers allowed us to localize over 35 million base pairs of previously unplaced whole-genome shotgun (WGS) scaffold sequence to putative locations in the genome of P. trichocarpa. A high proportion of sampled scaffolds could be verified for their placement with independently mapped SSRs, demonstrating the previously un-utilized power that high-density genotyping can provide in the context of map-based WGS sequence reassembly. Our results provide a substantial contribution to the continued improvement of the Populus genome assembly, while demonstrating the feasibility of microarray-based genotyping in a highly heterozygous population. The strategies presented are applicable to genetic mapping efforts in all plant species with similarly high levels of genetic diversity.

  10. Nano-scale orientation mapping of graphite in cast irons

    International Nuclear Information System (INIS)

    Theuwissen, Koenraad; Lacaze, Jacques; Véron, Muriel; Laffont, Lydia

    2014-01-01

    A diametrical section of a graphite spheroid from a ductile iron sample was prepared using the focused ion beam-lift out technique. Characterization of this section was carried out through automated crystal orientation mapping in a transmission electron microscope. This new technique automatically collects electron diffraction patterns and matches them with precalculated templates. The results of this investigation are crystal orientation and phase maps of the specimen, which bring new light to the understanding of growth mechanisms of this peculiar graphite morphology. This article shows that mapping the orientation of carbon-based materials such as graphite, which is difficult to achieve with conventional techniques, can be performed automatically and at high spatial resolution using automated crystal orientation mapping in a transmission electron microscope. - Highlights: • ACOM/TEM can be used to study the crystal orientation of carbon-based materials. • A spheroid is formed by conical sectors radiating from a central nuclei. • Misorientations exist within the conical sectors, defining various orientation domains

  11. [A Study on the Cognitive Learning Effectiveness of Scenario-Based Concept Mapping in a Neurological Nursing Course].

    Science.gov (United States)

    Pan, Hui-Ching; Hsieh, Suh-Ing; Hsu, Li-Ling

    2015-12-01

    The multiple levels of knowledge related to the neurological system deter many students from pursuing studies on this topic. Thus, in facing complicated and uncertain medical circumstances, nursing students have diffi-culty adjusting and using basic neurological-nursing knowledge and skills. Scenario-based concept-mapping teaching has been shown to promote the integration of complicated data, clarify related concepts, and increase the effectiveness of cognitive learning. To investigate the effect on the neurological-nursing cognition and learning attitude of nursing students of a scenario-based concept-mapping strategy that was integrated into the neurological nursing unit of a medical and surgical nursing course. This quasi-experimental study used experimental and control groups and a pre-test / post-test design. Sopho-more (2nd year) students in a four-year program at a university of science and technology in Taiwan were convenience sampled using cluster randomization that was run under SPSS 17.0. Concept-mapping lessons were used as the intervention for the experimental group. The control group followed traditional lesson plans only. The cognitive learning outcome was measured using the neurological nursing-learning examination. Both concept-mapping and traditional lessons significantly improved post-test neurological nursing learning scores (p learning attitude with regard to the teaching material. Furthermore, a significant number in the experimental group expressed the desire to add more lessons on anatomy, physiology, and pathology. These results indicate that this intervention strategy may help change the widespread fear and refusal of nursing students with regard to neurological lessons and may facilitate interest and positively affect learning in this important subject area. Integrating the concept-mapping strategy and traditional clinical-case lessons into neurological nursing lessons holds the potential to increase post-test scores significantly

  12. Regulation of microtubule-based transport by MAP4

    Science.gov (United States)

    Semenova, Irina; Ikeda, Kazuho; Resaul, Karim; Kraikivski, Pavel; Aguiar, Mike; Gygi, Steven; Zaliapin, Ilya; Cowan, Ann; Rodionov, Vladimir

    2014-01-01

    Microtubule (MT)-based transport of organelles driven by the opposing MT motors kinesins and dynein is tightly regulated in cells, but the underlying molecular mechanisms remain largely unknown. Here we tested the regulation of MT transport by the ubiquitous protein MAP4 using Xenopus melanophores as an experimental system. In these cells, pigment granules (melanosomes) move along MTs to the cell center (aggregation) or to the periphery (dispersion) by means of cytoplasmic dynein and kinesin-2, respectively. We found that aggregation signals induced phosphorylation of threonine residues in the MT-binding domain of the Xenopus MAP4 (XMAP4), thus decreasing binding of this protein to MTs. Overexpression of XMAP4 inhibited pigment aggregation by shortening dynein-dependent MT runs of melanosomes, whereas removal of XMAP4 from MTs reduced the length of kinesin-2–dependent runs and suppressed pigment dispersion. We hypothesize that binding of XMAP4 to MTs negatively regulates dynein-dependent movement of melanosomes and positively regulates kinesin-2–based movement. Phosphorylation during pigment aggregation reduces binding of XMAP4 to MTs, thus increasing dynein-dependent and decreasing kinesin-2–dependent motility of melanosomes, which stimulates their accumulation in the cell center, whereas dephosphorylation of XMAP4 during dispersion has an opposite effect. PMID:25143402

  13. A technology mapping based on graph of excitations and outputs for finite state machines

    Science.gov (United States)

    Kania, Dariusz; Kulisz, Józef

    2017-11-01

    A new, efficient technology mapping method of FSMs, dedicated for PAL-based PLDs is proposed. The essence of the method consists in searching for the minimal set of PAL-based logic blocks that cover a set of multiple-output implicants describing the transition and output functions of an FSM. The method is based on a new concept of graph: the Graph of Excitations and Outputs. The proposed algorithm was tested using the FSM benchmarks. The obtained results were compared with the classical technology mapping of FSM.

  14. A first generation BAC-based physical map of the rainbow trout genome

    Directory of Open Access Journals (Sweden)

    Thorgaard Gary H

    2009-10-01

    Full Text Available Abstract Background Rainbow trout (Oncorhynchus mykiss are the most-widely cultivated cold freshwater fish in the world and an important model species for many research areas. Coupling great interest in this species as a research model with the need for genetic improvement of aquaculture production efficiency traits justifies the continued development of genomics research resources. Many quantitative trait loci (QTL have been identified for production and life-history traits in rainbow trout. A bacterial artificial chromosome (BAC physical map is needed to facilitate fine mapping of QTL and the selection of positional candidate genes for incorporation in marker-assisted selection (MAS for improving rainbow trout aquaculture production. This resource will also facilitate efforts to obtain and assemble a whole-genome reference sequence for this species. Results The physical map was constructed from DNA fingerprinting of 192,096 BAC clones using the 4-color high-information content fingerprinting (HICF method. The clones were assembled into physical map contigs using the finger-printing contig (FPC program. The map is composed of 4,173 contigs and 9,379 singletons. The total number of unique fingerprinting fragments (consensus bands in contigs is 1,185,157, which corresponds to an estimated physical length of 2.0 Gb. The map assembly was validated by 1 comparison with probe hybridization results and agarose gel fingerprinting contigs; and 2 anchoring large contigs to the microsatellite-based genetic linkage map. Conclusion The production and validation of the first BAC physical map of the rainbow trout genome is described in this paper. We are currently integrating this map with the NCCCWA genetic map using more than 200 microsatellites isolated from BAC end sequences and by identifying BACs that harbor more than 300 previously mapped markers. The availability of an integrated physical and genetic map will enable detailed comparative genome

  15. An integrated genetic map based on four mapping populations and quantitative trait loci associated with economically important traits in watermelon (Citrullus lanatus)

    Science.gov (United States)

    2014-01-01

    Background Modern watermelon (Citrullus lanatus L.) cultivars share a narrow genetic base due to many years of selection for desirable horticultural qualities. Wild subspecies within C. lanatus are important potential sources of novel alleles for watermelon breeding, but successful trait introgression into elite cultivars has had limited success. The application of marker assisted selection (MAS) in watermelon is yet to be realized, mainly due to the past lack of high quality genetic maps. Recently, a number of useful maps have become available, however these maps have few common markers, and were constructed using different marker sets, thus, making integration and comparative analysis among maps difficult. The objective of this research was to use single-nucleotide polymorphism (SNP) anchor markers to construct an integrated genetic map for C. lanatus. Results Under the framework of the high density genetic map, an integrated genetic map was constructed by merging data from four independent mapping experiments using a genetically diverse array of parental lines, which included three subspecies of watermelon. The 698 simple sequence repeat (SSR), 219 insertion-deletion (InDel), 36 structure variation (SV) and 386 SNP markers from the four maps were used to construct an integrated map. This integrated map contained 1339 markers, spanning 798 cM with an average marker interval of 0.6 cM. Fifty-eight previously reported quantitative trait loci (QTL) for 12 traits in these populations were also integrated into the map. In addition, new QTL identified for brix, fructose, glucose and sucrose were added. Some QTL associated with economically important traits detected in different genetic backgrounds mapped to similar genomic regions of the integrated map, suggesting that such QTL are responsible for the phenotypic variability observed in a broad array of watermelon germplasm. Conclusions The integrated map described herein enhances the utility of genomic tools over

  16. An integrated genetic map based on four mapping populations and quantitative trait loci associated with economically important traits in watermelon (Citrullus lanatus).

    Science.gov (United States)

    Ren, Yi; McGregor, Cecilia; Zhang, Yan; Gong, Guoyi; Zhang, Haiying; Guo, Shaogui; Sun, Honghe; Cai, Wantao; Zhang, Jie; Xu, Yong

    2014-01-20

    Modern watermelon (Citrullus lanatus L.) cultivars share a narrow genetic base due to many years of selection for desirable horticultural qualities. Wild subspecies within C. lanatus are important potential sources of novel alleles for watermelon breeding, but successful trait introgression into elite cultivars has had limited success. The application of marker assisted selection (MAS) in watermelon is yet to be realized, mainly due to the past lack of high quality genetic maps. Recently, a number of useful maps have become available, however these maps have few common markers, and were constructed using different marker sets, thus, making integration and comparative analysis among maps difficult. The objective of this research was to use single-nucleotide polymorphism (SNP) anchor markers to construct an integrated genetic map for C. lanatus. Under the framework of the high density genetic map, an integrated genetic map was constructed by merging data from four independent mapping experiments using a genetically diverse array of parental lines, which included three subspecies of watermelon. The 698 simple sequence repeat (SSR), 219 insertion-deletion (InDel), 36 structure variation (SV) and 386 SNP markers from the four maps were used to construct an integrated map. This integrated map contained 1339 markers, spanning 798 cM with an average marker interval of 0.6 cM. Fifty-eight previously reported quantitative trait loci (QTL) for 12 traits in these populations were also integrated into the map. In addition, new QTL identified for brix, fructose, glucose and sucrose were added. Some QTL associated with economically important traits detected in different genetic backgrounds mapped to similar genomic regions of the integrated map, suggesting that such QTL are responsible for the phenotypic variability observed in a broad array of watermelon germplasm. The integrated map described herein enhances the utility of genomic tools over previous watermelon genetic maps. A

  17. GLIDERS - A web-based search engine for genome-wide linkage disequilibrium between HapMap SNPs

    Directory of Open Access Journals (Sweden)

    Broxholme John

    2009-10-01

    Full Text Available Abstract Background A number of tools for the examination of linkage disequilibrium (LD patterns between nearby alleles exist, but none are available for quickly and easily investigating LD at longer ranges (>500 kb. We have developed a web-based query tool (GLIDERS: Genome-wide LInkage DisEquilibrium Repository and Search engine that enables the retrieval of pairwise associations with r2 ≥ 0.3 across the human genome for any SNP genotyped within HapMap phase 2 and 3, regardless of distance between the markers. Description GLIDERS is an easy to use web tool that only requires the user to enter rs numbers of SNPs they want to retrieve genome-wide LD for (both nearby and long-range. The intuitive web interface handles both manual entry of SNP IDs as well as allowing users to upload files of SNP IDs. The user can limit the resulting inter SNP associations with easy to use menu options. These include MAF limit (5-45%, distance limits between SNPs (minimum and maximum, r2 (0.3 to 1, HapMap population sample (CEU, YRI and JPT+CHB combined and HapMap build/release. All resulting genome-wide inter-SNP associations are displayed on a single output page, which has a link to a downloadable tab delimited text file. Conclusion GLIDERS is a quick and easy way to retrieve genome-wide inter-SNP associations and to explore LD patterns for any number of SNPs of interest. GLIDERS can be useful in identifying SNPs with long-range LD. This can highlight mis-mapping or other potential association signal localisation problems.

  18. ReactionMap: an efficient atom-mapping algorithm for chemical reactions.

    Science.gov (United States)

    Fooshee, David; Andronico, Alessio; Baldi, Pierre

    2013-11-25

    Large databases of chemical reactions provide new data-mining opportunities and challenges. Key challenges result from the imperfect quality of the data and the fact that many of these reactions are not properly balanced or atom-mapped. Here, we describe ReactionMap, an efficient atom-mapping algorithm. Our approach uses a combination of maximum common chemical subgraph search and minimization of an assignment cost function derived empirically from training data. We use a set of over 259,000 balanced atom-mapped reactions from the SPRESI commercial database to train the system, and we validate it on random sets of 1000 and 17,996 reactions sampled from this pool. These large test sets represent a broad range of chemical reaction types, and ReactionMap correctly maps about 99% of the atoms and about 96% of the reactions, with a mean time per mapping of 2 s. Most correctly mapped reactions are mapped with high confidence. Mapping accuracy compares favorably with ChemAxon's AutoMapper, versions 5 and 6.1, and the DREAM Web tool. These approaches correctly map 60.7%, 86.5%, and 90.3% of the reactions, respectively, on the same data set. A ReactionMap server is available on the ChemDB Web portal at http://cdb.ics.uci.edu .

  19. Evaluating the Use of an Object-Based Approach to Lithological Mapping in Vegetated Terrain

    Directory of Open Access Journals (Sweden)

    Stephen Grebby

    2016-10-01

    Full Text Available Remote sensing-based approaches to lithological mapping are traditionally pixel-oriented, with classification performed on either a per-pixel or sub-pixel basis with complete disregard for contextual information about neighbouring pixels. However, intra-class variability due to heterogeneous surface cover (i.e., vegetation and soil or regional variations in mineralogy and chemical composition can result in the generation of unrealistic, generalised lithological maps that exhibit the “salt-and-pepper” artefact of spurious pixel classifications, as well as poorly defined contacts. In this study, an object-based image analysis (OBIA approach to lithological mapping is evaluated with respect to its ability to overcome these issues by instead classifying groups of contiguous pixels (i.e., objects. Due to significant vegetation cover in the study area, the OBIA approach incorporates airborne multispectral and LiDAR data to indirectly map lithologies by exploiting associations with both topography and vegetation type. The resulting lithological maps were assessed both in terms of their thematic accuracy and ability to accurately delineate lithological contacts. The OBIA approach is found to be capable of generating maps with an overall accuracy of 73.5% through integrating spectral and topographic input variables. When compared to equivalent per-pixel classifications, the OBIA approach achieved thematic accuracy increases of up to 13.1%, whilst also reducing the “salt-and-pepper” artefact to produce more realistic maps. Furthermore, the OBIA approach was also generally capable of mapping lithological contacts more accurately. The importance of optimising the segmentation stage of the OBIA approach is also highlighted. Overall, this study clearly demonstrates the potential of OBIA for lithological mapping applications, particularly in significantly vegetated and heterogeneous terrain.

  20. Breaking an encryption scheme based on chaotic baker map

    International Nuclear Information System (INIS)

    Alvarez, Gonzalo; Li, Shujun

    2006-01-01

    In recent years, a growing number of cryptosystems based on chaos have been proposed, many of them fundamentally flawed by a lack of robustness and security. This Letter describes the security weaknesses of a recently proposed cryptographic algorithm with chaos at the physical level based on the baker map. It is shown that the security is trivially compromised for practical implementations of the cryptosystem with finite computing precision and for the use of the iteration number n as the secret key. Some possible countermeasures to enhance the security of the chaos-based cryptographic algorithm are also discussed

  1. Cropland Mapping over Sahelian and Sudanian Agrosystems: A Knowledge-Based Approach Using PROBA-V Time Series at 100-m

    Directory of Open Access Journals (Sweden)

    Marie-Julie Lambert

    2016-03-01

    Full Text Available Early warning systems for food security require accurate and up-to-date information on the location of major crops in order to prevent hazards. A recent systematic analysis of existing cropland maps identified priority areas for cropland mapping and highlighted a major need for the Sahelian and Sudanian agrosystems. This paper proposes a knowledge-based approach to map cropland in the Sahelian and Sudanian agrosystems that benefits from the 100-m spatial resolution of the recent PROBA-V sensor. The methodology uses five temporal features characterizing crop development throughout the vegetative season to optimize cropland discrimination. A feature importance analysis validates the efficiency of using a diversity of temporal features. The fully-automated method offers the first cropland map at 100-m using the PROBA-V sensor with an overall accuracy of 84% and an F-score for the cropland class of 74%. The improvements observed compared to existing cropland products are related to the hectometric resolution, to the methodology and to the quality of the labeling layer from which reliable training samples were automatically extracted. Classification errors are mainly explained by data availability and landscape fragmentation. Further improvements are expected with the upcoming enhanced cloud screening of the PROBA-V sensor.

  2. Estimating missing hourly climatic data using artificial neural network for energy balance based ET mapping applications

    Science.gov (United States)

    Remote sensing based evapotranspiration (ET) mapping has become an important tool for water resources management at a regional scale. Accurate hourly climatic data and reference ET are crucial input for successfully implementing remote sensing based ET models such as Mapping ET with internal calibra...

  3. The comparative effect of individually-generated vs. collaboratively-generated computer-based concept mapping on science concept learning

    Science.gov (United States)

    Kwon, So Young

    Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However

  4. Geologic Mapping and Paired Geochemical-Paleomagnetic Sampling of Reference Sections in the Grande Ronde Basalt: An Example from the Bingen Section, Columbia River Gorge, Washington

    Science.gov (United States)

    Sawlan, M.; Hagstrum, J. T.; Wells, R. E.

    2011-12-01

    We have completed comprehensive geochemical (GC) and paleomagnetic (PM) sampling of individual lava flows from eight reference stratigraphic sections in the Grande Ronde Basalt (GRB), Columbia River Basalt Group [Hagstrum et al., 2009, GSA Ann. Mtg, Portland (abst); Hagstrum et al., 2010, AGU Fall Mtg, San Francisco (abst)]. These sections, distributed across the Columbia Plateau and eastern Columbia River Gorge, contain as many as 30 flows, are up to 670 m thick, span upper magneto-stratigraphic zones R2 and N2, and, in some locations, also contain one or more N1 flows. In concert with GC and PM sampling, we have carried out detailed geologic mapping of these sections, typically at a scale of 1:3,000 to 1:5,000, using GPS, digital imagery from the National Aerial Imagery Program (NAIP), and compilation in GIS. GRB member and informal unit names of Reidel et al. [1989, GSA Sp. Paper 239] generally have been adopted, although two new units are identified and named within the N2 zone. Notably, a distinctive PM direction for intercalated lavas of several lower N2 units indicates coeval eruption of compositionally distinct units; this result contrasts with the scenario of serial stratigraphic succession of GRB units proposed by Reidel et al. [1989]. Our objectives in the mapping include: Confirming the integrity of the stratigraphic sequences by documenting flow contacts and intraflow horizons (changes in joint patterns or vesicularity); assessing fault displacements; and, establishing precisely located samples in geologic context such that selected sites can be unambiguously reoccupied. A geologic map and GC-PM data for the Bingen section, along the north side of the Columbia River, are presented as an example of our GRB reference section mapping and sampling. One of our thicker sections (670 m) along which 30 flows are mapped, the Bingen section spans 7 km along WA State Hwy 14, from near the Hood River Bridge ESE to Locke Lake. This section cuts obliquely through a

  5. Quantum image encryption based on generalized affine transform and logistic map

    Science.gov (United States)

    Liang, Hao-Ran; Tao, Xiang-Yang; Zhou, Nan-Run

    2016-07-01

    Quantum circuits of the generalized affine transform are devised based on the novel enhanced quantum representation of digital images. A novel quantum image encryption algorithm combining the generalized affine transform with logistic map is suggested. The gray-level information of the quantum image is encrypted by the XOR operation with a key generator controlled by the logistic map, while the position information of the quantum image is encoded by the generalized affine transform. The encryption keys include the independent control parameters used in the generalized affine transform and the logistic map. Thus, the key space is large enough to frustrate the possible brute-force attack. Numerical simulations and analyses indicate that the proposed algorithm is realizable, robust and has a better performance than its classical counterpart in terms of computational complexity.

  6. Phase mapping of iron-based rapidly quenched alloys using precession electron diffraction

    International Nuclear Information System (INIS)

    Svec, P.; Janotova, I.; Hosko, J.; Matko, I.; Janickovic, D.; Svec, P. Sr.; Kepaptsoglou, D. M.

    2013-01-01

    The present contribution is focused on application of PED and phase/orientation mapping of nanocrystals of bcc-Fe formed during the first crystallization stage of amorphous Fe-Co-Si-B ribbon. Using precession electron diffraction and phase/orientation mapping the formation of primary crystalline phase, bcc-Fe, from amorphous Fe-Co-Si-B has been analyzed. Important information about mutual orientation of the phase in individual submicron grains as well as against the sample surface has been obtained. This information contributes to the understanding of micromechanisms controlling crystallization from amorphous rapidly quenched structure and of the structure of the original amorphous state. The presented technique due to its high spatial resolution, speed and information content provided complements well classical techniques, especially in nanocrystalline materials. (authors)

  7. Simulation of seagrass bed mapping by satellite images based on the radiative transfer model

    Science.gov (United States)

    Sagawa, Tatsuyuki; Komatsu, Teruhisa

    2015-06-01

    Seagrass and seaweed beds play important roles in coastal marine ecosystems. They are food sources and habitats for many marine organisms, and influence the physical, chemical, and biological environment. They are sensitive to human impacts such as reclamation and pollution. Therefore, their management and preservation are necessary for a healthy coastal environment. Satellite remote sensing is a useful tool for mapping and monitoring seagrass beds. The efficiency of seagrass mapping, seagrass bed classification in particular, has been evaluated by mapping accuracy using an error matrix. However, mapping accuracies are influenced by coastal environments such as seawater transparency, bathymetry, and substrate type. Coastal management requires sufficient accuracy and an understanding of mapping limitations for monitoring coastal habitats including seagrass beds. Previous studies are mainly based on case studies in specific regions and seasons. Extensive data are required to generalise assessments of classification accuracy from case studies, which has proven difficult. This study aims to build a simulator based on a radiative transfer model to produce modelled satellite images and assess the visual detectability of seagrass beds under different transparencies and seagrass coverages, as well as to examine mapping limitations and classification accuracy. Our simulations led to the development of a model of water transparency and the mapping of depth limits and indicated the possibility for seagrass density mapping under certain ideal conditions. The results show that modelling satellite images is useful in evaluating the accuracy of classification and that establishing seagrass bed monitoring by remote sensing is a reliable tool.

  8. Towards the XML schema measurement based on mapping between XML and OO domain

    Science.gov (United States)

    Rakić, Gordana; Budimac, Zoran; Heričko, Marjan; Pušnik, Maja

    2017-07-01

    Measuring quality of IT solutions is a priority in software engineering. Although numerous metrics for measuring object-oriented code already exist, measuring quality of UML models or XML Schemas is still developing. One of the research questions in the overall research leaded by ideas described in this paper is whether we can apply already defined object-oriented design metrics on XML schemas based on predefined mappings. In this paper, basic ideas for mentioned mapping are presented. This mapping is prerequisite for setting the future approach to XML schema quality measuring with object-oriented metrics.

  9. Alpha Matting with KL-Divergence Based Sparse Sampling.

    Science.gov (United States)

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  10. Developing a climate-based risk map of fascioliasis outbreaks in Iran

    Directory of Open Access Journals (Sweden)

    Mansour Halimi

    2015-09-01

    Full Text Available Summary: The strong relationship between climate and fascioliasis outbreaks enables the development of climate-based models to estimate the potential risk of fascioliasis outbreaks. This work aims to develop a climate-based risk map of fascioliasis outbreaks in Iran using Ollerenshaw's fascioliasis risk index incorporating geographical information system (GIS. Using this index, a risk map of fascioliasis outbreaks for the entire country was developed. We determined that the country can be divided into 4 fascioliasis outbreak risk categories. Class 1, in which the Mt value is less than 100, includes more than 0.91 of the country's area. The climate in this class is not conducive to fascioliasis outbreaks in any month. Dryness and low temperature in the wet season (December to April are the key barriers against fascioliasis outbreaks in this class. The risk map developed based on climatic factors indicated that only 0.03 of the country's area, including Gilan province in the northern region of Iran, is highly suitable to fascioliasis outbreaks during September to January. The Mt value is greater than 500 in this class. Heavy rainfall in the summer and fall, especially in Rasht, Astara and Bandar Anzaly (≥1000 mm/year, creates more suitable breeding places for snail intermediate hosts. Keywords: Ollerenshaw fascioliasis risk index, Climate, Gilan province, Iran

  11. Developing a climate-based risk map of fascioliasis outbreaks in Iran.

    Science.gov (United States)

    Halimi, Mansour; Farajzadeh, Manuchehr; Delavari, Mahdi; Arbabi, Mohsen

    2015-01-01

    The strong relationship between climate and fascioliasis outbreaks enables the development of climate-based models to estimate the potential risk of fascioliasis outbreaks. This work aims to develop a climate-based risk map of fascioliasis outbreaks in Iran using Ollerenshaw's fascioliasis risk index incorporating geographical information system (GIS). Using this index, a risk map of fascioliasis outbreaks for the entire country was developed. We determined that the country can be divided into 4 fascioliasis outbreak risk categories. Class 1, in which the Mt value is less than 100, includes more than 0.91 of the country's area. The climate in this class is not conducive to fascioliasis outbreaks in any month. Dryness and low temperature in the wet season (December to April) are the key barriers against fascioliasis outbreaks in this class. The risk map developed based on climatic factors indicated that only 0.03 of the country's area, including Gilan province in the northern region of Iran, is highly suitable to fascioliasis outbreaks during September to January. The Mt value is greater than 500 in this class. Heavy rainfall in the summer and fall, especially in Rasht, Astara and Bandar Anzaly (≥ 1000 mm/year), creates more suitable breeding places for snail intermediate hosts. Copyright © 2015 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  12. Evaluation of Sampling Methods for Bacillus Spore ...

    Science.gov (United States)

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  13. Clustering of the Self-Organizing Map based Approach in Induction Machine Rotor Faults Diagnostics

    Directory of Open Access Journals (Sweden)

    Ahmed TOUMI

    2009-12-01

    Full Text Available Self-Organizing Maps (SOM is an excellent method of analyzingmultidimensional data. The SOM based classification is attractive, due to itsunsupervised learning and topology preserving properties. In this paper, theperformance of the self-organizing methods is investigated in induction motorrotor fault detection and severity evaluation. The SOM is based on motor currentsignature analysis (MCSA. The agglomerative hierarchical algorithms using theWard’s method is applied to automatically dividing the map into interestinginterpretable groups of map units that correspond to clusters in the input data. Theresults obtained with this approach make it possible to detect a rotor bar fault justdirectly from the visualization results. The system is also able to estimate theextent of rotor faults.

  14. Color image encryption based on Coupled Nonlinear Chaotic Map

    International Nuclear Information System (INIS)

    Mazloom, Sahar; Eftekhari-Moghadam, Amir Masud

    2009-01-01

    Image encryption is somehow different from text encryption due to some inherent features of image such as bulk data capacity and high correlation among pixels, which are generally difficult to handle by conventional methods. The desirable cryptographic properties of the chaotic maps such as sensitivity to initial conditions and random-like behavior have attracted the attention of cryptographers to develop new encryption algorithms. Therefore, recent researches of image encryption algorithms have been increasingly based on chaotic systems, though the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper proposes a Coupled Nonlinear Chaotic Map, called CNCM, and a novel chaos-based image encryption algorithm to encrypt color images by using CNCM. The chaotic cryptography technique which used in this paper is a symmetric key cryptography with a stream cipher structure. In order to increase the security of the proposed algorithm, 240 bit-long secret key is used to generate the initial conditions and parameters of the chaotic map by making some algebraic transformations to the key. These transformations as well as the nonlinearity and coupling structure of the CNCM have enhanced the cryptosystem security. For getting higher security and higher complexity, the current paper employs the image size and color components to cryptosystem, thereby significantly increasing the resistance to known/chosen-plaintext attacks. The results of several experimental, statistical analysis and key sensitivity tests show that the proposed image encryption scheme provides an efficient and secure way for real-time image encryption and transmission.

  15. A GIS-based extended fuzzy multi-criteria evaluation for landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Shadman Roodposhti, Majid; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    Landslide susceptibility mapping (LSM) is making increasing use of GIS-based spatial analysis in combination with multi-criteria evaluation (MCE) methods. We have developed a new multi-criteria decision analysis (MCDA) method for LSM and applied it to the Izeh River basin in south-western Iran. Our method is based on fuzzy membership functions (FMFs) derived from GIS analysis. It makes use of nine causal landslide factors identified by local landslide experts. Fuzzy set theory was first integrated with an analytical hierarchy process (AHP) in order to use pairwise comparisons to compare LSM criteria for ranking purposes. FMFs were then applied in order to determine the criteria weights to be used in the development of a landslide susceptibility map. Finally, a landslide inventory database was used to validate the LSM map by comparing it with known landslides within the study area. Results indicated that the integration of fuzzy set theory with AHP produced significantly improved accuracies and a high level of reliability in the resulting landslide susceptibility map. Approximately 53% of known landslides within our study area fell within zones classified as having “very high susceptibility”, with the further 31% falling into zones classified as having “high susceptibility”. PMID:26089577

  16. Impact of population structure, effective bottleneck time, and allele frequency on linkage disequilibrium maps.

    Science.gov (United States)

    Zhang, Weihua; Collins, Andrew; Gibson, Jane; Tapper, William J; Hunt, Sarah; Deloukas, Panos; Bentley, David R; Morton, Newton E

    2004-12-28

    Genetic maps in linkage disequilibrium (LD) units play the same role for association mapping as maps in centimorgans provide at much lower resolution for linkage mapping. Association mapping of genes determining disease susceptibility and other phenotypes is based on the theory of LD, here applied to relations with three phenomena. To test the theory, markers at high density along a 10-Mb continuous segment of chromosome 20q were studied in African-American, Asian, and Caucasian samples. Population structure, whether created by pooling samples from divergent populations or by the mating pattern in a mixed population, is accurately bioassayed from genotype frequencies. The effective bottleneck time for Eurasians is substantially less than for migration out of Africa, reflecting later bottlenecks. The classical dependence of allele frequency on mutation age does not hold for the generally shorter time span of inbreeding and LD. Limitation of the classical theory to mutation age justifies the assumption of constant time in a LD map, except for alleles that were rare at the effective bottleneck time or have arisen since. This assumption is derived from the Malecot model and verified in all samples. Tested measures of relative efficiency, support intervals, and localization error determine the operating characteristics of LD maps that are applicable to every sexually reproducing species, with implications for association mapping, high-resolution linkage maps, evolutionary inference, and identification of recombinogenic sequences.

  17. MITIE: Simultaneous RNA-Seq-based transcript identification and quantification in multiple samples.

    Science.gov (United States)

    Behr, Jonas; Kahles, André; Zhong, Yi; Sreedharan, Vipin T; Drewe, Philipp; Rätsch, Gunnar

    2013-10-15

    High-throughput sequencing of mRNA (RNA-Seq) has led to tremendous improvements in the detection of expressed genes and reconstruction of RNA transcripts. However, the extensive dynamic range of gene expression, technical limitations and biases, as well as the observed complexity of the transcriptional landscape, pose profound computational challenges for transcriptome reconstruction. We present the novel framework MITIE (Mixed Integer Transcript IdEntification) for simultaneous transcript reconstruction and quantification. We define a likelihood function based on the negative binomial distribution, use a regularization approach to select a few transcripts collectively explaining the observed read data and show how to find the optimal solution using Mixed Integer Programming. MITIE can (i) take advantage of known transcripts, (ii) reconstruct and quantify transcripts simultaneously in multiple samples, and (iii) resolve the location of multi-mapping reads. It is designed for genome- and assembly-based transcriptome reconstruction. We present an extensive study based on realistic simulated RNA-Seq data. When compared with state-of-the-art approaches, MITIE proves to be significantly more sensitive and overall more accurate. Moreover, MITIE yields substantial performance gains when used with multiple samples. We applied our system to 38 Drosophila melanogaster modENCODE RNA-Seq libraries and estimated the sensitivity of reconstructing omitted transcript annotations and the specificity with respect to annotated transcripts. Our results corroborate that a well-motivated objective paired with appropriate optimization techniques lead to significant improvements over the state-of-the-art in transcriptome reconstruction. MITIE is implemented in C++ and is available from http://bioweb.me/mitie under the GPL license.

  18. A FAST AND FLEXIBLE METHOD FOR META-MAP BUILDING FOR ICP BASED SLAM

    Directory of Open Access Journals (Sweden)

    A. Kurian

    2016-06-01

    Full Text Available Recent developments in LiDAR sensors make mobile mapping fast and cost effective. These sensors generate a large amount of data which in turn improves the coverage and details of the map. Due to the limited range of the sensor, one has to collect a series of scans to build the entire map of the environment. If we have good GNSS coverage, building a map is a well addressed problem. But in an indoor environment, we have limited GNSS reception and an inertial solution, if available, can quickly diverge. In such situations, simultaneous localization and mapping (SLAM is used to generate a navigation solution and map concurrently. SLAM using point clouds possesses a number of computational challenges even with modern hardware due to the shear amount of data. In this paper, we propose two strategies for minimizing the cost of computation and storage when a 3D point cloud is used for navigation and real-time map building. We have used the 3D point cloud generated by Leica Geosystems's Pegasus Backpack which is equipped with Velodyne VLP-16 LiDARs scanners. To improve the speed of the conventional iterative closest point (ICP algorithm, we propose a point cloud sub-sampling strategy which does not throw away any key features and yet significantly reduces the number of points that needs to be processed and stored. In order to speed up the correspondence finding step, a dual kd-tree and circular buffer architecture is proposed. We have shown that the proposed method can run in real time and has excellent navigation accuracy characteristics.

  19. Gene-based single nucleotide polymorphism markers for genetic and association mapping in common bean.

    Science.gov (United States)

    Galeano, Carlos H; Cortés, Andrés J; Fernández, Andrea C; Soler, Álvaro; Franco-Herrera, Natalia; Makunde, Godwill; Vanderleyden, Jos; Blair, Matthew W

    2012-06-26

    In common bean, expressed sequence tags (ESTs) are an underestimated source of gene-based markers such as insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). However, due to the nature of these conserved sequences, detection of markers is difficult and portrays low levels of polymorphism. Therefore, development of intron-spanning EST-SNP markers can be a valuable resource for genetic experiments such as genetic mapping and association studies. In this study, a total of 313 new gene-based markers were developed at target genes. Intronic variation was deeply explored in order to capture more polymorphism. Introns were putatively identified after comparing the common bean ESTs with the soybean genome, and the primers were designed over intron-flanking regions. The intronic regions were evaluated for parental polymorphisms using the single strand conformational polymorphism (SSCP) technique and Sequenom MassARRAY system. A total of 53 new marker loci were placed on an integrated molecular map in the DOR364 × G19833 recombinant inbred line (RIL) population. The new linkage map was used to build a consensus map, merging the linkage maps of the BAT93 × JALO EEP558 and DOR364 × BAT477 populations. A total of 1,060 markers were mapped, with a total map length of 2,041 cM across 11 linkage groups. As a second application of the generated resource, a diversity panel with 93 genotypes was evaluated with 173 SNP markers using the MassARRAY-platform and KASPar technology. These results were coupled with previous SSR evaluations and drought tolerance assays carried out on the same individuals. This agglomerative dataset was examined, in order to discover marker-trait associations, using general linear model (GLM) and mixed linear model (MLM). Some significant associations with yield components were identified, and were consistent with previous findings. In short, this study illustrates the power of intron-based markers for linkage and association mapping in

  20. White Matter Fiber-based Analysis of T1w/T2w Ratio Map.

    Science.gov (United States)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  1. White matter fiber-based analysis of T1w/T2w ratio map

    Science.gov (United States)

    Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin

    2017-02-01

    Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.

  2. Mass spectrometric-based stable isotopic 2-aminobenzoic acid glycan mapping for rapid glycan screening of biotherapeutics.

    Science.gov (United States)

    Prien, Justin M; Prater, Bradley D; Qin, Qiang; Cockrill, Steven L

    2010-02-15

    Fast, sensitive, robust methods for "high-level" glycan screening are necessary during various stages of a biotherapeutic product's lifecycle, including clone selection, process changes, and quality control for lot release testing. Traditional glycan screening involves chromatographic or electrophoretic separation-based methods, and, although reproducible, these methods can be time-consuming. Even ultrahigh-performance chromatographic and microfluidic integrated LC/MS systems, which work on the tens of minute time scale, become lengthy when hundreds of samples are to be analyzed. Comparatively, a direct infusion mass spectrometry (MS)-based glycan screening method acquires data on a millisecond time scale, exhibits exquisite sensitivity and reproducibility, and is amenable to automated peak annotation. In addition, characterization of glycan species via sequential mass spectrometry can be performed simultaneously. Here, we demonstrate a quantitative high-throughput MS-based mapping approach using stable isotope 2-aminobenzoic acid (2-AA) for rapid "high-level" glycan screening.

  3. Tree Cover Mapping Tool—Documentation and user manual

    Science.gov (United States)

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2016-06-02

    The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.

  4. Map-based multicriteria analysis to support interactive land use allocation

    NARCIS (Netherlands)

    Arciniegas Lopez, G.A.; Janssen, R.; Omtzigt, A.Q.A.

    2011-01-01

    This article focuses on the use of map-based multicriteria analysis to develop a negotiation support tool for land use allocation. Spatial multicriteria analysis is used to make explicit trade-offs between objectives and to provide guidance and feedback on the land use changes negotiated by the

  5. Creation of a Cell-Based Digital Cadastral Mapping System (Digital ...

    African Journals Online (AJOL)

    Digital cadastre enhances land transaction activities to be conducted in a business manner. Similarly, land subdivision or boundary redefinition, land registration and land marketing are achieved with better accuracy. This paper discusses the need to introduce a national Cell-Based Digital Cadastral Mapping System model ...

  6. Optimized preparation of urine samples for two-dimensional electrophoresis and initial application to patient samples

    DEFF Research Database (Denmark)

    Lafitte, Daniel; Dussol, Bertrand; Andersen, Søren

    2002-01-01

    OBJECTIVE: We optimized of the preparation of urinary samples to obtain a comprehensive map of urinary proteins of healthy subjects and then compared this map with the ones obtained with patient samples to show that the pattern was specific of their kidney disease. DESIGN AND METHODS: The urinary...

  7. PERCEPTUAL MAPPING BASED ON IDIOSYNCRATIC SETS OF ATTRIBUTES

    NARCIS (Netherlands)

    STEENKAMP, JBEM; VANTRIJP, HCM; TENBERGE, JMF

    The authors describe a compositional perceptual mapping procedure, unrestricted attribute-elicitation mapping (UAM), which allows consumers to describe and rate the brands in their own terminology and thus relaxes the restrictive assumptions of traditional compositional mapping techniques regarding

  8. Identification of Coupled Map Lattice Based on Compressed Sensing

    Directory of Open Access Journals (Sweden)

    Dong Xie

    2016-01-01

    Full Text Available A novel approach for the parameter identification of coupled map lattice (CML based on compressed sensing is presented in this paper. We establish a meaningful connection between these two seemingly unrelated study topics and identify the weighted parameters using the relevant recovery algorithms in compressed sensing. Specifically, we first transform the parameter identification problem of CML into the sparse recovery problem of underdetermined linear system. In fact, compressed sensing provides a feasible method to solve underdetermined linear system if the sensing matrix satisfies some suitable conditions, such as restricted isometry property (RIP and mutual coherence. Then we give a low bound on the mutual coherence of the coefficient matrix generated by the observed values of CML and also prove that it satisfies the RIP from a theoretical point of view. If the weighted vector of each element is sparse in the CML system, our proposed approach can recover all the weighted parameters using only about M samplings, which is far less than the number of the lattice elements N. Another important and significant advantage is that if the observed data are contaminated with some types of noises, our approach is still effective. In the simulations, we mainly show the effects of coupling parameter and noise on the recovery rate.

  9. Big Data: A Parallel Particle Swarm Optimization-Back-Propagation Neural Network Algorithm Based on MapReduce.

    Science.gov (United States)

    Cao, Jianfang; Cui, Hongyan; Shi, Hao; Jiao, Lijuan

    2016-01-01

    A back-propagation (BP) neural network can solve complicated random nonlinear mapping problems; therefore, it can be applied to a wide range of problems. However, as the sample size increases, the time required to train BP neural networks becomes lengthy. Moreover, the classification accuracy decreases as well. To improve the classification accuracy and runtime efficiency of the BP neural network algorithm, we proposed a parallel design and realization method for a particle swarm optimization (PSO)-optimized BP neural network based on MapReduce on the Hadoop platform using both the PSO algorithm and a parallel design. The PSO algorithm was used to optimize the BP neural network's initial weights and thresholds and improve the accuracy of the classification algorithm. The MapReduce parallel programming model was utilized to achieve parallel processing of the BP algorithm, thereby solving the problems of hardware and communication overhead when the BP neural network addresses big data. Datasets on 5 different scales were constructed using the scene image library from the SUN Database. The classification accuracy of the parallel PSO-BP neural network algorithm is approximately 92%, and the system efficiency is approximately 0.85, which presents obvious advantages when processing big data. The algorithm proposed in this study demonstrated both higher classification accuracy and improved time efficiency, which represents a significant improvement obtained from applying parallel processing to an intelligent algorithm on big data.

  10. Rapid Prototyping of a Map-Based Android App

    OpenAIRE

    Flanagan, Nicholas M; Theller, Eric; Theller, Larry

    2013-01-01

    This project tries to provide a mobile phone-based solution app named “DriftWatch Pollinator Mapper” that will allow beekeepers, apiary inspectors, and association staff to easily register and map a hive into the Driftwatch system, where local pesticide applicators will notice it and be aware of the presence of pollinators. The purpose of the mobile application is to speed the process of registering beekeepers within DriftWatch, since many beekeepers have significant trouble using only web-ba...

  11. Blanding’s Turtle (Emydoidea blandingii Potential Habitat Mapping Using Aerial Orthophotographic Imagery and Object Based Classification

    Directory of Open Access Journals (Sweden)

    Douglas J. King

    2012-01-01

    Full Text Available Blanding’s turtle (Emydoidea blandingii is a threatened species under Canada’s Species at Risk Act. In southern Québec, field based inventories are ongoing to determine its abundance and potential habitat. The goal of this research was to develop means for mapping of potential habitat based on primary habitat attributes that can be detected with high-resolution remotely sensed imagery. Using existing spring leaf-off 20 cm resolution aerial orthophotos of a portion of Gatineau Park where some Blanding’s turtle observations had been made, habitat attributes were mapped at two scales: (1 whole wetlands; (2 within wetland habitat features of open water, vegetation (used for camouflage and thermoregulation, and logs (used for spring sun-basking. The processing steps involved initial pixel-based classification to eliminate most areas of non-wetland, followed by object-based segmentations and classifications using a customized rule sequence to refine the wetland map and to map the within wetland habitat features. Variables used as inputs to the classifications were derived from the orthophotos and included image brightness, texture, and segmented object shape and area. Independent validation using field data and visual interpretation showed classification accuracy for all habitat attributes to be generally over 90% with a minimum of 81.5% for the producer’s accuracy of logs. The maps for each attribute were combined to produce a habitat suitability map for Blanding’s turtle. Of the 115 existing turtle observations, 92.3% were closest to a wetland of the two highest suitability classes. High-resolution imagery combined with object-based classification and habitat suitability mapping methods such as those presented provide a much more spatially explicit representation of detailed habitat attributes than can be obtained through field work alone. They can complement field efforts to document and track turtle activities and can contribute to

  12. A Simple K-Map Based Variable Selection Scheme in the Direct ...

    African Journals Online (AJOL)

    A multiplexer with (n-l) data select inputs can realise directly a function of n variables. In this paper, a simple k-map based variable selection scheme is proposed such that an n variable logic function can be synthesised using a multiplexer with (n-q) data input variables and q data select variables. The procedure is based on ...

  13. Effectiveness of higher order thinking skills (HOTS) based i-Think map concept towards primary students

    Science.gov (United States)

    Ping, Owi Wei; Ahmad, Azhar; Adnan, Mazlini; Hua, Ang Kean

    2017-05-01

    Higher Order Thinking Skills (HOTS) is a new concept of education reform based on the Taxonomies Bloom. The concept concentrate on student understanding in learning process based on their own methods. Through the HOTS questions are able to train students to think creatively, critic and innovative. The aim of this study was to identify the student's proficiency in solving HOTS Mathematics question by using i-Think map. This research takes place in Sabak Bernam, Selangor. The method applied is quantitative approach that involves approximately all of the standard five students. Pra-posttest was conduct before and after the intervention using i-Think map in solving the HOTS questions. The result indicates significant improvement for post-test, which prove that applying i-Think map enhance the students ability to solve HOTS question. Survey's analysis showed 90% of the students agree having i-Thinking map in analysis the question carefully and using keywords in the map to solve the questions. As conclusion, this process benefits students to minimize in making the mistake when solving the questions. Therefore, teachers are necessarily to guide students in applying the eligible i-Think map and methods in analyzing the question through finding the keywords.

  14. Efficient crop type mapping based on remote sensing in the Central Valley, California

    Science.gov (United States)

    Zhong, Liheng

    Most agricultural systems in California's Central Valley are purposely flexible and intentionally designed to meet the demands of dynamic markets. Agricultural land use is also impacted by climate change and urban development. As a result, crops change annually and semiannually, which makes estimating agricultural water use difficult, especially given the existing method by which agricultural land use is identified and mapped. A minor portion of agricultural land is surveyed annually for land-use type, and every 5 to 8 years the entire valley is completely evaluated. So far no effort has been made to effectively and efficiently identify specific crop types on an annual basis in this area. The potential of satellite imagery to map agricultural land cover and estimate water usage in the Central Valley is explored. Efforts are made to minimize the cost and reduce the time of production during the mapping process. The land use change analysis shows that a remote sensing based mapping method is the only means to map the frequent change of major crop types. The traditional maximum likelihood classification approach is first utilized to map crop types to test the classification capacity of existing algorithms. High accuracy is achieved with sufficient ground truth data for training, and crop maps of moderate quality can be timely produced to facilitate a near-real-time water use estimate. However, the large set of ground truth data required by this method results in high costs in data collection. It is difficult to reduce the cost because a trained classification algorithm is not transferable between different years or different regions. A phenology based classification (PBC) approach is developed which extracts phenological metrics from annual vegetation index profiles and identifies crop types based on these metrics using decision trees. According to the comparison with traditional maximum likelihood classification, this phenology-based approach shows great advantages

  15. Development of admixture mapping panels for African Americans from commercial high-density SNP arrays

    Directory of Open Access Journals (Sweden)

    Dunston Georgia M

    2010-07-01

    Full Text Available Abstract Background Admixture mapping is a powerful approach for identifying genetic variants involved in human disease that exploits the unique genomic structure in recently admixed populations. To use existing published panels of ancestry-informative markers (AIMs for admixture mapping, markers have to be genotyped de novo for each admixed study sample and samples representing the ancestral parental populations. The increased availability of dense marker data on commercial chips has made it feasible to develop panels wherein the markers need not be predetermined. Results We developed two panels of AIMs (~2,000 markers each based on the Affymetrix Genome-Wide Human SNP Array 6.0 for admixture mapping with African American samples. These two AIM panels had good map power that was higher than that of a denser panel of ~20,000 random markers as well as other published panels of AIMs. As a test case, we applied the panels in an admixture mapping study of hypertension in African Americans in the Washington, D.C. metropolitan area. Conclusions Developing marker panels for admixture mapping from existing genome-wide genotype data offers two major advantages: (1 no de novo genotyping needs to be done, thereby saving costs, and (2 markers can be filtered for various quality measures and replacement markers (to minimize gaps can be selected at no additional cost. Panels of carefully selected AIMs have two major advantages over panels of random markers: (1 the map power from sparser panels of AIMs is higher than that of ~10-fold denser panels of random markers, and (2 clusters can be labeled based on information from the parental populations. With current technology, chip-based genome-wide genotyping is less expensive than genotyping ~20,000 random markers. The major advantage of using random markers is the absence of ascertainment effects resulting from the process of selecting markers. The ability to develop marker panels informative for ancestry from

  16. Mapping cell populations in flow cytometry data for cross‐sample comparison using the Friedman–Rafsky test statistic as a distance measure

    Science.gov (United States)

    Hsiao, Chiaowen; Liu, Mengya; Stanton, Rick; McGee, Monnie; Qian, Yu

    2015-01-01

    Abstract Flow cytometry (FCM) is a fluorescence‐based single‐cell experimental technology that is routinely applied in biomedical research for identifying cellular biomarkers of normal physiological responses and abnormal disease states. While many computational methods have been developed that focus on identifying cell populations in individual FCM samples, very few have addressed how the identified cell populations can be matched across samples for comparative analysis. This article presents FlowMap‐FR, a novel method for cell population mapping across FCM samples. FlowMap‐FR is based on the Friedman–Rafsky nonparametric test statistic (FR statistic), which quantifies the equivalence of multivariate distributions. As applied to FCM data by FlowMap‐FR, the FR statistic objectively quantifies the similarity between cell populations based on the shapes, sizes, and positions of fluorescence data distributions in the multidimensional feature space. To test and evaluate the performance of FlowMap‐FR, we simulated the kinds of biological and technical sample variations that are commonly observed in FCM data. The results show that FlowMap‐FR is able to effectively identify equivalent cell populations between samples under scenarios of proportion differences and modest position shifts. As a statistical test, FlowMap‐FR can be used to determine whether the expression of a cellular marker is statistically different between two cell populations, suggesting candidates for new cellular phenotypes by providing an objective statistical measure. In addition, FlowMap‐FR can indicate situations in which inappropriate splitting or merging of cell populations has occurred during gating procedures. We compared the FR statistic with the symmetric version of Kullback–Leibler divergence measure used in a previous population matching method with both simulated and real data. The FR statistic outperforms the symmetric version of KL‐distance in distinguishing

  17. Geologic map of the Zarkashan-Anguri copper and gold deposits, Ghazni Province, Afghanistan, modified from the 1968 original map compilation of E.P. Meshcheryakov and V.P. Sayapin

    Science.gov (United States)

    Peters, Stephen G.; Stettner, Will R.; Masonic, Linda M.; Moran, Thomas W.

    2011-01-01

    This map is a modified version of Geological map of the area of Zarkashan-Anguri gold deposits, scale 1:50,000, which was compiled by E.P. Meshcheryakov and V.P. Sayapin in 1968. Scientists from the U.S. Geological Survey, in cooperation with the Afghan Geological Survey and the Task Force for Business and Stability Operations of the U.S. Department of Defense, studied the original document and related reports and also visited the field area in April 2010. This modified map, which includes a cross section, illustrates the geologic setting of the Zarkashan-Anguri copper and gold deposits. The map reproduces the topology (contacts, faults, and so forth) of the original Soviet map and cross section and includes modifications based on our examination of that and other documents, and based on observations made and sampling undertaken during our field visit. (Refer to the Introduction and the References in the Map PDF for an explanation of our methodology and for complete citations of the original map and related reports.) Elevations on the cross section are derived from the original Soviet topography and may not match the newer topography used on the current map.

  18. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  19. An annotated genetic map of loblolly pine based on microsatellite and cDNA markers

    Science.gov (United States)

    Previous loblolly pine (Pinus taeda L.) genetic linkage maps have been based on a variety of DNA polymorphisms, such as AFLPs, RAPDs, RFLPs, and ESTPs, but only a few SSRs (simple sequence repeats), also known as simple tandem repeats or microsatellites, have been mapped in P. taeda. The objective o...

  20. Automated Plantation Mapping in Indonesia Using Remote Sensing Data

    Science.gov (United States)

    Karpatne, A.; Jia, X.; Khandelwal, A.; Kumar, V.

    2017-12-01

    Plantation mapping is critical for understanding and addressing deforestation, a key driver of climate change and ecosystem degradation. Unfortunately, most plantation maps are limited to small areas for specific years because they rely on visual inspection of imagery. In this work, we propose a data-driven approach which automatically generates yearly plantation maps for large regions using MODIS multi-spectral data. While traditional machine learning algorithms face manifold challenges in this task, e.g. imperfect training labels, spatio-temporal data heterogeneity, noisy and high-dimensional data, lack of evaluation data, etc., we introduce a novel deep learning-based framework that combines existing imperfect plantation products as training labels and models the spatio-temporal relationships of land covers. We also explores the post-processing steps based on Hidden Markov Model that further improve the detection accuracy. Then we conduct extensive evaluation of the generated plantation maps. Specifically, by randomly sampling and comparing with high-resolution Digital Globe imagery, we demonstrate that the generated plantation maps achieve both high precision and high recall. When compared with existing plantation mapping products, our detection can avoid both false positives and false negatives. Finally, we utilize the generated plantation maps in analyzing the relationship between forest fires and growth of plantations, which assists in better understanding the cause of deforestation in Indonesia.

  1. Performance of non-parametric algorithms for spatial mapping of tropical forest structure

    Directory of Open Access Journals (Sweden)

    Liang Xu

    2016-08-01

    Full Text Available Abstract Background Mapping tropical forest structure is a critical requirement for accurate estimation of emissions and removals from land use activities. With the availability of a wide range of remote sensing imagery of vegetation characteristics from space, development of finer resolution and more accurate maps has advanced in recent years. However, the mapping accuracy relies heavily on the quality of input layers, the algorithm chosen, and the size and quality of inventory samples for calibration and validation. Results By using airborne lidar data as the “truth” and focusing on the mean canopy height (MCH as a key structural parameter, we test two commonly-used non-parametric techniques of maximum entropy (ME and random forest (RF for developing maps over a study site in Central Gabon. Results of mapping show that both approaches have improved accuracy with more input layers in mapping canopy height at 100 m (1-ha pixels. The bias-corrected spatial models further improve estimates for small and large trees across the tails of height distributions with a trade-off in increasing overall mean squared error that can be readily compensated by increasing the sample size. Conclusions A significant improvement in tropical forest mapping can be achieved by weighting the number of inventory samples against the choice of image layers and the non-parametric algorithms. Without future satellite observations with better sensitivity to forest biomass, the maps based on existing data will remain slightly biased towards the mean of the distribution and under and over estimating the upper and lower tails of the distribution.

  2. Sample Based Unit Liter Dose Estimates

    International Nuclear Information System (INIS)

    JENSEN, L.

    2000-01-01

    The Tank Waste Characterization Program has taken many core samples, grab samples, and auger samples from the single-shell and double-shell tanks during the past 10 years. Consequently, the amount of sample data available has increased, both in terms of quantity of sample results and the number of tanks characterized. More and better data is available than when the current radiological and toxicological source terms used in the Basis for Interim Operation (BIO) (FDH 1999a) and the Final Safety Analysis Report (FSAR) (FDH 1999b) were developed. The Nuclear Safety and Licensing (NS and L) organization wants to use the new data to upgrade the radiological and toxicological source terms used in the BIO and FSAR. The NS and L organization requested assistance in producing a statistically based process for developing the source terms. This report describes the statistical techniques used and the assumptions made to support the development of a new radiological source term for liquid and solid wastes stored in single-shell and double-shell tanks. The results given in this report are a revision to similar results given in an earlier version of the document (Jensen and Wilmarth 1999). The main difference between the results in this document and the earlier version is that the dose conversion factors (DCF) for converting μCi/g or μCi/L to Sv/L (sieverts per liter) have changed. There are now two DCFs, one based on ICRP-68 and one based on ICW-71 (Brevick 2000)

  3. Video-based Mobile Mapping System Using Smartphones

    Science.gov (United States)

    Al-Hamad, A.; Moussa, A.; El-Sheimy, N.

    2014-11-01

    The last two decades have witnessed a huge growth in the demand for geo-spatial data. This demand has encouraged researchers around the world to develop new algorithms and design new mapping systems in order to obtain reliable sources for geo-spatial data. Mobile Mapping Systems (MMS) are one of the main sources for mapping and Geographic Information Systems (GIS) data. MMS integrate various remote sensing sensors, such as cameras and LiDAR, along with navigation sensors to provide the 3D coordinates of points of interest from moving platform (e.g. cars, air planes, etc.). Although MMS can provide accurate mapping solution for different GIS applications, the cost of these systems is not affordable for many users and only large scale companies and institutions can benefits from MMS systems. The main objective of this paper is to propose a new low cost MMS with reasonable accuracy using the available sensors in smartphones and its video camera. Using the smartphone video camera, instead of capturing individual images, makes the system easier to be used by non-professional users since the system will automatically extract the highly overlapping frames out of the video without the user intervention. Results of the proposed system are presented which demonstrate the effect of the number of the used images in mapping solution. In addition, the accuracy of the mapping results obtained from capturing a video is compared to the same results obtained from using separate captured images instead of video.

  4. The Global Trachoma Mapping Project: Methodology of a 34-Country Population-Based Study

    Science.gov (United States)

    Solomon, Anthony W.; Pavluck, Alexandre L.; Courtright, Paul; Aboe, Agatha; Adamu, Liknaw; Alemayehu, Wondu; Alemu, Menbere; Alexander, Neal D. E.; Kello, Amir Bedri; Bero, Berhanu; Brooker, Simon J.; Chu, Brian K.; Dejene, Michael; Emerson, Paul M.; Flueckiger, Rebecca M.; Gadisa, Solomon; Gass, Katherine; Gebre, Teshome; Habtamu, Zelalem; Harvey, Erik; Haslam, Dominic; King, Jonathan D.; Mesurier, Richard Le; Lewallen, Susan; Lietman, Thomas M.; MacArthur, Chad; Mariotti, Silvio P.; Massey, Anna; Mathieu, Els; Mekasha, Addis; Millar, Tom; Mpyet, Caleb; Muñoz, Beatriz E.; Ngondi, Jeremiah; Ogden, Stephanie; Pearce, Joseph; Sarah, Virginia; Sisay, Alemayehu; Smith, Jennifer L.; Taylor, Hugh R.; Thomson, Jo; West, Sheila K.; Willis, Rebecca; Bush, Simon; Haddad, Danny; Foster, Allen

    2015-01-01

    ABSTRACT Purpose: To complete the baseline trachoma map worldwide by conducting population-based surveys in an estimated 1238 suspected endemic districts of 34 countries. Methods: A series of national and sub-national projects owned, managed and staffed by ministries of health, conduct house-to-house cluster random sample surveys in evaluation units, which generally correspond to “health district” size: populations of 100,000–250,000 people. In each evaluation unit, we invite all residents aged 1 year and older from h households in each of c clusters to be examined for clinical signs of trachoma, where h is the number of households that can be seen by 1 team in 1 day, and the product h × c is calculated to facilitate recruitment of 1019 children aged 1–9 years. In addition to individual-level demographic and clinical data, household-level water, sanitation and hygiene data are entered into the purpose-built LINKS application on Android smartphones, transmitted to the Cloud, and cleaned, analyzed and ministry-of-health-approved via a secure web-based portal. The main outcome measures are the evaluation unit-level prevalence of follicular trachoma in children aged 1–9 years, prevalence of trachomatous trichiasis in adults aged 15 + years, percentage of households using safe methods for disposal of human feces, and percentage of households with proximate access to water for personal hygiene purposes. Results: In the first year of fieldwork, 347 field teams commenced work in 21 projects in 7 countries. Conclusion: With an approach that is innovative in design and scale, we aim to complete baseline mapping of trachoma throughout the world in 2015. PMID:26158580

  5. Generation and Assessment of Urban Land Cover Maps Using High-Resolution Multispectral Aerial Images

    DEFF Research Database (Denmark)

    Höhle, Joachim; Höhle, Michael

    2013-01-01

    a unique method for the automatic generation of urban land cover maps. In the present paper, imagery of a new medium-format aerial camera and advanced geoprocessing software are applied to derive normalized digital surface models and vegetation maps. These two intermediate products then become input...... to a tree structured classifier, which automatically derives land cover maps in 2D or 3D. We investigate the thematic accuracy of the produced land cover map by a class-wise stratified design and provide a method for deriving necessary sample sizes. Corresponding survey adjusted accuracy measures...... and their associated confidence intervals are used to adequately reflect uncertainty in the assessment based on the chosen sample size. Proof of concept for the method is given for an urban area in Switzerland. Here, the produced land cover map with six classes (building, wall and carport, road and parking lot, hedge...

  6. New approach based on fuzzy logic and principal component analysis for the classification of two-dimensional maps in health and disease. Application to lymphomas.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Righetti, Pier Giorgio; Antonucci, Francesca

    2003-07-04

    Two-dimensional (2D) electrophoresis is the most wide spread technique for the separation of proteins in biological systems. This technique produces 2D maps of high complexity, which creates difficulties in the comparison of different samples. The method proposed in this paper for the comparison of different 2D maps can be summarised in four steps: (a) digitalisation of the image; (b) fuzzyfication of the digitalised map in order to consider the variability of the two-dimensional electrophoretic separation; (c) decoding by principal component analysis of the previously obtained fuzzy maps, in order to reduce the system dimensionality; (d) classification analysis (linear discriminant analysis), in order to separate the samples contained in the dataset according to the classes present in said dataset. This method was applied to a dataset constituted by eight samples: four belonging to healthy human lymph-nodes and four deriving from non-Hodgkin lymphomas. The amount of fuzzyfication of the original map is governed by the sigma parameter. The larger the value, the more fuzzy theresulting transformed map. The effect of the fuzzyfication parameter was investigated, the optimal results being obtained for sigma = 1.75 and 2.25. Principal component analysis and linear discriminant analysis allowed the separation of the two classes of samples without any misclassification.

  7. GenomeVx: simple web-based creation of editable circular chromosome maps.

    Science.gov (United States)

    Conant, Gavin C; Wolfe, Kenneth H

    2008-03-15

    We describe GenomeVx, a web-based tool for making editable, publication-quality, maps of mitochondrial and chloroplast genomes and of large plasmids. These maps show the location of genes and chromosomal features as well as a position scale. The program takes as input either raw feature positions or GenBank records. In the latter case, features are automatically extracted and colored, an example of which is given. Output is in the Adobe Portable Document Format (PDF) and can be edited by programs such as Adobe Illustrator. GenomeVx is available at http://wolfe.gen.tcd.ie/GenomeVx

  8. Model-Based Geostatistical Mapping of the Prevalence of Onchocerca volvulus in West Africa.

    Directory of Open Access Journals (Sweden)

    Simon J O'Hanlon

    2016-01-01

    Full Text Available The initial endemicity (pre-control prevalence of onchocerciasis has been shown to be an important determinant of the feasibility of elimination by mass ivermectin distribution. We present the first geostatistical map of microfilarial prevalence in the former Onchocerciasis Control Programme in West Africa (OCP before commencement of antivectorial and antiparasitic interventions.Pre-control microfilarial prevalence data from 737 villages across the 11 constituent countries in the OCP epidemiological database were used as ground-truth data. These 737 data points, plus a set of statistically selected environmental covariates, were used in a Bayesian model-based geostatistical (B-MBG approach to generate a continuous surface (at pixel resolution of 5 km x 5km of microfilarial prevalence in West Africa prior to the commencement of the OCP. Uncertainty in model predictions was measured using a suite of validation statistics, performed on bootstrap samples of held-out validation data. The mean Pearson's correlation between observed and estimated prevalence at validation locations was 0.693; the mean prediction error (average difference between observed and estimated values was 0.77%, and the mean absolute prediction error (average magnitude of difference between observed and estimated values was 12.2%. Within OCP boundaries, 17.8 million people were deemed to have been at risk, 7.55 million to have been infected, and mean microfilarial prevalence to have been 45% (range: 2-90% in 1975.This is the first map of initial onchocerciasis prevalence in West Africa using B-MBG. Important environmental predictors of infection prevalence were identified and used in a model out-performing those without spatial random effects or environmental covariates. Results may be compared with recent epidemiological mapping efforts to find areas of persisting transmission. These methods may be extended to areas where data are sparse, and may be used to help inform the

  9. Visual map and instruction-based bicycle navigation: a comparison of effects on behaviour.

    Science.gov (United States)

    de Waard, Dick; Westerhuis, Frank; Joling, Danielle; Weiland, Stella; Stadtbäumer, Ronja; Kaltofen, Leonie

    2017-09-01

    Cycling with a classic paper map was compared with navigating with a moving map displayed on a smartphone, and with auditory, and visual turn-by-turn route guidance. Spatial skills were found to be related to navigation performance, however only when navigating from a paper or electronic map, not with turn-by-turn (instruction based) navigation. While navigating, 25% of the time cyclists fixated at the devices that present visual information. Navigating from a paper map required most mental effort and both young and older cyclists preferred electronic over paper map navigation. In particular a turn-by-turn dedicated guidance device was favoured. Visual maps are in particular useful for cyclists with higher spatial skills. Turn-by-turn information is used by all cyclists, and it is useful to make these directions available in all devices. Practitioner Summary: Electronic navigation devices are preferred over a paper map. People with lower spatial skills benefit most from turn-by-turn guidance information, presented either auditory or on a dedicated device. People with higher spatial skills perform well with all devices. It is advised to keep in mind that all users benefit from turn-by-turn information when developing a navigation device for cyclists.

  10. Map-based cloning and expression analysis of BMR-6 in sorghum

    Indian Academy of Sciences (India)

    CAD), using a map-based cloning approach. Genetic complementation confirmed that CAD is responsible for the BMR-6 phenotype. BMR-6 gene was expressed in all tested sorghum tissues, with the highest being in midrib and stem. Transient ...

  11. Utility assessment of a map-based online geo-collaboration tool.

    Science.gov (United States)

    Sidlar, Christopher L; Rinner, Claus

    2009-05-01

    Spatial group decision-making processes often include both informal and analytical components. Discussions among stakeholders or planning experts are an example of an informal component. When participants discuss spatial planning projects they typically express concerns and comments by pointing to places on a map. The Argumentation Map model provides a conceptual basis for collaborative tools that enable explicit linkages of arguments to the places to which they refer. These tools allow for the input of explicitly geo-referenced arguments as well as the visual access to arguments through a map interface. In this paper, we will review previous utility studies in geo-collaboration and evaluate a case study of a Web-based Argumentation Map application. The case study was conducted in the summer of 2005 when student participants discussed planning issues on the University of Toronto St. George campus. During a one-week unmoderated discussion phase, 11 participants wrote 60 comments on issues such as safety, facilities, parking, and building aesthetics. By measuring the participants' use of geographic references, we draw conclusions on how well the software tool supported the potential of the underlying concept. This research aims to contribute to a scientific approach to geo-collaboration in which the engineering of novel spatial decision support methods is complemented by a critical assessment of their utility in controlled, realistic experiments.

  12. The MAPS based PXL vertex detector for the STAR experiment

    International Nuclear Information System (INIS)

    Contin, G.; Anderssen, E.; Greiner, L.; Silber, J.; Stezelberger, T.; Vu, C.; Wieman, H.; Woodmansee, S.; Schambach, J.; Sun, X.; Szelezniak, M.

    2015-01-01

    The Heavy Flavor Tracker (HFT) was installed in the STAR experiment for the 2014 heavy ion run of RHIC. Designed to improve the vertex resolution and extend the measurement capabilities in the heavy flavor domain, the HFT is composed of three different silicon detectors based on CMOS monolithic active pixels (MAPS), pads and strips respectively, arranged in four concentric cylinders close to the STAR interaction point. The two innermost HFT layers are placed at a radius of 2.7 and 8 cm from the beam line, respectively, and accommodate 400 ultra-thin (50 μ m) high resolution MAPS sensors arranged in 10-sensor ladders to cover a total silicon area of 0.16 m 2 . Each sensor includes a pixel array of 928 rows and 960 columns with a 20.7 μ m pixel pitch, providing a sensitive area of ∼ 3.8 cm 2 . The architecture is based on a column parallel readout with amplification and correlated double sampling inside each pixel. Each column is terminated with a high precision discriminator, is read out in a rolling shutter mode and the output is processed through an integrated zero suppression logic. The results are stored in two SRAM with ping-pong arrangement for a continuous readout. The sensor features 185.6 μ s readout time and 170 mW/cm 2 power dissipation. The detector is air-cooled, allowing a global material budget as low as 0.39% on the inner layer. A novel mechanical approach to detector insertion enables effective installation and integration of the pixel layers within an 8 hour shift during the on-going STAR run.In addition to a detailed description of the detector characteristics, the experience of the first months of data taking will be presented in this paper, with a particular focus on sensor threshold calibration, latch-up protection procedures and general system operations aimed at stabilizing the running conditions. Issues faced during the 2014 run will be discussed together with the implemented solutions. A preliminary analysis of the detector

  13. Assessment of Photogrammetric Mapping Accuracy Based on Variation Flying Altitude Using Unmanned Aerial Vehicle

    International Nuclear Information System (INIS)

    Udin, W S; Ahmad, A

    2014-01-01

    Photogrammetry is the earliest technique used to collect data for topographic mapping. The recent development in aerial photogrammetry is the used of large format digital aerial camera for producing topographic map. The aerial photograph can be in the form of metric or non-metric imagery. The cost of mapping using aerial photogrammetry is very expensive. In certain application, there is a need to map small area with limited budget. Due to the development of technology, small format aerial photogrammetry technology has been introduced and offers many advantages. Currently, digital map can be extracted from digital aerial imagery of small format camera mounted on light weight platform such as unmanned aerial vehicle (UAV). This study utilizes UAV system for large scale stream mapping. The first objective of this study is to investigate the use of light weight rotary-wing UAV for stream mapping based on different flying height. Aerial photograph were acquired at 60% forward lap and 30% sidelap specifications. Ground control points and check points were established using Total Station technique. The digital camera attached to the UAV was calibrated and the recovered camera calibration parameters were then used in the digital images processing. The second objective is to determine the accuracy of the photogrammetric output. In this study, the photogrammetric output such as stereomodel in three dimensional (3D), contour lines, digital elevation model (DEM) and orthophoto were produced from a small stream of 200m long and 10m width. The research output is evaluated for planimetry and vertical accuracy using root mean square error (RMSE). Based on the finding, sub-meter accuracy is achieved and the RMSE value decreases as the flying height increases. The difference is relatively small. Finally, this study shows that UAV is very useful platform for obtaining aerial photograph and subsequently used for photogrammetric mapping and other applications

  14. An Improved Consensus Linkage Map of Barley Based on Flow-Sorted Chromosomes and Single Nucleotide Polymorphism Markers

    Directory of Open Access Journals (Sweden)

    María Muñoz-Amatriaín

    2011-11-01

    Full Text Available Recent advances in high-throughput genotyping have made it easier to combine information from different mapping populations into consensus genetic maps, which provide increased marker density and genome coverage compared to individual maps. Previously, a single nucleotide polymorphism (SNP-based genotyping platform was developed and used to genotype 373 individuals in four barley ( L. mapping populations. This led to a 2943 SNP consensus genetic map with 975 unique positions. In this work, we add data from six additional populations and more individuals from one of the original populations to develop an improved consensus map from 1133 individuals. A stringent and systematic analysis of each of the 10 populations was performed to achieve uniformity. This involved reexamination of the four populations included in the previous map. As a consequence, we present a robust consensus genetic map that contains 2994 SNP loci mapped to 1163 unique positions. The map spans 1137.3 cM with an average density of one marker bin per 0.99 cM. A novel application of the genotyping platform for gene detection allowed the assignment of 2930 genes to flow-sorted chromosomes or arms, confirmed the position of 2545 SNP-mapped loci, added chromosome or arm allocations to an additional 370 SNP loci, and delineated pericentromeric regions for chromosomes 2H to 7H. Marker order has been improved and map resolution has been increased by almost 20%. These increased precision outcomes enable more optimized SNP selection for marker-assisted breeding and support association genetic analysis and map-based cloning. It will also improve the anchoring of DNA sequence scaffolds and the barley physical map to the genetic map.

  15. Resection of highly language-eloquent brain lesions based purely on rTMS language mapping without awake surgery.

    Science.gov (United States)

    Ille, Sebastian; Sollmann, Nico; Butenschoen, Vicki M; Meyer, Bernhard; Ringel, Florian; Krieg, Sandro M

    2016-12-01

    The resection of left-sided perisylvian brain lesions harbours the risk of postoperative language impairment. Therefore the individual patient's language distribution is investigated by intraoperative direct cortical stimulation (DCS) during awake surgery. Yet, not all patients qualify for awake surgery. Non-invasive language mapping by repetitive navigated transcranial magnetic stimulation (rTMS) has frequently shown a high correlation in comparison with the results of DCS language mapping in terms of language-negative brain regions. The present study analyses the extent of resection (EOR) and functional outcome of patients who underwent left-sided perisylvian resection of brain lesions based purely on rTMS language mapping. Four patients with left-sided perisylvian brain lesions (two gliomas WHO III, one glioblastoma, one cavernous angioma) underwent rTMS language mapping prior to surgery. Data from rTMS language mapping and rTMS-based diffusion tensor imaging fibre tracking (DTI-FT) were transferred to the intraoperative neuronavigation system. Preoperatively, 5 days after surgery (POD5), and 3 months after surgery (POM3) clinical follow-up examinations were performed. No patient suffered from a new surgery-related aphasia at POM3. Three patients underwent complete resection immediately, while one patient required a second rTMS-based resection some days later to achieve the final, complete resection. The present study shows for the first time the feasibility of successfully resecting language-eloquent brain lesions based purely on the results of negative language maps provided by rTMS language mapping and rTMS-based DTI-FT. In very select cases, this technique can provide a rescue strategy with an optimal functional outcome and EOR when awake surgery is not feasible.

  16. Regional soil erosion assessment based on a sample survey and geostatistics

    Science.gov (United States)

    Yin, Shuiqing; Zhu, Zhengyuan; Wang, Li; Liu, Baoyuan; Xie, Yun; Wang, Guannan; Li, Yishan

    2018-03-01

    Soil erosion is one of the most significant environmental problems in China. From 2010 to 2012, the fourth national census for soil erosion sampled 32 364 PSUs (Primary Sampling Units, small watersheds) with the areas of 0.2-3 km2. Land use and soil erosion controlling factors including rainfall erosivity, soil erodibility, slope length, slope steepness, biological practice, engineering practice, and tillage practice for the PSUs were surveyed, and the soil loss rate for each land use in the PSUs was estimated using an empirical model, the Chinese Soil Loss Equation (CSLE). Though the information collected from the sample units can be aggregated to estimate soil erosion conditions on a large scale; the problem of estimating soil erosion condition on a regional scale has not been addressed well. The aim of this study is to introduce a new model-based regional soil erosion assessment method combining a sample survey and geostatistics. We compared seven spatial interpolation models based on the bivariate penalized spline over triangulation (BPST) method to generate a regional soil erosion assessment from the PSUs. Shaanxi Province (3116 PSUs) in China was selected for the comparison and assessment as it is one of the areas with the most serious erosion problem. Ten-fold cross-validation based on the PSU data showed the model assisted by the land use, rainfall erosivity factor (R), soil erodibility factor (K), slope steepness factor (S), and slope length factor (L) derived from a 1 : 10 000 topography map is the best one, with the model efficiency coefficient (ME) being 0.75 and the MSE being 55.8 % of that for the model assisted by the land use alone. Among four erosion factors as the covariates, the S factor contributed the most information, followed by K and L factors, and R factor made almost no contribution to the spatial estimation of soil loss. The LS factor derived from 30 or 90 m Shuttle Radar Topography Mission (SRTM) digital elevation model (DEM) data

  17. Construction and Implementation of Teaching Mode for Digital Mapping based on Interactive Micro-course Technology

    Directory of Open Access Journals (Sweden)

    Ning Gao

    2018-02-01

    Full Text Available The era of “Internet + education” has caused reforms in teaching ideas, teaching modes, and learning styles. The emergence of micro-course technology provides new strategies for integrating learning styles. Task-driven digital mapping teaching, known as traditional classroom organization, has poor teaching effect due to single learning style and strategy. A new teaching mode for digital mapping was constructed in this study based on micro-course technology by combining interactive micro-course technology and digital mapping teaching to adapt to the demands of modern teaching. This teaching mode mainly included four modules, namely, micro-courseware, micro-video, micro-exercise, and micro-examination. It realized the hierarchical teaching of knowledge points in digital mapping course, simplification of basic principles, simulation of engineering cases, and self-evaluation of learning outcomes. The teaching mode was applied to 114 students from the Mapping Engineering Department of Henan University of Urban Construction. Results indicate that the proposed teaching mode based on interactive micro-course technology promoting the independent after-class learning of the students, stimulating their learning enthusiasm, enhancing their practical abilities of the students, and improving the effect of teaching. This mode of teaching provides a new concept for the teaching mode reform of other courses in mapping engineering.

  18. Spatio-Temporal Story Mapping Animation Based On Structured Causal Relationships Of Historical Events

    Science.gov (United States)

    Inoue, Y.; Tsuruoka, K.; Arikawa, M.

    2014-04-01

    In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.

  19. On the security of a novel key agreement protocol based on chaotic maps

    International Nuclear Information System (INIS)

    Xiang Tao; Wong, K.-W.; Liao Xiaofeng

    2009-01-01

    Recently, Xiao et al. proposed a novel key agreement protocol based on Chebyshev chaotic map. In this paper, the security of the protocol is analyzed, and two attack methods can be found in different scenarios. The essential principle of Xiao et al.'s scheme is summarized. It is also pointed out with proof that any attempt along this line to improve the security of Chebyshev map is redundant.

  20. CRISM Multispectral and Hyperspectral Mapping Data - A Global Data Set for Hydrated Mineral Mapping

    Science.gov (United States)

    Seelos, F. P.; Hash, C. D.; Murchie, S. L.; Lim, H.

    2017-12-01

    The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) is a visible through short-wave infrared hyperspectral imaging spectrometer (VNIR S-detector: 364-1055 nm; IR L-detector: 1001-3936 nm; 6.55 nm sampling) that has been in operation on the Mars Reconnaissance Orbiter (MRO) since 2006. Over the course of the MRO mission, CRISM has acquired 290,000 individual mapping observation segments (mapping strips) with a variety of observing modes and data characteristics (VNIR/IR; 100/200 m/pxl; multi-/hyper-spectral band selection) over a wide range of observing conditions (atmospheric state, observation geometry, instrument state). CRISM mapping data coverage density varies primarily with latitude and secondarily due to seasonal and operational considerations. The aggregate global IR mapping data coverage currently stands at 85% ( 80% at the equator with 40% repeat sampling), which is sufficient spatial sampling density to support the assembly of empirically optimized radiometrically consistent mapping mosaic products. The CRISM project has defined a number of mapping mosaic data products (e.g. Multispectral Reduced Data Record (MRDR) map tiles) with varying degrees of observation-specific processing and correction applied prior to mosaic assembly. A commonality among the mosaic products is the presence of inter-observation radiometric discrepancies which are traceable to variable observation circumstances or associated atmospheric/photometric correction residuals. The empirical approach to radiometric reconciliation leverages inter-observation spatial overlaps and proximal relationships to construct a graph that encodes the mosaic structure and radiometric discrepancies. The graph theory abstraction allows the underling structure of the msaic to be evaluated and the corresponding optimization problem configured so it is well-posed. Linear and non-linear least squares optimization is then employed to derive a set of observation- and wavelength- specific model

  1. Phenology-based Spartina alterniflora mapping in coastal wetland of the Yangtze Estuary using time series of GaoFen satellite no. 1 wide field of view imagery

    Science.gov (United States)

    Ai, Jinquan; Gao, Wei; Gao, Zhiqiang; Shi, Runhe; Zhang, Chao

    2017-04-01

    Spartina alterniflora is an aggressive invasive plant species that replaces native species, changes the structure and function of the ecosystem across coastal wetlands in China, and is thus a major conservation concern. Mapping the spread of its invasion is a necessary first step for the implementation of effective ecological management strategies. The performance of a phenology-based approach for S. alterniflora mapping is explored in the coastal wetland of the Yangtze Estuary using a time series of GaoFen satellite no. 1 wide field of view camera (GF-1 WFV) imagery. First, a time series of the normalized difference vegetation index (NDVI) was constructed to evaluate the phenology of S. alterniflora. Two phenological stages (the senescence stage from November to mid-December and the green-up stage from late April to May) were determined as important for S. alterniflora detection in the study area based on NDVI temporal profiles, spectral reflectance curves of S. alterniflora and its coexistent species, and field surveys. Three phenology feature sets representing three major phenology-based detection strategies were then compared to map S. alterniflora: (1) the single-date imagery acquired within the optimal phenological window, (2) the multitemporal imagery, including four images from the two important phenological windows, and (3) the monthly NDVI time series imagery. Support vector machines and maximum likelihood classifiers were applied on each phenology feature set at different training sample sizes. For all phenology feature sets, the overall results were produced consistently with high mapping accuracies under sufficient training samples sizes, although significantly improved classification accuracies (10%) were obtained when the monthly NDVI time series imagery was employed. The optimal single-date imagery had the lowest accuracies of all detection strategies. The multitemporal analysis demonstrated little reduction in the overall accuracy compared with the

  2. Elevation data for floodplain mapping

    National Research Council Canada - National Science Library

    Committee on Floodplain Mapping Technologies; National Research Council; Division on Earth and Life Studies; National Research Council

    2007-01-01

    .... Elevation Data for Floodplain Mapping shows that there is sufficient two-dimensional base map imagery to meet FEMA's flood map modernization goals, but that the three-dimensional base elevation data...

  3. Designing a Sustainable Noise Mapping System Based on Citizen Scientists Smartphone Sensor Data.

    Directory of Open Access Journals (Sweden)

    Eunyoung Shim

    Full Text Available In this study, we attempted to assess the feasibility of collecting population health data via mobile devices. Specifically, we constructed noise maps based on sound information monitored by individuals' smartphones. We designed a sustainable way of creating noise maps that can overcome the shortcomings of existing station-based noise-monitoring systems. Three hundred and nine Seoul residents aged 20-49 years who used Android-based smartphones were recruited, and the subjects installed a special application that we developed for this study. This application collected information on sound and geographical location every 10 min for 7 days. Using GIS, we were able to construct various types of noise maps of Seoul (e.g., daytime/nighttime and weekdays/weekends using the information on sound and geographical location obtained via the users' smartphones. Despite the public health importance of noise management, a number of countries and cities lack a sustainable system to monitor noise. This pilot study showed the possibility of using the smartphones of citizen scientists as an economical and sustainable way of monitoring noise, particularly in an urban context in developing countries.

  4. Geological hazards investigation - relative slope stability map

    Energy Technology Data Exchange (ETDEWEB)

    Han, Dae Suk; Kim, Won Young; Yu, Il Hyon; Kim, Kyeong Su; Lee, Sa Ro; Choi, Young Sup [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)

    1997-12-01

    The Republic of Korea is a mountainous country; the mountains occupy about three quarters of her land area, an increasing urban development being taken place along the mountainside. For the reason, planners as well as developers and others must realize that some of the urban areas may be threaten by geologic hazards such as landslides and accelerated soil and rock creeps. For the purpose of environmental land-use planning, a mapping project on relative slope-stability was established in 1996. The selected area encompasses about 5,900 km{sup 2} including the topographic maps of Ulsan, Yongchon, Kyongju, Pulguksa, and Kampo, all at a scale of 1:50,000. Many disturbed and undisturbed soil samples, which were collected from the ares of the landslides and unstable slopes, were tested for their physical properties and shear strength. They were classified as GC, SP, SC, SM, SP-SM, SC-SM, CL, ML, and MH according to the Unified Soil Classification System, their liquid limit and plasticity index ranging from 25.3% to as high as 81.3% and from 4.1% to 41.5%, respectively. X-ray analysis revealed that many of the soils contained a certain amount of montmorillonite. Based on the available information as well as both field and laboratory investigation, it was found out that the most common types of slope failures in the study area were both debris and mud flows induced by the heavy rainfalls during the period of rainy season; the flows mostly occurred in the colluvial deposits at the middle and foot of mountains. Thus the deposits generally appear to be the most unstable slope forming materials in the study area. Produced for the study area were six different maps consisting of slope classification map, soil classification map, lineament density map, landslide distribution map, zonal map of rainfall, and geology map, most of them being stored as data base. Using the first four maps and GIS, two sheets of relative slope-stability maps were constructed, each at a scale of 1

  5. Provisional maps of thermal areas in Yellowstone National Park, based on satellite thermal infrared imaging and field observations

    Science.gov (United States)

    Vaughan, R. Greg; Heasler, Henry; Jaworowski, Cheryl; Lowenstern, Jacob B.; Keszthelyi, Laszlo P.

    2014-01-01

    Maps that define the current distribution of geothermally heated ground are useful toward setting a baseline for thermal activity to better detect and understand future anomalous hydrothermal and (or) volcanic activity. Monitoring changes in the dynamic thermal areas also supports decisions regarding the development of Yellowstone National Park infrastructure, preservation and protection of park resources, and ensuring visitor safety. Because of the challenges associated with field-based monitoring of a large, complex geothermal system that is spread out over a large and remote area, satellite-based thermal infrared images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) were used to map the location and spatial extent of active thermal areas, to generate thermal anomaly maps, and to quantify the radiative component of the total geothermal heat flux. ASTER thermal infrared data acquired during winter nights were used to minimize the contribution of solar heating of the surface. The ASTER thermal infrared mapping results were compared to maps of thermal areas based on field investigations and high-resolution aerial photos. Field validation of the ASTER thermal mapping is an ongoing task. The purpose of this report is to make available ASTER-based maps of Yellowstone’s thermal areas. We include an appendix containing the names and characteristics of Yellowstone’s thermal areas, georeferenced TIFF files containing ASTER thermal imagery, and several spatial data sets in Esri shapefile format.

  6. Pixel-based parametric source depth map for Cerenkov luminescence imaging

    International Nuclear Information System (INIS)

    Altabella, L.; Spinelli, A.E.; Boschi, F.

    2016-01-01

    Optical tomography represents a challenging problem in optical imaging because of the intrinsically ill-posed inverse problem due to photon diffusion. Cerenkov luminescence tomography (CLT) for optical photons produced in tissues by several radionuclides (i.e.: 32P, 18F, 90Y), has been investigated using both 3D multispectral approach and multiviews methods. Difficult in convergence of 3D algorithms can discourage to use this technique to have information of depth and intensity of source. For these reasons, we developed a faster 2D corrected approach based on multispectral acquisitions, to obtain source depth and its intensity using a pixel-based fitting of source intensity. Monte Carlo simulations and experimental data were used to develop and validate the method to obtain the parametric map of source depth. With this approach we obtain parametric source depth maps with a precision between 3% and 7% for MC simulation and 5–6% for experimental data. Using this method we are able to obtain reliable information about the source depth of Cerenkov luminescence with a simple and flexible procedure

  7. GIS-based rare events logistic regression for mineral prospectivity mapping

    Science.gov (United States)

    Xiong, Yihui; Zuo, Renguang

    2018-02-01

    Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

  8. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  9. The EnMAP Spaceborne Imaging Spectroscopy Mission for Earth Observation

    Directory of Open Access Journals (Sweden)

    Luis Guanter

    2015-07-01

    Full Text Available Imaging spectroscopy, also known as hyperspectral remote sensing, is based on the characterization of Earth surface materials and processes through spectrally-resolved measurements of the light interacting with matter. The potential of imaging spectroscopy for Earth remote sensing has been demonstrated since the 1980s. However, most of the developments and applications in imaging spectroscopy have largely relied on airborne spectrometers, as the amount and quality of space-based imaging spectroscopy data remain relatively low to date. The upcoming Environmental Mapping and Analysis Program (EnMAP German imaging spectroscopy mission is intended to fill this gap. An overview of the main characteristics and current status of the mission is provided in this contribution. The core payload of EnMAP consists of a dual-spectrometer instrument measuring in the optical spectral range between 420 and 2450 nm with a spectral sampling distance varying between 5 and 12 nm and a reference signal-to-noise ratio of 400:1 in the visible and near-infrared and 180:1 in the shortwave-infrared parts of the spectrum. EnMAP images will cover a 30 km-wide area in the across-track direction with a ground sampling distance of 30 m. An across-track tilted observation capability will enable a target revisit time of up to four days at the Equator and better at high latitudes. EnMAP will contribute to the development and exploitation of spaceborne imaging spectroscopy applications by making high-quality data freely available to scientific users worldwide.

  10. Rapid genotyping with DNA micro-arrays for high-density linkage mapping and QTL mapping in common buckwheat (Fagopyrum esculentum Moench)

    Science.gov (United States)

    Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi

    2014-01-01

    For genetic studies and genomics-assisted breeding, particularly of minor crops, a genotyping system that does not require a priori genomic information is preferable. Here, we demonstrated the potential of a novel array-based genotyping system for the rapid construction of high-density linkage map and quantitative trait loci (QTL) mapping. By using the system, we successfully constructed an accurate, high-density linkage map for common buckwheat (Fagopyrum esculentum Moench); the map was composed of 756 loci and included 8,884 markers. The number of linkage groups converged to eight, which is the basic number of chromosomes in common buckwheat. The sizes of the linkage groups of the P1 and P2 maps were 773.8 and 800.4 cM, respectively. The average interval between adjacent loci was 2.13 cM. The linkage map constructed here will be useful for the analysis of other common buckwheat populations. We also performed QTL mapping for main stem length and detected four QTL. It took 37 days to process 178 samples from DNA extraction to genotyping, indicating the system enables genotyping of genome-wide markers for a few hundred buckwheat plants before the plants mature. The novel system will be useful for genomics-assisted breeding in minor crops without a priori genomic information. PMID:25914583

  11. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery

    Science.gov (United States)

    Gao, Junfeng; Liao, Wenzhi; Nuyttens, David; Lootens, Peter; Vangeyte, Jürgen; Pižurica, Aleksandra; He, Yong; Pieters, Jan G.

    2018-05-01

    The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for inter- and intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.

  12. Construction of Polarimetric Radar-Based Reference Rain Maps for the Iowa Flood Studies Campaign

    Science.gov (United States)

    Petersen, Walt; Krajewski, Witek; Wolff, David; Gatlin, Patrick

    2015-04-01

    The Global Precipitation Measurement (GPM) Mission Iowa Flood Studies (IFloodS) campaign was conducted in central and northeastern Iowa during the months of April-June, 2013. Specific science objectives for IFloodS included quantification of uncertainties in satellite and ground-based estimates of precipitation, 4-D characterization of precipitation physical processes and associated parameters (e.g., size distributions, water contents, types, structure etc.), assessment of the impact of precipitation estimation uncertainty and physical processes on hydrologic predictive skill, and refinement of field observations and data analysis approaches as they pertain to future GPM integrated hydrologic validation and related field studies. In addition to field campaign archival of raw and processed satellite data (including precipitation products), key ground-based platforms such as the NASA NPOL S-band and D3R Ka/Ku-band dual-polarimetric radars, University of Iowa X-band dual-polarimetric radars, a large network of paired rain gauge platforms, and a large network of 2D Video and Parsivel disdrometers were deployed. In something of a canonical approach, the radar (NPOL in particular), gauge and disdrometer observational assets were deployed to create a consistent high-quality distributed (time and space sampling) radar-based ground "reference" rainfall dataset, with known uncertainties, that could be used for assessing the satellite-based precipitation products at a range of space/time scales. Subsequently, the impact of uncertainties in the satellite products could be evaluated relative to the ground-benchmark in coupled weather, land-surface and distributed hydrologic modeling frameworks as related to flood prediction. Relative to establishing the ground-based "benchmark", numerous avenues were pursued in the making and verification of IFloodS "reference" dual-polarimetric radar-based rain maps, and this study documents the process and results as they pertain specifically

  13. Construction of Polarimetric Radar-Based Reference Rain Maps for the Iowa Flood Studies Campaign

    Science.gov (United States)

    Petersen, Walter; Wolff, David; Krajewski, Witek; Gatlin, Patrick

    2015-01-01

    The Global Precipitation Measurement (GPM) Mission Iowa Flood Studies (IFloodS) campaign was conducted in central and northeastern Iowa during the months of April-June, 2013. Specific science objectives for IFloodS included quantification of uncertainties in satellite and ground-based estimates of precipitation, 4-D characterization of precipitation physical processes and associated parameters (e.g., size distributions, water contents, types, structure etc.), assessment of the impact of precipitation estimation uncertainty and physical processes on hydrologic predictive skill, and refinement of field observations and data analysis approaches as they pertain to future GPM integrated hydrologic validation and related field studies. In addition to field campaign archival of raw and processed satellite data (including precipitation products), key ground-based platforms such as the NASA NPOL S-band and D3R Ka/Ku-band dual-polarimetric radars, University of Iowa X-band dual-polarimetric radars, a large network of paired rain gauge platforms, and a large network of 2D Video and Parsivel disdrometers were deployed. In something of a canonical approach, the radar (NPOL in particular), gauge and disdrometer observational assets were deployed to create a consistent high-quality distributed (time and space sampling) radar-based ground "reference" rainfall dataset, with known uncertainties, that could be used for assessing the satellite-based precipitation products at a range of space/time scales. Subsequently, the impact of uncertainties in the satellite products could be evaluated relative to the ground-benchmark in coupled weather, land-surface and distributed hydrologic modeling frameworks as related to flood prediction. Relative to establishing the ground-based "benchmark", numerous avenues were pursued in the making and verification of IFloodS "reference" dual-polarimetric radar-based rain maps, and this study documents the process and results as they pertain specifically

  14. Dynamic Feedforward Control of a Diesel Engine Based on Optimal Transient Compensation Maps

    Directory of Open Access Journals (Sweden)

    Giorgio Mancini

    2014-08-01

    Full Text Available To satisfy the increasingly stringent emission regulations and a demand for an ever lower fuel consumption, diesel engines have become complex systems with many interacting actuators. As a consequence, these requirements are pushing control and calibration to their limits. The calibration procedure nowadays is still based mainly on engineering experience, which results in a highly iterative process to derive a complete engine calibration. Moreover, automatic tools are available only for stationary operation, to obtain control maps that are optimal with respect to some predefined objective function. Therefore, the exploitation of any leftover potential during transient operation is crucial. This paper proposes an approach to derive a transient feedforward (FF control system in an automated way. It relies on optimal control theory to solve a dynamic optimization problem for fast transients. A partially physics-based model is thereby used to replace the engine. From the optimal solutions, the relevant information is extracted and stored in maps spanned by the engine speed and the torque gradient. These maps complement the static control maps by accounting for the dynamic behavior of the engine. The procedure is implemented on a real engine and experimental results are presented along with the development of the methodology.

  15. A Map/INS/Wi-Fi Integrated System for Indoor Location-Based Service Applications.

    Science.gov (United States)

    Yu, Chunyang; Lan, Haiyu; Gu, Fuqiang; Yu, Fei; El-Sheimy, Naser

    2017-06-02

    In this research, a new Map/INS/Wi-Fi integrated system for indoor location-based service (LBS) applications based on a cascaded Particle/Kalman filter framework structure is proposed. Two-dimension indoor map information, together with measurements from an inertial measurement unit (IMU) and Received Signal Strength Indicator (RSSI) value, are integrated for estimating positioning information. The main challenge of this research is how to make effective use of various measurements that complement each other in order to obtain an accurate, continuous, and low-cost position solution without increasing the computational burden of the system. Therefore, to eliminate the cumulative drift caused by low-cost IMU sensor errors, the ubiquitous Wi-Fi signal and non-holonomic constraints are rationally used to correct the IMU-derived navigation solution through the extended Kalman Filter (EKF). Moreover, the map-aiding method and map-matching method are innovatively combined to constrain the primary Wi-Fi/IMU-derived position through an Auxiliary Value Particle Filter (AVPF). Different sources of information are incorporated through a cascaded structure EKF/AVPF filter algorithm. Indoor tests show that the proposed method can effectively reduce the accumulation of positioning errors of a stand-alone Inertial Navigation System (INS), and provide a stable, continuous and reliable indoor location service.

  16. Toggle PRM: Simultaneous mapping of C-free and C-obstacle - A study in 2D -

    KAUST Repository

    Denny, Jory

    2011-09-01

    Motion planning is known to be difficult. Probabilistic planners have made great advances, but still have difficulty for problems that require planning in narrow passages or on surfaces in Cspace. This work proposes Toggle PRM, a new methodology for PRMs that simultaneously maps both free and obstacle space. In this paper, we focus on 2 DOF problems and show that mapping both spaces leads to increased sampling density in narrow passages and to improved overall efficiency as compared to previous sampling based approaches.

  17. Toggle PRM: Simultaneous mapping of C-free and C-obstacle - A study in 2D -

    KAUST Repository

    Denny, Jory; Amato, Nancy M.

    2011-01-01

    Motion planning is known to be difficult. Probabilistic planners have made great advances, but still have difficulty for problems that require planning in narrow passages or on surfaces in Cspace. This work proposes Toggle PRM, a new methodology for PRMs that simultaneously maps both free and obstacle space. In this paper, we focus on 2 DOF problems and show that mapping both spaces leads to increased sampling density in narrow passages and to improved overall efficiency as compared to previous sampling based approaches.

  18. Methodology developed to make the Quebec indoor radon potential map

    International Nuclear Information System (INIS)

    Drolet, Jean-Philippe; Martel, Richard; Poulin, Patrick; Dessau, Jean-Claude

    2014-01-01

    This paper presents a relevant approach to predict the indoor radon potential based on the combination of the radiogeochemical data and the indoor radon measurements in the Quebec province territory (Canada). The Quebec ministry of health asked for such a map to identify the radon-prone areas to manage the risk for the population related to indoor radon exposure. Three radiogeochemical criteria including (1) equivalent uranium (eU) concentration from airborne surface gamma-ray surveys, (2) uranium concentration measurements in sediments, (3) bedrock and surficial geology were combined with 3082 basement radon concentration measurements to identify the radon-prone areas. It was shown that it is possible to determine thresholds for the three criteria that implied statistically significant different levels of radon potential using Kruskal–Wallis one way analyses of variance by ranks. The three discretized radiogeochemical datasets were combined into a total predicted radon potential that sampled 98% of the studied area. The combination process was also based on Kruskal–Wallis one way ANOVA. Four statistically significant different predicted radon potential levels were created: low, medium, high and very high. Respectively 10 and 13% of the dwellings exceed the Canadian radon guideline of 200 Bq/m 3 in low and medium predicted radon potentials. These proportions rise up to 22 and 45% respectively for high and very high predicted radon potentials. This predictive map of indoor radon potential based on the radiogeochemical data was validated using a map of confirmed radon exposure in homes based on the basement radon measurements. It was shown that the map of predicted radon potential based on the radiogeochemical data was reliable to identify radon-prone areas even in zones where no indoor radon measurement exists. - Highlights: • 5 radiogeochemical datasets were used to map the geogenic indoor radon potential. • An indoor radon potential was determined for each

  19. Open Land-Use Map: A Regional Land-Use Mapping Strategy for Incorporating OpenStreetMap with Earth Observations

    Science.gov (United States)

    Yang, D.; Fu, C. S.; Binford, M. W.

    2017-12-01

    The southeastern United States has high landscape heterogeneity, withheavily managed forestlands, highly developed agriculture lands, and multiple metropolitan areas. Human activities are transforming and altering land patterns and structures in both negative and positive manners. A land-use map for at the greater scale is a heavy computation task but is critical to most landowners, researchers, and decision makers, enabling them to make informed decisions for varying objectives. There are two major difficulties in generating the classification maps at the regional scale: the necessity of large training point sets and the expensive computation cost-in terms of both money and time-in classifier modeling. Volunteered Geographic Information (VGI) opens a new era in mapping and visualizing our world, where the platform is open for collecting valuable georeferenced information by volunteer citizens, and the data is freely available to the public. As one of the most well-known VGI initiatives, OpenStreetMap (OSM) contributes not only road network distribution, but also the potential for using this data to justify land cover and land use classifications. Google Earth Engine (GEE) is a platform designed for cloud-based mapping with a robust and fast computing power. Most large scale and national mapping approaches confuse "land cover" and "land-use", or build up the land-use database based on modeled land cover datasets. Unlike most other large-scale approaches, we distinguish and differentiate land-use from land cover. By focusing our prime objective of mapping land-use and management practices, a robust regional land-use mapping approach is developed by incorporating the OpenstreepMap dataset into Earth observation remote sensing imageries instead of the often-used land cover base maps.

  20. Progressive Amalgamation of Building Clusters for Map Generalization Based on Scaling Subgroups

    Directory of Open Access Journals (Sweden)

    Xianjin He

    2018-03-01

    Full Text Available Map generalization utilizes transformation operations to derive smaller-scale maps from larger-scale maps, and is a key procedure for the modelling and understanding of geographic space. Studies to date have largely applied a fixed tolerance to aggregate clustered buildings into a single object, resulting in the loss of details that meet cartographic constraints and may be of importance for users. This study aims to develop a method that amalgamates clustered buildings gradually without significant modification of geometry, while preserving the map details as much as possible under cartographic constraints. The amalgamation process consists of three key steps. First, individual buildings are grouped into distinct clusters by using the graph-based spatial clustering application with random forest (GSCARF method. Second, building clusters are decomposed into scaling subgroups according to homogeneity with regard to the mean distance of subgroups. Thus, hierarchies of building clusters can be derived based on scaling subgroups. Finally, an amalgamation operation is progressively performed from the bottom-level subgroups to the top-level subgroups using the maximum distance of each subgroup as the amalgamating tolerance instead of using a fixed tolerance. As a consequence of this step, generalized intermediate scaling results are available, which can form the multi-scale representation of buildings. The experimental results show that the proposed method can generate amalgams with correct details, statistical area balance and orthogonal shape while satisfying cartographic constraints (e.g., minimum distance and minimum area.

  1. Mapping shape to visuomotor mapping: learning and generalisation of sensorimotor behaviour based on contextual information.

    Directory of Open Access Journals (Sweden)

    Loes C J van Dam

    2015-03-01

    Full Text Available Humans can learn and store multiple visuomotor mappings (dual-adaptation when feedback for each is provided alternately. Moreover, learned context cues associated with each mapping can be used to switch between the stored mappings. However, little is known about the associative learning between cue and required visuomotor mapping, and how learning generalises to novel but similar conditions. To investigate these questions, participants performed a rapid target-pointing task while we manipulated the offset between visual feedback and movement end-points. The visual feedback was presented with horizontal offsets of different amounts, dependent on the targets shape. Participants thus needed to use different visuomotor mappings between target location and required motor response depending on the target shape in order to "hit" it. The target shapes were taken from a continuous set of shapes, morphed between spiky and circular shapes. After training we tested participants performance, without feedback, on different target shapes that had not been learned previously. We compared two hypotheses. First, we hypothesised that participants could (explicitly extract the linear relationship between target shape and visuomotor mapping and generalise accordingly. Second, using previous findings of visuomotor learning, we developed a (implicit Bayesian learning model that predicts generalisation that is more consistent with categorisation (i.e. use one mapping or the other. The experimental results show that, although learning the associations requires explicit awareness of the cues' role, participants apply the mapping corresponding to the trained shape that is most similar to the current one, consistent with the Bayesian learning model. Furthermore, the Bayesian learning model predicts that learning should slow down with increased numbers of training pairs, which was confirmed by the present results. In short, we found a good correspondence between the

  2. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    Science.gov (United States)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  3. On the map: Nature and Science editorials.

    Science.gov (United States)

    Waaijer, Cathelijn J F; van Bochove, Cornelis A; van Eck, Nees Jan

    2011-01-01

    Bibliometric mapping of scientific articles based on keywords and technical terms in abstracts is now frequently used to chart scientific fields. In contrast, no significant mapping has been applied to the full texts of non-specialist documents. Editorials in Nature and Science are such non-specialist documents, reflecting the views of the two most read scientific journals on science, technology and policy issues. We use the VOSviewer mapping software to chart the topics of these editorials. A term map and a document map are constructed and clusters are distinguished in both of them. The validity of the document clustering is verified by a manual analysis of a sample of the editorials. This analysis confirms the homogeneity of the clusters obtained by mapping and augments the latter with further detail. As a result, the analysis provides reliable information on the distribution of the editorials over topics, and on differences between the journals. The most striking difference is that Nature devotes more attention to internal science policy issues and Science more to the political influence of scientists. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s11192-010-0205-9) contains supplementary material, which is available to authorized users.

  4. Planetary Geologic Mapping Handbook - 2009

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete

  5. FANTOM5 CAGE profiles of human and mouse samples

    NARCIS (Netherlands)

    Noguchi, Shuhei; Arakawa, Takahiro; Fukuda, Shiro; Furuno, Masaaki; Hasegawa, Akira; Hori, Fumi; Ishikawa-Kato, Sachi; Kaida, Kaoru; Kaiho, Ai; Kanamori-Katayama, Mutsumi; Kawashima, Tsugumi; Kojima, Miki; Kubosaki, Atsutaka; Manabe, Ri-ichiroh; Murata, Mitsuyoshi; Nagao-Sato, Sayaka; Nakazato, Kenichi; Ninomiya, Noriko; Nishiyori-Sueki, Hiromi; Noma, Shohei; Saijyo, Eri; Saka, Akiko; Sakai, Mizuho; Simon, Christophe; Suzuki, Naoko; Tagami, Michihira; Watanabe, Shoko; Yoshida, Shigehiro; Arner, Peter; Axton, Richard A.; Babina, Magda; Baillie, J. Kenneth; Barnett, Timothy C.; Beckhouse, Anthony G.; Blumenthal, Antje; Bodega, Beatrice; Bonetti, Alessandro; Briggs, James; Brombacher, Frank; Carlisle, Ailsa J.; Clevers, Hans C.; Davis, Carrie A.; Detmar, Michael; Dohi, Taeko; Edge, Albert S. B.; Edinger, Matthias; Ehrlund, Anna; Ekwall, Karl; Endoh, Mitsuhiro; Enomoto, Hideki; Eslami, Afsaneh; Fagiolini, Michela; Fairbairn, Lynsey; Farach-Carson, Mary C.; Faulkner, Geoffrey J.; Ferrai, Carmelo; Fisher, Malcolm E.; Forrester, Lesley M.; Fujita, Rie; Furusawa, Jun-ichi; Geijtenbeek, Teunis B.; Gingeras, Thomas; Goldowitz, Daniel; Guhl, Sven; Guler, Reto; Gustincich, Stefano; Ha, Thomas J.; Hamaguchi, Masahide; Hara, Mitsuko; Hasegawa, Yuki; Herlyn, Meenhard; Heutink, Peter; Hitchens, Kelly J.; Hume, David A.; Ikawa, Tomokatsu; Ishizu, Yuri; Kai, Chieko; Kawamoto, Hiroshi; Kawamura, Yuki I.; Kempfle, Judith S.; Kenna, Tony J.; Kere, Juha; Khachigian, Levon M.; Kitamura, Toshio; Klein, Sarah; Klinken, S. Peter; Knox, Alan J.; Kojima, Soichi; Koseki, Haruhiko; Koyasu, Shigeo; Lee, Weonju; Lennartsson, Andreas; Mackay-sim, Alan; Mejhert, Niklas; Mizuno, Yosuke; Morikawa, Hiromasa; Morimoto, Mitsuru; Moro, Kazuyo; Morris, Kelly J.; Motohashi, Hozumi; Mummery, Christine L.; Nakachi, Yutaka; Nakahara, Fumio; Nakamura, Toshiyuki; Nakamura, Yukio; Nozaki, Tadasuke; Ogishima, Soichi; Ohkura, Naganari; Ohno, Hiroshi; Ohshima, Mitsuhiro; Okada-Hatakeyama, Mariko; Okazaki, Yasushi; Orlando, Valerio; Ovchinnikov, Dmitry A.; Passier, Robert; Patrikakis, Margaret; Pombo, Ana; Pradhan-Bhatt, Swati; Qin, Xian-Yang; Rehli, Michael; Rizzu, Patrizia; Roy, Sugata; Sajantila, Antti; Sakaguchi, Shimon; Sato, Hiroki; Satoh, Hironori; Savvi, Suzana; Saxena, Alka; Schmidl, Christian; Schneider, Claudio; Schulze-Tanzil, Gundula G.; Schwegmann, Anita; Sheng, Guojun; Shin, Jay W.; Sugiyama, Daisuke; Sugiyama, Takaaki; Summers, Kim M.; Takahashi, Naoko; Takai, Jun; Tanaka, Hiroshi; Tatsukawa, Hideki; Tomoiu, Andru; Toyoda, Hiroo; van de Wetering, Marc; van den Berg, Linda M.; Verardo, Roberto; Vijayan, Dipti; Wells, Christine A.; Winteringham, Louise N.; Wolvetang, Ernst; Yamaguchi, Yoko; Yamamoto, Masayuki; Yanagi-Mizuochi, Chiyo; Yoneda, Misako; Yonekura, Yohei; Zhang, Peter G.; Zucchelli, Silvia; Abugessaisa, Imad; Arner, Erik; Harshbarger, Jayson; Kondo, Atsushi; Lassmann, Timo; Lizio, Marina; Sahin, Serkan; Sengstag, Thierry; Severin, Jessica; Shimoji, Hisashi; Suzuki, Masanori; Suzuki, Harukazu; Kawai, Jun; Kondo, Naoto; Itoh, Masayoshi; Daub, Carsten O.; Kasukawa, Takeya; Kawaji, Hideya; Carninci, Piero; Forrest, Alistair R. R.; Hayashizaki, Yoshihide

    2017-01-01

    In the FANTOM5 project, transcription initiation events across the human and mouse genomes were mapped at a single base-pair resolution and their frequencies were monitored by CAGE (Cap Analysis of Gene Expression) coupled with single-molecule sequencing. Approximately three thousands of samples,

  6. An efficient hole-filling method based on depth map in 3D view generation

    Science.gov (United States)

    Liang, Haitao; Su, Xiu; Liu, Yilin; Xu, Huaiyuan; Wang, Yi; Chen, Xiaodong

    2018-01-01

    New virtual view is synthesized through depth image based rendering(DIBR) using a single color image and its associated depth map in 3D view generation. Holes are unavoidably generated in the 2D to 3D conversion process. We propose a hole-filling method based on depth map to address the problem. Firstly, we improve the process of DIBR by proposing a one-to-four (OTF) algorithm. The "z-buffer" algorithm is used to solve overlap problem. Then, based on the classical patch-based algorithm of Criminisi et al., we propose a hole-filling algorithm using the information of depth map to handle the image after DIBR. In order to improve the accuracy of the virtual image, inpainting starts from the background side. In the calculation of the priority, in addition to the confidence term and the data term, we add the depth term. In the search for the most similar patch in the source region, we define the depth similarity to improve the accuracy of searching. Experimental results show that the proposed method can effectively improve the quality of the 3D virtual view subjectively and objectively.

  7. Comparing 511 keV Attenuation Maps Obtained from Different Energy Mapping Methods for CT Based Attenuation Correction of PET Data

    Directory of Open Access Journals (Sweden)

    Maryam Shirmohammad

    2008-06-01

    Full Text Available Introduction:  The  advent  of  dual-modality  PET/CT  scanners  has  revolutionized  clinical  oncology  by  improving lesion localization and facilitating treatment planning for radiotherapy. In addition, the use of  CT images for CT-based attenuation correction (CTAC decreases the overall scanning time and creates  a noise-free  attenuation  map  (6map.  CTAC  methods  include  scaling,  segmentation,  hybrid  scaling/segmentation, bilinear and dual energy methods. All CTAC methods require the transformation  of CT Hounsfield units (HU to linear attenuation coefficients (LAC at 511 keV. The aim of this study is  to compare the results of implementing different methods of energy mapping in PET/CT scanners.   Materials and Methods: This study was conducted in 2 phases, the first phase in a phantom and the  second  one  on  patient  data.  To  perform  the  first  phase,  a  cylindrical  phantom  with  different  concentrations of K2HPO4 inserts was CT scanned and energy mapping methods were implemented on  it. For performing the second phase, different energy  mapping  methods  were implemented on several  clinical studies and compared to the transmission (TX image derived using Ga-68 radionuclide source  acquired on the GE Discovery LS PET/CT scanner.   Results: An ROI analysis was performed on different positions of the resultant 6maps and the average  6value of each ROI was compared to the reference value. The results of the 6maps obtained for 511 keV  compared to the theoretical  values showed that in the phantom for low  concentrations  of K 2 HPO 4 all  these  methods  produce  511  keV  attenuation  maps  with  small  relative  difference  compared  to  gold  standard. The relative difference for scaling, segmentation, hybrid, bilinear and dual energy methods was  4.92,  3.21,  4.43,  2.24  and  2.29%,  respectively.  Although  for  high  concentration

  8. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  9. A symmetric image encryption scheme based on 3D chaotic cat maps

    International Nuclear Information System (INIS)

    Chen Guanrong; Mao Yaobin; Chui, Charles K.

    2004-01-01

    Encryption of images is different from that of texts due to some intrinsic features of images such as bulk data capacity and high redundancy, which are generally difficult to handle by traditional methods. Due to the exceptionally desirable properties of mixing and sensitivity to initial conditions and parameters of chaotic maps, chaos-based encryption has suggested a new and efficient way to deal with the intractable problem of fast and highly secure image encryption. In this paper, the two-dimensional chaotic cat map is generalized to 3D for designing a real-time secure symmetric encryption scheme. This new scheme employs the 3D cat map to shuffle the positions (and, if desired, grey values as well) of image pixels and uses another chaotic map to confuse the relationship between the cipher-image and the plain-image, thereby significantly increasing the resistance to statistical and differential attacks. Thorough experimental tests are carried out with detailed analysis, demonstrating the high security and fast encryption speed of the new scheme

  10. A 2.5D Map-Based Mobile Robot Localization via Cooperation of Aerial and Ground Robots.

    Science.gov (United States)

    Nam, Tae Hyeon; Shim, Jae Hong; Cho, Young Im

    2017-11-25

    Recently, there has been increasing interest in studying the task coordination of aerial and ground robots. When a robot begins navigation in an unknown area, it has no information about the surrounding environment. Accordingly, for robots to perform tasks based on location information, they need a simultaneous localization and mapping (SLAM) process that uses sensor information to draw a map of the environment, while simultaneously estimating the current location of the robot on the map. This paper aims to present a localization method based in cooperation between aerial and ground robots in an indoor environment. The proposed method allows a ground robot to reach accurate destination by using a 2.5D elevation map built by a low-cost RGB-D (Red Green and Blue-Depth) sensor and 2D Laser sensor attached onto an aerial robot. A 2.5D elevation map is formed by projecting height information of an obstacle using depth information obtained by the RGB-D sensor onto a grid map, which is generated by using the 2D Laser sensor and scan matching. Experimental results demonstrate the effectiveness of the proposed method for its accuracy in location recognition and computing speed.

  11. A 2.5D Map-Based Mobile Robot Localization via Cooperation of Aerial and Ground Robots

    Directory of Open Access Journals (Sweden)

    Tae Hyeon Nam

    2017-11-01

    Full Text Available Recently, there has been increasing interest in studying the task coordination of aerial and ground robots. When a robot begins navigation in an unknown area, it has no information about the surrounding environment. Accordingly, for robots to perform tasks based on location information, they need a simultaneous localization and mapping (SLAM process that uses sensor information to draw a map of the environment, while simultaneously estimating the current location of the robot on the map. This paper aims to present a localization method based in cooperation between aerial and ground robots in an indoor environment. The proposed method allows a ground robot to reach accurate destination by using a 2.5D elevation map built by a low-cost RGB-D (Red Green and Blue-Depth sensor and 2D Laser sensor attached onto an aerial robot. A 2.5D elevation map is formed by projecting height information of an obstacle using depth information obtained by the RGB-D sensor onto a grid map, which is generated by using the 2D Laser sensor and scan matching. Experimental results demonstrate the effectiveness of the proposed method for its accuracy in location recognition and computing speed.

  12. Network-level accident-mapping: Distance based pattern matching using artificial neural network.

    Science.gov (United States)

    Deka, Lipika; Quddus, Mohammed

    2014-04-01

    The objective of an accident-mapping algorithm is to snap traffic accidents onto the correct road segments. Assigning accidents onto the correct segments facilitate to robustly carry out some key analyses in accident research including the identification of accident hot-spots, network-level risk mapping and segment-level accident risk modelling. Existing risk mapping algorithms have some severe limitations: (i) they are not easily 'transferable' as the algorithms are specific to given accident datasets; (ii) they do not perform well in all road-network environments such as in areas of dense road network; and (iii) the methods used do not perform well in addressing inaccuracies inherent in and type of road environment. The purpose of this paper is to develop a new accident mapping algorithm based on the common variables observed in most accident databases (e.g. road name and type, direction of vehicle movement before the accident and recorded accident location). The challenges here are to: (i) develop a method that takes into account uncertainties inherent to the recorded traffic accident data and the underlying digital road network data, (ii) accurately determine the type and proportion of inaccuracies, and (iii) develop a robust algorithm that can be adapted for any accident set and road network of varying complexity. In order to overcome these challenges, a distance based pattern-matching approach is used to identify the correct road segment. This is based on vectors containing feature values that are common in the accident data and the network data. Since each feature does not contribute equally towards the identification of the correct road segments, an ANN approach using the single-layer perceptron is used to assist in "learning" the relative importance of each feature in the distance calculation and hence the correct link identification. The performance of the developed algorithm was evaluated based on a reference accident dataset from the UK confirming that

  13. SIMULATION AND PREDICTION OF THE PROCESS BASED ON THE GENERAL LOGISTIC MAPPING

    Directory of Open Access Journals (Sweden)

    V. V. Skalozub

    2013-11-01

    Full Text Available Purpose. The aim of the research is to build a model of the generalzed logistic mapping and assessment of the possibilities of its use for the formation of the mathematical description, as well as operational forecasts of parameters of complex dynamic processes described by the time series. Methodology. The research results are obtained on the basis of mathematical modeling and simulation of nonlinear systems using the tools of chaotic dynamics. Findings. A model of the generalized logistic mapping, which is used to interpret the characteristics of dynamic processes was proposed. We consider some examples of representations of processes based on enhanced logistic mapping varying the values of model parameters. The procedures of modeling and interpretation of the data on the investigated processes, represented by the time series, as well as the operational forecasting of parameters using the generalized model of logistic mapping were proposed. Originality. The paper proposes an improved mathematical model, generalized logistic mapping, designed for the study of nonlinear discrete dynamic processes. Practical value. The carried out research using the generalized logistic mapping of railway transport processes, in particular, according to assessment of the parameters of traffic volumes, indicate the great potential of its application in practice for solving problems of analysis, modeling and forecasting complex nonlinear discrete dynamical processes. The proposed model can be used, taking into account the conditions of uncertainty, irregularity, the manifestations of the chaotic nature of the technical, economic and other processes, including the railway ones.

  14. Large-Scale Mapping and Predictive Modeling of Submerged Aquatic Vegetation in a Shallow Eutrophic Lake

    Directory of Open Access Journals (Sweden)

    Karl E. Havens

    2002-01-01

    Full Text Available A spatially intensive sampling program was developed for mapping the submerged aquatic vegetation (SAV over an area of approximately 20,000 ha in a large, shallow lake in Florida, U.S. The sampling program integrates Geographic Information System (GIS technology with traditional field sampling of SAV and has the capability of producing robust vegetation maps under a wide range of conditions, including high turbidity, variable depth (0 to 2 m, and variable sediment types. Based on sampling carried out in AugustœSeptember 2000, we measured 1,050 to 4,300 ha of vascular SAV species and approximately 14,000 ha of the macroalga Chara spp. The results were similar to those reported in the early 1990s, when the last large-scale SAV sampling occurred. Occurrence of Chara was strongly associated with peat sediments, and maximal depths of occurrence varied between sediment types (mud, sand, rock, and peat. A simple model of Chara occurrence, based only on water depth, had an accuracy of 55%. It predicted occurrence of Chara over large areas where the plant actually was not found. A model based on sediment type and depth had an accuracy of 75% and produced a spatial map very similar to that based on observations. While this approach needs to be validated with independent data in order to test its general utility, we believe it may have application elsewhere. The simple modeling approach could serve as a coarse-scale tool for evaluating effects of water level management on Chara populations.

  15. Large-scale mapping and predictive modeling of submerged aquatic vegetation in a shallow eutrophic lake.

    Science.gov (United States)

    Havens, Karl E; Harwell, Matthew C; Brady, Mark A; Sharfstein, Bruce; East, Therese L; Rodusky, Andrew J; Anson, Daniel; Maki, Ryan P

    2002-04-09

    A spatially intensive sampling program was developed for mapping the submerged aquatic vegetation (SAV) over an area of approximately 20,000 ha in a large, shallow lake in Florida, U.S. The sampling program integrates Geographic Information System (GIS) technology with traditional field sampling of SAV and has the capability of producing robust vegetation maps under a wide range of conditions, including high turbidity, variable depth (0 to 2 m), and variable sediment types. Based on sampling carried out in August-September 2000, we measured 1,050 to 4,300 ha of vascular SAV species and approximately 14,000 ha of the macroalga Chara spp. The results were similar to those reported in the early 1990s, when the last large-scale SAV sampling occurred. Occurrence of Chara was strongly associated with peat sediments, and maximal depths of occurrence varied between sediment types (mud, sand, rock, and peat). A simple model of Chara occurrence, based only on water depth, had an accuracy of 55%. It predicted occurrence of Chara over large areas where the plant actually was not found. A model based on sediment type and depth had an accuracy of 75% and produced a spatial map very similar to that based on observations. While this approach needs to be validated with independent data in order to test its general utility, we believe it may have application elsewhere. The simple modeling approach could serve as a coarse-scale tool for evaluating effects of water level management on Chara populations.

  16. A universal algorithm to generate pseudo-random numbers based on uniform mapping as homeomorphism

    International Nuclear Information System (INIS)

    Fu-Lai, Wang

    2010-01-01

    A specific uniform map is constructed as a homeomorphism mapping chaotic time series into [0,1] to obtain sequences of standard uniform distribution. With the uniform map, a chaotic orbit and a sequence orbit obtained are topologically equivalent to each other so the map can preserve the most dynamic properties of chaotic systems such as permutation entropy. Based on the uniform map, a universal algorithm to generate pseudo random numbers is proposed and the pseudo random series is tested to follow the standard 0–1 random distribution both theoretically and experimentally. The algorithm is not complex, which does not impose high requirement on computer hard ware and thus computation speed is fast. The method not only extends the parameter spaces but also avoids the drawback of small function space caused by constraints on chaotic maps used to generate pseudo random numbers. The algorithm can be applied to any chaotic system and can produce pseudo random sequence of high quality, thus can be a good universal pseudo random number generator. (general)

  17. A universal algorithm to generate pseudo-random numbers based on uniform mapping as homeomorphism

    Science.gov (United States)

    Wang, Fu-Lai

    2010-09-01

    A specific uniform map is constructed as a homeomorphism mapping chaotic time series into [0,1] to obtain sequences of standard uniform distribution. With the uniform map, a chaotic orbit and a sequence orbit obtained are topologically equivalent to each other so the map can preserve the most dynamic properties of chaotic systems such as permutation entropy. Based on the uniform map, a universal algorithm to generate pseudo random numbers is proposed and the pseudo random series is tested to follow the standard 0-1 random distribution both theoretically and experimentally. The algorithm is not complex, which does not impose high requirement on computer hard ware and thus computation speed is fast. The method not only extends the parameter spaces but also avoids the drawback of small function space caused by constraints on chaotic maps used to generate pseudo random numbers. The algorithm can be applied to any chaotic system and can produce pseudo random sequence of high quality, thus can be a good universal pseudo random number generator.

  18. The research of selection model based on LOD in multi-scale display of electronic map

    Science.gov (United States)

    Zhang, Jinming; You, Xiong; Liu, Yingzhen

    2008-10-01

    This paper proposes a selection model based on LOD to aid the display of electronic map. The ratio of display scale to map scale is regarded as a LOD operator. The categorization rule, classification rule, elementary rule and spatial geometry character rule of LOD operator setting are also concluded.

  19. Shared protection based virtual network mapping in space division multiplexing optical networks

    Science.gov (United States)

    Zhang, Huibin; Wang, Wei; Zhao, Yongli; Zhang, Jie

    2018-05-01

    Space Division Multiplexing (SDM) has been introduced to improve the capacity of optical networks. In SDM optical networks, there are multiple cores/modes in each fiber link, and spectrum resources are multiplexed in both frequency and core/modes dimensions. Enabled by network virtualization technology, one SDM optical network substrate can be shared by several virtual networks operators. Similar with point-to-point connection services, virtual networks (VN) also need certain survivability to guard against network failures. Based on customers' heterogeneous requirements on the survivability of their virtual networks, this paper studies the shared protection based VN mapping problem and proposes a Minimum Free Frequency Slots (MFFS) mapping algorithm to improve spectrum efficiency. Simulation results show that the proposed algorithm can optimize SDM optical networks significantly in terms of blocking probability and spectrum utilization.

  20. Mapping query terms to data and schema using content based similarity search in clinical information systems.

    Science.gov (United States)

    Safari, Leila; Patrick, Jon D

    2013-01-01

    This paper reports on the issues in mapping the terms of a query to the field names of the schema of an Entity Relationship (ER) model or to the data part of the Entity Attribute Value (EAV) model using similarity based Top-K algorithm in clinical information system together with an extension of EAV mapping for medication names. In addition, the details of the mapping algorithm and the required pre-processing including NLP (Natural Language Processing) tasks to prepare resources for mapping are explained. The experimental results on an example clinical information system demonstrate more than 84 per cent of accuracy in mapping. The results will be integrated into our proposed Clinical Data Analytics Language (CliniDAL) to automate mapping process in CliniDAL.

  1. A novel image encryption scheme based on the ergodicity of baker map

    Science.gov (United States)

    Ye, Ruisong; Chen, Yonghong

    2012-01-01

    Thanks to the exceptionally good properties in chaotic systems, such as sensitivity to initial conditions and control parameters, pseudo-randomness and ergodicity, chaos-based image encryption algorithms have been widely studied and developed in recent years. A novel digital image encryption scheme based on the chaotic ergodicity of Baker map is proposed in this paper. Different from traditional encryption schemes based on Baker map, we permute the pixel positions by their corresponding order numbers deriving from the approximating points in one chaotic orbit. To enhance the resistance to statistical and differential attacks, a diffusion process is suggested as well in the proposed scheme. The proposed scheme enlarges the key space significantly to resist brute-force attack. Additionally, the distribution of gray values in the cipher-image has a random-like behavior to resist statistical analysis. The proposed scheme is robust against cropping, tampering and noising attacks as well. It therefore suggests a high secure and efficient way for real-time image encryption and transmission in practice.

  2. Creating Geologically Based Radon Potential Maps for Kentucky

    Science.gov (United States)

    Overfield, B.; Hahn, E.; Wiggins, A.; Andrews, W. M., Jr.

    2017-12-01

    Radon potential in the United States, Kentucky in particular, has historically been communicated using a single hazard level for each county; however, physical phenomena are not controlled by administrative boundaries, so single-value county maps do not reflect the significant variations in radon potential in each county. A more accurate approach uses bedrock geology as a predictive tool. A team of nurses, health educators, statisticians, and geologists partnered to create 120 county maps showing spatial variations in radon potential by intersecting residential radon test kit results (N = 60,000) with a statewide 1:24,000-scale bedrock geology coverage to determine statistically valid radon-potential estimates for each geologic unit. Maps using geology as a predictive tool for radon potential are inherently more detailed than single-value county maps. This mapping project revealed that areas in central and south-central Kentucky with the highest radon potential are underlain by shales and karstic limestones.

  3. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain

    Science.gov (United States)

    2011-01-01

    Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This

  4. Development of spatial density maps based on geoprocessing web services: application to tuberculosis incidence in Barcelona, Spain.

    Science.gov (United States)

    Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A

    2011-11-29

    Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web

  5. A Electronic Map Data Model Based on PDF

    Science.gov (United States)

    Zhou, Xiaodong; Yang, Chuncheng; Meng, Nina; Peng, Peng

    2018-05-01

    In this paper, we proposed the PDFEMAP (PDF electronic map) that is a kind of new electronic map products aiming at the current situation and demand of the use of electronic map products. Firstly gives the definition and characteristics of PDFEMAP, followed by a detailed description of the data model and method for generating PDFEMAP, and finally expounds application modes of the PDFEMAP which feasibility and effectiveness are verified.

  6. Design of Intelligent Transportation Inquiry System Based on MapX in the Environment of VC++

    Directory of Open Access Journals (Sweden)

    Cheng Juan

    2016-01-01

    Full Text Available This paper applied MapInfo, the professional soft ware tool of GIS, integrated secondary exploiture combining with elctronic maps, and made use of the exploiture flat roof Visual C++ as the tool of visualize development, transferred MapX, a control of MapInfo, integrated them. The paper designed the Inquiry System in Intelligent Transportation, which including query system of road information, query system of bus information, query system of district information. It can be carried out space analysis and query function based on GIS. Adopted SQL Server manage attribute data, by data binding, attribute data in SQL Server and victor picture data were combined.

  7. Recognition of building group patterns in topographic maps based on graph partitioning and random forest

    Science.gov (United States)

    He, Xianjin; Zhang, Xinchang; Xin, Qinchuan

    2018-02-01

    Recognition of building group patterns (i.e., the arrangement and form exhibited by a collection of buildings at a given mapping scale) is important to the understanding and modeling of geographic space and is hence essential to a wide range of downstream applications such as map generalization. Most of the existing methods develop rigid rules based on the topographic relationships between building pairs to identify building group patterns and thus their applications are often limited. This study proposes a method to identify a variety of building group patterns that allow for map generalization. The method first identifies building group patterns from potential building clusters based on a machine-learning algorithm and further partitions the building clusters with no recognized patterns based on the graph partitioning method. The proposed method is applied to the datasets of three cities that are representative of the complex urban environment in Southern China. Assessment of the results based on the reference data suggests that the proposed method is able to recognize both regular (e.g., the collinear, curvilinear, and rectangular patterns) and irregular (e.g., the L-shaped, H-shaped, and high-density patterns) building group patterns well, given that the correctness values are consistently nearly 90% and the completeness values are all above 91% for three study areas. The proposed method shows promises in automated recognition of building group patterns that allows for map generalization.

  8. Raman spectroscopic analysis of real samples: Brazilian bauxite mineralogy

    Science.gov (United States)

    Faulstich, Fabiano Richard Leite; Castro, Harlem V.; de Oliveira, Luiz Fernando Cappa; Neumann, Reiner

    2011-10-01

    In this investigation, Raman spectroscopy with 1064 and 632.8 nm excitation was used to investigate real mineral samples of bauxite ore from mines of Northern Brazil, together with Raman mapping and X-rays diffraction. The obtained results show clearly that the use of microRaman spectroscopy is a powerful tool for the identification of all the minerals usually found in bauxites: gibbsite, kaolinite, goethite, hematite, anatase and quartz. Bulk samples can also be analysed, and FT-Raman is more adequate due to better signal-to-noise ratio and representativity, although not efficient for kaolinite. The identification of fingerprinting vibrations for all the minerals allows the acquisition of Raman-based chemical maps, potentially powerful tools for process mineralogy applied to bauxite ores.

  9. A Distributed Public Key Infrastructure Based on Threshold Cryptography for the HiiMap Next Generation Internet Architecture

    Directory of Open Access Journals (Sweden)

    Oliver Hanka

    2011-02-01

    Full Text Available In this article, a security extension for the HiiMap Next Generation Internet Architecture is presented. We regard a public key infrastructure which is integrated into the mapping infrastructure of the locator/identifier-split addressing scheme. The security approach is based on Threshold Cryptography which enables a sharing of keys among the mapping servers. Hence, a more trustworthy and fair approach for a Next Generation Internet Architecture as compared to the state of the art approach is fostered. Additionally, we give an evaluation based on IETF AAA recommendations for security-related systems.

  10. A Fast and Robust Feature-Based Scan-Matching Method in 3D SLAM and the Effect of Sampling Strategies

    Directory of Open Access Journals (Sweden)

    Cihan Ulas

    2013-11-01

    Full Text Available Simultaneous localization and mapping (SLAM plays an important role in fully autonomous systems when a GNSS (global navigation satellite system is not available. Studies in both 2D indoor and 3D outdoor SLAM are based on the appearance of environments and utilize scan-matching methods to find rigid body transformation parameters between two consecutive scans. In this study, a fast and robust scan-matching method based on feature extraction is introduced. Since the method is based on the matching of certain geometric structures, like plane segments, the outliers and noise in the point cloud are considerably eliminated. Therefore, the proposed scan-matching algorithm is more robust than conventional methods. Besides, the registration time and the number of iterations are significantly reduced, since the number of matching points is efficiently decreased. As a scan-matching framework, an improved version of the normal distribution transform (NDT is used. The probability density functions (PDFs of the reference scan are generated as in the traditional NDT, and the feature extraction - based on stochastic plane detection - is applied to the only input scan. By using experimental dataset belongs to an outdoor environment like a university campus, we obtained satisfactory performance results. Moreover, the feature extraction part of the algorithm is considered as a special sampling strategy for scan-matching and compared to other sampling strategies, such as random sampling and grid-based sampling, the latter of which is first used in the NDT. Thus, this study also shows the effect of the subsampling on the performance of the NDT.

  11. Spectral properties and ASTER-based alteration mapping of Masahim volcano facies, SE Iran

    Science.gov (United States)

    Tayebi, Mohammad H.; Tangestani, Majid H.; Vincent, Robert K.; Neal, Devin

    2014-10-01

    This study applies Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data and the Mixture Tuned Matched Filtering (MTMF) algorithm to map the sub-pixel distribution of alteration minerals associated with the Masahim volcano, SE Iran for understanding the spatial relationship between alteration minerals and volcano facies. Investigations of the alteration mineralogy were conducted using field-spectroscopy, X-ray diffraction (XRD) analysis and ASTER Short Wave Infrared (SWIR) spectral data. In order to spectrally characterize the stratovolcano deposits, lithological units and alteration minerals, the volcano was divided into three facies: the Central, Proximal, and Medial-distal facies. The reflectance spectra of rock samples show absorption features of a number of minerals including white mica, kaolinite, montmorillonite, illite, goethite, hematite, jarosite, opal, and chlorite. The end-members of key alteration minerals including sericite (phyllic zone), kaolinite (argillic zone) and chlorite (propylitic zone) were extracted from imagery using the Pixel Purity Index (PPI) method and were used to map alteration minerals. Accuracy assessment through field observations was used to verify the fraction maps. The results showed that most prominent altered rocks situated at the central facies of volcano. The alteration minerals were discriminated with the coefficient of determination (R2) of 0.74, 0.81, and 0.68 for kaolinite, sericite, and chlorite, respectively. The results of this study have the potential to refine the map of alteration zones in the Masahim volcano.

  12. Developing a scientific procedure for community based hazard mapping and risk mitigation

    Science.gov (United States)

    Verrier, M.

    2011-12-01

    As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation

  13. Sandwich mapping of schistosomiasis risk in Anhui Province, China.

    Science.gov (United States)

    Hu, Yi; Bergquist, Robert; Lynn, Henry; Gao, Fenghua; Wang, Qizhi; Zhang, Shiqing; Li, Rui; Sun, Liqian; Xia, Congcong; Xiong, Chenglong; Zhang, Zhijie; Jiang, Qingwu

    2015-06-03

    Schistosomiasis mapping using data obtained from parasitological surveys is frequently used in planning and evaluation of disease control strategies. The available geostatistical approaches are, however, subject to the assumption of stationarity, a stochastic process whose joint probability distribution does not change when shifted in time. As this is impractical for large areas, we introduce here the sandwich method, the basic idea of which is to divide the study area (with its attributes) into homogeneous subareas and estimate the values for the reporting units using spatial stratified sampling. The sandwich method was applied to map the county-level prevalence of schistosomiasis japonica in Anhui Province, China based on parasitological data collected from sample villages and land use data. We first mapped the county-level prevalence using the sandwich method, then compared our findings with block Kriging. The sandwich estimates ranged from 0.17 to 0.21% with a lower level of uncertainty, while the Kriging estimates varied from 0 to 0.97% with a higher level of uncertainty, indicating that the former is more smoothed and stable compared to latter. Aside from various forms of reporting units, the sandwich method has the particular merit of simple model assumption coupled with full utilization of sample data. It performs well when a disease presents stratified heterogeneity over space.

  14. Evidence of Allopolyploidy in Urochloa humidicola Based on Cytological Analysis and Genetic Linkage Mapping.

    Directory of Open Access Journals (Sweden)

    Bianca B Z Vigna

    Full Text Available The African species Urochloa humidicola (Rendle Morrone & Zuloaga (syn. Brachiaria humidicola (Rendle Schweick. is an important perennial forage grass found throughout the tropics. This species is polyploid, ranging from tetra to nonaploid, and apomictic, which makes genetic studies challenging; therefore, the number of currently available genetic resources is limited. The genomic architecture and evolution of U. humidicola and the molecular markers linked to apomixis were investigated in a full-sib F1 population obtained by crossing the sexual accession H031 and the apomictic cultivar U. humidicola cv. BRS Tupi, both of which are hexaploid. A simple sequence repeat (SSR-based linkage map was constructed for the species from 102 polymorphic and specific SSR markers based on simplex and double-simplex markers. The map consisted of 49 linkage groups (LGs and had a total length of 1702.82 cM, with 89 microsatellite loci and an average map density of 10.6 cM. Eight homology groups (HGs were formed, comprising 22 LGs, and the other LGs remained ungrouped. The locus that controls apospory (apo-locus was mapped in LG02 and was located 19.4 cM from the locus Bh027.c.D2. In the cytological analyses of some hybrids, bi- to hexavalents at diakinesis were observed, as well as two nucleoli in some meiocytes, smaller chromosomes with preferential allocation within the first metaphase plate and asynchronous chromosome migration to the poles during anaphase. The linkage map and the meiocyte analyses confirm previous reports of hybridization and suggest an allopolyploid origin of the hexaploid U. humidicola. This is the first linkage map of an Urochloa species, and it will be useful for future quantitative trait locus (QTL analysis after saturation of the map and for genome assembly and evolutionary studies in Urochloa spp. Moreover, the results of the apomixis mapping are consistent with previous reports and confirm the need for additional studies to search for

  15. CT in the staging of bronchogenic carcinoma: Analysis by correlative lymph node mapping and sampling

    International Nuclear Information System (INIS)

    McLoud, T.C.; Woldenberg, R.; Mathisen, D.J.; Grillo, H.C.; Bourgoulin, P.M.; Shepard, J.O.; Moore, E.H.

    1987-01-01

    Although previous studies have evaluated the accuracy of CT in staging the mediastinum in bronchogenic carcinoma, none has determined the sensitivity and specificity of CT in the assessment of individual lymph node groups by correlative nodal sampling at surgery. CT scans were performed on 84 patients with bronchogenic carcinoma. Abnormal nodes (≥ 1 cm) were localized according to the ATS classification of regional lymph node mapping. Seventy-nine patients had mediastinoscopy and 64 patients underwent thoracotomy. In each case, biopsies of lymph node groups 2R, 4R, 2L, 4L (paratracheal), 7 (subcarinal), and 5 (aorticopulmonary) were performed on the appropriate side. Hilar nodes (10R and 11R, 10L and 11L) were resected with the surgical specimen. A total of 292 nodes were sampled. Overall sensitivity for all lymph node groups was 40%, and specificity, 81%. Sensitivity was highest for the 4R (paratracheal) group (82%) and lowest for the subcarinal area (20%). Specificity ranged from 71% for 11R nodes (right hilar) to 94% for 10L (left peribronchial). The positive predictive value was 34%, and the negative predictive value, 84%. This study suggests that the more optimistic results previously reported may have resulted from lack of correlation of individual lymph node groups identified on CT with those sampled at surgery

  16. Forest Biomass Mapping From Lidar and Radar Synergies

    Science.gov (United States)

    Sun, Guoqing; Ranson, K. Jon; Guo, Z.; Zhang, Z.; Montesano, P.; Kimes, D.

    2011-01-01

    The use of lidar and radar instruments to measure forest structure attributes such as height and biomass at global scales is being considered for a future Earth Observation satellite mission, DESDynI (Deformation, Ecosystem Structure, and Dynamics of Ice). Large footprint lidar makes a direct measurement of the heights of scatterers in the illuminated footprint and can yield accurate information about the vertical profile of the canopy within lidar footprint samples. Synthetic Aperture Radar (SAR) is known to sense the canopy volume, especially at longer wavelengths and provides image data. Methods for biomass mapping by a combination of lidar sampling and radar mapping need to be developed. In this study, several issues in this respect were investigated using aircraft borne lidar and SAR data in Howland, Maine, USA. The stepwise regression selected the height indices rh50 and rh75 of the Laser Vegetation Imaging Sensor (LVIS) data for predicting field measured biomass with a R(exp 2) of 0.71 and RMSE of 31.33 Mg/ha. The above-ground biomass map generated from this regression model was considered to represent the true biomass of the area and used as a reference map since no better biomass map exists for the area. Random samples were taken from the biomass map and the correlation between the sampled biomass and co-located SAR signature was studied. The best models were used to extend the biomass from lidar samples into all forested areas in the study area, which mimics a procedure that could be used for the future DESDYnI Mission. It was found that depending on the data types used (quad-pol or dual-pol) the SAR data can predict the lidar biomass samples with R2 of 0.63-0.71, RMSE of 32.0-28.2 Mg/ha up to biomass levels of 200-250 Mg/ha. The mean biomass of the study area calculated from the biomass maps generated by lidar- SAR synergy 63 was within 10% of the reference biomass map derived from LVIS data. The results from this study are preliminary, but do show the

  17. A method of recovering the initial vectors of globally coupled map lattices based on symbolic dynamics

    International Nuclear Information System (INIS)

    Sun Li-Sha; Kang Xiao-Yun; Zhang Qiong; Lin Lan-Xin

    2011-01-01

    Based on symbolic dynamics, a novel computationally efficient algorithm is proposed to estimate the unknown initial vectors of globally coupled map lattices (CMLs). It is proved that not all inverse chaotic mapping functions are satisfied for contraction mapping. It is found that the values in phase space do not always converge on their initial values with respect to sufficient backward iteration of the symbolic vectors in terms of global convergence or divergence (CD). Both CD property and the coupling strength are directly related to the mapping function of the existing CML. Furthermore, the CD properties of Logistic, Bernoulli, and Tent chaotic mapping functions are investigated and compared. Various simulation results and the performances of the initial vector estimation with different signal-to-noise ratios (SNRs) are also provided to confirm the proposed algorithm. Finally, based on the spatiotemporal chaotic characteristics of the CML, the conditions of estimating the initial vectors using symbolic dynamics are discussed. The presented method provides both theoretical and experimental results for better understanding and characterizing the behaviours of spatiotemporal chaotic systems. (general)

  18. A method of recovering the initial vectors of globally coupled map lattices based on symbolic dynamics

    Science.gov (United States)

    Sun, Li-Sha; Kang, Xiao-Yun; Zhang, Qiong; Lin, Lan-Xin

    2011-12-01

    Based on symbolic dynamics, a novel computationally efficient algorithm is proposed to estimate the unknown initial vectors of globally coupled map lattices (CMLs). It is proved that not all inverse chaotic mapping functions are satisfied for contraction mapping. It is found that the values in phase space do not always converge on their initial values with respect to sufficient backward iteration of the symbolic vectors in terms of global convergence or divergence (CD). Both CD property and the coupling strength are directly related to the mapping function of the existing CML. Furthermore, the CD properties of Logistic, Bernoulli, and Tent chaotic mapping functions are investigated and compared. Various simulation results and the performances of the initial vector estimation with different signal-to-noise ratios (SNRs) are also provided to confirm the proposed algorithm. Finally, based on the spatiotemporal chaotic characteristics of the CML, the conditions of estimating the initial vectors using symbolic dynamics are discussed. The presented method provides both theoretical and experimental results for better understanding and characterizing the behaviours of spatiotemporal chaotic systems.

  19. A consensus linkage map of the grass carp (Ctenopharyngodon idella based on microsatellites and SNPs

    Directory of Open Access Journals (Sweden)

    Li Jiale

    2010-02-01

    Full Text Available Abstract Background Grass carp (Ctenopharyngodon idella belongs to the family Cyprinidae which includes more than 2000 fish species. It is one of the most important freshwater food fish species in world aquaculture. A linkage map is an essential framework for mapping traits of interest and is often the first step towards understanding genome evolution. The aim of this study is to construct a first generation genetic map of grass carp using microsatellites and SNPs to generate a new resource for mapping QTL for economically important traits and to conduct a comparative mapping analysis to shed new insights into the evolution of fish genomes. Results We constructed a first generation linkage map of grass carp with a mapping panel containing two F1 families including 192 progenies. Sixteen SNPs in genes and 263 microsatellite markers were mapped to twenty-four linkage groups (LGs. The number of LGs was corresponding to the haploid chromosome number of grass carp. The sex-specific map was 1149.4 and 888.8 cM long in females and males respectively whereas the sex-averaged map spanned 1176.1 cM. The average resolution of the map was 4.2 cM/locus. BLAST searches of sequences of mapped markers of grass carp against the whole genome sequence of zebrafish revealed substantial macrosynteny relationship and extensive colinearity of markers between grass carp and zebrafish. Conclusions The linkage map of grass carp presented here is the first linkage map of a food fish species based on co-dominant markers in the family Cyprinidae. This map provides a valuable resource for mapping phenotypic variations and serves as a reference to approach comparative genomics and understand the evolution of fish genomes and could be complementary to grass carp genome sequencing project.

  20. Consensus of heterogeneous multi-agent systems based on sampled data with a small sampling delay

    International Nuclear Information System (INIS)

    Wang Na; Wu Zhi-Hai; Peng Li

    2014-01-01

    In this paper, consensus problems of heterogeneous multi-agent systems based on sampled data with a small sampling delay are considered. First, a consensus protocol based on sampled data with a small sampling delay for heterogeneous multi-agent systems is proposed. Then, the algebra graph theory, the matrix method, the stability theory of linear systems, and some other techniques are employed to derive the necessary and sufficient conditions guaranteeing heterogeneous multi-agent systems to asymptotically achieve the stationary consensus. Finally, simulations are performed to demonstrate the correctness of the theoretical results. (interdisciplinary physics and related areas of science and technology)

  1. HULU SUNGAI PERAK BED SEDIMENT MAPPING USING UNDERWATER ACOUSTIC SONAR

    Directory of Open Access Journals (Sweden)

    N. Arriafdi

    2016-09-01

    Full Text Available Development in acoustic survey techniques in particular side scan sonar have revolutionized the way we are able to image, map and understand the riverbed environment. It is now cost effective to image large areas of the riverbed using these techniques and the backscatter image created from surveys provides base line data from which thematic maps of the riverbed environment including maps of morphological geology, can be derived when interpreted in conjunction with in situ sampling data. This article focuses on investigation characteristics of sediments and correlation of side scan backscatter image with signal strength. The interpretation of acoustic backscatter rely on experienced interpretation by eye of grey scale images produced from the data. A 990F Starfish Side Scan Sonar was used to collect and develop a series of sonar images along 6 km of Hulu Sungai Perak. Background sediments could be delineated accurately and the image textures could be linked to the actual river floor appearance through grab sampling. A major difference was found in the acoustic returns from the two research area studies: the upstream area shows much rougher textures. This is due to an actual differences in riverbed roughness, caused by a difference in bottom currents and sediment dynamics in the two areas. The highest backscatter correlates with coarsest and roughness sediment. Result suggest that image based backscatter classification shows considerable promise for interpretation of side scan sonar data for the production of geological maps.

  2. On the security of 3D Cat map based symmetric image encryption scheme

    International Nuclear Information System (INIS)

    Wang Kai; Pei, W.-J.; Zou, Liuhua; Song Aiguo; He Zhenya

    2005-01-01

    A 3D Cat map based symmetric image encryption algorithm, which significantly increases the resistance against statistical and differential attacks, has been proposed recently. It employs a 3D Cat map to shuffle the positions of image pixels and uses the Logistic map to diffuse the relationship between the cipher-image and the plain-image. Based on the factor that it is sufficient to break this cryptosystem only with the equivalent control parameters, some fundamental weaknesses of the cryptosystem are pointed out. With the knowledge of symbolic dynamics and some specially designed plain-images, we can calculate the equivalent initial condition of diffusion process and rebuild a valid equivalent 3D Cat matrix. In this Letter, we will propose a successful chosen-plain-text cryptanalytic attack, which is composed of two mutually independent procedures: the cryptanalysis of the diffusion process and the cryptanalysis of the spatial permutation process. Both theoretical and experimental results show that the lack of security discourages the use of these cryptosystems for practical applications

  3. An Improved Information Value Model Based on Gray Clustering for Landslide Susceptibility Mapping

    Directory of Open Access Journals (Sweden)

    Qianqian Ba

    2017-01-01

    Full Text Available Landslides, as geological hazards, cause significant casualties and economic losses. Therefore, it is necessary to identify areas prone to landslides for prevention work. This paper proposes an improved information value model based on gray clustering (IVM-GC for landslide susceptibility mapping. This method uses the information value derived from an information value model to achieve susceptibility classification and weight determination of landslide predisposing factors and, hence, obtain the landslide susceptibility of each study unit based on the clustering analysis. Using a landslide inventory of Chongqing, China, which contains 8435 landslides, three landslide susceptibility maps were generated based on the common information value model (IVM, an information value model improved by an analytic hierarchy process (IVM-AHP and our new improved model. Approximately 70% (5905 of the inventory landslides were used to generate the susceptibility maps, while the remaining 30% (2530 were used to validate the results. The training accuracies of the IVM, IVM-AHP and IVM-GC were 81.8%, 78.7% and 85.2%, respectively, and the prediction accuracies were 82.0%, 78.7% and 85.4%, respectively. The results demonstrate that all three methods perform well in evaluating landslide susceptibility. Among them, IVM-GC has the best performance.

  4. An annotated genetic map of loblolly pine based on microsatellite and cDNA markers

    Science.gov (United States)

    Craig S. Echt; Surya Saha; Konstantin V. Krutovsky; Kokulapalan Wimalanathan; John E. Erpelding; Chun Liang; C Dana Nelson

    2011-01-01

    Previous loblolly pine (Pinus taeda L.) genetic linkage maps have been based on a variety of DNA polymorphisms, such as AFLPs, RAPDs, RFLPs, and ESTPs, but only a few SSRs (simple sequence repeats), also known as simple tandem repeats or microsatellites, have been mapped in P. taeda. The objective of this study was to integrate a large set of SSR markers from a variety...

  5. A new block cipher based on chaotic map and group theory

    International Nuclear Information System (INIS)

    Yang Huaqian; Liao Xiaofeng; Wong Kwokwo; Zhang Wei; Wei Pengcheng

    2009-01-01

    Based on the study of some existing chaotic encryption algorithms, a new block cipher is proposed. In the proposed cipher, two sequences of decimal numbers individually generated by two chaotic piecewise linear maps are used to determine the noise vectors by comparing the element of the two sequences. Then a sequence of decimal numbers is used to define a bijection map. The modular multiplication operation in the group Z 2 8 +1 * and permutations are alternately applied on plaintext with block length of multiples of 64 bits to produce ciphertext blocks of the same length. Analysis show that the proposed block cipher does not suffer from the flaws of pure chaotic cryptosystems.

  6. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  7. Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics

    Science.gov (United States)

    Kohira, K.; Masuda, H.

    2017-09-01

    A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  8. POINT-CLOUD COMPRESSION FOR VEHICLE-BASED MOBILE MAPPING SYSTEMS USING PORTABLE NETWORK GRAPHICS

    Directory of Open Access Journals (Sweden)

    K. Kohira

    2017-09-01

    Full Text Available A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects.Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  9. SODIM: Service Oriented Data Integration based on MapReduce

    Directory of Open Access Journals (Sweden)

    Ghada ElSheikh

    2013-09-01

    Data integration systems can benefit from innovative dynamic infrastructure solutions such as Clouds, with its more agility, lower cost, device independency, location independency, and scalability. This study consolidates the data integration system, Service Orientation, and distributed processing to develop a new data integration system called Service Oriented Data Integration based on MapReduce (SODIM that improves the system performance, especially with large number of data sources, and that can efficiently be hosted on modern dynamic infrastructures as Clouds.

  10. The development of a high density linkage map for black tiger shrimp (Penaeus monodon based on cSNPs.

    Directory of Open Access Journals (Sweden)

    Matthew Baranski

    Full Text Available Transcriptome sequencing using Illumina RNA-seq was performed on populations of black tiger shrimp from India. Samples were collected from (i four landing centres around the east coastline (EC of India, (ii survivors of a severe WSSV infection during pond culture (SUR and (iii the Andaman Islands (AI in the Bay of Bengal. Equal quantities of purified total RNA from homogenates of hepatopancreas, muscle, nervous tissue, intestinal tract, heart, gonad, gills, pleopod and lymphoid organs were combined to create AI, EC and SUR pools for RNA sequencing. De novo transcriptome assembly resulted in 136,223 contigs (minimum size 100 base pairs, bp with a total length 61 Mb, an average length of 446 bp and an average coverage of 163× across all pools. Approximately 16% of contigs were annotated with BLAST hit information and gene ontology annotations. A total of 473,620 putative SNPs/indels were identified. An Illumina iSelect genotyping array containing 6,000 SNPs was developed and used to genotype 1024 offspring belonging to seven full-sibling families. A total of 3959 SNPs were mapped to 44 linkage groups. The linkage groups consisted of between 16-129 and 13-130 markers, of length between 139-10.8 and 109.1-10.5 cM and with intervals averaging between 1.2 and 0.9 cM for the female and male maps respectively. The female map was 28% longer than the male map (4060 and 2917 cM respectively with a 1.6 higher recombination rate observed for female compared to male meioses. This approach has substantially increased expressed sequence and DNA marker resources for tiger shrimp and is a useful resource for QTL mapping and association studies for evolutionarily and commercially important traits.

  11. An Isometric Mapping Based Co-Location Decision Tree Algorithm

    Science.gov (United States)

    Zhou, G.; Wei, J.; Zhou, X.; Zhang, R.; Huang, W.; Sha, H.; Chen, J.

    2018-05-01

    Decision tree (DT) induction has been widely used in different pattern classification. However, most traditional DTs have the disadvantage that they consider only non-spatial attributes (ie, spectral information) as a result of classifying pixels, which can result in objects being misclassified. Therefore, some researchers have proposed a co-location decision tree (Cl-DT) method, which combines co-location and decision tree to solve the above the above-mentioned traditional decision tree problems. Cl-DT overcomes the shortcomings of the existing DT algorithms, which create a node for each value of a given attribute, which has a higher accuracy than the existing decision tree approach. However, for non-linearly distributed data instances, the euclidean distance between instances does not reflect the true positional relationship between them. In order to overcome these shortcomings, this paper proposes an isometric mapping method based on Cl-DT (called, (Isomap-based Cl-DT), which is a method that combines heterogeneous and Cl-DT together. Because isometric mapping methods use geodetic distances instead of Euclidean distances between non-linearly distributed instances, the true distance between instances can be reflected. The experimental results and several comparative analyzes show that: (1) The extraction method of exposed carbonate rocks is of high accuracy. (2) The proposed method has many advantages, because the total number of nodes, the number of leaf nodes and the number of nodes are greatly reduced compared to Cl-DT. Therefore, the Isomap -based Cl-DT algorithm can construct a more accurate and faster decision tree.

  12. AN ISOMETRIC MAPPING BASED CO-LOCATION DECISION TREE ALGORITHM

    Directory of Open Access Journals (Sweden)

    G. Zhou

    2018-05-01

    Full Text Available Decision tree (DT induction has been widely used in different pattern classification. However, most traditional DTs have the disadvantage that they consider only non-spatial attributes (ie, spectral information as a result of classifying pixels, which can result in objects being misclassified. Therefore, some researchers have proposed a co-location decision tree (Cl-DT method, which combines co-location and decision tree to solve the above the above-mentioned traditional decision tree problems. Cl-DT overcomes the shortcomings of the existing DT algorithms, which create a node for each value of a given attribute, which has a higher accuracy than the existing decision tree approach. However, for non-linearly distributed data instances, the euclidean distance between instances does not reflect the true positional relationship between them. In order to overcome these shortcomings, this paper proposes an isometric mapping method based on Cl-DT (called, (Isomap-based Cl-DT, which is a method that combines heterogeneous and Cl-DT together. Because isometric mapping methods use geodetic distances instead of Euclidean distances between non-linearly distributed instances, the true distance between instances can be reflected. The experimental results and several comparative analyzes show that: (1 The extraction method of exposed carbonate rocks is of high accuracy. (2 The proposed method has many advantages, because the total number of nodes, the number of leaf nodes and the number of nodes are greatly reduced compared to Cl-DT. Therefore, the Isomap -based Cl-DT algorithm can construct a more accurate and faster decision tree.

  13. A New Approach to High-accuracy Road Orthophoto Mapping Based on Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2011-12-01

    Full Text Available Existing orthophoto map based on satellite photography and aerial photography is not precise enough for road marking. This paper proposes a new approach to high-accuracy orthophoto mapping. The approach uses inverse perspective transformation to process the image information and generates the orthophoto fragment. The offline interpolation algorithm is used to process the location information. It processes the dead reckoning and the EKF location information, and uses the result to transform the fragments to the global coordinate system. At last it uses wavelet transform to divides the image to two frequency bands and uses weighted median algorithm to deal with them separately. The result of experiment shows that the map produced with this method has high accuracy.

  14. Development of a new comprehensive and reliable endometrial receptivity map (ER Map/ER Grade) based on RT-qPCR gene expression analysis.

    Science.gov (United States)

    Enciso, M; Carrascosa, J P; Sarasa, J; Martínez-Ortiz, P A; Munné, S; Horcajadas, J A; Aizpurua, J

    2018-02-01

    Is it possible to determine the receptivity status of an endometrium by combined quantitative reverse transcription PCR (RT-qPCR) expression analysis of genes involved in endometrial proliferation and immunity? The new ER Map®/ER Grade® test can predict endometrial receptivity status by RT-qPCR using a new panel of genes involved in endometrial proliferation and the maternal immune response associated to embryonic implantation. The human endometrium reaches a receptive status adequate for embryonic implantation around Days 19-21 of the menstrual cycle. During this period, known as the window of implantation (WOI), the endometrium shows a specific gene expression profile suitable for endometrial function evaluation. The number of molecular diagnostic tools currently available to characterize this process is very limited. In this study, a new system for human endometrial receptivity evaluation was optimized and presented for the first time. ER Map®/ER Grade® validation was achieved on 312 endometrial samples including fertile women and patients undergoing fertility treatment between July 2014 and March 2016. Expression analyses of 184 genes involved in endometrial receptivity and immune response were performed. Samples were additionally tested with an independent endometrial receptivity test. A total of 96 fertile women and 120 assisted reproduction treatment (ART) patients participated in the study. Endometrial biopsy samples were obtained at LH + 2 and LH + 7 days in fertile subjects in a natural cycle and at the window of implantation (WOI) in patients in a hormone-replacement therapy (HRT) cycle. Total RNA was purified, quality-checked and reverse-transcribed. Gene expression was quantified by high-throughput RT-qPCR and statistically analyzed. Informative genes were selected and used to classify samples into four different groups of endometrial receptivity status. Significantly different gene expression levels were found in 85 out of 184 selected genes when

  15. The Use of Concept Map as a Consolidation Phase Based STAD to Enhance Students’ Comprehension about Environmental Pollution

    Science.gov (United States)

    Nugroho, O. F.; Chandra, D. T.; Sanjaya, Y.; Pendidikan Indonesia, Universitas

    2017-02-01

    The purpose of this study was to improve students’ concept comprehension using concept map as a consolidation phase based STAD. This study was conducted by randomized control group pretest-posttest. Data was collected by using an instrument test to evaluate the effect of concept map as a consolidation phase based STAD on students’understanding about environmental pollution. Data was analyzed using normalized gain (n-gain) and independent t-test. The n-gain analysis shows the increased of students’s understanding about environmental pollution at experimental group arehigher than at the control group. The result of this study showed that students’ comprehension at the experimental class (0,53) higher compared to the control group (0,23). Whilst the t-test analysis shows that there is a significant effect of mapping concept as a consolidation phase based STAD towards students’ concept comprehension. It can be concluded that the implementation of mapping concept based STAD may improve the students’s understanding on science concept.

  16. An Image Encryption Scheme Based on Hyperchaotic Rabinovich and Exponential Chaos Maps

    Directory of Open Access Journals (Sweden)

    Xiaojun Tong

    2015-01-01

    Full Text Available This paper proposes a new four-dimensional hyperchaotic map based on the Rabinovich system to realize chaotic encryption in higher dimension and improve the security. The chaotic sequences generated by Runge-Kutta method are combined with the chaotic sequences generated by an exponential chaos map to generate key sequences. The key sequences are used for image encryption. The security test results indicate that the new hyperchaotic system has high security and complexity. The comparison between the new hyperchaotic system and the several low-dimensional chaotic systems shows that the proposed system performs more efficiently.

  17. Weed map generation from UAV image mosaics based on crop row detection

    DEFF Research Database (Denmark)

    Midtiby, Henrik Skov

    To control weed in a field effectively with a minimum of herbicides, knowledge about the weed patches is required. Based on images acquired by Unmanned Aerial Vehicles (UAVs), a vegetation map of the entire field can be generated. Manual analysis, which is often required, to detect weed patches...... is used as input for the method. Issues related to perspective distortion are reduced by using an orthomosaic, which is a high resolution image of the entire field, built from hundreds of images taken by a UAV. A vegetation map is generated from the orthomosaic by calculating the excess green color index...

  18. Solar resources and power potential mapping in Vietnam using satellite-derived and GIS-based information

    International Nuclear Information System (INIS)

    Polo, J.; Bernardos, A.; Navarro, A.A.; Fernandez-Peruchena, C.M.; Ramírez, L.; Guisado, María V.; Martínez, S.

    2015-01-01

    Highlights: • Satellite-based, reanalysis data and measurements are combined for solar mapping. • Plant output modeling for PV and CSP results in simple expressions of solar potential. • Solar resource, solar potential are used in a GIS for determine technical solar potential. • Solar resource and potential maps of Vietnam are presented. - Abstract: The present paper presents maps of the solar resources in Vietnam and of the solar potential for concentrating solar power (CSP) and for grid-connected photovoltaic (PV) technology. The mapping of solar radiation components has been calculated from satellite-derived data combined with solar radiation derived from sunshine duration and other additional sources of information based on reanalysis for several atmospheric and meteorological parameters involved. Two scenarios have been selected for the study of the solar potential: CSP Parabolic Trough of 50 MWe and grid-connected Flat Plate PV plant of around 1 MWe. For each selected scenario plant performance simulations have been computed for developing simple expressions that allow the estimation of the solar potential from the annual solar irradiation and the latitude of every site in Vietnam. Finally, Geographic Information Systems (GIS) have been used for combining the solar potential with the land availability according each scenario to deliver the technical solar potential maps of Vietnam

  19. A human motion model based on maps for navigation systems

    Directory of Open Access Journals (Sweden)

    Kaiser Susanna

    2011-01-01

    Full Text Available Abstract Foot-mounted indoor positioning systems work remarkably well when using additionally the knowledge of floor-plans in the localization algorithm. Walls and other structures naturally restrict the motion of pedestrians. No pedestrian can walk through walls or jump from one floor to another when considering a building with different floor-levels. By incorporating known floor-plans in sequential Bayesian estimation processes such as particle filters (PFs, long-term error stability can be achieved as long as the map is sufficiently accurate and the environment sufficiently constraints pedestrians' motion. In this article, a new motion model based on maps and floor-plans is introduced that is capable of weighting the possible headings of the pedestrian as a function of the local environment. The motion model is derived from a diffusion algorithm that makes use of the principle of a source effusing gas and is used in the weighting step of a PF implementation. The diffusion algorithm is capable of including floor-plans as well as maps with areas of different degrees of accessibility. The motion model more effectively represents the probability density function of possible headings that are restricted by maps and floor-plans than a simple binary weighting of particles (i.e., eliminating those that crossed walls and keeping the rest. We will show that the motion model will help for obtaining better performance in critical navigation scenarios where two or more modes may be competing for some of the time (multi-modal scenarios.

  20. Template based rodent brain extraction and atlas mapping.

    Science.gov (United States)

    Weimin Huang; Jiaqi Zhang; Zhiping Lin; Su Huang; Yuping Duan; Zhongkang Lu

    2016-08-01

    Accurate rodent brain extraction is the basic step for many translational studies using MR imaging. This paper presents a template based approach with multi-expert refinement to automatic rodent brain extraction. We first build the brain appearance model based on the learning exemplars. Together with the template matching, we encode the rodent brain position into the search space to reliably locate the rodent brain and estimate the rough segmentation. With the initial mask, a level-set segmentation and a mask-based template learning are implemented further to the brain region. The multi-expert fusion is used to generate a new mask. We finally combine the region growing based on the histogram distribution learning to delineate the final brain mask. A high-resolution rodent atlas is used to illustrate that the segmented low resolution anatomic image can be well mapped to the atlas. Tested on a public data set, all brains are located reliably and we achieve the mean Jaccard similarity score at 94.99% for brain segmentation, which is a statistically significant improvement compared to two other rodent brain extraction methods.