WorldWideScience

Sample records for mapping approach defines

  1. Defining Mapping Mashups with BioXMash

    Directory of Open Access Journals (Sweden)

    Hunt Ela

    2007-12-01

    Full Text Available We present a novel approach to XML data integration which allows a biologist to select data from a large XML file repository, add it to a genome map, and produce a mapping mashup showing integrated data in map context. This approach can be used to produce contextual views of arbitrary XML data which relates to objects shown on a map. A biologist using BioXMash searches in XML tags, and is guided by XML path data availability, shown as the number of values reachable via a path, in both global, genome-wide, and local, per-gene, context. Then she examines sample values in an area of interest on the map. If required, the resulting data is dumped to files, for subsequent analysis.

  2. Towards Technological Approaches for Concept Maps Mining from Text

    Directory of Open Access Journals (Sweden)

    Camila Zacche Aguiar

    2018-04-01

    Full Text Available Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed study on technological approaches for automatic construction of concept maps published between 1994 and 2016 in the IEEE Xplore, ACM and Elsevier Science Direct data bases. From this study, we elaborate a categorization defined on two perspectives, Data Source and Graphic Representation, and fourteen categories. That study collected 30 relevant articles, which were applied to the proposed categorization to identify the main features and limitations of each approach. A detailed view on these approaches, their characteristics and techniques are presented enabling a quantitative analysis. In addition, the categorization has given us objective conditions to establish new specification requirements for a new technological approach aiming at concept maps mining from texts.

  3. Analytic mappings: a new approach in particle production by accelerated observers

    International Nuclear Information System (INIS)

    Sanchez, N.

    1982-01-01

    This is a summary of the authors recent results about physical consequences of analytic mappings in the space-time. Classically, the mapping defines an accelerated frame. At the quantum level it gives rise to particle production. Statistically, the real singularities of the mapping have associated temperatures. This concerns a new approach in Q.F.T. as formulated in accelerated frames. It has been considered as a first step in the understanding of the deep connection that could exist between the structure (geometry and topology) of the space-time and thermodynamics, mainly motivated by the works of Hawking since 1975. (Auth.)

  4. Evaluating the Use of an Object-Based Approach to Lithological Mapping in Vegetated Terrain

    Directory of Open Access Journals (Sweden)

    Stephen Grebby

    2016-10-01

    Full Text Available Remote sensing-based approaches to lithological mapping are traditionally pixel-oriented, with classification performed on either a per-pixel or sub-pixel basis with complete disregard for contextual information about neighbouring pixels. However, intra-class variability due to heterogeneous surface cover (i.e., vegetation and soil or regional variations in mineralogy and chemical composition can result in the generation of unrealistic, generalised lithological maps that exhibit the “salt-and-pepper” artefact of spurious pixel classifications, as well as poorly defined contacts. In this study, an object-based image analysis (OBIA approach to lithological mapping is evaluated with respect to its ability to overcome these issues by instead classifying groups of contiguous pixels (i.e., objects. Due to significant vegetation cover in the study area, the OBIA approach incorporates airborne multispectral and LiDAR data to indirectly map lithologies by exploiting associations with both topography and vegetation type. The resulting lithological maps were assessed both in terms of their thematic accuracy and ability to accurately delineate lithological contacts. The OBIA approach is found to be capable of generating maps with an overall accuracy of 73.5% through integrating spectral and topographic input variables. When compared to equivalent per-pixel classifications, the OBIA approach achieved thematic accuracy increases of up to 13.1%, whilst also reducing the “salt-and-pepper” artefact to produce more realistic maps. Furthermore, the OBIA approach was also generally capable of mapping lithological contacts more accurately. The importance of optimising the segmentation stage of the OBIA approach is also highlighted. Overall, this study clearly demonstrates the potential of OBIA for lithological mapping applications, particularly in significantly vegetated and heterogeneous terrain.

  5. Theoretical approaches to elections defining

    OpenAIRE

    Natalya V. Lebedeva

    2011-01-01

    Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  6. Theoretical approaches to elections defining

    Directory of Open Access Journals (Sweden)

    Natalya V. Lebedeva

    2011-01-01

    Full Text Available Theoretical approaches to elections defining develop the nature, essence and content of elections, help to determine their place and a role as one of the major national law institutions in democratic system.

  7. Localization of canine brachycephaly using an across breed mapping approach.

    Directory of Open Access Journals (Sweden)

    Danika Bannasch

    2010-03-01

    Full Text Available The domestic dog, Canis familiaris, exhibits profound phenotypic diversity and is an ideal model organism for the genetic dissection of simple and complex traits. However, some of the most interesting phenotypes are fixed in particular breeds and are therefore less tractable to genetic analysis using classical segregation-based mapping approaches. We implemented an across breed mapping approach using a moderately dense SNP array, a low number of animals and breeds carefully selected for the phenotypes of interest to identify genetic variants responsible for breed-defining characteristics. Using a modest number of affected (10-30 and control (20-60 samples from multiple breeds, the correct chromosomal assignment was identified in a proof of concept experiment using three previously defined loci; hyperuricosuria, white spotting and chondrodysplasia. Genome-wide association was performed in a similar manner for one of the most striking morphological traits in dogs: brachycephalic head type. Although candidate gene approaches based on comparable phenotypes in mice and humans have been utilized for this trait, the causative gene has remained elusive using this method. Samples from nine affected breeds and thirteen control breeds identified strong genome-wide associations for brachycephalic head type on Cfa 1. Two independent datasets identified the same genomic region. Levels of relative heterozygosity in the associated region indicate that it has been subjected to a selective sweep, consistent with it being a breed defining morphological characteristic. Genotyping additional dogs in the region confirmed the association. To date, the genetic structure of dog breeds has primarily been exploited for genome wide association for segregating traits. These results demonstrate that non-segregating traits under strong selection are equally tractable to genetic analysis using small sample numbers.

  8. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  9. New approach on seismic hazard isoseismal map for Romania

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Cioflan, Carmen Ortanza; Marmureanu, Alexandru

    2008-01-01

    The seismicity of Romania comes from the energy that is released by crustal earthquakes, which have a depth not more than 40 km, and by the intermediate earthquakes coming from Vrancea region (unique case in Europe) with a depth between 60 and 200 km. The authors developed the concept of 'control earthquake' and equations to obtain the banana shape of the attenuations curves of the macroseimic intensity I (along the directions defined by azimuth Az), in the case of a Vrancea earthquake at a depth 80 < x < 160 km. There were used deterministic and probabilistic approaches, linear and nonlinear ones. The final map is in MMI intensity (isoseismal map) for maximum possible Vrancea earthquake with Richter magnitude, MGR 7.5. This will avoid any drawbacks to civil structural designers and to insurance companies which are paying all damages and life loses in function of earthquake intensity. (authors)

  10. Approach of simultaneous localization and mapping based on local maps for robot

    Institute of Scientific and Technical Information of China (English)

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  11. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat

  12. Mapping Second Chromosome Mutations to Defined Genomic Regions in Drosophila melanogaster.

    Science.gov (United States)

    Kahsai, Lily; Cook, Kevin R

    2018-01-04

    Hundreds of Drosophila melanogaster stocks are currently maintained at the Bloomington Drosophila Stock Center with mutations that have not been associated with sequence-defined genes. They have been preserved because they have interesting loss-of-function phenotypes. The experimental value of these mutations would be increased by tying them to specific genomic intervals so that geneticists can more easily associate them with annotated genes. Here, we report the mapping of 85 second chromosome complementation groups in the Bloomington collection to specific, small clusters of contiguous genes or individual genes in the sequenced genome. This information should prove valuable to Drosophila geneticists interested in processes associated with particular phenotypes and those searching for mutations affecting specific sequence-defined genes. Copyright © 2018 Kahsai,Cook.

  13. Mapping Second Chromosome Mutations to Defined Genomic Regions in Drosophila melanogaster

    Directory of Open Access Journals (Sweden)

    Lily Kahsai

    2018-01-01

    Full Text Available Hundreds of Drosophila melanogaster stocks are currently maintained at the Bloomington Drosophila Stock Center with mutations that have not been associated with sequence-defined genes. They have been preserved because they have interesting loss-of-function phenotypes. The experimental value of these mutations would be increased by tying them to specific genomic intervals so that geneticists can more easily associate them with annotated genes. Here, we report the mapping of 85 second chromosome complementation groups in the Bloomington collection to specific, small clusters of contiguous genes or individual genes in the sequenced genome. This information should prove valuable to Drosophila geneticists interested in processes associated with particular phenotypes and those searching for mutations affecting specific sequence-defined genes.

  14. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    Science.gov (United States)

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  15. Fast periodic stimulation (FPS): a highly effective approach in fMRI brain mapping.

    Science.gov (United States)

    Gao, Xiaoqing; Gentile, Francesco; Rossion, Bruno

    2018-03-03

    Defining the neural basis of perceptual categorization in a rapidly changing natural environment with low-temporal resolution methods such as functional magnetic resonance imaging (fMRI) is challenging. Here, we present a novel fast periodic stimulation (FPS)-fMRI approach to define face-selective brain regions with natural images. Human observers are presented with a dynamic stream of widely variable natural object images alternating at a fast rate (6 images/s). Every 9 s, a short burst of variable face images contrasting with object images in pairs induces an objective face-selective neural response at 0.111 Hz. A model-free Fourier analysis achieves a twofold increase in signal-to-noise ratio compared to a conventional block-design approach with identical stimuli and scanning duration, allowing to derive a comprehensive map of face-selective areas in the ventral occipito-temporal cortex, including the anterior temporal lobe (ATL), in all individual brains. Critically, periodicity of the desired category contrast and random variability among widely diverse images effectively eliminates the contribution of low-level visual cues, and lead to the highest values (80-90%) of test-retest reliability in the spatial activation map yet reported in imaging higher level visual functions. FPS-fMRI opens a new avenue for understanding brain function with low-temporal resolution methods.

  16. Defining mental disorder. Exploring the 'natural function' approach.

    Science.gov (United States)

    Varga, Somogy

    2011-01-21

    Due to several socio-political factors, to many psychiatrists only a strictly objective definition of mental disorder, free of value components, seems really acceptable. In this paper, I will explore a variant of such an objectivist approach to defining metal disorder, natural function objectivism. Proponents of this approach make recourse to the notion of natural function in order to reach a value-free definition of mental disorder. The exploration of Christopher Boorse's 'biostatistical' account of natural function (1) will be followed an investigation of the 'hybrid naturalism' approach to natural functions by Jerome Wakefield (2). In the third part, I will explore two proposals that call into question the whole attempt to define mental disorder (3). I will conclude that while 'natural function objectivism' accounts fail to provide the backdrop for a reliable definition of mental disorder, there is no compelling reason to conclude that a definition cannot be achieved.

  17. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    Directory of Open Access Journals (Sweden)

    Mutlu Ozdogan

    Full Text Available In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i creating masks for water, non-forested areas, clouds, and cloud shadows; ii identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR difference image; iii filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission, issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for

  18. NOAA's Shoreline Survey Maps - Raster NOAA-NOS Shoreline Survey Manuscripts that define the shoreline and alongshore natural and man-made features

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NOS coastal survey maps (often called t-sheet or tp-sheet maps) are special use planimetric or topographic maps that precisely define the shoreline and alongshore...

  19. Defining mental disorder. Exploring the 'natural function' approach

    Directory of Open Access Journals (Sweden)

    Varga Somogy

    2011-01-01

    Full Text Available Abstract Due to several socio-political factors, to many psychiatrists only a strictly objective definition of mental disorder, free of value components, seems really acceptable. In this paper, I will explore a variant of such an objectivist approach to defining metal disorder, natural function objectivism. Proponents of this approach make recourse to the notion of natural function in order to reach a value-free definition of mental disorder. The exploration of Christopher Boorse's 'biostatistical' account of natural function (1 will be followed an investigation of the 'hybrid naturalism' approach to natural functions by Jerome Wakefield (2. In the third part, I will explore two proposals that call into question the whole attempt to define mental disorder (3. I will conclude that while 'natural function objectivism' accounts fail to provide the backdrop for a reliable definition of mental disorder, there is no compelling reason to conclude that a definition cannot be achieved.

  20. The sustainability, a relevant approach for defining the road-map for future nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Poinssot, C.; Bourg, S.; Grandjean, S. [CEA Centre de Macoule, Nuclear Energy Division, Radiochemistry and Processes Department, BP11, F-30207 Bagnols sur Ceze (France); Boullis, B. [CEA Centre de Saclay, Nuclear Energy Division, Innovation and Industrial Support Division, F-91191 Gif-sur-Yvette (France)

    2016-07-01

    Developing sustainable energy systems is a key driver for mitigating the global climate change in the framework of COP21 conference commitments. Nuclear energy is strongly concerned by such a perspective and we have developed a thorough approach to define relevant objectives to fulfill in order to meet such a requirement. On this basis nuclear energy systems need to be improved to increase the preservation of uranium natural resource and reduce their environmental footprints. It requires in both cases increasing energetic material recycling based first on the current LWR reactor fleet, in the future on SFR which would allow a much more efficient use of neutrons to consume uranium and produce energy. Furthermore, such reactor type would also allow minor actinides recycling, which would significantly reduce the ultimate waste toxicity and lifetime, and the repository footprint. In this perspective, recycling the actinides is clearly the cornerstone of any sustainable nuclear fuel cycle.

  1. The sustainability, a relevant approach for defining the road-map for future nuclear fuel cycles

    International Nuclear Information System (INIS)

    Poinssot, C.; Bourg, S.; Grandjean, S.; Boullis, B.

    2016-01-01

    Developing sustainable energy systems is a key driver for mitigating the global climate change in the framework of COP21 conference commitments. Nuclear energy is strongly concerned by such a perspective and we have developed a thorough approach to define relevant objectives to fulfill in order to meet such a requirement. On this basis nuclear energy systems need to be improved to increase the preservation of uranium natural resource and reduce their environmental footprints. It requires in both cases increasing energetic material recycling based first on the current LWR reactor fleet, in the future on SFR which would allow a much more efficient use of neutrons to consume uranium and produce energy. Furthermore, such reactor type would also allow minor actinides recycling, which would significantly reduce the ultimate waste toxicity and lifetime, and the repository footprint. In this perspective, recycling the actinides is clearly the cornerstone of any sustainable nuclear fuel cycle

  2. Functional physiology of the human terminal antrum defined by high-resolution electrical mapping and computational modeling.

    Science.gov (United States)

    Berry, Rachel; Miyagawa, Taimei; Paskaranandavadivel, Niranchan; Du, Peng; Angeli, Timothy R; Trew, Mark L; Windsor, John A; Imai, Yohsuke; O'Grady, Gregory; Cheng, Leo K

    2016-11-01

    High-resolution (HR) mapping has been used to study gastric slow-wave activation; however, the specific characteristics of antral electrophysiology remain poorly defined. This study applied HR mapping and computational modeling to define functional human antral physiology. HR mapping was performed in 10 subjects using flexible electrode arrays (128-192 electrodes; 16-24 cm 2 ) arranged from the pylorus to mid-corpus. Anatomical registration was by photographs and anatomical landmarks. Slow-wave parameters were computed, and resultant data were incorporated into a computational fluid dynamics (CFD) model of gastric flow to calculate impact on gastric mixing. In all subjects, extracellular mapping demonstrated normal aboral slow-wave propagation and a region of increased amplitude and velocity in the prepyloric antrum. On average, the high-velocity region commenced 28 mm proximal to the pylorus, and activation ceased 6 mm from the pylorus. Within this region, velocity increased 0.2 mm/s per mm of tissue, from the mean 3.3 ± 0.1 mm/s to 7.5 ± 0.6 mm/s (P human terminal antral contraction is controlled by a short region of rapid high-amplitude slow-wave activity. Distal antral wave acceleration plays a major role in antral flow and mixing, increasing particle strain and trituration. Copyright © 2016 the American Physiological Society.

  3. Constant-scale natural boundary mapping to reveal global and cosmic processes

    CERN Document Server

    Clark, Pamela Elizabeth

    2013-01-01

    Whereas conventional maps can be expressed as outward-expanding formulae with well-defined central features and relatively poorly defined edges, Constant Scale Natural Boundary (CSNB) maps have well-defined boundaries that result from natural processes and thus allow spatial and dynamic relationships to be observed in a new way useful to understanding these processes. CSNB mapping presents a new approach to visualization that produces maps markedly different from those produced by conventional cartographic methods. In this approach, any body can be represented by a 3D coordinate system. For a regular body, with its surface relatively smooth on the scale of its size, locations of features can be represented by definite geographic grid (latitude and longitude) and elevation, or deviation from the triaxial ellipsoid defined surface. A continuous surface on this body can be segmented, its distinctive regional terranes enclosed, and their inter-relationships defined, by using selected morphologically identifiable ...

  4. The frequency-domain approach for apparent density mapping

    Science.gov (United States)

    Tong, T.; Guo, L.

    2017-12-01

    Apparent density mapping is a technique to estimate density distribution in the subsurface layer from the observed gravity data. It has been widely applied for geologic mapping, tectonic study and mineral exploration for decades. Apparent density mapping usually models the density layer as a collection of vertical, juxtaposed prisms in both horizontal directions, whose top and bottom surfaces are assumed to be horizontal or variable-depth, and then inverts or deconvolves the gravity anomalies to determine the density of each prism. Conventionally, the frequency-domain approach, which assumes that both top and bottom surfaces of the layer are horizontal, is usually utilized for fast density mapping. However, such assumption is not always valid in the real world, since either the top surface or the bottom surface may be variable-depth. Here, we presented a frequency-domain approach for apparent density mapping, which permits both the top and bottom surfaces of the layer to be variable-depth. We first derived the formula for forward calculation of gravity anomalies caused by the density layer, whose top and bottom surfaces are variable-depth, and the formula for inversion of gravity anomalies for the density distribution. Then we proposed the procedure for density mapping based on both the formulas of inversion and forward calculation. We tested the approach on the synthetic data, which verified its effectiveness. We also tested the approach on the real Bouguer gravity anomalies data from the central South China. The top surface was assumed to be flat and was on the sea level, and the bottom surface was considered as the Moho surface. The result presented the crustal density distribution, which was coinciding well with the basic tectonic features in the study area.

  5. Determination of contact maps in proteins: A combination of structural and chemical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wołek, Karol; Cieplak, Marek, E-mail: mc@ifpan.edu.pl [Institute of Physics, Polish Academy of Science, Al. Lotników 32/46, 02-668 Warsaw (Poland); Gómez-Sicilia, Àngel [Instituto Cajal, Consejo Superior de Investigaciones Cientificas (CSIC), Av. Doctor Arce, 37, 28002 Madrid (Spain); Instituto Madrileño de Estudios Avanzados en Nanociencia (IMDEA-Nanociencia), C/Faraday 9, 28049 Cantoblanco (Madrid) (Spain)

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  6. A tiered approach for ecosystem services mapping

    OpenAIRE

    Grêt-Regamey, Adrienne; Weibel, Bettina; Rabe, Sven-Erik; Burkhard, Benjamin

    2017-01-01

    Mapping ecosystem services delivers essential insights into the spatial characteristics of various goods’ and services’ flows from nature to human society. It has become a central topic of science, policy, business and society – all belonging on functioning ecosystems. This textbook summarises the current state-of-the-art of ecosystem services mapping, related theory and methods, different ecosystem service quantification and modelling approaches as well as practical applications. The book...

  7. Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks

    Science.gov (United States)

    Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.

    2015-03-01

    The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which is to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.

  8. Towards Technological Approaches for Concept Maps Mining from Text

    OpenAIRE

    Camila Zacche Aguiar; Davidson Cury; Amal Zouaq

    2018-01-01

    Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed...

  9. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  10. Clustering of color map pixels: an interactive approach

    Science.gov (United States)

    Moon, Yiu Sang; Luk, Franklin T.; Yuen, K. N.; Yeung, Hoi Wo

    2003-12-01

    The demand for digital maps continues to arise as mobile electronic devices become more popular nowadays. Instead of creating the entire map from void, we may convert a scanned paper map into a digital one. Color clustering is the very first step of the conversion process. Currently, most of the existing clustering algorithms are fully automatic. They are fast and efficient but may not work well in map conversion because of the numerous ambiguous issues associated with printed maps. Here we introduce two interactive approaches for color clustering on the map: color clustering with pre-calculated index colors (PCIC) and color clustering with pre-calculated color ranges (PCCR). We also introduce a memory model that could enhance and integrate different image processing techniques for fine-tuning the clustering results. Problems and examples of the algorithms are discussed in the paper.

  11. DEFINING THE NOTION OF CONCEPT MAPS 3.0

    DEFF Research Database (Denmark)

    Jensen, Jesper; Johnsen, Lars

    The aim of this poster is to present a proposal of how concept maps may be described, annotated and exposed on the Web of Data, also frequently known as the Semantic Web or Web 3.0. In doing so, the poster will first introduce the concept ofconcept maps 3.0 – that is, concept maps which utilize......, and are enriched by, Web 3.0 technologies and resources. While conceptmaps 1.0 and 2.0 may be said to reflect earlier generations of the Web, the web of documents and the social web, the utilization ofWeb 3.0 technologies allows concept maps 3.0 to become machine-interpretable semantic web resources, and perhaps...... even semantic learning resources. This has several implications. One is that concept map discoverability can undoubtedly be improved through metadata annotation and the use of search engine interpretable vocabularies such as hts://schema.org/. Also, a key featureof Web 3.0 is that it supports...

  12. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  13. Lipid and protein maps defining arterial layers in atherosclerotic aorta

    Directory of Open Access Journals (Sweden)

    Marta Martin-Lorenzo

    2015-09-01

    Full Text Available Subclinical atherosclerosis cannot be predicted and novel therapeutic targets are needed. The molecular anatomy of healthy and atherosclerotic tissue is pursued to identify ongoing molecular changes in atherosclerosis development. Mass Spectrometry Imaging (MSI accounts with the unique advantage of analyzing proteins and metabolites (lipids while preserving their original localization; thus two dimensional maps can be obtained. Main molecular alterations were investigated in a rabbit model in response to early development of atherosclerosis. Aortic arterial layers (intima and media and calcified regions were investigated in detail by MALDI-MSI and proteins and lipids specifically defining those areas of interest were identified. These data further complement main findings previously published in J Proteomics (M. Martin-Lorenzo et al., J. Proteomics. (In press; M. Martin-Lorenzo et al., J. Proteomics 108 (2014 465–468. [1,2].

  14. An optimization approach for extracting and encoding consistent maps in a shape collection

    KAUST Repository

    Huang, Qi-Xing

    2012-11-01

    We introduce a novel approach for computing high quality point-topoint maps among a collection of related shapes. The proposed approach takes as input a sparse set of imperfect initial maps between pairs of shapes and builds a compact data structure which implicitly encodes an improved set of maps between all pairs of shapes. These maps align well with point correspondences selected from initial maps; they map neighboring points to neighboring points; and they provide cycle-consistency, so that map compositions along cycles approximate the identity map. The proposed approach is motivated by the fact that a complete set of maps between all pairs of shapes that admits nearly perfect cycleconsistency are highly redundant and can be represented by compositions of maps through a single base shape. In general, multiple base shapes are needed to adequately cover a diverse collection. Our algorithm sequentially extracts such a small collection of base shapes and creates correspondences from each of these base shapes to all other shapes. These correspondences are found by global optimization on candidate correspondences obtained by diffusing initial maps. These are then used to create a compact graphical data structure from which globally optimal cycle-consistent maps can be extracted using simple graph algorithms. Experimental results on benchmark datasets show that the proposed approach yields significantly better results than state-of-theart data-driven shape matching methods. © 2012 ACM.

  15. Application of a new genetic classification and semi-automated geomorphic mapping approach in the Perth submarine canyon, Australia

    Science.gov (United States)

    Picard, K.; Nanson, R.; Huang, Z.; Nichol, S.; McCulloch, M.

    2017-12-01

    The acquisition of high resolution marine geophysical data has intensified in recent years (e.g. multibeam echo-sounding, sub-bottom profiling). This progress provides the opportunity to classify and map the seafloor in greater detail, using new methods that preserve the links between processes and morphology. Geoscience Australia has developed a new genetic classification approach, nested within the Harris et al (2014) global seafloor mapping framework. The approach divides parent units into sub-features based on established classification schemes and feature descriptors defined by Bradwell et al. (2016: http://nora.nerc.ac.uk/), the International Hydrographic Organization (https://www.iho.int) and the Coastal Marine and Ecological Classification Standard (https://www.cmecscatalog.org). Owing to the ecological significance of submarine canyon systems in particular, much recent attention has focused on defining their variation in form and process, whereby they can be classified using a range of topographic metrics, fluvial dis/connection and shelf-incising status. The Perth Canyon is incised into the continental slope and shelf of southwest Australia, covering an area of >1500 km2 and extending from 4700 m water depth to the shelf break in 170 m. The canyon sits within a Marine Protected Area, incorporating a Marine National Park and Habitat Protection Zone in recognition of its benthic and pelagic biodiversity values. However, detailed information of the spatial patterns of the seabed habitats that influence this biodiversity is lacking. Here we use 20 m resolution bathymetry and acoustic backscatter data acquired in 2015 by the Schmidt Ocean Institute plus sub-bottom datasets and sediment samples collected Geoscience Australia in 2005 to apply the new geomorphic classification system to the Perth Canyon. This presentation will show the results of the geomorphic feature mapping of the canyon and its application to better defining potential benthic habitats.

  16. Changing energy-related behavior: An Intervention Mapping approach

    International Nuclear Information System (INIS)

    Kok, Gerjo; Lo, Siu Hing; Peters, Gjalt-Jorn Y.; Ruiter, Robert A.C.

    2011-01-01

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: → Intervention Mapping (IM) is a planning process for developing evidence-based interventions.→ IM takes a problem-driven rather than theory-driven approach. → IM can be applied to the promotion of energy-conservation in a multilevel approach. → IM helps identifying determinants of behaviors and environmental conditions. → IM helps selecting appropriate theory-based methods and practical applications.

  17. Changing energy-related behavior: An Intervention Mapping approach

    Energy Technology Data Exchange (ETDEWEB)

    Kok, Gerjo, E-mail: g.kok@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Lo, Siu Hing, E-mail: siu-hing.lo@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Peters, Gjalt-Jorn Y., E-mail: gj.peters@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Ruiter, Robert A.C., E-mail: r.ruiter@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands)

    2011-09-15

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: > Intervention Mapping (IM) is a planning process for developing evidence-based interventions.> IM takes a problem-driven rather than theory-driven approach. > IM can be applied to the promotion of energy-conservation in a multilevel approach. > IM helps identifying determinants of behaviors and environmental conditions. > IM helps selecting appropriate theory-based methods and practical applications.

  18. An automated approach to mapping corn from Landsat imagery

    Science.gov (United States)

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  19. Genetic fine-mapping and genomic annotation defines causal mechanisms at type 2 diabetes susceptibility loci

    Science.gov (United States)

    Mahajan, Anubha; Locke, Adam; Rayner, N William; Robertson, Neil; Scott, Robert A; Prokopenko, Inga; Scott, Laura J; Green, Todd; Sparso, Thomas; Thuillier, Dorothee; Yengo, Loic; Grallert, Harald; Wahl, Simone; Frånberg, Mattias; Strawbridge, Rona J; Kestler, Hans; Chheda, Himanshu; Eisele, Lewin; Gustafsson, Stefan; Steinthorsdottir, Valgerdur; Thorleifsson, Gudmar; Qi, Lu; Karssen, Lennart C; van Leeuwen, Elisabeth M; Willems, Sara M; Li, Man; Chen, Han; Fuchsberger, Christian; Kwan, Phoenix; Ma, Clement; Linderman, Michael; Lu, Yingchang; Thomsen, Soren K; Rundle, Jana K; Beer, Nicola L; van de Bunt, Martijn; Chalisey, Anil; Kang, Hyun Min; Voight, Benjamin F; Abecasis, Goncalo R; Almgren, Peter; Baldassarre, Damiano; Balkau, Beverley; Benediktsson, Rafn; Blüher, Matthias; Boeing, Heiner; Bonnycastle, Lori L; Borringer, Erwin P; Burtt, Noël P; Carey, Jason; Charpentier, Guillaume; Chines, Peter S; Cornelis, Marilyn C; Couper, David J; Crenshaw, Andrew T; van Dam, Rob M; Doney, Alex SF; Dorkhan, Mozhgan; Edkins, Sarah; Eriksson, Johan G; Esko, Tonu; Eury, Elodie; Fadista, João; Flannick, Jason; Fontanillas, Pierre; Fox, Caroline; Franks, Paul W; Gertow, Karl; Gieger, Christian; Gigante, Bruna; Gottesman, Omri; Grant, George B; Grarup, Niels; Groves, Christopher J; Hassinen, Maija; Have, Christian T; Herder, Christian; Holmen, Oddgeir L; Hreidarsson, Astradur B; Humphries, Steve E; Hunter, David J; Jackson, Anne U; Jonsson, Anna; Jørgensen, Marit E; Jørgensen, Torben; Kerrison, Nicola D; Kinnunen, Leena; Klopp, Norman; Kong, Augustine; Kovacs, Peter; Kraft, Peter; Kravic, Jasmina; Langford, Cordelia; Leander, Karin; Liang, Liming; Lichtner, Peter; Lindgren, Cecilia M; Lindholm, Eero; Linneberg, Allan; Liu, Ching-Ti; Lobbens, Stéphane; Luan, Jian’an; Lyssenko, Valeriya; Männistö, Satu; McLeod, Olga; Meyer, Julia; Mihailov, Evelin; Mirza, Ghazala; Mühleisen, Thomas W; Müller-Nurasyid, Martina; Navarro, Carmen; Nöthen, Markus M; Oskolkov, Nikolay N; Owen, Katharine R; Palli, Domenico; Pechlivanis, Sonali; Perry, John RB; Platou, Carl GP; Roden, Michael; Ruderfer, Douglas; Rybin, Denis; van der Schouw, Yvonne T; Sennblad, Bengt; Sigurðsson, Gunnar; Stančáková, Alena; Steinbach, Gerald; Storm, Petter; Strauch, Konstantin; Stringham, Heather M; Sun, Qi; Thorand, Barbara; Tikkanen, Emmi; Tonjes, Anke; Trakalo, Joseph; Tremoli, Elena; Tuomi, Tiinamaija; Wennauer, Roman; Wood, Andrew R; Zeggini, Eleftheria; Dunham, Ian; Birney, Ewan; Pasquali, Lorenzo; Ferrer, Jorge; Loos, Ruth JF; Dupuis, Josée; Florez, Jose C; Boerwinkle, Eric; Pankow, James S; van Duijn, Cornelia; Sijbrands, Eric; Meigs, James B; Hu, Frank B; Thorsteinsdottir, Unnur; Stefansson, Kari; Lakka, Timo A; Rauramaa, Rainer; Stumvoll, Michael; Pedersen, Nancy L; Lind, Lars; Keinanen-Kiukaanniemi, Sirkka M; Korpi-Hyövälti, Eeva; Saaristo, Timo E; Saltevo, Juha; Kuusisto, Johanna; Laakso, Markku; Metspalu, Andres; Erbel, Raimund; Jöckel, Karl-Heinz; Moebus, Susanne; Ripatti, Samuli; Salomaa, Veikko; Ingelsson, Erik; Boehm, Bernhard O; Bergman, Richard N; Collins, Francis S; Mohlke, Karen L; Koistinen, Heikki; Tuomilehto, Jaakko; Hveem, Kristian; Njølstad, Inger; Deloukas, Panagiotis; Donnelly, Peter J; Frayling, Timothy M; Hattersley, Andrew T; de Faire, Ulf; Hamsten, Anders; Illig, Thomas; Peters, Annette; Cauchi, Stephane; Sladek, Rob; Froguel, Philippe; Hansen, Torben; Pedersen, Oluf; Morris, Andrew D; Palmer, Collin NA; Kathiresan, Sekar; Melander, Olle; Nilsson, Peter M; Groop, Leif C; Barroso, Inês; Langenberg, Claudia; Wareham, Nicholas J; O’Callaghan, Christopher A; Gloyn, Anna L; Altshuler, David; Boehnke, Michael; Teslovich, Tanya M; McCarthy, Mark I; Morris, Andrew P

    2015-01-01

    We performed fine-mapping of 39 established type 2 diabetes (T2D) loci in 27,206 cases and 57,574 controls of European ancestry. We identified 49 distinct association signals at these loci, including five mapping in/near KCNQ1. “Credible sets” of variants most likely to drive each distinct signal mapped predominantly to non-coding sequence, implying that T2D association is mediated through gene regulation. Credible set variants were enriched for overlap with FOXA2 chromatin immunoprecipitation binding sites in human islet and liver cells, including at MTNR1B, where fine-mapping implicated rs10830963 as driving T2D association. We confirmed that this T2D-risk allele increases FOXA2-bound enhancer activity in islet- and liver-derived cells. We observed allele-specific differences in NEUROD1 binding in islet-derived cells, consistent with evidence that the T2D-risk allele increases islet MTNR1B expression. Our study demonstrates how integration of genetic and genomic information can define molecular mechanisms through which variants underlying association signals exert their effects on disease. PMID:26551672

  20. Soil erodibility mapping using three approaches in the Tangiers province –Northern Morocco

    Directory of Open Access Journals (Sweden)

    Hamza Iaaich

    2016-09-01

    Full Text Available Soil erodibility is a key factor in assessing soil loss rates. In fact, soil loss is the most occurring land degradation form in Morocco, affecting rural and urban vulnerable areas. This work deals with large scale mapping of soil erodibility using three mapping approaches: (i the CORINE approach developed for Europe by the JRC; (ii the UNEP/FAO approach developed within the frame of the United Nations Environmental Program for the Mediterranean area; (iii the Universal Soil Loss Equation (USLE K factor. Our study zone is the province of Tangiers, North-West of Morocco. For each approach, we mapped and analyzed different erodibility factors in terms of parent material, topography and soil attributes. The thematic maps were then integrated using a Geographic Information System to elaborate a soil erodibility map for each of the three approaches. Finally, the validity of each approach was checked in the field, focusing on highly eroded areas, by confronting the estimated soil erodibility and the erosion state as observed in the field. We used three statistical indicators for validation: overall accuracy, weighted Kappa factor and omission/commission errors. We found that the UNEP/FAO approach, based principally on lithofacies and topography as mapping inputs, is the most adapted for the case of our study zone, followed by the CORINE approach. The USLE K factor underestimated the soil erodibility, especially for highly eroded areas.

  1. Unity-Based Diversity: System Approach to Defining Information

    Directory of Open Access Journals (Sweden)

    Yixin Zhong

    2011-07-01

    Full Text Available What is information? This is the first question that information science should answer clearly. However, the definitions of information have been so diversified that people are questioning if there is any unity among the diversity, leading to a suspicion on whether it is possible to establish a unified theory of information or not. To answer this question, a system approach to defining information is introduced in this paper. It is proved that the unity of information definitions can be maintained with this approach. As a by-product, an important concept, the information eco-system, was also achieved.

  2. Concept maps and nursing theory: a pedagogical approach.

    Science.gov (United States)

    Hunter Revell, Susan M

    2012-01-01

    Faculty seek to teach nursing students how to link clinical and theoretical knowledge with the intent of improving patient outcomes. The author discusses an innovative 9-week concept mapping activity as a pedagogical approach to teach nursing theory in a graduate theory course. Weekly concept map building increased student engagement and fostered theoretical thinking. Unexpectedly, this activity also benefited students through group work and its ability to enhance theory-practice knowledge.

  3. Defining Leadership: Collegiate Women's Learning Circles: A Qualitative Approach

    Science.gov (United States)

    Preston-Cunningham, Tammie; Elbert, Chanda D.; Dooley, Kim E.

    2017-01-01

    The researchers employed qualitative methods to evaluate first-year female students' definition of "leadership" through involvement in the Women's Learning Circle. The findings revealed that students defined leadership in two dimensions: traits and behaviors. The qualitative findings explore a multidimensional approach to the voices of…

  4. Interest rates mapping

    Science.gov (United States)

    Kanevski, M.; Maignan, M.; Pozdnoukhov, A.; Timonin, V.

    2008-06-01

    The present study deals with the analysis and mapping of Swiss franc interest rates. Interest rates depend on time and maturity, defining term structure of the interest rate curves (IRC). In the present study IRC are considered in a two-dimensional feature space-time and maturity. Exploratory data analysis includes a variety of tools widely used in econophysics and geostatistics. Geostatistical models and machine learning algorithms (multilayer perceptron and Support Vector Machines) were applied to produce interest rate maps. IR maps can be used for the visualisation and pattern perception purposes, to develop and to explore economical hypotheses, to produce dynamic asset-liability simulations and for financial risk assessments. The feasibility of an application of interest rates mapping approach for the IRC forecasting is considered as well.

  5. New GIS approaches to wild land mapping in Europe

    Science.gov (United States)

    Steffen Fritz; Steve Carver; Linda See

    2000-01-01

    This paper outlines modifications and new approaches to wild land mapping developed specifically for the United Kingdom and European areas. In particular, national level reconnaissance and local level mapping of wild land in the UK and Scotland are presented. A national level study for the UK is undertaken, and a local study focuses on the Cairngorm Mountains in...

  6. Tropical forest carbon assessment: integrating satellite and airborne mapping approaches

    International Nuclear Information System (INIS)

    Asner, Gregory P

    2009-01-01

    Large-scale carbon mapping is needed to support the UNFCCC program to reduce deforestation and forest degradation (REDD). Managers of forested land can potentially increase their carbon credits via detailed monitoring of forest cover, loss and gain (hectares), and periodic estimates of changes in forest carbon density (tons ha -1 ). Satellites provide an opportunity to monitor changes in forest carbon caused by deforestation and degradation, but only after initial carbon densities have been assessed. New airborne approaches, especially light detection and ranging (LiDAR), provide a means to estimate forest carbon density over large areas, which greatly assists in the development of practical baselines. Here I present an integrated satellite-airborne mapping approach that supports high-resolution carbon stock assessment and monitoring in tropical forest regions. The approach yields a spatially resolved, regional state-of-the-forest carbon baseline, followed by high-resolution monitoring of forest cover and disturbance to estimate carbon emissions. Rapid advances and decreasing costs in the satellite and airborne mapping sectors are already making high-resolution carbon stock and emissions assessments viable anywhere in the world.

  7. Entropic Phase Maps in Discrete Quantum Gravity

    Directory of Open Access Journals (Sweden)

    Benjamin F. Dribus

    2017-06-01

    Full Text Available Path summation offers a flexible general approach to quantum theory, including quantum gravity. In the latter setting, summation is performed over a space of evolutionary pathways in a history configuration space. Discrete causal histories called acyclic directed sets offer certain advantages over similar models appearing in the literature, such as causal sets. Path summation defined in terms of these histories enables derivation of discrete Schrödinger-type equations describing quantum spacetime dynamics for any suitable choice of algebraic quantities associated with each evolutionary pathway. These quantities, called phases, collectively define a phase map from the space of evolutionary pathways to a target object, such as the unit circle S 1 ⊂ C , or an analogue such as S 3 or S 7 . This paper explores the problem of identifying suitable phase maps for discrete quantum gravity, focusing on a class of S 1 -valued maps defined in terms of “structural increments” of histories, called terminal states. Invariants such as state automorphism groups determine multiplicities of states, and induce families of natural entropy functions. A phase map defined in terms of such a function is called an entropic phase map. The associated dynamical law may be viewed as an abstract combination of Schrödinger’s equation and the second law of thermodynamics.

  8. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  9. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    Science.gov (United States)

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  10. Maps help protect sensitive areas from spills : an integrated approach to environmental mapping

    International Nuclear Information System (INIS)

    Laflamme, A.; Leblanc, S.R.; Percy, R.J.

    2001-01-01

    The Atlantic Sensitivity Mapping Program (ASMP) is underway in Canada's Atlantic Region to develop and maintain the best possible sensitivity mapping system to provide planners and managers with the full range of information they would need in the event of a coastal oil spill drill or spill incident. This initiative also provides recommendations concerning resource protection at the time of a spill. ASMP has become a powerful tool, providing a consistent and standardized terminology throughout the range of spill planning, preparedness and real-time response activities. The desktop mapping system provides an easy-to-use approach for a wide range of technical and support data and information stored in various databases. The data and information are based on a consistent set of terms and definitions that describe the character of the shore zone, the objective and strategies for a specific response, and the methods for achieving those objectives. The data are linked with other resource information in a GIS-based system and can be updated quickly and easily as new information becomes available. The mapping program keeps evolving to better serve the needs of environmental emergency responders. In addition, all components will soon be integrated into a web-based mapping format for broader accessibility. Future work will focus on developing a pre-spill database for Labrador. 3 refs., 8 figs

  11. Advancing the STMS genomic resources for defining new locations on the intraspecific genetic linkage map of chickpea (Cicer arietinum L.

    Directory of Open Access Journals (Sweden)

    Shokeen Bhumika

    2011-02-01

    Full Text Available Abstract Background Chickpea (Cicer arietinum L. is an economically important cool season grain legume crop that is valued for its nutritive seeds having high protein content. However, several biotic and abiotic stresses and the low genetic variability in the chickpea genome have continuously hindered the chickpea molecular breeding programs. STMS (Sequence Tagged Microsatellite Sites markers which are preferred for the construction of saturated linkage maps in several crop species, have also emerged as the most efficient and reliable source for detecting allelic diversity in chickpea. However, the number of STMS markers reported in chickpea is still limited and moreover exhibit low rates of both inter and intraspecific polymorphism, thereby limiting the positions of the SSR markers especially on the intraspecific linkage maps of chickpea. Hence, this study was undertaken with the aim of developing additional STMS markers and utilizing them for advancing the genetic linkage map of chickpea which would have applications in QTL identification, MAS and for de novo assembly of high throughput whole genome sequence data. Results A microsatellite enriched library of chickpea (enriched for (GT/CAn and (GA/CTn repeats was constructed from which 387 putative microsatellite containing clones were identified. From these, 254 STMS primers were designed of which 181 were developed as functional markers. An intraspecific mapping population of chickpea, [ICCV-2 (single podded × JG-62 (double podded] and comprising of 126 RILs, was genotyped for mapping. Of the 522 chickpea STMS markers (including the double-podding trait, screened for parental polymorphism, 226 (43.3% were polymorphic in the parents and were used to genotype the RILs. At a LOD score of 3.5, eight linkage groups defining the position of 138 markers were obtained that spanned 630.9 cM with an average marker density of 4.57 cM. Further, based on the common loci present between the current map

  12. Advancing the STMS genomic resources for defining new locations on the intraspecific genetic linkage map of chickpea (Cicer arietinum L.).

    Science.gov (United States)

    Gaur, Rashmi; Sethy, Niroj K; Choudhary, Shalu; Shokeen, Bhumika; Gupta, Varsha; Bhatia, Sabhyata

    2011-02-17

    Chickpea (Cicer arietinum L.) is an economically important cool season grain legume crop that is valued for its nutritive seeds having high protein content. However, several biotic and abiotic stresses and the low genetic variability in the chickpea genome have continuously hindered the chickpea molecular breeding programs. STMS (Sequence Tagged Microsatellite Sites) markers which are preferred for the construction of saturated linkage maps in several crop species, have also emerged as the most efficient and reliable source for detecting allelic diversity in chickpea. However, the number of STMS markers reported in chickpea is still limited and moreover exhibit low rates of both inter and intraspecific polymorphism, thereby limiting the positions of the SSR markers especially on the intraspecific linkage maps of chickpea. Hence, this study was undertaken with the aim of developing additional STMS markers and utilizing them for advancing the genetic linkage map of chickpea which would have applications in QTL identification, MAS and for de novo assembly of high throughput whole genome sequence data. A microsatellite enriched library of chickpea (enriched for (GT/CA)n and (GA/CT)n repeats) was constructed from which 387 putative microsatellite containing clones were identified. From these, 254 STMS primers were designed of which 181 were developed as functional markers. An intraspecific mapping population of chickpea, [ICCV-2 (single podded) × JG-62 (double podded)] and comprising of 126 RILs, was genotyped for mapping. Of the 522 chickpea STMS markers (including the double-podding trait, screened for parental polymorphism, 226 (43.3%) were polymorphic in the parents and were used to genotype the RILs. At a LOD score of 3.5, eight linkage groups defining the position of 138 markers were obtained that spanned 630.9 cM with an average marker density of 4.57 cM. Further, based on the common loci present between the current map and the previously published chickpea

  13. Physico-empirical approach for mapping soil hydraulic behaviour

    Directory of Open Access Journals (Sweden)

    G. D'Urso

    1997-01-01

    Full Text Available Abstract: Pedo-transfer functions are largely used in soil hydraulic characterisation of large areas. The use of physico-empirical approaches for the derivation of soil hydraulic parameters from disturbed samples data can be greatly enhanced if a characterisation performed on undisturbed cores of the same type of soil is available. In this study, an experimental procedure for deriving maps of soil hydraulic behaviour is discussed with reference to its application in an irrigation district (30 km2 in southern Italy. The main steps of the proposed procedure are: i the precise identification of soil hydraulic functions from undisturbed sampling of main horizons in representative profiles for each soil map unit; ii the determination of pore-size distribution curves from larger disturbed sampling data sets within the same soil map unit. iii the calibration of physical-empirical methods for retrieving soil hydraulic parameters from particle-size data and undisturbed soil sample analysis; iv the definition of functional hydraulic properties from water balance output; and v the delimitation of soil hydraulic map units based on functional properties.

  14. An Odometry-free Approach for Simultaneous Localization and Online Hybrid Map Building

    Directory of Open Access Journals (Sweden)

    Wei Hong Chin

    2016-11-01

    Full Text Available In this paper, a new approach is proposed for mobile robot localization and hybrid map building simultaneously without using any odometry hardware system. The proposed method termed as Genetic Bayesian ARAM which comprises two main components: 1 Steady state genetic algorithm (SSGA for self-localization and occupancy grid map building; 2 Bayesian Adaptive Resonance Associative Memory (ARAM for online topological map building. The model of the explored environment is formed as a hybrid representation, both topological and grid-based, and it is incrementally constructed during the exploration process. During occupancy map building, robot estimated self-position is updated by SSGA. At the same time, robot estimated self position is transmit to Bayesian ARAM for topological map building and localization. The effectiveness of our proposed approach is validated by a number of standardized benchmark datasets and real experimental results carried on mobile robot. Benchmark datasets are used to verify the proposed method capable of generating topological map in different environment conditions. Real robot experiment is to verify the proposed method can be implemented in real world.

  15. Hyperspectral Data for Mangrove Species Mapping: A Comparison of Pixel-Based and Object-Based Approach

    Directory of Open Access Journals (Sweden)

    Muhammad Kamal

    2011-10-01

    Full Text Available Visual image interpretation and digital image classification have been used to map and monitor mangrove extent and composition for decades. The presence of a high-spatial resolution hyperspectral sensor can potentially improve our ability to differentiate mangrove species. However, little research has explored the use of pixel-based and object-based approaches on high-spatial hyperspectral datasets for this purpose. This study assessed the ability of CASI-2 data for mangrove species mapping using pixel-based and object-based approaches at the mouth of the Brisbane River area, southeast Queensland, Australia. Three mapping techniques used in this study: spectral angle mapper (SAM and linear spectral unmixing (LSU for the pixel-based approaches, and multi-scale segmentation for the object-based image analysis (OBIA. The endmembers for the pixel-based approach were collected based on existing vegetation community map. Nine targeted classes were mapped in the study area from each approach, including three mangrove species: Avicennia marina, Rhizophora stylosa, and Ceriops australis. The mapping results showed that SAM produced accurate class polygons with only few unclassified pixels (overall accuracy 69%, Kappa 0.57, the LSU resulted in a patchy polygon pattern with many unclassified pixels (overall accuracy 56%, Kappa 0.41, and the object-based mapping produced the most accurate results (overall accuracy 76%, Kappa 0.67. Our results demonstrated that the object-based approach, which combined a rule-based and nearest-neighbor classification method, was the best classifier to map mangrove species and its adjacent environments.

  16. Mapping Typical Urban LULC from Landsat Imagery without Training Samples or Self-Defined Parameters

    Directory of Open Access Journals (Sweden)

    Hui Li

    2017-07-01

    Full Text Available Land use/land cover (LULC change is one of the most important indicators in understanding the interactions between humans and the environment. Traditionally, when LULC maps are produced yearly, most existing remote-sensing methods have to collect ground reference data annually, as the classifiers have to be trained individually in each corresponding year. This study presented a novel strategy to map LULC classes without training samples or assigning parameters. First of all, several novel indices were carefully selected from the index pool, which were able to highlight certain LULC very well. Following this, a common unsupervised classifier was employed to extract the LULC from the associated index image without assigning thresholds. Finally, a supervised classification was implemented with samples automatically collected from the unsupervised classification outputs. Results illustrated that the proposed method could achieve satisfactory performance, reaching similar accuracies to traditional approaches. Findings of this study demonstrate that the proposed strategy is a simple and effective alternative to mapping urban LULC. With the proposed strategy, the budget and time required for remote-sensing data processing could be reduced dramatically.

  17. The projective heat map

    CERN Document Server

    Schwartz, Richard Evan

    2017-01-01

    This book introduces a simple dynamical model for a planar heat map that is invariant under projective transformations. The map is defined by iterating a polygon map, where one starts with a finite planar N-gon and produces a new N-gon by a prescribed geometric construction. One of the appeals of the topic of this book is the simplicity of the construction that yet leads to deep and far reaching mathematics. To construct the projective heat map, the author modifies the classical affine invariant midpoint map, which takes a polygon to a new polygon whose vertices are the midpoints of the original. The author provides useful background which makes this book accessible to a beginning graduate student or advanced undergraduate as well as researchers approaching this subject from other fields of specialty. The book includes many illustrations, and there is also a companion computer program.

  18. Fund Finder: A case study of database-to-ontology mapping

    OpenAIRE

    Barrasa Rodríguez, Jesús; Corcho, Oscar; Gómez-Pérez, A.

    2003-01-01

    The mapping between databases and ontologies is a basic problem when trying to "upgrade" deep web content to the semantic web. Our approach suggests the declarative definition of mappings as a way to achieve domain independency and reusability. A specific language (expressive enough to cover some real world mapping situations like lightly structured databases or not 1st normal form ones) is defined for this purpose. Along with this mapping description language, the ODEMapster processor is in ...

  19. Physical Mapping of Bread Wheat Chromosome 5A: An Integrated Approach

    Directory of Open Access Journals (Sweden)

    Delfina Barabaschi

    2015-11-01

    Full Text Available The huge size, redundancy, and highly repetitive nature of the bread wheat [ (L.] genome, makes it among the most difficult species to be sequenced. To overcome these limitations, a strategy based on the separation of individual chromosomes or chromosome arms and the subsequent production of physical maps was established within the frame of the International Wheat Genome Sequence Consortium (IWGSC. A total of 95,812 bacterial artificial chromosome (BAC clones of short-arm chromosome 5A (5AS and long-arm chromosome 5A (5AL arm-specific BAC libraries were fingerprinted and assembled into contigs by complementary analytical approaches based on the FingerPrinted Contig (FPC and Linear Topological Contig (LTC tools. Combined anchoring approaches based on polymerase chain reaction (PCR marker screening, microarray, and sequence homology searches applied to several genomic tools (i.e., genetic maps, deletion bin map, neighbor maps, BAC end sequences (BESs, genome zipper, and chromosome survey sequences allowed the development of a high-quality physical map with an anchored physical coverage of 75% for 5AS and 53% for 5AL with high portions (64 and 48%, respectively of contigs ordered along the chromosome. In the genome of grasses, [ (L. Beauv.], rice ( L., and sorghum [ (L. Moench] homologs of genes on wheat chromosome 5A were separated into syntenic blocks on different chromosomes as a result of translocations and inversions during evolution. The physical map presented represents an essential resource for fine genetic mapping and map-based cloning of agronomically relevant traits and a reference for the 5A sequencing projects.

  20. Using a Similarity Matrix Approach to Evaluate the Accuracy of Rescaled Maps

    Directory of Open Access Journals (Sweden)

    Peijun Sun

    2018-03-01

    Full Text Available Rescaled maps have been extensively utilized to provide data at the appropriate spatial resolution for use in various Earth science models. However, a simple and easy way to evaluate these rescaled maps has not been developed. We propose a similarity matrix approach using a contingency table to compute three measures: overall similarity (OS, omission error (OE, and commission error (CE to evaluate the rescaled maps. The Majority Rule Based aggregation (MRB method was employed to produce the upscaled maps to demonstrate this approach. In addition, previously created, coarser resolution land cover maps from other research projects were also available for comparison. The question of which is better, a map initially produced at coarse resolution or a fine resolution map rescaled to a coarse resolution, has not been quantitatively investigated. To address these issues, we selected study sites at three different extent levels. First, we selected twelve regions covering the continental USA, then we selected nine states (from the whole continental USA, and finally we selected nine Agriculture Statistical Districts (ASDs (from within the nine selected states as study sites. Crop/non-crop maps derived from the USDA Crop Data Layer (CDL at 30 m as base maps were used for the upscaling and existing maps at 250 m and 1 km were utilized for the comparison. The results showed that a similarity matrix can effectively provide the map user with the information needed to assess the rescaling. Additionally, the upscaled maps can provide higher accuracy and better represent landscape pattern compared to the existing coarser maps. Therefore, we strongly recommend that an evaluation of the upscaled map and the existing coarser resolution map using a similarity matrix should be conducted before deciding which dataset to use for the modelling. Overall, extending our understanding on how to perform an evaluation of the rescaled map and investigation of the applicability

  1. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    Science.gov (United States)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  2. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    Science.gov (United States)

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher

    2017-07-01

    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  3. Development of erosion risk map using fuzzy logic approach

    Directory of Open Access Journals (Sweden)

    Fauzi Manyuk

    2017-01-01

    Full Text Available Erosion-hazard assessment is an important aspect in the management of a river basin such as Siak River Basin, Riau Province, Indonesia. This study presents an application of fuzzy logic approach to develop erosion risk map based on geographic information system. Fuzzy logic is a computing approach based on “degrees of truth” rather than the usual “true or false” (1 or 0 Boolean logic on which the modern computer is based. The results of the erosion risk map were verified by using field measurements. The verification result shows that the parameter of soil-erodibility (K indicates a good agreement with field measurement data. The classification of soil-erodibility (K as the result of validation were: very low (0.0–0.1, medium (0.21-0.32, high (0.44-0.55 and very high (0.56-0.64. The results obtained from this study show that the erosion risk map of Siak River Basin were dominantly classified as medium level which cover about 68.54%. The other classifications were high and very low erosion level which cover about 28.84% and 2.61% respectively.

  4. Considerations for Software Defined Networking (SDN): Approaches and use cases

    Science.gov (United States)

    Bakshi, K.

    Software Defined Networking (SDN) is an evolutionary approach to network design and functionality based on the ability to programmatically modify the behavior of network devices. SDN uses user-customizable and configurable software that's independent of hardware to enable networked systems to expand data flow control. SDN is in large part about understanding and managing a network as a unified abstraction. It will make networks more flexible, dynamic, and cost-efficient, while greatly simplifying operational complexity. And this advanced solution provides several benefits including network and service customizability, configurability, improved operations, and increased performance. There are several approaches to SDN and its practical implementation. Among them, two have risen to prominence with differences in pedigree and implementation. This paper's main focus will be to define, review, and evaluate salient approaches and use cases of the OpenFlow and Virtual Network Overlay approaches to SDN. OpenFlow is a communication protocol that gives access to the forwarding plane of a network's switches and routers. The Virtual Network Overlay relies on a completely virtualized network infrastructure and services to abstract the underlying physical network, which allows the overlay to be mobile to other physical networks. This is an important requirement for cloud computing, where applications and associated network services are migrated to cloud service providers and remote data centers on the fly as resource demands dictate. The paper will discuss how and where SDN can be applied and implemented, including research and academia, virtual multitenant data center, and cloud computing applications. Specific attention will be given to the cloud computing use case, where automated provisioning and programmable overlay for scalable multi-tenancy is leveraged via the SDN approach.

  5. Defining European Wholesale Electricity Markets. An 'And/Or' Approach

    International Nuclear Information System (INIS)

    Dijkgraaf, E.; Janssen, M.C.W.

    2009-09-01

    An important question in the dynamic European wholesale markets for electricity is whether to define the geographical market at the level of an individual member state or more broadly. We show that if we currently take the traditional approach by considering for each member state whether there is one single other country that provides a substitute for domestic production, the market in each separate member state has still to be considered a separate market. However, if we allow for the possibility that at different moments in time there is another country that provides a substitute for domestic production, then the conclusion should be that certain member states do not constitute a separate geographical market. This is in particular true for Belgium, but also for The Netherlands, France, and to some extent also for Germany and Austria. We call this alternative approach the 'and/or' approach.

  6. Mapping Aquatic Vegetation in a Large, Shallow Eutrophic Lake: A Frequency-Based Approach Using Multiple Years of MODIS Data

    Directory of Open Access Journals (Sweden)

    Xiaohan Liu

    2015-08-01

    Full Text Available Aquatic vegetation serves many important ecological and socioeconomic functions in lake ecosystems. The presence of floating algae poses difficulties for accurately estimating the distribution of aquatic vegetation in eutrophic lakes. We present an approach to map the distribution of aquatic vegetation in Lake Taihu (a large, shallow eutrophic lake in China and reduce the influence of floating algae on aquatic vegetation mapping. Our approach involved a frequency analysis over a 2003–2013 time series of the floating algal index (FAI based on moderate-resolution imaging spectroradiometer (MODIS data. Three phenological periods were defined based on the vegetation presence frequency (VPF and the growth of algae and aquatic vegetation: December and January composed the period of wintering aquatic vegetation; February and March composed the period of prolonged coexistence of algal blooms and wintering aquatic vegetation; and June to October was the peak period of the coexistence of algal blooms and aquatic vegetation. By comparing and analyzing the satellite-derived aquatic vegetation distribution and 244 in situ measurements made in 2013, we established a FAI threshold of −0.025 and VPF thresholds of 0.55, 0.45 and 0.85 for the three phenological periods. We validated the accuracy of our approach by comparing the results between the satellite-derived maps and the in situ results obtained from 2008–2012. The overall classification accuracy was 87%, 81%, 77%, 88% and 73% in the five years from 2008–2012, respectively. We then applied the approach to the MODIS images from 2003–2013 and obtained the total area of the aquatic vegetation, which varied from 265.94 km2 in 2007 to 503.38 km2 in 2008, with an average area of 359.62 ± 69.20 km2 over the 11 years. Our findings suggest that (1 the proposed approach can be used to map the distribution of aquatic vegetation in eutrophic algae-rich waters and (2 dramatic changes occurred in the

  7. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    Science.gov (United States)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  8. Edge maps: Representing flow with bounded error

    KAUST Repository

    Bhatia, Harsh

    2011-03-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Many analysis techniques rely on computing streamlines, a task often hampered by numerical instabilities. Approaches that ignore the resulting errors can lead to inconsistencies that may produce unreliable visualizations and ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with linear maps defined on its boundary. This representation, called edge maps, is equivalent to computing all possible streamlines at a user defined error threshold. In spite of this error, all the streamlines computed using edge maps will be pairwise disjoint. Furthermore, our representation stores the error explicitly, and thus can be used to produce more informative visualizations. Given a piecewise-linear interpolated vector field, a recent result [15] shows that there are only 23 possible map classes for a triangle, permitting a concise description of flow behaviors. This work describes the details of computing edge maps, provides techniques to quantify and refine edge map error, and gives qualitative and visual comparisons to more traditional techniques. © 2011 IEEE.

  9. Architectonic Mapping of the Human Brain beyond Brodmann.

    Science.gov (United States)

    Amunts, Katrin; Zilles, Karl

    2015-12-16

    Brodmann has pioneered structural brain mapping. He considered functional and pathological criteria for defining cortical areas in addition to cytoarchitecture. Starting from this idea of structural-functional relationships at the level of cortical areas, we will argue that the cortical architecture is more heterogeneous than Brodmann's map suggests. A triple-scale concept is proposed that includes repetitive modular-like structures and micro- and meso-maps. Criteria for defining a cortical area will be discussed, considering novel preparations, imaging and optical methods, 2D and 3D quantitative architectonics, as well as high-performance computing including analyses of big data. These new approaches contribute to an understanding of the brain on multiple levels and challenge the traditional, mosaic-like segregation of the cerebral cortex. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Defining a Cancer Dependency Map.

    Science.gov (United States)

    Tsherniak, Aviad; Vazquez, Francisca; Montgomery, Phil G; Weir, Barbara A; Kryukov, Gregory; Cowley, Glenn S; Gill, Stanley; Harrington, William F; Pantel, Sasha; Krill-Burger, John M; Meyers, Robin M; Ali, Levi; Goodale, Amy; Lee, Yenarae; Jiang, Guozhi; Hsiao, Jessica; Gerath, William F J; Howell, Sara; Merkel, Erin; Ghandi, Mahmoud; Garraway, Levi A; Root, David E; Golub, Todd R; Boehm, Jesse S; Hahn, William C

    2017-07-27

    Most human epithelial tumors harbor numerous alterations, making it difficult to predict which genes are required for tumor survival. To systematically identify cancer dependencies, we analyzed 501 genome-scale loss-of-function screens performed in diverse human cancer cell lines. We developed DEMETER, an analytical framework that segregates on- from off-target effects of RNAi. 769 genes were differentially required in subsets of these cell lines at a threshold of six SDs from the mean. We found predictive models for 426 dependencies (55%) by nonlinear regression modeling considering 66,646 molecular features. Many dependencies fall into a limited number of classes, and unexpectedly, in 82% of models, the top biomarkers were expression based. We demonstrated the basis behind one such predictive model linking hypermethylation of the UBB ubiquitin gene to a dependency on UBC. Together, these observations provide a foundation for a cancer dependency map that facilitates the prioritization of therapeutic targets. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    Science.gov (United States)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  12. Defining acceptable levels for ecological indicators: an approach for considering social values.

    Science.gov (United States)

    Smyth, Robyn L; Watzin, Mary C; Manning, Robert E

    2007-03-01

    Ecological indicators can facilitate an adaptive management approach, but only if acceptable levels for those indicators have been defined so that the data collected can be interpreted. Because acceptable levels are an expression of the desired state of the ecosystem, the process of establishing acceptable levels should incorporate not just ecological understanding but also societal values. The goal of this research was to explore an approach for defining acceptable levels of ecological indicators that explicitly considers social perspectives and values. We used a set of eight indicators that were related to issues of concern in the Lake Champlain Basin. Our approach was based on normative theory. Using a stakeholder survey, we measured respondent normative evaluations of varying levels of our indicators. Aggregated social norm curves were used to determine the level at which indicator values shifted from acceptable to unacceptable conditions. For seven of the eight indicators, clear preferences were interpretable from these norm curves. For example, closures of public beaches because of bacterial contamination and days of intense algae bloom went from acceptable to unacceptable at 7-10 days in a summer season. Survey respondents also indicated that the number of fish caught from Lake Champlain that could be safely consumed each month was unacceptably low and the number of streams draining into the lake that were impaired by storm water was unacceptably high. If indicators that translate ecological conditions into social consequences are carefully selected, we believe the normative approach has considerable merit for defining acceptable levels of valued ecological system components.

  13. DyKOSMap: A framework for mapping adaptation between biomedical knowledge organization systems.

    Science.gov (United States)

    Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2015-06-01

    Knowledge Organization Systems (KOS) and their associated mappings play a central role in several decision support systems. However, by virtue of knowledge evolution, KOS entities are modified over time, impacting mappings and potentially turning them invalid. This requires semi-automatic methods to maintain such semantic correspondences up-to-date at KOS evolution time. We define a complete and original framework based on formal heuristics that drives the adaptation of KOS mappings. Our approach takes into account the definition of established mappings, the evolution of KOS and the possible changes that can be applied to mappings. This study experimentally evaluates the proposed heuristics and the entire framework on realistic case studies borrowed from the biomedical domain, using official mappings between several biomedical KOSs. We demonstrate the overall performance of the approach over biomedical datasets of different characteristics and sizes. Our findings reveal the effectiveness in terms of precision, recall and F-measure of the suggested heuristics and methods defining the framework to adapt mappings affected by KOS evolution. The obtained results contribute and improve the quality of mappings over time. The proposed framework can adapt mappings largely automatically, facilitating thus the maintenance task. The implemented algorithms and tools support and minimize the work of users in charge of KOS mapping maintenance. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Energy-efficient virtual optical network mapping approaches over converged flexible bandwidth optical networks and data centers.

    Science.gov (United States)

    Chen, Bowen; Zhao, Yongli; Zhang, Jie

    2015-09-21

    In this paper, we develop a virtual link priority mapping (LPM) approach and a virtual node priority mapping (NPM) approach to improve the energy efficiency and to reduce the spectrum usage over the converged flexible bandwidth optical networks and data centers. For comparison, the lower bound of the virtual optical network mapping is used for the benchmark solutions. Simulation results show that the LPM approach achieves the better performance in terms of power consumption, energy efficiency, spectrum usage, and the number of regenerators compared to the NPM approach.

  15. Optogenetic Approaches for Mesoscopic Brain Mapping.

    Science.gov (United States)

    Kyweriga, Michael; Mohajerani, Majid H

    2016-01-01

    Recent advances in identifying genetically unique neuronal proteins has revolutionized the study of brain circuitry. Researchers are now able to insert specific light-sensitive proteins (opsins) into a wide range of specific cell types via viral injections or by breeding transgenic mice. These opsins enable the activation, inhibition, or modulation of neuronal activity with millisecond control within distinct brain regions defined by genetic markers. Here we present a useful guide to implement this technique into any lab. We first review the materials needed and practical considerations and provide in-depth instructions for acute surgeries in mice. We conclude with all-optical mapping techniques for simultaneous recording and manipulation of population activity of many neurons in vivo by combining arbitrary point optogenetic stimulation and regional voltage-sensitive dye imaging. It is our intent to make these methods available to anyone wishing to use them.

  16. Force scanning: a rapid, high-resolution approach for spatial mechanical property mapping

    International Nuclear Information System (INIS)

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the microscale and nanoscale is force mapping, which involves taking individual force curves at discrete sites across a region of interest. The limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straightforward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact-mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to ones achieved by the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue.

  17. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Science.gov (United States)

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. An Effective NoSQL-Based Vector Map Tile Management Approach

    Directory of Open Access Journals (Sweden)

    Lin Wan

    2016-11-01

    Full Text Available Within a digital map service environment, the rapid growth of Spatial Big-Data is driving new requirements for effective mechanisms for massive online vector map tile processing. The emergence of Not Only SQL (NoSQL databases has resulted in a new data storage and management model for scalable spatial data deployments and fast tracking. They better suit the scenario of high-volume, low-latency network map services than traditional standalone high-performance computer (HPC or relational databases. In this paper, we propose a flexible storage framework that provides feasible methods for tiled map data parallel clipping and retrieval operations within a distributed NoSQL database environment. We illustrate the parallel vector tile generation and querying algorithms with the MapReduce programming model. Three different processing approaches, including local caching, distributed file storage, and the NoSQL-based method, are compared by analyzing the concurrent load and calculation time. An online geological vector tile map service prototype was developed to embed our processing framework in the China Geological Survey Information Grid. Experimental results show that our NoSQL-based parallel tile management framework can support applications that process huge volumes of vector tile data and improve performance of the tiled map service.

  19. An approach to define semantics for BPM systems interoperability

    Science.gov (United States)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  20. Noise pollution mapping approach and accuracy on landscape scales.

    Science.gov (United States)

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Approach to defining de minimis, intermediate, and other classes of radioactive waste

    International Nuclear Information System (INIS)

    Cohen, J.J.; Smith, C.F.

    1986-01-01

    This study has developed a framework within which the complete spectrum of radioactive wastes can be defined. An approach has been developed that reflects both concerns in the framework of a radioactive waste classification system. In this approach, the class of any radioactive waste stream is dependent on its degree of radioactivity and its persistence. To be consistent with conventional systems, four waste classes are defined. In increasing order of concern due to radioactivity and/or duration, these are: 1. De Minimis Wastes: This waste has such a low content of radioactive material that it can be considered essentially nonradioactive and managed according to its nonradiological characteristics. 2. Low-Level Waste (LLW): Maximum concentrations for wastes considered to be in this class are prescribed in 10CFR61 as wastes that can be disposed of by shallow land burial methods. 3. Intermediate Level Waste (ILW): This category defines a class of waste whose content exceeds class C (10CFR61) levels, yet does not pose a sufficient hazard to justify management as a high-level waste (i.e., permanent isolation by deep geologic disposal). 4. High-Level Waste: HLW poses the most serious management problem and requires the most restrictive disposal methods. It is defined in NWPA as waste derived from the reprocessing of nuclear fuel and/or as highly radioactive wastes that require permanent isolation

  2. Data mining approach to bipolar cognitive map development and decision analysis

    Science.gov (United States)

    Zhang, Wen-Ran

    2002-03-01

    A data mining approach to cognitive mapping is presented based on bipolar logic, bipolar relations, and bipolar clustering. It is shown that a correlation network derived from a database can be converted to a bipolar cognitive map (or bipolar relation). A transitive, symmetric, and reflexive bipolar relation (equilibrium relation) can be used to identify focal links in decision analysis. It can also be used to cluster a set of events or itemsets into three different clusters: coalition sets, conflict sets, and harmony sets. The coalition sets are positively correlated events or itemsets; each conflict set is a negatively correlated set of two coalition subsets; and a harmony set consists of events that are both negatively and positively correlated. A cognitive map and the clusters can then be used for online decision analysis. This approach combines knowledge discovery with the views of decision makers and provides an effective means for online analytical processing (OLAP) and online analytical mining (OLAM).

  3. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    Science.gov (United States)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  4. Fast and Accurate Approaches for Large-Scale, Automated Mapping of Food Diaries on Food Composition Tables

    Directory of Open Access Journals (Sweden)

    Marc Lamarine

    2018-05-01

    Full Text Available Aim of Study: The use of weighed food diaries in nutritional studies provides a powerful method to quantify food and nutrient intakes. Yet, mapping these records onto food composition tables (FCTs is a challenging, time-consuming and error-prone process. Experts make this effort manually and no automation has been previously proposed. Our study aimed to assess automated approaches to map food items onto FCTs.Methods: We used food diaries (~170,000 records pertaining to 4,200 unique food items from the DiOGenes randomized clinical trial. We attempted to map these items onto six FCTs available from the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching. The second used a machine learning approach (C5.0 classifier combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English-translation. Top matching pairs were reviewed manually to derive performance metrics: precision (the percentage of correctly mapped items and recall (percentage of mapped items.Results: The simpler approach: fuzzy matching, provided very good performance. Under a relaxed threshold (score > 50%, this approach enabled to remap 99.49% of the items with a precision of 88.75%. With a slightly more stringent threshold (score > 63%, the precision could be significantly improved to 96.81% while keeping a recall rate > 95% (i.e., only 5% of the queried items would not be mapped. The machine learning approach did not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names. Our approaches have been implemented as R packages and are freely available from GitHub.Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We

  5. A conformal mapping approach to a root-clustering problem

    International Nuclear Information System (INIS)

    Melnikov, Gennady I; Dudarenko, Nataly A; Melnikov, Vitaly G

    2014-01-01

    This paper presents a new approach for matrix root-clustering in sophisticated and multiply-connected regions of the complex plane. The parametric sweeping method and a concept of the closed forbidden region covered by a set of modified three-parametrical Cassini regions are used. A conformal mapping approach was applied to formulate the main results of the paper. An application of the developed method to the problem of matrix root-clustering in a multiply connected region is shown for illustration

  6. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  7. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    Science.gov (United States)

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  8. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population.

    Science.gov (United States)

    Raghavan, Chitra; Mauleon, Ramil; Lacorte, Vanica; Jubay, Monalisa; Zaw, Hein; Bonifacio, Justine; Singh, Rakesh Kumar; Huang, B Emma; Leung, Hei

    2017-06-07

    Multi-parent Advanced Generation Intercross (MAGIC) populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL) mapping. In this study, 1316 S6:8 indica MAGIC (MI) lines and the eight founders were sequenced using Genotyping by Sequencing (GBS). As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height), physical (grain length and grain width) and cooking properties (amylose content) of the rice grain, abiotic stress (submergence tolerance), and biotic stress (brown spot disease) were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations. Copyright © 2017 Raghavan et al.

  9. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population

    Directory of Open Access Journals (Sweden)

    Chitra Raghavan

    2017-06-01

    Full Text Available Multi-parent Advanced Generation Intercross (MAGIC populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL mapping. In this study, 1316 S6:8 indica MAGIC (MI lines and the eight founders were sequenced using Genotyping by Sequencing (GBS. As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height, physical (grain length and grain width and cooking properties (amylose content of the rice grain, abiotic stress (submergence tolerance, and biotic stress (brown spot disease were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations.

  10. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  11. Mapping of transport sensitive areas - Task 3.1

    DEFF Research Database (Denmark)

    Münier, Bernd

    and retrieval of data available for pan European mapping exercises revealed a considerable number of high resolution maps suitable for production of map examples. The results have been documented as a spreadsheet, containing essential sets of metadata. Furthermore, it could be realised that the number...... and related sensitive areas in the EU deals with the operationalisation of the criteria for transport sensitivity and impacts, as defined in D2. This paper reports the findings of task 3.1, a Review on spatial approaches, mapping examples and available data sets at EU level. The outcomes of this task...... and quality of map data available is constantly increasing, both with regard to coverage of existing maps and the release of new maps or maps harmonised from national mapping tasks. Main data gaps seem to be within data on meteorology and air quality, as they only exist in rather coarse spatial resolution...

  12. A taxonomy of behaviour change methods: an Intervention Mapping approach

    OpenAIRE

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fern?ndez, Mar?a E.; Markham, Christine; Bartholomew, L. Kay

    2015-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters fo...

  13. Redefining the Indirect Approach, Defining Special Operations Forces (SOF Power, and the Global Networking of SOF

    Directory of Open Access Journals (Sweden)

    Scott Morrison

    2014-07-01

    Full Text Available The current Defense Strategy assigns Special Operations Forces (SOF to play a central role in countering terrorism, weapons of mass destruction, and irregular warfare. However, there has been little published that defines the role of Special Operations alongside air, land, and sea domains. The U.S. Special Operations Community struggles to define its own theoretical concepts such as direct approach and indirect approach. The U.S. SOF circles typically define direct approach with direct action and the indirect approach with foreign internal defense or security force assistance. Military theorist Liddell Hart viewed the indirect approach as a method to orient upon, target, and upset an adversary’s equilibrium in order to plan for and direct decisive blows. Today, the SOF indirect approach is arguable more applicable due to the prevalence of non-state threats and internal conflicts. Following Hart’s definition, precision raids are among the integral components of a broader application of the indirect approach. The approach also networks U.S. government power as a force when used in concert with allies and local partners. Global networking along with balanced precision raids will exponentially increase the utility of SOF power and position it to appropriately complement all domains to tackle 21st century challenges.

  14. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.

  15. A New Approach to High-accuracy Road Orthophoto Mapping Based on Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2011-12-01

    Full Text Available Existing orthophoto map based on satellite photography and aerial photography is not precise enough for road marking. This paper proposes a new approach to high-accuracy orthophoto mapping. The approach uses inverse perspective transformation to process the image information and generates the orthophoto fragment. The offline interpolation algorithm is used to process the location information. It processes the dead reckoning and the EKF location information, and uses the result to transform the fragments to the global coordinate system. At last it uses wavelet transform to divides the image to two frequency bands and uses weighted median algorithm to deal with them separately. The result of experiment shows that the map produced with this method has high accuracy.

  16. Development and application of a conceptual approach for defining high-level waste

    International Nuclear Information System (INIS)

    Croff, A.G.; Forsberg, C.W.; Kocher, D.C.; Cohen, J.J.; Smith, C.F.; Miller, D.E.

    1986-01-01

    This paper presents a conceptual approach to defining high-level radioactive waste (HLW) and a preliminary quantitative definition obtained from an example implementation of the conceptual approach. On the basis of the description of HLW in the Nuclear Waste Policy Act of 1982, we have developed a conceptual model in which HLW has two attributes: HLW is (1) highly radioactive and (2) requires permanent isolation via deep geologic disposal. This conceptual model results in a two-dimensional waste categorization system in which one axis, related to ''requires permanent isolation,'' is associated with long-term risks from waste disposal and the other axis, related to ''highly radioactive,'' is associated with short-term risks from waste management and operations; this system also leads to the specification of categories of wastes that are not HLW. Implementation of the conceptual model for defining HLW was based primarily on health and safety considerations. Wastes requiring permanent isolation via deep geologic disposal were defined by estimating the maximum concentrations of radionuclides that would be acceptable for disposal using the next-best technology, i.e., greater confinement disposal (GCD) via intermediate-depth burial or engineered surface structures. Wastes that are highly radioactive were defined by adopting heat generation rate as the appropriate measure and examining levels of decay heat that necessitate special methods to control risks from operations in a variety of nuclear fuel-cycle situations. We determined that wastes having a power density >200 W/m 3 should be considered highly radioactive. Thus, in the example implementation, the combination of maximum concentrations of long-lived radionuclides that are acceptable for GCD and a power density of 200 W/m 3 provides boundaries for defining wastes that are HLW

  17. My Family-Study, Early-Onset Substance use Prevention Program: An Application of Intervention Mapping Approach

    Directory of Open Access Journals (Sweden)

    Mehdi Mirzaei-Alavijeh

    2017-03-01

    Full Text Available Background and Objectives: Based on different studies, substance use is one of the health problems in the Iranian society. The prevalence of substance use is on a growing trend; moreover, the age of the onset of substance use has declined to early adolescence and even lower. Regarding this, the present study aimed to develop a family-based early-onset substance use prevention program in children (My Family-Study by using intervention mapping approach. Materials and Methods: This study descirbes the research protocol during which the intervention mapping approach was used as a framework to develop My Family-Study. In this study, six steps of intervention mapping were completed. Interviews with experts and literature review fulfilled the need assessment. In the second step, the change objectivs were rewritten based on the intersection of the performance objectives and the determinants associated in the matrices. After designing the program and planning the implementation of the intervention, the evaluation plan of the program was accomplished. Results: The use of intervention mapping approach facilitated the develop-pment of a systematic as well as theory- and evidence-based program. Moreover, this approach was helful in the determination of outcomes, performance and change objectives, determinants, theoretical methods, practical application, intervention, dissemination, and evaluation program. Conclusions: The intervention mapping provided a systematic as well as theory- and evidence-based approach to develop a quality continuing health promotion program.

  18. Global mapping of transposon location.

    Directory of Open Access Journals (Sweden)

    Abram Gabriel

    2006-12-01

    Full Text Available Transposable genetic elements are ubiquitous, yet their presence or absence at any given position within a genome can vary between individual cells, tissues, or strains. Transposable elements have profound impacts on host genomes by altering gene expression, assisting in genomic rearrangements, causing insertional mutations, and serving as sources of phenotypic variation. Characterizing a genome's full complement of transposons requires whole genome sequencing, precluding simple studies of the impact of transposition on interindividual variation. Here, we describe a global mapping approach for identifying transposon locations in any genome, using a combination of transposon-specific DNA extraction and microarray-based comparative hybridization analysis. We use this approach to map the repertoire of endogenous transposons in different laboratory strains of Saccharomyces cerevisiae and demonstrate that transposons are a source of extensive genomic variation. We also apply this method to mapping bacterial transposon insertion sites in a yeast genomic library. This unique whole genome view of transposon location will facilitate our exploration of transposon dynamics, as well as defining bases for individual differences and adaptive potential.

  19. Integrating Volcanic Hazard Data in a Systematic Approach to Develop Volcanic Hazard Maps in the Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Jan M. Lindsay

    2018-04-01

    Full Text Available We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick ‘em Jenny and Ronde/Caille, Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia, and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past ~10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that

  20. A novel intra-operative, high-resolution atrial mapping approach.

    Science.gov (United States)

    Yaksh, Ameeta; van der Does, Lisette J M E; Kik, Charles; Knops, Paul; Oei, Frans B S; van de Woestijne, Pieter C; Bekkers, Jos A; Bogers, Ad J J C; Allessie, Maurits A; de Groot, Natasja M S

    2015-12-01

    A new technique is demonstrated for extensive high-resolution intra-operative atrial mapping that will facilitate the localization of atrial fibrillation (AF) sources and identification of the substrate perpetuating AF. Prior to the start of extra-corporal circulation, a 8 × 24-electrode array (2-mm inter-electrode distance) is placed subsequently on all the right and left epicardial atrial sites, including Bachmann's bundle, for recording of unipolar electrograms during sinus rhythm and (induced) AF. AF is induced by high-frequency pacing at the right atrial free wall. A pacemaker wire stitched to the right atrium serves as a reference signal. The indifferent pole is connected to a steal wire fixed to subcutaneous tissue. Electrograms are recorded by a computerized mapping system and, after amplification (gain 1000), filtering (bandwidth 0.5-400 Hz), sampling (1 kHz) and analogue to digital conversion (16 bits), automatically stored on hard disk. During the mapping procedure, real-time visualization secures electrogram quality. Analysis will be performed offline. This technique was performed in 168 patients of 18 years and older, with coronary and/or structural heart disease, with or without AF, electively scheduled for cardiac surgery and a ventricular ejection fraction above 40 %. The mean duration of the entire mapping procedure including preparation time was 9 ± 2 min. Complications related to the mapping procedure during or after cardiac surgery were not observed. We introduce the first epicardial atrial mapping approach with a high resolution of ≥1728 recording sites which can be performed in a procedure time of only 9±2 mins. This mapping technique can potentially identify areas responsible for initiation and persistence of AF and hopefully can individualize both diagnosis and therapy of AF.

  1. Mapping community vulnerability to poaching: A whole-of-society approach

    CSIR Research Space (South Africa)

    Schmitz, Peter

    2017-01-01

    Full Text Available in Cartography and GIScience Mapping community vulnerability to poaching: A whole-of-society approach Peter M.U. Schmitz,1,2,3 Duarte Gonçalves,4 and Merin Jacob4 1. CSIR Built Environment, Meiring Naude Rd, Brummeria, Pretoria, South Africa; pschmitz...

  2. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Shiguo [Univ. Wisc.-Madison; Kile, A. [Univ. Wisc.-Madison; Bechner, M. [Univ. Wisc.-Madison; Kvikstad, E. [Univ. Wisc.-Madison; Deng, W. [Univ. Wisc.-Madison; Wei, J. [Univ. Wisc.-Madison; Severin, J. [Univ. Wisc.-Madison; Runnheim, R. [Univ. Wisc.-Madison; Churas, C. [Univ. Wisc.-Madison; Forrest, D. [Univ. Wisc.-Madison; Dimalanta, E. [Univ. Wisc.-Madison; Lamers, C. [Univ. Wisc.-Madison; Burland, V. [Univ. Wisc.-Madison; Blattner, F. R. [Univ. Wisc.-Madison; Schwartz, David C. [Univ. Wisc.-Madison

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  3. The CEGB approach to defining the commissioning tests for prime movers

    Energy Technology Data Exchange (ETDEWEB)

    Horne, B. E. [CEGB, Generation Development and Construction Division, Barnett Way, Barnwood, Gloucester GL4 7RS (United Kingdom)

    1986-02-15

    This paper describes the CEGB approach to demonstrating during commissioning the adequacy of the reliability of the large on-site essential electrical power sources installed in the CAGR power stations. In this approach the reliability requirements of the essential electrical supplies at the power stations are defined and then the reliability requirements of the particular gas turbine and diesel generator installation derived. The paper outlines the probabilistic methods used in arriving at the specific start and run test programmes which were subsequently carried out. The results achieved in these test programmes in demonstrating that the reliability requirements were satisfied, are presented In the paper. (author)

  4. The CEGB approach to defining the commissioning tests for prime movers

    International Nuclear Information System (INIS)

    Horne, B.E.

    1986-01-01

    This paper describes the CEGB approach to demonstrating during commissioning the adequacy of the reliability of the large on-site essential electrical power sources installed in the CAGR power stations. In this approach the reliability requirements of the essential electrical supplies at the power stations are defined and then the reliability requirements of the particular gas turbine and diesel generator installation derived. The paper outlines the probabilistic methods used in arriving at the specific start and run test programmes which were subsequently carried out. The results achieved in these test programmes in demonstrating that the reliability requirements were satisfied, are presented In the paper. (author)

  5. Systematic methods for defining coarse-grained maps in large biomolecules.

    Science.gov (United States)

    Zhang, Zhiyong

    2015-01-01

    Large biomolecules are involved in many important biological processes. It would be difficult to use large-scale atomistic molecular dynamics (MD) simulations to study the functional motions of these systems because of the computational expense. Therefore various coarse-grained (CG) approaches have attracted rapidly growing interest, which enable simulations of large biomolecules over longer effective timescales than all-atom MD simulations. The first issue in CG modeling is to construct CG maps from atomic structures. In this chapter, we review the recent development of a novel and systematic method for constructing CG representations of arbitrarily complex biomolecules, in order to preserve large-scale and functionally relevant essential dynamics (ED) at the CG level. In this ED-CG scheme, the essential dynamics can be characterized by principal component analysis (PCA) on a structural ensemble, or elastic network model (ENM) of a single atomic structure. Validation and applications of the method cover various biological systems, such as multi-domain proteins, protein complexes, and even biomolecular machines. The results demonstrate that the ED-CG method may serve as a very useful tool for identifying functional dynamics of large biomolecules at the CG level.

  6. Geographic Knowledge Extraction and Semantic Similarity in OpenStreetMap

    OpenAIRE

    Ballatore, Andrea; Bertolotto, Michela; Wilson, David C.

    2012-01-01

    In recent years, a web phenomenon known as Volunteered Geographic Information (VGI) has produced large crowdsourced geographic data sets. OpenStreetMap (OSM), the leading VGI project, aims at building an open-content world map through user contributions. OSM semantics consists of a set of properties (called 'tags') describing geographic classes, whose usage is defined by project contributors on a dedicated Wiki website. Because of its simple and open semantic structure, the OSM approach often...

  7. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    Science.gov (United States)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  8. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    Science.gov (United States)

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Directory of Open Access Journals (Sweden)

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  10. Branched polynomial covering maps

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard

    2002-01-01

    A Weierstrass polynomial with multiple roots in certain points leads to a branched covering map. With this as the guiding example, we formally define and study the notion of a branched polynomial covering map. We shall prove that many finite covering maps are polynomial outside a discrete branch ...... set. Particular studies are made of branched polynomial covering maps arising from Riemann surfaces and from knots in the 3-sphere. (C) 2001 Elsevier Science B.V. All rights reserved.......A Weierstrass polynomial with multiple roots in certain points leads to a branched covering map. With this as the guiding example, we formally define and study the notion of a branched polynomial covering map. We shall prove that many finite covering maps are polynomial outside a discrete branch...

  11. From symplectic integrator to Poincare map: Spline expansion of a map generator in Cartesian coordinates

    International Nuclear Information System (INIS)

    Warnock, R.L.; Ellison, J.A.; Univ. of New Mexico, Albuquerque, NM

    1997-08-01

    Data from orbits of a symplectic integrator can be interpolated so as to construct an approximation to the generating function of a Poincare map. The time required to compute an orbit of the symplectic map induced by the generator can be much less than the time to follow the same orbit by symplectic integration. The construction has been carried out previously for full-turn maps of large particle accelerators, and a big saving in time (for instance a factor of 60) has been demonstrated. A shortcoming of the work to date arose from the use of canonical polar coordinates, which precluded map construction in small regions of phase space near coordinate singularities. This paper shows that Cartesian coordinates can also be used, thus avoiding singularities. The generator is represented in a basis of tensor product B-splines. Under weak conditions the spline expansion converges uniformly as the mesh is refined, approaching the exact generator of the Poincare map as defined by the symplectic integrator, in some parallelepiped of phase space centered at the origin

  12. A regularized, model-based approach to phase-based conductivity mapping using MRI.

    Science.gov (United States)

    Ropella, Kathleen M; Noll, Douglas C

    2017-11-01

    To develop a novel regularized, model-based approach to phase-based conductivity mapping that uses structural information to improve the accuracy of conductivity maps. The inverse of the three-dimensional Laplacian operator is used to model the relationship between measured phase maps and the object conductivity in a penalized weighted least-squares optimization problem. Spatial masks based on structural information are incorporated into the problem to preserve data near boundaries. The proposed Inverse Laplacian method was compared against a restricted Gaussian filter in simulation, phantom, and human experiments. The Inverse Laplacian method resulted in lower reconstruction bias and error due to noise in simulations than the Gaussian filter. The Inverse Laplacian method also produced conductivity maps closer to the measured values in a phantom and with reduced noise in the human brain, as compared to the Gaussian filter. The Inverse Laplacian method calculates conductivity maps with less noise and more accurate values near boundaries. Improving the accuracy of conductivity maps is integral for advancing the applications of conductivity mapping. Magn Reson Med 78:2011-2021, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  13. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    Science.gov (United States)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  14. Concept Mapping as an Approach to Facilitate Participatory Intervention Building.

    Science.gov (United States)

    L Allen, Michele; Schaleben-Boateng, Dane; Davey, Cynthia S; Hang, Mikow; Pergament, Shannon

    2015-01-01

    A challenge to addressing community-defined need through community-based participatory intervention building is ensuring that all collaborators' opinions are represented. Concept mapping integrates perspectives of individuals with differing experiences, interests, or expertise into a common visually depicted framework, and ranks composite views on importance and feasibility. To describe the use of concept mapping to facilitate participatory intervention building for a school-based, teacher-focused, positive youth development (PYD) promotion program for Latino, Hmong, and Somali youth. Particiants were teachers, administrators, youth, parents, youth workers, and community and university researchers on the projects' community collaborative board. We incorporated previously collected qualitative data into the process. In a mixed-methods process we 1) generated statements based on key informant interview and focus group data from youth workers, teachers, parents, and youth in multiple languages regarding ways teachers promote PYD for Somali, Latino and Hmong youth; 2) guided participants to individually sort statements into meaningful groupings and rate them by importance and feasibility; 3) mapped the statements based on their relation to each other using multivariate statistical analyses to identify concepts, and as a group identified labels for each concept; and 4) used labels and statement ratings to identify feasible and important concepts as priorities for intervention development. We identified 12 concepts related to PYD promotion in schools and prioritized 8 for intervention development. Concept mapping facilitated participatory intervention building by formally representing all participants' opinions, generating visual representation of group thinking, and supporting priority setting. Use of prior qualitative work increased the diversity of viewpoints represented.

  15. Defining European Wholesale Electricity Markets. An 'And/Or' Approach

    Energy Technology Data Exchange (ETDEWEB)

    Dijkgraaf, E. [Erasmus School of Economics, Erasmus University Rotterdam, Rotterdam (Netherlands); Janssen, M.C.W. [University of Vienna, Vienna (Austria)

    2009-09-15

    An important question in the dynamic European wholesale markets for electricity is whether to define the geographical market at the level of an individual member state or more broadly. We show that if we currently take the traditional approach by considering for each member state whether there is one single other country that provides a substitute for domestic production, the market in each separate member state has still to be considered a separate market. However, if we allow for the possibility that at different moments in time there is another country that provides a substitute for domestic production, then the conclusion should be that certain member states do not constitute a separate geographical market. This is in particular true for Belgium, but also for The Netherlands, France, and to some extent also for Germany and Austria. We call this alternative approach the 'and/or' approach.

  16. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  17. Tree Cover Mapping Tool—Documentation and user manual

    Science.gov (United States)

    Cotillon, Suzanne E.; Mathis, Melissa L.

    2016-06-02

    The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.

  18. Conversion efficiency of implanted ions by confocal micro-luminescence mapping

    International Nuclear Information System (INIS)

    Deshko, Y.; Huang, Mengbing; Gorokhovsky, A.A.

    2013-01-01

    We report on the further development of the statistical approach to determine the conversion efficiency of implanted ions into emitting centers and present the measurement method based on the confocal micro-luminescence mapping. It involves the micro-luminescence mapping with a narrow-open confocal aperture, followed by the statistical analysis of the photoluminescence signal from an ensemble of emitting centers. The confocal mapping method has two important advantages compared to the recently discussed aperture-free method (J. Lumin. 131 (2011) 489): it is less sensitive to errors in the laser spot size and has a well defined useful area. The confocal mapping has been applied to the Xe center in diamond. The conversion efficiency has been found to be about 0.28, which is in good agreement with the results of the aperture-free method. - Highlights: ► Conversion efficiency of implanted ions into emitting centers – statistical approach. ► Micro-luminescence mapping with open and narrow confocal aperture – comparison. ► Advantages of the confocal micro-luminescence mapping. ► Confocal micro-luminescence mapping has been applied to the Xe center in diamond. ► The conversion efficiency has been found to be about 0.28.

  19. Fast and accurate approaches for large-scale, automated mapping of food diaries on food composition tables

    DEFF Research Database (Denmark)

    Lamarine, Marc; Hager, Jörg; Saris, Wim H M

    2018-01-01

    the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching). The second used a machine learning approach (C5.0 classifier) combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English...... not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names). Our approaches have been implemented as R packages...... and are freely available from GitHub. Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We demonstrate that both high precision and recall can be achieved. Our solutions can be used with any FCT and do not require any programming background...

  20. A new approach to the statistical treatment of 2D-maps in proteomics using fuzzy logic.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio

    2003-01-01

    A new approach to the statistical treatment of 2D-maps has been developed. This method is based on the use of fuzzy logic and allows to take into consideration the typical low reproducibility of 2D-maps. In this approach the signal corresponding to the presence of proteins on the 2D-maps is substituted with probability functions, centred on the signal itself. The standard deviation of the bidimensional gaussian probability function employed to blur the signal allows to assign different uncertainties to the two electrophoretic dimensions. The effect of changing the standard deviation and the digitalisation resolution are investigated.

  1. Quantifying Spatial Variation in Ecosystem Services Demand : A Global Mapping Approach

    NARCIS (Netherlands)

    Wolff, S.; Schulp, C. J E; Kastner, T.; Verburg, P. H.

    2017-01-01

    Understanding the spatial-temporal variability in ecosystem services (ES) demand can help anticipate externalities of land use change. This study presents new operational approaches to quantify and map demand for three non-commodity ES on a global scale: animal pollination, wild medicinal plants and

  2. User Experience Design in Professional Map-Based Geo-Portals

    Directory of Open Access Journals (Sweden)

    Bastian Zimmer

    2013-10-01

    Full Text Available We have recently been witnessing the growing establishment of map-centered web-based geo-portals on national, regional and local levels. However, a particular issue with these geo-portals is that each instance has been implemented in different ways in terms of design, usability, functionality, interaction possibilities, map size and symbologies. In this paper, we try to tackle these shortcomings by analyzing and formalizing the requirements for map-based geo-portals in a user experience based approach. First, we propose a holistic definition the term of a “geo-portal”. Then, we present our approach to user experience design for map-based geo-portals by defining the functional requirements of a geo-portal, by analyzing previous geo-portal developments, by distilling the results of our empirical user study to perform practically-oriented user requirements, and finally by establishing a set of user experience design guidelines for the creation of map-based geo-portals. These design guidelines have been extracted for each of the main components of a geo-portal, i.e., the map, the search dialogue, the presentation of the search results, symbologies, and other aspects. These guidelines shall constitute the basis for future geo-portal developments to achieve standardization in the user-experience design of map-based geo-portals.

  3. ADVANCED EARTH OBSERVATION APPROACH FOR MULTISCALE FOREST ECOSYSTEM SERVICES MODELING AND MAPPING (MIMOSE

    Directory of Open Access Journals (Sweden)

    G. Chirici

    2014-04-01

    Full Text Available In the last decade ecosystem services (ES have been proposed as a method for quantifying the multifunctional role of forest ecosystems. Their spatial distribution on large areas is frequently limited by the lack of information, because field data collection with traditional methods requires much effort in terms of time and cost.  In this contribution we propose a methodology (namely, MultIscale Mapping Of ecoSystem servicEs - MIMOSE based on the integration of remotely sensed images and field observation to produce a wall-to-wall geodatabase of forest parcels accompanied with several information useful as a basis for future trade-off analysis of different ES. Here, we present the application of the MIMOSE approach to a study area of 443,758 hectares  coincident with administrative Molise Region in Central Italy. The procedure is based on a local high resolution forest types map integrated with information on the main forest management approaches. Through the non-parametric k-Nearest Neighbors techniques, we produced a growing stock volume map integrating a local forest inventory with a multispectral satellite IRS LISS III imagery. With the growing stock volume map we derived a forest age map for even-aged forest types. Later these information were used to automatically create a vector forest parcels map by multidimensional image segmentation that were finally populated with a number of information useful for ES spatial estimation. The contribution briefly introduce to the MIMOSE methodology presenting the preliminary results we achieved which constitute the basis for a future implementation of ES modeling.

  4. Defining ‘sensitive’ health status: a systematic approach using health code terminologies.

    Directory of Open Access Journals (Sweden)

    Andy Boyd

    2017-04-01

    We have demonstrated a systematic and partially interoperable approach to defining ‘sensitive’ health information. However, any such exercise is likely to include decisions which will be open to interpretation and open to change over time. As such, the application of this technique should be embedded within an appropriate governance framework which can accommodate misclassification while minimising potential patient harm.

  5. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  6. Deep Mapping and Spatial Anthropology

    Directory of Open Access Journals (Sweden)

    Les Roberts

    2016-01-01

    Full Text Available This paper provides an introduction to the Humanities Special Issue on “Deep Mapping”. It sets out the rationale for the collection and explores the broad-ranging nature of perspectives and practices that fall within the “undisciplined” interdisciplinary domain of spatial humanities. Sketching a cross-current of ideas that have begun to coalesce around the concept of “deep mapping”, the paper argues that rather than attempting to outline a set of defining characteristics and “deep” cartographic features, a more instructive approach is to pay closer attention to the multivalent ways deep mapping is performatively put to work. Casting a critical and reflexive gaze over the developing discourse of deep mapping, it is argued that what deep mapping “is” cannot be reduced to the otherwise a-spatial and a-temporal fixity of the “deep map”. In this respect, as an undisciplined survey of this increasing expansive field of study and practice, the paper explores the ways in which deep mapping can engage broader discussion around questions of spatial anthropology.

  7. Symplectic Maps from Cluster Algebras

    Directory of Open Access Journals (Sweden)

    Allan P. Fordy

    2011-09-01

    Full Text Available We consider nonlinear recurrences generated from the iteration of maps that arise from cluster algebras. More precisely, starting from a skew-symmetric integer matrix, or its corresponding quiver, one can define a set of mutation operations, as well as a set of associated cluster mutations that are applied to a set of affine coordinates (the cluster variables. Fordy and Marsh recently provided a complete classification of all such quivers that have a certain periodicity property under sequences of mutations. This periodicity implies that a suitable sequence of cluster mutations is precisely equivalent to iteration of a nonlinear recurrence relation. Here we explain briefly how to introduce a symplectic structure in this setting, which is preserved by a corresponding birational map (possibly on a space of lower dimension. We give examples of both integrable and non-integrable maps that arise from this construction. We use algebraic entropy as an approach to classifying integrable cases. The degrees of the iterates satisfy a tropical version of the map.

  8. Non-animal methods to predict skin sensitization (II): an assessment of defined approaches *.

    Science.gov (United States)

    Kleinstreuer, Nicole C; Hoffmann, Sebastian; Alépée, Nathalie; Allen, David; Ashikaga, Takao; Casey, Warren; Clouet, Elodie; Cluzel, Magalie; Desprez, Bertrand; Gellatly, Nichola; Göbel, Carsten; Kern, Petra S; Klaric, Martina; Kühnl, Jochen; Martinozzi-Teissier, Silvia; Mewes, Karsten; Miyazawa, Masaaki; Strickland, Judy; van Vliet, Erwin; Zang, Qingda; Petersohn, Dirk

    2018-05-01

    Skin sensitization is a toxicity endpoint of widespread concern, for which the mechanistic understanding and concurrent necessity for non-animal testing approaches have evolved to a critical juncture, with many available options for predicting sensitization without using animals. Cosmetics Europe and the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods collaborated to analyze the performance of multiple non-animal data integration approaches for the skin sensitization safety assessment of cosmetics ingredients. The Cosmetics Europe Skin Tolerance Task Force (STTF) collected and generated data on 128 substances in multiple in vitro and in chemico skin sensitization assays selected based on a systematic assessment by the STTF. These assays, together with certain in silico predictions, are key components of various non-animal testing strategies that have been submitted to the Organization for Economic Cooperation and Development as case studies for skin sensitization. Curated murine local lymph node assay (LLNA) and human skin sensitization data were used to evaluate the performance of six defined approaches, comprising eight non-animal testing strategies, for both hazard and potency characterization. Defined approaches examined included consensus methods, artificial neural networks, support vector machine models, Bayesian networks, and decision trees, most of which were reproduced using open source software tools. Multiple non-animal testing strategies incorporating in vitro, in chemico, and in silico inputs demonstrated equivalent or superior performance to the LLNA when compared to both animal and human data for skin sensitization.

  9. Mapping radon-prone areas - a geophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Shirav, M. [Geological Survey of Israel, Jerusalem (Israel); Vulkan, U. [Soreq Nuclear Research Center, Yavne (Israel)

    1997-06-01

    Radon-prone areas in Israel were mapped on the basis of direct measurements of radon ({sup 222}Rn) in the soil/rock gas of all exposed geological units, supported by the accumulated knowledge of local stratigraphy and sub-surface geology. Measurements were carried out by a modified alpha-track detection system, resulting in high radon levels mainly in rocks of the Senonian-Paleocene-aged Mount Scopus Group, comprised of chert-bearing marly chalks, rich in phosphorite which acts as the major uranium source. Issues of source depth, seasonal variations and comparison with indoor radon levels are addressed as well. This approach could be applied to other similar terrains, especially the Mediterranean Phosphate Belt. (orig.)

  10. Mapping radon-prone areas - a geophysical approach

    International Nuclear Information System (INIS)

    Shirav, M.; Vulkan, U.

    1997-01-01

    Radon-prone areas in Israel were mapped on the basis of direct measurements of radon ( 222 Rn) in the soil/rock gas of all exposed geological units, supported by the accumulated knowledge of local stratigraphy and sub-surface geology. Measurements were carried out by a modified alpha-track detection system, resulting in high radon levels mainly in rocks of the Senonian-Paleocene-aged Mount Scopus Group, comprised of chert-bearing marly chalks, rich in phosphorite which acts as the major uranium source. Issues of source depth, seasonal variations and comparison with indoor radon levels are addressed as well. This approach could be applied to other similar terrains, especially the Mediterranean Phosphate Belt. (orig.)

  11. CRISM Multispectral and Hyperspectral Mapping Data - A Global Data Set for Hydrated Mineral Mapping

    Science.gov (United States)

    Seelos, F. P.; Hash, C. D.; Murchie, S. L.; Lim, H.

    2017-12-01

    The Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) is a visible through short-wave infrared hyperspectral imaging spectrometer (VNIR S-detector: 364-1055 nm; IR L-detector: 1001-3936 nm; 6.55 nm sampling) that has been in operation on the Mars Reconnaissance Orbiter (MRO) since 2006. Over the course of the MRO mission, CRISM has acquired 290,000 individual mapping observation segments (mapping strips) with a variety of observing modes and data characteristics (VNIR/IR; 100/200 m/pxl; multi-/hyper-spectral band selection) over a wide range of observing conditions (atmospheric state, observation geometry, instrument state). CRISM mapping data coverage density varies primarily with latitude and secondarily due to seasonal and operational considerations. The aggregate global IR mapping data coverage currently stands at 85% ( 80% at the equator with 40% repeat sampling), which is sufficient spatial sampling density to support the assembly of empirically optimized radiometrically consistent mapping mosaic products. The CRISM project has defined a number of mapping mosaic data products (e.g. Multispectral Reduced Data Record (MRDR) map tiles) with varying degrees of observation-specific processing and correction applied prior to mosaic assembly. A commonality among the mosaic products is the presence of inter-observation radiometric discrepancies which are traceable to variable observation circumstances or associated atmospheric/photometric correction residuals. The empirical approach to radiometric reconciliation leverages inter-observation spatial overlaps and proximal relationships to construct a graph that encodes the mosaic structure and radiometric discrepancies. The graph theory abstraction allows the underling structure of the msaic to be evaluated and the corresponding optimization problem configured so it is well-posed. Linear and non-linear least squares optimization is then employed to derive a set of observation- and wavelength- specific model

  12. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    Science.gov (United States)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  13. Progress in national-scale landslide susceptibility mapping in Romania using a combined statistical-heuristical approach

    Science.gov (United States)

    Bălteanu, Dan; Micu, Mihai; Malet, Jean-Philippe; Jurchescu, Marta; Sima, Mihaela; Kucsicsa, Gheorghe; Dumitrică, Cristina; Petrea, Dănuţ; Mărgărint, Ciprian; Bilaşco, Ştefan; Văcăreanu, Radu; Georgescu, Sever; Senzaconi, Francisc

    2017-04-01

    Landslide processes represent a very widespread geohazard in Romania, affecting mainly the hilly and plateau regions as well as the mountain sectors developed on flysch formations. Two main projects provided the framework for improving the existing national landslide susceptibility map (Bălteanu et al. 2010): the ELSUS (Pan-European and nation-wide landslide susceptibility assessment, EC-CERG) and the RO-RISK (Disaster Risk Evaluation at National Level, ESF-POCA) projects. The latter one, a flagship project aiming at strengthening risk prevention and management in Romania, focused on a national-level evaluation of the main risks in the country including landslides. The strategy for modeling landslide susceptibility was designed based on the experience gained from continental and national level assessments conducted in the frame of the International Programme on Landslides (IPL) project IPL-162, the European Landslides Expert Group - JRC and the ELSUS project. The newly proposed landslide susceptibility model used as input a reduced set of landslide conditioning factor maps available at scales of 1:100,000 - 1:200,000 and consisting of lithology, slope angle and land cover. The input data was further differentiated for specific natural environments, defined here as morpho-structural units in order to incorporate differences induced by elevation (vertical climatic zonation), morpho-structure as well as neotectonic features. In order to best discern the specific landslide conditioning elements, the analysis has been carried out for one single process category, namely slides. The existence of a landslide inventory covering the whole country's territory ( 30,000 records, Micu et al. 2014), although affected by incompleteness and lack of homogeneity, allowed for the application of a semi-quantitative, mixed statistical-heuristical approach having the advantage of combining the objectivity of statistics with expert-knowledge in calibrating class and factor weights. The

  14. A filtering approach to edge preserving MAP estimation of images.

    Science.gov (United States)

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  15. Timed bisimulation and open maps

    DEFF Research Database (Denmark)

    Hune, Thomas; Nielsen, Mogens

    1998-01-01

    of timed bisimulation. Thus the abstract results from the theory of open maps apply, e.g. the existence of canonical models and characteristic logics. Here, we provide an alternative proof of decidability of bisimulation for finite timed transition systems in terms of open maps, and illustrate the use......Open maps have been used for defining bisimulations for a range of models, but none of these have modelled real-time. We define a category of timed transition systems, and use the general framework of open maps to obtain a notion of bisimulation. We show this to be equivalent to the standard notion...... of open maps in presenting bisimulations....

  16. An Image Encryption Approach Using a Shuffling Map

    International Nuclear Information System (INIS)

    Xiao Yongliang; Xia Limin

    2009-01-01

    A new image encryption approach is proposed. First, a sort transformation based on nonlinear chaotic algorithm is used to shuffle the positions of image pixels. Then the states of hyper-chaos are used to change the grey values of the shuffled image according to the changed chaotic values of the same position between the above nonlinear chaotic sequence and the sorted chaotic sequence. The experimental results demonstrate that the image encryption scheme based on a shuffling map shows advantages of large key space and high-level security. Compared with some encryption algorithms, the suggested encryption scheme is more secure. (general)

  17. A zeta function approach to the semiclassical quantization of maps

    International Nuclear Information System (INIS)

    Smilansky, Uzi.

    1993-11-01

    The quantum analogue of an area preserving map on a compact phase space is a unitary (evolution) operator which can be represented by a matrix of dimension L∝ℎ -1 . The semiclassical theory for spectrum of the evolution operator will be reviewed with special emphasize on developing a dynamical zeta function approach, similar to the one introduced recently for a semiclassical quantization of hamiltonian systems. (author)

  18. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  19. Off-line mapping of multi-rate dependent task sets to many-core platforms

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang; Noulard, Eric; Pagetti, Claire

    2015-01-01

    This paper presents an approach to execute safety-critical applications on multi- and many-core processors in a predictable manner. We investigate three concrete platforms: the Intel Single-chip Cloud Computer, the Texas Instruments TMS320C6678 and the Tilera TILEmpower-Gx36. We define an execution...... model to safely execute dependent periodic task sets on these platforms. The four rules of the execution model entail that an off-line mapping of the application to the platform must be computed. The paper details our approach to automatically compute a valid mapping. Furthermore, we evaluate our...

  20. Towards the XML schema measurement based on mapping between XML and OO domain

    Science.gov (United States)

    Rakić, Gordana; Budimac, Zoran; Heričko, Marjan; Pušnik, Maja

    2017-07-01

    Measuring quality of IT solutions is a priority in software engineering. Although numerous metrics for measuring object-oriented code already exist, measuring quality of UML models or XML Schemas is still developing. One of the research questions in the overall research leaded by ideas described in this paper is whether we can apply already defined object-oriented design metrics on XML schemas based on predefined mappings. In this paper, basic ideas for mentioned mapping are presented. This mapping is prerequisite for setting the future approach to XML schema quality measuring with object-oriented metrics.

  1. Global land cover mapping at 30 m resolution: A POK-based operational approach

    Science.gov (United States)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  2. Defining biological assemblages (biotopes) of conservation interest in the submarine canyons of the South West Approaches (offshore United Kingdom) for use in marine habitat mapping

    Science.gov (United States)

    Davies, Jaime S.; Howell, Kerry L.; Stewart, Heather A.; Guinan, Janine; Golding, Neil

    2014-06-01

    In 2007, the upper part of a submarine canyon system located in water depths between 138 and 1165 m in the South West (SW) Approaches (North East Atlantic Ocean) was surveyed over a 2 week period. High-resolution multibeam echosounder data covering 1106 km2, and 44 ground-truthing video and image transects were acquired to characterise the biological assemblages of the canyons. The SW Approaches is an area of complex terrain, and intensive ground-truthing revealed the canyons to be dominated by soft sediment assemblages. A combination of multivariate analysis of seabed photographs (184-1059 m) and visual assessment of video ground-truthing identified 12 megabenthic assemblages (biotopes) at an appropriate scale to act as mapping units. Of these biotopes, 5 adhered to current definitions of habitats of conservation concern, 4 of which were classed as Vulnerable Marine Ecosystems. Some of the biotopes correspond to descriptions of communities from other megahabitat features (for example the continental shelf and seamounts), although it appears that the canyons host modified versions, possibly due to the inferred high rates of sedimentation in the canyons. Other biotopes described appear to be unique to canyon features, particularly the sea pen biotope consisting of Kophobelemnon stelliferum and cerianthids.

  3. Rapid Construction of Fe-Co-Ni Composition-Phase Map by Combinatorial Materials Chip Approach.

    Science.gov (United States)

    Xing, Hui; Zhao, Bingbing; Wang, Yujie; Zhang, Xiaoyi; Ren, Yang; Yan, Ningning; Gao, Tieren; Li, Jindong; Zhang, Lanting; Wang, Hong

    2018-03-12

    One hundred nanometer thick Fe-Co-Ni material chips were prepared and isothermally annealed at 500, 600, and 700 °C, respectively. Pixel-by-pixel composition and structural mapping was performed by microbeam X-ray at synchrotron light source. Diffraction images were recorded at a rate of 1 pattern/s. The XRD patterns were automatically processed, phase-identified, and categorized by hierarchical clustering algorithm to construct the composition-phase map. The resulting maps are consistent with corresponding isothermal sections reported in the ASM Alloy Phase Diagram Database, verifying the effectiveness of the present approach in phase diagram construction.

  4. A simple method for combining genetic mapping data from multiple crosses and experimental designs.

    Directory of Open Access Journals (Sweden)

    Jeremy L Peirce

    Full Text Available BACKGROUND: Over the past decade many linkage studies have defined chromosomal intervals containing polymorphisms that modulate a variety of traits. Many phenotypes are now associated with enough mapping data that meta-analysis could help refine locations of known QTLs and detect many novel QTLs. METHODOLOGY/PRINCIPAL FINDINGS: We describe a simple approach to combining QTL mapping results for multiple studies and demonstrate its utility using two hippocampus weight loci. Using data taken from two populations, a recombinant inbred strain set and an advanced intercross population we demonstrate considerable improvements in significance and resolution for both loci. 1-LOD support intervals were improved 51% for Hipp1a and 37% for Hipp9a. We first generate locus-wise permuted P-values for association with the phenotype from multiple maps, which can be done using a permutation method appropriate to each population. These results are then assigned to defined physical positions by interpolation between markers with known physical and genetic positions. We then use Fisher's combination test to combine position-by-position probabilities among experiments. Finally, we calculate genome-wide combined P-values by generating locus-specific P-values for each permuted map for each experiment. These permuted maps are then sampled with replacement and combined. The distribution of best locus-specific P-values for each combined map is the null distribution of genome-wide adjusted P-values. CONCLUSIONS/SIGNIFICANCE: Our approach is applicable to a wide variety of segregating and non-segregating mapping populations, facilitates rapid refinement of physical QTL position, is complementary to other QTL fine mapping methods, and provides an appropriate genome-wide criterion of significance for combined mapping results.

  5. Deciphering the genomic architecture of the stickleback brain with a novel multilocus gene-mapping approach.

    Science.gov (United States)

    Li, Zitong; Guo, Baocheng; Yang, Jing; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Shikano, Takahito; Calboli, Federico C F; Merilä, Juha

    2017-03-01

    Quantitative traits important to organismal function and fitness, such as brain size, are presumably controlled by many small-effect loci. Deciphering the genetic architecture of such traits with traditional quantitative trait locus (QTL) mapping methods is challenging. Here, we investigated the genetic architecture of brain size (and the size of five different brain parts) in nine-spined sticklebacks (Pungitius pungitius) with the aid of novel multilocus QTL-mapping approaches based on a de-biased LASSO method. Apart from having more statistical power to detect QTL and reduced rate of false positives than conventional QTL-mapping approaches, the developed methods can handle large marker panels and provide estimates of genomic heritability. Single-locus analyses of an F 2 interpopulation cross with 239 individuals and 15 198, fully informative single nucleotide polymorphisms (SNPs) uncovered 79 QTL associated with variation in stickleback brain size traits. Many of these loci were in strong linkage disequilibrium (LD) with each other, and consequently, a multilocus mapping of individual SNPs, accounting for LD structure in the data, recovered only four significant QTL. However, a multilocus mapping of SNPs grouped by linkage group (LG) identified 14 LGs (1-6 depending on the trait) that influence variation in brain traits. For instance, 17.6% of the variation in relative brain size was explainable by cumulative effects of SNPs distributed over six LGs, whereas 42% of the variation was accounted for by all 21 LGs. Hence, the results suggest that variation in stickleback brain traits is influenced by many small-effect loci. Apart from suggesting moderately heritable (h 2  ≈ 0.15-0.42) multifactorial genetic architecture of brain traits, the results highlight the challenges in identifying the loci contributing to variation in quantitative traits. Nevertheless, the results demonstrate that the novel QTL-mapping approach developed here has distinctive advantages

  6. From a Reductionist to a Holistic Approach in Preventive Nutrition to Define New and More Ethical Paradigms.

    Science.gov (United States)

    Fardet, Anthony; Rock, Edmond

    2015-10-28

    This concept paper intends to define four new paradigms for improving nutrition research. First, the consequences of applying a reductionist versus a holistic approach to nutrition science will be discussed. The need for a more focused preventive nutrition approach, as opposed to a curative one, will then be presented on the basis of the 'healthy core metabolism' concept. This will lead us to propose a new classification of food products based on processing for future epidemiological studies. As a result of applying the holistic approach, health food potential will be redefined based on both food structure and nutrient density. These new paradigms should help define a more ethical preventive nutrition for humans to improve public recommendations while preserving the environment.

  7. Rendering Systems Visible for Design: Synthesis Maps as Constructivist Design Narratives

    Directory of Open Access Journals (Sweden)

    Peter Jones

    Full Text Available Synthesis maps integrate research evidence, system expertise, and design proposals into visual narratives. These narratives support communication and decision-making among stakeholders. Synthesis maps evolved from earlier visualization tools in systemics and design. They help stakeholders to understand design options for complex sociotechnical systems. Other visual approaches map complexity for effective collaboration across perspectives and knowledge domains. These help stakeholder groups to work in higher-order design contexts for sociotechnical or human-ecological systems. This article describes a constructivist pedagogy for collaborative learning in small teams of mixed-discipline designers. Synthesis mapping enables these teams to learn systems methods for design research in complex problem domains. Synthesis maps integrate knowledge from research cycles and iterative sensemaking to define a coherent design narrative. While synthesis maps may include formal system modeling techniques, they do not require them. Synthesis maps tangibly render research observations and design choices. As a hybrid system design method, synthesis maps are a contribution to the design genre of visual systems thinking.

  8. A Self-Adaptive Evolutionary Approach to the Evolution of Aesthetic Maps for a RTS Game

    OpenAIRE

    Lara-Cabrera, Raúl; Cotta, Carlos; Fernández-Leiva, Antonio J.

    2014-01-01

    Procedural content generation (PCG) is a research eld on the rise,with numerous papers devoted to this topic. This paper presents a PCG method based on a self-adaptive evolution strategy for the automatic generation of maps for the real-time strategy (RTS) game PlanetWars. These maps are generated in order to ful ll the aesthetic preferences of the user, as implied by her assessment of a collection of maps used as training set. A topological approach is used for the characterization of th...

  9. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  10. Crude oil and its’ distillation: an experimental approach in High School using conceptual maps

    Directory of Open Access Journals (Sweden)

    Dionísio Borsato

    2006-02-01

    Full Text Available Conceptual maps are representations of ideas organized in the form of bidimensional diagrams. In the present work the theme of oil fractional distillation was explored, and the conceptual maps were elaborated both before and after the activities by 43 students from the 1st and 3rd High School grades of a public school in Londrina – PR. The study was conducted theoretically and in practice, with a daily life approach. The use of the motivational theme and the opening text as previous organizers, enabled the establishment of a cognitive link between the students’ previous knowledge and the new concepts. Differences between the maps were verified before and after the activities as well as among the work groups. The students, stimulated by the technique, created better structured maps.

  11. An Approach of Dynamic Object Removing for Indoor Mapping Based on UGV SLAM

    Directory of Open Access Journals (Sweden)

    Jian Tang

    2015-07-01

    Full Text Available The study of indoor mapping for Location Based Service (LBS becomes more and more popular in recent years. LiDAR SLAM based mapping method seems to be a promising indoor mapping solution. However, there are some dynamic objects such as pedestrians, indoor vehicles, etc. existing in the raw LiDAR range data. They have to be removal for mapping purpose. In this paper, a new approach of dynamic object removing called Likelihood Grid Voting (LGV is presented. It is a model free method and takes full advantage of the high scanning rate of LiDAR, which is moving at a relative low speed in indoor environment. In this method, a counting grid is allocated for recording the occupation of map position by laser scans. The lower counter value of this position can be recognized as dynamic objects and the point cloud will be removed from map. This work is a part of algorithms in our self- developed Unmanned Ground Vehicles (UGV simultaneous localization and Mapping (SLAM system- NAVIS. Field tests are carried in an indoor parking place with NAVIS to evaluate the effectiveness of the proposed method. The result shows that all the small size objects like pedestrians can be detected and removed quickly; large size of objects like cars can be detected and removed partly.

  12. Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach

    Science.gov (United States)

    Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai

    2017-02-01

    The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.

  13. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    Science.gov (United States)

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  14. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach.

    Science.gov (United States)

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  15. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    Directory of Open Access Journals (Sweden)

    Adrian-Valentin Nedelcu

    2015-01-01

    Full Text Available Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  16. A score-statistic approach for determining threshold values in QTL mapping.

    Science.gov (United States)

    Kao, Chen-Hung; Ho, Hsiang-An

    2012-06-01

    Issues in determining the threshold values of QTL mapping are often investigated for the backcross and F2 populations with relatively simple genome structures so far. The investigations of these issues in the progeny populations after F2 (advanced populations) with relatively more complicated genomes are generally inadequate. As these advanced populations have been well implemented in QTL mapping, it is important to address these issues for them in more details. Due to an increasing number of meiosis cycle, the genomes of the advanced populations can be very different from the backcross and F2 genomes. Therefore, special devices that consider the specific genome structures present in the advanced populations are required to resolve these issues. By considering the differences in genome structure between populations, we formulate more general score test statistics and gaussian processes to evaluate their threshold values. In general, we found that, given a significance level and a genome size, threshold values for QTL detection are higher in the denser marker maps and in the more advanced populations. Simulations were performed to validate our approach.

  17. Apparent diffusion coefficient mapping in medulloblastoma predicts non-infiltrative surgical planes.

    Science.gov (United States)

    Marupudi, Neena I; Altinok, Deniz; Goncalves, Luis; Ham, Steven D; Sood, Sandeep

    2016-11-01

    An appropriate surgical approach for posterior fossa lesions is to start tumor removal from areas with a defined plane to where tumor is infiltrating the brainstem or peduncles. This surgical approach minimizes risk of damage to eloquent areas. Although magnetic resonance imaging (MRI) is the current standard preoperative imaging obtained for diagnosis and surgical planning of pediatric posterior fossa tumors, it offers limited information on the infiltrative planes between tumor and normal structures in patients with medulloblastomas. Because medulloblastomas demonstrate diffusion restriction on apparent diffusion coefficient map (ADC map) sequences, we investigated the role of ADC map in predicting infiltrative and non-infiltrative planes along the brain stem and/or cerebellar peduncles by medulloblastomas prior to surgery. Thirty-four pediatric patients with pathologically confirmed medulloblastomas underwent surgical resection at our facility from 2004 to 2012. An experienced pediatric neuroradiologist reviewed the brain MRIs/ADC map, assessing the planes between the tumor and cerebellar peduncles/brain stem. An independent evaluator documented surgical findings from operative reports for comparison to the radiographic findings. The radiographic findings were statistically compared to the documented intraoperative findings to determine predictive value of the test in identifying tumor infiltration of the brain stem cerebellar peduncles. Twenty-six patients had preoperative ADC mapping completed and thereby, met inclusion criteria. Mean age at time of surgery was 8.3 ± 4.6 years. Positive predictive value of ADC maps to predict tumor invasion of the brain stem and cerebellar peduncles ranged from 69 to 88 %; negative predictive values ranged from 70 to 89 %. Sensitivity approached 93 % while specificity approached 78 %. ADC maps are valuable in predicting the infiltrative and non-infiltrative planes along the tumor and brain stem interface in

  18. How Do Users Map Points Between Dissimilar Shapes?

    KAUST Repository

    Hecher, Michael

    2017-07-25

    Finding similar points in globally or locally similar shapes has been studied extensively through the use of various point descriptors or shape-matching methods. However, little work exists on finding similar points in dissimilar shapes. In this paper, we present the results of a study where users were given two dissimilar two-dimensional shapes and asked to map a given point in the first shape to the point in the second shape they consider most similar. We find that user mappings in this study correlate strongly with simple geometric relationships between points and shapes. To predict the probability distribution of user mappings between any pair of simple two-dimensional shapes, two distinct statistical models are defined using these relationships. We perform a thorough validation of the accuracy of these predictions and compare our models qualitatively and quantitatively to well-known shape-matching methods. Using our predictive models, we propose an approach to map objects or procedural content between different shapes in different design scenarios.

  19. A multi-model ensemble approach to seabed mapping

    Science.gov (United States)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  20. Flow Visualization with Quantified Spatial and Temporal Errors Using Edge Maps

    KAUST Repository

    Bhatia, H.; Jadhav, S.; Bremer, P.; Guoning Chen,; Levine, J. A.; Nonato, L. G.; Pascucci, V.

    2012-01-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures. © 2012 IEEE.

  1. Flow Visualization with Quantified Spatial and Temporal Errors Using Edge Maps

    KAUST Repository

    Bhatia, H.

    2012-09-01

    Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures. © 2012 IEEE.

  2. Crop Biometric Maps: The Key to Prediction

    Directory of Open Access Journals (Sweden)

    Francisco Rovira-Más

    2013-09-01

    Full Text Available The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular “identity.” This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed.

  3. Crop biometric maps: the key to prediction.

    Science.gov (United States)

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-09-23

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular "identity." This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed.

  4. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan-Rong; Wang, Jian-Min [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650011 (China)

    2016-11-10

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  5. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    International Nuclear Information System (INIS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  6. Resting state cortico-cerebellar functional connectivity networks: A comparison of anatomical and self-organizing map approaches

    Directory of Open Access Journals (Sweden)

    Jessica A Bernard

    2012-08-01

    Full Text Available The cerebellum plays a role in a wide variety of complex behaviors. In order to better understand the role of the cerebellum in human behavior, it is important to know how this structure interacts with cortical and other subcortical regions of the brain. To date, several studies have investigated the cerebellum using resting-state functional connectivity magnetic resonance imaging (fcMRI; Buckner et al., 2011; Krienen & Buckner, 2009; O’Reilly et al., 2009. However, none of this work has taken an anatomically-driven approach. Furthermore, though detailed maps of cerebral cortex and cerebellum networks have been proposed using different network solutions based on the cerebral cortex (Buckner et al., 2011, it remains unknown whether or not an anatomical lobular breakdown best encompasses the networks of the cerebellum. Here, we used fcMRI to create an anatomically-driven cerebellar connectivity atlas. Timecourses were extracted from the lobules of the right hemisphere and vermis. We found distinct networks for the individual lobules with a clear division into motor and non-motor regions. We also used a self-organizing map algorithm to parcellate the cerebellum. This allowed us to investigate redundancy and independence of the anatomically identified cerebellar networks. We found that while anatomical boundaries in the anterior cerebellum provide functional subdivisions of a larger motor grouping defined using our self-organizing map algorithm, in the posterior cerebellum, the lobules were made up of sub-regions associated with distinct functional networks. Together, our results indicate that the lobular boundaries of the human cerebellum are not indicative of functional boundaries, though anatomical divisions can be useful, as is the case of the anterior cerebellum. Additionally, driving the analyses from the cerebellum is key to determining the complete picture of functional connectivity within the structure.

  7. Topological visual mapping in robotics.

    Science.gov (United States)

    Romero, Anna; Cazorla, Miguel

    2012-08-01

    A key problem in robotics is the construction of a map from its environment. This map could be used in different tasks, like localization, recognition, obstacle avoidance, etc. Besides, the simultaneous location and mapping (SLAM) problem has had a lot of interest in the robotics community. This paper presents a new method for visual mapping, using topological instead of metric information. For that purpose, we propose prior image segmentation into regions in order to group the extracted invariant features in a graph so that each graph defines a single region of the image. Although others methods have been proposed for visual SLAM, our method is complete, in the sense that it makes all the process: it presents a new method for image matching; it defines a way to build the topological map; and it also defines a matching criterion for loop-closing. The matching process will take into account visual features and their structure using the graph transformation matching (GTM) algorithm, which allows us to process the matching and to remove out the outliers. Then, using this image comparison method, we propose an algorithm for constructing topological maps. During the experimentation phase, we will test the robustness of the method and its ability constructing topological maps. We have also introduced new hysteresis behavior in order to solve some problems found building the graph.

  8. Branched polynomial covering maps

    DEFF Research Database (Denmark)

    Hansen, Vagn Lundsgaard

    1999-01-01

    A Weierstrass polynomial with multiple roots in certain points leads to a branched covering map. With this as the guiding example, we formally define and study the notion of a branched polynomial covering map. We shall prove that many finite covering maps are polynomial outside a discrete branch...... set. Particular studies are made of branched polynomial covering maps arising from Riemann surfaces and from knots in the 3-sphere....

  9. Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches

    Directory of Open Access Journals (Sweden)

    Damien Vivet

    2013-08-01

    Full Text Available This paper is concerned with robotic applications using a ground-based radar sensor for simultaneous localization and mapping problems. In mobile robotics, radar technology is interesting because of its long range and the robustness of radar waves to atmospheric conditions, making these sensors well-suited for extended outdoor robotic applications. Two localization and mapping approaches using data obtained from a 360° field of view microwave radar sensor are presented and compared. The first method is a trajectory-oriented simultaneous localization and mapping technique, which makes no landmark assumptions and avoids the data association problem. The estimation of the ego-motion makes use of the Fourier-Mellin transform for registering radar images in a sequence, from which the rotation and translation of the sensor motion can be estimated. The second approach uses the consequence of using a rotating range sensor in high speed robotics. In such a situation, movement combinations create distortions in the collected data. Velocimetry is achieved here by explicitly analysing these measurement distortions. As a result, the trajectory of the vehicle and then the radar map of outdoor environments can be obtained. The evaluation of experimental results obtained by the two methods is presented on real-world data from a vehicle moving at 30 km/h over a 2.5 km course.

  10. Clustering of the Self-Organizing Map based Approach in Induction Machine Rotor Faults Diagnostics

    Directory of Open Access Journals (Sweden)

    Ahmed TOUMI

    2009-12-01

    Full Text Available Self-Organizing Maps (SOM is an excellent method of analyzingmultidimensional data. The SOM based classification is attractive, due to itsunsupervised learning and topology preserving properties. In this paper, theperformance of the self-organizing methods is investigated in induction motorrotor fault detection and severity evaluation. The SOM is based on motor currentsignature analysis (MCSA. The agglomerative hierarchical algorithms using theWard’s method is applied to automatically dividing the map into interestinginterpretable groups of map units that correspond to clusters in the input data. Theresults obtained with this approach make it possible to detect a rotor bar fault justdirectly from the visualization results. The system is also able to estimate theextent of rotor faults.

  11. The Conceptual Framework of Thematic Mapping in Case Conceptualization.

    Science.gov (United States)

    Ridley, Charles R; Jeffrey, Christina E

    2017-04-01

    This article, the 3rd in a series of 5, introduces the conceptual framework for thematic mapping, a novel approach to case conceptualization. The framework is transtheoretical in that it is not constrained by the tenets or concepts of any one therapeutic orientation and transdiagnostic in that it conceptualizes clients outside the constraints of diagnostic criteria. Thematic mapping comprises 4 components: a definition, foundational principles, defining features, and core concepts. These components of the framework, deemed building blocks, are explained in this article. Like the foundation of any structure, the heuristic value of the method requires that the building blocks have integrity, coherence, and sound anchoring. We assert that the conceptual framework provides a solid foundation, making thematic mapping a potential asset in mental health treatment. © 2017 Wiley Periodicals, Inc.

  12. A Photogrammetric Approach for Assessing Positional Accuracy of OpenStreetMap© Roads

    Directory of Open Access Journals (Sweden)

    Peter Doucette

    2013-04-01

    Full Text Available As open source volunteered geographic information continues to gain popularity, the user community and data contributions are expected to grow, e.g., CloudMade, Apple, and Ushahidi now provide OpenStreetMap© (OSM as a base layer for some of their mapping applications. This, coupled with the lack of cartographic standards and the expectation to one day be able to use this vector data for more geopositionally sensitive applications, like GPS navigation, leaves potential users and researchers to question the accuracy of the database. This research takes a photogrammetric approach to determining the positional accuracy of OSM road features using stereo imagery and a vector adjustment model. The method applies rigorous analytical measurement principles to compute accurate real world geolocations of OSM road vectors. The proposed approach was tested on several urban gridded city streets from the OSM database with the results showing that the post adjusted shape points improved positionally by 86%. Furthermore, the vector adjustment was able to recover 95% of the actual positional displacement present in the database. To demonstrate a practical application, a head-to-head positional accuracy assessment between OSM, the USGS National Map (TNM, and United States Census Bureau’s Topologically Integrated Geographic Encoding Referencing (TIGER 2007 roads was conducted.

  13. Geosites and geoheritage representations - a cartographic approach

    Science.gov (United States)

    Rocha, Joao; Brilha, José

    2016-04-01

    In recent years, the increasing awareness of the importance of nature conservation, particularly towards the protection, conservation and promotion of geological sites, has resulted in a wide range of scientific studies. In a certain way, the majority of geodiversity studies, geoconservation strategies and geosites inventories and geoheritage assessment projects will use, on a particular stage, a cartographic representation - a map - of the most relevant geological and geomorphological features within the area of analyses. A wide range of geosite maps and geological heritage maps have been produced but, so far, a widely accepted conceptual cartographic framework with a specific symbology for cartographic representation has not been created. In this work we debate the lack of a systematic and conceptual framework to support geoheritage and geosite mapping. It is important to create a widely accepted conceptual cartographic framework with a specific symbology to be used within maps dedicated to geoheritage and geosites. We propose a cartographic approach aiming the conceptualization and the definition of a nomenclature and symbology system to be used on both geosite and geoheritage maps. We define a symbology framework for geosite and geoheritage mapping addressed to general public and to secondary school students, in order to be used as geotouristic and didactic tools, respectively. Three different approaches to support the definition of the symbology framework were developed: i) symbols to correlate geosites with the geological time scale; ii) symbols related to each one of the 27 geological frameworks defined in the Portuguese geoheritage inventory; iii) symbols to represent groups of geosites that share common geological and geomorphological features. The use of these different symbols in a map allows a quick understanding of a set of relevant information, in addition to the usual geographical distribution of geosites in a certain area.

  14. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  15. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  16. Ordered and isomorphic mapping of periodic structures in the parametrically forced logistic map

    Energy Technology Data Exchange (ETDEWEB)

    Maranhão, Dariel M., E-mail: dariel@ifsp.edu.br [Departamento de Ciências e Matemática, Instituto Federal de Educação, Ciência e Tecnologia de São Paulo, São Paulo (Brazil); Diretoria de Informática, Universidade Nove de Julho, São Paulo (Brazil)

    2016-09-23

    Highlights: • A direct description of the internal structure of a periodic window in terms of winding numbers is proposed. • Periodic structures in parameter spaces are mapped in a recurrent and isomorphic way. • Sequences of winding numbers show global and local organization of periodic domains. - Abstract: We investigate the periodic domains found in the parametrically forced logistic map, the classical logistic map when its control parameter changes dynamically. Phase diagrams in two-parameter spaces reveal intricate periodic structures composed of patterns of intersecting superstable orbits curves, defining the cell of a periodic window. Cells appear multifoliated and ordered, and they are isomorphically mapped when one changes the map parameters. Also, we identify the characteristics of simplest cell and apply them to other more complex, discussing how the topography on parameter space is affected. By use of the winding number as defined in periodically forced oscillators, we show that the hierarchical organization of the periodic domains is manifested in global and local scales.

  17. The use of concept maps as an indicator of significant learning in Calculus

    Directory of Open Access Journals (Sweden)

    Naíma Soltau Ferrão

    2014-03-01

    Full Text Available This paper contains reflections and results of a research that aimed to apply and analyze the use of concept maps in Higher Education as an indicator of significant learning concerning derivative as mathematical object with students that finished Differential and Integral Calculus. This is a qualitative approach, situated in the area of mathematics education, based on Ausubel's Theory of Meaningful Learning and on technique of Novak's Concept Mapping. As data acquisition instruments, use of classroom observations, questionnaire, brainstorming and digital conceptual mapping, made by an undergraduate physics course. To analyze we defined four aspects to be observed in the maps constructed by students: (i validity of propositions formed with concepts, (ii hierarchization, (iii cross-links between the propositions, and (vi the presence of applications. The identification of these elements, taken as reference to analyze the maps, allowed the collection of information about how each student has structured and correlated the set of concepts learned on the derivative of a function along their course. Based on the results, we have identified in the digital conceptual maps effective tools to evaluate the students in terms of meaningful learning about specific contents of Differential and Integral Calculus by the hierarchy of concepts, progressive differentiation and integrative reconciliation as defined in the Theory of Meaningful Learning.

  18. Defining functional DNA elements in the human genome

    Science.gov (United States)

    Kellis, Manolis; Wold, Barbara; Snyder, Michael P.; Bernstein, Bradley E.; Kundaje, Anshul; Marinov, Georgi K.; Ward, Lucas D.; Birney, Ewan; Crawford, Gregory E.; Dekker, Job; Dunham, Ian; Elnitski, Laura L.; Farnham, Peggy J.; Feingold, Elise A.; Gerstein, Mark; Giddings, Morgan C.; Gilbert, David M.; Gingeras, Thomas R.; Green, Eric D.; Guigo, Roderic; Hubbard, Tim; Kent, Jim; Lieb, Jason D.; Myers, Richard M.; Pazin, Michael J.; Ren, Bing; Stamatoyannopoulos, John A.; Weng, Zhiping; White, Kevin P.; Hardison, Ross C.

    2014-01-01

    With the completion of the human genome sequence, attention turned to identifying and annotating its functional DNA elements. As a complement to genetic and comparative genomics approaches, the Encyclopedia of DNA Elements Project was launched to contribute maps of RNA transcripts, transcriptional regulator binding sites, and chromatin states in many cell types. The resulting genome-wide data reveal sites of biochemical activity with high positional resolution and cell type specificity that facilitate studies of gene regulation and interpretation of noncoding variants associated with human disease. However, the biochemically active regions cover a much larger fraction of the genome than do evolutionarily conserved regions, raising the question of whether nonconserved but biochemically active regions are truly functional. Here, we review the strengths and limitations of biochemical, evolutionary, and genetic approaches for defining functional DNA segments, potential sources for the observed differences in estimated genomic coverage, and the biological implications of these discrepancies. We also analyze the relationship between signal intensity, genomic coverage, and evolutionary conservation. Our results reinforce the principle that each approach provides complementary information and that we need to use combinations of all three to elucidate genome function in human biology and disease. PMID:24753594

  19. A novel matrix approach for controlling the invariant densities of chaotic maps

    International Nuclear Information System (INIS)

    Rogers, Alan; Shorten, Robert; Heffernan, Daniel M.

    2008-01-01

    Recent work on positive matrices has resulted in a new matrix method for generating chaotic maps with arbitrary piecewise constant invariant densities, sometimes known as the inverse Frobenius-Perron problem (IFPP). In this paper, we give an extensive introduction to the IFPP, describing existing methods for solving it, and we describe our new matrix approach for solving the IFPP

  20. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    Science.gov (United States)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  1. Riemannian metric optimization on surfaces (RMOS) for intrinsic brain mapping in the Laplace-Beltrami embedding space.

    Science.gov (United States)

    Gahm, Jin Kyu; Shi, Yonggang

    2018-05-01

    Surface mapping methods play an important role in various brain imaging studies from tracking the maturation of adolescent brains to mapping gray matter atrophy patterns in Alzheimer's disease. Popular surface mapping approaches based on spherical registration, however, have inherent numerical limitations when severe metric distortions are present during the spherical parameterization step. In this paper, we propose a novel computational framework for intrinsic surface mapping in the Laplace-Beltrami (LB) embedding space based on Riemannian metric optimization on surfaces (RMOS). Given a diffeomorphism between two surfaces, an isometry can be defined using the pullback metric, which in turn results in identical LB embeddings from the two surfaces. The proposed RMOS approach builds upon this mathematical foundation and achieves general feature-driven surface mapping in the LB embedding space by iteratively optimizing the Riemannian metric defined on the edges of triangular meshes. At the core of our framework is an optimization engine that converts an energy function for surface mapping into a distance measure in the LB embedding space, which can be effectively optimized using gradients of the LB eigen-system with respect to the Riemannian metrics. In the experimental results, we compare the RMOS algorithm with spherical registration using large-scale brain imaging data, and show that RMOS achieves superior performance in the prediction of hippocampal subfields and cortical gyral labels, and the holistic mapping of striatal surfaces for the construction of a striatal connectivity atlas from substantia nigra. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Defining the lung outline from a gamma camera transmission attenuation map

    International Nuclear Information System (INIS)

    Fleming, John S; Pitcairn, Gary; Newman, Stephen

    2006-01-01

    Segmentation of the lung outline from gamma camera transmission images of the thorax is useful in attenuation correction and quantitative image analysis. This paper describes and compares two threshold-based methods of segmentation. Simulated gamma camera transmission images of test objects were used to produce a knowledge base of the variation of threshold defining the lung outline with image resolution and chest wall thickness. Two segmentation techniques based on global (GT) and context-sensitive (CST) thresholds were developed and evaluated in simulated transmission images of realistic thoraces. The segmented lung volumes were compared to the true values used in the simulation. The mean distances between segmented and true lung surface were calculated. The techniques were also applied to three real human subject transmission images. The lung volumes were estimated and the segmentations were compared visually. The CST segmentation produced significantly superior segmentations than the GT technique in the simulated data. In human subjects, the GT technique underestimated volumes by 13% compared to the CST technique. It missed areas that clearly belonged to the lungs. In conclusion, both techniques segmented the lungs with reasonable accuracy and precision. The CST approach was superior, particularly in real human subject images

  3. A COGNITIVE APPROACH TO CORPORATE GOVERNANCE: A VISUALIZATION TEST OF MENTAL MODELS WITH THE COGNITIVE MAPPING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Garoui NASSREDDINE

    2012-01-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the fi rm with respect to the cognitive approach of corporate governance. The paper takes a corporate governance perspective, discusses mental models and uses the cognitive map to view the diagrams showing the ways of thinking and the conceptualization of the cognitive approach. In addition, it employs a cognitive mapping technique. Returning to the systematic exploration of grids for each actor, it concludes that there is a balance of concepts expressing their cognitive orientation.

  4. Mapping topographic plant location properties using a dense matching approach

    Science.gov (United States)

    Niederheiser, Robert; Rutzinger, Martin; Lamprecht, Andrea; Bardy-Durchhalter, Manfred; Pauli, Harald; Winkler, Manuela

    2017-04-01

    Within the project MEDIALPS (Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains) six regions in Alpine and in Mediterranean mountain regions are investigated to assess how plant species respond to climate change. The project is embedded in the Global Observation Research Initiative in Alpine Environments (GLORIA), which is a well-established global monitoring initiative for systematic observation of changes in the plant species composition and soil temperature on mountain summits worldwide to discern accelerating climate change pressures on these fragile alpine ecosystems. Close-range sensing techniques such as terrestrial photogrammetry are well suited for mapping terrain topography of small areas with high resolution. Lightweight equipment, flexible positioning for image acquisition in the field, and independence on weather conditions (i.e. wind) make this a feasible method for in-situ data collection. New developments of dense matching approaches allow high quality 3D terrain mapping with less requirements for field set-up. However, challenges occur in post-processing and required data storage if many sites have to be mapped. Within MEDIALPS dense matching is used for mapping high resolution topography for 284 3x3 meter plots deriving information on vegetation coverage, roughness, slope, aspect and modelled solar radiation. This information helps identifying types of topography-dependent ecological growing conditions and evaluating the potential for existing refugial locations for specific plant species under climate change. This research is conducted within the project MEDIALPS - Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains funded by the Earth System Sciences Programme of the Austrian Academy of Sciences.

  5. A Hybrid Color Mapping Approach to Fusing MODIS and Landsat Images for Forward Prediction

    OpenAIRE

    Chiman Kwan; Bence Budavari; Feng Gao; Xiaolin Zhu

    2018-01-01

    We present a new, simple, and efficient approach to fusing MODIS and Landsat images. It is well known that MODIS images have high temporal resolution and low spatial resolution, whereas Landsat images are just the opposite. Similar to earlier approaches, our goal is to fuse MODIS and Landsat images to yield high spatial and high temporal resolution images. Our approach consists of two steps. First, a mapping is established between two MODIS images, where one is at an earlier time, t1, and the...

  6. Cryptanalysis of a family of 1D unimodal maps

    Science.gov (United States)

    Md Said, Mohamad Rushdan; Hina, Aliyu Danladi; Banerjee, Santo

    2017-07-01

    In this paper, we proposed a topologically conjugate map, equivalent to the well known logistic map. This constructed map is defined on the integer domain [0, 2n) with a view to be used as a random number generator (RNG) based on an integer domain as is the required in classical cryptography. The maps were found to have a one to one correspondence between points in their respective defining intervals defined on an n-bits precision. The dynamics of the proposed map similar with that of the logistic map, in terms of the Lyapunov exponents with the control parameter. This similarity between the curves indicates topological conjugacy between the maps. With a view to be applied in cryptography as a Pseudo-Random number generator (PRNG), the complexity of the constructed map as a source of randomness is determined using both the permutation entropy (PE) and the Lempel-Ziv (LZ-76) complexity measures, and the results are compared with numerical simulations.

  7. Groundwater vulnerability maps for pesticides for Flanders

    Science.gov (United States)

    Dams, Jef; Joris, Ingeborg; Bronders, Jan; Van Looy, Stijn; Vanden Boer, Dirk; Heuvelmans, Griet; Seuntjens, Piet

    2017-04-01

    Pesticides are increasingly being detected in shallow groundwater and and are one of the main causes of the poor chemical status of phreatic groundwater bodies in Flanders. There is a need for groundwater vulnerability maps in order to design monitoring strategies and land-use strategies for sensitive areas such as drinking water capture zones. This research focuses on the development of generic vulnerability maps for pesticides for Flanders and a tool to calculate substance-specific vulnerability maps at the scale of Flanders and at the local scale. (1) The generic vulnerability maps are constructed using an index based method in which maps of the main contributing factors in soil and saturated zone to high concentrations of pesticides in groundwater are classified and overlain. Different weights are assigned to the contributing factors according to the type of pesticide (low/high mobility, low/high persistence). Factors that are taken into account are the organic matter content and texture of soil, depth of the unsaturated zone, organic carbon and redox potential of the phreatic groundwater and thickness and conductivity of the phreatic layer. (2) Secondly a tool is developed that calculates substance-specific vulnerability maps for Flanders using a hybrid approach where a process-based leaching model GeoPEARL is combined with vulnerability indices that account for dilution in the phreatic layer. The GeoPEARL model is parameterized for Flanders in 1434 unique combinations of soil properties, climate and groundwater depth. Leaching is calculated for a 20 year period for each 50 x 50 m gridcell in Flanders. (3) At the local scale finally, a fully process-based approach is applied combining GeoPEARL leaching calculations and flowline calculations of pesticide transport in the saturated zone to define critical zones in the capture zone of a receptor such as a drinking water well or a river segment. The three approaches are explained more in detail and illustrated

  8. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  9. Quantitative Trait Loci Mapping Problem: An Extinction-Based Multi-Objective Evolutionary Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Nicholas S. Flann

    2013-09-01

    Full Text Available The Quantitative Trait Loci (QTL mapping problem aims to identify regions in the genome that are linked to phenotypic features of the developed organism that vary in degree. It is a principle step in determining targets for further genetic analysis and is key in decoding the role of specific genes that control quantitative traits within species. Applications include identifying genetic causes of disease, optimization of cross-breeding for desired traits and understanding trait diversity in populations. In this paper a new multi-objective evolutionary algorithm (MOEA method is introduced and is shown to increase the accuracy of QTL mapping identification for both independent and epistatic loci interactions. The MOEA method optimizes over the space of possible partial least squares (PLS regression QTL models and considers the conflicting objectives of model simplicity versus model accuracy. By optimizing for minimal model complexity, MOEA has the advantage of solving the over-fitting problem of conventional PLS models. The effectiveness of the method is confirmed by comparing the new method with Bayesian Interval Mapping approaches over a series of test cases where the optimal solutions are known. This approach can be applied to many problems that arise in analysis of genomic data sets where the number of features far exceeds the number of observations and where features can be highly correlated.

  10. Spatial MEG laterality maps for language: clinical applications in epilepsy.

    Science.gov (United States)

    D'Arcy, Ryan C N; Bardouille, Timothy; Newman, Aaron J; McWhinney, Sean R; Debay, Drew; Sadler, R Mark; Clarke, David B; Esser, Michael J

    2013-08-01

    Functional imaging is increasingly being used to provide a noninvasive alternative to intracarotid sodium amobarbitol testing (i.e., the Wada test). Although magnetoencephalography (MEG) has shown significant potential in this regard, the resultant output is often reduced to a simplified estimate of laterality. Such estimates belie the richness of functional imaging data and consequently limit the potential value. We present a novel approach that utilizes MEG data to compute "complex laterality vectors" and consequently "laterality maps" for a given function. Language function was examined in healthy controls and in people with epilepsy. When compared with traditional laterality index (LI) approaches, the resultant maps provided critical information about the magnitude and spatial characteristics of lateralized function. Specifically, it was possible to more clearly define low LI scores resulting from strong bilateral activation, high LI scores resulting from weak unilateral activation, and most importantly, the spatial distribution of lateralized activation. We argue that the laterality concept is better presented with the inherent spatial sensitivity of activation maps, rather than being collapsed into a one-dimensional index. Copyright © 2012 Wiley Periodicals, Inc.

  11. Multi-Sensor Approach to Mapping Snow Cover Using Data From NASA's EOS Aqua and Terra Spacecraft

    Science.gov (United States)

    Armstrong, R. L.; Brodzik, M. J.

    2003-12-01

    Snow cover is an important variable for climate and hydrologic models due to its effects on energy and moisture budgets. Over the past several decades both optical and passive microwave satellite data have been utilized for snow mapping at the regional to global scale. For the period 1978 to 2002, we have shown earlier that both passive microwave and visible data sets indicate a similar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are, depending on season, less than those provided by the visible satellite data and the visible data typically show higher monthly variability. Snow mapping using optical data is based on the magnitude of the surface reflectance while microwave data can be used to identify snow cover because the microwave energy emitted by the underlying soil is scattered by the snow grains resulting in a sharp decrease in brightness temperature and a characteristic negative spectral gradient. Our previous work has defined the respective advantages and disadvantages of these two types of satellite data for snow cover mapping and it is clear that a blended product is optimal. We present a multi-sensor approach to snow mapping based both on historical data as well as data from current NASA EOS sensors. For the period 1978 to 2002 we combine data from the NOAA weekly snow charts with passive microwave data from the SMMR and SSM/I brightness temperature record. For the current and future time period we blend MODIS and AMSR-E data sets. An example of validation at the brightness temperature level is provided through the comparison of AMSR-E with data from the well-calibrated heritage SSM/I sensor over a large homogeneous snow-covered surface (Dome C, Antarctica). Prototype snow cover maps from AMSR-E compare well with maps derived from SSM/I. Our current blended product is being developed in the 25 km EASE-Grid while the MODIS data being used are in the Climate Modelers Grid (CMG) at approximately 5 km

  12. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    Science.gov (United States)

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  13. Linking retinotopic fMRI mapping and anatomical probability maps of human occipital areas V1 and V2.

    Science.gov (United States)

    Wohlschläger, A M; Specht, K; Lie, C; Mohlberg, H; Wohlschläger, A; Bente, K; Pietrzyk, U; Stöcker, T; Zilles, K; Amunts, K; Fink, G R

    2005-05-15

    Using functional MRI, we characterized field sign maps of the occipital cortex and created three-dimensional maps of these areas. By averaging the individual maps into group maps, probability maps of functionally defined V1 or V2 were determined and compared to anatomical probability maps of Brodmann areas BA17 and BA18 derived from cytoarchitectonic analysis (Amunts, K., Malikovic, A., Mohlberg, H., Schormann, T., Zilles, K., 2000. Brodmann's areas 17 and 18 brought into stereotaxic space-where and how variable? NeuroImage 11, 66-84). Comparison of areas BA17/V1 and BA18/V2 revealed good agreement of the anatomical and functional probability maps. Taking into account that our functional stimulation (due to constraints of the visual angle of stimulation achievable in the MR scanner) only identified parts of V1 and V2, for statistical evaluation of the spatial correlation of V1 and BA17, or V2 and BA18, respectively, the a priori measure kappa was calculated testing the hypothesis that a region can only be part of functionally defined V1 or V2 if it is also in anatomically defined BA17 or BA18, respectively. kappa = 1 means the hypothesis is fully true, kappa = 0 means functionally and anatomically defined visual areas are independent. When applying this measure to the probability maps, kappa was equal to 0.84 for both V1/BA17 and V2/BA18. The data thus show a good correspondence of functionally and anatomically derived segregations of early visual processing areas and serve as a basis for employing anatomical probability maps of V1 and V2 in group analyses to characterize functional activations of early visual processing areas.

  14. ReactionMap: an efficient atom-mapping algorithm for chemical reactions.

    Science.gov (United States)

    Fooshee, David; Andronico, Alessio; Baldi, Pierre

    2013-11-25

    Large databases of chemical reactions provide new data-mining opportunities and challenges. Key challenges result from the imperfect quality of the data and the fact that many of these reactions are not properly balanced or atom-mapped. Here, we describe ReactionMap, an efficient atom-mapping algorithm. Our approach uses a combination of maximum common chemical subgraph search and minimization of an assignment cost function derived empirically from training data. We use a set of over 259,000 balanced atom-mapped reactions from the SPRESI commercial database to train the system, and we validate it on random sets of 1000 and 17,996 reactions sampled from this pool. These large test sets represent a broad range of chemical reaction types, and ReactionMap correctly maps about 99% of the atoms and about 96% of the reactions, with a mean time per mapping of 2 s. Most correctly mapped reactions are mapped with high confidence. Mapping accuracy compares favorably with ChemAxon's AutoMapper, versions 5 and 6.1, and the DREAM Web tool. These approaches correctly map 60.7%, 86.5%, and 90.3% of the reactions, respectively, on the same data set. A ReactionMap server is available on the ChemDB Web portal at http://cdb.ics.uci.edu .

  15. Effective information flow through efficient supply chain management - Value stream mapping approach Case Outokumpu Tornio Works

    OpenAIRE

    Juvonen, Piia

    2012-01-01

    ABSTRACT Juvonen, Piia Suvi Päivikki 2012. Effective information flow through efficient supply chain management -Value stream mapping approach - Case Outokumpu Tornio Works. Master`s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 63. Appendices 2. The general aim of this thesis is to explore effective information flow through efficient supply chain management by following one of the lean management principles, value stream mapping. The specific research...

  16. An Optimization Approach to Improving Collections of Shape Maps

    DEFF Research Database (Denmark)

    Nguyen, Andy; Ben‐Chen, Mirela; Welnicka, Katarzyna

    2011-01-01

    pairwise map independently does not take full advantage of all existing information. For example, a notorious problem with computing shape maps is the ambiguity introduced by the symmetry problem — for two similar shapes which have reflectional symmetry there exist two maps which are equally favorable...... shape maps connecting our collection, we propose to add the constraint of global map consistency, requiring that any composition of maps between two shapes should be independent of the path chosen in the network. This requirement can help us choose among the equally good symmetric alternatives, or help...

  17. Integrative approach to produce hydrogen and polyhydroxybutyrate from biowaste using defined bacterial cultures.

    Science.gov (United States)

    Patel, Sanjay K S; Kumar, Prasun; Singh, Mamtesh; Lee, Jung-Kul; Kalia, Vipin C

    2015-01-01

    Biological production of hydrogen (H2) and polyhydroxybutyrate (PHB) from pea-shell slurry (PSS) was investigated using defined mixed culture (MMC4, composed of Enterobacter, Proteus, Bacillus spp.). Under batch culture, 19.0LH2/kg of PSS (total solid, TS, 2%w/v) was evolved. Using effluent from the H2 producing stage, Bacillus cereus EGU43 could produce 12.4% (w/w) PHB. Dilutions of PSS hydrolysate containing glucose (0.5%, w/v) resulted in 45-75LH2/kg TS fed and 19.1% (w/w) of PHB content. Under continuous culture, MMC4 immobilized on coconut coir (CC) lead to an H2 yield of 54L/kg TS fed and a PHB content of 64.7% (w/w). An improvement of 2- and 3.7-fold in H2 and PHB yields were achieved in comparison to control. This integrative approach using defined set of bacterial strains can prove effective in producing biomolecules from biowastes. Copyright © 2014. Published by Elsevier Ltd.

  18. Standardisation of defined approaches for skin sensitisation testing to support regulatory use and international adoption: position of the International Cooperation on Alternative Test Methods.

    Science.gov (United States)

    Casati, S; Aschberger, K; Barroso, J; Casey, W; Delgado, I; Kim, T S; Kleinstreuer, N; Kojima, H; Lee, J K; Lowit, A; Park, H K; Régimbald-Krnel, M J; Strickland, J; Whelan, M; Yang, Y; Zuang, Valérie

    2018-02-01

    Skin sensitisation is the regulatory endpoint that has been at the centre of concerted efforts to replace animal testing in recent years, as demonstrated by the Organisation for Economic Co-operation and Development (OECD) adoption of five non-animal methods addressing mechanisms under the first three key events of the skin sensitisation adverse outcome pathway. Nevertheless, the currently adopted methods, when used in isolation, are not sufficient to fulfil regulatory requirements on the skin sensitisation potential and potency of chemicals comparable to that provided by the regulatory animal tests. For this reason, a number of defined approaches integrating data from these methods with other relevant information have been proposed and documented by the OECD. With the aim to further enhance regulatory consideration and adoption of defined approaches, the European Union Reference Laboratory for Alternatives to Animal testing in collaboration with the International Cooperation on Alternative Test Methods hosted, on 4-5 October 2016, a workshop on the international regulatory applicability and acceptance of alternative non-animal approaches, i.e., defined approaches, to skin sensitisation assessment of chemicals used in a variety of sectors. The workshop convened representatives from more than 20 regulatory authorities from the European Union, United States, Canada, Japan, South Korea, Brazil and China. There was a general consensus among the workshop participants that to maximise global regulatory acceptance of data generated with defined approaches, international harmonisation and standardisation are needed. Potential assessment criteria were defined for a systematic evaluation of existing defined approaches that would facilitate their translation into international standards, e.g., into a performance-based Test Guideline. Informed by the discussions at the workshop, the ICATM members propose practical ways to further promote the regulatory use and facilitate

  19. MAP3D: a media processor approach for high-end 3D graphics

    Science.gov (United States)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  20. A multiscale approach to mapping seabed sediments.

    Directory of Open Access Journals (Sweden)

    Benjamin Misiuk

    Full Text Available Benthic habitat maps, including maps of seabed sediments, have become critical spatial-decision support tools for marine ecological management and conservation. Despite the increasing recognition that environmental variables should be considered at multiple spatial scales, variables used in habitat mapping are often implemented at a single scale. The objective of this study was to evaluate the potential for using environmental variables at multiple scales for modelling and mapping seabed sediments. Sixteen environmental variables were derived from multibeam echosounder data collected near Qikiqtarjuaq, Nunavut, Canada at eight spatial scales ranging from 5 to 275 m, and were tested as predictor variables for modelling seabed sediment distributions. Using grain size data obtained from grab samples, we tested which scales of each predictor variable contributed most to sediment models. Results showed that the default scale was often not the best. Out of 129 potential scale-dependent variables, 11 were selected to model the additive log-ratio of mud and sand at five different scales, and 15 were selected to model the additive log-ratio of gravel and sand, also at five different scales. Boosted Regression Tree models that explained between 46.4 and 56.3% of statistical deviance produced multiscale predictions of mud, sand, and gravel that were correlated with cross-validated test data (Spearman's ρmud = 0.77, ρsand = 0.71, ρgravel = 0.58. Predictions of individual size fractions were classified to produce a map of seabed sediments that is useful for marine spatial planning. Based on the scale-dependence of variables in this study, we concluded that spatial scale consideration is at least as important as variable selection in seabed mapping.

  1. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    Science.gov (United States)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  2. Conceptualizing Stakeholders’ Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    Directory of Open Access Journals (Sweden)

    Lopes Rita

    2015-12-01

    Full Text Available A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  3. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    Science.gov (United States)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  4. A unified theoretical framework for mapping models for the multi-state Hamiltonian.

    Science.gov (United States)

    Liu, Jian

    2016-11-28

    We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.

  5. A novel continuous colour mapping approach for visualization of facial skin hydration and transepidermal water loss for four ethnic groups.

    Science.gov (United States)

    Voegeli, R; Rawlings, A V; Seroul, P; Summers, B

    2015-12-01

    The aim of this exploratory study was to develop a novel colour mapping approach to visualize and interpret the complexity of facial skin hydration and barrier properties of four ethnic groups (Caucasians, Indians, Chinese and Black Africans) living in Pretoria, South Africa. We measured transepidermal water loss (TEWL) and skin capacitance on 30 pre-defined sites on the forehead, cheek, jaw and eye areas of sixteen women (four per ethnic group) and took digital images of their faces. Continuous colour maps were generated by interpolating between each measured value and superimposing the values on the digital images. The complexity of facial skin hydration and skin barrier properties is revealed by these measurements and visualized by the continuous colour maps of the digital images. Overall, the Caucasian subjects had the better barrier properties followed by the Black African subjects, Chinese subjects and Indian subjects. Nevertheless, the two more darkly pigmented ethnic groups had superior skin hydration properties. Subtle differences were seen when examining the different facial sites. There exists remarkable skin capacitance and TEWL gradients within short distances on selected areas of the face. These gradients are distinctive in the different ethnic groups. In contrast to other reports, we found that darkly pigmented skin does not always have a superior barrier function and differences in skin hydration values are complex on the different parts of the face among the different ethnic groups. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  6. In Silico Design of Human IMPDH Inhibitors Using Pharmacophore Mapping and Molecular Docking Approaches

    Directory of Open Access Journals (Sweden)

    Rui-Juan Li

    2015-01-01

    Full Text Available Inosine 5′-monophosphate dehydrogenase (IMPDH is one of the crucial enzymes in the de novo biosynthesis of guanosine nucleotides. It has served as an attractive target in immunosuppressive, anticancer, antiviral, and antiparasitic therapeutic strategies. In this study, pharmacophore mapping and molecular docking approaches were employed to discover novel Homo sapiens IMPDH (hIMPDH inhibitors. The Güner-Henry (GH scoring method was used to evaluate the quality of generated pharmacophore hypotheses. One of the generated pharmacophore hypotheses was found to possess a GH score of 0.67. Ten potential compounds were selected from the ZINC database using a pharmacophore mapping approach and docked into the IMPDH active site. We find two hits (i.e., ZINC02090792 and ZINC00048033 that match well the optimal pharmacophore features used in this investigation, and it is found that they form interactions with key residues of IMPDH. We propose that these two hits are lead compounds for the development of novel hIMPDH inhibitors.

  7. Defining Precision Medicine Approaches to Autism Spectrum Disorders: Concepts and Challenges.

    Science.gov (United States)

    Loth, Eva; Murphy, Declan G; Spooren, Will

    2016-01-01

    The tremendous clinical and etiological variability between individuals with autism spectrum disorder (ASD) has made precision medicine the most promising treatment approach. It aims to combine new pathophysiologically based treatments with objective tests (stratification biomarkers) to predict which treatment may be beneficial for a particular person. Here we discuss significant advances and current challenges for this approach: rare monogenic forms of ASD have provided a major breakthrough for the identification of treatment targets by providing a means to trace causal links from a gene to specific molecular alterations and biological pathways. To estimate whether treatment targets thus identified may be useful for larger patient groups we need a better understanding of whether different etiologies (i.e., genetic and environmental risk factors acting at different critical time points) lead to convergent or divergent molecular mechanisms, and how they map onto differences in circuit-level brain and cognitive development, and behavioral symptom profiles. Several recently failed clinical trials with syndromic forms of ASD provide valuable insights into conceptual and methodological issues linked to limitations in the translatability from animal models to humans, placebo effects, and a need for mechanistically plausible, objective outcome measures. To identify stratification biomarkers that enrich participant selection in clinical trials, large-scale multi-modal longitudinal observational studies are underway. Addressing these different factors in the next generation of research studies requires a translatable developmental perspective and multidisciplinary, collaborative efforts, with a commitment to sharing protocols and data, to increase transparency and reproducibility.

  8. Mapping enzymatic catalysis using the effective fragment molecular orbital method

    DEFF Research Database (Denmark)

    Svendsen, Casper Steinmann; Fedorov, Dmitri G.; Jensen, Jan Halborg

    2013-01-01

    We extend the Effective Fragment Molecular Orbital (EFMO) method to the frozen domain approach where only the geometry of an active part is optimized, while the many-body polarization effects are considered for the whole system. The new approach efficiently mapped out the entire reaction path...... of chorismate mutase in less than four days using 80 cores on 20 nodes, where the whole system containing 2398 atoms is treated in the ab initio fashion without using any force fields. The reaction path is constructed automatically with the only assumption of defining the reaction coordinate a priori. We...

  9. A high-throughput shotgun mutagenesis approach to mapping B-cell antibody epitopes.

    Science.gov (United States)

    Davidson, Edgar; Doranz, Benjamin J

    2014-09-01

    Characterizing the binding sites of monoclonal antibodies (mAbs) on protein targets, their 'epitopes', can aid in the discovery and development of new therapeutics, diagnostics and vaccines. However, the speed of epitope mapping techniques has not kept pace with the increasingly large numbers of mAbs being isolated. Obtaining detailed epitope maps for functionally relevant antibodies can be challenging, particularly for conformational epitopes on structurally complex proteins. To enable rapid epitope mapping, we developed a high-throughput strategy, shotgun mutagenesis, that enables the identification of both linear and conformational epitopes in a fraction of the time required by conventional approaches. Shotgun mutagenesis epitope mapping is based on large-scale mutagenesis and rapid cellular testing of natively folded proteins. Hundreds of mutant plasmids are individually cloned, arrayed in 384-well microplates, expressed within human cells, and tested for mAb reactivity. Residues are identified as a component of a mAb epitope if their mutation (e.g. to alanine) does not support candidate mAb binding but does support that of other conformational mAbs or allows full protein function. Shotgun mutagenesis is particularly suited for studying structurally complex proteins because targets are expressed in their native form directly within human cells. Shotgun mutagenesis has been used to delineate hundreds of epitopes on a variety of proteins, including G protein-coupled receptor and viral envelope proteins. The epitopes mapped on dengue virus prM/E represent one of the largest collections of epitope information for any viral protein, and results are being used to design better vaccines and drugs. © 2014 John Wiley & Sons Ltd.

  10. One perspective on spatial variability in geologic mapping

    Science.gov (United States)

    Markewich, H.W.; Cooper, S.C.

    1991-01-01

    This paper discusses some of the differences between geologic mapping and soil mapping, and how the resultant maps are interpreted. The role of spatial variability in geologic mapping is addressed only indirectly because in geologic mapping there have been few attempts at quantification of spatial differences. This is largely because geologic maps deal with temporal as well as spatial variability and consider time, age, and origin, as well as composition and geometry. Both soil scientists and geologists use spatial variability to delineate mappable units; however, the classification systems from which these mappable units are defined differ greatly. Mappable soil units are derived from systematic, well-defined, highly structured sets of taxonomic criteria; whereas mappable geologic units are based on a more arbitrary heirarchy of categories that integrate many features without strict values or definitions. Soil taxonomy is a sorting tool used to reduce heterogeneity between soil units. Thus at the series level, soils in any one series are relatively homogeneous because their range of properties is small and well-defined. Soil maps show the distribution of soils on the land surface. Within a map area, soils, which are often less than 2 m thick, show a direct correlation to topography and to active surface processes as well as to parent material.

  11. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  12. Open Land-Use Map: A Regional Land-Use Mapping Strategy for Incorporating OpenStreetMap with Earth Observations

    Science.gov (United States)

    Yang, D.; Fu, C. S.; Binford, M. W.

    2017-12-01

    The southeastern United States has high landscape heterogeneity, withheavily managed forestlands, highly developed agriculture lands, and multiple metropolitan areas. Human activities are transforming and altering land patterns and structures in both negative and positive manners. A land-use map for at the greater scale is a heavy computation task but is critical to most landowners, researchers, and decision makers, enabling them to make informed decisions for varying objectives. There are two major difficulties in generating the classification maps at the regional scale: the necessity of large training point sets and the expensive computation cost-in terms of both money and time-in classifier modeling. Volunteered Geographic Information (VGI) opens a new era in mapping and visualizing our world, where the platform is open for collecting valuable georeferenced information by volunteer citizens, and the data is freely available to the public. As one of the most well-known VGI initiatives, OpenStreetMap (OSM) contributes not only road network distribution, but also the potential for using this data to justify land cover and land use classifications. Google Earth Engine (GEE) is a platform designed for cloud-based mapping with a robust and fast computing power. Most large scale and national mapping approaches confuse "land cover" and "land-use", or build up the land-use database based on modeled land cover datasets. Unlike most other large-scale approaches, we distinguish and differentiate land-use from land cover. By focusing our prime objective of mapping land-use and management practices, a robust regional land-use mapping approach is developed by incorporating the OpenstreepMap dataset into Earth observation remote sensing imageries instead of the often-used land cover base maps.

  13. The cancer cell map initiative: defining the hallmark networks of cancer.

    Science.gov (United States)

    Krogan, Nevan J; Lippman, Scott; Agard, David A; Ashworth, Alan; Ideker, Trey

    2015-05-21

    Progress in DNA sequencing has revealed the startling complexity of cancer genomes, which typically carry thousands of somatic mutations. However, it remains unclear which are the key driver mutations or dependencies in a given cancer and how these influence pathogenesis and response to therapy. Although tumors of similar types and clinical outcomes can have patterns of mutations that are strikingly different, it is becoming apparent that these mutations recurrently hijack the same hallmark molecular pathways and networks. For this reason, it is likely that successful interpretation of cancer genomes will require comprehensive knowledge of the molecular networks under selective pressure in oncogenesis. Here we announce the creation of a new effort, The Cancer Cell Map Initiative (CCMI), aimed at systematically detailing these complex interactions among cancer genes and how they differ between diseased and healthy states. We discuss recent progress that enables creation of these cancer cell maps across a range of tumor types and how they can be used to target networks disrupted in individual patients, significantly accelerating the development of precision medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. A fast and cost-effective approach to develop and map EST-SSR markers: oak as a case study

    Directory of Open Access Journals (Sweden)

    Cherubini Marcello

    2010-10-01

    Full Text Available Abstract Background Expressed Sequence Tags (ESTs are a source of simple sequence repeats (SSRs that can be used to develop molecular markers for genetic studies. The availability of ESTs for Quercus robur and Quercus petraea provided a unique opportunity to develop microsatellite markers to accelerate research aimed at studying adaptation of these long-lived species to their environment. As a first step toward the construction of a SSR-based linkage map of oak for quantitative trait locus (QTL mapping, we describe the mining and survey of EST-SSRs as well as a fast and cost-effective approach (bin mapping to assign these markers to an approximate map position. We also compared the level of polymorphism between genomic and EST-derived SSRs and address the transferability of EST-SSRs in Castanea sativa (chestnut. Results A catalogue of 103,000 Sanger ESTs was assembled into 28,024 unigenes from which 18.6% presented one or more SSR motifs. More than 42% of these SSRs corresponded to trinucleotides. Primer pairs were designed for 748 putative unigenes. Overall 37.7% (283 were found to amplify a single polymorphic locus in a reference full-sib pedigree of Quercus robur. The usefulness of these loci for establishing a genetic map was assessed using a bin mapping approach. Bin maps were constructed for the male and female parental tree for which framework linkage maps based on AFLP markers were available. The bin set consisting of 14 highly informative offspring selected based on the number and position of crossover sites. The female and male maps comprised 44 and 37 bins, with an average bin length of 16.5 cM and 20.99 cM, respectively. A total of 256 EST-SSRs were assigned to bins and their map position was further validated by linkage mapping. EST-SSRs were found to be less polymorphic than genomic SSRs, but their transferability rate to chestnut, a phylogenetically related species to oak, was higher. Conclusion We have generated a bin map for oak

  15. Region & Gateway Mapping

    OpenAIRE

    Schröter, Derik

    2007-01-01

    State-of-the-art robot mapping approaches are capable of acquiring impressively accurate 2D and 3D models of their environments. To the best of our knowledge, few of them represent structure or acquire models of task-relevant objects. In this work, a new approach to mapping of indoor environments is presented, in which the environment structure in terms of regions and gateways is automatically extracted, while the robot explores. Objects, both in 2D and 3D, are modeled explicitly in those map...

  16. Groundwater potentiality mapping using geoelectrical-based aquifer hydraulic parameters: A GIS-based multi-criteria decision analysis modeling approach

    Directory of Open Access Journals (Sweden)

    Kehinde Anthony Mogaji Hwee San Lim

    2017-01-01

    Full Text Available This study conducted a robust analysis on acquired 2D resistivity imaging data and borehole pumping test records to optimize groundwater potentiality mapping in Perak province, Malaysia using derived aquifer hydraulic properties. The transverse resistance (TR parameter was determined from the interpreted 2D resistivity imaging data by applying the Dar-Zarrouk parameter equation. Linear regression and GIS techniques were used to regress the estimated values for TR parameters with the aquifer transmissivity values extracted from the geospatially produced BPT records-based aquifer transmissivity map to develop the aquifer transmissivity parameter predictive (ATPP model. The reliability evaluated ATPP model using the Theil inequality coefficient measurement approach was used to establish geoelectrical-based hydraulic parameters (GHP modeling equations for the modeling of transmissivity (Tr, hydraulic conductivity (K, storativity (St, and hydraulic diffusivity (D properties. The applied GHP modeling equation results to the delineated aquifer media was used to produce aquifer potential conditioning factor maps for Tr, K, St, and D. The maps were modeled to develop an aquifer potential mapping index (APMI model via applying the multi-criteria decision analysis-analytic hierarchy process principle. The area groundwater reservoir productivity potential model map produced based on the processed APMI model estimates in the GIS environment was found to be 71% accurate. This study establishes a good alternative approach to determine aquifer hydraulic parameters even in areas where pumping test information is unavailable using a cost effective geophysical data. The produced map can be explored for hydrological decision making.

  17. Fusion Approaches for Land Cover Map Production Using High Resolution Image Time Series without Reference Data of the Corresponding Period

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2017-11-01

    Full Text Available Optical sensor time series images allow one to produce land cover maps at a large scale. The supervised classification algorithms have been shown to be the best to produce maps automatically with good accuracy. The main drawback of these methods is the need for reference data, the collection of which can introduce important production delays. Therefore, the maps are often available too late for some applications. Domain adaptation methods seem to be efficient for using past data for land cover map production. According to this idea, the main goal of this study is to propose several simple past data fusion schemes to override the current land cover map production delays. A single classifier approach and three voting rules are considered to produce maps without reference data of the corresponding period. These four approaches reach an overall accuracy of around 80% with a 17-class nomenclature using Formosat-2 image time series. A study of the impact of the number of past periods used is also done. It shows that the overall accuracy increases with the number of periods used. The proposed methods require at least two or three previous years to be used.

  18. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    KAUST Repository

    Dashti, M.

    2013-09-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.

  19. MAP estimators and their consistency in Bayesian nonparametric inverse problems

    International Nuclear Information System (INIS)

    Dashti, M; Law, K J H; Stuart, A M; Voss, J

    2013-01-01

    We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map G applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ 0 . We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μ y . Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager–Machlup functional defined on the Cameron–Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of G(u) can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier–Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. (paper)

  20. Karst groundwater protection: First application of a Pan-European Approach to vulnerability, hazard and risk mapping in the Sierra de Libar (Southern Spain)

    International Nuclear Information System (INIS)

    Andreo, Bartolome; Goldscheider, Nico; Vadillo, Inaki; Vias, Jesus Maria; Neukum, Christoph; Sinreich, Michael; Jimenez, Pablo; Brechenmacher, Julia; Carrasco, Francisco; Hoetzl, Heinz; Perles, Maria Jesus; Zwahlen, Francois

    2006-01-01

    The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, validation of vulnerability maps, hazard and risk mapping. This paper presents the first application of all components of this Pan-European Approach to the Sierra de Libar, a karst hydrogeology system in Andalusia, Spain. The intrinsic vulnerability maps take into account the hydrogeological characteristics of the area but are independent from specific contaminant properties. Two specific vulnerability maps were prepared for faecal coliforms and BTEX. These maps take into account the specific properties of these two groups of contaminants and their interaction with the karst hydrogeological system. The vulnerability assessment was validated by means of tracing tests, hydrological, hydrochemical and isotope methods. The hazard map shows the localization of potential contamination sources resulting from human activities, and evaluates those according to their dangerousness. The risk of groundwater contamination depends on the hazards and the vulnerability of the aquifer system. The risk map for the Sierra de Libar was thus created by overlaying the hazard and vulnerability maps

  1. Karst groundwater protection: First application of a Pan-European Approach to vulnerability, hazard and risk mapping in the Sierra de Libar (Southern Spain)

    Energy Technology Data Exchange (ETDEWEB)

    Andreo, Bartolome [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain)]. E-mail: Andreo@uma.es; Goldscheider, Nico [Centre of Hydrogeology, University of Neuchatel, 11 rue Emile-Argand, CH-2007 Neuchatel (Switzerland); Vadillo, Inaki [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Vias, Jesus Maria [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Neukum, Christoph [Department of Applied Geology, University of Karlsruhe, Kaiserstrasse, 12, D-76128 Karlsruhe (Germany); Sinreich, Michael [Centre of Hydrogeology, University of Neuchatel, 11 rue Emile-Argand, CH-2007 Neuchatel (Switzerland); Jimenez, Pablo [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Brechenmacher, Julia [Department of Applied Geology, University of Karlsruhe, Kaiserstrasse, 12, D-76128 Karlsruhe (Germany); Carrasco, Francisco [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Hoetzl, Heinz [Department of Applied Geology, University of Karlsruhe, Kaiserstrasse, 12, D-76128 Karlsruhe (Germany); Perles, Maria Jesus [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Zwahlen, Francois [Centre of Hydrogeology, University of Neuchatel, 11 rue Emile-Argand, CH-2007 Neuchatel (Switzerland)

    2006-03-15

    The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, validation of vulnerability maps, hazard and risk mapping. This paper presents the first application of all components of this Pan-European Approach to the Sierra de Libar, a karst hydrogeology system in Andalusia, Spain. The intrinsic vulnerability maps take into account the hydrogeological characteristics of the area but are independent from specific contaminant properties. Two specific vulnerability maps were prepared for faecal coliforms and BTEX. These maps take into account the specific properties of these two groups of contaminants and their interaction with the karst hydrogeological system. The vulnerability assessment was validated by means of tracing tests, hydrological, hydrochemical and isotope methods. The hazard map shows the localization of potential contamination sources resulting from human activities, and evaluates those according to their dangerousness. The risk of groundwater contamination depends on the hazards and the vulnerability of the aquifer system. The risk map for the Sierra de Libar was thus created by overlaying the hazard and vulnerability maps.

  2. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  3. Mapping of multi-floor buildings: A barometric approach

    DEFF Research Database (Denmark)

    Özkil, Ali Gürcan; Fan, Zhun; Xiao, Jizhong

    2011-01-01

    This paper presents a new method for mapping multi5floor buildings. The method combines laser range sensor for metric mapping and barometric pressure sensor for detecting floor transitions and map segmentation. We exploit the fact that the barometric pressure is a function of the elevation......, and it varies between different floors. The method is tested with a real robot in a typical indoor environment, and the results show that physically consistent multi5floor representations are achievable....

  4. A Bac Library and Paired-PCR Approach to Mapping and Completing the Genome Sequence of Sulfolobus Solfataricus P2

    DEFF Research Database (Denmark)

    She, Qunxin; Confalonieri, F.; Zivanovic, Y.

    2000-01-01

    The original strategy used in the Sulfolobus solfatnricus genome project was to sequence non overlapping, or minimally overlapping, cosmid or lambda inserts without constructing a physical map. However, after only about two thirds of the genome sequence was completed, this approach became counter......-productive because there was a high sequence bias in the cosmid and lambda libraries. Therefore, a new approach was devised for linking the sequenced regions which may be generally applicable. BAC libraries were constructed and terminal sequences of the clones were determined and used for both end mapping and PCR...

  5. The Peak Pairs algorithm for strain mapping from HRTEM images

    Energy Technology Data Exchange (ETDEWEB)

    Galindo, Pedro L. [Departamento de Lenguajes y Sistemas Informaticos, CASEM, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain)], E-mail: pedro.galindo@uca.es; Kret, Slawomir [Institute of Physics, PAS, AL. Lotnikow 32/46, 02-668 Warsaw (Poland); Sanchez, Ana M. [Departamento de Ciencia de los Materiales e Ing. Metalurgica y Q. Inorganica, Facultad de Ciencias, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain); Laval, Jean-Yves [Laboratoire de Physique du Solide, UPR5 CNRS-ESPCI, Paris (France); Yanez, Andres; Pizarro, Joaquin; Guerrero, Elisa [Departamento de Lenguajes y Sistemas Informaticos, CASEM, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain); Ben, Teresa; Molina, Sergio I. [Departamento de Ciencia de los Materiales e Ing. Metalurgica y Q. Inorganica, Facultad de Ciencias, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain)

    2007-11-15

    Strain mapping is defined as a numerical image-processing technique that measures the local shifts of image details around a crystal defect with respect to the ideal, defect-free, positions in the bulk. Algorithms to map elastic strains from high-resolution transmission electron microscopy (HRTEM) images may be classified into two categories: those based on the detection of peaks of intensity in real space and the Geometric Phase approach, calculated in Fourier space. In this paper, we discuss both categories and propose an alternative real space algorithm (Peak Pairs) based on the detection of pairs of intensity maxima in an affine transformed space dependent on the reference area. In spite of the fact that it is a real space approach, the Peak Pairs algorithm exhibits good behaviour at heavily distorted defect cores, e.g. interfaces and dislocations. Quantitative results are reported from experiments to determine local strain in different types of semiconductor heterostructures.

  6. Common Fixed Points of Mappings and Set-Valued Mappings in Symmetric Spaces with Application to Probabilistic Spaces

    OpenAIRE

    M. Aamri; A. Bassou; S. Bennani; D. El Moutawakil

    2007-01-01

    The main purpose of this paper is to give some common fixed point theorems of mappings and set-valued mappings of a symmetric space with some applications to probabilistic spaces. In order to get these results, we define the concept of E-weak compatibility between set-valued and single-valued mappings of a symmetric space.

  7. Defining precision medicines approaches to Autism Spectrum Disorders: concepts and challenges

    Directory of Open Access Journals (Sweden)

    Eva Loth

    2016-11-01

    Full Text Available The tremendous clinical and etiological variability between individuals with Autism Spectrum Disorder (ASD has made precision medicine the most promising treatment approach. It aims to combine new pathophysiologically based treatments with objective tests (stratification biomarkers to predict which treatment may be beneficial for a particular person. Here we discuss significant advances and current challenges for this approach: Rare monogenic forms of ASD have provided a major breakthrough for the identification of treatment targets by providing a means to trace causal links from a gene to specific molecular alterations and biological pathways. To estimate whether treatment targets thus identified may be useful for larger patient groups we need a better understanding of whether different etiologies (i.e., genetic and environmental risk factors acting at different critical time points lead to convergent or divergent molecular mechanisms, and how they map onto differences in circuit-level brain and cognitive development, and behavioural symptom profiles. Several recently failed clinical trials with syndromic forms of ASD provide valuable insights into conceptual and methodological issues linked to limitations in the translatability from animal models to humans, placebo effects, and a need for mechanistically plausible, objective outcome measures. To identify stratification biomarkers markers that enrich participant selection in clinical trials, large-scale multi-modal longitudinal observational studies are underway. Addressing these different factors in the next generation of research studies requires a translatable developmental perspective and multidisciplinary, collaborative efforts, with a commitment to sharing protocols and data, to increase transparency and reproducibility.

  8. Detailed debris flow hazard assessment in Andorra: A multidisciplinary approach

    Science.gov (United States)

    Hürlimann, Marcel; Copons, Ramon; Altimir, Joan

    2006-08-01

    In many mountainous areas, the rapid development of urbanisation and the limited space in the valley floors have created a need to construct buildings in zones potentially exposed to debris flow hazard. In these zones, a detailed and coherent hazard assessment is necessary to provide an adequate urban planning. This article presents a multidisciplinary procedure to evaluate the debris flow hazard at a local scale. Our four-step approach was successfully applied to five torrent catchments in the Principality of Andorra, located in the Pyrenees. The first step consisted of a comprehensive geomorphologic and geologic analysis providing an inventory map of the past debris flows, a magnitude-frequency relationship, and a geomorphologic-geologic map. These data were necessary to determine the potential initiation zones and volumes of future debris flows for each catchment. A susceptibility map and different scenarios were the principal outcome of the first step, as well as essential input data for the second step, the runout analysis. A one-dimensional numerical code was applied to analyse the scenarios previously defined. First, the critical channel sections in the fan area were evaluated, then the maximum runout of the debris flows on the fan was studied, and finally simplified intensity maps for each defined scenario were established. The third step of our hazard assessment was the hazard zonation and the compilation of all the results from the two previous steps in a final hazard map. The base of this hazard map was the hazard matrix, which combined the intensity of the debris flow with its probability of occurrence and determined a certain hazard degree. The fourth step referred to the hazard mitigation and included some recommendations for hazard reduction. In Andorra, this four-step approach is actually being applied to assess the debris flow hazard. The final hazard maps, at 1 : 2000 scale, provide an obligatory tool for local land use planning. Experience

  9. Software defined networks a comprehensive approach

    CERN Document Server

    Goransson, Paul

    2014-01-01

    Software Defined Networks discusses the historical networking environment that gave rise to SDN, as well as the latest advances in SDN technology. The book gives you the state of the art knowledge needed for successful deployment of an SDN, including: How to explain to the non-technical business decision makers in your organization the potential benefits, as well as the risks, in shifting parts of a network to the SDN modelHow to make intelligent decisions about when to integrate SDN technologies in a networkHow to decide if your organization should be developing its own SDN applications or

  10. Lunar UV-visible-IR mapping interferometric spectrometer

    Science.gov (United States)

    Smith, W. Hayden; Haskin, L.; Korotev, R.; Arvidson, R.; Mckinnon, W.; Hapke, B.; Larson, S.; Lucey, P.

    1992-01-01

    Ultraviolet-visible-infrared mapping digital array scanned interferometers for lunar compositional surveys was developed. The research has defined a no-moving-parts, low-weight and low-power, high-throughput, and electronically adaptable digital array scanned interferometer that achieves measurement objectives encompassing and improving upon all the requirements defined by the LEXSWIG for lunar mineralogical investigation. In addition, LUMIS provides a new, important, ultraviolet spectral mapping, high-spatial-resolution line scan camera, and multispectral camera capabilities. An instrument configuration optimized for spectral mapping and imaging of the lunar surface and provide spectral results in support of the instrument design are described.

  11. Non-invasive multiparametric qBOLD approach for robust mapping of the oxygen extraction fraction.

    Science.gov (United States)

    Domsch, Sebastian; Mie, Moritz B; Wenz, Frederik; Schad, Lothar R

    2014-09-01

    The quantitative blood oxygenation level-dependent (qBOLD) method has not become clinically established yet because long acquisition times are necessary to achieve an acceptable certainty of the parameter estimates. In this work, a non-invasive multiparametric (nimp) qBOLD approach based on a simple analytical model is proposed to facilitate robust oxygen extraction fraction (OEF) mapping within clinically acceptable acquisition times by using separate measurements. The protocol consisted of a gradient-echo sampled spin-echo sequence (GESSE), a T2-weighted Carr-Purcell-Meiboom-Gill (CPMG) sequence, and a T2(*)-weighted multi-slice multi-echo gradient echo (MMGE) sequence. The GESSE acquisition time was less than 5 minutes and the extra measurement time for CPMG/MMGE was below 2 minutes each. The proposed nimp-qBOLD approach was validated in healthy subjects (N = 5) and one patient. The proposed nimp-qBOLD approach facilitated more robust OEF mapping with significantly reduced inter- and intra-subject variability compared to the standard qBOLD method. Thereby, an average OEF in all subjects of 27±2% in white matter (WM) and 29±2% in gray matter (GM) using the nimp-qBOLD method was more stable compared to 41±10% (WM) and 46±10% (GM) with standard qBOLD. Moreover, the spatial variance in the image slice (i.e. standard deviation divided by mean) was on average reduced from 35% to 25%. In addition, the preliminary results of the patient are encouraging. The proposed nimp-qBOLD technique provides a promising tool for robust OEF mapping within clinically acceptable acquisition times and could therefore provide an important contribution for analyzing tumors or monitoring the success of radio and chemo therapies. Copyright © 2014. Published by Elsevier GmbH.

  12. Non-invasive multiparametric qBOLD approach for robust mapping of the oxygen extraction fraction

    Energy Technology Data Exchange (ETDEWEB)

    Domsch, Sebastian; Mie, Moritz B.; Schad, Lothar R. [Heidelberg Univ., Medical Faculty Mannheim (Germany). Computer Assisted Clinical Medicine; Wenz, Frederik [Heidelberg Univ., Medical Faculty Mannheim (Germany). Dept. of Radiation Oncology

    2014-10-01

    Introduction: The quantitative blood oxygenation level-dependent (qBOLD) method has not become clinically established yet because long acquisition times are necessary to achieve an acceptable certainty of the parameter estimates. In this work, a non-invasive multiparametric (nimp) qBOLD approach based on a simple analytical model is proposed to facilitate robust oxygen extraction fraction (OEF) mapping within clinically acceptable acquisition times by using separate measurements. Methods: The protocol consisted of a gradient-echo sampled spin-echo sequence (GESSE), a T{sub 2}-weighted Carr-Purcell-Meiboom-Gill (CPMG) sequence, and a T{sub 2}{sup *}-weighted multi-slice multi-echo gradient echo (MMGE) sequence. The GESSE acquisition time was less than 5 minutes and the extra measurement time for CPMG / MMGE was below 2 minutes each. The proposed nimp-qBOLD approach was validated in healthy subjects (N = 5) and one patient. Results: The proposed nimp-qBOLD approach facilitated more robust OEF mapping with significantly reduced inter- and intra-subject variability compared to the standard qBOLD method. Thereby, an average OEF in all subjects of 27 ± 2 % in white matter (WM) and 29 ± 2 % in gray matter (GM) using the nimp-qBOLD method was more stable compared to 41 ± 10 % (WM) and 46 ± 10 % (GM) with standard qBOLD. Moreover, the spatial variance in the image slice (i.e. standard deviation divided by mean) was on average reduced from 35 % to 25 %. In addition, the preliminary results of the patient are encouraging. Conclusion: The proposed nimp-qBOLD technique provides a promising tool for robust OEF mapping within clinically acceptable acquisition times and could therefore provide an important contribution for analyzing tumors or monitoring the success of radio and chemo therapies. (orig.)

  13. An 00 visual language definition approach supporting multiple views

    OpenAIRE

    Akehurst, David H.; I.E.E.E. Computer Society

    2000-01-01

    The formal approach to visual language definition is to use graph grammars and/or graph transformation techniques. These techniques focus on specifying the syntax and manipulation rules of the concrete representation. This paper presents a constraint and object-oriented approach to defining visual languages that uses UML and OCL as a definition language. Visual language definitions specify a mapping between concrete and abstract models of possible visual sentences, which carl subsequently be ...

  14. A trait based approach to defining valued mentoring qualities

    Science.gov (United States)

    Pendall, E.

    2012-12-01

    Graduate training in the sciences requires strong personal interactions among faculty, senior lab members and more junior members. Within the lab-group setting we learn to frame problems, to conduct research and to communicate findings. The result is that individual scientists are partly shaped by a few influential mentors. We have all been influenced by special relationships with mentors, and on reflection we may find that certain qualities have been especially influential in our career choices. In this presentation I will discuss favorable mentoring traits as determined from an informal survey of scientists in varying stages of careers and from diverse backgrounds. Respondents addressed questions about traits they value in their mentors in several categories: 1) personal qualities such as approachability, humor and encouragement; background including gender, ethnicity, and family status; 2) scientific qualities including discipline or specialization, perceived stature in discipline, seniority, breadth of perspective, and level of expectations; and 3) community-oriented qualities promoted by mentors, such as encouraging service contributions and peer-mentoring within the lab group. The results will be compared among respondents by gender, ethnicity, stage of career, type of work, and subdiscipline within the broadly defined Biogeoscience community. We hope to contribute to the growing discussion on building a diverse and balanced scientific workforce.

  15. A Remote Sensing Approach for Regional-Scale Mapping of Agricultural Land-Use Systems Based on NDVI Time Series

    Directory of Open Access Journals (Sweden)

    Beatriz Bellón

    2017-06-01

    Full Text Available In response to the need for generic remote sensing tools to support large-scale agricultural monitoring, we present a new approach for regional-scale mapping of agricultural land-use systems (ALUS based on object-based Normalized Difference Vegetation Index (NDVI time series analysis. The approach consists of two main steps. First, to obtain relatively homogeneous land units in terms of phenological patterns, a principal component analysis (PCA is applied to an annual MODIS NDVI time series, and an automatic segmentation is performed on the resulting high-order principal component images. Second, the resulting land units are classified into the crop agriculture domain or the livestock domain based on their land-cover characteristics. The crop agriculture domain land units are further classified into different cropping systems based on the correspondence of their NDVI temporal profiles with the phenological patterns associated with the cropping systems of the study area. A map of the main ALUS of the Brazilian state of Tocantins was produced for the 2013–2014 growing season with the new approach, and a significant coherence was observed between the spatial distribution of the cropping systems in the final ALUS map and in a reference map extracted from the official agricultural statistics of the Brazilian Institute of Geography and Statistics (IBGE. This study shows the potential of remote sensing techniques to provide valuable baseline spatial information for supporting agricultural monitoring and for large-scale land-use systems analysis.

  16. Identification of probabilistic approaches and map-based navigation ...

    Indian Academy of Sciences (India)

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  17. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    Science.gov (United States)

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  18. A Concept Map of Curiosity Literature

    OpenAIRE

    Bai, Zhen

    2018-01-01

    Curiosity is a commonly studied topic in psychology. I produced the following mind map to categorize and understand key contributions to curiosity literature, to inform the design of technology-enhanced learning technologies to evoke curiosity that we are presently undertaking. Just as the mind map categorizes the literature, the literature de?fines the shape and nature of the mind map presented here-in.

  19. Characterizing semantic mappings adaptation via biomedical KOS evolution: a case study investigating SNOMED CT and ICD.

    Science.gov (United States)

    Dos Reis, Julio Cesar; Pruski, Cédric; Da Silveira, Marcos; Reynaud-Delaître, Chantal

    2013-01-01

    Mappings established between Knowledge Organization Systems (KOS) increase semantic interoperability between biomedical information systems. However, biomedical knowledge is highly dynamic and changes affecting KOS entities can potentially invalidate part or the totality of existing mappings. Understanding how mappings evolve and what the impacts of KOS evolution on mappings are is therefore crucial for the definition of an automatic approach to maintain mappings valid and up-to-date over time. In this article, we study variations of a specific KOS complex change (split) for two biomedical KOS (SNOMED CT and ICD-9-CM) through a rigorous method of investigation for identifying and refining complex changes, and for selecting representative cases. We empirically analyze and explain their influence on the evolution of associated mappings. Results point out the importance of considering various dimensions of the information described in KOS, like the semantic structure of concepts, the set of relevant information used to define the mappings and the change operations interfering with this set of information.

  20. Genetic fine mapping and genomic annotation defines causal mechanisms at type 2 diabetes susceptibility loci

    DEFF Research Database (Denmark)

    Gaulton, Kyle J; Ferreira, Teresa; Lee, Yeji

    2015-01-01

    We performed fine mapping of 39 established type 2 diabetes (T2D) loci in 27,206 cases and 57,574 controls of European ancestry. We identified 49 distinct association signals at these loci, including five mapping in or near KCNQ1. 'Credible sets' of the variants most likely to drive each distinct...... signal mapped predominantly to noncoding sequence, implying that association with T2D is mediated through gene regulation. Credible set variants were enriched for overlap with FOXA2 chromatin immunoprecipitation binding sites in human islet and liver cells, including at MTNR1B, where fine mapping...... implicated rs10830963 as driving T2D association. We confirmed that the T2D risk allele for this SNP increases FOXA2-bound enhancer activity in islet- and liver-derived cells. We observed allele-specific differences in NEUROD1 binding in islet-derived cells, consistent with evidence that the T2D risk allele...

  1. Defining asthma in genetic studies

    NARCIS (Netherlands)

    Koppelman, GH; Postma, DS; Meijer, G.

    1999-01-01

    Genetic studies have been hampered by the lack of a gold standard to diagnose asthma. The complex nature of asthma makes it more difficult to identify asthma genes. Therefore, approaches to define phenotypes, which have been successful in other genetically complex diseases, may be applied to define

  2. New approaches to high-resolution mapping of marine vertical structures.

    Science.gov (United States)

    Robert, Katleen; Huvenne, Veerle A I; Georgiopoulou, Aggeliki; Jones, Daniel O B; Marsh, Leigh; D O Carter, Gareth; Chaumillon, Leo

    2017-08-21

    Vertical walls in marine environments can harbour high biodiversity and provide natural protection from bottom-trawling activities. However, traditional mapping techniques are usually restricted to down-looking approaches which cannot adequately replicate their 3D structure. We combined sideways-looking multibeam echosounder (MBES) data from an AUV, forward-looking MBES data from ROVs and ROV-acquired videos to examine walls from Rockall Bank and Whittard Canyon, Northeast Atlantic. High-resolution 3D point clouds were extracted from each sonar dataset and structure from motion photogrammetry (SfM) was applied to recreate 3D representations of video transects along the walls. With these reconstructions, it was possible to interact with extensive sections of video footage and precisely position individuals. Terrain variables were derived on scales comparable to those experienced by megabenthic individuals. These were used to show differences in environmental conditions between observed and background locations as well as explain spatial patterns in ecological characteristics. In addition, since the SfM 3D reconstructions retained colours, they were employed to separate and quantify live coral colonies versus dead framework. The combination of these new technologies allows us, for the first time, to map the physical 3D structure of previously inaccessible habitats and demonstrates the complexity and importance of vertical structures.

  3. Snapshots for Semantic Maps

    National Research Council Canada - National Science Library

    Nielsen, Curtis W; Ricks, Bob; Goodrich, Michael A; Bruemmer, David; Few, Doug; Walton, Miles

    2004-01-01

    .... Semantic maps are a relatively new approach to information presentation. Semantic maps provide more detail about an environment than typical maps because they are augmented by icons or symbols that provide meaning for places or objects of interest...

  4. Satellite SAR interferometric techniques applied to emergency mapping

    Science.gov (United States)

    Stefanova Vassileva, Magdalena; Riccardi, Paolo; Lecci, Daniele; Giulio Tonolo, Fabio; Boccardo Boccardo, Piero; Chiesa, Giuliana; Angeluccetti, Irene

    2017-04-01

    This paper aim to investigate the capabilities of the currently available SAR interferometric algorithms in the field of emergency mapping. Several tests have been performed exploiting the Copernicus Sentinel-1 data using the COTS software ENVI/SARscape 5.3. Emergency Mapping can be defined as "creation of maps, geo-information products and spatial analyses dedicated to providing situational awareness emergency management and immediate crisis information for response by means of extraction of reference (pre-event) and crisis (post-event) geographic information/data from satellite or aerial imagery". The conventional differential SAR interferometric technique (DInSAR) and the two currently available multi-temporal SAR interferometric approaches, i.e. Permanent Scatterer Interferometry (PSI) and Small BAseline Subset (SBAS), have been applied to provide crisis information useful for the emergency management activities. Depending on the considered Emergency Management phase, it may be distinguished between rapid mapping, i.e. fast provision of geospatial data regarding the area affected for the immediate emergency response, and monitoring mapping, i.e. detection of phenomena for risk prevention and mitigation activities. In order to evaluate the potential and limitations of the aforementioned SAR interferometric approaches for the specific rapid and monitoring mapping application, five main factors have been taken into account: crisis information extracted, input data required, processing time and expected accuracy. The results highlight that DInSAR has the capacity to delineate areas affected by large and sudden deformations and fulfills most of the immediate response requirements. The main limiting factor of interferometry is the availability of suitable SAR acquisition immediately after the event (e.g. Sentinel-1 mission characterized by 6-day revisiting time may not always satisfy the immediate emergency request). PSI and SBAS techniques are suitable to produce

  5. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    Science.gov (United States)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  6. The constellation of dietary factors in adolescent acne: a semantic connectivity map approach.

    Science.gov (United States)

    Grossi, E; Cazzaniga, S; Crotti, S; Naldi, L; Di Landro, A; Ingordo, V; Cusano, F; Atzori, L; Tripodi Cutrì, F; Musumeci, M L; Pezzarossa, E; Bettoli, V; Caproni, M; Bonci, A

    2016-01-01

    Different lifestyle and dietetic factors have been linked with the onset and severity of acne. To assess the complex interconnection between dietetic variables and acne. This was a reanalysis of data from a case-control study by using a semantic connectivity map approach. 563 subjects, aged 10-24 years, involved in a case-control study of acne between March 2009 and February 2010, were considered in this study. The analysis evaluated the link between a moderate to severe acne and anthropometric variables, family history and dietetic factors. Analyses were conducted by relying on an artificial adaptive system, the Auto Semantic Connectivity Map (AutoCM). The AutoCM map showed that moderate-severe acne was closely associated with family history of acne in first degree relatives, obesity (BMI ≥ 30), and high consumption of milk, in particular skim milk, cheese/yogurt, sweets/cakes, chocolate, and a low consumption of fish, and limited intake of fruits/vegetables. Our analyses confirm the link between several dietetic items and acne. When providing care, dermatologists should also be aware of the complex interconnection between dietetic factors and acne. © 2014 European Academy of Dermatology and Venereology.

  7. Concept Mapping: An "Instagram" of Students' Thinking

    Science.gov (United States)

    Campbell, Laurie O.

    2016-01-01

    Minimal research has been accumulated in the field of social studies education for Novakian concept mapping, yet there are many benefits from adding this learning tool to a teacher's instructional toolbox. The article defines Novakian concept mapping and invites readers to adopt digital Novakian concept mapping into the social studies classroom as…

  8. Mapping the World's Intact Forest Landscapes by Remote Sensing

    Directory of Open Access Journals (Sweden)

    Peter Potapov

    2008-12-01

    Full Text Available Protection of large natural forest landscapes is a highly important task to help fulfill different international strategic initiatives to protect forest biodiversity, to reduce carbon emissions from deforestation and forest degradation, and to stimulate sustainable forest management practices. This paper introduces a new approach for mapping large intact forest landscapes (IFL, defined as an unbroken expanse of natural ecosystems within areas of current forest extent, without signs of significant human activity, and having an area of at least 500 km2. We have created a global IFL map using existing fine-scale maps and a global coverage of high spatial resolution satellite imagery. We estimate the global area of IFL within the current extent of forest ecosystems (forest zone to be 13.1 million km2 or 23.5% of the forest zone. The vast majority of IFL are found in two biomes: Dense Tropical and Subtropical Forests (45.3% and Boreal Forests (43.8%. The lowest proportion of IFL is found in Temperate Broadleaf and Mixed Forests. The IFL exist in 66 of the 149 countries that together make up the forest zone. Three of them - Canada, Russia, and Brazil - contain 63.8% of the total IFL area. Of the world's IFL area, 18.9% has some form of protection, but only 9.7% is strictly protected, i.e., belongs to IUCN protected areas categories I-III. The world IFL map presented here is intended to underpin the development of a general strategy for nature conservation at the global and regional scales. It also defines a baseline for monitoring deforestation and forest degradation that is well suited for use with operational and cost-effective satellite data. All project results and IFL maps are available on a dedicated web site (http://www.intactforests.org.

  9. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    Science.gov (United States)

    Grelle, Gerardo; Bonito, Laura; Lampasi, Alessandro; Revellino, Paola; Guerriero, Luigi; Sappa, Giuseppe; Guadagno, Francesco Maria

    2016-04-01

    The SiSeRHMap (simulator for mapped seismic response using a hybrid model) is a computerized methodology capable of elaborating prediction maps of seismic response in terms of acceleration spectra. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code architecture composed of five interdependent modules. A GIS (geographic information system) cubic model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A meta-modelling process confers a hybrid nature to the methodology. In this process, the one-dimensional (1-D) linear equivalent analysis produces acceleration response spectra for a specified number of site profiles using one or more input motions. The shear wave velocity-thickness profiles, defined as trainers, are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Emul-spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated evolutionary algorithm (EA) and the Levenberg-Marquardt algorithm (LMA) as the final optimizer. In the final step, the GCM maps executor module produces a serial map set of a stratigraphic seismic response at different periods, grid solving the calibrated Emul-spectra model. In addition, the spectra topographic amplification is also computed by means of a 3-D validated numerical prediction model. This model is built to match the results of the numerical simulations related to isolate reliefs using GIS morphometric data. In this way, different sets of seismic response maps are developed on which maps of design acceleration response spectra are also defined by means of an enveloping technique.

  10. Torus Breakdown in Noninvertible Maps

    DEFF Research Database (Denmark)

    Maistrenko, V.; Maistrenko, Yu.; Mosekilde, Erik

    2003-01-01

    We propose a criterion for the destruction of a two-dimensional torus through the formation of an infinite set of cusp points on the closed invariant curves defining the resonance torus. This mechanism is specific to noninvertible maps. The cusp points arise when the tangent to the torus at the p......We propose a criterion for the destruction of a two-dimensional torus through the formation of an infinite set of cusp points on the closed invariant curves defining the resonance torus. This mechanism is specific to noninvertible maps. The cusp points arise when the tangent to the torus...... at the point of intersection with the critical curve L-0 coincides with the eigendirection corresponding to vanishing eigenvalue for the noninvertible map. Further parameter changes lead typically to the generation of loops (self-intersections of the invariant manifolds) followed by the transformation...

  11. Employing Deceptive Dynamic Network Topology Through Software-Defined Networking

    Science.gov (United States)

    2014-03-01

    actions. From [64] . . . . . 37 xi THIS PAGE INTENTIONALLY LEFT BLANK xii List of Acronyms and Abbreviations ACL Access Control List API Application...can be extremely useful in topology mapping through various latency-based geolocation methods [35], [36], [37]. PING 1 7 2 . 2 0 . 5 . 2 ( 1 7 2 . 2 0...defined northbound Applica- tion Programming Interfaces ( APIs ). Figure 3.1: Software-Defined Network Architecture. From [8] 29 3.3 SDN OpenFlow

  12. Genetic fine mapping and genomic annotation defines causal mechanisms at type 2 diabetes susceptibility loci

    NARCIS (Netherlands)

    K. Gaulton (Kyle); T. Ferreira (Teresa); Y. Lee (Yeji); A. Raimondo (Anne); R. Mägi (Reedik); M.E. Reschen (Michael E.); A. Mahajan (Anubha); A. Locke (Adam); N.W. Rayner (Nigel William); N.R. Robertson (Neil); R.A. Scott (Robert); I. Prokopenko (Inga); L.J. Scott (Laura); T. Green (Todd); T. Sparsø (Thomas); D. Thuillier (Dorothee); L. Yengo (Loic); H. Grallert (Harald); S. Wahl (Simone); M. Frånberg (Mattias); R.J. Strawbridge (Rona); H. Kestler (Hans); H. Chheda (Himanshu); L. Eisele (Lewin); S. Gustafsson (Stefan); V. Steinthorsdottir (Valgerdur); G. Thorleifsson (Gudmar); L. Qi (Lu); L.C. Karssen (Lennart); E.M. van Leeuwen (Elisa); S.M. Willems (Sara); M. Li (Man); H. Chen (Han); C. Fuchsberger (Christian); P. Kwan (Phoenix); C. Ma (Clement); M. Linderman (Michael); Y. Lu (Yingchang); S.K. Thomsen (Soren K.); J.K. Rundle (Jana K.); N.L. Beer (Nicola L.); M. van de Bunt (Martijn); A. Chalisey (Anil); H.M. Kang (Hyun Min); B.F. Voight (Benjamin); G.R. Abecasis (Gonçalo); P. Almgren (Peter); D. Baldassarre (Damiano); B. Balkau (Beverley); R. Benediktsson (Rafn); M. Blüher (Matthias); H. Boeing (Heiner); L.L. Bonnycastle (Lori); E.P. Bottinger (Erwin P.); N.P. Burtt (Noël); J. Carey (Jason); G. Charpentier (Guillaume); P.S. Chines (Peter); M. Cornelis (Marilyn); D.J. Couper (David J.); A. Crenshaw (Andrew); R.M. van Dam (Rob); A.S.F. Doney (Alex); M. Dorkhan (Mozhgan); T. Edkins (Ted); J.G. Eriksson (Johan G.); T. Esko (Tõnu); E. Eury (Elodie); J. Fadista (João); J. Flannick (Jason); P. Fontanillas (Pierre); C.S. Fox (Caroline); P.W. Franks (Paul W.); K. Gertow (Karl); C. Gieger (Christian); B. Gigante (Bruna); R.F. Gottesman (Rebecca); G.B. Grant (George); N. Grarup (Niels); C.J. Groves (Christopher J.); M. Hassinen (Maija); C.T. Have (Christian T.); C. Herder (Christian); O.L. Holmen (Oddgeir); A.B. Hreidarsson (Astradur); S.E. Humphries (Steve E.); D.J. Hunter (David J.); A.U. Jackson (Anne); A. Jonsson (Anna); M.E. Jørgensen (Marit E.); T. Jørgensen (Torben); W.H.L. Kao (Wen); N.D. Kerrison (Nicola D.); L. Kinnunen (Leena); N. Klopp (Norman); A. Kong (Augustine); P. Kovacs (Peter); P. Kraft (Peter); J. Kravic (Jasmina); C. Langford (Cordelia); K. Leander (Karin); L. Liang (Liming); P. Lichtner (Peter); C.M. Lindgren (Cecilia M.); B. Lindholm (Bengt); A. Linneberg (Allan); C.-T. Liu (Ching-Ti); S. Lobbens (Stéphane); J. Luan (Jian'fan); V. Lyssenko (Valeriya); S. Männistö (Satu); O. McLeod (Olga); J. Meyer (Jobst); E. Mihailov (Evelin); G. Mirza (Ghazala); T.W. Mühleisen (Thomas); M. Müller-Nurasyid (Martina); C. Navarro (Carmen); M.M. Nöthen (Markus); N.N. Oskolkov (Nikolay N.); K.R. Owen (Katharine); D. Palli (Domenico); S. Pechlivanis (Sonali); L. Peltonen (Leena Johanna); J.R.B. Perry (John); C.P. Platou (Carl); M. Roden (Michael); D. Ruderfer (Douglas); D. Rybin (Denis); Y.T. Van Der Schouw (Yvonne T.); B. Sennblad (Bengt); G. Sigurosson (Gunnar); A. Stancáková (Alena); D. Steinbach; P. Storm (Petter); K. Strauch (Konstantin); H.M. Stringham (Heather); Q. Sun; B. Thorand (Barbara); E. Tikkanen (Emmi); A. Tönjes (Anke); J. Trakalo (Joseph); E. Tremoli (Elena); T. Tuomi (Tiinamaija); R. Wennauer (Roman); S. Wiltshire (Steven); A.R. Wood (Andrew); E. Zeggini (Eleftheria); I. Dunham (Ian); E. Birney (Ewan); L. Pasquali (Lorenzo); J. Ferrer (Jorge); R.J.F. Loos (Ruth); J. Dupuis (Josée); J.C. Florez (Jose); E.A. Boerwinkle (Eric); J.S. Pankow (James); C.M. van Duijn (Cornelia); E.J.G. Sijbrands (Eric); J.B. Meigs (James B.); F.B. Hu (Frank B.); U. Thorsteinsdottir (Unnur); J-A. Zwart (John-Anker); T.A. Lakka (Timo); R. Rauramaa (Rainer); M. Stumvoll (Michael); N.L. Pedersen (Nancy L.); L. Lind (Lars); S. Keinanen-Kiukaanniemi (Sirkka); E. Korpi-Hyövälti (Eeva); T. Saaristo (Timo); J. Saltevo (Juha); J. Kuusisto (Johanna); M. Laakso (Markku); A. Metspalu (Andres); R. Erbel (Raimund); K.-H. Jöckel (Karl-Heinz); S. Moebus (Susanne); S. Ripatti (Samuli); V. Salomaa (Veikko); E. Ingelsson (Erik); B.O. Boehm (Bernhard); R.N. Bergman (Richard N.); F.S. Collins (Francis S.); K.L. Mohlke (Karen L.); H. Koistinen (Heikki); J. Tuomilehto (Jaakko); K. Hveem (Kristian); I. Njølstad (Inger); P. Deloukas (Panagiotis); P.J. Donnelly (Peter J.); T.M. Frayling (Timothy); A.T. Hattersley (Andrew); U. de Faire (Ulf); A. Hamsten (Anders); T. Illig (Thomas); A. Peters (Annette); S. Cauchi (Stephane); R. Sladek (Rob); P. Froguel (Philippe); T. Hansen (Torben); O. Pedersen (Oluf); A.D. Morris (Andrew); C.N.A. Palmer (Collin N. A.); S. Kathiresan (Sekar); O. Melander (Olle); P.M. Nilsson (Peter M.); L. Groop (Leif); I.E. Barroso (Inês); C. Langenberg (Claudia); N.J. Wareham (Nicholas J.); C.A. O'Callaghan (Christopher A.); A.L. Gloyn (Anna); D. Altshuler (David); M. Boehnke (Michael); T.M. Teslovich (Tanya M.); M.I. McCarthy (Mark); A.P. Morris (Andrew)

    2015-01-01

    textabstractWe performed fine mapping of 39 established type 2 diabetes (T2D) loci in 27,206 cases and 57,574 controls of European ancestry. We identified 49 distinct association signals at these loci, including five mapping in or near KCNQ1. 'Credible sets' of the variants most likely to drive each

  13. Genetic fine mapping and genomic annotation defines causal mechanisms at type 2 diabetes susceptibility loci

    NARCIS (Netherlands)

    Gaulton, Kyle J; Ferreira, Teresa; Lee, Yeji; Raimondo, Anne; Mägi, Reedik; Reschen, Michael E; Mahajan, Anubha; Locke, Adam; William Rayner, N; Robertson, Neil; Scott, Robert A; Prokopenko, Inga; Scott, Laura J; Green, Todd; Sparso, Thomas; Thuillier, Dorothee; Yengo, Loic; Grallert, Harald; Wahl, Simone; Frånberg, Mattias; Strawbridge, Rona J; Kestler, Hans; Chheda, Himanshu; Eisele, Lewin; Gustafsson, Stefan; Steinthorsdottir, Valgerdur; Thorleifsson, Gudmar; Qi, Lu; Karssen, Lennart C; van Leeuwen, Elisabeth M; Willems, Sara M; Li, Man; Chen, Han; Fuchsberger, Christian; Kwan, Phoenix; Ma, Clement; Linderman, Michael; Lu, Yingchang; Thomsen, Soren K; Rundle, Jana K; Beer, Nicola L; van de Bunt, Martijn; Chalisey, Anil; Kang, Hyun Min; Voight, Benjamin F; Abecasis, Gonçalo R; Almgren, Peter; Baldassarre, Damiano; Balkau, Beverley; Benediktsson, Rafn; Blüher, Matthias; Boeing, Heiner; Bonnycastle, Lori L; Bottinger, Erwin P; Burtt, Noël P; Carey, Jason; Charpentier, Guillaume; Chines, Peter S; Cornelis, Marilyn C; Couper, David J; Crenshaw, Andrew T; van Dam, Rob M; Doney, Alex S F; Dorkhan, Mozhgan; Edkins, Sarah; Eriksson, Johan G; Esko, Tonu; Eury, Elodie; Fadista, João; Flannick, Jason; Fontanillas, Pierre; Fox, Caroline; Franks, Paul W; Gertow, Karl; Gieger, Christian; Gigante, Bruna; Gottesman, Omri; Grant, George B; Grarup, Niels; Groves, Christopher J; Hassinen, Maija; Have, Christian T; Herder, Christian; Holmen, Oddgeir L; Hreidarsson, Astradur B; Humphries, Steve E; Hunter, David J; Jackson, Anne U; Jonsson, Anna; Jørgensen, Marit E; Jørgensen, Torben; Kao, Wen-Hong L; Kerrison, Nicola D; Kinnunen, Leena; Klopp, Norman; Kong, Augustine; Kovacs, Peter; Kraft, Peter; Kravic, Jasmina; Langford, Cordelia; Leander, Karin; Liang, Liming; Lichtner, Peter; Lindgren, Cecilia M; Lindholm, Eero; Linneberg, Allan; Liu, Ching-Ti; Lobbens, Stéphane; Luan, Jian'an; Lyssenko, Valeriya; Männistö, Satu; McLeod, Olga; Meyer, Julia; Mihailov, Evelin; Mirza, Ghazala; Mühleisen, Thomas W; Müller-Nurasyid, Martina; Navarro, Carmen; Nöthen, Markus M; Oskolkov, Nikolay N; Owen, Katharine R; Palli, Domenico; Pechlivanis, Sonali; Peltonen, Leena; Perry, John R B; Platou, Carl G P; Roden, Michael; Ruderfer, Douglas; Rybin, Denis; van der Schouw, Yvonne T; Sennblad, Bengt; Sigurðsson, Gunnar; Stančáková, Alena; Steinbach, Gerald; Storm, Petter; Strauch, Konstantin; Stringham, Heather M; Sun, Qi; Thorand, Barbara; Tikkanen, Emmi; Tonjes, Anke; Trakalo, Joseph; Tremoli, Elena; Tuomi, Tiinamaija; Wennauer, Roman; Wiltshire, Steven; Wood, Andrew R; Zeggini, Eleftheria; Dunham, Ian; Birney, Ewan; Pasquali, Lorenzo; Ferrer, Jorge; Loos, Ruth J F; Dupuis, Josée; Florez, Jose C; Boerwinkle, Eric; Pankow, James S; van Duijn, Cornelia; Sijbrands, Eric; Meigs, James B; Hu, Frank B; Thorsteinsdottir, Unnur; Stefansson, Kari; Lakka, Timo A; Rauramaa, Rainer; Stumvoll, Michael; Pedersen, Nancy L; Lind, Lars; Keinanen-Kiukaanniemi, Sirkka M; Korpi-Hyövälti, Eeva; Saaristo, Timo E; Saltevo, Juha; Kuusisto, Johanna; Laakso, Markku; Metspalu, Andres; Erbel, Raimund; Jöcke, Karl-Heinz; Moebus, Susanne; Ripatti, Samuli; Salomaa, Veikko; Ingelsson, Erik; Boehm, Bernhard O; Bergman, Richard N; Collins, Francis S; Mohlke, Karen L; Koistinen, Heikki; Tuomilehto, Jaakko; Hveem, Kristian; Njølstad, Inger; Deloukas, Panagiotis; Donnelly, Peter J; Frayling, Timothy M; Hattersley, Andrew T; de Faire, Ulf; Hamsten, Anders; Illig, Thomas; Peters, Annette; Cauchi, Stephane; Sladek, Rob; Froguel, Philippe; Hansen, Torben; Pedersen, Oluf; Morris, Andrew D; Palmer, Collin N A; Kathiresan, Sekar; Melander, Olle; Nilsson, Peter M; Groop, Leif C; Barroso, Inês; Langenberg, Claudia; Wareham, Nicholas J; O'Callaghan, Christopher A; Gloyn, Anna L; Altshuler, David; Boehnke, Michael; Teslovich, Tanya M; McCarthy, Mark I; Morris, Andrew P

    2015-01-01

    We performed fine mapping of 39 established type 2 diabetes (T2D) loci in 27,206 cases and 57,574 controls of European ancestry. We identified 49 distinct association signals at these loci, including five mapping in or near KCNQ1. 'Credible sets' of the variants most likely to drive each distinct

  14. General Galilei Covariant Gaussian Maps

    Science.gov (United States)

    Gasbarri, Giulio; Toroš, Marko; Bassi, Angelo

    2017-09-01

    We characterize general non-Markovian Gaussian maps which are covariant under Galilean transformations. In particular, we consider translational and Galilean covariant maps and show that they reduce to the known Holevo result in the Markovian limit. We apply the results to discuss measures of macroscopicity based on classicalization maps, specifically addressing dissipation, Galilean covariance and non-Markovianity. We further suggest a possible generalization of the macroscopicity measure defined by Nimmrichter and Hornberger [Phys. Rev. Lett. 110, 16 (2013)].

  15. Trace maps of general substitutional sequences

    International Nuclear Information System (INIS)

    Kolar, M.; Nori, F.

    1990-01-01

    It is shown that for arbitrary n, there exists a trace map for any n-letter substitutional sequence. Trace maps are explicitly obtained for the well-known circle and Rudin-Shapiro sequences which can be defined by means of substitution rules on three and four letters, respectively. The properties of the two trace maps and their consequences for various spectral properties are briefly discussed

  16. Slope, Scarp and Sea Cliff Instability Susceptibility Mapping for Planning Regulations in Almada County, Portugal

    Science.gov (United States)

    Marques, Fernando; Queiroz, Sónia; Gouveia, Luís; Vasconcelos, Manuel

    2017-12-01

    In Portugal, the modifications introduced in 2008 and 2012 in the National Ecological Reserve law (REN) included the mandatory study of slope instability, including slopes, natural scarps, and sea cliffs, at municipal or regional scale, with the purpose of avoiding the use of hazardous zones with buildings and other structures. The law also indicates specific methods to perform these studies, with different approaches for slope instability, natural scarps and sea cliffs. The methods used to produce the maps required by REN law, with modifications and improvements to the law specified methods, were applied to the 71 km2 territory of Almada County, and included: 1) Slope instability mapping using the statistically based Information Value method validated with the landslide inventory using ROC curves, which provided an AAC=0.964, with the higher susceptibility zones which cover at least 80% of the landslides of the inventory to be included in REN map. The map was object of a generalization process to overcome the inconveniences of the use of a pixel based approach. 2) Natural scarp mapping including setback areas near the top, defined according to the law and setback areas near the toe defined by the application of the shadow angle calibrated with the major rockfalls which occurred in the study area; 3) Sea cliffs mapping including two levels of setback zones near the top, and one setback zone at the cliffs toe, which were based on systematic inventories of cliff failures occurred between 1947 and 2010 in a large scale regional littoral monitoring project. In the paper are described the methods used and the results obtained in this study, which correspond to the final maps of areas to include in REN. The results obtained in this study may be considered as an example of good practice of the municipal authorities in terms of solid, technical and scientifically supported regulation definitions, hazard prevention and safe and sustainable land use management.

  17. BAC-end sequence-based SNPs and Bin mapping for rapid integration of physical and genetic maps in apple.

    Science.gov (United States)

    Han, Yuepeng; Chagné, David; Gasic, Ksenija; Rikkerink, Erik H A; Beever, Jonathan E; Gardiner, Susan E; Korban, Schuyler S

    2009-03-01

    A genome-wide BAC physical map of the apple, Malus x domestica Borkh., has been recently developed. Here, we report on integrating the physical and genetic maps of the apple using a SNP-based approach in conjunction with bin mapping. Briefly, BAC clones located at ends of BAC contigs were selected, and sequenced at both ends. The BAC end sequences (BESs) were used to identify candidate SNPs. Subsequently, these candidate SNPs were genetically mapped using a bin mapping strategy for the purpose of mapping the physical onto the genetic map. Using this approach, 52 (23%) out of 228 BESs tested were successfully exploited to develop SNPs. These SNPs anchored 51 contigs, spanning approximately 37 Mb in cumulative physical length, onto 14 linkage groups. The reliability of the integration of the physical and genetic maps using this SNP-based strategy is described, and the results confirm the feasibility of this approach to construct an integrated physical and genetic maps for apple.

  18. A novel chaotic particle swarm optimization approach using Henon map and implicit filtering local search for economic load dispatch

    International Nuclear Information System (INIS)

    Coelho, Leandro dos Santos; Mariani, Viviana Cocco

    2009-01-01

    Particle swarm optimization (PSO) is a population-based swarm intelligence algorithm driven by the simulation of a social psychological metaphor instead of the survival of the fittest individual. Based on the chaotic systems theory, this paper proposed a novel chaotic PSO combined with an implicit filtering (IF) local search method to solve economic dispatch problems. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed PSO introduces chaos mapping using Henon map sequences which increases its convergence rate and resulting precision. The chaotic PSO approach is used to produce good potential solutions, and the IF is used to fine-tune of final solution of PSO. The hybrid methodology is validated for a test system consisting of 13 thermal units whose incremental fuel cost function takes into account the valve-point loading effects. Simulation results are promising and show the effectiveness of the proposed approach.

  19. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    Science.gov (United States)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  20. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    Science.gov (United States)

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Evaluation of digital soil mapping approaches with large sets of environmental covariates

    Science.gov (United States)

    Nussbaum, Madlene; Spiess, Kay; Baltensweiler, Andri; Grob, Urs; Keller, Armin; Greiner, Lucie; Schaepman, Michael E.; Papritz, Andreas

    2018-01-01

    The spatial assessment of soil functions requires maps of basic soil properties. Unfortunately, these are either missing for many regions or are not available at the desired spatial resolution or down to the required soil depth. The field-based generation of large soil datasets and conventional soil maps remains costly. Meanwhile, legacy soil data and comprehensive sets of spatial environmental data are available for many regions. Digital soil mapping (DSM) approaches relating soil data (responses) to environmental data (covariates) face the challenge of building statistical models from large sets of covariates originating, for example, from airborne imaging spectroscopy or multi-scale terrain analysis. We evaluated six approaches for DSM in three study regions in Switzerland (Berne, Greifensee, ZH forest) by mapping the effective soil depth available to plants (SD), pH, soil organic matter (SOM), effective cation exchange capacity (ECEC), clay, silt, gravel content and fine fraction bulk density for four soil depths (totalling 48 responses). Models were built from 300-500 environmental covariates by selecting linear models through (1) grouped lasso and (2) an ad hoc stepwise procedure for robust external-drift kriging (georob). For (3) geoadditive models we selected penalized smoothing spline terms by component-wise gradient boosting (geoGAM). We further used two tree-based methods: (4) boosted regression trees (BRTs) and (5) random forest (RF). Lastly, we computed (6) weighted model averages (MAs) from the predictions obtained from methods 1-5. Lasso, georob and geoGAM successfully selected strongly reduced sets of covariates (subsets of 3-6 % of all covariates). Differences in predictive performance, tested on independent validation data, were mostly small and did not reveal a single best method for 48 responses. Nevertheless, RF was often the best among methods 1-5 (28 of 48 responses), but was outcompeted by MA for 14 of these 28 responses. RF tended to over

  2. Development of Long Live Love+, a school-based online sexual health programme for young adults. An intervention mapping approach

    NARCIS (Netherlands)

    Mevissen, F.E.F.; Empelen, P. van; Watzeels, A.; Duin, G. van; Meijer, S.; Lieshout, S. van; Kok, G.

    2017-01-01

    This paper describes the development of a Dutch online programme called Long Live Love+ focusing on positive, coercion-free relationships, contraception use, and the prevention of STIs, using the Intervention Mapping (IM) approach. All six steps of the approach were followed. Step 1 confirmed the

  3. WOCAT mapping, GIS and the Góis municipality

    Science.gov (United States)

    Esteves, T. C. J.; Soares, J. A. A.; Ferreira, A. J. D.; Coelho, C. O. A.; Carreiras, M. A.; Lynden, G. V.

    2012-04-01

    In the scope of the goals of the association "The World Overview of Conservation Approaches and Technologies" (WOCAT), the established methodology intends to support the sustainable development of new techniques and the process of decision making in Sustainable Soil Management (SSM). Its main goal is to promote the co-existence with nature, in order to assure the wellbeing of upcoming generations. SSM is defined as the use of terrestrial resources, including soil, water, fauna, flora, for the production of goods that fulfill human needs, guaranteeing simultaneously a long-term productive potential for these resources, as well as the maintenance of their environmental functions. The EU-funded DESIRE (Desertification Mitigation & Remediation of Land: a global approach for local solutions) project is centered on SSM, having as a main goal the development and study of promising conservation, soil use and management strategies, therefore contributing for the protection of arid and semi-arid vulnerable areas. In Portugal, one of the main soil degradation and desertification agents are wildfires. There is consequently an urgent need to establish integrated conservation measures to reduce or prevent these occurrences. To do so, and for the DESIRE project, the WOCAT methodology was implemented, where it could be foreseen as 3 major questionnaires for: technologies (WOCAT Technologies), approaches (WOCAT Approaches) and mapping (WOCAT Mapping). The established methodology for WOCAT Mapping was created in order to attend the questions associated to the soil and water degradation, emphasizing the direct and socio-economic causes of this degradation. It evaluates what type of soil degradation is occurring, where, why and what actions are in practice in what respects to SSM. The association of this questionnaire to Geographical Information Systems (GIS) allows not only to produce maps, but also to calculate areas, taking into account several aspects of soil degradation and

  4. Using fuzzy cognitive mapping as a participatory approach to analyze change, preferred states, and perceived resilience of social-ecological systems

    Directory of Open Access Journals (Sweden)

    Steven A. Gray

    2015-06-01

    Full Text Available There is a growing interest in the use of fuzzy cognitive mapping (FCM as a participatory method for understanding social-ecological systems (SESs. In recent years, FCM has been used in a diverse set of contexts ranging from fisheries management to agricultural development, in an effort to generate transparent graphical models of complex systems that are useful for decision making, illuminate the core presumptions of environmental stakeholders, and structure environmental problems for scenario development. This increase in popularity is because of FCM's bottom-up approach and its ability to incorporate a range of individual, community-level, and expert knowledge into an accessible and standardized format. Although there has been an increase in the use of FCM as an environmental planning and learning tool, limited progress has been made with regard to the method's relationship to existing resilience frameworks and how the use of FCM compares with other participatory modeling/approaches available. Using case study data developed from community-driven models of the bushmeat trade in Tanzania, we examine the usefulness of FCM for promoting resilience analysis among stakeholders in terms of identifying key state variables that comprise an SES, evaluating alternative SES equilibrium states, and defining desirable or undesirable state outcomes through scenario analysis.

  5. Exposing the Myths, Defining the Future

    International Nuclear Information System (INIS)

    Slavov, S.

    2013-01-01

    With this official statement, the WEC calls for policymakers and industry leaders to ''get real'' as the World Energy Council as a global energy body exposes the myths by informing the energy debate and defines a path to a more sustainable energy future. The World Energy Council urged stakeholders to take urgent and incisive actions, to develop and transform the global energy system. Failure to do so could put aspirations on the triple challenge of WEC Energy Trilemma defined by affordability, accessibility and environmental sustainability at serious risk. Through its multi-year in-depth global studies and issue-mapping the WEC has found that challenges that energy sector is facing today are much more crucial than previously envisaged. The WEC's analysis has exposed a number of myths which influence our understanding of important aspects of the global energy landscape. If not challenged, these misconceptions will lead us down a path of complacency and missed opportunities. Much has, and still is, being done to secure energy future, but the WEC' s studies reveal that current pathways fall short of delivering on global aspirations of energy access, energy security and environmental improvements. If we are to derive the full economic and social benefits from energy resources, then we must take incisive and urgent action to modify our steps to energy solutions. The usual business approaches are not effective, the business as usual is not longer a solution. The focus has moved from large universal solutions to an appreciation of regional and national contexts and sharply differentiated consumer expectations.(author)

  6. Closed forms and multi-moment maps

    DEFF Research Database (Denmark)

    Madsen, Thomas Bruun; Swann, Andrew Francis

    2013-01-01

    We extend the notion of multi-moment map to geometries defined by closed forms of arbitrary degree. We give fundamental existence and uniqueness results and discuss a number of essential examples, including geometries related to special holonomy. For forms of degree four, multi-moment maps are gu...

  7. Computerized mappings of the cerebral cortex: a multiresolution flattening method and a surface-based coordinate system

    Science.gov (United States)

    Drury, H. A.; Van Essen, D. C.; Anderson, C. H.; Lee, C. W.; Coogan, T. A.; Lewis, J. W.

    1996-01-01

    We present a new method for generating two-dimensional maps of the cerebral cortex. Our computerized, two-stage flattening method takes as its input any well-defined representation of a surface within the three-dimensional cortex. The first stage rapidly converts this surface to a topologically correct two-dimensional map, without regard for the amount of distortion introduced. The second stage reduces distortions using a multiresolution strategy that makes gross shape changes on a coarsely sampled map and further shape refinements on progressively finer resolution maps. We demonstrate the utility of this approach by creating flat maps of the entire cerebral cortex in the macaque monkey and by displaying various types of experimental data on such maps. We also introduce a surface-based coordinate system that has advantages over conventional stereotaxic coordinates and is relevant to studies of cortical organization in humans as well as non-human primates. Together, these methods provide an improved basis for quantitative studies of individual variability in cortical organization.

  8. The semantic connectivity map: an adapting self-organising knowledge discovery method in data bases. Experience in gastro-oesophageal reflux disease.

    Science.gov (United States)

    Buscema, Massimo; Grossi, Enzo

    2008-01-01

    We describe here a new mapping method able to find out connectivity traces among variables thanks to an artificial adaptive system, the Auto Contractive Map (AutoCM), able to define the strength of the associations of each variable with all the others in a dataset. After the training phase, the weights matrix of the AutoCM represents the map of the main connections between the variables. The example of gastro-oesophageal reflux disease data base is extremely useful to figure out how this new approach can help to re-design the overall structure of factors related to complex and specific diseases description.

  9. Data Assimilation with Optimal Maps

    Science.gov (United States)

    El Moselhy, T.; Marzouk, Y.

    2012-12-01

    Tarek El Moselhy and Youssef Marzouk Massachusetts Institute of Technology We present a new approach to Bayesian inference that entirely avoids Markov chain simulation and sequential importance resampling, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. The map is written as a multivariate polynomial expansion and computed efficiently through the solution of a stochastic optimization problem. While our previous work [1] focused on static Bayesian inference problems, we now extend the map-based approach to sequential data assimilation, i.e., nonlinear filtering and smoothing. One scheme involves pushing forward a fixed reference measure to each filtered state distribution, while an alternative scheme computes maps that push forward the filtering distribution from one stage to the other. We compare the performance of these schemes and extend the former to problems of smoothing, using a map implementation of the forward-backward smoothing formula. Advantages of a map-based representation of the filtering and smoothing distributions include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent uniformly-weighted posterior samples without additional evaluations of the dynamical model. Perhaps the main advantage, however, is that the map approach inherently avoids issues of sample impoverishment, since it explicitly represents the posterior as the pushforward of a reference measure, rather than with a particular set of samples. The computational complexity of our algorithm is comparable to state-of-the-art particle filters. Moreover, the accuracy of the approach is controlled via the convergence criterion of the underlying optimization problem. We demonstrate the efficiency and accuracy of the map approach via data assimilation in

  10. A Ranking Analysis/An Interlinking Approach of New Triangular Fuzzy Cognitive Maps and Combined Effective Time Dependent Matrix

    Science.gov (United States)

    Adiga, Shreemathi; Saraswathi, A.; Praveen Prakash, A.

    2018-04-01

    This paper aims an interlinking approach of new Triangular Fuzzy Cognitive Maps (TrFCM) and Combined Effective Time Dependent (CETD) matrix to find the ranking of the problems of Transgenders. Section one begins with an introduction that briefly describes the scope of Triangular Fuzzy Cognitive Maps (TrFCM) and CETD Matrix. Section two provides the process of causes of problems faced by Transgenders using Fuzzy Triangular Fuzzy Cognitive Maps (TrFCM) method and performs the calculations using the collected data among the Transgender. In Section 3, the reasons for the main causes for the problems of the Transgenders. Section 4 describes the Charles Spearmans coefficients of rank correlation method by interlinking of Triangular Fuzzy Cognitive Maps (TrFCM) Method and CETD Matrix. Section 5 shows the results based on our study.

  11. Mapping Urban Green Infrastructure: A Novel Landscape-Based Approach to Incorporating Land Use and Land Cover in the Mapping of Human-Dominated Systems

    Directory of Open Access Journals (Sweden)

    Matthew Dennis

    2018-01-01

    Full Text Available Common approaches to mapping green infrastructure in urbanised landscapes invariably focus on measures of land use or land cover and associated functional or physical traits. However, such one-dimensional perspectives do not accurately capture the character and complexity of the landscapes in which urban inhabitants live. The new approach presented in this paper demonstrates how open-source, high spatial and temporal resolution data with global coverage can be used to measure and represent the landscape qualities of urban environments. Through going beyond simple metrics of quantity, such as percentage green and blue cover, it is now possible to explore the extent to which landscape quality helps to unpick the mixed evidence presented in the literature on the benefits of urban nature to human well-being. Here we present a landscape approach, employing remote sensing, GIS and data reduction techniques to map urban green infrastructure elements in a large U.K. city region. Comparison with existing urban datasets demonstrates considerable improvement in terms of coverage and thematic detail. The characterisation of landscapes, using census tracts as spatial units, and subsequent exploration of associations with social–ecological attributes highlights the further detail that can be uncovered by the approach. For example, eight urban landscape types identified for the case study city exhibited associations with distinct socioeconomic conditions accountable not only to quantities but also qualities of green and blue space. The identification of individual landscape features through simultaneous measures of land use and land cover demonstrated unique and significant associations between the former and indicators of human health and ecological condition. The approach may therefore provide a promising basis for developing further insight into processes and characteristics that affect human health and well-being in urban areas, both in the United

  12. Solving topological field theories on mapping tori

    International Nuclear Information System (INIS)

    Blau, M.; Jermyn, I.; Thompson, G.

    1996-05-01

    Using gauge theory and functional integral methods, we derive concrete expressions for the partition functions of BF theory and the U(1 modul 1) model of Rozansky and Saleur on Σ x S 1 , both directly and using equivalent two-dimensional theories. We also derive the partition function on a certain non-abelian generalization of the U(1 modul 1) model on mapping tori and hence obtain explicit expressions for the Ray-Singer torsion on these manifolds. Extensions of these results to BF and Chern-Simons theories on mapping tori are also discussed. The topological field theory actions of the equivalent two- dimensional theories we find have the interesting property of depending explicitly on the diffeomorphism defining the mapping torus while the quantum field theory is sensitive only to its isomorphism class defining the mapping torus as a smooth manifold. (author). 20 refs

  13. Regional quantitative analysis of cortical surface maps of FDG PET images

    CERN Document Server

    Protas, H D; Hayashi, K M; Chin Lung, Yu; Bergsneider, M; Sung Cheng, Huang

    2006-01-01

    Cortical surface maps are advantageous for visualizing the 3D profile of cortical gray matter development and atrophy, and for integrating structural and functional images. In addition, cortical surface maps for PET data, when analyzed in conjunction with structural MRI data allow us to investigate, and correct for, partial volume effects. Here we compared quantitative regional PET values based on a 3D cortical surface modeling approach with values obtained directly from the 3D FDG PET images in various atlas-defined regions of interest (ROIs; temporal, parietal, frontal, and occipital lobes). FDG PET and 3D MR (SPGR) images were obtained and aligned to ICBM space for 15 normal subjects. Each image was further elastically warped in 2D parameter space of the cortical surface, to align major cortical sulci. For each point within a 15 mm distance of the cortex, the value of the PET intensity was averaged to give a cortical surface map of FDG uptake. The average PET values on the cortical surface map were calcula...

  14. Local Relation Map: A Novel Illumination Invariant Face Recognition Approach

    Directory of Open Access Journals (Sweden)

    Lian Zhichao

    2012-10-01

    Full Text Available In this paper, a novel illumination invariant face recognition approach is proposed. Different from most existing methods, an additive term as noise is considered in the face model under varying illuminations in addition to a multiplicative illumination term. High frequency coefficients of Discrete Cosine Transform (DCT are discarded to eliminate the effect caused by noise. Based on the local characteristics of the human face, a simple but effective illumination invariant feature local relation map is proposed. Experimental results on the Yale B, Extended Yale B and CMU PIE demonstrate the outperformance and lower computational burden of the proposed method compared to other existing methods. The results also demonstrate the validity of the proposed face model and the assumption on noise.

  15. Development of the SALdável programme to reduce salt intake among hypertensive Brazilian women: an intervention mapping approach.

    Science.gov (United States)

    Cornélio, Marilia Estevam; Godin, Gaston; Rodrigues, Roberta; Agondi, Rúbia; Spana, Thaís; Gallani, Maria-Cecilia

    2013-08-01

    Despite strong evidence for a relationship between high salt intake and hypertension, plus the widespread recommendations for dietary salt restriction among hypertensive subjects, there are no nursing studies describing effective theory-based interventions. To describe a systematic process for development of a theory-based nursing intervention that is aimed at reducing salt intake among hypertensive women, by applying the 'intervention mapping' protocol. We developed our intervention following the six steps of the 'intervention mapping' protocol: assessing needs, creating a matrix of change objectives, selecting theoretical methods and practical applications, defining the intervention programme, organizing the adoption and implementation plan, and defining the evaluation plan. Addition of salt during cooking is identified as the main source for salt consumption, plus women are identified as the people responsible for cooking meals at home. In our study, the motivational predictors of this behaviour were self-efficacy and habit. Guided practice, verbal persuasion, coping barriers, consciousness-raising and counter-conditioning were the theoretical methods we selected for enhancing self-efficacy and promoting habit change, respectively. Brainstorming, role-playing, cookbook use, measuring spoon use, label reading, hands-on skill-building activities and reinforcement phone calls were the chosen practical applications. We designed our intervention programme, and then organized the adoption and implementation plans. Finally, we generated a plan to evaluate our intervention. 'Intervention mapping' was a feasible methodological framework to guide the development of a theory-based nursing intervention for dietary salt reduction among hypertensive women.

  16. Regional geology mapping using satellite-based remote sensing approach in Northern Victoria Land, Antarctica

    Science.gov (United States)

    Pour, Amin Beiranvand; Park, Yongcheol; Park, Tae-Yoon S.; Hong, Jong Kuk; Hashim, Mazlan; Woo, Jusun; Ayoobi, Iman

    2018-06-01

    Satellite remote sensing imagery is especially useful for geological investigations in Antarctica because of its remoteness and extreme environmental conditions that constrain direct geological survey. The highest percentage of exposed rocks and soils in Antarctica occurs in Northern Victoria Land (NVL). Exposed Rocks in NVL were part of the paleo-Pacific margin of East Gondwana during the Paleozoic time. This investigation provides a satellite-based remote sensing approach for regional geological mapping in the NVL, Antarctica. Landsat-8 and the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) datasets were used to extract lithological-structural and mineralogical information. Several spectral-band ratio indices were developed using Landsat-8 and ASTER bands and proposed for Antarctic environments to map spectral signatures of snow/ice, iron oxide/hydroxide minerals, Al-OH-bearing and Fe, Mg-OH and CO3 mineral zones, and quartz-rich felsic and mafic-to-ultramafic lithological units. The spectral-band ratio indices were tested and implemented to Level 1 terrain-corrected (L1T) products of Landsat-8 and ASTER datasets covering the NVL. The surface distribution of the mineral assemblages was mapped using the spectral-band ratio indices and verified by geological expeditions and laboratory analysis. Resultant image maps derived from spectral-band ratio indices that developed in this study are fairly accurate and correspond well with existing geological maps of the NVL. The spectral-band ratio indices developed in this study are especially useful for geological investigations in inaccessible locations and poorly exposed lithological units in Antarctica environments.

  17. A new meshless approach to map electromagnetic loads for FEM analysis on DEMO TF coil system

    International Nuclear Information System (INIS)

    Biancolini, Marco Evangelos; Brutti, Carlo; Giorgetti, Francesco; Muzzi, Luigi; Turtù, Simonetta; Anemona, Alessandro

    2015-01-01

    Graphical abstract: - Highlights: • Generation and mapping of magnetic load on DEMO using radial basis function. • Good agreement between RBF interpolation and EM TOSCA computations. • Resultant forces are stable with respect to the target mesh used. • Stress results are robust and accurate even if a coarse cloud is used for RBF interpolation. - Abstract: Demonstration fusion reactors (DEMO) are being envisaged to be able to produce commercial electrical power. The design of the DEMO magnets and of the constituting conductors is a crucial issue in the overall engineering design of such a large fusion machine. In the frame of the EU roadmap of the so-called fast track approach, mechanical studies of preliminary DEMO toroidal field (TF) coil system conceptual designs are being enforced. The magnetic field load acting on the DEMO TF coil conductor has to be evaluated as input in the FEM model mesh, in order to evaluate the stresses on the mechanical structure. To gain flexibility, a novel approach based on the meshless method of radial basis functions (RBF) has been implemented. The present paper describes this original and flexible approach for the generation and mapping of magnetic load on DEMO TF coil system.

  18. A knowledge-based approach for C-factor mapping in Spain using Landsat TM and GIS

    DEFF Research Database (Denmark)

    Veihe (former Folly), Anita; Bronsveld, M.C.; Clavaux, M

    1996-01-01

    The cover and management factor (C) in the Universal Soil Loss Equation (USLE), is one of the most important parameters for assessing erosion. In this study it is shown how a knowledge-based approach can be used to optimize C-factor mapping in the Mediterranean region being characterized...... to the limitations of the USLE...

  19. Tools for mapping ecosystem services

    Science.gov (United States)

    Palomo, Ignacio; Adamescu, Mihai; Bagstad, Kenneth J.; Cazacu, Constantin; Klug, Hermann; Nedkov, Stoyan; Burkhard, Benjamin; Maes, Joachim

    2017-01-01

    Mapping tools have evolved impressively in recent decades. From early computerised mapping techniques to current cloud-based mapping approaches, we have witnessed a technological evolution that has facilitated the democratisation of Geographic Information

  20. Approach of automatic 3D geological mapping: the case of the Kovdor phoscorite-carbonatite complex, NW Russia.

    Science.gov (United States)

    Kalashnikov, A O; Ivanyuk, G Yu; Mikhailova, J A; Sokharev, V A

    2017-07-31

    We have developed an approach for automatic 3D geological mapping based on conversion of chemical composition of rocks to mineral composition by logical computation. It allows to calculate mineral composition based on bulk rock chemistry, interpolate the mineral composition in the same way as chemical composition, and, finally, build a 3D geological model. The approach was developed for the Kovdor phoscorite-carbonatite complex containing the Kovdor baddeleyite-apatite-magnetite deposit. We used 4 bulk rock chemistry analyses - Fe magn , P 2 O 5 , CO 2 and SiO 2 . We used four techniques for prediction of rock types - calculation of normative mineral compositions (norms), multiple regression, artificial neural network and developed by logical evaluation. The two latter became the best. As a result, we distinguished 14 types of phoscorites (forsterite-apatite-magnetite-carbonate rock), carbonatite and host rocks. The results show good convergence with our petrographical studies of the deposit, and recent manually built maps. The proposed approach can be used as a tool of a deposit genesis reconstruction and preliminary geometallurgical modelling.

  1. Approaches to defining deltaic sustainability in the 21st century

    Science.gov (United States)

    Day, John W.; Agboola, Julius; Chen, Zhongyuan; D'Elia, Christopher; Forbes, Donald L.; Giosan, Liviu; Kemp, Paul; Kuenzer, Claudia; Lane, Robert R.; Ramachandran, Ramesh; Syvitski, James; Yañez-Arancibia, Alejandro

    2016-12-01

    Deltas are among the most productive and economically important of global ecosystems but unfortunately they are also among the most threatened by human activities. Here we discuss deltas and human impact, several approaches to defining deltaic sustainability and present a ranking of sustainability. Delta sustainability must be considered within the context of global biophysical and socioeconomic constraints that include thermodynamic limitations, scale and embeddedness, and constraints at the level of the biosphere/geosphere. The development, functioning, and sustainability of deltas are the result of external and internal inputs of energy and materials, such as sediments and nutrients, that include delta lobe development, channel switching, crevasse formation, river floods, storms and associated waves and storm surges, and tides and other ocean currents. Modern deltas developed over the past several thousand years with relatively stable global mean sea level, predictable material inputs from drainage basins and the sea, and as extremely open systems. Human activity has changed these conditions to make deltas less sustainable, in that they are unable to persist through time structurally or functionally. Deltaic sustainability can be considered from geomorphic, ecological, and economic perspectives, with functional processes at these three levels being highly interactive. Changes in this functioning can lead to either enhanced or diminished sustainability, but most changes have been detrimental. There is a growing understanding that the trajectories of global environmental change and cost of energy will make achieving delta sustainability more challenging and limit options for management. Several delta types are identified in terms of sustainability including those in arid regions, those with high and low energy-intensive management systems, deltas below sea level, tropical deltas, and Arctic deltas. Representative deltas are ranked on a sustainability range

  2. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers. 

  3. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  4. A quantitative approach to measure road network information based on edge diversity

    Science.gov (United States)

    Wu, Xun; Zhang, Hong; Lan, Tian; Cao, Weiwei; He, Jing

    2015-12-01

    The measure of map information has been one of the key issues in assessing cartographic quality and map generalization algorithms. It is also important for developing efficient approaches to transfer geospatial information. Road network is the most common linear object in real world. Approximately describe road network information will benefit road map generalization, navigation map production and urban planning. Most of current approaches focused on node diversities and supposed that all the edges are the same, which is inconsistent to real-life condition, and thus show limitations in measuring network information. As real-life traffic flow are directed and of different quantities, the original undirected vector road map was first converted to a directed topographic connectivity map. Then in consideration of preferential attachment in complex network study and rich-club phenomenon in social network, the from and to weights of each edge are assigned. The from weight of a given edge is defined as the connectivity of its end node to the sum of the connectivities of all the neighbors of the from nodes of the edge. After getting the from and to weights of each edge, edge information, node information and the whole network structure information entropies could be obtained based on information theory. The approach has been applied to several 1 square mile road network samples. Results show that information entropies based on edge diversities could successfully describe the structural differences of road networks. This approach is a complementarity to current map information measurements, and can be extended to measure other kinds of geographical objects.

  5. Toward accelerating landslide mapping with interactive machine learning techniques

    Science.gov (United States)

    Stumpf, André; Lachiche, Nicolas; Malet, Jean-Philippe; Kerle, Norman; Puissant, Anne

    2013-04-01

    Despite important advances in the development of more automated methods for landslide mapping from optical remote sensing images, the elaboration of inventory maps after major triggering events still remains a tedious task. Image classification with expert defined rules typically still requires significant manual labour for the elaboration and adaption of rule sets for each particular case. Machine learning algorithm, on the contrary, have the ability to learn and identify complex image patterns from labelled examples but may require relatively large amounts of training data. In order to reduce the amount of required training data active learning has evolved as key concept to guide the sampling for applications such as document classification, genetics and remote sensing. The general underlying idea of most active learning approaches is to initialize a machine learning model with a small training set, and to subsequently exploit the model state and/or the data structure to iteratively select the most valuable samples that should be labelled by the user and added in the training set. With relatively few queries and labelled samples, an active learning strategy should ideally yield at least the same accuracy than an equivalent classifier trained with many randomly selected samples. Our study was dedicated to the development of an active learning approach for landslide mapping from VHR remote sensing images with special consideration of the spatial distribution of the samples. The developed approach is a region-based query heuristic that enables to guide the user attention towards few compact spatial batches rather than distributed points resulting in time savings of 50% and more compared to standard active learning techniques. The approach was tested with multi-temporal and multi-sensor satellite images capturing recent large scale triggering events in Brazil and China and demonstrated balanced user's and producer's accuracies between 74% and 80%. The assessment also

  6. The use of mixed-integer programming for inverse treatment planning with pre-defined field segments

    International Nuclear Information System (INIS)

    Bednarz, Greg; Michalski, Darek; Houser, Chris; Huq, M. Saiful; Xiao Ying; Rani, Pramila Anne; Galvin, James M.

    2002-01-01

    Complex intensity patterns generated by traditional beamlet-based inverse treatment plans are often very difficult to deliver. In the approach presented in this work the intensity maps are controlled by pre-defining field segments to be used for dose optimization. A set of simple rules was used to define a pool of allowable delivery segments and the mixed-integer programming (MIP) method was used to optimize segment weights. The optimization problem was formulated by combining real variables describing segment weights with a set of binary variables, used to enumerate voxels in targets and critical structures. The MIP method was compared to the previously used Cimmino projection algorithm. The field segmentation approach was compared to an inverse planning system with a traditional beamlet-based beam intensity optimization. In four complex cases of oropharyngeal cancer the segmental inverse planning produced treatment plans, which competed with traditional beamlet-based IMRT plans. The mixed-integer programming provided mechanism for imposition of dose-volume constraints and allowed for identification of the optimal solution for feasible problems. Additional advantages of the segmental technique presented here are: simplified dosimetry, quality assurance and treatment delivery. (author)

  7. The Facebook influence model: a concept mapping approach.

    Science.gov (United States)

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts.

  8. The Facebook Influence Model: A Concept Mapping Approach

    Science.gov (United States)

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  9. Mapping cortical mesoscopic networks of single spiking cortical or sub-cortical neurons.

    Science.gov (United States)

    Xiao, Dongsheng; Vanni, Matthieu P; Mitelut, Catalin C; Chan, Allen W; LeDue, Jeffrey M; Xie, Yicheng; Chen, Andrew Cn; Swindale, Nicholas V; Murphy, Timothy H

    2017-02-04

    Understanding the basis of brain function requires knowledge of cortical operations over wide-spatial scales, but also within the context of single neurons. In vivo, wide-field GCaMP imaging and sub-cortical/cortical cellular electrophysiology were used in mice to investigate relationships between spontaneous single neuron spiking and mesoscopic cortical activity. We make use of a rich set of cortical activity motifs that are present in spontaneous activity in anesthetized and awake animals. A mesoscale spike-triggered averaging procedure allowed the identification of motifs that are preferentially linked to individual spiking neurons by employing genetically targeted indicators of neuronal activity. Thalamic neurons predicted and reported specific cycles of wide-scale cortical inhibition/excitation. In contrast, spike-triggered maps derived from single cortical neurons yielded spatio-temporal maps expected for regional cortical consensus function. This approach can define network relationships between any point source of neuronal spiking and mesoscale cortical maps.

  10. Meeting the challenge of mapping peatlands with remotely sensed data

    Directory of Open Access Journals (Sweden)

    O. N. Krankina

    2008-12-01

    Full Text Available Boreal peatlands play a major role in carbon and water cycling and other global environmental processes but understanding this role is constrained by inconsistent representation of peatlands on, or omission from, many global land cover maps. The comparison of several widely used global and continental-scale databases on peatland distribution with a detailed map for the St. Petersburg region of Russia showed significant under-reporting of peatland area, or even total omission. Analysis of the spatial agreement and disagreement with the detailed regional map indicated that the error of comission (overestimation was significantly lower than the error of omission (underestimation which means, that overall, peatlands were correctly classified as such in coarse resolution datasets but a large proportion (74–99% was overlooked. The coarse map resolution alone caused significant omission of peatlands in the study region. In comparison to categorical maps, continuous field mapping approach utilizing MODIS sensor data showed potential for a greatly improved representation of peatlands on coarse resolution maps. Analysis of spectral signatures of peatlands with different types of surface vegetation suggested that improved mapping of boreal peatlands on categorical maps is feasible. The lower reflectance of treeless peatlands in the near- and shortwave-infrared parts of the electromagnetic spectrum is consistent with the spectral signature of sphagnum mosses. However, when trees are present, the canopy architecture appears to be more important in defining the overall spectral reflectance of peatlands. A research focus on developing remote sensing methods for boreal peatlands is needed for adequate characterization of their global distribution.

  11. Development of "Long Live Love+," a School-Based Online Sexual Health Programme for Young Adults. An Intervention Mapping Approach

    Science.gov (United States)

    Mevissen, Fraukje E. F.; van Empelen, Pepijn; Watzeels, Anita; van Duin, Gee; Meijer, Suzanne; van Lieshout, Sanne; Kok, Gerjo

    2018-01-01

    This paper describes the development of a Dutch online programme called "Long Live Love+" focusing on positive, coercion-free relationships, contraception use, and the prevention of STIs, using the Intervention Mapping (IM) approach. All six steps of the approach were followed. Step 1 confirmed the need for a sexual health programme…

  12. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    Science.gov (United States)

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  13. Mapping the Wolf-Hirschhorn syndrome phenotype outside the currently accepted WHS critical region and defining a new critical region, WHSCR-2.

    Science.gov (United States)

    Zollino, Marcella; Lecce, Rosetta; Fischetto, Rita; Murdolo, Marina; Faravelli, Francesca; Selicorni, Angelo; Buttè, Cinzia; Memo, Luigi; Capovilla, Giuseppe; Neri, Giovanni

    2003-03-01

    In an attempt to define the distinctive Wolf-Hirschhorn syndrome (WHS) phenotype, and to map its specific clinical manifestations, a total of eight patients carrying a 4p16.3 microdeletion were analyzed for their clinical phenotype and their respective genotypes. The extent of each individual deletion was established by fluorescence in situ hybridization, with a cosmid contig spanning the genomic region from MSX1 (distal half of 4p16.1) to the subtelomeric locus D4S3359. The deletions were 1.9-3.5 Mb, and all were terminal. All the patients presented with a mild phenotype, in which major malformations were usually absent. It is worth noting that head circumference was normal for height in two patients (those with the smallest deletions [1.9 and 2.2 Mb]). The currently accepted WHS critical region (WHSCR) was fully preserved in the patient with the 1.9-Mb deletion, in spite of a typical WHS phenotype. The deletion in this patient spanned the chromosome region from D4S3327 (190 b4 cosmid clone included) to the telomere. From a clinical point of view, the distinctive WHS phenotype is defined by the presence of typical facial appearance, mental retardation, growth delay, congenital hypotonia, and seizures. These signs represent the minimal diagnostic criteria for WHS. This basic phenotype maps distal to the currently accepted WHSCR. Here, we propose a new critical region for WHS, and we refer to this region as "WHSCR-2." It falls within a 300-600-kb interval in 4p16.3, between the loci D4S3327 and D4S98-D4S168. Among the candidate genes already described for WHS, LETM1 (leucine zipper/EF-hand-containing transmembrane) is likely to be pathogenetically involved in seizures. On the basis of genotype-phenotype correlation analysis, dividing the WHS phenotype into two distinct clinical entities, a "classical" and a "mild" form, is recommended for the purpose of proper genetic counseling.

  14. Comparing different stimulus configurations for population receptive field mapping in human fMRI

    Directory of Open Access Journals (Sweden)

    Ivan eAlvarez

    2015-02-01

    Full Text Available Population receptive field (pRF mapping is a widely used approach to measuring aggregate human visual receptive field properties by recording non-invasive signals using functional MRI. Despite growing interest, no study to date has systematically investigated the effects of different stimulus configurations on pRF estimates from human visual cortex. Here we compared the effects of three different stimulus configurations on a model-based approach to pRF estimation: size-invariant bars and eccentricity-scaled bars defined in Cartesian coordinates and traveling along the cardinal axes, and a novel simultaneous ‘wedge and ring’ stimulus defined in polar coordinates, systematically covering polar and eccentricity axes. We found that the presence or absence of eccentricity scaling had a significant effect on goodness of fit and pRF size estimates. Further, variability in pRF size estimates was directly influenced by stimulus configuration, particularly for higher visual areas including V5/MT+. Finally, we compared eccentricity estimation between phase-encoded and model-based pRF approaches. We observed a tendency for more peripheral eccentricity estimates using phase-encoded methods, independent of stimulus size. We conclude that both eccentricity scaling and polar rather than Cartesian stimulus configuration are important considerations for optimal experimental design in pRF mapping. While all stimulus configurations produce adequate estimates, simultaneous wedge and ring stimulation produced higher fit reliability, with a significant advantage in reduced acquisition time.

  15. Isthmus sites identified by Ripple Mapping are usually anatomically stable: A novel method to guide atrial substrate ablation?

    Science.gov (United States)

    Luther, Vishal; Qureshi, Norman; Lim, Phang Boon; Koa-Wing, Michael; Jamil-Copley, Shahnaz; Ng, Fu Siong; Whinnett, Zachary; Davies, D Wyn; Peters, Nicholas S; Kanagaratnam, Prapa; Linton, Nick

    2018-03-01

    Postablation reentrant ATs depend upon conducting isthmuses bordered by scar. Bipolar voltage maps highlight scar as sites of low voltage, but the voltage amplitude of an electrogram depends upon the myocardial activation sequence. Furthermore, a voltage threshold that defines atrial scar is unknown. We used Ripple Mapping (RM) to test whether these isthmuses were anatomically fixed between different activation vectors and atrial rates. We studied post-AF ablation ATs where >1 rhythm was mapped. Multipolar catheters were used with CARTO Confidense for high-density mapping. RM visualized the pattern of activation, and the voltage threshold below which no activation was seen. Isthmuses were characterized at this threshold between maps for each patient. Ten patients were studied (Map 1 was AT1; Map 2: sinus 1/10, LA paced 2/10, AT2 with reverse CS activation 3/10; AT2 CL difference 50 ± 30 ms). Point density was similar between maps (Map 1: 2,589 ± 1,330; Map 2: 2,214 ± 1,384; P  =  0.31). RM activation threshold was 0.16 ± 0.08 mV. Thirty-one isthmuses were identified in Map 1 (median 3 per map; width 27 ± 15 mm; 7 anterior; 6 roof; 8 mitral; 9 septal; 1 posterior). Importantly, 7 of 31 (23%) isthmuses were unexpectedly identified within regions without prior ablation. AT1 was treated following ablation of 11/31 (35%) isthmuses. Of the remaining 20 isthmuses, 14 of 16 isthmuses (88%) were consistent between the two maps (four were inadequately mapped). Wavefront collision caused variation in low voltage distribution in 2 of 16 (12%). The distribution of isthmuses and nonconducting tissue within the ablated left atrium, as defined by RM, appear concordant between rhythms. This could guide a substrate ablative approach. © 2018 Wiley Periodicals, Inc.

  16. Mapping in the cloud

    CERN Document Server

    Peterson, Michael P

    2014-01-01

    This engaging text provides a solid introduction to mapmaking in the era of cloud computing. It takes students through both the concepts and technology of modern cartography, geographic information systems (GIS), and Web-based mapping. Conceptual chapters delve into the meaning of maps and how they are developed, covering such topics as map layers, GIS tools, mobile mapping, and map animation. Methods chapters take a learn-by-doing approach to help students master application programming interfaces and build other technical skills for creating maps and making them available on the Internet. Th

  17. A Systematic Approach to Modified BCJR MAP Algorithms for Convolutional Codes

    Directory of Open Access Journals (Sweden)

    Patenaude François

    2006-01-01

    Full Text Available Since Berrou, Glavieux and Thitimajshima published their landmark paper in 1993, different modified BCJR MAP algorithms have appeared in the literature. The existence of a relatively large number of similar but different modified BCJR MAP algorithms, derived using the Markov chain properties of convolutional codes, naturally leads to the following questions. What is the relationship among the different modified BCJR MAP algorithms? What are their relative performance, computational complexities, and memory requirements? In this paper, we answer these questions. We derive systematically four major modified BCJR MAP algorithms from the BCJR MAP algorithm using simple mathematical transformations. The connections between the original and the four modified BCJR MAP algorithms are established. A detailed analysis of the different modified BCJR MAP algorithms shows that they have identical computational complexities and memory requirements. Computer simulations demonstrate that the four modified BCJR MAP algorithms all have identical performance to the BCJR MAP algorithm.

  18. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Science.gov (United States)

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  19. Assessment of Above-Ground Biomass of Borneo Forests through a New Data-Fusion Approach Combining Two Pan-Tropical Biomass Maps

    Directory of Open Access Journals (Sweden)

    Andreas Langner

    2015-08-01

    Full Text Available This study investigates how two existing pan-tropical above-ground biomass (AGB maps (Saatchi 2011, Baccini 2012 can be combined to derive forest ecosystem specific carbon estimates. Several data-fusion models which combine these AGB maps according to their local correlations with independent datasets such as the spectral bands of SPOT VEGETATION imagery are analyzed. Indeed these spectral bands convey information about vegetation type and structure which can be related to biomass values. Our study area is the island of Borneo. The data-fusion models are evaluated against a reference AGB map available for two forest concessions in Sabah. The highest accuracy was achieved by a model which combines the AGB maps according to the mean of the local correlation coefficients calculated over different kernel sizes. Combining the resulting AGB map with a new Borneo land cover map (whose overall accuracy has been estimated at 86.5% leads to average AGB estimates of 279.8 t/ha and 233.1 t/ha for forests and degraded forests respectively. Lowland dipterocarp and mangrove forests have the highest and lowest AGB values (305.8 t/ha and 136.5 t/ha respectively. The AGB of all natural forests amounts to 10.8 Gt mainly stemming from lowland dipterocarp (66.4%, upper dipterocarp (10.9% and peat swamp forests (10.2%. Degraded forests account for another 2.1 Gt of AGB. One main advantage of our approach is that, once the best fitting data-fusion model is selected, no further AGB reference dataset is required for implementing the data-fusion process. Furthermore, the local harmonization of AGB datasets leads to more spatially precise maps. This approach can easily be extended to other areas in Southeast Asia which are dominated by lowland dipterocarp forest, and can be repeated when newer or more accurate AGB maps become available.

  20. Canonical, stable, general mapping using context schemes.

    Science.gov (United States)

    Novak, Adam M; Rosen, Yohei; Haussler, David; Paten, Benedict

    2015-11-15

    Sequence mapping is the cornerstone of modern genomics. However, most existing sequence mapping algorithms are insufficiently general. We introduce context schemes: a method that allows the unambiguous recognition of a reference base in a query sequence by testing the query for substrings from an algorithmically defined set. Context schemes only map when there is a unique best mapping, and define this criterion uniformly for all reference bases. Mappings under context schemes can also be made stable, so that extension of the query string (e.g. by increasing read length) will not alter the mapping of previously mapped positions. Context schemes are general in several senses. They natively support the detection of arbitrary complex, novel rearrangements relative to the reference. They can scale over orders of magnitude in query sequence length. Finally, they are trivially extensible to more complex reference structures, such as graphs, that incorporate additional variation. We demonstrate empirically the existence of high-performance context schemes, and present efficient context scheme mapping algorithms. The software test framework created for this study is available from https://registry.hub.docker.com/u/adamnovak/sequence-graphs/. anovak@soe.ucsc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Possibilities of contactless control of web map applications by sight

    Directory of Open Access Journals (Sweden)

    Rostislav Netek

    2012-03-01

    Full Text Available This paper assesses possibilities of a new approach of control map applications on the screen without locomotive system. There is a project about usability of Eye Tracking System in Geoinformatic and Cartographic fields at Department of Geoinformatics at Palacky University. The eye tracking system is a device for measuring eye/gaze positions and eye/gaze movement ("where we are looking". There is a number of methods and outputs, but the most common are "heat-maps" of intensity and/or time. Just this method was used in the first part, where was analyzed the number of common web map portals, especially distribution of their tools and functions on the screen. The aim of research is to localize by heat-maps the best distribution of control tools for movement with map (function "pan". It can analyze how sensitive are people on perception of control tools in different web pages and platforms. It is a great experience to compare accurate survey data with personal interpretation and knowledge. Based on these results is the next step – design of "control tools" which is command by eye-tracking device. There has been elected rectangle areas located on the edge of map (AOI – areas of interest, with special function which have defined some time delay. When user localizes one of these areas the map automatically moves to the way on which edge is localized on, and time delay prevents accidental movement. The technology for recording the eye movements on the screen offers this option because if you properly define the layout and function controls of the map, you need only connect these two systems. At this moment, there is a technical constrain. The solution of movement control is based on data transmission between eye-tracking-device-output and converter in real-time. Just real-time transfer is not supported in every case of SMI (SensoMotoric Instruments company devices. More precisely it is the problem of money, because eye-tracking device and every

  2. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    Science.gov (United States)

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  3. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    Software process improvement in small, agile organizations is often problematic. Model-based approaches seem to overlook problems. We have been seeking an alternative approach to overcome this through action research. Here we report on a piece of action research from which we developed an approach...... to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...... software process improvement....

  4. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    Science.gov (United States)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  5. Defining electricity markets. An arbitrage cost approach

    International Nuclear Information System (INIS)

    Kleit, A.N.

    2001-01-01

    Market definition is a crucial component of antitrust policy. There is, however, no universally accepted method of carrying out market definition. While several approaches have been presented in the literature, each has its share of drawbacks. This paper suggests that a modeling technique based upon the theory of arbitrage is well suited to answering this question. After the empirical approach is presented, it is used to calculate antitrust market definitions between electricity hubs in the American West

  6. Mapping small molecule binding data to structural domains.

    Science.gov (United States)

    Kruger, Felix A; Rostom, Raghd; Overington, John P

    2012-01-01

    Large-scale bioactivity/SAR Open Data has recently become available, and this has allowed new analyses and approaches to be developed to help address the productivity and translational gaps of current drug discovery. One of the current limitations of these data is the relative sparsity of reported interactions per protein target, and complexities in establishing clear relationships between bioactivity and targets using bioinformatics tools. We detail in this paper the indexing of targets by the structural domains that bind (or are likely to bind) the ligand within a full-length protein. Specifically, we present a simple heuristic to map small molecule binding to Pfam domains. This profiling can be applied to all proteins within a genome to give some indications of the potential pharmacological modulation and regulation of all proteins. In this implementation of our heuristic, ligand binding to protein targets from the ChEMBL database was mapped to structural domains as defined by profiles contained within the Pfam-A database. Our mapping suggests that the majority of assay targets within the current version of the ChEMBL database bind ligands through a small number of highly prevalent domains, and conversely the majority of Pfam domains sampled by our data play no currently established role in ligand binding. Validation studies, carried out firstly against Uniprot entries with expert binding-site annotation and secondly against entries in the wwPDB repository of crystallographic protein structures, demonstrate that our simple heuristic maps ligand binding to the correct domain in about 90 percent of all assessed cases. Using the mappings obtained with our heuristic, we have assembled ligand sets associated with each Pfam domain. Small molecule binding has been mapped to Pfam-A domains of protein targets in the ChEMBL bioactivity database. The result of this mapping is an enriched annotation of small molecule bioactivity data and a grouping of activity classes

  7. Applicability of vulnerability maps

    International Nuclear Information System (INIS)

    Andersen, L.J.; Gosk, E.

    1989-01-01

    A number of aspects to vulnerability maps are discussed: the vulnerability concept, mapping purposes, possible users, and applicability of vulnerability maps. Problems associated with general-type vulnerability mapping, including large-scale maps, universal pollutant, and universal pollution scenario are also discussed. An alternative approach to vulnerability assessment - specific vulnerability mapping for limited areas, specific pollutant, and predefined pollution scenario - is suggested. A simplification of the vulnerability concept is proposed in order to make vulnerability mapping more objective and by this means more comparable. An extension of the vulnerability concept to the rest of the hydrogeological cycle (lakes, rivers, and the sea) is proposed. Some recommendations regarding future activities are given

  8. Mapping of wine industry

    OpenAIRE

    Віліна Пересадько; Надія Максименко; Катерина Біла

    2016-01-01

    Having reviewed a variety of approaches to understanding the essence of wine industry, having studied the modern ideas about the future of wine industry, having analyzed more than 50 maps from the Internet we have set the trends and special features of wine industry mapping in the world, such as: - the vast majority of maps displays the development of the industry at regional or national level, whereas there are practically no world maps; - wine-growing regions are represented on maps very un...

  9. New Angle on the Parton Distribution Functions: Self-Organizing Maps

    International Nuclear Information System (INIS)

    Honkanen, H.; Liuti, S.

    2009-01-01

    Neural network (NN) algorithms have been recently applied to construct Parton Distribution Function (PDF) parametrizations, providing an alternative to standard global fitting procedures. Here we explore a novel technique using Self-Organizing Maps (SOMs). SOMs are a class of clustering algorithms based on competitive learning among spatially-ordered neurons. We train our SOMs with stochastically generated PDF samples. On every optimization iteration the PDFs are clustered on the SOM according to a user-defined feature and the most promising candidates are used as a seed for the subsequent iteration using the topology of the map to guide the PDF generating process. Our goal is a fitting procedure that, at variance with the standard neural network approaches, will allow for an increased control of the systematic bias by enabling user interaction in the various stages of the process.

  10. Creating a conceptual hydrological soil response map for the ...

    African Journals Online (AJOL)

    2014-03-03

    Mar 3, 2014 ... a digital soil mapping (DSM) approach to soil mapping can speed up the mapping process and thereby extend soil map use in the field of ... This research uses an expert-knowledge DSM approach to create a soil map for Stevenson Hamilton .... the different bands of the Landsat and SPOT 5 images.

  11. Creating a conceptual hydrological soil response map for the ...

    African Journals Online (AJOL)

    The use of a digital soil mapping (DSM) approach to soil mapping can speed up the mapping process and thereby extend soil map use in the field of hydrology. This research uses an expert-knowledge DSM approach to create a soil map for Stevenson Hamilton Research Supersite within the Kruger National Park, South ...

  12. Mapping Trends in Pedagogical Approaches and Learning Technologies: Perspectives from the Canadian, International, and Military Education Contexts

    Science.gov (United States)

    Scoppio, Grazia; Covell, Leigha

    2016-01-01

    Increased technological advances, coupled with new learners' needs, have created new realities for higher education contexts. This study explored and mapped trends in pedagogical approaches and learning technologies in postsecondary education and identified how these innovations are affecting teaching and learning practices in higher education…

  13. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping.

    Science.gov (United States)

    Baeßler, Bettina; Schaarschmidt, Frank; Treutlein, Melanie; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido; Maintz, David; Bunck, Alexander C

    2017-12-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. • A novel quantitative approach to myocardial oedema imaging in myocarditis was re-evaluated. • The T2-mapping-derived parameters maxT2 and madSD were compared to traditional Lake-Louise criteria. • Using maxT2 and madSD with dedicated cut-offs performs similarly to Lake-Louise criteria. • Adding maxT2 and madSD to LGE results in further increased diagnostic performance. • This novel approach has the potential to overcome the limitations of T2-mapping.

  14. Method and system for a network mapping service

    Science.gov (United States)

    Bynum, Leo

    2017-10-17

    A method and system of publishing a map includes providing access to a plurality of map data files or mapping services between at least one publisher and at least one subscriber; defining a map in a map context comprising parameters and descriptors to substantially duplicate a map by reference to mutually accessible data or mapping services, publishing a map to a channel in a table file on server; accessing the channel by at least one subscriber, transmitting the mapping context from the server to the at least one subscriber, executing the map context by the at least one subscriber, and generating the map on a display software associated with the at least one subscriber by reconstituting the map from the references and other data in the mapping context.

  15. Circle diffeomorphisms forced by expanding circle maps

    NARCIS (Netherlands)

    Homburg, A.J.

    2012-01-01

    We discuss the dynamics of skew product maps defined by circle diffeomorphisms forced by expanding circle maps. We construct an open class of such systems that are robustly topologically mixing and for which almost all points in the same fiber converge under iteration. This property follows from the

  16. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    Science.gov (United States)

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  17. A dominance-based approach to map risks of ecological invasions in the presence of severe uncertainty

    Science.gov (United States)

    Denys Yemshanov; Frank H. Koch; D. Barry Lyons; Mark Ducey; Klaus Koehler

    2012-01-01

    Aim Uncertainty has been widely recognized as one of the most critical issues in predicting the expansion of ecological invasions. The uncertainty associated with the introduction and spread of invasive organisms influences how pest management decision makers respond to expanding incursions. We present a model-based approach to map risk of ecological invasions that...

  18. An Alternative Approach to Mapping Thermophysical Units from Martian Thermal Inertia and Albedo Data Using a Combination of Unsupervised Classification Techniques

    Directory of Open Access Journals (Sweden)

    Eriita Jones

    2014-06-01

    Full Text Available Thermal inertia and albedo provide information on the distribution of surface materials on Mars. These parameters have been mapped globally on Mars by the Thermal Emission Spectrometer (TES onboard the Mars Global Surveyor. Two-dimensional clusters of thermal inertia and albedo reflect the thermophysical attributes of the dominant materials on the surface. In this paper three automated, non-deterministic, algorithmic classification methods are employed for defining thermophysical units: Expectation Maximisation of a Gaussian Mixture Model; Iterative Self-Organizing Data Analysis Technique (ISODATA; and Maximum Likelihood. We analyse the behaviour of the thermophysical classes resulting from the three classifiers, operating on the 2007 TES thermal inertia and albedo datasets. Producing a rigorous mapping of thermophysical classes at ~3 km/pixel resolution remains important for constraining the geologic processes that have shaped the Martian surface on a regional scale, and for choosing appropriate landing sites. The results from applying these algorithms are compared to geologic maps, surface data from lander missions, features derived from imaging, and previous classifications of thermophysical units which utilized manual (and potentially more time consuming classification methods. These comparisons comprise data suitable for validation of our classifications. Our work shows that a combination of the algorithms—ISODATA and Maximum Likelihood—optimises the sensitivity to the underlying dataspace, and that new information on Martian surface materials can be obtained by using these methods. We demonstrate that the algorithms used here can be applied to define a finer partitioning of albedo and thermal inertia for a more detailed mapping of surface materials, grain sizes and thermal behaviour of the Martian surface and shallow subsurface, at the ~3 km scale.

  19. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    Science.gov (United States)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  20. Fingerprinting Software Defined Networks and Controllers

    Science.gov (United States)

    2015-03-01

    rps requests per second RTT Round-Trip Time SDN Software Defined Networking SOM Self-Organizing Map STP Spanning Tree Protocol TRW-CB Threshold Random...Protocol ( STP ) updates), in which case the frame will be “punted” from the forwarding lookup process and processed by the route processor [9]. The act of...environment 20 to accomplish the needs of B4. In addition to Google, the SDN market is expected to grow beyond $35 billion by April 2018 [31]. The rate

  1. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    This paper presents our ongoing effort on developing a principled methodology for automatic ontology mapping based on BayesOWL, a probabilistic framework we developed for modeling uncertainty in semantic web...

  2. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  3. High-definition mapping of retroviral integration sites defines the fate of allogeneic T cells after donor lymphocyte infusion.

    Directory of Open Access Journals (Sweden)

    Claudia Cattoglio

    2010-12-01

    Full Text Available The infusion of donor lymphocytes transduced with a retroviral vector expressing the HSV-TK suicide gene in patients undergoing hematopoietic stem cell transplantation for leukemia/lymphoma promotes immune reconstitution and prevents infections and graft-versus-host disease. Analysis of the clonal dynamics of genetically modified lymphocytes in vivo is of crucial importance to understand the potential genotoxic risk of this therapeutic approach. We used linear amplification-mediated PCR and pyrosequencing to build a genome-wide, high-definition map of retroviral integration sites in the genome of peripheral blood T cells from two different donors and used gene expression profiling and bioinformatics to associate integration clusters to transcriptional activity and to genetic and epigenetic features of the T cell genome. Comparison with matched random controls and with integrations obtained from CD34(+ hematopoietic stem/progenitor cells showed that integration clusters occur within chromatin regions bearing epigenetic marks associated with active promoters and regulatory elements in a cell-specific fashion. Analysis of integration sites in T cells obtained ex vivo two months after infusion showed no evidence of integration-related clonal expansion or dominance, but rather loss of cells harboring integration events interfering with RNA post-transcriptional processing. The study shows that high-definition maps of retroviral integration sites are a powerful tool to analyze the fate of genetically modified T cells in patients and the biological consequences of retroviral transduction.

  4. Atlas of the Colombian coal, Potential map and rank: Map 5-09

    International Nuclear Information System (INIS)

    Pulido Gonzalez, Orlando

    1999-01-01

    With the presentation of the Atlas of Coal to scale 1:500.000, it is sought to show to big features the location of the different areas with coal in Colombia, associating with the geologic units, the potential and the range. In the Map 5-09, the formations that include the coal are defined as Umir, Guaduas, Limbo, (Los Cuervos), San Fernando, (Carbonera), defined as Kst, Ksgt and Pgt. For the potential an arbitrary scale settled down, in the following way: in the first place bigger to 1000 million tons; between 1000 and 100; between 100 and 10 and lastly smaller to 10 million tons. These figures are represented in the map by triangles with colors that they are equal before to the figures mentioned. Keeping in mind the scale, it was opted to report the potential in the category of the hypothetical resources; when the resources or reserves are established, they are also reported. As for the range, in the map it is indicated by means of symbols that should be taken as a domain or tendency of the coal in each area in general. The the coal rank understood as the transformation that has reached along the geologic evolution is what is mentioned as anthracitic coal, semi-anthracitic, bituminous low in volatile, bituminous middle in volatile, bituminous high in volatile A, B and C, sub-bituminous and lastly the lignite. For each map are mentioned that there are determined

  5. Transition to Coherence in Populations of Coupled Chaotic Oscillators: A Linear Response Approach

    International Nuclear Information System (INIS)

    Topaj, Dmitri; Kye, Won-Ho; Pikovsky, Arkady

    2001-01-01

    We consider the collective dynamics in an ensemble of globally coupled chaotic maps. The transition to the coherent state with a macroscopic mean field is analyzed in the framework of the linear response theory. The linear response function for the chaotic system is obtained using the perturbation approach to the Frobenius-Perron operator. The transition point is defined from this function by virtue of the self-excitation condition for the feedback loop. Analytical results for the coupled Bernoulli maps are confirmed by the numerics

  6. Unified framework for recognition, localization and mapping using wearable cameras.

    Science.gov (United States)

    Vázquez-Martín, Ricardo; Bandera, Antonio

    2012-08-01

    Monocular approaches to simultaneous localization and mapping (SLAM) have recently addressed with success the challenging problem of the fast computation of dense reconstructions from a single, moving camera. Thus, if these approaches initially relied on the detection of a reduced set of interest points to estimate the camera position and the map, they are currently able to reconstruct dense maps from a handheld camera while the camera coordinates are simultaneously computed. However, these maps of 3-dimensional points usually remain meaningless, that is, with no memorable items and without providing a way of encoding spatial relationships between objects and paths. In humans and mobile robotics, landmarks play a key role in the internalization of a spatial representation of an environment. They are memorable cues that can serve to define a region of the space or the location of other objects. In a topological representation of the space, landmarks can be identified and located according to its structural, perceptive or semantic significance and distinctiveness. But on the other hand, landmarks may be difficult to be located in a metric representation of the space. Restricted to the domain of visual landmarks, this work describes an approach where the map resulting from a point-based, monocular SLAM is annotated with the semantic information provided by a set of distinguished landmarks. Both features are obtained from the image. Hence, they can be linked by associating to each landmark all those point-based features that are superimposed to the landmark in a given image (key-frame). Visual landmarks will be obtained by means of an object-based, bottom-up attention mechanism, which will extract from the image a set of proto-objects. These proto-objects could not be always associated with natural objects, but they will typically constitute significant parts of these scene objects and can be appropriately annotated with semantic information. Moreover, they will be

  7. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    Science.gov (United States)

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  8. Recursive definition of global cellular-automata mappings

    DEFF Research Database (Denmark)

    Feldberg, Rasmus; Knudsen, Carsten; Rasmussen, Steen

    1994-01-01

    A method for a recursive definition of global cellular-automata mappings is presented. The method is based on a graphical representation of global cellular-automata mappings. For a given cellular-automaton rule the recursive algorithm defines the change of the global cellular-automaton mapping...... as the number of lattice sites is incremented. A proof of lattice size invariance of global cellular-automata mappings is derived from an approximation to the exact recursive definition. The recursive definitions are applied to calculate the fractal dimension of the set of reachable states and of the set...

  9. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2011-06-01

    Conclusion Adopting an ontology-driven approach to case finding could improve the quality of disease registers and of research based on routine data. It would offer considerable advantages over using limited datasets to define cases. This approach should be considered by those involved in research and quality improvement projects which utilise routine data.

  10. Map updates in a dynamic Voronoi data structure

    DEFF Research Database (Denmark)

    Mioc, Darka; Antón Castro, Francesc/François; Gold, C. M.

    2006-01-01

    In this paper we are using local and sequential map updates in the Voronoi data structure, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric...... algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define...

  11. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  12. A new experimental approach to study the stability of logistic map

    International Nuclear Information System (INIS)

    Rani, Mamta; Agarwal, Rashi

    2009-01-01

    Remarkably benign looking logistic transformations x n+1 = rx n (1 - x n ) for choosing x 0 between 0 and 1 and 0 < r ≤ 4 have found a celebrated place in chaos, fractals and discrete dynamics. The purpose of this paper is to enhance the capabilities of logistic map via superior iterations. Stability of logistic map has been studied by running computer programs. Logistic map is stable for 0 < r ≤ 3.2 in Picard orbit. In superior orbit, we see that the range of stability of logistic map increases drastically. Also, chaotic behavior of logistic map disappears in certain cases.

  13. Geospatial approach in mapping soil erodibility using CartoDEM – A ...

    Indian Academy of Sciences (India)

    unscientific management practices followed in the hilly regions. .... country. In the absence of large scale or detail map, researcher use the small scale of soil map prepared ..... tural development. .... mapping: An introductory perspective; Dev.

  14. A two-stage approach for improved prediction of residue contact maps

    Directory of Open Access Journals (Sweden)

    Pollastri Gianluca

    2006-03-01

    Full Text Available Abstract Background Protein topology representations such as residue contact maps are an important intermediate step towards ab initio prediction of protein structure. Although improvements have occurred over the last years, the problem of accurately predicting residue contact maps from primary sequences is still largely unsolved. Among the reasons for this are the unbalanced nature of the problem (with far fewer examples of contacts than non-contacts, the formidable challenge of capturing long-range interactions in the maps, the intrinsic difficulty of mapping one-dimensional input sequences into two-dimensional output maps. In order to alleviate these problems and achieve improved contact map predictions, in this paper we split the task into two stages: the prediction of a map's principal eigenvector (PE from the primary sequence; the reconstruction of the contact map from the PE and primary sequence. Predicting the PE from the primary sequence consists in mapping a vector into a vector. This task is less complex than mapping vectors directly into two-dimensional matrices since the size of the problem is drastically reduced and so is the scale length of interactions that need to be learned. Results We develop architectures composed of ensembles of two-layered bidirectional recurrent neural networks to classify the components of the PE in 2, 3 and 4 classes from protein primary sequence, predicted secondary structure, and hydrophobicity interaction scales. Our predictor, tested on a non redundant set of 2171 proteins, achieves classification performances of up to 72.6%, 16% above a base-line statistical predictor. We design a system for the prediction of contact maps from the predicted PE. Our results show that predicting maps through the PE yields sizeable gains especially for long-range contacts which are particularly critical for accurate protein 3D reconstruction. The final predictor's accuracy on a non-redundant set of 327 targets is 35

  15. Developing clinical strength-of-evidence approach to define HIV-associated malignancies for cancer registration in Kenya.

    Science.gov (United States)

    Korir, Anne; Mauti, Nathan; Moats, Pamela; Gurka, Matthew J; Mutuma, Geoffrey; Metheny, Christine; Mwamba, Peter M; Oyiro, Peter O; Fisher, Melanie; Ayers, Leona W; Rochford, Rosemary; Mwanda, Walter O; Remick, Scot C

    2014-01-01

    Sub-Saharan Africa cancer registries are beset by an increasing cancer burden further exacerbated by the AIDS epidemic where there are limited capabilities for cancer-AIDS match co-registration. We undertook a pilot study based on a "strength-of-evidence" approach using clinical data that is abstracted at the time of cancer registration for purposes of linking cancer diagnosis to AIDS diagnosis. The standard Nairobi Cancer Registry form was modified for registrars to abstract the following clinical data from medical records regarding HIV infection/AIDS in a hierarchal approach at time of cancer registration from highest-to-lowest strength-of-evidence: 1) documentation of positive HIV serology; 2) antiretroviral drug prescription; 3) CD4+ lymphocyte count; and 4) WHO HIV clinical stage or immune suppression syndrome (ISS), which is Kenyan terminology for AIDS. Between August 1 and October 31, 2011 a total of 1,200 cancer cases were registered. Of these, 171 cases (14.3%) met clinical strength-of-evidence criteria for association with HIV infection/AIDS; 69% (118 cases were tumor types with known HIV association - Kaposi's sarcoma, cervical cancer, non-Hodgkin's and Hodgkin's lymphoma, and conjunctiva carcinoma) and 31% (53) were consistent with non-AIDS defining cancers. Verifiable positive HIV serology was identified in 47 (27%) cases for an absolute seroprevalence rate of 4% among the cancer registered cases with an upper boundary of 14% among those meeting at least one of strength-of-evidence criteria. This pilot demonstration of a hierarchal, clinical strength-of-evidence approach for cancer-AIDS registration in Kenya establishes feasibility, is readily adaptable, pragmatic, and does not require additional resources for critically under staffed cancer registries. Cancer is an emerging public health challenge, and African nations need to develop well designed population-based studies in order to better define the impact and spectrum of malignant disease in the

  16. Development of a dynamic web mapping service for vegetation productivity using earth observation and in situ sensors in a sensor web based approach

    NARCIS (Netherlands)

    Kooistra, L.; Bergsma, A.R.; Chuma, B.; Bruin, de S.

    2009-01-01

    This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the

  17. [Defining AIDS terminology. A practical approach].

    Science.gov (United States)

    Locutura, Jaime; Almirante, Benito; Berenguer, Juan; Muñoz, Agustín; Peña, José María

    2003-01-01

    Since the appearance of AIDS, the study of this disease has generated a large amount of information and an extensive related vocabulary comprised of new terms or terms borrowed from other scientific fields. The urgent need to provide names for newly described phenomena and concepts in this field has resulted in the application of terms that are not always appropriate from the linguistic and scientific points of view. We discuss the difficulties in attempting to create adequate AIDS terminology in the Spanish language, considering both the general problems involved in building any scientific vocabulary and the specific problems inherent to this activity in a field whose defining illness has important social connotations. The pressure exerted by the predominance of the English language in reporting scientific knowledge is considered, and the inappropriate words most often found in a review of current literature are examined. Finally, attending to the two most important criteria for the creation of new scientific terms, accuracy and linguistic correction, we propose some well thought-out alternatives that conform to the essence of the Spanish language.

  18. Research on Topographic Map Updating

    Directory of Open Access Journals (Sweden)

    Ivana Javorović

    2013-04-01

    Full Text Available The investigation of interpretability of panchromatic satellite image IRS-1C integrated with multispectral Landsat TM image with the purpose of updating the topographic map sheet at the scale of 1:25 000 has been described. The geocoding of source map was based on trigonometric points of the map sheet. Satellite images were geocoded using control points selected from the map. The contents of map have been vectorized and topographic database designed. The digital image processing improved the interpretability of images. Then, the vectorization of new contents was made. The change detection of the forest and water area was defined by using unsupervised classification of spatial and spectral merged images. Verification of the results was made using corresponding aerial photographs. Although this methodology could not insure the complete updating of topographic map at the scale of 1:25 000, the database has been updated with huge amount of data. Erdas Imagine 8.3. software was used. 

  19. Review of the Space Mapping Approach to Engineering Optimization and Modeling

    DEFF Research Database (Denmark)

    Bakr, M. H.; Bandler, J. W.; Madsen, Kaj

    2000-01-01

    We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physically-based models is exploited. S......-based Modeling (SMM). These include Space Derivative Mapping (SDM), Generalized Space Mapping (GSM) and Space Mapping-based Neuromodeling (SMN). Finally, we address open points for research and future development....

  20. Robust approach to f(R) gravity

    International Nuclear Information System (INIS)

    Jaime, Luisa G.; Patino, Leonardo; Salgado, Marcelo

    2011-01-01

    We consider metric f(R) theories of gravity without mapping them to their scalar-tensor counterpart, but using the Ricci scalar itself as an ''extra'' degree of freedom. This approach avoids then the introduction of a scalar-field potential that might be ill defined (not single valued). In order to explicitly show the usefulness of this method, we focus on static and spherically symmetric spacetimes and deal with the recent controversy about the existence of extended relativistic objects in certain class of f(R) models.

  1. Defining Abnormally Low Tenders

    DEFF Research Database (Denmark)

    Ølykke, Grith Skovgaard; Nyström, Johan

    2017-01-01

    The concept of an abnormally low tender is not defined in EU public procurement law. This article takes an interdisciplinary law and economics approach to examine a dataset consisting of Swedish and Danish judgments and verdicts concerning the concept of an abnormally low tender. The purpose...

  2. Maps on statistical manifolds exactly reduced from the Perron-Frobenius equations for solvable chaotic maps

    Science.gov (United States)

    Goto, Shin-itiro; Umeno, Ken

    2018-03-01

    Maps on a parameter space for expressing distribution functions are exactly derived from the Perron-Frobenius equations for a generalized Boole transform family. Here the generalized Boole transform family is a one-parameter family of maps, where it is defined on a subset of the real line and its probability distribution function is the Cauchy distribution with some parameters. With this reduction, some relations between the statistical picture and the orbital one are shown. From the viewpoint of information geometry, the parameter space can be identified with a statistical manifold, and then it is shown that the derived maps can be characterized. Also, with an induced symplectic structure from a statistical structure, symplectic and information geometric aspects of the derived maps are discussed.

  3. Land use map, Finney County, Kansas

    Science.gov (United States)

    Morain, S. A. (Principal Investigator); Williams, D. L.; Coiner, J. C.

    1973-01-01

    The author has identified the following significant results. Methods for the mapping of land use in agricultural regions are developed and applied to preparation of a land use map of Finney County, Kanas. Six land use categories were identified from an MSS-5 image. These categories are: (1) large field irrigation; (2) small field irrigation; (3) dryland cultivation; (4) rangeland; (5) cultural features; and (6) riverine land. The map is composed of basically homogeneous regions with definable mixtures of the six categories. Each region is bounded by an ocularly evident change in land use.

  4. Multi-stakeholder perspectives in defining health-services quality in cataract care.

    Science.gov (United States)

    Stolk-Vos, Aline C; van de Klundert, Joris J; Maijers, Niels; Zijlmans, Bart L M; Busschbach, Jan J V

    2017-08-01

    To develop a method to define a multi-stakeholder perspective on health-service quality that enables the expression of differences in systematically identified stakeholders' perspectives, and to pilot the approach for cataract care. Mixed-method study between 2014 and 2015. Cataract care in the Netherlands. Stakeholder representatives. We first identified and classified stakeholders using stakeholder theory. Participants established a multi-stakeholder perspective on quality of cataract care using concept mapping, this yielded a cluster map based on multivariate statistical analyses. Consensus-based quality dimensions were subsequently defined in a plenary stakeholder session. Stakeholders and multi-stakeholder perspective on health-service quality. Our analysis identified seven definitive stakeholders, as follows: the Dutch Ophthalmology Society, ophthalmologists, general practitioners, optometrists, health insurers, hospitals and private clinics. Patients, as dependent stakeholders, were considered to lack power by other stakeholders; hence, they were not classified as definitive stakeholders. Overall, 18 stakeholders representing ophthalmologists, general practitioners, optometrists, health insurers, hospitals, private clinics, patients, patient federations and the Dutch Healthcare Institute sorted 125 systematically collected indicators into the seven following clusters: patient centeredness and accessibility, interpersonal conduct and expectations, experienced outcome, clinical outcome, process and structure, medical technical acting and safety. Importance scores from stakeholders directly involved in the cataract service delivery process correlated strongly, as did scores from stakeholders not directly involved in this process. Using a case study on cataract care, the proposed methods enable different views among stakeholders concerning quality dimensions to be systematically revealed, and the stakeholders jointly agreed on these dimensions. The methods

  5. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    Science.gov (United States)

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. ON DEFINING S-SPACES

    Directory of Open Access Journals (Sweden)

    Francesco Strati

    2013-05-01

    Full Text Available The present work is intended to be an introduction to the Superposition Theory of David Carfì. In particular I shall depict the meaning of his brand new theory, on the one hand in an informal fashion and on the other hand by giving a formal approach of the algebraic structure of the theory: the S-linear algebra. This kind of structure underpins the notion of S-spaces (or Carfì-spaces by defining both its properties and its nature. Thus I shall define the S-triple as the fundamental principle upon which the S-linear algebra is built up.

  7. INTEGRATED GEOREFERENCING OF STEREO IMAGE SEQUENCES CAPTURED WITH A STEREOVISION MOBILE MAPPING SYSTEM – APPROACHES AND PRACTICAL RESULTS

    Directory of Open Access Journals (Sweden)

    H. Eugster

    2012-07-01

    Full Text Available Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations – in our case of the imaging sensors – normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  8. Empirical PPGIS/PGIS mapping of ecosystem services

    DEFF Research Database (Denmark)

    Brown, Gregory G; Fagerholm, Nora

    2015-01-01

    demonstrate high potential for the identification of ecosystem services, especially cultural services, there has been no review to evaluate the methods to identify best practice. Through examination of peer-reviewed, empirical PPGIS/PGIS studies, we describe the types of ecosystem services mapped, the spatial...... of experimental design and long-term case studies where the influence of mapped ecosystem services on land use decisions can be assessed....... mapping methods, the sampling approaches and range of participants, the types of spatial analyses performed, and the methodological trade-offs associated with each PPGIS/PGIS mapping approach. We found that multiple methods were implemented in nearly 30 case studies worldwide with the mapping of cultural...

  9. A short note on nearly perfect maps of locales | Razafindrakoto ...

    African Journals Online (AJOL)

    We characterise compact locales in terms of nearly perfect maps. We show in particular that these maps are the natural pointfree version of Bourbaki's proper maps - when defined via any ultrafillter - and that they extend Herrlich's notion of nearly closed sublocales [10]. Mathematics Subject Classication (2010): 06A15, ...

  10. Recursive definition of global cellular-automata mappings

    International Nuclear Information System (INIS)

    Feldberg, R.; Knudsen, C.; Rasmussen, S.

    1994-01-01

    A method for a recursive definition of global cellular-automata mappings is presented. The method is based on a graphical representation of global cellular-automata mappings. For a given cellular-automaton rule the recursive algorithm defines the change of the global cellular-automaton mapping as the number of lattice sites is incremented. A proof of lattice size invariance of global cellular-automata mappings is derived from an approximation to the exact recursive definition. The recursive definitions are applied to calculate the fractal dimension of the set of reachable states and of the set of fixed points of cellular automata on an infinite lattice

  11. Functional mapping imprinted quantitative trait loci underlying developmental characteristics

    Directory of Open Access Journals (Sweden)

    Li Gengxin

    2008-03-01

    Full Text Available Abstract Background Genomic imprinting, a phenomenon referring to nonequivalent expression of alleles depending on their parental origins, has been widely observed in nature. It has been shown recently that the epigenetic modification of an imprinted gene can be detected through a genetic mapping approach. Such an approach is developed based on traditional quantitative trait loci (QTL mapping focusing on single trait analysis. Recent studies have shown that most imprinted genes in mammals play an important role in controlling embryonic growth and post-natal development. For a developmental character such as growth, current approach is less efficient in dissecting the dynamic genetic effect of imprinted genes during individual ontology. Results Functional mapping has been emerging as a powerful framework for mapping quantitative trait loci underlying complex traits showing developmental characteristics. To understand the genetic architecture of dynamic imprinted traits, we propose a mapping strategy by integrating the functional mapping approach with genomic imprinting. We demonstrate the approach through mapping imprinted QTL controlling growth trajectories in an inbred F2 population. The statistical behavior of the approach is shown through simulation studies, in which the parameters can be estimated with reasonable precision under different simulation scenarios. The utility of the approach is illustrated through real data analysis in an F2 family derived from LG/J and SM/J mouse stains. Three maternally imprinted QTLs are identified as regulating the growth trajectory of mouse body weight. Conclusion The functional iQTL mapping approach developed here provides a quantitative and testable framework for assessing the interplay between imprinted genes and a developmental process, and will have important implications for elucidating the genetic architecture of imprinted traits.

  12. Heat transfer analysis in internally-cooled fuel elements by means of a conformal mapping approach

    International Nuclear Information System (INIS)

    Sarmiento, G.S.; Laura, P.A.A.

    1981-01-01

    The present paper deals with an approximate solution of the steady-state heat conduction problem in internally cooled fuel elements of fast breeder reactors. Explicit expressions for the dimensionless temperature distribution in terms of the governing physical and geometrical parameters are determined by means of a coupled conformal mapping-variational approach. The results obtained are found to be in very good agreement with those calculated by means of a finite element code. (orig.)

  13. Evaluation of phenoxybenzamine in the CFA model of pain following gene expression studies and connectivity mapping.

    Science.gov (United States)

    Chang, Meiping; Smith, Sarah; Thorpe, Andrew; Barratt, Michael J; Karim, Farzana

    2010-09-16

    We have previously used the rat 4 day Complete Freund's Adjuvant (CFA) model to screen compounds with potential to reduce osteoarthritic pain. The aim of this study was to identify genes altered in this model of osteoarthritic pain and use this information to infer analgesic potential of compounds based on their own gene expression profiles using the Connectivity Map approach. Using microarrays, we identified differentially expressed genes in L4 and L5 dorsal root ganglia (DRG) from rats that had received intraplantar CFA for 4 days compared to matched, untreated control animals. Analysis of these data indicated that the two groups were distinguishable by differences in genes important in immune responses, nerve growth and regeneration. This list of differentially expressed genes defined a "CFA signature". We used the Connectivity Map approach to identify pharmacologic agents in the Broad Institute Build02 database that had gene expression signatures that were inversely related ('negatively connected') with our CFA signature. To test the predictive nature of the Connectivity Map methodology, we tested phenoxybenzamine (an alpha adrenergic receptor antagonist) - one of the most negatively connected compounds identified in this database - for analgesic activity in the CFA model. Our results indicate that at 10 mg/kg, phenoxybenzamine demonstrated analgesia comparable to that of Naproxen in this model. Evaluation of phenoxybenzamine-induced analgesia in the current study lends support to the utility of the Connectivity Map approach for identifying compounds with analgesic properties in the CFA model.

  14. A Pragmatic Approach to Guide Implementation Evaluation Research: Strategy Mapping for Complex Interventions

    Directory of Open Access Journals (Sweden)

    Alexis K. Huynh

    2018-05-01

    for producing valid and reliable process evaluation data, mapping implementation strategies has informed development of a pragmatic blueprint for implementation and longitudinal analyses and evaluation activities.DiscussionWe update recent recommendations on specification of implementation strategies by considering the implications for multi-strategy frameworks and propose an approach for mapping the use of implementation strategies within complex, multi-level interventions, in support of rigorous evaluation. Developing pragmatic tools to aid in operationalizing the conduct of implementation and evaluation activities is essential to enacting sound implementation research.

  15. Conscious worst case definition for risk assessment, part I: a knowledge mapping approach for defining most critical risk factors in integrative risk management of chemicals and nanomaterials.

    Science.gov (United States)

    Sørensen, Peter B; Thomsen, Marianne; Assmuth, Timo; Grieger, Khara D; Baun, Anders

    2010-08-15

    This paper helps bridge the gap between scientists and other stakeholders in the areas of human and environmental risk management of chemicals and engineered nanomaterials. This connection is needed due to the evolution of stakeholder awareness and scientific progress related to human and environmental health which involves complex methodological demands on risk management. At the same time, the available scientific knowledge is also becoming more scattered across multiple scientific disciplines. Hence, the understanding of potentially risky situations is increasingly multifaceted, which again challenges risk assessors in terms of giving the 'right' relative priority to the multitude of contributing risk factors. A critical issue is therefore to develop procedures that can identify and evaluate worst case risk conditions which may be input to risk level predictions. Therefore, this paper suggests a conceptual modelling procedure that is able to define appropriate worst case conditions in complex risk management. The result of the analysis is an assembly of system models, denoted the Worst Case Definition (WCD) model, to set up and evaluate the conditions of multi-dimensional risk identification and risk quantification. The model can help optimize risk assessment planning by initial screening level analyses and guiding quantitative assessment in relation to knowledge needs for better decision support concerning environmental and human health protection or risk reduction. The WCD model facilitates the evaluation of fundamental uncertainty using knowledge mapping principles and techniques in a way that can improve a complete uncertainty analysis. Ultimately, the WCD is applicable for describing risk contributing factors in relation to many different types of risk management problems since it transparently and effectively handles assumptions and definitions and allows the integration of different forms of knowledge, thereby supporting the inclusion of multifaceted risk

  16. A review of monopolar motor mapping and a comprehensive guide to continuous dynamic motor mapping for resection of motor eloquent brain tumors.

    Science.gov (United States)

    Schucht, P; Seidel, K; Jilch, A; Beck, J; Raabe, A

    2017-06-01

    Monopolar mapping of motor function differs from the most commonly used method of intraoperative mapping, i.e. bipolar direct electrical stimulation at 50-60Hz (Penfield technique mapping). Most importantly, the monopolar probe emits a radial, homogenous electrical field different to the more focused inter-tip bipolar electrical field. Most users combine monopolar stimulation with the short train technique, also called high frequency stimulation, or train-of-five techniques. It consists of trains of four to nine monopolar rectangular electrical pulses of 200-500μs pulse length with an inter stimulus interval of 2-4msec. High frequency short train stimulation triggers a time-locked motor-evoked potential response, which has a defined latency and an easily quantifiable amplitude. In this way, motor thresholds might be used to evaluate a current-to-distance relation. The homogeneous electrical field and the current-to-distance approximation provide the surgeon with an estimate of the remaining distance to the corticospinal tract, enabling the surgeon to adjust the speed of resection as the corticospinal tract is approached. Furthermore, this stimulation paradigm is associated with a lower incidence of intraoperative seizures, allowing continuous stimulation. Hence, monopolar mapping is increasingly used as part of a strategy of continuous dynamic mapping: ergonomically integrated into the surgeon's tools, the monopolar probe reliably provides continuous/uninterrupted feedback on motor function. As part of this strategy, motor mapping is not any longer a time consuming interruption of resection but rather a radar-like, real-time information system on the spatial relationship of the current resection site to eloquent motor structures. Copyright © 2017. Published by Elsevier Masson SAS.

  17. WebMapping at school

    Science.gov (United States)

    de Lange, Norbert

    2010-11-01

    The paper discusses the position of GIS in Geography as a subject especially at German schools. It points out that students only need simple GIS-functions in order to explore digital atlases or webbased data viewers. Furthermore it is widely accepted that learning achievements improve if students make use of the idea of self-employed and explorative working on information produced by themselves. These two arguments have led to the development of the WebMapping tool "kartografix_school". It allows users to generate maps with new and individually defined content on the internet. For that purpose the tool contains generalized outlines of all countries of the world as well as of German States. As these boundaries are given users can assign new attribute data to these geoobjects. These data are transferred to a graphic presentation. It is possible to define the classification and colours for each class. Users can change and update all information (data as well as number of classes, definition of classes, colours) at any time. Moreover "kartografix_school" offers the possibility to produce maps which are composed of two layers. All data are stored at a server located in the University of Osnabrück. "kartografix_school" is integrated in an e-Learning environment.

  18. Using the Hadoop/MapReduce approach for monitoring the CERN storage system and improving the ATLAS computing model

    CERN Document Server

    Russo, Stefano Alberto; Lamanna, M

    The processing of huge amounts of data, an already fundamental task for the research in the elementary particle physics field, is becoming more and more important also for companies operating in the Information Technology (IT) industry. In this context, if conventional approaches are adopted several problems arise, starting from the congestion of the communication channels. In the IT sector, one of the approaches designed to minimize this congestion on is to exploit the data locality, or in other words, to bring the computation as closer as possible to where the data resides. The most common implementation of this concept is the Hadoop/MapReduce framework. In this thesis work I evaluate the usage of Hadoop/MapReduce in two areas: a standard one similar to typical IT analyses, and an innovative one related to high energy physics analyses. The first consists in monitoring the history of the storage cluster which stores the data generated by the LHC experiments, the second in the physics analysis of the latter, ...

  19. Map Usage in Virtual Environments

    National Research Council Canada - National Science Library

    Cevik, Helsin

    1998-01-01

    ... of map representation as an aid in performing navigation tasks. The approach taken was first to determine and then investigate the parameters that affect virtual map representation through an experiment designed specifically for this thesis...

  20. Crowd-Sourced Mobility Mapping for Location Tracking Using Unlabeled Wi-Fi Simultaneous Localization and Mapping

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-01-01

    Full Text Available Due to the increasing requirements of the seamless and round-the-clock Location-based services (LBSs, a growing interest in Wi-Fi network aided location tracking is witnessed in the past decade. One of the significant problems of the conventional Wi-Fi location tracking approaches based on received signal strength (RSS fingerprinting is the time-consuming and labor intensive work involved in location fingerprint calibration. To solve this problem, a novel unlabeled Wi-Fi simultaneous localization and mapping (SLAM approach is developed to avoid the location fingerprinting and additional inertial or vision sensors. In this approach, an unlabeled mobility map of the coverage area is first constructed by using the crowd-sourcing from a batch of sporadically recorded Wi-Fi RSS sequences based on the spectral cluster assembling. Then, the sequence alignment algorithm is applied to conduct location tracking and mobility map updating. Finally, the effectiveness of this approach is verified by the extensive experiments carried out in a campus-wide area.

  1. c-Jun controls the efficiency of MAP kinase signaling by transcriptional repression of MAP kinase phosphatases

    International Nuclear Information System (INIS)

    Sprowles, Amy; Robinson, Dan; Wu Yimi; Kung, H.-J.; Wisdom, Ron

    2005-01-01

    The mammalian JNK signaling pathway regulates the transcriptional response of cells to environmental stress, including UV irradiation. This signaling pathway is composed of a classical MAP kinase cascade; activation results in phosphorylation of the transcription factor substrates c-Jun and ATF2, and leads to changes in gene expression. The defining components of this pathway are conserved in the fission yeast S. pombe, where the genetic studies have shown that the ability of the JNK homolog Spc1 to be activated in response to UV irradiation is dependent on the presence of the transcription factor substrate Atf1. We have used genetic analysis to define the role of c-Jun in activation of the mammalian JNK signaling pathway. Our results show that optimal activation of JNK requires the presence of its transcription factor substrate c-Jun. Mutational analysis shows that the ability of c-Jun to support efficient activation of JNK requires the ability of Jun to bind DNA, suggesting a transcriptional mechanism. Consistent with this, we show that c-Jun represses the expression of several MAP kinase phosphatases. In the absence of c-Jun, the increased expression of MAP kinase phosphatases leads to impaired activation of the ERK, JNK, and p38 MAP kinases after pathway activation. The results show that one function of c-Jun is to regulate the efficiency of signaling by the ERK, p38, and JNK MAP kinases, a function that is likely to affect cellular responses to many different stimuli

  2. The peeling process of infinite Boltzmann planar maps

    DEFF Research Database (Denmark)

    Budd, Timothy George

    2016-01-01

    criterion has a very simple interpretation. The finite random planar maps under consideration were recently proved to possess a well-defined local limit known as the infinite Boltzmann planar map (IBPM). Inspired by recent work of Curien and Le Gall, we show that the peeling process on the IBPM can...

  3. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    Science.gov (United States)

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  4. Use of Time-Frequency Analysis and Neural Networks for Mode Identification in a Wireless Software-Defined Radio Approach

    Directory of Open Access Journals (Sweden)

    Matteo Gandetto

    2004-09-01

    Full Text Available The use of time-frequency distributions is proposed as a nonlinear signal processing technique that is combined with a pattern recognition approach to identify superimposed transmission modes in a reconfigurable wireless terminal based on software-defined radio techniques. In particular, a software-defined radio receiver is described aiming at the identification of two coexistent communication modes: frequency hopping code division multiple access and direct sequence code division multiple access. As a case study, two standards, based on the previous modes and operating in the same band (industrial, scientific, and medical, are considered: IEEE WLAN 802.11b (direct sequence and Bluetooth (frequency hopping. Neural classifiers are used to obtain identification results. A comparison between two different neural classifiers is made in terms of relative error frequency.

  5. Using a Curricular Vision to Define Entrustable Professional Activities for Medical Student Assessment.

    Science.gov (United States)

    Hauer, Karen E; Boscardin, Christy; Fulton, Tracy B; Lucey, Catherine; Oza, Sandra; Teherani, Arianne

    2015-09-01

    The new UCSF Bridges Curriculum aims to prepare students to succeed in today's health care system while simultaneously improving it. Curriculum redesign requires assessment strategies that ensure that graduates achieve competence in enduring and emerging skills for clinical practice. To design entrustable professional activities (EPAs) for assessment in a new curriculum and gather evidence of content validity. University of California, San Francisco, School of Medicine. Nineteen medical educators participated; 14 completed both rounds of a Delphi survey. Authors describe 5 steps for defining EPAs that encompass a curricular vision including refining the vision, defining draft EPAs, developing EPAs and assessment strategies, defining competencies and milestones, and mapping milestones to EPAs. A Q-sort activity and Delphi survey involving local medical educators created consensus and prioritization for milestones for each EPA. For 4 EPAs, most milestones had content validity indices (CVIs) of at least 78 %. For 2 EPAs, 2 to 4 milestones did not achieve CVIs of 78 %. We demonstrate a stepwise procedure for developing EPAs that capture essential physician work activities defined by a curricular vision. Structured procedures for soliciting faculty feedback and mapping milestones to EPAs provide content validity.

  6. Approaching multidimensional forms of knowledge through Personal Meaning Mapping in science integrating teaching outside the classroom

    DEFF Research Database (Denmark)

    Hartmeyer, Rikke; Bolling, Mads; Bentsen, Peter

    2017-01-01

    knowledge dimensions is important, especially in science teaching outside the classroom, where “hands-on” approaches and experiments are often part of teaching and require procedural knowledge, among other things. Therefore, this study investigates PMM as a method for exploring specific knowledge dimensions......Current research points to Personal Meaning Mapping (PMM) as a method useful in investigating students’ prior and current science knowledge. However, studies investigating PMM as a method for exploring specific knowledge dimensions are lacking. Ensuring that students are able to access specific...... in formal science education integrating teaching outside the classroom. We applied a case study design involving two schools and four sixth-grade classes. Data were collected from six students in each class who constructed personal meaning maps and were interviewed immediately after natural science...

  7. An approach for establishing the performance maps of the sc-CO_2 compressor: Development and qualification by means of CFD simulations

    International Nuclear Information System (INIS)

    Pham, H.S.; Alpy, N.; Ferrasse, J.H.; Boutin, O.; Tothill, M.; Quenaut, J.; Gastaldi, O.; Cadiou, T.; Saez, M.

    2016-01-01

    Highlights: • Ability of CFD to predict the performance of a sc-CO_2 test compressor is shown. • Risk of vapor pockets occurrence inside a scale 1:1 compressor is highlighted. • Limitation of previous performance maps approaches to model the real gas behavior is shown. • A performance maps approach for the sc-CO_2 compressor is proposed and validated. - Abstract: One of the challenges in the performance prediction of the supercritical CO_2 (sc-CO_2) compressor is the real gas behavior of the working fluid near the critical point. This study deals with the establishment of an approach that allows coping with this particularity by dressing compressor performance maps in adequate reduced coordinates (i.e., suitable dimensionless speed and flow parameters inputs and pressure ratio and enthalpy rise outputs), while using CFD for its validation. Two centrifugal compressor designs have been considered in this work. The first one corresponds to a 6 kW small scale component implemented in a test loop at Tokyo Institute of Technology. The second one corresponds to a 38 MW scale 1:1 design considered at an early stage of a project that investigates sc-CO_2 cycle for a Small Modular Reactor application. Numerical results on the former have been successfully confronted with the experimental data to qualify the ability of CFD to provide a performance database. Results on the latter have revealed a significant decrease in the static temperature and pressure during flow acceleration along the leading edge of the impeller blades. In this line, the increased risk of vapor pockets appearance inside a sc-CO_2 compressor has been highlighted and recommendations regarding the choice of the on-design inlet conditions and the compressor design have been given to overcome this concern. CFD results on the scale 1:1 compressor have then been used to evaluate the relevancy of some previous performance maps approaches for a sc-CO_2 compressor application. These include the conventional

  8. Mapping and defining sources of variability in bioavailable strontium isotope ratios in the Eastern Mediterranean

    Science.gov (United States)

    Hartman, Gideon; Richards, Mike

    2014-02-01

    over time. Precipitation, the age of the bedrock and the overall Sr concentration must to be taken into consideration when interpreting geographical variation in strontium isotopes throughout a region. Because these factors can change through time, we recommend that Sr data from time periods older than the Holocene be interpreted with caution. What is the range of variation in the 87Sr/86Sr ratios of vegetation within individual sampling locales? Are there differences in the 87Sr/86Sr ratios of ligneous (woody plants) and non-ligneous (herbaceous plants) within a single sampling location? What is the range of variability in the 87Sr/86Sr ratios of plants growing on marine sedimentary and volcanic geologies? How do the relative contributions of atmospheric Sr sources vary with geology, precipitation, distance from the sea, soil type, and vegetation type. Outlining Sr variability will enable the prediction of the Sr ratio of herbivores in various ecological niches as well as the mapping of bioavailable Sr ratios for a range of pre-Holocene landscapes.In contrast to previous mapping efforts in the region (Shewan, 2004; Perry et al., 2009), this study takes a systematic approach that examines the relative contribution of atmospherically deposited Sr and local weathered bedrock Sr sources to local bioavailable 87Sr/86Sr pools. This is based on the intensive sampling of plants and herbivorous invertebrates primarily from volcanic landscapes and marine sedimentary landscapes composed by large of limestone, dolomite, chalk and marl. The repeated sampling of individual locales, and comparisons between distinct locales of the same geological outcrops were initially planned to detemine the degree of homogeneity of bioavailable 87Sr/86Sr ratios for the purpose of regional landscape mapping. This is important due to the current lack of data on microscale variation in bioavailable sources that might limit the degree of separation between different exposures.

  9. a Conceptual Framework for Indoor Mapping by Using Grammars

    Science.gov (United States)

    Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.

    2017-09-01

    Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  10. Fermion to boson mappings revisited

    International Nuclear Information System (INIS)

    Ginocchio, J.N.; Johnson, C.W.

    1996-01-01

    We briefly review various mappings of fermion pairs to bosons, including those based on mapping operators, such as Belyaev-Zelevinskii, and those on mapping states, such as Marumori; in particular we consider the work of Otsuka-Arima-Iachello, aimed at deriving the Interacting Boson Model. We then give a rigorous and unified description of state-mapping procedures which allows one to systematically go beyond Otsuka-Arima-Iachello and related approaches, along with several exact results. (orig.)

  11. A map between corner and link operators in lattice gauge theories

    International Nuclear Information System (INIS)

    Bars, I.

    1979-01-01

    A completely local gauge-invariant lattice gauge theory is formulated in terms of a new set of variables introduced earlier in the continuum. This theory uses local 'corner' variables defined on lattice sites only, as opposed to the conventional 'link' variables. It is shown via a map that the formulation gives identical results to the usual lattice gauge theory. The properties of the quantum commutators in the continuum limit is also discussed and contrasted for the two lattice approaches. In terms of the corner operators the quantized lattice theory is seen to be closely related to continuum QCD. (Auth.)

  12. Using fuzzy self-organising maps for safety critical systems

    International Nuclear Information System (INIS)

    Kurd, Zeshan; Kelly, Tim P.

    2007-01-01

    This paper defines a type of constrained artificial neural network (ANN) that enables analytical certification arguments whilst retaining valuable performance characteristics. Previous work has defined a safety lifecycle for ANNs without detailing a specific neural model. Building on this previous work, the underpinning of the devised model is based upon an existing neuro-fuzzy system called the fuzzy self-organising map (FSOM). The FSOM is type of 'hybrid' ANN which allows behaviour to be described qualitatively and quantitatively using meaningful expressions. Safety of the FSOM is argued through adherence to safety requirements-derived from hazard analysis and expressed using safety constraints. The approach enables the construction of compelling (product-based) arguments for mitigation of potential failure modes associated with the FSOM. The constrained FSOM has been termed a 'safety critical artificial neural network' (SCANN). The SCANN can be used for non-linear function approximation and allows certified learning and generalisation for high criticality roles. A discussion of benefits for real-world applications is also presented

  13. Mapping carbon sequestration in forests at the regional scale - a climate biomonitoring approach by example of Germany

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, Winfried; Pesch, Roland [University of Vechta, Chair of Landscape Ecology, PO Box. 1553, Vechta (Germany)

    2011-12-15

    The United Nations Framework Convention on Climate Change recognizes carbon (C) fixation in forests as an important contribution for the reduction of atmospheric pollution in terms of greenhouse gases. Spatial differentiation of C sequestration in forests either at the national or at the regional scale is therefore needed for forest planning purposes. Hence, within the framework of the Forest Focus regulation, the aim of this investigation was to statistically analyse factors influencing the C fixation and to use the corresponding associations in terms of a predictive mapping approach at the regional scale by example of the German federal state North Rhine-Westphalia. The results of the methodical scheme outlined in this article should be compared with an already-published approach applied to the same data which were used in the investigation at hand. Site-specific data on C sequestration in humus, forest trees/dead wood and soil from two forest monitoring networks were intersected with available surface information on topography, soil, climate and forestal growing areas and districts. Next, the association between the C sequestration and the influence factors were examined and modelled by linear regression analyses. The resulting regression equations were applied on the surface data to predicatively map the C sequestration for the entire study area. The computations yielded an estimation of 146.7 mio t C sequestered in the forests of North Rhine-Westphalia corresponding to 168.6 t/ha. The calculated values correspond well to according specifications given by the literature. Furthermore, the results are almost identical to those of another pilot study where a different statistical methodology was applied on the same database. Nevertheless, the underlying regression models contribute only a low degree of explanation to the overall variance of the C fixation. This might mainly be due to data quality aspects and missing influence factors in the analyses. In another

  14. A Generalized Approach to Defining Item Discrimination for DCMs

    Science.gov (United States)

    Henson, Robert; DiBello, Lou; Stout, Bill

    2018-01-01

    Diagnostic classification models (DCMs, also known as cognitive diagnosis models) hold the promise of providing detailed classroom information about the skills a student has or has not mastered. Specifically, DCMs are special cases of constrained latent class models where classes are defined based on mastery/nonmastery of a set of attributes (or…

  15. PCR-Based EST Mapping in Wheat (Triticum aestivum L.

    Directory of Open Access Journals (Sweden)

    J. PERRY GUSTAFSON

    2009-04-01

    Full Text Available Mapping expressed sequence tags (ESTs to hexaploid wheat is aimed to reveal the structure and function of the hexaploid wheat genome. Sixty eight ESTs representing 26 genes were mapped into all seven homologous chromosome groups of wheat (Triticum aestivum L using a polymerase chain reaction technique. The majority of the ESTs were mapped to homologous chromosome group 2, and the least were mapped to homologous chromosome group 6. Comparative analysis between the EST map from this study and the EST map based on RFLPs showed 14 genes that have been mapped by both approaches were mapped to the same arm of the same homologous chromosome, which indicated that using PCR-based ESTs was a reliable approach in mapping ESTs in hexaploid wheat.

  16. Improvements in Off Design Aeroengine Performance Prediction Using Analytic Compressor Map Interpolation

    Science.gov (United States)

    Mist'e, Gianluigi Alberto; Benini, Ernesto

    2012-06-01

    Compressor map interpolation is usually performed through the introduction of auxiliary coordinates (β). In this paper, a new analytical bivariate β function definition to be used in compressor map interpolation is studied. The function has user-defined parameters that must be adjusted to properly fit to a single map. The analytical nature of β allows for rapid calculations of the interpolation error estimation, which can be used as a quantitative measure of interpolation accuracy and also as a valid tool to compare traditional β function interpolation with new approaches (artificial neural networks, genetic algorithms, etc.). The quality of the method is analyzed by comparing the error output to the one of a well-known state-of-the-art methodology. This comparison is carried out for two different types of compressor and, in both cases, the error output using the method presented in this paper is found to be consistently lower. Moreover, an optimization routine able to locally minimize the interpolation error by shape variation of the β function is implemented. Further optimization introducing other important criteria is discussed.

  17. Defining Survivorship Trajectories Across Patients With Solid Tumors: An Evidence-Based Approach.

    Science.gov (United States)

    Dood, Robert L; Zhao, Yang; Armbruster, Shannon D; Coleman, Robert L; Tworoger, Shelley; Sood, Anil K; Baggerly, Keith A

    2018-06-02

    Survivorship involves a multidisciplinary approach to surveillance and management of comorbidities and secondary cancers, overseen by oncologists, surgeons, and primary care physicians. Optimal timing and coordination of care, however, is unclear and often based on arbitrary 5-year cutoffs. To determine high- and low-risk periods for all tumor types that could define when survivorship care might best be overseen by oncologists and when to transition to primary care physicians. In this pan-cancer, longitudinal, observational study, excess mortality hazard, calculated as an annualized mortality risk above a baseline population, was plotted over time. The time this hazard took to stabilize defined a high-risk period. The percent morality elevation above age- and sex-matched controls in the latter low-risk period was reported as a mortality gap. The US population-based Surveillance, Epidemiology, and End Results database defined the cancer population, and the US Census life tables defined controls. Incident cases of patients with cancer were separated into tumor types based on International Classification of Diseases for Oncology definitions. Population-level data on incident cancer cases was compared with the general US population. Overall mortality and cause of death were reported on observed cancer cases. A total of 2 317 185 patients (median age, 63 years; 49.8% female) with 66 primary tumor types were evaluated. High-risk surveillance period durations ranged from less than 1 year (breast, prostate, lip, ocular, and parathyroid cancers) up to 19 years (unspecified gastrointestinal cancers). The annualized mortality gap, representing the excess mortality in the stable period, ranged from a median 0.26% to 9.33% excess annual mortality (thyroid and hypopharyngeal cancer populations, respectively). Cluster analysis produced 6 risk cluster groups: group 1, with median survival of 16.2 (5th to 95th percentile range [PR], 10.7-40.2) years and median high-risk period

  18. Issue Mapping for an Ageing Europe

    NARCIS (Netherlands)

    Rogers, R.; Sánchez-Querubín, N.; Kil, A.

    2015-01-01

    Issue Mapping for an Ageing Europe is a seminal guide to mapping social and political issues with digital methods. The issue at stake concerns the imminent crisis of an ageing Europe and its impact on the contemporary welfare state. The book brings together three leading approaches to issue mapping:

  19. Developing clinical strength-of-evidence approach to define HIV-associated malignancies for cancer registration in Kenya.

    Directory of Open Access Journals (Sweden)

    Anne Korir

    Full Text Available Sub-Saharan Africa cancer registries are beset by an increasing cancer burden further exacerbated by the AIDS epidemic where there are limited capabilities for cancer-AIDS match co-registration. We undertook a pilot study based on a "strength-of-evidence" approach using clinical data that is abstracted at the time of cancer registration for purposes of linking cancer diagnosis to AIDS diagnosis.The standard Nairobi Cancer Registry form was modified for registrars to abstract the following clinical data from medical records regarding HIV infection/AIDS in a hierarchal approach at time of cancer registration from highest-to-lowest strength-of-evidence: 1 documentation of positive HIV serology; 2 antiretroviral drug prescription; 3 CD4+ lymphocyte count; and 4 WHO HIV clinical stage or immune suppression syndrome (ISS, which is Kenyan terminology for AIDS. Between August 1 and October 31, 2011 a total of 1,200 cancer cases were registered. Of these, 171 cases (14.3% met clinical strength-of-evidence criteria for association with HIV infection/AIDS; 69% (118 cases were tumor types with known HIV association - Kaposi's sarcoma, cervical cancer, non-Hodgkin's and Hodgkin's lymphoma, and conjunctiva carcinoma and 31% (53 were consistent with non-AIDS defining cancers. Verifiable positive HIV serology was identified in 47 (27% cases for an absolute seroprevalence rate of 4% among the cancer registered cases with an upper boundary of 14% among those meeting at least one of strength-of-evidence criteria.This pilot demonstration of a hierarchal, clinical strength-of-evidence approach for cancer-AIDS registration in Kenya establishes feasibility, is readily adaptable, pragmatic, and does not require additional resources for critically under staffed cancer registries. Cancer is an emerging public health challenge, and African nations need to develop well designed population-based studies in order to better define the impact and spectrum of malignant disease

  20. Computing fixed points of nonexpansive mappings by $\\alpha$-dense curves

    Directory of Open Access Journals (Sweden)

    G. García

    2017-08-01

    Full Text Available Given a multivalued nonexpansive mapping defined on a convex and compact set of a Banach space, with values in the class of convex and compact subsets of its domain, we present an iteration scheme which (under suitable conditions converges to a fixed point of such mapping. This new iteration provides us another method to approximate the fixed points of a singlevalued nonexpansive mapping, defined on a compact and convex set into itself. Moreover, the conditions for the singlevalued case are less restrictive than for the multivalued case. Our main tool will be the so called $\\alpha$-dense curves, which will allow us to construct such iterations. Some numerical examples are provided to illustrate our results.

  1. Mapping posttranscriptional regulation of the human glycome uncovers microRNA defining the glycocode

    OpenAIRE

    Agrawal, Praveen; Kurcon, Tomasz; Pilobello, Kanoelani T.; Rakus, John F.; Koppolu, Sujeethraj; Liu, Zhongyin; Batista, Bianca S.; Eng, William S.; Hsu, Ku-Lung; Liang, Yaxuan; Mahal, Lara K.

    2014-01-01

    Carbohydrates (glycans) are complex cell surface molecules that control multiple aspects of cell biology, including cell–cell communication, cancer metastasis, and inflammation. Glycan biosynthesis requires the coordination of many enzymes, but how this is regulated is not well understood. Herein we show that microRNA (miRNA), small noncoding RNA, are a major regulator of cell surface glycosylation. We map miRNA expression onto carbohydrate signatures obtained by using lectin microarrays, a g...

  2. A CONCEPTUAL FRAMEWORK FOR INDOOR MAPPING BY USING GRAMMARS

    Directory of Open Access Journals (Sweden)

    X. Hu

    2017-09-01

    Full Text Available Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users’ location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.

  3. On Attribute Thresholding and Data Mapping Functions in a Supervised Connected Component Segmentation Framework

    Directory of Open Access Journals (Sweden)

    Christoff Fourie

    2015-06-01

    Full Text Available Search-centric, sample supervised image segmentation has been demonstrated as a viable general approach applicable within the context of remote sensing image analysis. Such an approach casts the controlling parameters of image processing—generating segments—as a multidimensional search problem resolvable via efficient search methods. In this work, this general approach is analyzed in the context of connected component segmentation. A specific formulation of connected component labeling, based on quasi-flat zones, allows for the addition of arbitrary segment attributes to contribute to the nature of the output. This is in addition to core tunable parameters controlling the basic nature of connected components. Additional tunable constituents may also be introduced into such a framework, allowing flexibility in the definition of connected component connectivity, either directly via defining connectivity differently or via additional processes such as data mapping functions. The relative merits of these two additional constituents, namely the addition of tunable attributes and data mapping functions, are contrasted in a general remote sensing image analysis setting. Interestingly, tunable attributes in such a context, conjectured to be safely useful in general settings, were found detrimental under cross-validated conditions. This is in addition to this constituent’s requiring substantially greater computing time. Casting connectivity definitions as a searchable component, here via the utilization of data mapping functions, proved more beneficial and robust in this context. The results suggest that further investigations into such a general framework could benefit more from focusing on the aspects of data mapping and modifiable connectivity as opposed to the utility of thresholding various geometric and spectral attributes.

  4. The EnMAP-Box—A Toolbox and Application Programming Interface for EnMAP Data Processing

    Directory of Open Access Journals (Sweden)

    Sebastian van der Linden

    2015-09-01

    Full Text Available The EnMAP-Box is a toolbox that is developed for the processing and analysis of data acquired by the German spaceborne imaging spectrometer EnMAP (Environmental Mapping and Analysis Program. It is developed with two aims in mind in order to guarantee full usage of future EnMAP data, i.e., (1 extending the EnMAP user community and (2 providing access to recent approaches for imaging spectroscopy data processing. The software is freely available and offers a range of tools and applications for the processing of spectral imagery, including classical processing tools for imaging spectroscopy data as well as powerful machine learning approaches or interfaces for the integration of methods available in scripting languages. A special developer version includes the full open source code, an application programming interface and an application wizard for easy integration and documentation of new developments. This paper gives an overview of the EnMAP-Box for users and developers, explains typical workflows along an application example and exemplifies the concept for making it a frequently used and constantly extended platform for imaging spectroscopy applications.

  5. The National Map: from geography to mapping and back again

    Science.gov (United States)

    Kelmelis, John A.; DeMulder, Mark L.; Ogrosky, Charles E.; Van Driel, J. Nicholas; Ryan, Barbara J.

    2003-01-01

    When the means of production for national base mapping were capital intensive, required large production facilities, and had ill-defined markets, Federal Government mapping agencies were the primary providers of the spatial data needed for economic development, environmental management, and national defense. With desktop geographic information systems now ubiquitous, source data available as a commodity from private industry, and the realization that many complex problems faced by society need far more and different kinds of spatial data for their solutions, national mapping organizations must realign their business strategies to meet growing demand and anticipate the needs of a rapidly changing geographic information environment. The National Map of the United States builds on a sound historic foundation of describing and monitoring the land surface and adds a focused effort to produce improved understanding, modeling, and prediction of land-surface change. These added dimensions bring to bear a broader spectrum of geographic science to address extant and emerging issues. Within the overarching construct of The National Map, the U.S. Geological Survey (USGS) is making a transition from data collector to guarantor of national data completeness; from producing paper maps to supporting an online, seamless, integrated database; and from simply describing the Nation’s landscape to linking these descriptions with increased scientific understanding. Implementing the full spectrum of geographic science addresses a myriad of public policy issues, including land and natural resource management, recreation, urban growth, human health, and emergency planning, response, and recovery. Neither these issues nor the science and technologies needed to deal with them are static. A robust research agenda is needed to understand these changes and realize The National Map vision. Initial successes have been achieved. These accomplishments demonstrate the utility of

  6. Mapping flood and flooding potential indices: a methodological approach to identifying areas susceptible to flood and flooding risk. Case study: the Prahova catchment (Romania)

    Science.gov (United States)

    Zaharia, Liliana; Costache, Romulus; Prăvălie, Remus; Ioana-Toroimac, Gabriela

    2017-04-01

    Given that floods continue to cause yearly significant worldwide human and material damages, flood risk mitigation is a key issue and a permanent challenge in developing policies and strategies at various spatial scales. Therefore, a basic phase is elaborating hazard and flood risk maps, documents which are an essential support for flood risk management. The aim of this paper is to develop an approach that allows for the identification of flash-flood and flood-prone susceptible areas based on computing and mapping of two indices: FFPI (Flash-Flood Potential Index) and FPI (Flooding Potential Index). These indices are obtained by integrating in a GIS environment several geographical variables which control runoff (in the case of the FFPI) and favour flooding (in the case of the FPI). The methodology was applied in the upper (mountainous) and middle (hilly) catchment of the Prahova River, a densely populated and socioeconomically well-developed area which has been affected repeatedly by water-related hazards over the past decades. The resulting maps showing the spatialization of the FFPI and FPI allow for the identification of areas with high susceptibility to flashfloods and flooding. This approach can provide useful mapped information, especially for areas (generally large) where there are no flood/hazard risk maps. Moreover, the FFPI and FPI maps can constitute a preliminary step for flood risk and vulnerability assessment.

  7. Inferring the most probable maps of underground utilities using Bayesian mapping model

    Science.gov (United States)

    Bilal, Muhammad; Khan, Wasiq; Muggleton, Jennifer; Rustighi, Emiliano; Jenks, Hugo; Pennock, Steve R.; Atkins, Phil R.; Cohn, Anthony

    2018-03-01

    Mapping the Underworld (MTU), a major initiative in the UK, is focused on addressing social, environmental and economic consequences raised from the inability to locate buried underground utilities (such as pipes and cables) by developing a multi-sensor mobile device. The aim of MTU device is to locate different types of buried assets in real time with the use of automated data processing techniques and statutory records. The statutory records, even though typically being inaccurate and incomplete, provide useful prior information on what is buried under the ground and where. However, the integration of information from multiple sensors (raw data) with these qualitative maps and their visualization is challenging and requires the implementation of robust machine learning/data fusion approaches. An approach for automated creation of revised maps was developed as a Bayesian Mapping model in this paper by integrating the knowledge extracted from sensors raw data and available statutory records. The combination of statutory records with the hypotheses from sensors was for initial estimation of what might be found underground and roughly where. The maps were (re)constructed using automated image segmentation techniques for hypotheses extraction and Bayesian classification techniques for segment-manhole connections. The model consisting of image segmentation algorithm and various Bayesian classification techniques (segment recognition and expectation maximization (EM) algorithm) provided robust performance on various simulated as well as real sites in terms of predicting linear/non-linear segments and constructing refined 2D/3D maps.

  8. A topographical map approach to representing treatment efficacy: a focus on positive psychology interventions.

    Science.gov (United States)

    Gorlin, Eugenia I; Lee, Josephine; Otto, Michael W

    2018-01-01

    A recent meta-analysis by Bolier et al. indicated that positive psychology interventions have overall small to moderate effects on well-being, but results were quite heterogeneous across intervention trials. Such meta-analytic research helps condense information on the efficacy of a broad psychosocial intervention by averaging across many effects; however, such global averages may provide limited navigational guidance for selecting among specific interventions. Here, we introduce a novel method for displaying qualitative and quantitative information on the efficacy of interventions using a topographical map approach. As an initial prototype for demonstrating this method, we mapped 50 positive psychology interventions targeting well-being (as captured in the Bolier et al. [2013] meta-analysis, [Bolier, L., Haverman, M., Westerhof, G. J., Riper, H., Smit, F., & Bohlmeijer, E. (2013). Positive psychology interventions: A meta-analysis of randomized controlled studies. BMC Public Health, 13, 83]). Each intervention domain/subdomain was mapped according to its average effect size (indexed by vertical elevation), number of studies providing effect sizes (indexed by horizontal area), and therapist/client burden (indexed by shading). The geographical placement of intervention domains/subdomains was determined by their conceptual proximity, allowing viewers to gauge the general conceptual "direction" in which promising intervention effects can be found. The resulting graphical displays revealed several prominent features of the well-being intervention "landscape," such as more strongly and uniformly positive effects of future-focused interventions (including, goal-pursuit and optimism training) compared to past/present-focused ones.

  9. Bioinformatics of genomic association mapping

    NARCIS (Netherlands)

    Vaez Barzani, Ahmad

    2015-01-01

    In this thesis we present an overview of bioinformatics-based approaches for genomic association mapping, with emphasis on human quantitative traits and their contribution to complex diseases. We aim to provide a comprehensive walk-through of the classic steps of genomic association mapping

  10. Universal recovery map for approximate Markov chains.

    Science.gov (United States)

    Sutter, David; Fawzi, Omar; Renner, Renato

    2016-02-01

    A central question in quantum information theory is to determine how well lost information can be reconstructed. Crucially, the corresponding recovery operation should perform well without knowing the information to be reconstructed. In this work, we show that the quantum conditional mutual information measures the performance of such recovery operations. More precisely, we prove that the conditional mutual information I ( A : C | B ) of a tripartite quantum state ρ ABC can be bounded from below by its distance to the closest recovered state [Formula: see text], where the C -part is reconstructed from the B -part only and the recovery map [Formula: see text] merely depends on ρ BC . One particular application of this result implies the equivalence between two different approaches to define topological order in quantum systems.

  11. A novel approach for monitoring writing interferences during navigated transcranial magnetic stimulation mappings of writing related cortical areas.

    Science.gov (United States)

    Rogić Vidaković, Maja; Gabelica, Dragan; Vujović, Igor; Šoda, Joško; Batarelo, Nikolina; Džimbeg, Andrija; Zmajević Schönwald, Marina; Rotim, Krešimir; Đogaš, Zoran

    2015-11-30

    It has recently been shown that navigated repetitive transcranial magnetic stimulation (nTMS) is useful in preoperative neurosurgical mapping of motor and language brain areas. In TMS mapping of motor cortices the evoked responses can be quantitatively monitored by electromyographic (EMG) recordings. No such setup exists for monitoring of writing during nTMS mappings of writing related cortical areas. We present a novel approach for monitoring writing during nTMS mappings of motor writing related cortical areas. To our best knowledge, this is the first demonstration of quantitative monitoring of motor evoked responses from hand by EMG, and of pen related activity during writing with our custom made pen, together with the application of chronometric TMS design and patterned protocol of rTMS. The method was applied in four healthy subjects participating in writing during nTMS mapping of the premotor cortical area corresponding to BA 6 and close to the superior frontal sulcus. The results showed that stimulation impaired writing in all subjects. The corresponding spectra of measured signal related to writing movements was observed in the frequency band 0-20 Hz. Magnetic stimulation affected writing by suppressing normal writing frequency band. The proposed setup for monitoring of writing provides additional quantitative data for monitoring and the analysis of rTMS induced writing response modifications. The setup can be useful for investigation of neurophysiologic mechanisms of writing, for therapeutic effects of nTMS, and in preoperative mapping of language cortical areas in patients undergoing brain surgery. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Swath sonar mapping of Earth's submarine plate boundaries

    Science.gov (United States)

    Carbotte, S. M.; Ferrini, V. L.; Celnick, M.; Nitsche, F. O.; Ryan, W. B. F.

    2014-12-01

    The recent loss of Malaysia Airlines flight MH370 in an area of the Indian Ocean where less than 5% of the seafloor is mapped with depth sounding data (Smith and Marks, EOS 2014) highlights the striking lack of detailed knowledge of the topography of the seabed for much of the worlds' oceans. Advances in swath sonar mapping technology over the past 30 years have led to dramatic improvements in our capability to map the seabed. However, the oceans are vast and only an estimated 10% of the seafloor has been mapped with these systems. Furthermore, the available coverage is highly heterogeneous and focused within areas of national strategic priority and community scientific interest. The major plate boundaries that encircle the globe, most of which are located in the submarine environment, have been a significant focus of marine geoscience research since the advent of swath sonar mapping. While the location of these plate boundaries are well defined from satellite-derived bathymetry, significant regions remain unmapped at the high-resolutions provided by swath sonars and that are needed to study active volcanic and tectonic plate boundary processes. Within the plate interiors, some fossil plate boundary zones, major hotspot volcanoes, and other volcanic provinces have been the focus of dedicated research programs. Away from these major tectonic structures, swath mapping coverage is limited to sparse ocean transit lines which often reveal previously unknown deep-sea channels and other little studied sedimentary structures not resolvable in existing low-resolution global compilations, highlighting the value of these data even in the tectonically quiet plate interiors. Here, we give an overview of multibeam swath sonar mapping of the major plate boundaries of the globe as extracted from public archives. Significant quantities of swath sonar data acquired from deep-sea regions are in restricted-access international archives. Open access to more of these data sets would

  13. Pressure pain sensitivity maps, self-reported musculoskeletal disorders and sickness absence among cleaners

    DEFF Research Database (Denmark)

    Binderup, Asbjørn Thalund; Holtermann, Andreas; Søgaard, Karen

    2011-01-01

    back regions (27 points). LTSA was defined as ten or more consecutive workdays with sick leave. RESULTS: The PPT maps revealed the spatial heterogeneity in mechanical sensitivity among cleaners. The level of pain in the neck and dominant shoulder and upper back within the last 7 days correlated......BACKGROUND: Pressure pain threshold mapping is a valuable method for the identification of distinct zones of mechanical pain sensitivity. Such approach was applied for the first time in relation to self-reported musculoskeletal disorders and long-term sickness absence (LTSA) within the last 12...... months among cleaners. METHODS: About 29 cleaners filled out a self-administered questionnaire regarding health, work-related measures and musculoskeletal disorders. Subsequently, PPTs were measured at (1) tibialis anterior (control location, 1 point), (2) the neck-shoulder (48 points) and (3) the low...

  14. Mapping sequences by parts

    Directory of Open Access Journals (Sweden)

    Guziolowski Carito

    2007-09-01

    Full Text Available Abstract Background: We present the N-map method, a pairwise and asymmetrical approach which allows us to compare sequences by taking into account evolutionary events that produce shuffled, reversed or repeated elements. Basically, the optimal N-map of a sequence s over a sequence t is the best way of partitioning the first sequence into N parts and placing them, possibly complementary reversed, over the second sequence in order to maximize the sum of their gapless alignment scores. Results: We introduce an algorithm computing an optimal N-map with time complexity O (|s| × |t| × N using O (|s| × |t| × N memory space. Among all the numbers of parts taken in a reasonable range, we select the value N for which the optimal N-map has the most significant score. To evaluate this significance, we study the empirical distributions of the scores of optimal N-maps and show that they can be approximated by normal distributions with a reasonable accuracy. We test the functionality of the approach over random sequences on which we apply artificial evolutionary events. Practical Application: The method is illustrated with four case studies of pairs of sequences involving non-standard evolutionary events.

  15. Prediction of Poly(A Sites by Poly(A Read Mapping.

    Directory of Open Access Journals (Sweden)

    Thomas Bonfert

    Full Text Available RNA-seq reads containing part of the poly(A tail of transcripts (denoted as poly(A reads provide the most direct evidence for the position of poly(A sites in the genome. However, due to reduced coverage of poly(A tails by reads, poly(A reads are not routinely identified during RNA-seq mapping. Nevertheless, recent studies for several herpesviruses successfully employed mapping of poly(A reads to identify herpesvirus poly(A sites using different strategies and customized programs. To more easily allow such analyses without requiring additional programs, we integrated poly(A read mapping and prediction of poly(A sites into our RNA-seq mapping program ContextMap 2. The implemented approach essentially generalizes previously used poly(A read mapping approaches and combines them with the context-based approach of ContextMap 2 to take into account information provided by other reads aligned to the same location. Poly(A read mapping using ContextMap 2 was evaluated on real-life data from the ENCODE project and compared against a competing approach based on transcriptome assembly (KLEAT. This showed high positive predictive value for our approach, evidenced also by the presence of poly(A signals, and considerably lower runtime than KLEAT. Although sensitivity is low for both methods, we show that this is in part due to a high extent of spurious results in the gold standard set derived from RNA-PET data. Sensitivity improves for poly(A sites of known transcripts or determined with a more specific poly(A sequencing protocol and increases with read coverage on transcript ends. Finally, we illustrate the usefulness of the approach in a high read coverage scenario by a re-analysis of published data for herpes simplex virus 1. Thus, with current trends towards increasing sequencing depth and read length, poly(A read mapping will prove to be increasingly useful and can now be performed automatically during RNA-seq mapping with ContextMap 2.

  16. Defining and Classifying Interest Groups

    DEFF Research Database (Denmark)

    Baroni, Laura; Carroll, Brendan; Chalmers, Adam

    2014-01-01

    The interest group concept is defined in many different ways in the existing literature and a range of different classification schemes are employed. This complicates comparisons between different studies and their findings. One of the important tasks faced by interest group scholars engaged...... in large-N studies is therefore to define the concept of an interest group and to determine which classification scheme to use for different group types. After reviewing the existing literature, this article sets out to compare different approaches to defining and classifying interest groups with a sample...... in the organizational attributes of specific interest group types. As expected, our comparison of coding schemes reveals a closer link between group attributes and group type in narrower classification schemes based on group organizational characteristics than those based on a behavioral definition of lobbying....

  17. Iterated-map approach to die tossing

    DEFF Research Database (Denmark)

    Feldberg, Rasmus; Szymkat, Maciej; Knudsen, Carsten

    1990-01-01

    Nonlinear dissipative mapping is applied to determine the trajectory of a two-dimensional die thrown onto an elastic table. The basins of attraction for different outcomes are obtained and their distribution in the space of initial conditions discussed. The system has certain properties in common...... with chaotic systems. However, a die falls to rest after a finite number of impacts, and therefore the system has a finite sensitivity to the initial conditions. Quantitative measures of this sensitivity are proposed and their variations with the initial momentum and orientation of the die investigated....

  18. Impact of visual impairment on the lives of young adults in the Netherlands: a concept-mapping approach.

    Science.gov (United States)

    Elsman, Ellen Bernadette Maria; van Rens, Gerardus Hermanus Maria Bartholomeus; van Nispen, Ruth Marie Antoinette

    2017-12-01

    While the impact of visual impairments on specific aspects of young adults' lives is well recognised, a systematic understanding of its impact on all life aspects is lacking. This study aims to provide an overview of life aspects affected by visual impairment in young adults (aged 18-25 years) using a concept-mapping approach. Visually impaired young adults (n = 22) and rehabilitation professionals (n = 16) participated in online concept-mapping workshops (brainstorm procedure), to explore how having a visual impairment influences the lives of young adults. Statements were categorised based on similarity and importance. Using multidimensional scaling, concept maps were produced and interpreted. A total of 59 and 260 statements were generated by young adults and professionals, respectively, resulting in 99 individual statements after checking and deduplication. The combined concept map revealed 11 clusters: work, study, information and regulations, social skills, living independently, computer, social relationships, sport and activities, mobility, leisure time, and hobby. The concept maps provided useful insight into activities influenced by visual impairments in young adults, which can be used by rehabilitation centres to improve their services. This might help in goal setting, rehabilitation referral and successful transition to adult life, ultimately increasing participation and quality of life. Implications for rehabilitation Having a visual impairment affects various life-aspects related to participation, including activities related to work, study, social skills and relationships, activities of daily living, leisure time and mobility. Concept-mapping helped to identify the life aspects affected by low vision, and quantify these aspects in terms of importance according to young adults and low vision rehabilitation professionals. Low vision rehabilitation centres should focus on all life aspects found in this study when identifying the needs of young

  19. Defined PEG smears as an alternative approach to enhance the search for crystallization conditions and crystal-quality improvement in reduced screens

    Energy Technology Data Exchange (ETDEWEB)

    Chaikuad, Apirat, E-mail: apirat.chaikuad@sgc.ox.ac.uk [University of Oxford, Old Road Campus Research Building, Roosevelt Drive, Headington, Oxford OX3 7DQ (United Kingdom); Knapp, Stefan [University of Oxford, Old Road Campus Research Building, Roosevelt Drive, Headington, Oxford OX3 7DQ (United Kingdom); Johann Wolfgang Goethe-University, Building N240 Room 3.03, Max-von-Laue-Strasse 9, 60438 Frankfurt am Main (Germany); Delft, Frank von, E-mail: apirat.chaikuad@sgc.ox.ac.uk [University of Oxford, Old Road Campus Research Building, Roosevelt Drive, Headington, Oxford OX3 7DQ (United Kingdom)

    2015-07-28

    An alternative strategy for PEG sampling is suggested through the use of four newly defined PEG smears to enhance chemical space in reduced screens with a benefit towards protein crystallization. The quest for an optimal limited set of effective crystallization conditions remains a challenge in macromolecular crystallography, an issue that is complicated by the large number of chemicals which have been deemed to be suitable for promoting crystal growth. The lack of rational approaches towards the selection of successful chemical space and representative combinations has led to significant overlapping conditions, which are currently present in a multitude of commercially available crystallization screens. Here, an alternative approach to the sampling of widely used PEG precipitants is suggested through the use of PEG smears, which are mixtures of different PEGs with a requirement of either neutral or cooperatively positive effects of each component on crystal growth. Four newly defined smears were classified by molecular-weight groups and enabled the preservation of specific properties related to different polymer sizes. These smears not only allowed a wide coverage of properties of these polymers, but also reduced PEG variables, enabling greater sampling of other parameters such as buffers and additives. The efficiency of the smear-based screens was evaluated on more than 220 diverse recombinant human proteins, which overall revealed a good initial crystallization success rate of nearly 50%. In addition, in several cases successful crystallizations were only obtained using PEG smears, while various commercial screens failed to yield crystals. The defined smears therefore offer an alternative approach towards PEG sampling, which will benefit the design of crystallization screens sampling a wide chemical space of this key precipitant.

  20. Viking lander tracking contributions to Mars mapping

    International Nuclear Information System (INIS)

    Michael, W.H. Jr.

    1979-01-01

    The major recent advances in planetary mapping have been accomplished through use of photography from orbiting satellites, as is the case for Mars with Mariner and Viking photographs. The requirement for greater precision demands that inputs to the photogrammatic process be more precisely defined. This paper describes how analyses of Doppler and ranging data from the Viking landers are contributing to more precise mapping of Mars in several specific areas. (Auth.)

  1. Proportional Symbol Mapping in R

    Directory of Open Access Journals (Sweden)

    Susumu Tanimura

    2006-01-01

    Full Text Available Visualization of spatial data on a map aids not only in data exploration but also in communication to impart spatial conception or ideas to others. Although recent carto-graphic functions in R are rapidly becoming richer, proportional symbol mapping, which is one of the common mapping approaches, has not been packaged thus far. Based on the theories of proportional symbol mapping developed in cartography, the authors developed some functions for proportional symbol mapping using R, including mathematical and perceptual scaling. An example of these functions demonstrated the new expressive power and options available in R, particularly for the visualization of conceptual point data.

  2. Enriching the national map database for multi-scale use: Introducing the visibilityfilter attribution

    Science.gov (United States)

    Stauffer, Andrew J.; Webinger, Seth; Roche, Brittany

    2016-01-01

    The US Geological Survey’s (USGS) National Geospatial Technical Operations Center is prototyping and evaluating the ability to filter data through a range of scales using 1:24,000-scale The National Map (TNM) datasets as the source. A “VisibilityFilter” attribute is under evaluation that can be added to all TNM vector data themes and will permit filtering of data to eight target scales between 1:24,000 and 1:5,000,000, thus defining each feature’s smallest applicable scale-of-use. For a prototype implementation, map specifications for 1:100,000- and 1:250,000-scale USGS Topographic Map Series are being utilized to define feature content appropriate at fixed mapping scales to guide generalization decisions that are documented in a ScaleMaster diagram. This paper defines the VisibilityFilter attribute, the generalization decisions made for each TNM data theme, and how these decisions are embedded into the data to support efficient data filtering.

  3. Mapping social values of ecosystem services: What is behind the map?

    Directory of Open Access Journals (Sweden)

    Laura Nahuelhual

    2016-09-01

    Full Text Available A growing interest in mapping the social value of ecosystem services (ES is not yet methodologically aligned with what is actually being mapped. We critically examine aspects of the social value mapping process that might influence map outcomes and limit their practical use in decision making. We rely on an empirical case of participatory mapping, for a single ES (recreation opportunities, which involves diverse stakeholders such as planners, researchers, and community representatives. Value elicitation relied on an individual open-ended interview and a mapping exercise. Interpretation of the narratives and GIS calculations of proximity, centrality, and dispersion helped in exploring the factors driving participants' answers. Narratives reveal diverse value types. Whereas planners highlighted utilitarian and aesthetic values, the answers from researchers revealed naturalistic values as well. In turn community representatives acknowledged symbolic values. When remitted to the map, these values were constrained to statements toward a much narrower set of features of the physical (e.g., volcanoes and built landscape (e.g., roads. The results suggest that mapping, as an instrumental approach toward social valuation, may capture only a subset of relevant assigned values. This outcome is the interplay between participants' characteristics, including their acquaintance with the territory and their ability with maps, and the mapping procedure itself, including the proxies used to represent the ES and the value typology chosen, the elicitation question, the cartographic features displayed on the base map, and the spatial scale.

  4. [The intervention mapping protocol: A structured process to develop, implement and evaluate health promotion programs].

    Science.gov (United States)

    Fassier, J-B; Lamort-Bouché, M; Sarnin, P; Durif-Bruckert, C; Péron, J; Letrilliart, L; Durand, M-J

    2016-02-01

    Health promotion programs are expected to improve population health and reduce social inequalities in health. However, their theoretical foundations are frequently ill-defined, and their implementation faces many obstacles. The aim of this article is to describe the intervention mapping protocol in health promotion programs planning, used recently in several countries. The challenges of planning health promotion programs are presented, and the six steps of the intervention mapping protocol are described with an example. Based on a literature review, the use of this protocol, its requirements and potential limitations are discussed. The intervention mapping protocol has four essential characteristics: an ecological perspective (person-environment), a participative approach, the use of theoretical models in human and social sciences and the use of scientific evidence. It comprises six steps: conduct a health needs assessment, define change objectives, select theory-based change techniques and practical applications, organize techniques and applications into an intervention program (logic model), plan for program adoption, implementation, and sustainability, and generate an evaluation plan. This protocol was used in different countries and domains such as obesity, tobacco, physical activity, cancer and occupational health. Although its utilization requires resources and a critical stance, this protocol was used to develop interventions which efficacy was demonstrated. The intervention mapping protocol is an integrated process that fits the scientific and practical challenges of health promotion. It could be tested in France as it was used in other countries, in particular to reduce social inequalities in health. Copyright © 2016. Published by Elsevier Masson SAS.

  5. A Rule-Based Spatial Reasoning Approach for OpenStreetMap Data Quality Enrichment; Case Study of Routing and Navigation

    Science.gov (United States)

    2017-01-01

    Finding relevant geospatial information is increasingly critical because of the growing volume of geospatial data available within the emerging “Big Data” era. Users are expecting that the availability of massive datasets will create more opportunities to uncover hidden information and answer more complex queries. This is especially the case with routing and navigation services where the ability to retrieve points of interest and landmarks make the routing service personalized, precise, and relevant. In this paper, we propose a new geospatial information approach that enables the retrieval of implicit information, i.e., geospatial entities that do not exist explicitly in the available source. We present an information broker that uses a rule-based spatial reasoning algorithm to detect topological relations. The information broker is embedded into a framework where annotations and mappings between OpenStreetMap data attributes and external resources, such as taxonomies, support the enrichment of queries to improve the ability of the system to retrieve information. Our method is tested with two case studies that leads to enriching the completeness of OpenStreetMap data with footway crossing points-of-interests as well as building entrances for routing and navigation purposes. It is concluded that the proposed approach can uncover implicit entities and contribute to extract required information from the existing datasets. PMID:29088125

  6. A Rule-Based Spatial Reasoning Approach for OpenStreetMap Data Quality Enrichment; Case Study of Routing and Navigation

    Directory of Open Access Journals (Sweden)

    Amin Mobasheri

    2017-10-01

    Full Text Available Finding relevant geospatial information is increasingly critical because of the growing volume of geospatial data available within the emerging “Big Data” era. Users are expecting that the availability of massive datasets will create more opportunities to uncover hidden information and answer more complex queries. This is especially the case with routing and navigation services where the ability to retrieve points of interest and landmarks make the routing service personalized, precise, and relevant. In this paper, we propose a new geospatial information approach that enables the retrieval of implicit information, i.e., geospatial entities that do not exist explicitly in the available source. We present an information broker that uses a rule-based spatial reasoning algorithm to detect topological relations. The information broker is embedded into a framework where annotations and mappings between OpenStreetMap data attributes and external resources, such as taxonomies, support the enrichment of queries to improve the ability of the system to retrieve information. Our method is tested with two case studies that leads to enriching the completeness of OpenStreetMap data with footway crossing points-of-interests as well as building entrances for routing and navigation purposes. It is concluded that the proposed approach can uncover implicit entities and contribute to extract required information from the existing datasets.

  7. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    International Nuclear Information System (INIS)

    Bailey, D; Spaans, J; Kumaraswamy, L; Podgorsak, M

    2016-01-01

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  8. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, D [Northside Hospital Cancer Institute, Atlanta, GA (United States); Spaans, J; Kumaraswamy, L; Podgorsak, M [Roswell Park Cancer Institute, Buffalo, NY (United States)

    2016-06-15

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  9. Algorithms for mapping high-throughput DNA sequences

    DEFF Research Database (Denmark)

    Frellsen, Jes; Menzel, Peter; Krogh, Anders

    2014-01-01

    of data generation, new bioinformatics approaches have been developed to cope with the large amount of sequencing reads obtained in these experiments. In this chapter, we first introduce HTS technologies and their usage in molecular biology and discuss the problem of mapping sequencing reads...... to their genomic origin. We then in detail describe two approaches that offer very fast heuristics to solve the mapping problem in a feasible runtime. In particular, we describe the BLAT algorithm, and we give an introduction to the Burrows-Wheeler Transform and the mapping algorithms based on this transformation....

  10. Mapping ecosystem types by means of ecological species groups

    NARCIS (Netherlands)

    Witte, J.P.M.; Meijden, van der R.

    2000-01-01

    A method is presented to deduce nation-wide maps of ecosystem types from FLORBASE. This national database contains data, per km2, on the presence of indigenous plant species that grow in the wild. The ecosystem types on the maps are defined on the basis of abiotic factors that determine the plant

  11. Combined visualization for noise mapping of industrial facilities based on ray-tracing and thin plate splines

    Science.gov (United States)

    Ovsiannikov, Mikhail; Ovsiannikov, Sergei

    2017-01-01

    The paper presents the combined approach to noise mapping and visualizing of industrial facilities sound pollution using forward ray tracing method and thin-plate spline interpolation. It is suggested to cauterize industrial area in separate zones with similar sound levels. Equivalent local source is defined for range computation of sanitary zones based on ray tracing algorithm. Computation of sound pressure levels within clustered zones are based on two-dimension spline interpolation of measured data on perimeter and inside the zone.

  12. MareyMap Online: A User-Friendly Web Application and Database Service for Estimating Recombination Rates Using Physical and Genetic Maps.

    Science.gov (United States)

    Siberchicot, Aurélie; Bessy, Adrien; Guéguen, Laurent; Marais, Gabriel A B

    2017-10-01

    Given the importance of meiotic recombination in biology, there is a need to develop robust methods to estimate meiotic recombination rates. A popular approach, called the Marey map approach, relies on comparing genetic and physical maps of a chromosome to estimate local recombination rates. In the past, we have implemented this approach in an R package called MareyMap, which includes many functionalities useful to get reliable recombination rate estimates in a semi-automated way. MareyMap has been used repeatedly in studies looking at the effect of recombination on genome evolution. Here, we propose a simpler user-friendly web service version of MareyMap, called MareyMap Online, which allows a user to get recombination rates from her/his own data or from a publicly available database that we offer in a few clicks. When the analysis is done, the user is asked whether her/his curated data can be placed in the database and shared with other users, which we hope will make meta-analysis on recombination rates including many species easy in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  13. A unified modeling approach for physical experiment design and optimization in laser driven inertial confinement fusion

    Energy Technology Data Exchange (ETDEWEB)

    Li, Haiyan [Mechatronics Engineering School of Guangdong University of Technology, Guangzhou 510006 (China); Huang, Yunbao, E-mail: Huangyblhy@gmail.com [Mechatronics Engineering School of Guangdong University of Technology, Guangzhou 510006 (China); Jiang, Shaoen, E-mail: Jiangshn@vip.sina.com [Laser Fusion Research Center, China Academy of Engineering Physics, Mianyang 621900 (China); Jing, Longfei, E-mail: scmyking_2008@163.com [Laser Fusion Research Center, China Academy of Engineering Physics, Mianyang 621900 (China); Tianxuan, Huang; Ding, Yongkun [Laser Fusion Research Center, China Academy of Engineering Physics, Mianyang 621900 (China)

    2015-11-15

    Highlights: • A unified modeling approach for physical experiment design is presented. • Any laser facility can be flexibly defined and included with two scripts. • Complex targets and laser beams can be parametrically modeled for optimization. • Automatically mapping of laser beam energy facilitates targets shape optimization. - Abstract: Physical experiment design and optimization is very essential for laser driven inertial confinement fusion due to the high cost of each shot. However, only limited experiments with simple structure or shape on several laser facilities can be designed and evaluated in available codes, and targets are usually defined by programming, which may lead to it difficult for complex shape target design and optimization on arbitrary laser facilities. A unified modeling approach for physical experiment design and optimization on any laser facilities is presented in this paper. Its core idea includes: (1) any laser facility can be flexibly defined and included with two scripts, (2) complex shape targets and laser beams can be parametrically modeled based on features, (3) an automatically mapping scheme of laser beam energy onto discrete mesh elements of targets enable targets or laser beams be optimized without any additional interactive modeling or programming, and (4) significant computation algorithms are additionally presented to efficiently evaluate radiation symmetry on the target. Finally, examples are demonstrated to validate the significance of such unified modeling approach for physical experiments design and optimization in laser driven inertial confinement fusion.

  14. Local activation time sampling density for atrial tachycardia contact mapping: how much is enough?

    Science.gov (United States)

    Williams, Steven E; Harrison, James L; Chubb, Henry; Whitaker, John; Kiedrowicz, Radek; Rinaldi, Christopher A; Cooklin, Michael; Wright, Matthew; Niederer, Steven; O'Neill, Mark D

    2018-02-01

    Local activation time (LAT) mapping forms the cornerstone of atrial tachycardia diagnosis. Although anatomic and positional accuracy of electroanatomic mapping (EAM) systems have been validated, the effect of electrode sampling density on LAT map reconstruction is not known. Here, we study the effect of chamber geometry and activation complexity on optimal LAT sampling density using a combined in silico and in vivo approach. In vivo 21 atrial tachycardia maps were studied in three groups: (1) focal activation, (2) macro-re-entry, and (3) localized re-entry. In silico activation was simulated on a 4×4cm atrial monolayer, sampled randomly at 0.25-10 points/cm2 and used to re-interpolate LAT maps. Activation patterns were studied in the geometrically simple porcine right atrium (RA) and complex human left atrium (LA). Activation complexity was introduced into the porcine RA by incomplete inter-caval linear ablation. In all cases, optimal sampling density was defined as the highest density resulting in minimal further error reduction in the re-interpolated maps. Optimal sampling densities for LA tachycardias were 0.67 ± 0.17 points/cm2 (focal activation), 1.05 ± 0.32 points/cm2 (macro-re-entry) and 1.23 ± 0.26 points/cm2 (localized re-entry), P = 0.0031. Increasing activation complexity was associated with increased optimal sampling density both in silico (focal activation 1.09 ± 0.14 points/cm2; re-entry 1.44 ± 0.49 points/cm2; spiral-wave 1.50 ± 0.34 points/cm2, P density (0.61 ± 0.22 points/cm2 vs. 1.0 ± 0.34 points/cm2, P = 0.0015). Optimal sampling densities can be identified to maximize diagnostic yield of LAT maps. Greater sampling density is required to correctly reveal complex activation and represent activation across complex geometries. Overall, the optimal sampling density for LAT map interpolation defined in this study was ∼1.0-1.5 points/cm2. Published on behalf of the European Society of

  15. Soil organic carbon content assessment in a heterogeneous landscape: comparison of digital soil mapping and visible and near Infrared spectroscopy approaches

    Science.gov (United States)

    Michot, Didier; Fouad, Youssef; Pascal, Pichelin; Viaud, Valérie; Soltani, Inès; Walter, Christian

    2017-04-01

    This study aims are: i) to assess SOC content distribution according to the global soil map (GSM) project recommendations in a heterogeneous landscape ; ii) to compare the prediction performance of digital soil mapping (DSM) and visible-near infrared (Vis-NIR) spectroscopy approaches. The study area of 140 ha, located at Plancoët, surrounds the unique mineral spring water of Brittany (Western France). It's a hillock characterized by a heterogeneous landscape mosaic with different types of forest, permanent pastures and wetlands along a small coastal river. We acquired two independent datasets: j) 50 points selected using a conditioned Latin hypercube sampling (cLHS); jj) 254 points corresponding to the GSM grid. Soil samples were collected in three layers (0-5, 20-25 and 40-50cm) for both sampling strategies. SOC content was only measured in cLHS soil samples, while Vis-NIR spectra were measured on all the collected samples. For the DSM approach, a machine-learning algorithm (Cubist) was applied on the cLHS calibration data to build rule-based models linking soil carbon content in the different layers with environmental covariates, derived from digital elevation model, geological variables, land use data and existing large scale soil maps. For the spectroscopy approach, we used two calibration datasets: k) the local cLHS ; kk) a subset selected from the regional spectral database of Brittany after a PCA with a hierarchical clustering analysis and spiked by local cLHS spectra. The PLS regression algorithm with "leave-one-out" cross validation was performed for both calibration datasets. SOC contents for the 3 layers of the GSM grid were predicted using the different approaches and were compared with each other. Their prediction performance was evaluated by the following parameters: R2, RMSE and RPD. Both approaches led to satisfactory predictions for SOC content with an advantage for the spectral approach, particularly as regards the pertinence of the variation

  16. Cost-Effectiveness of Seven Approaches to Map Vegetation Communities — A Case Study from Northern Australia’s Tropical Savannas

    Directory of Open Access Journals (Sweden)

    Stuart Phinn

    2013-01-01

    Full Text Available Vegetation communities are traditionally mapped from aerial photography interpretation. Other semi-automated methods include pixel- and object-based image analysis. While these methods have been used for decades, there is a lack of comparative research. We evaluated the cost-effectiveness of seven approaches to map vegetation communities in a northern Australia’s tropical savanna environment. The seven approaches included: (1. aerial photography interpretation, (2. pixel-based image-only classification (Maximum Likelihood Classifier, (3. pixel-based integrated classification (Maximum Likelihood Classifier, (4. object-based image-only classification (nearest neighbor classifier, (5. object-based integrated classification (nearest neighbor classifier, (6. object-based image-only classification (step-wise ruleset, and (7. object-based integrated classification (step-wise ruleset. Approach 1 was applied to 1:50,000 aerial photography and approaches 2–7 were applied to SPOT5 and Landsat5 TM multispectral data. The integrated approaches (3, 5 and 7 included ancillary data (a digital elevation model, slope model, normalized difference vegetation index and hydrology information. The cost-effectiveness was assessed taking into consideration the accuracy and costs associated with each classification approach and image dataset. Accuracy was assessed in terms of overall accuracy and the costs were evaluated using four main components: field data acquisition and preparation, image data acquisition and preparation, image classification and accuracy assessment. Overall accuracy ranged from 28%, for the image-only pixel-based approach, to 67% for the aerial photography interpretation, while total costs ranged from AU$338,000 to AU$388,180 (Australian dollars, for the pixel-based image-only classification and aerial photography interpretation respectively. The most labor-intensive component was field data acquisition and preparation, followed by image data

  17. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    Science.gov (United States)

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  18. A concept mapping approach to guide and understand dissemination and implementation.

    Science.gov (United States)

    Green, Amy E; Fettes, Danielle L; Aarons, Gregory A

    2012-10-01

    Many efforts to implement evidence-based programs do not reach their full potential or fail due to the variety of challenges inherent in dissemination and implementation. This article describes the use of concept mapping-a mixed method strategy-to study implementation of behavioral health innovations and evidence-based practice (EBP). The application of concept mapping to implementation research represents a practical and concise way to identify and quantify factors affecting implementation, develop conceptual models of implementation, target areas to address as part of implementation readiness and active implementation, and foster communication among stakeholders. Concept mapping is described and a case example is provided to illustrate its use in an implementation study. Implications for the use of concept mapping methods in both research and applied settings towards the dissemination and implementation of behavioral health services are discussed.

  19. Wild boar mapping using population-density statistics: From polygons to high resolution raster maps.

    Science.gov (United States)

    Pittiglio, Claudia; Khomenko, Sergei; Beltran-Alcrudo, Daniel

    2018-01-01

    The wild boar is an important crop raider as well as a reservoir and agent of spread of swine diseases. Due to increasing densities and expanding ranges worldwide, the related economic losses in livestock and agricultural sectors are significant and on the rise. Its management and control would strongly benefit from accurate and detailed spatial information on species distribution and abundance, which are often available only for small areas. Data are commonly available at aggregated administrative units with little or no information about the distribution of the species within the unit. In this paper, a four-step geostatistical downscaling approach is presented and used to disaggregate wild boar population density statistics from administrative units of different shape and size (polygons) to 5 km resolution raster maps by incorporating auxiliary fine scale environmental variables. 1) First a stratification method was used to define homogeneous bioclimatic regions for the analysis; 2) Under a geostatistical framework, the wild boar densities at administrative units, i.e. subnational areas, were decomposed into trend and residual components for each bioclimatic region. Quantitative relationships between wild boar data and environmental variables were estimated through multiple regression and used to derive trend components at 5 km spatial resolution. Next, the residual components (i.e., the differences between the trend components and the original wild boar data at administrative units) were downscaled at 5 km resolution using area-to-point kriging. The trend and residual components obtained at 5 km resolution were finally added to generate fine scale wild boar estimates for each bioclimatic region. 3) These maps were then mosaicked to produce a final output map of predicted wild boar densities across most of Eurasia. 4) Model accuracy was assessed at each different step using input as well as independent data. We discuss advantages and limits of the method and its

  20. A national scale flood hazard mapping methodology: The case of Greece - Protection and adaptation policy approaches.

    Science.gov (United States)

    Kourgialas, Nektarios N; Karatzas, George P

    2017-12-01

    The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A semi-automated approach for mapping geomorphology of El Bardawil Lake, Northern Sinai, Egypt, using integrated remote sensing and GIS techniques

    Directory of Open Access Journals (Sweden)

    Nabil Sayed Embabi

    2014-06-01

    Full Text Available Among the other coastal lakes of the Mediterranean northern coast of Egypt, Bardawil Lake is a unique lagoon, as it is fed only by seawater. The lagoon is composed of two main basins, and several other internal small basins interconnected to one another. Although the general geomorphologic characteristics are treated in some regional studies, we used a semi-automated approach based on a wide variety of digital image processing for mapping the major geomorphological landforms of the lake on a medium scale of 1:250,000. The approach is based primarily on data fusion of Landsat ETM+ image, and validated by other ancillary spatial data (e.g. topographic maps, Google images and GPS in situ data. Interpretations of high resolution space images by Google Earth and the large-scale topographic maps (1:25,000, in specific, revealed new microforms and some detailed geomorphologic aspects with the aid of GPS measurements. Small sand barriers, submerged sand dunes, tidal channels, fans and flats, and micro-lagoons are the recurrent forms in the lake. The approach used in this study could be widely applied to study the low-lying coastal lands along the Nile Delta. However, it is concluded from geological data and geomorphologic aspects that Bardawil Lake is of a tectonic origin; it was much deeper than it is currently, and has been filled with sediments mostly since the Flandrian transgression (∼8–6 ka bp.

  2. Data Mining Approaches for Landslide Susceptibility Mapping in Umyeonsan, Seoul, South Korea

    Directory of Open Access Journals (Sweden)

    Sunmin Lee

    2017-07-01

    Full Text Available The application of data mining models has become increasingly popular in recent years in assessments of a variety of natural hazards such as landslides and floods. Data mining techniques are useful for understanding the relationships between events and their influencing variables. Because landslides are influenced by a combination of factors including geomorphological and meteorological factors, data mining techniques are helpful in elucidating the mechanisms by which these complex factors affect landslide events. In this study, spatial data mining approaches based on data on landslide locations in the geographic information system environment were investigated. The topographical factors of slope, aspect, curvature, topographic wetness index, stream power index, slope length factor, standardized height, valley depth, and downslope distance gradient were determined using topographical maps. Additional soil and forest variables using information obtained from national soil and forest maps were also investigated. A total of 17 variables affecting the frequency of landslide occurrence were selected to construct a spatial database, and support vector machine (SVM and artificial neural network (ANN models were applied to predict landslide susceptibility from the selected factors. In the SVM model, linear, polynomial, radial base function, and sigmoid kernels were applied in sequence; the model yielded 72.41%, 72.83%, 77.17% and 72.79% accuracy, respectively. The ANN model yielded a validity accuracy of 78.41%. The results of this study are useful in guiding effective strategies for the prevention and management of landslides in urban areas.

  3. Evaluation of a practical expert defined approach to patient population segmentation: a case study in Singapore

    Directory of Open Access Journals (Sweden)

    Lian Leng Low

    2017-11-01

    Full Text Available Abstract Background Segmenting the population into groups that are relatively homogeneous in healthcare characteristics or needs is crucial to facilitate integrated care and resource planning. We aimed to evaluate the feasibility of segmenting the population into discrete, non-overlapping groups using a practical expert and literature driven approach. We hypothesized that this approach is feasible utilizing the electronic health record (EHR in SingHealth. Methods In addition to well-defined segments of “Mostly healthy”, “Serious acute illness but curable” and “End of life” segments that are also present in the Ministry of Health Singapore framework, patients with chronic diseases were segmented into “Stable chronic disease”, “Complex chronic diseases without frequent hospital admissions”, and “Complex chronic diseases with frequent hospital admissions”. Using the electronic health record (EHR, we applied this framework to all adult patients who had a healthcare encounter in the Singapore Health Services Regional Health System in 2012. ICD-9, 10 and polyclinic codes were used to define chronic diseases with a comprehensive look-back period of 5 years. Outcomes (hospital admissions, emergency attendances, specialist outpatient clinic attendances and mortality were analyzed for years 2012 to 2015. Results Eight hundred twenty five thousand eight hundred seventy four patients were included in this study with the majority being healthy without chronic diseases. The most common chronic disease was hypertension. Patients with “complex chronic disease” with frequent hospital admissions segment represented 0.6% of the eligible population, but accounted for the highest hospital admissions (4.33 ± 2.12 admissions; p < 0.001 and emergency attendances (ED (3.21 ± 3.16 ED visits; p < 0.001 per patient, and a high mortality rate (16%. Patients with metastatic disease accounted for the highest specialist outpatient

  4. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  5. Irreducible complexity of iterated symmetric bimodal maps

    Directory of Open Access Journals (Sweden)

    J. P. Lampreia

    2005-01-01

    Full Text Available We introduce a tree structure for the iterates of symmetric bimodal maps and identify a subset which we prove to be isomorphic to the family of unimodal maps. This subset is used as a second factor for a ∗-product that we define in the space of bimodal kneading sequences. Finally, we give some properties for this product and study the ∗-product induced on the associated Markov shifts.

  6. A Neurogenetics Approach to Defining Differential Susceptibility to Institutional Care

    Science.gov (United States)

    Brett, Zoe H.; Sheridan, Margaret; Humphreys, Kate; Smyke, Anna; Gleason, Mary Margaret; Fox, Nathan; Zeanah, Charles; Nelson, Charles; Drury, Stacy

    2015-01-01

    An individual's neurodevelopmental and cognitive sequelae to negative early experiences may, in part, be explained by genetic susceptibility. We examined whether extreme differences in the early caregiving environment, defined as exposure to severe psychosocial deprivation associated with institutional care compared to normative rearing,…

  7. Linear Programming Approaches for Power Savings in Software-defined Networks

    NARCIS (Netherlands)

    Moghaddam, F.A.; Grosso, P.

    2016-01-01

    Software-defined networks have been proposed as a viable solution to decrease the power consumption of the networking component in data center networks. Still the question remains on which scheduling algorithms are most suited to achieve this goal. We propose 4 different linear programming

  8. Using knowledge rules for pharmacy mapping.

    Science.gov (United States)

    Shakib, Shaun C; Che, Chengjian; Lau, Lee Min

    2006-01-01

    The 3M Health Information Systems (HIS) Healthcare Data Dictionary (HDD) is used to encode and structure patient medication data for the Electronic Health Record (EHR) of the Department of Defense's (DoD's) Armed Forces Health Longitudinal Technology Application (AHLTA). HDD Subject Matter Experts (SMEs) are responsible for initial and maintenance mapping of disparate, standalone medication master files from all 100 DoD host sites worldwide to a single concept-based vocabulary, to accomplish semantic interoperability. To achieve higher levels of automation, SMEs began defining a growing set of knowledge rules. These knowledge rules were implemented in a pharmacy mapping tool, which enhanced consistency through automation and increased mapping rate by 29%.

  9. Defining functional distances over Gene Ontology

    Directory of Open Access Journals (Sweden)

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  10. Application of CPL with Interference Mapping Lithography to generate random contact reticle designs for the 65-nm node

    Science.gov (United States)

    Van Den Broeke, Douglas J.; Laidig, Thomas L.; Chen, J. Fung; Wampler, Kurt E.; Hsu, Stephen D.; Shi, Xuelong; Socha, Robert J.; Dusa, Mircea V.; Corcoran, Noel P.

    2004-08-01

    Imaging contact and via layers continues to be one of the major challenges to be overcome for 65nm node lithography. Initial results of using ASML MaskTools' CPL Technology to print contact arrays through pitch have demonstrated the potential to further extend contact imaging to a k1 near 0.30. While there are advantages and disadvantages for any potential RET, the benefits of not having to solve the phase assignment problem (which can lead to unresolvable phase conflicts), of it being a single reticle - single exposure technique, and its application to multiple layers within a device (clear field and dark field) make CPL an attractive, cost effective solution to low k1 imaging. However, real semiconductor circuit designs consist of much more than regular arrays of contact holes and a method to define the CPL reticle design for a full chip circuit pattern is required in order for this technique to be feasible in volume manufacturing. Interference Mapping Lithography (IML) is a novel approach for defining optimum reticle patterns based on the imaging conditions that will be used when the wafer is exposed. Figure 1 shows an interference map for an isolated contact simulated using ASML /1150 settings of 0.75NA and 0.92/0.72/30deg Quasar illumination. This technique provides a model-based approach for placing all types features (scattering bars, anti-scattering bars, non-printing assist features, phase shifted and non-phase shifted) for the purpose of enhancing the resolution of the target pattern and it can be applied to any reticle type including binary (COG), attenuated phase shifting mask (attPSM), alternating aperture phase shifting mask (altPSM), and CPL. In this work, we investigate the application of IML to generate CPL reticle designs for random contact patterns that are typical for 65nm node logic devices. We examine the critical issues related to using CPL with Interference Mapping Lithography including controlling side lobe printing, contact patterns with

  11. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    Energy Technology Data Exchange (ETDEWEB)

    Nagi, A.; Altarazi, S.

    2017-07-01

    This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO) was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation) process nonconformities.

  12. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    Directory of Open Access Journals (Sweden)

    Ayman Nagi

    2017-04-01

    Full Text Available Purpose: This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation process nonconformities .

  13. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    International Nuclear Information System (INIS)

    Nagi, A.; Altarazi, S.

    2017-01-01

    This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO) was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation) process nonconformities.

  14. Wet Snow Mapping in Southern Ontario with Sentinel-1A Observations

    Science.gov (United States)

    Chen, H.; Kelly, R. E. J.

    2017-12-01

    Wet snow is defined as snow with liquid water present in an ice-water mix. It is can be an indicator for the onset of the snowmelt period. Knowledge about the extent of wet snow area can be of great importance for the monitoring of seasonal snowmelt runoff with climate-induced changes in snowmelt duration having implications for operational hydrological and ecological applications. Spaceborne microwave remote sensing has been used to observe seasonal snow under all-weather conditions. Active microwave observations of snow at C-band are sensitive to wet snow due to the high dielectric contrast with non-wet snow surfaces and synthetic aperture radar (SAR) is now openly available to identify and map the wet snow areas globally at relatively fine spatial resolutions ( 100m). In this study, a semi-automated workflow is developed from the change detection method of Nagler et al. (2016) using multi-temporal Sentinel-1A (S1A) dual-polarization observations of Southern Ontario. Weather station data and visible-infrared satellite observations are used to refine the wet snow area estimates. Wet snow information from National Operational Hydrologic Remote Sensing Center (NOHRSC) is used to compare with the S1A estimates. A time series of wet snow maps shows the variations in backscatter from wet snow on a pixel basis. Different land cover types in Southern Ontario are assessed with respect to their impacts on wet snow estimates. While forests and complex land surfaces can impact the ability to map wet snow, the approach taken is robust and illustrates the strong sensitivity of the approach to wet snow backscattering characteristics. The results indicate the feasibility of the change detection method on non-mountainous large areas and address the usefulness of Sentinel-1A data for wet snow mapping.

  15. Supporting autonomous vehicles by creating HD maps

    Directory of Open Access Journals (Sweden)

    Arpad Barsi

    2017-10-01

    Full Text Available Maps are constantly developing, also, the newly defined High Definition (HD maps increase the map content remarkably. They are based on three-dimensional survey, like laser scanning, and then stored in a fully new structured way to be able to support modern-day vehicles. Beyond the traditional lane based map content, they contain information about the roads’ neighbourhood. The goal of these maps is twofold. Primarily, they store the connections where the vehicles can travel with the description of the road-environment. Secondly, they efficiently support the exact vehicle positioning. The paper demonstrates the first results of a pilot study in the creation of HD map of an urban and a rural environment. The applied data collection technology was the terrestrial laser scanning, where the obtained point cloud was evaluated. The data storage has been solved by an in-house developed information storage model with the ability to help in vehicle control processes.

  16. Mappings on Neutrosophic Soft Classes

    Directory of Open Access Journals (Sweden)

    Shawkat Alkhazaleh

    2014-03-01

    Full Text Available In 1995 Smarandache introduced the concept of neutrosophic set which is a mathematical tool for handling problems involving imprecise, indeterminacy and inconsistent data. In 2013 Maji introduced the concept of neutrosophic soft set theory as a general mathematical tool for dealing with uncertainty. In this paper we define the notion of a mapping on classes where the neutrosophic soft classes are collections of neutrosophic soft set. We also define and study the properties of neutrosophic soft images and neutrosophic soft inverse images of neutrosophic soft sets.

  17. Gamut mapping in a high-dynamic-range color space

    Science.gov (United States)

    Preiss, Jens; Fairchild, Mark D.; Ferwerda, James A.; Urban, Philipp

    2014-01-01

    In this paper, we present a novel approach of tone mapping as gamut mapping in a high-dynamic-range (HDR) color space. High- and low-dynamic-range (LDR) images as well as device gamut boundaries can simultaneously be represented within such a color space. This enables a unified transformation of the HDR image into the gamut of an output device (in this paper called HDR gamut mapping). An additional aim of this paper is to investigate the suitability of a specific HDR color space to serve as a working color space for the proposed HDR gamut mapping. For the HDR gamut mapping, we use a recent approach that iteratively minimizes an image-difference metric subject to in-gamut images. A psychophysical experiment on an HDR display shows that the standard reproduction workflow of two subsequent transformations - tone mapping and then gamut mapping - may be improved by HDR gamut mapping.

  18. Application of fuzzy logic approach for wind erosion hazard mapping in Laghouat region (Algeria) using remote sensing and GIS

    Science.gov (United States)

    Saadoud, Djouher; Hassani, Mohamed; Martin Peinado, Francisco José; Guettouche, Mohamed Saïd

    2018-06-01

    Wind erosion is one of the most serious environmental problems in Algeria that threatens human activities and socio-economic development. The main goal of this study is to apply a fuzzy logic approach to wind erosion sensitivity mapping in the Laghouat region, Algeria. Six causative factors, obtained by applying fuzzy membership functions to each used parameter, are considered: soil, vegetation cover, wind factor, soil dryness, land topography and land cover sensitivity. Different fuzzy operators (AND, OR, SUM, PRODUCT, and GAMMA) are applied to generate wind-erosion hazard map. Success rate curves reveal that the fuzzy gamma (γ) operator, with γ equal to 0.9, gives the best prediction accuracy with an area under curve of 85.2%. The resulting wind-erosion sensitivity map delineates the area into different zones of five relative sensitivity classes: very high, high, moderate, low and very low. The estimated result was verified by field measurements and the high statistically significant value of a chi-square test.

  19. An integrated approach to shoreline mapping for spill response planning

    International Nuclear Information System (INIS)

    Owens, E.H.; LeBlanc, S.R.; Percy, R.J.

    1996-01-01

    A desktop mapping package was introduced which has the capability to provide consistent and standardized application of mapping and data collection/generation techniques. Its application in oil spill cleanup was discussed. The data base can be updated easily as new information becomes available. This provides a response team with access to a wide range of information that would otherwise be difficult to obtain. Standard terms and definitions and shoreline segmentation procedures are part of the system to describe the shore-zone character and shore-zone oiling conditions. The program that is in place for Atlantic Canada involves the integration of (1) Environment Canada's SCAT methodology in pre-spill data generation, (2) shoreline segmentation, (3) response management by objectives, (4) Environment Canada's national sensitivity mapping program, and (5) Environment Canada's field guide for the protection and treatment of oiled shorelines. 7 refs., 6 figs

  20. Color maps of Arp 146

    Science.gov (United States)

    Schultz, A. B.; Spight, L. D.; Colegrove, P. T.; Disanti, M. A.; Fink, U.

    1990-01-01

    Four color maps of Arp 146 are given. The structure and color of the ring galaxy and its companion show evidence of a bridge of material between the companion and the remnant nucleus of the original galaxy now forming the ring. Broad band spatial coverage clearly defines regions of starburst occurrence.

  1. Spectrally based bathymetric mapping of a dynamic, sand‐bedded channel: Niobrara River, Nebraska, USA

    Science.gov (United States)

    Dilbone, Elizabeth; Legleiter, Carl; Alexander, Jason S.; McElroy, Brandon

    2018-01-01

    Methods for spectrally based mapping of river bathymetry have been developed and tested in clear‐flowing, gravel‐bed channels, with limited application to turbid, sand‐bed rivers. This study used hyperspectral images and field surveys from the dynamic, sandy Niobrara River to evaluate three depth retrieval methods. The first regression‐based approach, optimal band ratio analysis (OBRA), paired in situ depth measurements with image pixel values to estimate depth. The second approach used ground‐based field spectra to calibrate an OBRA relationship. The third technique, image‐to‐depth quantile transformation (IDQT), estimated depth by linking the cumulative distribution function (CDF) of depth to the CDF of an image‐derived variable. OBRA yielded the lowest depth retrieval mean error (0.005 m) and highest observed versus predicted R2 (0.817). Although misalignment between field and image data did not compromise the performance of OBRA in this study, poor georeferencing could limit regression‐based approaches such as OBRA in dynamic, sand‐bedded rivers. Field spectroscopy‐based depth maps exhibited a mean error with a slight shallow bias (0.068 m) but provided reliable estimates for most of the study reach. IDQT had a strong deep bias but provided informative relative depth maps. Overprediction of depth by IDQT highlights the need for an unbiased sampling strategy to define the depth CDF. Although each of the techniques we tested demonstrated potential to provide accurate depth estimates in sand‐bed rivers, each method also was subject to certain constraints and limitations.

  2. Intra-operative mapping of the atria: the first step towards individualization of atrial fibrillation therapy?

    Science.gov (United States)

    Kik, Charles; Mouws, Elisabeth M J P; Bogers, Ad J J C; de Groot, Natasja M S

    2017-07-01

    Atrial fibrillation (AF), an age-related progressive disease, is becoming a worldwide epidemic with a prevalence rate of 33 million. Areas covered: In this expert review, an overview of important results obtained from previous intra-operative mapping studies is provided. In addition, our novel intra-operative high resolution mapping studies, its surgical considerations and data analyses are discussed. Furthermore, the importance of high resolution mapping studies of both sinus rhythm and AF for the development of future AF therapy is underlined by our most recent results. Expert commentary: Progression of AF is determined by the extensiveness of electropathology which is defined as conduction disorders caused by structural damage of atrial tissue. The severity of electropathology is a major determinant of therapy failure. At present, we do not have any diagnostic tool to determine the degree of electropathology in the individual patient and we can thus not select the most optimal treatment modality for the individual patient. An intra-operative, high resolution scale, epicardial mapping approach combined with quantification of electrical parameters may serve as a diagnostic tool to stage AF in the individual patient and to provide patient tailored therapy.

  3. MR vascular fingerprinting: A new approach to compute cerebral blood volume, mean vessel radius, and oxygenation maps in the human brain.

    Science.gov (United States)

    Christen, T; Pannetier, N A; Ni, W W; Qiu, D; Moseley, M E; Schuff, N; Zaharchuk, G

    2014-04-01

    In the present study, we describe a fingerprinting approach to analyze the time evolution of the MR signal and retrieve quantitative information about the microvascular network. We used a Gradient Echo Sampling of the Free Induction Decay and Spin Echo (GESFIDE) sequence and defined a fingerprint as the ratio of signals acquired pre- and post-injection of an iron-based contrast agent. We then simulated the same experiment with an advanced numerical tool that takes a virtual voxel containing blood vessels as input, then computes microscopic magnetic fields and water diffusion effects, and eventually derives the expected MR signal evolution. The parameter inputs of the simulations (cerebral blood volume [CBV], mean vessel radius [R], and blood oxygen saturation [SO2]) were varied to obtain a dictionary of all possible signal evolutions. The best fit between the observed fingerprint and the dictionary was then determined by using least square minimization. This approach was evaluated in 5 normal subjects and the results were compared to those obtained by using more conventional MR methods, steady-state contrast imaging for CBV and R and a global measure of oxygenation obtained from the superior sagittal sinus for SO2. The fingerprinting method enabled the creation of high-resolution parametric maps of the microvascular network showing expected contrast and fine details. Numerical values in gray matter (CBV=3.1±0.7%, R=12.6±2.4μm, SO2=59.5±4.7%) are consistent with literature reports and correlated with conventional MR approaches. SO2 values in white matter (53.0±4.0%) were slightly lower than expected. Numerous improvements can easily be made and the method should be useful to study brain pathologies. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. IceMap250—Automatic 250 m Sea Ice Extent Mapping Using MODIS Data

    Directory of Open Access Journals (Sweden)

    Charles Gignac

    2017-01-01

    Full Text Available The sea ice cover in the North evolves at a rapid rate. To adequately monitor this evolution, tools with high temporal and spatial resolution are needed. This paper presents IceMap250, an automatic sea ice extent mapping algorithm using MODIS reflective/emissive bands. Hybrid cloud-masking using both the MOD35 mask and a visibility mask, combined with downscaling of Bands 3–7 to 250 m, are utilized to delineate sea ice extent using a decision tree approach. IceMap250 was tested on scenes from the freeze-up, stable cover, and melt seasons in the Hudson Bay complex, in Northeastern Canada. IceMap250 first product is a daily composite sea ice presence map at 250 m. Validation based on comparisons with photo-interpreted ground-truth show the ability of the algorithm to achieve high classification accuracy, with kappa values systematically over 90%. IceMap250 second product is a weekly clear sky map that provides a synthesis of 7 days of daily composite maps. This map, produced using a majority filter, makes the sea ice presence map even more accurate by filtering out the effects of isolated classification errors. The synthesis maps show spatial consistency through time when compared to passive microwave and national ice services maps.

  5. Simulation of speckle patterns with pre-defined correlation distributions

    Science.gov (United States)

    Song, Lipei; Zhou, Zhen; Wang, Xueyan; Zhao, Xing; Elson, Daniel S.

    2016-01-01

    We put forward a method to easily generate a single or a sequence of fully developed speckle patterns with pre-defined correlation distribution by utilizing the principle of coherent imaging. The few-to-one mapping between the input correlation matrix and the correlation distribution between simulated speckle patterns is realized and there is a simple square relationship between the values of these two correlation coefficient sets. This method is demonstrated both theoretically and experimentally. The square relationship enables easy conversion from any desired correlation distribution. Since the input correlation distribution can be defined by a digital matrix or a gray-scale image acquired experimentally, this method provides a convenient way to simulate real speckle-related experiments and to evaluate data processing techniques. PMID:27231589

  6. REST-MapReduce: An Integrated Interface but Differentiated Service

    Directory of Open Access Journals (Sweden)

    Jong-Hyuk Park

    2014-01-01

    Full Text Available With the fast deployment of cloud computing, MapReduce architectures are becoming the major technologies for mobile cloud computing. The concept of MapReduce was first introduced as a novel programming model and implementation for a large set of computing devices. In this research, we propose a novel concept of REST-MapReduce, enabling users to use only the REST interface without using the MapReduce architecture. This approach provides a higher level of abstraction by integration of the two types of access interface, REST API and MapReduce. The motivation of this research stems from the slower response time for accessing simple RDBMS on Hadoop than direct access to RDMBS. This is because there is overhead to job scheduling, initiating, starting, tracking, and management during MapReduce-based parallel execution. Therefore, we provide a good performance for REST Open API service and for MapReduce, respectively. This is very useful for constructing REST Open API services on Hadoop hosting services, for example, Amazon AWS (Macdonald, 2005 or IBM Smart Cloud. For evaluating performance of our REST-MapReduce framework, we conducted experiments with Jersey REST web server and Hadoop. Experimental result shows that our approach outperforms conventional approaches.

  7. The mapping approach in the path integral formalism applied to curve-crossing systems

    International Nuclear Information System (INIS)

    Novikov, Alexey; Kleinekathoefer, Ulrich; Schreiber, Michael

    2004-01-01

    The path integral formalism in a combined phase-space and coherent-state representation is applied to the problem of curve-crossing dynamics. The system of interest is described by two coupled one-dimensional harmonic potential energy surfaces interacting with a heat bath consisting of harmonic oscillators. The mapping approach is used to rewrite the Lagrangian function of the electronic part of the system. Using the Feynman-Vernon influence-functional method the bath is eliminated whereas the non-Gaussian part of the path integral is treated using the generating functional for the electronic trajectories. The dynamics of a Gaussian wave packet is analyzed along a one-dimensional reaction coordinate within a perturbative treatment for a small coordinate shift between the potential energy surfaces

  8. Reconnaissance geologic mapping of a portion of the rain‐forest‐covered Guiana Shield, Northwestern Brazil, using SIR-B and digital aeromagnetic data

    Science.gov (United States)

    Pellon de Miranda, Fernando; McCafferty, Anne E.; Taranik, James V.

    1994-01-01

    This paper documents the result of an integrated analysis of spaceborne radar (SIR-B) and digital aeromagnetic data carried out in the heavily forested Guiana Shield. The objective of the research is to interpret the geophysical data base to its limit to produce a reconnaissance geologic map as an aid to ground work planning in a worst‐case setting. Linear geomorphic features were identified based on the interpretation of the SIR-B image. Digital manipulation of aeromagnetic data allowed the development of a color‐shaded relief map of reduced‐to‐pole magnetic anomalies, a terrace‐magnetization map, and a map showing the location of maximum values of the horizontal component of the pseudogravity gradient (magnetization boundary lines). The resultant end product was a reconnaissance geologic map where broad terrane categories were delineated and geologic faults with both topographic and magnetic expression were defined. The availability of global spaceborne radar coverage in the 1990s and the large number of existing digital aeromagnetic surveys in northwestern Brazil indicate that this approach can be potentially useful for reconnaissance geologic mapping elsewhere in the Guiana Shield.

  9. Relationship Marketing results: proposition of a cognitive mapping model

    Directory of Open Access Journals (Sweden)

    Iná Futino Barreto

    2015-12-01

    Full Text Available Objective - This research sought to develop a cognitive model that expresses how marketing professionals understand the relationship between the constructs that define relationship marketing (RM. It also tried to understand, using the obtained model, how objectives in this field are achieved. Design/methodology/approach – Through cognitive mapping, we traced 35 individual mental maps, highlighting how each respondent understands the interactions between RM elements. Based on the views of these individuals, we established an aggregate mental map. Theoretical foundation – The topic is based on a literature review that explores the RM concept and its main elements. Based on this review, we listed eleven main constructs. Findings – We established an aggregate mental map that represents the RM structural model. Model analysis identified that CLV is understood as the final result of RM. We also observed that the impact of most of the RM elements on CLV is brokered by loyalty. Personalization and quality, on the other hand, proved to be process input elements, and are the ones that most strongly impact others. Finally, we highlight that elements that punish customers are much less effective than elements that benefit them. Contributions - The model was able to insert core elements of RM, but absent from most formal models: CLV and customization. The analysis allowed us to understand the interactions between the RM elements and how the end result of RM (CLV is formed. This understanding improves knowledge on the subject and helps guide, assess and correct actions.

  10. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    Directory of Open Access Journals (Sweden)

    Emma Lawrence

    Full Text Available The recently declared Australian Commonwealth Marine Reserve (CMR Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia designed to test the benefits of two approaches to characterising shelf habitats: (i MBES mapping of a continuous (~30 km2 area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN

  11. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    Science.gov (United States)

    Lawrence, Emma; Hayes, Keith R; Lucieer, Vanessa L; Nichol, Scott L; Dambacher, Jeffrey M; Hill, Nicole A; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN zone IV, and in

  12. Some issues in data model mapping

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Alsabbagh, Jamal R.

    1985-01-01

    Numerous data models have been reported in the literature since the early 1970's. They have been used as database interfaces and as conceptual design tools. The mapping between schemas expressed according to the same data model or according to different models is interesting for theoretical and practical purposes. This paper addresses some of the issues involved in such a mapping. Of special interest are the identification of the mapping parameters and some current approaches for handling the various situations that require a mapping.

  13. SLAMM: Visual monocular SLAM with continuous mapping using multiple maps.

    Directory of Open Access Journals (Sweden)

    Hayyan Afeef Daoud

    Full Text Available This paper presents the concept of Simultaneous Localization and Multi-Mapping (SLAMM. It is a system that ensures continuous mapping and information preservation despite failures in tracking due to corrupted frames or sensor's malfunction; making it suitable for real-world applications. It works with single or multiple robots. In a single robot scenario the algorithm generates a new map at the time of tracking failure, and later it merges maps at the event of loop closure. Similarly, maps generated from multiple robots are merged without prior knowledge of their relative poses; which makes this algorithm flexible. The system works in real time at frame-rate speed. The proposed approach was tested on the KITTI and TUM RGB-D public datasets and it showed superior results compared to the state-of-the-arts in calibrated visual monocular keyframe-based SLAM. The mean tracking time is around 22 milliseconds. The initialization is twice as fast as it is in ORB-SLAM, and the retrieved map can reach up to 90 percent more in terms of information preservation depending on tracking loss and loop closure events. For the benefit of the community, the source code along with a framework to be run with Bebop drone are made available at https://github.com/hdaoud/ORBSLAMM.

  14. Construction of harmonic maps between pseudo-Riemannian spheres and hyperbolic spaces

    International Nuclear Information System (INIS)

    Konderak, J.

    1988-09-01

    Defined here is an orthogonal multiplication for vector spaces with indefinite nondegenerate scalar product. This is then used, via the Hopf construction, to obtain harmonic maps between pseudo-Riemannian spheres and hyperbolic spaces. Examples of harmonic maps are constructed using Clifford algebras. (author). 6 refs

  15. Assessing IT Projects Success with Extended Fuzzy Cognitive Maps & Neutrosophic Cognitive Maps in comparison to Fuzzy Cognitive Maps

    Directory of Open Access Journals (Sweden)

    Kanika Bhutani

    2016-08-01

    Full Text Available IT projects hold a huge importance to economic growth. Today, half of the capital investments are in IT technology. IT systems and projects are extensive and time consuming; thus implying that its failure is not affordable, so proper feasibility study of assessing project success factors is required. A current methodology like Fuzzy Cognitive Maps has been experimented for identifying and evaluating the success factors in IT projects, but this technique has certain limitations. This paper discusses two new approaches to evaluate IT project success: Extended Fuzzy Cognitive Maps (E-FCM & Neutrosophic Cognitive Maps (NCM.The limitations of FCM like non consideration for non-linear, conditional, time delay weights and indeterminate relations are targeted using E-FCM and NCM in this paper.

  16. Electrocorticographic Temporal Alteration Mapping: A Clinical Technique for Mapping the Motor Cortex with Movement-Related Cortical Potentials

    Directory of Open Access Journals (Sweden)

    Zehan Wu

    2017-06-01

    Full Text Available We propose electrocorticographic temporal alteration mapping (ETAM for motor cortex mapping by utilizing movement-related cortical potentials (MRCPs within the low-frequency band [0.05-3] Hz. This MRCP waveform-based temporal domain approach was compared with the state-of-the-art electrocorticographic frequency alteration mapping (EFAM, which is based on frequency spectrum dynamics. Five patients (two epilepsy cases and three tumor cases were enrolled in the study. Each patient underwent intraoperative direct electrocortical stimulation (DECS procedure for motor cortex localization. Moreover, the patients were required to perform simple brisk wrist extension task during awake craniotomy surgery. Cross-validation results showed that the proposed ETAM method had high sensitivity (81.8% and specificity (94.3% in identifying sites which exhibited positive DECS motor responses. Moreover, although the sensitivity of the ETAM and EFAM approaches was not significantly different, ETAM had greater specificity compared with EFAM (94.3 vs. 86.1%. These results indicate that for the intraoperative functional brain mapping, ETAM is a promising novel approach for motor cortex localization with the potential to reduce the need for cortical electrical stimulation.

  17. Dynamics of Open Systems with Affine Maps

    International Nuclear Information System (INIS)

    Zhang Da-Jian; Liu Chong-Long; Tong Dian-Min

    2015-01-01

    Many quantum systems of interest are initially correlated with their environments and the reduced dynamics of open systems are an interesting while challenging topic. Affine maps, as an extension of completely positive maps, are a useful tool to describe the reduced dynamics of open systems with initial correlations. However, it is unclear what kind of initial state shares an affine map. In this study, we give a sufficient condition of initial states, in which the reduced dynamics can always be described by an affine map. Our result shows that if the initial states of the combined system constitute a convex set, and if the correspondence between the initial states of the open system and those of the combined system, defined by taking the partial trace, is a bijection, then the reduced dynamics of the open system can be described by an affine map. (paper)

  18. Comparison of Pixel-Based and Object-Based Classification Using Parameters and Non-Parameters Approach for the Pattern Consistency of Multi Scale Landcover

    Science.gov (United States)

    Juniati, E.; Arrofiqoh, E. N.

    2017-09-01

    Information extraction from remote sensing data especially land cover can be obtained by digital classification. In practical some people are more comfortable using visual interpretation to retrieve land cover information. However, it is highly influenced by subjectivity and knowledge of interpreter, also takes time in the process. Digital classification can be done in several ways, depend on the defined mapping approach and assumptions on data distribution. The study compared several classifiers method for some data type at the same location. The data used Landsat 8 satellite imagery, SPOT 6 and Orthophotos. In practical, the data used to produce land cover map in 1:50,000 map scale for Landsat, 1:25,000 map scale for SPOT and 1:5,000 map scale for Orthophotos, but using visual interpretation to retrieve information. Maximum likelihood Classifiers (MLC) which use pixel-based and parameters approach applied to such data, and also Artificial Neural Network classifiers which use pixel-based and non-parameters approach applied too. Moreover, this study applied object-based classifiers to the data. The classification system implemented is land cover classification on Indonesia topographic map. The classification applied to data source, which is expected to recognize the pattern and to assess consistency of the land cover map produced by each data. Furthermore, the study analyse benefits and limitations the use of methods.

  19. Mapping wood density globally using remote sensing and climatological data

    Science.gov (United States)

    Moreno, A.; Camps-Valls, G.; Carvalhais, N.; Kattge, J.; Robinson, N.; Reichstein, M.; Allred, B. W.; Running, S. W.

    2017-12-01

    Wood density (WD) is defined as the oven-dry mass divided by fresh volume, varies between individuals, and describes the carbon investment per unit volume of stem. WD has been proven to be a key functional trait in carbon cycle research and correlates with numerous morphological, mechanical, physiological, and ecological properties. In spite of the utility and importance of this trait, there is a lack of an operational framework to spatialize plant WD measurements at a global scale. In this work, we present a consistent modular processing chain to derive global maps (500 m) of WD using modern machine learning techniques along with optical remote sensing data (MODIS/Landsat) and climate data using the Google Earth Engine platform. The developed approach uses a hierarchical Bayesian approach to fill in gaps in the plant measured WD data set to maximize its global representativeness. WD plant species are then aggregated to Plant Functional Types (PFT). The spatial abundance of PFT at 500 m spatial resolution (MODIS) is calculated using a high resolution (30 m) PFT map developed using Landsat data. Based on these PFT abundances, representative WD values are estimated for each MODIS pixel with nearby measured data. Finally, random forests are used to globally estimate WD from these MODIS pixels using remote sensing and climate. The validation and assessment of the applied methods indicate that the model explains more than 72% of the spatial variance of the calculated community aggregated WD estimates with virtually unbiased estimates and low RMSE (<15%). The maps thus offer new opportunities to study and analyze the global patterns of variation of WD at an unprecedented spatial coverage and spatial resolution.

  20. The development of flood map in Malaysia

    Science.gov (United States)

    Zakaria, Siti Fairus; Zin, Rosli Mohamad; Mohamad, Ismail; Balubaid, Saeed; Mydin, Shaik Hussein; MDR, E. M. Roodienyanto

    2017-11-01

    In Malaysia, flash floods are common occurrences throughout the year in flood prone areas. In terms of flood extent, flash floods affect smaller areas but because of its tendency to occur in densely urbanized areas, the value of damaged property is high and disruption to traffic flow and businesses are substantial. However, in river floods especially the river floods of Kelantan and Pahang, the flood extent is widespread and can extend over 1,000 square kilometers. Although the value of property and density of affected population is lower, the damage inflicted by these floods can also be high because the area affected is large. In order to combat these floods, various flood mitigation measures have been carried out. Structural flood mitigation alone can only provide protection levels from 10 to 100 years Average Recurrence Intervals (ARI). One of the economically effective non-structural approaches in flood mitigation and flood management is using a geospatial technology which involves flood forecasting and warning services to the flood prone areas. This approach which involves the use of Geographical Information Flood Forecasting system also includes the generation of a series of flood maps. There are three types of flood maps namely Flood Hazard Map, Flood Risk Map and Flood Evacuation Map. Flood Hazard Map is used to determine areas susceptible to flooding when discharge from a stream exceeds the bank-full stage. Early warnings of incoming flood events will enable the flood victims to prepare themselves before flooding occurs. Properties and life's can be saved by keeping their movable properties above the flood levels and if necessary, an early evacuation from the area. With respect to flood fighting, an early warning with reference through a series of flood maps including flood hazard map, flood risk map and flood evacuation map of the approaching flood should be able to alert the organization in charge of the flood fighting actions and the authority to

  1. Auxin molecular field maps define AUX1 selectivity: many auxin herbicides are not substrates

    Czech Academy of Sciences Publication Activity Database

    Hoyerová, Klára; Hošek, Petr; Quareshy, M.; Li, J.; Klíma, Petr; Kubeš, Martin; Yemm, A. A.; Neve, P.; Tripathi, A.; Bennett, M.J.; Napier, R. M.

    2018-01-01

    Roč. 217, č. 4 (2018), s. 1625-1639 ISSN 0028-646X R&D Projects: GA ČR(CZ) GA16-19557S; GA MŠk LD15137 Grant - others:OPPK(XE) CZ.2.16/3.1.00/21519 Institutional support: RVO:61389030 Keywords : auxin transport * cheminformatics * herbicide * herbicide resistance * molecular field maps * pharmacophore * structure–activity relationship * uptake carrier Subject RIV: ED - Physiology OBOR OECD: Cell biology Impact factor: 7.330, year: 2016

  2. Map of the Physical Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Boyack, Kevin W.

    1999-07-02

    Various efforts to map the structure of science have been undertaken over the years. Using a new tool, VxInsight{trademark}, we have mapped and displayed 3000 journals in the physical sciences. This map is navigable and interactively reveals the structure of science at many different levels. Science mapping studies are typically focused at either the macro-or micro-level. At a macro-level such studies seek to determine the basic structural units of science and their interrelationships. The majority of studies are performed at the discipline or specialty level, and seek to inform science policy and technical decision makers. Studies at both levels probe the dynamic nature of science, and the implications of the changes. A variety of databases and methods have been used for these studies. Primary among databases are the citation indices (SCI and SSCI) from the Institute for Scientific Information, which have gained widespread acceptance for bibliometric studies. Maps are most often based on computed similarities between journal articles (co-citation), keywords or topics (co-occurrence or co-classification), or journals (journal-journal citation counts). Once the similarity matrix is defined, algorithms are used to cluster the data.

  3. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    Science.gov (United States)

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  4. Genetic k-means clustering approach for mapping human vulnerability to chemical hazards in the industrialized city: a case study of Shanghai, China.

    Science.gov (United States)

    Shi, Weifang; Zeng, Weihua

    2013-06-20

    Reducing human vulnerability to chemical hazards in the industrialized city is a matter of great urgency. Vulnerability mapping is an alternative approach for providing vulnerability-reducing interventions in a region. This study presents a method for mapping human vulnerability to chemical hazards by using clustering analysis for effective vulnerability reduction. Taking the city of Shanghai as the study area, we measure human exposure to chemical hazards by using the proximity model with additionally considering the toxicity of hazardous substances, and capture the sensitivity and coping capacity with corresponding indicators. We perform an improved k-means clustering approach on the basis of genetic algorithm by using a 500 m × 500 m geographical grid as basic spatial unit. The sum of squared errors and silhouette coefficient are combined to measure the quality of clustering and to determine the optimal clustering number. Clustering result reveals a set of six typical human vulnerability patterns that show distinct vulnerability dimension combinations. The vulnerability mapping of the study area reflects cluster-specific vulnerability characteristics and their spatial distribution. Finally, we suggest specific points that can provide new insights in rationally allocating the limited funds for the vulnerability reduction of each cluster.

  5. Genetic k-Means Clustering Approach for Mapping Human Vulnerability to Chemical Hazards in the Industrialized City: A Case Study of Shanghai, China

    Directory of Open Access Journals (Sweden)

    Weihua Zeng

    2013-06-01

    Full Text Available Reducing human vulnerability to chemical hazards in the industrialized city is a matter of great urgency. Vulnerability mapping is an alternative approach for providing vulnerability-reducing interventions in a region. This study presents a method for mapping human vulnerability to chemical hazards by using clustering analysis for effective vulnerability reduction. Taking the city of Shanghai as the study area, we measure human exposure to chemical hazards by using the proximity model with additionally considering the toxicity of hazardous substances, and capture the sensitivity and coping capacity with corresponding indicators. We perform an improved k-means clustering approach on the basis of genetic algorithm by using a 500 m × 500 m geographical grid as basic spatial unit. The sum of squared errors and silhouette coefficient are combined to measure the quality of clustering and to determine the optimal clustering number. Clustering result reveals a set of six typical human vulnerability patterns that show distinct vulnerability dimension combinations. The vulnerability mapping of the study area reflects cluster-specific vulnerability characteristics and their spatial distribution. Finally, we suggest specific points that can provide new insights in rationally allocating the limited funds for the vulnerability reduction of each cluster.

  6. The Power of Visual Approaches in Qualitative Inquiry: The Use of Collage Making and Concept Mapping in Experiential Research

    Directory of Open Access Journals (Sweden)

    Lynn Butler-Kisber

    2010-01-01

    Full Text Available The burgeoning interest in arts-informed research and the increasing variety of visual possibilities as a result of new technologies have paved the way for researchers to explore and use visual forms of inquiry. This article investigates how collage making and concept mapping are useful visual approaches that can inform qualitative research. They are experiential ways of doing/knowing that help to get at tacit aspects of both understanding and process and to make these more explicit to the researcher and more accessible to audiences. It outlines specific ways that each approach can be used with examples to illustrate how the approach informs the researcher's experience and that of the audience. The two approaches are compared and contrasted and issues that can arise in the work are discussed.

  7. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    Directory of Open Access Journals (Sweden)

    Ammendolia Carlo

    2009-06-01

    Full Text Available Abstract Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting.

  8. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    Science.gov (United States)

    Ammendolia, Carlo; Cassidy, David; Steensta, Ivan; Soklaridis, Sophie; Boyle, Eleanor; Eng, Stephanie; Howard, Hamer; Bhupinder, Bains; Côté, Pierre

    2009-01-01

    Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP) and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW) coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting. PMID:19508728

  9. Neighborhood size of training data influences soil map disaggregation

    Science.gov (United States)

    Soil class mapping relies on the ability of sample locations to represent portions of the landscape with similar soil types; however, most digital soil mapping (DSM) approaches intersect sample locations with one raster pixel per covariate layer regardless of pixel size. This approach does not take ...

  10. Landslide susceptibility map: from research to application

    Science.gov (United States)

    Fiorucci, Federica; Reichenbach, Paola; Ardizzone, Francesca; Rossi, Mauro; Felicioni, Giulia; Antonini, Guendalina

    2014-05-01

    Susceptibility map is an important and essential tool in environmental planning, to evaluate landslide hazard and risk and for a correct and responsible management of the territory. Landslide susceptibility is the likelihood of a landslide occurring in an area on the basis of local terrain conditions. Can be expressed as the probability that any given region will be affected by landslides, i.e. an estimate of "where" landslides are likely to occur. In this work we present two examples of landslide susceptibility map prepared for the Umbria Region and for the Perugia Municipality. These two maps were realized following official request from the Regional and Municipal government to the Research Institute for the Hydrogeological Protection (CNR-IRPI). The susceptibility map prepared for the Umbria Region represents the development of previous agreements focused to prepare: i) a landslide inventory map that was included in the Urban Territorial Planning (PUT) and ii) a series of maps for the Regional Plan for Multi-risk Prevention. The activities carried out for the Umbria Region were focused to define and apply methods and techniques for landslide susceptibility zonation. Susceptibility maps were prepared exploiting a multivariate statistical model (linear discriminant analysis) for the five Civil Protection Alert Zones defined in the regional territory. The five resulting maps were tested and validated using the spatial distribution of recent landslide events that occurred in the region. The susceptibility map for the Perugia Municipality was prepared to be integrated as one of the cartographic product in the Municipal development plan (PRG - Piano Regolatore Generale) as required by the existing legislation. At strategic level, one of the main objectives of the PRG, is to establish a framework of knowledge and legal aspects for the management of geo-hydrological risk. At national level most of the susceptibility maps prepared for the PRG, were and still are obtained

  11. Text Maps: Helping Students Navigate Informational Texts.

    Science.gov (United States)

    Spencer, Brenda H.

    2003-01-01

    Notes that a text map is an instructional approach designed to help students gain fluency in reading content area materials. Discusses how the goal is to teach students about the important features of the material and how the maps can be used to build new understandings. Presents the procedures for preparing and using a text map. (SG)

  12. Grafting, pruning, and the antipodal map on measured laminations

    OpenAIRE

    Dumas, David

    2006-01-01

    Grafting a measured lamination on a hyperbolic surface defines a self-map of Teichmuller space, which is a homeomorphism by a result of Scannell and Wolf. In this paper we study the large-scale behavior of pruning, which is the inverse of grafting. Specifically, for each conformal structure $X \\in \\T(S)$, pruning $X$ gives a map $\\ML(S) \\to \\T(S)$. We show that this map extends to the Thurston compactification of $\\T(S)$, and that its boundary values are the natural antipodal involution relat...

  13. Radiation hybrid mapping of genes in the lithium-sensitive wnt signaling pathway.

    Science.gov (United States)

    Rhoads, A R; Karkera, J D; Detera-Wadleigh, S D

    1999-09-01

    Lithium, an effective drug in the treatment of bipolar disorder, has been proposed to disrupt the Wnt signaling pathway. To facilitate analysis of the possible involvement of elements of the Wnt pathway in human bipolar disorder, a high resolution radiation hybrid mapping (RHM) of these genes was performed. A fine physical location has been obtained for Wnt 7A, frizzled 3, 4 and 5, dishevelled 1, 2 and 3, GSK3beta, axin, alpha-catenin, the Armadillo repeat-containing genes (delta-catenin and ARVCF), and a frizzled-like protein (frpHE) using the Stanford Human Genome Center (SHGC) G3 panel. Most of these genes were previously mapped by fluorescence in situ hybridization (FISH). Frizzled 4, axin and frpHE did not have a previous chromosomal assignment and were linked by RHM to chromosome markers, SHGC-35131 at 11q22.1, NIB1488 at 16p13.3 and D7S2919 at 7p15.2, respectively. Interestingly, some of these genes were found to map within potential regions underlying susceptibility to bipolar disorder and schizophrenia as well as disorders of neurodevelopmental origin. This alternative approach of establishing the precise location of selected genetic components of a candidate pathway and determining if they map within previously defined susceptibility loci should help to identify plausible candidate genes that warrant further analysis through association and mutational scanning.

  14. An automatic approach for rice mapping in temperate region using time series of MODIS imagery: first results for Mediterranean environment

    Science.gov (United States)

    Boschetti, M.; Nelson, A.; Manfrom, G.; Brivio, P. A.

    2012-04-01

    Timely and accurate information on crop typology and status are required to support suitable action to better manage agriculture production and reduce food insecurity. More specifically, regional crop masking and phenological information are important inputs for spatialized crop growth models for yield forecasting systems. Digital cartographic data available at global/regional scale, such as GLC2000, GLOBCOVER or MODIS land cover products (MOD12), are often not adequate for this crop modeling application. For this reason, there is a need to develop and test methods that can provide such information for specific cropsusing automated classification techniques.. In this framework we focused our analysis on the rice cultivation area detection due to the importance of this crop. Rice is a staple food for half of the world's population (FAO 2004). Over 90% of the world's rice is produced and consumed in Asia and the region is home to 70% of the world's poor, most of whom depend on rice for their livelihoods andor food security. Several initiatives are being promoted at the international level to provide maps of rice cultivated areas in South and South East Asia using different approaches available in literature for rice mapping in tropical regions. We contribute to these efforts by proposing an automatic method to detect rice cultivated areas in temperate regions exploiting MODIS 8-Day composite of Surface Reflectance at 500m spatial resolution (MOD09A1product). Temperate rice is cultivated worldwide in more than 20 countries covering around 16M ha for a total production of about 65M tons of paddy per year. The proposed method is based on a common approach available in literature that first identifies flood condition that can be related to rice agronomic practice and then checks for vegetation growth. The method presents innovative aspects related both to the flood detection, exploiting Short Wave Infrared spectral information, and to the crop grow monitoring analyzing

  15. An Automated Approach to Map the History of Forest Disturbance from Insect Mortality and Harvest with Landsat Time-Series Data

    Directory of Open Access Journals (Sweden)

    Christopher S.R. Neigh

    2014-03-01

    Full Text Available Forests contain a majority of the aboveground carbon (C found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI from 1982 to 2007, observed with the National Ocean and Atmospheric Administration’s (NOAA series of Advanced Very High Resolution Radiometer (AVHRR. To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1 multiple disturbance index thresholds to capture clear-cut harvest; and (2 a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer’s and user’s accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2. Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories

  16. An Automated Approach to Map the History of Forest Disturbance from Insect Mortality and Harvest with Landsat Time-Series Data

    Science.gov (United States)

    Rudasill-Neigh, Christopher S.; Bolton, Douglas K.; Diabate, Mouhamad; Williams, Jennifer J.; Carvalhais, Nuno

    2014-01-01

    Forests contain a majority of the aboveground carbon (C) found in ecosystems, and understanding biomass lost from disturbance is essential to improve our C-cycle knowledge. Our study region in the Wisconsin and Minnesota Laurentian Forest had a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 2007, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometer (AVHRR). To understand the potential role of disturbances in the terrestrial C-cycle, we developed an algorithm to map forest disturbances from either harvest or insect outbreak for Landsat time-series stacks. We merged two image analysis approaches into one algorithm to monitor forest change that included: (1) multiple disturbance index thresholds to capture clear-cut harvest; and (2) a spectral trajectory-based image analysis with multiple confidence interval thresholds to map insect outbreak. We produced 20 maps and evaluated classification accuracy with air-photos and insect air-survey data to understand the performance of our algorithm. We achieved overall accuracies ranging from 65% to 75%, with an average accuracy of 72%. The producer's and user's accuracy ranged from a maximum of 32% to 70% for insect disturbance, 60% to 76% for insect mortality and 82% to 88% for harvested forest, which was the dominant disturbance agent. Forest disturbances accounted for 22% of total forested area (7349 km2). Our algorithm provides a basic approach to map disturbance history where large impacts to forest stands have occurred and highlights the limited spectral sensitivity of Landsat time-series to outbreaks of defoliating insects. We found that only harvest and insect mortality events can be mapped with adequate accuracy with a non-annual Landsat time-series. This limited our land cover understanding of NDVI decline drivers. We demonstrate that to capture more subtle disturbances with spectral trajectories, future observations

  17. Accurate atom-mapping computation for biochemical reactions.

    Science.gov (United States)

    Latendresse, Mario; Malerich, Jeremiah P; Travers, Mike; Karp, Peter D

    2012-11-26

    The complete atom mapping of a chemical reaction is a bijection of the reactant atoms to the product atoms that specifies the terminus of each reactant atom. Atom mapping of biochemical reactions is useful for many applications of systems biology, in particular for metabolic engineering where synthesizing new biochemical pathways has to take into account for the number of carbon atoms from a source compound that are conserved in the synthesis of a target compound. Rapid, accurate computation of the atom mapping(s) of a biochemical reaction remains elusive despite significant work on this topic. In particular, past researchers did not validate the accuracy of mapping algorithms. We introduce a new method for computing atom mappings called the minimum weighted edit-distance (MWED) metric. The metric is based on bond propensity to react and computes biochemically valid atom mappings for a large percentage of biochemical reactions. MWED models can be formulated efficiently as Mixed-Integer Linear Programs (MILPs). We have demonstrated this approach on 7501 reactions of the MetaCyc database for which 87% of the models could be solved in less than 10 s. For 2.1% of the reactions, we found multiple optimal atom mappings. We show that the error rate is 0.9% (22 reactions) by comparing these atom mappings to 2446 atom mappings of the manually curated Kyoto Encyclopedia of Genes and Genomes (KEGG) RPAIR database. To our knowledge, our computational atom-mapping approach is the most accurate and among the fastest published to date. The atom-mapping data will be available in the MetaCyc database later in 2012; the atom-mapping software will be available within the Pathway Tools software later in 2012.

  18. AERIAL TERRAIN MAPPING USING UNMANNED AERIAL VEHICLE APPROACH

    Directory of Open Access Journals (Sweden)

    K. N. Tahar

    2012-08-01

    Full Text Available This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root

  19. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    Energy Technology Data Exchange (ETDEWEB)

    Vatsavai, Raju [ORNL

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  20. An overview of animal science research 1945-2011 through science mapping analysis.

    Science.gov (United States)

    Rodriguez-Ledesma, A; Cobo, M J; Lopez-Pujalte, C; Herrera-Viedma, E

    2015-12-01

    The conceptual structure of the field of Animal Science (AS) research is examined by means of a longitudinal science mapping analysis. The whole of the AS research field is analysed, revealing its conceptual evolution. To this end, an automatic approach to detecting and visualizing hidden themes or topics and their evolution across a consecutive span of years was applied to AS publications of the JCR category 'Agriculture, Dairy & Animal Science' during the period 1945-2011. This automatic approach was based on a coword analysis and combines performance analysis and science mapping. To observe the conceptual evolution of AS, six consecutive periods were defined: 1945-1969, 1970-1979, 1980-1989, 1990-1999, 2000-2005 and 2006-2011. Research in AS was identified as having focused on ten main thematic areas: ANIMAL-FEEDING, SMALL-RUMINANTS, ANIMAL-REPRODUCTION, DAIRY-PRODUCTION, MEAT-QUALITY, SWINE-PRODUCTION, GENETICS-AND-ANIMAL-BREEDING, POULTRY, ANIMAL-WELFARE and GROWTH-FACTORS-AND-FATTY-ACIDS. The results show how genomic studies gain in weight and integrate with other thematic areas. The whole of AS research has become oriented towards an overall framework in which animal welfare, sustainable management and human health play a major role. All this would affect the future structure and management of livestock farming. © 2014 Blackwell Verlag GmbH.

  1. Cropland Mapping over Sahelian and Sudanian Agrosystems: A Knowledge-Based Approach Using PROBA-V Time Series at 100-m

    Directory of Open Access Journals (Sweden)

    Marie-Julie Lambert

    2016-03-01

    Full Text Available Early warning systems for food security require accurate and up-to-date information on the location of major crops in order to prevent hazards. A recent systematic analysis of existing cropland maps identified priority areas for cropland mapping and highlighted a major need for the Sahelian and Sudanian agrosystems. This paper proposes a knowledge-based approach to map cropland in the Sahelian and Sudanian agrosystems that benefits from the 100-m spatial resolution of the recent PROBA-V sensor. The methodology uses five temporal features characterizing crop development throughout the vegetative season to optimize cropland discrimination. A feature importance analysis validates the efficiency of using a diversity of temporal features. The fully-automated method offers the first cropland map at 100-m using the PROBA-V sensor with an overall accuracy of 84% and an F-score for the cropland class of 74%. The improvements observed compared to existing cropland products are related to the hectometric resolution, to the methodology and to the quality of the labeling layer from which reliable training samples were automatically extracted. Classification errors are mainly explained by data availability and landscape fragmentation. Further improvements are expected with the upcoming enhanced cloud screening of the PROBA-V sensor.

  2. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms.

    Science.gov (United States)

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting; Guo, Feng-Biao

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus , which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates t