WorldWideScience

Sample records for random mapping approach

  1. A Stochastic Simulation Framework for the Prediction of Strategic Noise Mapping and Occupational Noise Exposure Using the Random Walk Approach

    Science.gov (United States)

    Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  2. Creating groups with similar expected behavioural response in randomized controlled trials: a fuzzy cognitive map approach.

    Science.gov (United States)

    Giabbanelli, Philippe J; Crutzen, Rik

    2014-12-12

    Controlling bias is key to successful randomized controlled trials for behaviour change. Bias can be generated at multiple points during a study, for example, when participants are allocated to different groups. Several methods of allocations exist to randomly distribute participants over the groups such that their prognostic factors (e.g., socio-demographic variables) are similar, in an effort to keep participants' outcomes comparable at baseline. Since it is challenging to create such groups when all prognostic factors are taken together, these factors are often balanced in isolation or only the ones deemed most relevant are balanced. However, the complex interactions among prognostic factors may lead to a poor estimate of behaviour, causing unbalanced groups at baseline, which may introduce accidental bias. We present a novel computational approach for allocating participants to different groups. Our approach automatically uses participants' experiences to model (the interactions among) their prognostic factors and infer how their behaviour is expected to change under a given intervention. Participants are then allocated based on their inferred behaviour rather than on selected prognostic factors. In order to assess the potential of our approach, we collected two datasets regarding the behaviour of participants (n = 430 and n = 187). The potential of the approach on larger sample sizes was examined using synthetic data. All three datasets highlighted that our approach could lead to groups with similar expected behavioural changes. The computational approach proposed here can complement existing statistical approaches when behaviours involve numerous complex relationships, and quantitative data is not readily available to model these relationships. The software implementing our approach and commonly used alternatives is provided at no charge to assist practitioners in the design of their own studies and to compare participants' allocations.

  3. Improve ground-level PM2.5concentration mapping using a random forests-based geostatistical approach.

    Science.gov (United States)

    Liu, Ying; Cao, Guofeng; Zhao, Naizhuo; Mulligan, Kevin; Ye, Xinyue

    2018-01-04

    Accurate measurements of ground-level PM 2.5 (particulate matter with aerodynamic diameters equal to or less than 2.5 μm) concentrations are critically important to human and environmental health studies. In this regard, satellite-derived gridded PM 2.5 datasets, particularly those datasets derived from chemical transport models (CTM), have demonstrated unique attractiveness in terms of their geographic and temporal coverage. The CTM-based approaches, however, often yield results with a coarse spatial resolution (typically at 0.1° of spatial resolution) and tend to ignore or simplify the impact of geographic and socioeconomic factors on PM 2.5 concentrations. In this study, with a focus on the long-term PM 2.5 distribution in the contiguous United States, we adopt a random forests-based geostatistical (regression kriging) approach to improve one of the most commonly used satellite-derived, gridded PM 2.5 datasets with a refined spatial resolution (0.01°) and enhanced accuracy. By combining the random forests machine learning method and the kriging family of methods, the geostatistical approach effectively integrates ground-based PM 2.5 measurements and related geographic variables while accounting for the non-linear interactions and the complex spatial dependence. The accuracy and advantages of the proposed approach are demonstrated by comparing the results with existing PM 2.5 datasets. This manuscript also highlights the effectiveness of the geographical variables in long-term PM 2.5 mapping, including brightness of nighttime lights, normalized difference vegetation index and elevation, and discusses the contribution of each of these variables to the spatial distribution of PM 2.5 concentrations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Efficient $\\chi ^{2}$ Kernel Linearization via Random Feature Maps.

    Science.gov (United States)

    Yuan, Xiao-Tong; Wang, Zhenzhen; Deng, Jiankang; Liu, Qingshan

    2016-11-01

    Explicit feature mapping is an appealing way to linearize additive kernels, such as χ2 kernel for training large-scale support vector machines (SVMs). Although accurate in approximation, feature mapping could pose computational challenges in high-dimensional settings as it expands the original features to a higher dimensional space. To handle this issue in the context of χ2 kernel SVMs learning, we introduce a simple yet efficient method to approximately linearize χ2 kernel through random feature maps. The main idea is to use sparse random projection to reduce the dimensionality of feature maps while preserving their approximation capability to the original kernel. We provide approximation error bound for the proposed method. Furthermore, we extend our method to χ2 multiple kernel SVMs learning. Extensive experiments on large-scale image classification tasks confirm that the proposed approach is able to significantly speed up the training process of the χ2 kernel SVMs at almost no cost of testing accuracy.

  5. Maps of random walks on complex networks reveal community structure.

    Science.gov (United States)

    Rosvall, Martin; Bergstrom, Carl T

    2008-01-29

    To comprehend the multipartite organization of large-scale biological and social systems, we introduce an information theoretic approach that reveals community structure in weighted and directed networks. We use the probability flow of random walks on a network as a proxy for information flows in the real system and decompose the network into modules by compressing a description of the probability flow. The result is a map that both simplifies and highlights the regularities in the structure and their relationships. We illustrate the method by making a map of scientific communication as captured in the citation patterns of >6,000 journals. We discover a multicentric organization with fields that vary dramatically in size and degree of integration into the network of science. Along the backbone of the network-including physics, chemistry, molecular biology, and medicine-information flows bidirectionally, but the map reveals a directional pattern of citation from the applied fields to the basic sciences.

  6. Random mappings with a single absorbing center and combinatorics of discretizations of the logistic mapping

    Directory of Open Access Journals (Sweden)

    A. Klemm

    1999-01-01

    Full Text Available Distributions of basic characteristics of random mappings with a single absorbing center are calculated. Results explain some phenomena occurring in computer simulations of the logistic mapping.

  7. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    .... In this approach, the source and target ontologies are first translated into Bayesian networks (BN); the concept mapping between the two ontologies are treated as evidential reasoning between the two translated BNs...

  8. A random matrix approach to language acquisition

    Science.gov (United States)

    Nicolaidis, A.; Kosmidis, Kosmas; Argyrakis, Panos

    2009-12-01

    Since language is tied to cognition, we expect the linguistic structures to reflect patterns that we encounter in nature and are analyzed by physics. Within this realm we investigate the process of lexicon acquisition, using analytical and tractable methods developed within physics. A lexicon is a mapping between sounds and referents of the perceived world. This mapping is represented by a matrix and the linguistic interaction among individuals is described by a random matrix model. There are two essential parameters in our approach. The strength of the linguistic interaction β, which is considered as a genetically determined ability, and the number N of sounds employed (the lexicon size). Our model of linguistic interaction is analytically studied using methods of statistical physics and simulated by Monte Carlo techniques. The analysis reveals an intricate relationship between the innate propensity for language acquisition β and the lexicon size N, N~exp(β). Thus a small increase of the genetically determined β may lead to an incredible lexical explosion. Our approximate scheme offers an explanation for the biological affinity of different species and their simultaneous linguistic disparity.

  9. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  10. MAGNETIC VT study: a prospective, multicenter, post-market randomized controlled trial comparing VT ablation outcomes using remote magnetic navigation-guided substrate mapping and ablation versus manual approach in a low LVEF population.

    Science.gov (United States)

    Di Biase, Luigi; Tung, Roderick; Szili-Torok, Tamás; Burkhardt, J David; Weiss, Peter; Tavernier, Rene; Berman, Adam E; Wissner, Erik; Spear, William; Chen, Xu; Neužil, Petr; Skoda, Jan; Lakkireddy, Dhanunjaya; Schwagten, Bruno; Lock, Ken; Natale, Andrea

    2017-04-01

    Patients with ischemic cardiomyopathy (ICM) are prone to scar-related ventricular tachycardia (VT). The success of VT ablation depends on accurate arrhythmogenic substrate localization, followed by optimal delivery of energy provided by constant electrode-tissue contact. Current manual and remote magnetic navigation (RMN)-guided ablation strategies aim to identify a reentry circuit and to target a critical isthmus through activation and entrainment mapping during ongoing tachycardia. The MAGNETIC VT trial will assess if VT ablation using the Niobe™ ES magnetic navigation system results in superior outcomes compared to a manual approach in subjects with ischemic scar VT and low ejection fraction. This is a randomized, single-blind, prospective, multicenter post-market study. A total of 386 subjects (193 per group) will be enrolled and randomized 1:1 between treatment with the Niobe ES system and treatment via a manual procedure at up to 20 sites. The study population will consist of patients with ischemic cardiomyopathy with left ventricular ejection fraction (LVEF) of ≤35% and implantable cardioverter defibrillator (ICD) who have sustained monomorphic VT. The primary study endpoint is freedom from any recurrence of VT through 12 months. The secondary endpoints are acute success; freedom from any VT at 1 year in a large-scar subpopulation; procedure-related major adverse events; and mortality rate through 12-month follow-up. Follow-up will consist of visits at 3, 6, 9, and 12 months, all of which will include ICD interrogation. The MAGNETIC VT trial will help determine whether substrate-based ablation of VT with RMN has clinical advantages over manual catheter manipulation. Clinicaltrials.gov identifier: NCT02637947.

  11. Neurodynamics in Randomly Coupled Circle Maps

    Science.gov (United States)

    Matsuno, Tetsuya; Toko, Kiyoshi; Yamafuji, Kaoru

    1996-05-01

    The dynamics of retrieval processes in a system composed of coupled circle maps is studied by means of a statistical method and numerical simulations. Phase patterns are embedded in coupling parameters so that the system may work as an associative memory system. A parameter, which is an amplification factor multiplied to all the coupling strengths, is introduced for investigating the effect of the strength of the coupling nonlinearity on the behavior of the system concerned. The statistical method provides a set of time evolution equations representing the macroscopic behavior. It is found that the storage capacity is considerably enhanced by the introduced amplification factor. It is also shown that the system exhibits macroscopic chaotic oscillations when the strength of the coupling is sufficiently large. Moreover, the clustering is observed, as in other types of the globally coupled nonlinear systems.

  12. Improving the pseudo-randomness properties of chaotic maps using deep-zoom.

    Science.gov (United States)

    Machicao, Jeaneth; Bruno, Odemir M

    2017-05-01

    A generalized method is proposed to compose new orbits from a given chaotic map. The method provides an approach to examine discrete-time chaotic maps in a "deep-zoom" manner by using k-digits to the right from the decimal separator of a given point from the underlying chaotic map. Interesting phenomena have been identified. Rapid randomization was observed, i.e., chaotic patterns tend to become indistinguishable when compared to the original orbits of the underlying chaotic map. Our results were presented using different graphical analyses (i.e., time-evolution, bifurcation diagram, Lyapunov exponent, Poincaré diagram, and frequency distribution). Moreover, taking advantage of this randomization improvement, we propose a Pseudo-Random Number Generator (PRNG) based on the k-logistic map. The pseudo-random qualities of the proposed PRNG passed both tests successfully, i.e., DIEHARD and NIST, and were comparable with other traditional PRNGs such as the Mersenne Twister. The results suggest that simple maps such as the logistic map can be considered as good PRNG methods.

  13. Selecting a phoneme-to-grapheme mapping: Random or weighted selection?

    Directory of Open Access Journals (Sweden)

    Binna Lee

    2015-05-01

    Our findings demonstrate that random selection underestimates MOA’s PG correspondences whereas weighted selection predicts higher PG correspondences than he produces. To explain his intermediate spelling performance on PPEs, we will test additional approaches to weighing the relative probability of PG mappings, including using log frequencies, separating consonant and vowel status, and considering the number of grapheme options in each phoneme.

  14. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    digital soil mapping methods and sets of ancillary variables for producing the most accurate spatial prediction of texture classes in a given area of interest. Both legacy and recently collected data on PSD were used as reference information. The predictor variable data set consisted of digital elevation model and its derivatives, lithology, land use maps as well as various bands and indices of satellite images. Two conceptionally different approaches can be applied in the mapping process. Textural classification can be realized after particle size data were spatially extended by proper geostatistical method. Alternatively, the textural classification is carried out first, followed by the spatial extension through suitable data mining method. According to the first approach, maps of sand, silt and clay percentage have been computed through regression kriging (RK). Since the three maps are compositional (their sum must be 100%), we applied Additive Log-Ratio (alr) transformation, instead of kriging them independently. Finally, the texture class map has been compiled according to the USDA categories from the three maps. Different combinations of reference and training soil data and auxiliary covariables resulted several different maps. On the basis of the other way, the PSD were classified firstly into the USDA categories, then the texture class maps were compiled directly by data mining methods (classification trees and random forests). The various results were compared to each other as well as to the RK maps. The performance of the different methods and data sets has been examined by testing the accuracy of the geostatistically computed and the directly classified results to assess the most predictive and accurate method. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  15. Concepts Map Approach in e-Classroom

    Directory of Open Access Journals (Sweden)

    Gabriel ZAMFIR

    2012-01-01

    Full Text Available This article is designed as an educational research study focused on e-Classroom as a medium of instruction based on assisted didactics design and teacher assisted learning in order to develop interactive applications, integrating concepts map approach. In this context, the paper proposes a specific conceptual framework applied in a theoretical model, as a base of an analytical framework used in a case study. Such a paradigm defines the classwork as the basic activity of the student which connects the fieldwork and the deskwork, and finally, it develops the basic and specific competencies of the individual according with the educational objectives.

  16. Random-breakage mapping method applied to human DNA sequences

    Science.gov (United States)

    Lobrich, M.; Rydberg, B.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)

    1996-01-01

    The random-breakage mapping method [Game et al. (1990) Nucleic Acids Res., 18, 4453-4461] was applied to DNA sequences in human fibroblasts. The methodology involves NotI restriction endonuclease digestion of DNA from irradiated calls, followed by pulsed-field gel electrophoresis, Southern blotting and hybridization with DNA probes recognizing the single copy sequences of interest. The Southern blots show a band for the unbroken restriction fragments and a smear below this band due to radiation induced random breaks. This smear pattern contains two discontinuities in intensity at positions that correspond to the distance of the hybridization site to each end of the restriction fragment. By analyzing the positions of those discontinuities we confirmed the previously mapped position of the probe DXS1327 within a NotI fragment on the X chromosome, thus demonstrating the validity of the technique. We were also able to position the probes D21S1 and D21S15 with respect to the ends of their corresponding NotI fragments on chromosome 21. A third chromosome 21 probe, D21S11, has previously been reported to be close to D21S1, although an uncertainty about a second possible location existed. Since both probes D21S1 and D21S11 hybridized to a single NotI fragment and yielded a similar smear pattern, this uncertainty is removed by the random-breakage mapping method.

  17. RAPID Outcome Mapping Approach Guide now online | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-27

    RAPID Outcome Mapping Approach Guide now online. April 27, 2016. Image. Roma Book Cover. After a decade of work developing the Outcome Mapping Approach, IDRC has played an important role in the development of ROMA, an adaptation of IDRC's original outcome mapping research. ROMA: A guide to policy ...

  18. Hydrochromic Approaches to Mapping Human Sweat Pores.

    Science.gov (United States)

    Park, Dong-Hoon; Park, Bum Jun; Kim, Jong-Man

    2016-06-21

    colorimetric change near body temperature. This feature enables the use of this technique to generate high-quality images of sweat pores. This Account also focuses on the results of the most recent phase of this investigation, which led to the development of a simple yet efficient and reliable technique for sweat pore mapping. The method utilizes a hydrophilic polymer composite film containing fluorescein, a commercially available dye that undergoes a fluorometric response as a result of water-dependent interconversion between its ring-closed spirolactone (nonfluorescent) and ring-opened fluorone (fluorescent) forms. Surface-modified carbon nanodots (CDs) have also been found to be efficient for hydrochromic mapping of human sweat pores. The results discovered by Lou et al. [ Adv. Mater. 2015 , 27 , 1389 ] are also included in this Account. Sweat pore maps obtained from fingertips using these materials were found to be useful for fingerprint analysis. In addition, this hydrochromism-based approach is sufficiently sensitive to enable differentiation between sweat-secreting active pores and inactive pores. As a result, the techniques can be applied to clinical diagnosis of malfunctioning sweat pores. The directions that future research in this area will follow are also discussed.

  19. The Poincaré Map of Randomly Perturbed Periodic Motion

    Science.gov (United States)

    Hitczenko, Pawel; Medvedev, Georgi S.

    2013-10-01

    A system of autonomous differential equations with a stable limit cycle and perturbed by small white noise is analyzed in this work. In the vicinity of the limit cycle of the unperturbed deterministic system, we define, construct, and analyze the Poincaré map of the randomly perturbed periodic motion. We show that the time of the first exit from a small neighborhood of the fixed point of the map, which corresponds to the unperturbed periodic orbit, is well approximated by the geometric distribution. The parameter of the geometric distribution tends to zero together with the noise intensity. Therefore, our result can be interpreted as an estimate of the stability of periodic motion to random perturbations. In addition, we show that the geometric distribution of the first exit times translates into statistical properties of solutions of important differential equation models in applications. To this end, we demonstrate three distinct examples from mathematical neuroscience featuring complex oscillatory patterns characterized by the geometric distribution. We show that in each of these models the statistical properties of emerging oscillations are fully explained by the general properties of randomly perturbed periodic motions identified in this paper.

  20. Rapid Land Cover Map Updates Using Change Detection and Robust Random Forest Classifiers

    Directory of Open Access Journals (Sweden)

    Konrad J. Wessels

    2016-10-01

    Full Text Available The paper evaluated the Landsat Automated Land Cover Update Mapping (LALCUM system designed to rapidly update a land cover map to a desired nominal year using a pre-existing reference land cover map. The system uses the Iteratively Reweighted Multivariate Alteration Detection (IRMAD to identify areas of change and no change. The system then automatically generates large amounts of training samples (n > 1 million in the no-change areas as input to an optimized Random Forest classifier. Experiments were conducted in the KwaZulu-Natal Province of South Africa using a reference land cover map from 2008, a change mask between 2008 and 2011 and Landsat ETM+ data for 2011. The entire system took 9.5 h to process. We expected that the use of the change mask would improve classification accuracy by reducing the number of mislabeled training data caused by land cover change between 2008 and 2011. However, this was not the case due to exceptional robustness of Random Forest classifier to mislabeled training samples. The system achieved an overall accuracy of 65%–67% using 22 detailed classes and 72%–74% using 12 aggregated national classes. “Water”, “Plantations”, “Plantations—clearfelled”, “Orchards—trees”, “Sugarcane”, “Built-up/dense settlement”, “Cultivation—Irrigated” and “Forest (indigenous” had user’s accuracies above 70%. Other detailed classes (e.g., “Low density settlements”, “Mines and Quarries”, and “Cultivation, subsistence, drylands” which are required for operational, provincial-scale land use planning and are usually mapped using manual image interpretation, could not be mapped using Landsat spectral data alone. However, the system was able to map the 12 national classes, at a sufficiently high level of accuracy for national scale land cover monitoring. This update approach and the highly automated, scalable LALCUM system can improve the efficiency and update rate of regional land

  1. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Directory of Open Access Journals (Sweden)

    Joseph Mascaro

    Full Text Available Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus. The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag", which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1 when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  2. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Science.gov (United States)

    Mascaro, Joseph; Asner, Gregory P; Knapp, David E; Kennedy-Bowdoin, Ty; Martin, Roberta E; Anderson, Christopher; Higgins, Mark; Chadwick, K Dana

    2014-01-01

    Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus). The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging)-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area) for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag"), which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1) when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  3. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    This paper presents our ongoing effort on developing a principled methodology for automatic ontology mapping based on BayesOWL, a probabilistic framework we developed for modeling uncertainty in semantic web...

  4. Random Wandering Around Homoclinic-like Manifolds in Symplectic Map Chain

    OpenAIRE

    Goto, Shin-itiro; Nozaki, Kazuhiro; Yamada, Hiroyasu

    2001-01-01

    We present a method to construct a symplecticity preserving renormalization group map of a chain of weakly nonlinear symplectic maps and obtain a general reduced symplectic map describing its long-time behaviour. It is found that the modulational instability in the reduced map triggers random wandering of orbits around some homoclinic-like manifolds, which is understood as the Bernoulli shifts.

  5. Random Wandering around Homoclinic-Like Manifolds in a Symplectic Map Chain

    OpenAIRE

    Shin-itiro, GOTO; Kazuhiro, NOZAKI; Hiroyasu, YAMADA; Department of Physics, Nagoya University

    2002-01-01

    We present a method to construct a symplecticity preserving renormalization group map of a chain of weakly nonlinear symplectic maps and obtain a general reduced symplectic map describing its long-time behavior. It is found that the modulational instability in the reduced map triggers random wandering of orbits around some homoclinic-like manifolds. This behavior is understood as Bernoulli shifts.

  6. Random matrix approach to categorical data analysis

    Science.gov (United States)

    Patil, Aashay; Santhanam, M. S.

    2015-09-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings, and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow random matrix predictions with the dominant eigenvalue being an exception. We demonstrate this approach by applying it to the data for Indian general elections and sea level pressures in the North Atlantic ocean.

  7. An Optimization Approach to Improving Collections of Shape Maps

    DEFF Research Database (Denmark)

    Nguyen, Andy; Ben‐Chen, Mirela; Welnicka, Katarzyna

    2011-01-01

    Finding an informative, structure‐preserving map between two shapes has been a long‐standing problem in geometry processing, involving a variety of solution approaches and applications. However, in many cases, we are given not only two related shapes, but a collection of them, and considering each......, and no intrinsic mapping algorithm can distinguish between them based on these two shapes alone. Another prominent issue with shape mapping algorithms is their relative sensitivity to how “similar” two shapes are — good maps are much easier to obtain when shapes are very similar. Given the context of additional...... shape maps connecting our collection, we propose to add the constraint of global map consistency, requiring that any composition of maps between two shapes should be independent of the path chosen in the network. This requirement can help us choose among the equally good symmetric alternatives, or help...

  8. Iterated-map approach to die tossing

    DEFF Research Database (Denmark)

    Feldberg, Rasmus; Szymkat, Maciej; Knudsen, Carsten

    1990-01-01

    Nonlinear dissipative mapping is applied to determine the trajectory of a two-dimensional die thrown onto an elastic table. The basins of attraction for different outcomes are obtained and their distribution in the space of initial conditions discussed. The system has certain properties in common...... with chaotic systems. However, a die falls to rest after a finite number of impacts, and therefore the system has a finite sensitivity to the initial conditions. Quantitative measures of this sensitivity are proposed and their variations with the initial momentum and orientation of the die investigated....

  9. A FISH approach for mapping the human genome using Bacterial Artificial Chromosomes (BACs)

    Energy Technology Data Exchange (ETDEWEB)

    Hubert, R.S.; Chen, X.N.; Mitchell, S. [Univ. of Los Angeles, CA (United States)] [and others

    1994-09-01

    As the Human Genome Project progresses, large insert cloning vectors such as BACs, P1, and P1 Artificial Chromosomes (PACs) will be required to complement the YAC mapping efforts. The value of the BAC vector for physical mapping lies in the stability of the inserts, the lack of chimerism, the length of inserts (up to 300 kb), the ability to obtain large amounts of pure clone DNA and the ease of BAC manipulation. These features helped us design two approaches for generating physical mapping reagents for human genetic studies. The first approach is a whole genome strategy in which randomly selected BACs are mapped, using FISH, to specific chromosomal bands. To date, 700 BACs have been mapped to single chromosome bands at a resolution of 2-5 Mb in addition to BACs mapped to 14 different centromeres. These BACs represent more than 90 Mb of the genome and include >70% of all human chromosome bands at the 350-band level. These data revealed that >97% of the BACs were non-chimeric and have a genomic distribution covering most gaps in the existing YAC map with excellent coverage of gene-rich regions. In the second approach, we used YACs to identify BACs on chromosome 21. A 1.5 Mb contig between D21S339 and D21S220 nears completion within the Down syndrome congenital heart disease (DS-CHD) region. Seventeen BACs ranging in size from 80 kb to 240 kb were ordered using 14 STSs with FISH confirmation. We have also used 40 YACs spanning 21q to identify, on average, >1 BAC/Mb to provide molecular cytogenetic reagents and anchor points for further mapping. The contig generated on chromosome 21 will be helpful in isolating the genes for DS-CHD. The physical mapping reagents generated using the whole genome approach will provide cytogenetic markers and mapped genomic fragments that will facilitate positional cloning efforts and the identification of genes within most chromosomal bands.

  10. An algebraic geometric approach to integrable maps of the plane

    Energy Technology Data Exchange (ETDEWEB)

    Jogia, Danesh [School of Mathematics, University of New South Wales, Sydney, NSW 2052 (Australia); Roberts, John A G [School of Mathematics, University of New South Wales, Sydney, NSW 2052 (Australia); Vivaldi, Franco [School of Mathematical Sciences, Queen Mary, University of London, London E1 4NS (United Kingdom)

    2006-02-03

    We show that the dynamics of a birational map on an elliptic curve over a field is, typically, conjugate to addition by a point (under the associated group law). When the field is taken to be the function field of rational complex functions of one variable, this amounts to an algebraic geometric version of the Arnold-Liouville integrability theorem for planar integrable maps. By-products of this approach are that birational maps preserving foliations are necessarily the composition of two involutions, and that relationships between birational maps preserving the same foliation can be described in terms of the respective points they add on the corresponding Weierstrass curves. When the result is applied to finite fields, it helps explain some universal features of the periodic orbit distribution function for the reductions of integrable maps.

  11. Mapping Smart Regions. An Exploratory Approach

    Directory of Open Access Journals (Sweden)

    Sylvie Occelli

    2014-05-01

    Full Text Available The paper presents the results of an exploratory approach aimed at extending the ranking procedures normally used in studying the socioeconomics determinants of smart growth at the regional level.   Most of these studies adopt a methodological procedure which essentially consists of the following steps: a identification of the pertinent elementary indicators according to the study objectives; b data selection and processing; c combination of the elementary indicators by multivariate statistical techniques aimed at obtaining a robust synthetic index to rank the observation units. In the procedure a relational dimension is mainly subsumed in the system oriented perspective adopted in selecting the indicators which would best represent the system determinants depending on the goals of the analysis (step a.  In order to get deeper insights into the smartness profile of the European regions, this study makes an effort to account of the relational dimension also in steps b and c of the procedure. The novelties of the proposed approach are twofold. First, by computing region-to-region distances associated with the selected indicators it extends the conventional ranking procedure (step c. Second, it uses a relational database (step b, dealing with the regional participation to the FP7-ICT project, to modify the distances and investigate its impact on the interpretation of the regional positioning.  The main results of this exercise seem to suggest that regional collaborations would have a positive role in regional convergence process. By providing an opportunity to get contacts with the areas endowed with a comparatively more robust smartness profile, regions may have a chance to enhance their own smartness profile.

  12. Random field estimation approach to robot dynamics

    Science.gov (United States)

    Rodriguez, Guillermo

    1990-01-01

    The difference equations of Kalman filtering and smoothing recursively factor and invert the covariance of the output of a linear state-space system driven by a white-noise process. Here it is shown that similar recursive techniques factor and invert the inertia matrix of a multibody robot system. The random field models are based on the assumption that all of the inertial (D'Alembert) forces in the system are represented by a spatially distributed white-noise model. They are easier to describe than the models based on classical mechanics, which typically require extensive derivation and manipulation of equations of motion for complex mechanical systems. With the spatially random models, more primitive locally specified computations result in a global collective system behavior equivalent to that obtained with deterministic models. The primary goal of applying random field estimation is to provide a concise analytical foundation for solving robot control and motion planning problems.

  13. A Random Matrix Approach to Credit Risk

    Science.gov (United States)

    Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided. PMID:24853864

  14. A random matrix approach to credit risk.

    Directory of Open Access Journals (Sweden)

    Michael C Münnix

    Full Text Available We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.

  15. Mapping Deforestation in North Korea Using Phenology-Based Multi-Index and Random Forest

    Directory of Open Access Journals (Sweden)

    Yihua Jin

    2016-12-01

    Full Text Available Phenology-based multi-index with the random forest (RF algorithm can be used to overcome the shortcomings of traditional deforestation mapping that involves pixel-based classification, such as ISODATA or decision trees, and single images. The purpose of this study was to investigate methods to identify specific types of deforestation in North Korea, and to increase the accuracy of classification, using phenological characteristics extracted with multi-index and random forest algorithms. The mapping of deforestation area based on RF was carried out by merging phenology-based multi-indices (i.e., normalized difference vegetation index (NDVI, normalized difference water index (NDWI, and normalized difference soil index (NDSI derived from MODIS (Moderate Resolution Imaging Spectroradiometer products and topographical variables. Our results showed overall classification accuracy of 89.38%, with corresponding kappa coefficients of 0.87. In particular, for forest and farm land categories with similar phenological characteristic (e.g., paddy, plateau vegetation, unstocked forest, hillside field, this approach improved the classification accuracy in comparison with pixel-based methods and other classes. The deforestation types were identified by incorporating point data from high-resolution imagery, outcomes of image classification, and slope data. Our study demonstrated that the proposed methodology could be used for deciding on the restoration priority and monitoring the expansion of deforestation areas.

  16. A random approach to the Lebesgue integral

    Science.gov (United States)

    Grahl, Jack

    2008-04-01

    We construct an integral of a measurable real function using randomly chosen Riemann sums and show that it converges in probability to the Lebesgue integral where this exists. We then prove some conditions for the almost sure convergence of this integral.

  17. Land cover mapping based on random forest classification of multitemporal spectral and thermal images.

    Science.gov (United States)

    Eisavi, Vahid; Homayouni, Saeid; Yazdi, Ahmad Maleknezhad; Alimohammadi, Abbas

    2015-05-01

    Thematic mapping of complex landscapes, with various phenological patterns from satellite imagery, is a particularly challenging task. However, supplementary information, such as multitemporal data and/or land surface temperature (LST), has the potential to improve the land cover classification accuracy and efficiency. In this paper, in order to map land covers, we evaluated the potential of multitemporal Landsat 8's spectral and thermal imageries using a random forest (RF) classifier. We used a grid search approach based on the out-of-bag (OOB) estimate of error to optimize the RF parameters. Four different scenarios were considered in this research: (1) RF classification of multitemporal spectral images, (2) RF classification of multitemporal LST images, (3) RF classification of all multitemporal LST and spectral images, and (4) RF classification of selected important or optimum features. The study area in this research was Naghadeh city and its surrounding region, located in West Azerbaijan Province, northwest of Iran. The overall accuracies of first, second, third, and fourth scenarios were equal to 86.48, 82.26, 90.63, and 91.82%, respectively. The quantitative assessments of the results demonstrated that the most important or optimum features increase the class separability, while the spectral and thermal features produced a more moderate increase in the land cover mapping accuracy. In addition, the contribution of the multitemporal thermal information led to a considerable increase in the user and producer accuracies of classes with a rapid temporal change behavior, such as crops and vegetation.

  18. Complexity and properties of a multidimensional Cat-Hadamard map for pseudo random number generation

    Science.gov (United States)

    Kim Hue, Ta Thi; Hoang, Thang Manh

    2017-07-01

    This paper presents a novel method to extend the Cat map from 2-dimension to higher dimension using the fast pseudo Hadamard Transform, and the resulted maps are called Cat-Hadamard maps. The complexity and properties of Cat-Hadamard maps are investigated under the point of view for cryptographic applications. In addition, we propose a method for constructing a pseudo random number generator using a novel design concept of the high dimensional Cat map. The simulation results show that the proposed generator fulfilled all the statistic tests of the NIST SP 800-90 A.

  19. A linear programming approach for optimal contrast-tone mapping.

    Science.gov (United States)

    Wu, Xiaolin

    2011-05-01

    This paper proposes a novel algorithmic approach of image enhancement via optimal contrast-tone mapping. In a fundamental departure from the current practice of histogram equalization for contrast enhancement, the proposed approach maximizes expected contrast gain subject to an upper limit on tone distortion and optionally to other constraints that suppress artifacts. The underlying contrast-tone optimization problem can be solved efficiently by linear programming. This new constrained optimization approach for image enhancement is general, and the user can add and fine tune the constraints to achieve desired visual effects. Experimental results demonstrate clearly superior performance of the new approach over histogram equalization and its variants.

  20. DYNAMIC STRAIN MAPPING AND REAL-TIME DAMAGE STATE ESTIMATION UNDER BIAXIAL RANDOM FATIGUE LOADING

    Data.gov (United States)

    National Aeronautics and Space Administration — DYNAMIC STRAIN MAPPING AND REAL-TIME DAMAGE STATE ESTIMATION UNDER BIAXIAL RANDOM FATIGUE LOADING SUBHASISH MOHANTY*, ADITI CHATTOPADHYAY, JOHN N. RAJADAS, AND CLYDE...

  1. Effect of concept mapping approach on students' achievement in ...

    African Journals Online (AJOL)

    The study was carried out to determine the effect of concept mapping approach on students' achievement in Mathematics in Secondary School in NgorOkpala Local Government Area of Imo State. Based on the objective of the study, three hypotheses guided the study. The quasi-experimental research design was used in ...

  2. An automated approach to map a French terminology to UMLS.

    Science.gov (United States)

    Merabti, Tayeb; Massari, Philipe; Joubert, Michel; Sadou, Eric; Lecroq, Thierry; Abdoune, Hocine; Rodrigues, Jean-Marie; Darmoni, Stefan J

    2010-01-01

    CCAM is a French terminology for coding clinical procedures. CCAM is a multi-hierarchical structured classification for procedures used in France for reimbursement in health care, which is external to UMLS. The objective of this work is to describe a French lexical approach allowing mapping CCAM procedures to the UMLS Metathesaurus to achieve interoperability to multiple international terminologies. This approach used a preliminary step intended to take only the significant characters used to code CCAM corresponding to anatomical and actions axes. According to the 7,926 CCAM codes used in this study, 5,212 possible matches (exact matching, single to multiple matching, partial matching) are found using the French CCAM to UMLS based mapping, 65% of the corresponding anatomical terms in the CCAM code are mapped to at least one UMLS Concept and 37% of the corresponding action terms in the CCAM code are mapped to at least one UMLS Concept. For all the exact matches found (n=200), 91% were rated by a human expert as narrower than the mapped UMLS Concepts, while only 3% were irrelevant.

  3. An automated approach to mapping external terminologies to the UMLS.

    Science.gov (United States)

    Taboada, María; Lalín, Rosario; Martínez, Diego

    2009-06-01

    Nowadays, providing interoperability between different biomedical terminologies is a critical issue for efficient information sharing. One problem making interoperability difficult is the lack of automated methods simplifying the mapping process. In this study, we propose an automated approach to mapping external terminologies to the Unified Medical Language System (UMLS). Our approach applies a sequential combination of two basic matching methods classically used in ontology matching. First, a lexical technique identifies similar strings between the external terminology and the UMLS. Second, a structure-based technique validates, in part, the lexical alignment by computing paths to top-level concepts and checking the compatibility of these top-level concepts across the external terminology and the UMLS. The method was applied to the mapping of the large-scale biomedical thesaurus EMTREE to the complete UMLS Metathesaurus. In total, 47.9% coverage of EMTREE terms was reached, leading to 80% coverage of EMTREE concepts. Our method has revealed a high compatibility in 6 out of 15 top-level categories across terminologies. The validation of lexical mappings ranges over 75.8% of the total lexical alignment. Overall, the method rules out a total of 6927 (7.9%) lexical mappings, with a global precision of 78%.

  4. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions

    Science.gov (United States)

    Hengl, Tomislav; Heuvelink, Gerard B. M.; Kempen, Bas; Leenaars, Johan G. B.; Walsh, Markus G.; Shepherd, Keith D.; Sila, Andrew; MacMillan, Robert A.; Mendes de Jesus, Jorge; Tamene, Lulseged; Tondoh, Jérôme E.

    2015-01-01

    80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS) project was established in 2008. Over the period 2008–2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy) database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management—organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na). We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15–75% in Root Mean Squared Error (RMSE) across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols) help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring pedological

  5. A Randomization Approach for Stochastic Workflow Scheduling in Clouds

    Directory of Open Access Journals (Sweden)

    Wei Zheng

    2016-01-01

    Full Text Available In cloud systems consisting of heterogeneous distributed resources, scheduling plays a key role to obtain good performance when complex applications are run. However, there is unavoidable error in predicting individual task execution times and data transmission times. When this error is being not negligible, deterministic scheduling approaches (i.e., scheduling based on accurate time prediction may suffer. In this paper, we assume the error in time predictions is modelled in stochastic manner, and a novel randomization approach making use of the properties of random variables is proposed to improve deterministic scheduling. The randomization approach is applied to a classic deterministic scheduling heuristic, but its applicability is not limited to this one heuristic. Evaluation results obtained from extensive simulation show that the randomized scheduling approach can significantly outperform its static counterpart and the extra overhead introduced is not only controllable but also acceptable.

  6. Continuous Mapping of Soil pH Using Digital Soil Mapping Approach in Europe

    Directory of Open Access Journals (Sweden)

    Ciro Gardi

    2012-07-01

    Full Text Available Soil pH is one of the most important chemical parameters of soil, playing an essential role on the agricultural production and on the distribution of plants and soil biota communities. It is the expression of soil genesis that in turns is a function of soil forming factors and influences all the chemical, physical and biological processes that occur in the soil. Thus it shapes the entire soil ecosystem. Due to any of the above reasons, mapping of soil pH becomes very important to provide harmonised soil pH data to policy makers, public bodies and researchers. In order to obtain a continuous mapping of soil pH for Europe, adopting the digital soil mapping approach, a set of continuously distribute covariates, highly correlated with pH, were selected. The estimate of soil pH was realized using a regression procedure, coupled with the kriging of the residuals. More than 30.000 points on top soil pH (CaCl2 were used, and 27 covariates were tested as predictors. The similar approach was already applied with 12.333 samples to produce a pH map of Europe using European Soil Profile Data in 2008 which compiles several databases from 11 different sources (Reuter et al. 2008. Our study was conducted to update the previous data and maps based on LUCAS (EUROSTAT - Land Use/Cover Area frame statistical Survey, BIOSOIL (Hiederer and Durrant, 2010 and merged database which was used to produce previous soil pH map of Europe (Reuter et al. 2008. We used a compilation of more than 30.000 soil pH measurements from 13 different sources to create a continuous map of soil pH across Europe using a geostatistical approach based on regression-kriging. Regression was based on the use of 27 covariates in the form of raster maps at 1km resolution to explain the differences in the distribution of soil pH in CaCl2 and we added the kriged map of the residuals from the regression model.

  7. A Heuristic Approach to Global Landslide Susceptibility Mapping

    Science.gov (United States)

    Stanley, Thomas; Kirschbaum, Dalia B.

    2017-01-01

    Landslides can have significant and pervasive impacts to life and property around the world. Several attempts have been made to predict the geographic distribution of landslide activity at continental and global scales. These efforts shared common traits such as resolution, modeling approach, and explanatory variables. The lessons learned from prior research have been applied to build a new global susceptibility map from existing and previously unavailable data. Data on slope, faults, geology, forest loss, and road networks were combined using a heuristic fuzzy approach. The map was evaluated with a Global Landslide Catalog developed at the National Aeronautics and Space Administration, as well as several local landslide inventories. Comparisons to similar susceptibility maps suggest that the subjective methods commonly used at this scale are, for the most part, reproducible. However, comparisons of landslide susceptibility across spatial scales must take into account the susceptibility of the local subset relative to the larger study area. The new global landslide susceptibility map is intended for use in disaster planning, situational awareness, and for incorporation into global decision support systems.

  8. Diffusion MRI noise mapping using random matrix theory

    Science.gov (United States)

    Veraart, Jelle; Fieremans, Els; Novikov, Dmitry S.

    2016-01-01

    Purpose To estimate the spatially varying noise map using a redundant magnitude MR series. Methods We exploit redundancy in non-Gaussian multi-directional diffusion MRI data by identifying its noise-only principal components, based on the theory of noisy covariance matrices. The bulk of PCA eigenvalues, arising due to noise, is described by the universal Marchenko-Pastur distribution, parameterized by the noise level. This allows us to estimate noise level in a local neighborhood based on the singular value decomposition of a matrix combining neighborhood voxels and diffusion directions. Results We present a model-independent local noise mapping method capable of estimating noise level down to about 1% error. In contrast to current state-of-the art techniques, the resultant noise maps do not show artifactual anatomical features that often reflect physiological noise, the presence of sharp edges, or a lack of adequate a priori knowledge of the expected form of MR signal. Conclusions Simulations and experiments show that typical diffusion MRI data exhibit sufficient redundancy that enables accurate, precise, and robust estimation of the local noise level by interpreting the PCA eigenspectrum in terms of the Marchenko-Pastur distribution. PMID:26599599

  9. Comparison of complementary and alternative medicine with conventional mind-body therapies for chronic back pain: protocol for the Mind-body Approaches to Pain (MAP) randomized controlled trial.

    Science.gov (United States)

    Cherkin, Daniel C; Sherman, Karen J; Balderson, Benjamin H; Turner, Judith A; Cook, Andrea J; Stoelb, Brenda; Herman, Patricia M; Deyo, Richard A; Hawkes, Rene J

    2014-06-07

    The self-reported health and functional status of persons with back pain in the United States have declined in recent years, despite greatly increased medical expenditures due to this problem. Although patient psychosocial factors such as pain-related beliefs, thoughts and coping behaviors have been demonstrated to affect how well patients respond to treatments for back pain, few patients receive treatments that address these factors. Cognitive-behavioral therapy (CBT), which addresses psychosocial factors, has been found to be effective for back pain, but access to qualified therapists is limited. Another treatment option with potential for addressing psychosocial issues, mindfulness-based stress reduction (MBSR), is increasingly available. MBSR has been found to be helpful for various mental and physical conditions, but it has not been well-studied for application with chronic back pain patients. In this trial, we will seek to determine whether MBSR is an effective and cost-effective treatment option for persons with chronic back pain, compare its effectiveness and cost-effectiveness compared with CBT and explore the psychosocial variables that may mediate the effects of MBSR and CBT on patient outcomes. In this trial, we will randomize 397 adults with nonspecific chronic back pain to CBT, MBSR or usual care arms (99 per group). Both interventions will consist of eight weekly 2-hour group sessions supplemented by home practice. The MBSR protocol also includes an optional 6-hour retreat. Interviewers masked to treatment assignments will assess outcomes 5, 10, 26 and 52 weeks postrandomization. The primary outcomes will be pain-related functional limitations (based on the Roland Disability Questionnaire) and symptom bothersomeness (rated on a 0 to 10 numerical rating scale) at 26 weeks. If MBSR is found to be an effective and cost-effective treatment option for patients with chronic back pain, it will become a valuable addition to the limited treatment options

  10. Effects of Cooperative Concept Mapping Teaching Approach on Secondary School Students' Motivation in Biology in Gucha District, Kenya

    Science.gov (United States)

    Keraro, Fred Nyabuti; Wachanga, Samuel W.; Orora, William

    2007-01-01

    This study investigated the effects of using the cooperative concept mapping (CCM) teaching approach on secondary school students' motivation in biology. A non equivalent control group design under the quasi-experimental research was used in which a random sample of four co-educational secondary schools was used. The four schools were randomly…

  11. Physico-empirical approach for mapping soil hydraulic behaviour

    Directory of Open Access Journals (Sweden)

    G. D'Urso

    1997-01-01

    Full Text Available Abstract: Pedo-transfer functions are largely used in soil hydraulic characterisation of large areas. The use of physico-empirical approaches for the derivation of soil hydraulic parameters from disturbed samples data can be greatly enhanced if a characterisation performed on undisturbed cores of the same type of soil is available. In this study, an experimental procedure for deriving maps of soil hydraulic behaviour is discussed with reference to its application in an irrigation district (30 km2 in southern Italy. The main steps of the proposed procedure are: i the precise identification of soil hydraulic functions from undisturbed sampling of main horizons in representative profiles for each soil map unit; ii the determination of pore-size distribution curves from larger disturbed sampling data sets within the same soil map unit. iii the calibration of physical-empirical methods for retrieving soil hydraulic parameters from particle-size data and undisturbed soil sample analysis; iv the definition of functional hydraulic properties from water balance output; and v the delimitation of soil hydraulic map units based on functional properties.

  12. Development of erosion risk map using fuzzy logic approach

    Directory of Open Access Journals (Sweden)

    Fauzi Manyuk

    2017-01-01

    Full Text Available Erosion-hazard assessment is an important aspect in the management of a river basin such as Siak River Basin, Riau Province, Indonesia. This study presents an application of fuzzy logic approach to develop erosion risk map based on geographic information system. Fuzzy logic is a computing approach based on “degrees of truth” rather than the usual “true or false” (1 or 0 Boolean logic on which the modern computer is based. The results of the erosion risk map were verified by using field measurements. The verification result shows that the parameter of soil-erodibility (K indicates a good agreement with field measurement data. The classification of soil-erodibility (K as the result of validation were: very low (0.0–0.1, medium (0.21-0.32, high (0.44-0.55 and very high (0.56-0.64. The results obtained from this study show that the erosion risk map of Siak River Basin were dominantly classified as medium level which cover about 68.54%. The other classifications were high and very low erosion level which cover about 28.84% and 2.61% respectively.

  13. Changing energy-related behavior: An Intervention Mapping approach

    Energy Technology Data Exchange (ETDEWEB)

    Kok, Gerjo, E-mail: g.kok@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Lo, Siu Hing, E-mail: siu-hing.lo@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Peters, Gjalt-Jorn Y., E-mail: gj.peters@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Ruiter, Robert A.C., E-mail: r.ruiter@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands)

    2011-09-15

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: > Intervention Mapping (IM) is a planning process for developing evidence-based interventions.> IM takes a problem-driven rather than theory-driven approach. > IM can be applied to the promotion of energy-conservation in a multilevel approach. > IM helps identifying determinants of behaviors and environmental conditions. > IM helps selecting appropriate theory-based methods and practical applications.

  14. Sparsity constrained deconvolution approaches for acoustic source mapping.

    Science.gov (United States)

    Yardibi, Tarik; Li, Jian; Stoica, Petre; Cattafesta, Louis N

    2008-05-01

    Using microphone arrays for estimating source locations and strengths has become common practice in aeroacoustic applications. The classical delay-and-sum approach suffers from low resolution and high sidelobes and the resulting beamforming maps are difficult to interpret. The deconvolution approach for the mapping of acoustic sources (DAMAS) deconvolution algorithm recovers the actual source levels from the contaminated delay-and-sum results by defining an inverse problem that can be represented as a linear system of equations. In this paper, the deconvolution problem is carried onto the sparse signal representation area and a sparsity constrained deconvolution approach (SC-DAMAS) is presented for solving the DAMAS inverse problem. A sparsity preserving covariance matrix fitting approach (CMF) is also presented to overcome the drawbacks of the DAMAS inverse problem. The proposed algorithms are convex optimization problems. Our simulations show that CMF and SC-DAMAS outperform DAMAS and as the noise in the measurements increases, CMF works better than both DAMAS and SC-DAMAS. It is observed that the proposed algorithms converge faster than DAMAS. A modification to SC-DAMAS is also provided which makes it significantly faster than DAMAS and CMF. For the correlated source case, the CMF-C algorithm is proposed and compared with DAMAS-C. Improvements in performance are obtained similar to the uncorrelated case.

  15. Constructing Binomial Trees Via Random Maps for Analysis of Financial Assets

    Directory of Open Access Journals (Sweden)

    Antonio Airton Carneiro de Freitas

    2010-04-01

    Full Text Available Random maps can be constructed from a priori knowledge of the financial assets. It is also addressed the reverse problem, i.e. from a function of an empirical stationary probability density function we set up a random map that naturally leads to an implied binomial tree, allowing the adjustment of models, including the ability to incorporate jumps. An applica- tion related to the options market is presented. It is emphasized that the quality of the model to incorporate a priori knowledge of the financial asset may be affected, for example, by the skewed vision of the analyst.

  16. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    Science.gov (United States)

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  17. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    Science.gov (United States)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  18. Agricultural cropland mapping using black-and-white aerial photography, Object-Based Image Analysis and Random Forests

    Science.gov (United States)

    Vogels, M. F. A.; de Jong, S. M.; Sterk, G.; Addink, E. A.

    2017-02-01

    Land-use and land-cover (LULC) conversions have an important impact on land degradation, erosion and water availability. Information on historical land cover (change) is crucial for studying and modelling land- and ecosystem degradation. During the past decades major LULC conversions occurred in Africa, Southeast Asia and South America as a consequence of a growing population and economy. Most distinct is the conversion of natural vegetation into cropland. Historical LULC information can be derived from satellite imagery, but these only date back until approximately 1972. Before the emergence of satellite imagery, landscapes were monitored by black-and-white (B&W) aerial photography. This photography is often visually interpreted, which is a very time-consuming approach. This study presents an innovative, semi-automated method to map cropland acreage from B&W photography. Cropland acreage was mapped on two study sites in Ethiopia and in The Netherlands. For this purpose we used Geographic Object-Based Image Analysis (GEOBIA) and a Random Forest classification on a set of variables comprising texture, shape, slope, neighbour and spectral information. Overall mapping accuracies attained are 90% and 96% for the two study areas respectively. This mapping method increases the timeline at which historical cropland expansion can be mapped purely from brightness information in B&W photography up to the 1930s, which is beneficial for regions where historical land-use statistics are mostly absent.

  19. An effective Hamiltonian approach to quantum random walk

    Indian Academy of Sciences (India)

    In this article we present an effective Hamiltonian approach for discrete time quantum random walk. A form of the Hamiltonian ... TARUN KANTI GHOSH2. Inter-University Centre for Astronomy and Astrophysics, Ganeshkhind, Pune 411 007, India; Department of Physics, Indian Institute of Technology, Kanpur 208 016, India ...

  20. Mapping topographic plant location properties using a dense matching approach

    Science.gov (United States)

    Niederheiser, Robert; Rutzinger, Martin; Lamprecht, Andrea; Bardy-Durchhalter, Manfred; Pauli, Harald; Winkler, Manuela

    2017-04-01

    Within the project MEDIALPS (Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains) six regions in Alpine and in Mediterranean mountain regions are investigated to assess how plant species respond to climate change. The project is embedded in the Global Observation Research Initiative in Alpine Environments (GLORIA), which is a well-established global monitoring initiative for systematic observation of changes in the plant species composition and soil temperature on mountain summits worldwide to discern accelerating climate change pressures on these fragile alpine ecosystems. Close-range sensing techniques such as terrestrial photogrammetry are well suited for mapping terrain topography of small areas with high resolution. Lightweight equipment, flexible positioning for image acquisition in the field, and independence on weather conditions (i.e. wind) make this a feasible method for in-situ data collection. New developments of dense matching approaches allow high quality 3D terrain mapping with less requirements for field set-up. However, challenges occur in post-processing and required data storage if many sites have to be mapped. Within MEDIALPS dense matching is used for mapping high resolution topography for 284 3x3 meter plots deriving information on vegetation coverage, roughness, slope, aspect and modelled solar radiation. This information helps identifying types of topography-dependent ecological growing conditions and evaluating the potential for existing refugial locations for specific plant species under climate change. This research is conducted within the project MEDIALPS - Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains funded by the Earth System Sciences Programme of the Austrian Academy of Sciences.

  1. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  2. AERIAL TERRAIN MAPPING USING UNMANNED AERIAL VEHICLE APPROACH

    Directory of Open Access Journals (Sweden)

    K. N. Tahar

    2012-08-01

    Full Text Available This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root

  3. Aerial Terrain Mapping Using Unmanned Aerial Vehicle Approach

    Science.gov (United States)

    Tahar, K. N.

    2012-08-01

    This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV) technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS) onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root mean square

  4. A mathematical theory of stochastic microlensing. I. Random time delay functions and lensing maps

    Science.gov (United States)

    Petters, A. O.; Rider, B.; Teguia, A. M.

    2009-07-01

    Stochastic microlensing is a central tool in probing dark matter on galactic scales. From first principles, we initiate the development of a mathematical theory of stochastic microlensing. Beginning with the random time delay function and associated lensing map, we determine exact expressions for the mean and variance of these transformations. In addition, we derive the probability density function (pdf) of a random point-mass potential, which form the constituent of a stochastic microlens potential. We characterize the exact pdf of a normalized random time delay function at the origin, showing that it is a shifted gamma distribution, which also holds at leading order in the limit of a large number of point masses if the normalized time delay function was at a general point of the lens plane. For the large number of point-mass limit, we also prove that the asymptotic pdf of the random lensing map under a specified scaling converges to a bivariate normal distribution. We show analytically that the pdf of the random scaled lensing map at leading order depends on the magnitude of the scaled bending angle due purely to point masses as well as demonstrate explicitly how this radial symmetry is broken at the next order. Interestingly, we found at leading order a formula linking the expectation and variance of the normalized random time delay function to the first Betti number of its domain. We also determine an asymptotic pdf for the random bending angle vector and find an integral expression for the probability of a lens plane point being near a fixed point. Lastly, we show explicitly how the results are affected by location in the lens plane. The results of this paper are relevant to the theory of random fields and provide a platform for further generalizations as well as analytical limits for checking astrophysical studies of stochastic microlensing.

  5. Localization of canine brachycephaly using an across breed mapping approach.

    Directory of Open Access Journals (Sweden)

    Danika Bannasch

    2010-03-01

    Full Text Available The domestic dog, Canis familiaris, exhibits profound phenotypic diversity and is an ideal model organism for the genetic dissection of simple and complex traits. However, some of the most interesting phenotypes are fixed in particular breeds and are therefore less tractable to genetic analysis using classical segregation-based mapping approaches. We implemented an across breed mapping approach using a moderately dense SNP array, a low number of animals and breeds carefully selected for the phenotypes of interest to identify genetic variants responsible for breed-defining characteristics. Using a modest number of affected (10-30 and control (20-60 samples from multiple breeds, the correct chromosomal assignment was identified in a proof of concept experiment using three previously defined loci; hyperuricosuria, white spotting and chondrodysplasia. Genome-wide association was performed in a similar manner for one of the most striking morphological traits in dogs: brachycephalic head type. Although candidate gene approaches based on comparable phenotypes in mice and humans have been utilized for this trait, the causative gene has remained elusive using this method. Samples from nine affected breeds and thirteen control breeds identified strong genome-wide associations for brachycephalic head type on Cfa 1. Two independent datasets identified the same genomic region. Levels of relative heterozygosity in the associated region indicate that it has been subjected to a selective sweep, consistent with it being a breed defining morphological characteristic. Genotyping additional dogs in the region confirmed the association. To date, the genetic structure of dog breeds has primarily been exploited for genome wide association for segregating traits. These results demonstrate that non-segregating traits under strong selection are equally tractable to genetic analysis using small sample numbers.

  6. A Multi-Dasymetric Mapping Approach for Tourism

    Directory of Open Access Journals (Sweden)

    Eric Vaz

    2013-12-01

    Full Text Available The challenge of measuring at municipal level tourism density has been a daunting task for both statisticians and geographers. The reason of this is enforced by the fact that administrative areas, such as municipalities, tend to be large spatial administrative units, sharing a large demographic asymmetry of tourist demand within the municipality. The rationale is that geographic characteristics such as coastal line, climate and vegetation, play a crucial role in tourist offer, leaning towards the conclusion that traditional census at administrative level are simply not enough to interpret the true distribution of tourism data. A more quantifiable method is necessary to assess the distribution of socio-economic data. This is developed by means of a dasymetric approach adding on the advantages of multi-temporal comparison. This paper adopts a dasymetric approach for defining tourism density per land use types using the CORINE Land Cover dataset. A density map for tourism is calculated, creating a modified areal weighting (MAW approach to assess the distribution of tourism density per administrative municipality. This distribution is then assessed as a bidirectional layer on the land use datasets for two temporal stamps: 2000 and 2006, which leads to (i a consistent map on a more accurate distribution of tourism in Algarve, (ii the calculation of tourism density surfaces, and (iii a multi-locational and temporal assessment through density crosstabulation. Finally a geovisual interpretation of locational analysis of tourism change in Algarve for the last decade is created. This integrative spatial methodology offers unique characteristics for more accurate decision making at regional level, bringing an integrative methodology to the forefront of linking tourism with the spatio-temporal clusters formed in rapidly changing economic regions.

  7. Fractal generator for efficient production of random planar patterns and symbols in digital mapping

    Science.gov (United States)

    Chen, Qiyu; Liu, Gang; Ma, Xiaogang; Li, Xinchuan; He, Zhenwen

    2017-08-01

    In digital cartography, the automatic generation of random planar patterns and symbols is still an ongoing challenge. Those patterns and symbols of randomness have randomly variated configurations and boundaries, and their generating algorithms are constrained by the shape features, cartographic standards and many other conditions. The fractal geometry offers favorable solutions to simulate random boundaries and patterns. In the work presented in this paper, we used both fractal theory and random Iterated Function Systems (IFS) to develop a method for the automatic generation of random planar patterns and symbols. The marshland and the trough cross-bedding patterns were used as two case studies for the implementation of the method. We first analyzed the morphological characteristics of those two planar patterns. Then we designed algorithms and implementation schemes addressing the features of each pattern. Finally, we ran the algorithms to generate the patterns and symbols, and compared them with the requirements of a few digital cartographic standards. The method presented in this paper has already been deployed in a digital mapping system for practical uses. The flexibility of the method also allows it to be reused and/or adapted in various software platforms for digital mapping.

  8. Mapping the distribution of malaria: current approaches and future directions

    Science.gov (United States)

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.; Chen, Dongmei; Moulin, Bernard; Wu, Jianhong

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  9. Local Relation Map: A Novel Illumination Invariant Face Recognition Approach

    Directory of Open Access Journals (Sweden)

    Lian Zhichao

    2012-10-01

    Full Text Available In this paper, a novel illumination invariant face recognition approach is proposed. Different from most existing methods, an additive term as noise is considered in the face model under varying illuminations in addition to a multiplicative illumination term. High frequency coefficients of Discrete Cosine Transform (DCT are discarded to eliminate the effect caused by noise. Based on the local characteristics of the human face, a simple but effective illumination invariant feature local relation map is proposed. Experimental results on the Yale B, Extended Yale B and CMU PIE demonstrate the outperformance and lower computational burden of the proposed method compared to other existing methods. The results also demonstrate the validity of the proposed face model and the assumption on noise.

  10. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  11. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis

    Directory of Open Access Journals (Sweden)

    Quanlong Feng

    2015-01-01

    Full Text Available Unmanned aerial vehicle (UAV remote sensing has great potential for vegetation mapping in complex urban landscapes due to the ultra-high resolution imagery acquired at low altitudes. Because of payload capacity restrictions, off-the-shelf digital cameras are widely used on medium and small sized UAVs. The limitation of low spectral resolution in digital cameras for vegetation mapping can be reduced by incorporating texture features and robust classifiers. Random Forest has been widely used in satellite remote sensing applications, but its usage in UAV image classification has not been well documented. The objectives of this paper were to propose a hybrid method using Random Forest and texture analysis to accurately differentiate land covers of urban vegetated areas, and analyze how classification accuracy changes with texture window size. Six least correlated second-order texture measures were calculated at nine different window sizes and added to original Red-Green-Blue (RGB images as ancillary data. A Random Forest classifier consisting of 200 decision trees was used for classification in the spectral-textural feature space. Results indicated the following: (1 Random Forest outperformed traditional Maximum Likelihood classifier and showed similar performance to object-based image analysis in urban vegetation classification; (2 the inclusion of texture features improved classification accuracy significantly; (3 classification accuracy followed an inverted U relationship with texture window size. The results demonstrate that UAV provides an efficient and ideal platform for urban vegetation mapping. The hybrid method proposed in this paper shows good performance in differentiating urban vegetation mapping. The drawbacks of off-the-shelf digital cameras can be reduced by adopting Random Forest and texture analysis at the same time.

  12. Comparison between WorldView-2 and SPOT-5 images in mapping the bracken fern using the random forest algorithm

    Science.gov (United States)

    Odindi, John; Adam, Elhadi; Ngubane, Zinhle; Mutanga, Onisimo; Slotow, Rob

    2014-01-01

    Plant species invasion is known to be a major threat to socioeconomic and ecological systems. Due to high cost and limited extents of urban green spaces, high mapping accuracy is necessary to optimize the management of such spaces. We compare the performance of the new-generation WorldView-2 (WV-2) and SPOT-5 images in mapping the bracken fern [Pteridium aquilinum (L) kuhn] in a conserved urban landscape. Using the random forest algorithm, grid-search approaches based on out-of-bag estimate error were used to determine the optimal ntree and mtry combinations. The variable importance and backward feature elimination techniques were further used to determine the influence of the image bands on mapping accuracy. Additionally, the value of the commonly used vegetation indices in enhancing the classification accuracy was tested on the better performing image data. Results show that the performance of the new WV-2 bands was better than that of the traditional bands. Overall classification accuracies of 84.72 and 72.22% were achieved for the WV-2 and SPOT images, respectively. Use of selected indices from the WV-2 bands increased the overall classification accuracy to 91.67%. The findings in this study show the suitability of the new generation in mapping the bracken fern within the often vulnerable urban natural vegetation cover types.

  13. Local random quantum circuits: Ensemble completely positive maps and swap algebras

    Energy Technology Data Exchange (ETDEWEB)

    Zanardi, Paolo [Department of Physics and Astronomy, and Center for Quantum Information Science and Technology, University of Southern California, Los Angeles, California 90089-0484, USA and Centre for Quantum Technologies, National University of Singapore, 2 Science Drive 3, Singapore 117542 (Singapore)

    2014-08-15

    We define different classes of local random quantum circuits (L-RQC) and show that (a) statistical properties of L-RQC are encoded into an associated family of completely positive maps and (b) average purity dynamics can be described by the action of these maps on operator algebras of permutations (swap algebras). An exactly solvable one-dimensional case is analyzed to illustrate the power of the swap algebra formalism. More in general, we prove short time area-law bounds on average purity for uncorrelated L-RQC and infinite time results for both the uncorrelated and correlated cases.

  14. Local random quantum circuits: Ensemble completely positive maps and swap algebras

    Science.gov (United States)

    Zanardi, Paolo

    2014-08-01

    We define different classes of local random quantum circuits (L-RQC) and show that (a) statistical properties of L-RQC are encoded into an associated family of completely positive maps and (b) average purity dynamics can be described by the action of these maps on operator algebras of permutations (swap algebras). An exactly solvable one-dimensional case is analyzed to illustrate the power of the swap algebra formalism. More in general, we prove short time area-law bounds on average purity for uncorrelated L-RQC and infinite time results for both the uncorrelated and correlated cases.

  15. Extension of the Multipole Approach to Random Metamaterials

    Directory of Open Access Journals (Sweden)

    A. Chipouline

    2012-01-01

    Full Text Available Influence of the short-range lateral disorder in the meta-atoms positioning on the effective parameters of the metamaterials is investigated theoretically using the multipole approach. Random variation of the near field quasi-static interaction between metaatoms in form of double wires is shown to be the reason for the effective permittivity and permeability changes. The obtained analytical results are compared with the known experimental ones.

  16. Random materials modeling : Statistical approach proposal for recycling materials

    OpenAIRE

    Jeong, Jena; Wang, L.; Schmidt, Franziska; LEKLOU, NORDINE; Ramezani, Hamidreza

    2015-01-01

    The current paper aims to promote the application of demolition waste on civil constructions. To achieve this assaignement, two main physcical properties, i.e. dry density and water absoption of the recycled aggregates have been chosen and studied at the first stage. The materail moduli of the recycled materials, i.e. the Lamé's coefficients, and strongly depend on the porosity. Moreover, the recycling materials should be considered as random materials. As a result, the statistical approach...

  17. Randomized comparison between objective-based lectures and outcome-based concept mapping for teaching neurological care to nursing students.

    Science.gov (United States)

    Hsu, Li-Ling; Pan, Hui-Ching; Hsieh, Suh-Ing

    2016-02-01

    Pre-registration programs have been found to insufficiently prepare nurses for working in the neurosciences specialism. Effective approaches to neurology education are important, not only to enhance motivation to learn, but also for learners to develop basic competence in handling patients with neurological problems. To demonstrate that outcome-based course design using concept mapping would bring about significant differences in the nursing students' competency, cognitive load, and learning satisfaction with the neurological care course. A two-group pretest and post-test experimental study was administered. Two of the four clusters of participants were randomly assigned to the experimental group for experiencing an outcome-based course design using concept mapping, and the rest were designated the control group to be given objective-based lectures only. The Competency Inventory of Nursing Students, Cognitive Load Scale of Neurological Nursing, and Learning Satisfaction Scale of Neurological Nursing were used in this study for the students to rate their own performance. In addition, The Concept Map Scoring Scale was used in the experimental group for examining students' concept mapping ability. Significant increases of mean nursing competency scores in both groups from pre-test to post-test were found. There was no statistically significant difference in mean nursing competency score between the experimental group and the control groups at post-test. The mean cognitive load score of the experimental group was lower than the control group at post-test. The mean learning satisfaction scores of the experimental group were higher than the control group. This article provides that outcome-based concept mapping as educational method could encourage a group of nursing students to take a bio-psycho-social approach to medicine, which might ultimately result in better nursing care quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    Science.gov (United States)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  19. CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH

    Directory of Open Access Journals (Sweden)

    V. A. Ayma

    2015-03-01

    Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  20. A Joint Land Cover Mapping and Image Registration Algorithm Based on a Markov Random Field Model

    Directory of Open Access Journals (Sweden)

    Apisit Eiumnoh

    2013-10-01

    Full Text Available Traditionally, image registration of multi-modal and multi-temporal images is performed satisfactorily before land cover mapping. However, since multi-modal and multi-temporal images are likely to be obtained from different satellite platforms and/or acquired at different times, perfect alignment is very difficult to achieve. As a result, a proper land cover mapping algorithm must be able to correct registration errors as well as perform an accurate classification. In this paper, we propose a joint classification and registration technique based on a Markov random field (MRF model to simultaneously align two or more images and obtain a land cover map (LCM of the scene. The expectation maximization (EM algorithm is employed to solve the joint image classification and registration problem by iteratively estimating the map parameters and approximate posterior probabilities. Then, the maximum a posteriori (MAP criterion is used to produce an optimum land cover map. We conducted experiments on a set of four simulated images and one pair of remotely sensed images to investigate the effectiveness and robustness of the proposed algorithm. Our results show that, with proper selection of a critical MRF parameter, the resulting LCMs derived from an unregistered image pair can achieve an accuracy that is as high as when images are perfectly aligned. Furthermore, the registration error can be greatly reduced.

  1. Pure P2P mediation system: A mappings discovery approach

    Science.gov (United States)

    selma, El yahyaoui El idrissi; Zellou, Ahmed; Idri, Ali

    2015-02-01

    The information integration systems consist in offering a uniform interface to provide access to a set of autonomous and distributed information sources. The most important advantage of this system is that it allows users to specify what they want, rather than thinking about how to get the responses. The works realized in this area have particular leads to two major classes of integration systems: the mediation systems based on the paradigm mediator / adapter and peer to peer systems (P2P). The combination of both systems has led to a third type; is the mediation P2P systems. The P2P systems are large-scale systems, self-organized and distributed. They allow the resource management in a completely decentralized way. However, the integration of structured information sources, heterogeneous and distributed proves to be a complex problem. The objective of this work is to propose an approach to resolve conflicts and establish a mapping between the heterogeneous elements. This approach is based on clustering; the latter is to group similar Peers that share common information in the same subnet. Thus, to facilitate the heterogeneity, we introduced three additional layers of our hierarchy of peers: internal schema, external schema and Schema directory peer. We used linguistic techniques, and precisely the name correspondence technique, that is based on the similarity of names to propose a correspondence.

  2. ConMap: Investigating new computer-based approaches to assessing conceptual knowledge structure in physics

    Science.gov (United States)

    Beatty, Ian D.

    2000-06-01

    There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap (``Conceptual Mapping'') project, described in this dissertation, proposed and investigated some novel methods for assessing the conceptual knowledge structure of physics students. A set of brief computer-administered tasks for eliciting students' conceptual associations was designed. The basic approach of the tasks was to elicit spontaneous term associations from subjects by presenting them with a prompt term, or problem, or topic area, and having them type a set of response terms. Each response was recorded along with the time spent thinking of and typing it. Several studies were conducted in which data was collected on introductory physics students' performance on the tasks. A detailed statistical description of the data was compiled. Phenomenological characterization of the data (description and statistical summary of observed patterns) provided insight into the way students respond to the tasks, and discovered some notable features to guide modeling efforts. Possible correlations were investigated, some among different aspects of the ConMap data, others between aspects of the data and students' in-course exam scores. Several correlations were found which suggest that the ConMap tasks can successfully reveal information about students' knowledge structuring and level of expertise. Similarity was observed between data from one of the tasks and results from a traditional concept map task. Two rudimentary quantitative models for the temporal aspects of student performance on one of the tasks were constructed, one based on random probability distributions and the other on a detailed deterministic representation of conceptual knowledge structure. Both models were

  3. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    OpenAIRE

    Chelouche, Doron; Nuñez, Francisco Pozo; Zucker, Shay

    2017-01-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann's mean-square successive-difference estimator are found ...

  4. A geostatistical approach to mapping site response spectral amplifications

    Science.gov (United States)

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  5. Adverse Effects of Electronic Cigarette Use: A Concept Mapping Approach.

    Science.gov (United States)

    Soule, Eric K; Nasim, Aashir; Rosas, Scott

    2016-05-01

    Electronic cigarette (ECIG) use has grown rapidly in popularity within a short period of time. As ECIG products continue to evolve and more individuals begin using ECIGs, it is important to understand the potential adverse effects that are associated with ECIG use. The purpose of this study was to examine and describe the acute adverse effects associated with ECIG use. This study used an integrated, mixed-method participatory approach called concept mapping (CM). Experienced ECIG users (n = 85) provided statements that answered the focus prompt "A specific negative or unpleasant effect (ie, physical or psychological) that I have experienced either during or immediately after using an electronic cigarette device is…" in an online program. Participants sorted these statements into piles of common themes and rated each statement. Using multidimensional scaling and hierarchical cluster analysis, a concept map of the adverse effects statements was created. Participants generated 79 statements that completed the focus prompt and were retained by researchers. Analysis generated a map containing five clusters that characterized perceived adverse effects of ECIG use: Stigma, Worry/Guilt, Addiction Signs, Physical Effects, and Device/Vapor Problems. ECIG use is associated with adverse effects that should be monitored as ECIGs continue to grow in popularity. If ECIGs are to be regulated, policies should be created that minimize the likelihood of user identified adverse effects. This article provides a list of adverse effects reported by experienced ECIG users. This article organizes these effects into a conceptual model that may be useful for better understanding the adverse outcomes associated with ECIG use. These identified adverse effects may be useful for health professionals and policy makers. Health professionals should be aware of potential negative health effects that may be associated with ECIG use and policy makers could design ECIG regulations that minimize the

  6. A random-permutations-based approach to fast read alignment.

    Science.gov (United States)

    Lederman, Roy

    2013-01-01

    Read alignment is a computational bottleneck in some sequencing projects. Most of the existing software packages for read alignment are based on two algorithmic approaches: prefix-trees and hash-tables. We propose a new approach to read alignment using random permutations of strings. We present a prototype implementation and experiments performed with simulated and real reads of human DNA. Our experiments indicate that this permutations-based prototype is several times faster than comparable programs for fast read alignment and that it aligns more reads correctly. This approach may lead to improved speed, sensitivity, and accuracy in read alignment. The algorithm can also be used for specialized alignment applications and it can be extended to other related problems, such as assembly.More information: http://alignment.commons.yale.edu.

  7. A topographical map approach to representing treatment efficacy: a focus on positive psychology interventions.

    Science.gov (United States)

    Gorlin, Eugenia I; Lee, Josephine; Otto, Michael W

    2018-01-01

    A recent meta-analysis by Bolier et al. indicated that positive psychology interventions have overall small to moderate effects on well-being, but results were quite heterogeneous across intervention trials. Such meta-analytic research helps condense information on the efficacy of a broad psychosocial intervention by averaging across many effects; however, such global averages may provide limited navigational guidance for selecting among specific interventions. Here, we introduce a novel method for displaying qualitative and quantitative information on the efficacy of interventions using a topographical map approach. As an initial prototype for demonstrating this method, we mapped 50 positive psychology interventions targeting well-being (as captured in the Bolier et al. [2013] meta-analysis, [Bolier, L., Haverman, M., Westerhof, G. J., Riper, H., Smit, F., & Bohlmeijer, E. (2013). Positive psychology interventions: A meta-analysis of randomized controlled studies. BMC Public Health, 13, 83]). Each intervention domain/subdomain was mapped according to its average effect size (indexed by vertical elevation), number of studies providing effect sizes (indexed by horizontal area), and therapist/client burden (indexed by shading). The geographical placement of intervention domains/subdomains was determined by their conceptual proximity, allowing viewers to gauge the general conceptual "direction" in which promising intervention effects can be found. The resulting graphical displays revealed several prominent features of the well-being intervention "landscape," such as more strongly and uniformly positive effects of future-focused interventions (including, goal-pursuit and optimism training) compared to past/present-focused ones.

  8. An optimization approach for extracting and encoding consistent maps in a shape collection

    KAUST Repository

    Huang, Qi-Xing

    2012-11-01

    We introduce a novel approach for computing high quality point-topoint maps among a collection of related shapes. The proposed approach takes as input a sparse set of imperfect initial maps between pairs of shapes and builds a compact data structure which implicitly encodes an improved set of maps between all pairs of shapes. These maps align well with point correspondences selected from initial maps; they map neighboring points to neighboring points; and they provide cycle-consistency, so that map compositions along cycles approximate the identity map. The proposed approach is motivated by the fact that a complete set of maps between all pairs of shapes that admits nearly perfect cycleconsistency are highly redundant and can be represented by compositions of maps through a single base shape. In general, multiple base shapes are needed to adequately cover a diverse collection. Our algorithm sequentially extracts such a small collection of base shapes and creates correspondences from each of these base shapes to all other shapes. These correspondences are found by global optimization on candidate correspondences obtained by diffusing initial maps. These are then used to create a compact graphical data structure from which globally optimal cycle-consistent maps can be extracted using simple graph algorithms. Experimental results on benchmark datasets show that the proposed approach yields significantly better results than state-of-theart data-driven shape matching methods. © 2012 ACM.

  9. Spectral similarity approach for mapping turbidity of an inland waterbody

    Science.gov (United States)

    Garg, Vaibhav; Senthil Kumar, A.; Aggarwal, S. P.; Kumar, Vinay; Dhote, Pankaj R.; Thakur, Praveen K.; Nikam, Bhaskar R.; Sambare, Rohit S.; Siddiqui, Asfa; Muduli, Pradipta R.; Rastogi, Gurdeep

    2017-07-01

    Turbidity is an important quality parameter of water from its optical property point of view. It varies spatio-temporally over large waterbodies and its well distributed measurement on field is tedious and time consuming. Generally, normalized difference turbidity index (NDTI), or band ratio, or regression analysis between turbidity concentration and band reflectance, approaches have been adapted to retrieve turbidity using multispectral remote sensing data. These techniques usually provide qualitative rather than quantitative estimates of turbidity. However, in the present study, spectral similarity analysis, between the spectral characteristics of spaceborne hyperspectral remote sensing data and spectral library generated on field, was carried out to quantify turbidity in the part of Chilika Lake, Odisha, India. Spatial spectral contextual image analysis, spectral angle mapper (SAM) technique was evaluated for the same. The SAM spectral matching technique has been widely used in geological application (mineral mapping), however, the application of this kind of techniques is limited in water quality studies due to non-availability of reference spectral libraries. A spectral library was generated on field for the different concentrations of turbidity using well calibrated instruments like field spectro-radiometer, turbidity meter and hand held global positioning system. The field spectra were classified into 7 classes of turbidity concentration as 100 NTU for analysis. Analysis reveal that at each location in the lake under consideration, the field spectra matched with the image spectra with SAM score of 0.8 and more. The observed turbidity at each location was also very much falling in the estimated turbidity class range. It was observed that the spectral similarity approach provides more quantitative estimate of turbidity as compared to NDTI.

  10. Artificial Neural Network Approach for Mapping Contrasting Tillage Practices

    Directory of Open Access Journals (Sweden)

    Terry Howell

    2010-02-01

    Full Text Available Tillage information is crucial for environmental modeling as it directly affects evapotranspiration, infiltration, runoff, carbon sequestration, and soil losses due to wind and water erosion from agricultural fields. However, collecting this information can be time consuming and costly. Remote sensing approaches are promising for rapid collection of tillage information on individual fields over large areas. Numerous regression-based models are available to derive tillage information from remote sensing data. However, these models require information about the complex nature of underlying watershed characteristics and processes. Unlike regression-based models, Artificial Neural Network (ANN provides an efficient alternative to map complex nonlinear relationships between an input and output datasets without requiring a detailed knowledge of underlying physical relationships. Limited or no information currently exist quantifying ability of ANN models to identify contrasting tillage practices from remote sensing data. In this study, a set of Landsat TM-based ANN models was developed to identify contrasting tillage practices in the Texas High Plains. Observed tillage data from Moore and Ochiltree Counties were used to develop and evaluate the models, respectively. The overall classification accuracy for the 15 models developed with the Moore County dataset varied from 74% to 91%. Statistical evaluation of these models against the Ochiltree County dataset produced results with an overall classification accuracy varied from 66% to 80%. The ANN models based on TM band 5 or indices of TM Band 5 may provide consistent and accurate tillage information when applied to the Texas High Plains.

  11. Mapping of multi-floor buildings: A barometric approach

    DEFF Research Database (Denmark)

    Özkil, Ali Gürcan; Fan, Zhun; Xiao, Jizhong

    2011-01-01

    This paper presents a new method for mapping multi5floor buildings. The method combines laser range sensor for metric mapping and barometric pressure sensor for detecting floor transitions and map segmentation. We exploit the fact that the barometric pressure is a function of the elevation......, and it varies between different floors. The method is tested with a real robot in a typical indoor environment, and the results show that physically consistent multi5floor representations are achievable....

  12. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    Science.gov (United States)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  13. Combinatorial theory of the semiclassical evaluation of transport moments. I. Equivalence with the random matrix approach

    Energy Technology Data Exchange (ETDEWEB)

    Berkolaiko, G., E-mail: berko@math.tamu.edu [Department of Mathematics, Texas A and M University, College Station, Texas 77843-3368 (United States); Kuipers, J., E-mail: Jack.Kuipers@physik.uni-regensburg.de [Institut für Theoretische Physik, Universität Regensburg, D-93040 Regensburg (Germany)

    2013-11-15

    To study electronic transport through chaotic quantum dots, there are two main theoretical approaches. One involves substituting the quantum system with a random scattering matrix and performing appropriate ensemble averaging. The other treats the transport in the semiclassical approximation and studies correlations among sets of classical trajectories. There are established evaluation procedures within the semiclassical evaluation that, for several linear and nonlinear transport moments to which they were applied, have always resulted in the agreement with random matrix predictions. We prove that this agreement is universal: any semiclassical evaluation within the accepted procedures is equivalent to the evaluation within random matrix theory. The equivalence is shown by developing a combinatorial interpretation of the trajectory sets as ribbon graphs (maps) with certain properties and exhibiting systematic cancellations among their contributions. Remaining trajectory sets can be identified with primitive (palindromic) factorisations whose number gives the coefficients in the corresponding expansion of the moments of random matrices. The equivalence is proved for systems with and without time reversal symmetry.

  14. Combinatorial theory of the semiclassical evaluation of transport moments. I. Equivalence with the random matrix approach

    Science.gov (United States)

    Berkolaiko, G.; Kuipers, J.

    2013-11-01

    To study electronic transport through chaotic quantum dots, there are two main theoretical approaches. One involves substituting the quantum system with a random scattering matrix and performing appropriate ensemble averaging. The other treats the transport in the semiclassical approximation and studies correlations among sets of classical trajectories. There are established evaluation procedures within the semiclassical evaluation that, for several linear and nonlinear transport moments to which they were applied, have always resulted in the agreement with random matrix predictions. We prove that this agreement is universal: any semiclassical evaluation within the accepted procedures is equivalent to the evaluation within random matrix theory. The equivalence is shown by developing a combinatorial interpretation of the trajectory sets as ribbon graphs (maps) with certain properties and exhibiting systematic cancellations among their contributions. Remaining trajectory sets can be identified with primitive (palindromic) factorisations whose number gives the coefficients in the corresponding expansion of the moments of random matrices. The equivalence is proved for systems with and without time reversal symmetry.

  15. Curriculum Mapping across the Disciplines: Differences, Approaches, and Strategies

    Science.gov (United States)

    Rawle, Fiona; Bowen, Tracey; Murck, Barbara; Hong, Rosa Junghwa

    2017-01-01

    Curriculum mapping can be used to document, align, visualize, and assess curricular data, such as learning outcomes, assessment materials, instructional techniques, and student pre- and post-testing scores. A cross-disciplinary Curriculum Mapping Initiative currently underway at the University of Toronto Mississauga aims to: (1) develop guidelines…

  16. RAPID Outcome Mapping Approach Guide now online | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    27 avr. 2016 ... ... of ROMA, an adaptation of IDRC's original outcome mapping research. ROMA: A guide to policy engagement and influence, has been released by the Overseas Development Institute (ODI). The publication is the latest result of phase three of IDRC's Outcome Mapping Virtual Learning Community project.

  17. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  18. Treatment noncompliance in randomized experiments: statistical approaches and design issues.

    Science.gov (United States)

    Sagarin, Brad J; West, Stephen G; Ratnikov, Alexander; Homan, William K; Ritchie, Timothy D; Hansen, Edward J

    2014-09-01

    Treatment noncompliance in randomized experiments threatens the validity of causal inference and the interpretability of treatment effects. This article provides a nontechnical review of 7 approaches: 3 traditional and 4 newer statistical analysis strategies. Traditional approaches include (a) intention-to-treat analysis (which estimates the effects of treatment assignment irrespective of treatment received), (b) as-treated analysis (which reassigns participants to groups reflecting the treatment they actually received), and (c) per-protocol analysis (which drops participants who did not comply with their assigned treatment). Newer approaches include (d) the complier average causal effect (which estimates the effect of treatment on the subpopulation of those who would comply with their assigned treatment), (e) dose-response estimation (which uses degree of compliance to stratify participants, producing an estimate of a dose-response relationship), (f) propensity score analysis (which uses covariates to estimate the probability that individual participants will comply, enabling estimates of treatment effects at different propensities), and (g) treatment effect bounding (which calculates a range of possible treatment effects applicable to both compliers and noncompliers). The discussion considers the areas of application, the quantity estimated, the underlying assumptions, and the strengths and weaknesses of each approach. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  19. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    OpenAIRE

    Garoui Nassreddine; Jarboui Anis

    2012-01-01

    The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  20. Monte Carlo Random-Walk Experiments as a Test of Chaotic Orbits of Maps of the Interval

    Science.gov (United States)

    Arneodo, A.; Sornette, D.

    1984-05-01

    We have performed Monte Carlo random-walk experiments on a one-dimensional periodic lattice with a trapping site using the logistic map as a generator of pseudorandom numbers. Comparison with analytical results indicates that, when it has sensitive dependence to the initial conditions, this map provides a true pseudorandom generator.

  1. A mapping approach for distortion correction in sinusoidally scanned images

    Science.gov (United States)

    Khoury, J.; Woods, C. L.; Haji-saeed, Bahareh; Pyburn, Dana; Sengupta, Sandip K.; Kierstead, J.

    2006-04-01

    We have developed a mapping algorithm for correcting sinusoidally scanned images from their distortions. Our algorithm is based on an approximate relationship between linear and sinusoidal scanning. Straightforward implementation of this algorithm showed that the mapped image has either missing lines or redundant lines. The missing lines were filled by fusing the mapped image with its median filtered version. The implementation of this algorithm shows that it is possible to retrieve up to 96.43% of the original image, as measured by the recovered energy.

  2. On the design of henon and logistic map-based random number generator

    Science.gov (United States)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  3. Multielectrode vs. point-by-point mapping for ventricular tachycardia substrate ablation: a randomized study.

    Science.gov (United States)

    Acosta, Juan; Penela, Diego; Andreu, David; Cabrera, Mario; Carlosena, Alicia; Vassanelli, Francesca; Alarcón, Francisco; Soto-Iglesias, David; Korshunov, Viatcheslav; Borras, Roger; Linhart, Markus; Martínez, Mikel; Fernández-Armenta, Juan; Mont, Lluis; Berruezo, Antonio

    2017-01-08

    Ventricular tachycardia (VT) substrate ablation is based on detailed electroanatomical maps (EAM). This study analyses whether high-density multielectrode mapping (MEM) is superior to conventional point-by-point mapping (PPM) in guiding VT substrate ablation procedures. This was a randomized controlled study (NCT02083016). Twenty consecutive ischemic patients undergoing VT substrate ablation were randomized to either group A [n = 10; substrate mapping performed first by PPM (Navistar) and secondly by MEM (PentaRay) ablation guided by PPM] or group B [n = 10; substrate mapping performed first by MEM and second by PPM ablation guided by MEM]. Ablation was performed according to the scar-dechanneling technique. Late potential (LP) pairs were defined as a Navistar-LP and a PentaRay-LP located within a three-dimensional distance of ≤ 3 mm. Data obtained from EAM, procedure time, radiofrequency time, and post-ablation VT inducibility were compared between groups. Larger bipolar scar areas were obtained with MEM (55.7±31.7 vs. 50.5±26.6 cm(2); P = 0.017). Substrate mapping time was similar with MEM (19.7±7.9 minutes) and PPM (25±9.2 minutes); P = 0.222. No differences were observed in the number of LPs identified within the scar by MEM vs. PPM (73±50 vs. 76±52 LPs per patient, respectively; P = 0.965). A total of 1104 LP pairs were analysed. Using PentaRay, far-field/LP ratio was significantly lower (0.58±0.4 vs. 1.64±1.1; P = 0.01) and radiofrequency time was shorter [median (interquartile range) 12 (7-20) vs. 22 (17-33) minutes; P = 0.023]. No differences were observed in VT inducibility after procedure. MEM with PentaRay catheter provided better discrimination of LPs due to a lower sensitivity for far-field signals. Ablation guided by MEM was associated with a shorter radiofrequency time. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email

  4. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    Science.gov (United States)

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Transboundary aquifer mapping and management in Africa: a harmonised approach

    Science.gov (United States)

    Altchenko, Yvan; Villholth, Karen G.

    2013-11-01

    Recent attention to transboundary aquifers (TBAs) in Africa reflects the growing importance of these resources for development in the continent. However, relatively little research on these aquifers and their best management strategies has been published. This report recapitulates progress on mapping and management frameworks for TBAs in Africa. The world map on transboundary aquifers presented at the 6th World Water Forum in 2012 identified 71 TBA systems in Africa. This report presents an updated African TBA map including 80 shared aquifers and aquifer systems superimposed on 63 international river basins. Furthermore, it proposes a new nomenclature for the mapping based on three sub-regions, reflecting the leading regional development communities. The map shows that TBAs represent approximately 42 % of the continental area and 30 % of the population. Finally, a brief review of current international law, specific bi- or multilateral treaties, and TBA management practice in Africa reveals little documented international conflicts over TBAs. The existing or upcoming international river and lake basin organisations offer a harmonised institutional base for TBA management while alternative or supportive models involving the regional development communities are also required. The proposed map and geographical classification scheme for TBAs facilitates identification of options for joint institutional setups.

  6. How Albot0 finds its way home: a novel approach to cognitive mapping using robots.

    Science.gov (United States)

    Yeap, Wai K

    2011-10-01

    Much of what we know about cognitive mapping comes from observing how biological agents behave in their physical environments, and several of these ideas were implemented on robots, imitating such a process. In this paper a novel approach to cognitive mapping is presented whereby robots are treated as a species of their own and their cognitive mapping is being investigated. Such robots are referred to as Albots. The design of the first Albot, Albot0 , is presented. Albot0 computes an imprecise map and employs a novel method to find its way home. Both the map and the return-home algorithm exhibited characteristics commonly found in biological agents. What we have learned from Albot0 's cognitive mapping are discussed. One major lesson is that the spatiality in a cognitive map affords us rich and useful information and this argues against recent suggestions that the notion of a cognitive map is not a useful one. Copyright © 2011 Cognitive Science Society, Inc.

  7. A Random Walk Approach to Query Informative Constraints for Clustering.

    Science.gov (United States)

    Abin, Ahmad Ali

    2017-08-09

    This paper presents a random walk approach to the problem of querying informative constraints for clustering. The proposed method is based on the properties of the commute time, that is the expected time taken for a random walk to travel between two nodes and return, on the adjacency graph of data. Commute time has the nice property of that, the more short paths connect two given nodes in a graph, the more similar those nodes are. Since computing the commute time takes the Laplacian eigenspectrum into account, we use this property in a recursive fashion to query informative constraints for clustering. At each recursion, the proposed method constructs the adjacency graph of data and utilizes the spectral properties of the commute time matrix to bipartition the adjacency graph. Thereafter, the proposed method benefits from the commute times distance on graph to query informative constraints between partitions. This process iterates for each partition until the stop condition becomes true. Experiments on real-world data show the efficiency of the proposed method for constraints selection.

  8. A martingale approach for the elephant random walk

    Science.gov (United States)

    Bercu, Bernard

    2018-01-01

    The purpose of this paper is to establish, via a martingale approach, some refinements on the asymptotic behavior of the one-dimensional elephant random walk (ERW). The asymptotic behavior of the ERW mainly depends on a memory parameter p which lies between zero and one. This behavior is totally different in the diffusive regime 0 ≤slant p <3/4 , the critical regime p=3/4 , and the superdiffusive regime 3/4. In the diffusive and critical regimes, we establish some new results on the almost sure asymptotic behavior of the ERW, such as the quadratic strong law and the law of the iterated logarithm. In the superdiffusive regime, we provide the first rigorous mathematical proof that the limiting distribution of the ERW is not Gaussian.

  9. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    algorithm were evaluated. The resulting maps were validated on 777 soil profiles situated in a grid covering Denmark. The experiments showed that the results obtained with Jacobsen’s map were more accurate than the results obtained with the CEC map, despite a nominally coarser scale of 1:2,000,000 vs. 1...... of European Communities (CEC, 1985) respectively, both using the FAO 1974 classification. Furthermore, the effects of implementing soil-landscape relationships, using area proportional sampling instead of per polygon sampling, and replacing the default C5.0 classification tree algorithm with a random forest......:1,000,000. This finding is probably related to the fact that Jacobsen’s map was more detailed with a larger number of polygons, soil map units and soil types, despite its coarser scale. The results showed that the implementation of soil-landscape relationships, area-proportional sampling and the random forest...

  10. Comparative Assessment of Three Nonlinear Approaches for Landslide Susceptibility Mapping in a Coal Mine Area

    Directory of Open Access Journals (Sweden)

    Qiaomei Su

    2017-07-01

    Full Text Available Landslide susceptibility mapping is the first and most important step involved in landslide hazard assessment. The purpose of the present study is to compare three nonlinear approaches for landslide susceptibility mapping and test whether coal mining has a significant impact on landslide occurrence in coal mine areas. Landslide data collected by the Bureau of Land and Resources are represented by the X, Y coordinates of its central point; causative factors were calculated from topographic and geologic maps, as well as satellite imagery. The five-fold cross-validation method was adopted and the landslide/non-landslide datasets were randomly split into a ratio of 80:20. From this, five subsets for 20 times were acquired for training and validating models by GIS Geostatistical analysis methods, and all of the subsets were employed in a spatially balanced sample design. Three landslide models were built using support vector machine (SVM, logistic regression (LR, and artificial neural network (ANN models by selecting the median of the performance measures. Then, the three fitted models were compared using the area under the receiver operating characteristics (ROC curves (AUC and the performance measures. The results show that the prediction accuracies are between 73.43% and 87.45% in the training stage, and 67.16% to 73.13% in the validating stage for the three models. AUCs vary from 0.807 to 0.906 and 0.753 to 0.944 in the two stages, respectively. Additionally, three landslide susceptibility maps were obtained by classifying the range of landslide probabilities into four classes representing low (0–0.02, medium (0.02–0.1, high (0.1–0.85, and very high (0.85–1 probabilities of landslides. For the distributions of landslide and area percentages under different susceptibility standards, the SVM model has more relative balance in the four classes compared to the LR and the ANN models. The result reveals that the SVM model possesses better

  11. Evaluation of digital soil mapping approaches with large sets of environmental covariates

    Directory of Open Access Journals (Sweden)

    M. Nussbaum

    2018-01-01

    Full Text Available The spatial assessment of soil functions requires maps of basic soil properties. Unfortunately, these are either missing for many regions or are not available at the desired spatial resolution or down to the required soil depth. The field-based generation of large soil datasets and conventional soil maps remains costly. Meanwhile, legacy soil data and comprehensive sets of spatial environmental data are available for many regions. Digital soil mapping (DSM approaches relating soil data (responses to environmental data (covariates face the challenge of building statistical models from large sets of covariates originating, for example, from airborne imaging spectroscopy or multi-scale terrain analysis. We evaluated six approaches for DSM in three study regions in Switzerland (Berne, Greifensee, ZH forest by mapping the effective soil depth available to plants (SD, pH, soil organic matter (SOM, effective cation exchange capacity (ECEC, clay, silt, gravel content and fine fraction bulk density for four soil depths (totalling 48 responses. Models were built from 300–500 environmental covariates by selecting linear models through (1 grouped lasso and (2 an ad hoc stepwise procedure for robust external-drift kriging (georob. For (3 geoadditive models we selected penalized smoothing spline terms by component-wise gradient boosting (geoGAM. We further used two tree-based methods: (4 boosted regression trees (BRTs and (5 random forest (RF. Lastly, we computed (6 weighted model averages (MAs from the predictions obtained from methods 1–5. Lasso, georob and geoGAM successfully selected strongly reduced sets of covariates (subsets of 3–6 % of all covariates. Differences in predictive performance, tested on independent validation data, were mostly small and did not reveal a single best method for 48 responses. Nevertheless, RF was often the best among methods 1–5 (28 of 48 responses, but was outcompeted by MA for 14 of these 28

  12. Evaluation of digital soil mapping approaches with large sets of environmental covariates

    Science.gov (United States)

    Nussbaum, Madlene; Spiess, Kay; Baltensweiler, Andri; Grob, Urs; Keller, Armin; Greiner, Lucie; Schaepman, Michael E.; Papritz, Andreas

    2018-01-01

    The spatial assessment of soil functions requires maps of basic soil properties. Unfortunately, these are either missing for many regions or are not available at the desired spatial resolution or down to the required soil depth. The field-based generation of large soil datasets and conventional soil maps remains costly. Meanwhile, legacy soil data and comprehensive sets of spatial environmental data are available for many regions. Digital soil mapping (DSM) approaches relating soil data (responses) to environmental data (covariates) face the challenge of building statistical models from large sets of covariates originating, for example, from airborne imaging spectroscopy or multi-scale terrain analysis. We evaluated six approaches for DSM in three study regions in Switzerland (Berne, Greifensee, ZH forest) by mapping the effective soil depth available to plants (SD), pH, soil organic matter (SOM), effective cation exchange capacity (ECEC), clay, silt, gravel content and fine fraction bulk density for four soil depths (totalling 48 responses). Models were built from 300-500 environmental covariates by selecting linear models through (1) grouped lasso and (2) an ad hoc stepwise procedure for robust external-drift kriging (georob). For (3) geoadditive models we selected penalized smoothing spline terms by component-wise gradient boosting (geoGAM). We further used two tree-based methods: (4) boosted regression trees (BRTs) and (5) random forest (RF). Lastly, we computed (6) weighted model averages (MAs) from the predictions obtained from methods 1-5. Lasso, georob and geoGAM successfully selected strongly reduced sets of covariates (subsets of 3-6 % of all covariates). Differences in predictive performance, tested on independent validation data, were mostly small and did not reveal a single best method for 48 responses. Nevertheless, RF was often the best among methods 1-5 (28 of 48 responses), but was outcompeted by MA for 14 of these 28 responses. RF tended to over

  13. ELEMENTARY APPROACH TO SELF-ASSEMBLY AND ELASTIC PROPERTIES OF RANDOM COPOLYMERS

    Energy Technology Data Exchange (ETDEWEB)

    S. M. CHITANVIS

    2000-10-01

    The authors have mapped the physics of a system of random copolymers onto a time-dependent density functional-type field theory using techniques of functional integration. Time in the theory is merely a label for the location of a given monomer along the extent of a flexible chain. We derive heuristically within this approach a non-local constraint which prevents segments on chains in the system from straying too far from each other, and leads to self-assembly. The structure factor is then computed in a straightforward fashion. The long wave-length limit of the structure factor is used to obtain the elastic modulus of the network. It is shown that there is a surprising competition between the degree of micro-phase separation and the elastic moduli of the system.

  14. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    Science.gov (United States)

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  15. Partnering with youth to map their neighborhood environments: a multilayered GIS approach.

    Science.gov (United States)

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T; Raleigh, Kevin; Miller-Francis, Jenni

    2015-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multilayered approach for gaining local knowledge of neighborhood environments that engages youths as coresearchers and active knowledge producers. By integrating geographic information systems with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youths. Youths report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy makers in making place-based improvements for youth neighborhood environments.

  16. Raman mapping of oral buccal mucosa: a spectral histopathology approach

    Science.gov (United States)

    Behl, Isha; Kukreja, Lekha; Deshmukh, Atul; Singh, S. P.; Mamgain, Hitesh; Hole, Arti R.; Krishna, C. Murali

    2014-12-01

    Oral cancer is one of the most common cancers worldwide. One-fifth of the world's oral cancer subjects are from India and other South Asian countries. The present Raman mapping study was carried out to understand biochemical variations in normal and malignant oral buccal mucosa. Data were acquired using WITec alpha 300R instrument from 10 normal and 10 tumors unstained tissue sections. Raman maps of normal sections could resolve the layers of epithelium, i.e. basal, intermediate, and superficial. Inflammatory, tumor, and stromal regions are distinctly depicted on Raman maps of tumor sections. Mean and difference spectra of basal and inflammatory cells suggest abundance of DNA and carotenoids features. Strong cytochrome bands are observed in intermediate layers of normal and stromal regions of tumor. Epithelium and stromal regions of normal cells are classified by principal component analysis. Classification among cellular components of normal and tumor sections is also observed. Thus, the findings of the study further support the applicability of Raman mapping for providing molecular level insights in normal and malignant conditions.

  17. The Facebook Influence Model: A Concept Mapping Approach

    Science.gov (United States)

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  18. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    Science.gov (United States)

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  19. A National Approach to Map and Quantify Terrestrial Vertebrate Biodiversity

    Science.gov (United States)

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  20. Mapping of health facilities in Jimeta Metropolis: a digital approach ...

    African Journals Online (AJOL)

    Two sets of data were acquired and used, cartographic data and attribute data. Scanning was carried out using CorelDraw!2 while Georeferencing and Digitizing was done using ILWIS3.1 academic Software packages. A digital map, showing the spatial distribution of health facilities in Jimeta metropolis was produced; this ...

  1. Mapping the Dabus Wetlands, Ethiopia, Using Random Forest Classification of Landsat, PALSAR and Topographic Data

    Directory of Open Access Journals (Sweden)

    Pierre Dubeau

    2017-10-01

    Full Text Available The Dabus Wetland complex in the highlands of Ethiopia is within the headwaters of the Nile Basin and is home to significant ecological communities and rare or endangered species. Its many interrelated wetland types undergo seasonal and longer-term changes due to weather and climate variations as well as anthropogenic land use such as grazing and burning. Mapping and monitoring of these wetlands has not been previously undertaken due primarily to their relative isolation and lack of resources. This study investigated the potential of remote sensing based classification for mapping the primary vegetation groups in the Dabus Wetlands using a combination of dry and wet season data, including optical (Landsat spectral bands and derived vegetation and wetness indices, radar (ALOS PALSAR L-band backscatter, and elevation (SRTM derived DEM and other terrain metrics as inputs to the non-parametric Random Forest (RF classifier. Eight wetland types and three terrestrial/upland classes were mapped using field samples of observed plant community composition and structure groupings as reference information. Various tests to compare results using different RF input parameters and data types were conducted. A combination of multispectral optical, radar and topographic variables provided the best overall classification accuracy, 94.4% and 92.9% for the dry and wet season, respectively. Spectral and topographic data (radar data excluded performed nearly as well, while accuracies using only radar and topographic data were 82–89%. Relatively homogeneous classes such as Papyrus Swamps, Forested Wetland, and Wet Meadow yielded the highest accuracies while spatially complex classes such as Emergent Marsh were more difficult to accurately classify. The methods and results presented in this paper can serve as a basis for development of long-term mapping and monitoring of these and other non-forested wetlands in Ethiopia and other similar environmental settings.

  2. Soil erodibility mapping using three approaches in the Tangiers province –Northern Morocco

    Directory of Open Access Journals (Sweden)

    Hamza Iaaich

    2016-09-01

    Full Text Available Soil erodibility is a key factor in assessing soil loss rates. In fact, soil loss is the most occurring land degradation form in Morocco, affecting rural and urban vulnerable areas. This work deals with large scale mapping of soil erodibility using three mapping approaches: (i the CORINE approach developed for Europe by the JRC; (ii the UNEP/FAO approach developed within the frame of the United Nations Environmental Program for the Mediterranean area; (iii the Universal Soil Loss Equation (USLE K factor. Our study zone is the province of Tangiers, North-West of Morocco. For each approach, we mapped and analyzed different erodibility factors in terms of parent material, topography and soil attributes. The thematic maps were then integrated using a Geographic Information System to elaborate a soil erodibility map for each of the three approaches. Finally, the validity of each approach was checked in the field, focusing on highly eroded areas, by confronting the estimated soil erodibility and the erosion state as observed in the field. We used three statistical indicators for validation: overall accuracy, weighted Kappa factor and omission/commission errors. We found that the UNEP/FAO approach, based principally on lithofacies and topography as mapping inputs, is the most adapted for the case of our study zone, followed by the CORINE approach. The USLE K factor underestimated the soil erodibility, especially for highly eroded areas.

  3. Random Vector and Matrix Theories: A Renormalization Group Approach

    Science.gov (United States)

    Zinn-Justin, Jean

    2014-09-01

    Random matrices in the large N expansion and the so-called double scaling limit can be used as toy models for quantum gravity: 2D quantum gravity coupled to conformal matter. This has generated a tremendous expansion of random matrix theory, tackled with increasingly sophisticated mathematical methods and number of matrix models have been solved exactly. However, the somewhat paradoxical situation is that either models can be solved exactly or little can be said. Since the solved models display critical points and universal properties, it is tempting to use renormalization group ideas to determine universal properties, without solving models explicitly. Initiated by Br\\'ezin and Zinn-Justin, the approach has led to encouraging results, first for matrix integrals and then quantum mechanics with matrices, but has not yet become a universal tool as initially hoped. In particular, general quantum field theories with matrix fields require more detailed investigations. To better understand some of the encountered difficulties, we first apply analogous ideas to the simpler O(N) symmetric vector models, models that can be solved quite generally in the large N limit. Unlike other attempts, our method is a close extension of Br\\'ezin and Zinn-Justin. Discussing vector and matrix models with similar approximation scheme, we notice that in all cases (vector and matrix integrals, vector and matrix path integrals in the local approximation), at leading order, non-trivial fixed points satisfy the same universal algebraic equation, and this is the main result of this work. However, its precise meaning and role have still to be better understood.

  4. Comparison of validity of mapping between drug indications and ICD-10. Direct and indirect terminology based approaches

    National Research Council Canada - National Science Library

    Choi, Y; Jung, C; Chae, Y; Kang, M; Kim, J; Joung, K; Lim, J; Cho, S; Sung, S; Lee, E; Kim, S

    2014-01-01

    .... This study was undertaken to compare the validity of a direct mapping approach and an indirect terminology based mapping approach of drug indications against the gold standard drawn from the results...

  5. DREAM - A Novel Approach for Robust, Ultra-Fast, Multi-Slice B1 Mapping

    NARCIS (Netherlands)

    Nehrke, K.; Boernert, P.

    2012-01-01

    Fast and robust in vivo B1 mapping is an essential prerequisite forquantitative MRI or multi-element transmit applications like RF-shimming or accelerated multi-dimensional RF pulses. However, especially at higher field strength, the acquisition speed of current B1-mapping approaches is

  6. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    Science.gov (United States)

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  7. Evaluating the Use of an Object-Based Approach to Lithological Mapping in Vegetated Terrain

    Directory of Open Access Journals (Sweden)

    Stephen Grebby

    2016-10-01

    Full Text Available Remote sensing-based approaches to lithological mapping are traditionally pixel-oriented, with classification performed on either a per-pixel or sub-pixel basis with complete disregard for contextual information about neighbouring pixels. However, intra-class variability due to heterogeneous surface cover (i.e., vegetation and soil or regional variations in mineralogy and chemical composition can result in the generation of unrealistic, generalised lithological maps that exhibit the “salt-and-pepper” artefact of spurious pixel classifications, as well as poorly defined contacts. In this study, an object-based image analysis (OBIA approach to lithological mapping is evaluated with respect to its ability to overcome these issues by instead classifying groups of contiguous pixels (i.e., objects. Due to significant vegetation cover in the study area, the OBIA approach incorporates airborne multispectral and LiDAR data to indirectly map lithologies by exploiting associations with both topography and vegetation type. The resulting lithological maps were assessed both in terms of their thematic accuracy and ability to accurately delineate lithological contacts. The OBIA approach is found to be capable of generating maps with an overall accuracy of 73.5% through integrating spectral and topographic input variables. When compared to equivalent per-pixel classifications, the OBIA approach achieved thematic accuracy increases of up to 13.1%, whilst also reducing the “salt-and-pepper” artefact to produce more realistic maps. Furthermore, the OBIA approach was also generally capable of mapping lithological contacts more accurately. The importance of optimising the segmentation stage of the OBIA approach is also highlighted. Overall, this study clearly demonstrates the potential of OBIA for lithological mapping applications, particularly in significantly vegetated and heterogeneous terrain.

  8. Concept Mapping as an Approach to Facilitate Participatory Intervention Building.

    Science.gov (United States)

    L Allen, Michele; Schaleben-Boateng, Dane; Davey, Cynthia S; Hang, Mikow; Pergament, Shannon

    2015-01-01

    A challenge to addressing community-defined need through community-based participatory intervention building is ensuring that all collaborators' opinions are represented. Concept mapping integrates perspectives of individuals with differing experiences, interests, or expertise into a common visually depicted framework, and ranks composite views on importance and feasibility. To describe the use of concept mapping to facilitate participatory intervention building for a school-based, teacher-focused, positive youth development (PYD) promotion program for Latino, Hmong, and Somali youth. Particiants were teachers, administrators, youth, parents, youth workers, and community and university researchers on the projects' community collaborative board. We incorporated previously collected qualitative data into the process. In a mixed-methods process we 1) generated statements based on key informant interview and focus group data from youth workers, teachers, parents, and youth in multiple languages regarding ways teachers promote PYD for Somali, Latino and Hmong youth; 2) guided participants to individually sort statements into meaningful groupings and rate them by importance and feasibility; 3) mapped the statements based on their relation to each other using multivariate statistical analyses to identify concepts, and as a group identified labels for each concept; and 4) used labels and statement ratings to identify feasible and important concepts as priorities for intervention development. We identified 12 concepts related to PYD promotion in schools and prioritized 8 for intervention development. Concept mapping facilitated participatory intervention building by formally representing all participants' opinions, generating visual representation of group thinking, and supporting priority setting. Use of prior qualitative work increased the diversity of viewpoints represented.

  9. Treatment decisions for localized prostate cancer: a concept mapping approach.

    Science.gov (United States)

    McFall, Stephanie L; Mullen, Patricia D; Byrd, Theresa L; Cantor, Scott B; Le, Yen-Chi; Torres-Vigil, Isabel; Pettaway, Curtis; Volk, Robert J

    2015-12-01

    Few decision aids emphasize active surveillance (AS) for localized prostate cancer. Concept mapping was used to produce a conceptual framework incorporating AS and treatment. Fifty-four statements about what men need to make a decision for localized prostate cancer were derived from focus groups with African American, Latino and white men previously screened for prostate cancer and partners (n = 80). In the second phase, 89 participants sorted and rated the importance of statements. An eight cluster map was produced for the overall sample. Clusters were labelled Doctor-patient exchange, Big picture comparisons, Weighing the options, Seeking and using information, Spirituality and inner strength, Related to active treatment, Side-effects and Family concerns. A major division was between medical and home-based clusters. Ethnic groups and genders had similar sorting, but some variation in importance. Latinos rated Big picture comparisons as less important. African Americans saw Spirituality and inner strength most important, followed by Latinos, then whites. Ethnic- and gender-specific concept maps were not analysed because of high similarity in their sorting patterns. We identified a conceptual framework for management of early-stage prostate cancer that included coverage of AS. Eliciting the conceptual framework is an important step in constructing decision aids which will address gaps related to AS. © 2014 John Wiley & Sons Ltd.

  10. A local segmentation parameter optimization approach for mapping heterogeneous urban environments using VHR imagery

    Science.gov (United States)

    Grippa, Tais; Georganos, Stefanos; Lennert, Moritz; Vanhuysse, Sabine; Wolff, Eléonore

    2017-10-01

    Mapping large heterogeneous urban areas using object-based image analysis (OBIA) remains challenging, especially with respect to the segmentation process. This could be explained both by the complex arrangement of heterogeneous land-cover classes and by the high diversity of urban patterns which can be encountered throughout the scene. In this context, using a single segmentation parameter to obtain satisfying segmentation results for the whole scene can be impossible. Nonetheless, it is possible to subdivide the whole city into smaller local zones, rather homogeneous according to their urban pattern. These zones can then be used to optimize the segmentation parameter locally, instead of using the whole image or a single representative spatial subset. This paper assesses the contribution of a local approach for the optimization of segmentation parameter compared to a global approach. Ouagadougou, located in sub-Saharan Africa, is used as case studies. First, the whole scene is segmented using a single globally optimized segmentation parameter. Second, the city is subdivided into 283 local zones, homogeneous in terms of building size and building density. Each local zone is then segmented using a locally optimized segmentation parameter. Unsupervised segmentation parameter optimization (USPO), relying on an optimization function which tends to maximize both intra-object homogeneity and inter-object heterogeneity, is used to select the segmentation parameter automatically for both approaches. Finally, a land-use/land-cover classification is performed using the Random Forest (RF) classifier. The results reveal that the local approach outperforms the global one, especially by limiting confusions between buildings and their bare-soil neighbors.

  11. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  12. A new fate mapping system reveals context-dependent random or clonal expansion of microglia.

    Science.gov (United States)

    Tay, Tuan Leng; Mai, Dominic; Dautzenberg, Jana; Fernández-Klett, Francisco; Lin, Gen; Sagar; Datta, Moumita; Drougard, Anne; Stempfl, Thomas; Ardura-Fabregat, Alberto; Staszewski, Ori; Margineanu, Anca; Sporbert, Anje; Steinmetz, Lars M; Pospisilik, J Andrew; Jung, Steffen; Priller, Josef; Grün, Dominic; Ronneberger, Olaf; Prinz, Marco

    2017-06-01

    Microglia constitute a highly specialized network of tissue-resident immune cells that is important for the control of tissue homeostasis and the resolution of diseases of the CNS. Little is known about how their spatial distribution is established and maintained in vivo. Here we establish a new multicolor fluorescence fate mapping system to monitor microglial dynamics during steady state and disease. Our findings suggest that microglia establish a dense network with regional differences, and the high regional turnover rates found challenge the universal concept of microglial longevity. Microglial self-renewal under steady state conditions constitutes a stochastic process. During pathology this randomness shifts to selected clonal microglial expansion. In the resolution phase, excess disease-associated microglia are removed by a dual mechanism of cell egress and apoptosis to re-establish the stable microglial network. This study unravels the dynamic yet discrete self-organization of mature microglia in the healthy and diseased CNS.

  13. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    Science.gov (United States)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  14. COMPARATIVE PERFORMANCE ANALYSIS OF A HYPER-TEMPORAL NDVI ANALYSIS APPROACH AND A LANDSCAPE-ECOLOGICAL MAPPING APPROACH

    Directory of Open Access Journals (Sweden)

    A. Ali

    2012-07-01

    Full Text Available Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i a hyper-temporal Normalized Difference Vegetation Index (NDVI analysis approach and (ii a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998–December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two

  15. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Science.gov (United States)

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. The Utility of AISA Eagle Hyperspectral Data and Random Forest Classifier for Flower Mapping

    Directory of Open Access Journals (Sweden)

    Elfatih M. Abdel-Rahman

    2015-10-01

    Full Text Available Knowledge of the floral cycle and the spatial distribution and abundance of flowering plants is important for bee health studies to understand the relationship between landscape and bee hive productivity and honey flow. The key objective of this study was to show how AISA Eagle hyperspectral data and random forest (RF can be optimally utilized to produce flowering and spatially explicit land use/land cover (LULC maps for a study site in Kenya. AISA Eagle imagery was captured at the early flowering period (January 2014 and at the peak flowering season (February 2013. Data on white and yellow flowering trees as well as LULC classes in the study area were collected and used as ground-truth points. We utilized all 64 AISA Eagle bands and also used variable importance in RF to identify the most important bands in both AISA Eagle data sets. The results showed that flowering was most accurately mapped using the AISA Eagle data from the peak flowering period (85.71%–88.15% overall accuracy for the peak flowering season imagery versus 80.82%–83.67% for the early flowering season. The variable optimization (i.e., variable selection analysis showed that less than half of the AISA bands (n = 26 for the February 2013 data and n = 21 for the January 2014 data were important to attain relatively reliable classification accuracies. Our study is an important first step towards the development of operational flower mapping routines and for understanding the relationship between flowering and bees’ foraging behavior.

  17. Purposive versus random sampling for map validation: a case study on ecotope maps of floodplains in the Netherlands

    NARCIS (Netherlands)

    Knotters, M.; Brus, D.J.

    2013-01-01

    The quality of ecotope maps of five districts of main water courses in the Netherlands was assessed on the basis of independent validation samples of field observations. The overall proportion of area correctly classified, and user's and producer's accuracy for each map unit were estimated. In four

  18. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    Science.gov (United States)

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Supplementary Appendix for: Constrained Perturbation Regularization Approach for Signal Estimation Using Random Matrix Theory

    KAUST Repository

    Suliman, Mohamed

    2016-01-01

    In this supplementary appendix we provide proofs and additional simulation results that complement the paper (constrained perturbation regularization approach for signal estimation using random matrix theory).

  20. Orthogonal matching pursuit applied to the deconvolution approach for the mapping of acoustic sources inverse problem.

    Science.gov (United States)

    Padois, Thomas; Berry, Alain

    2015-12-01

    Microphone arrays and beamforming have become a standard method to localize aeroacoustic sources. Deconvolution techniques have been developed to improve spatial resolution of beamforming maps. The deconvolution approach for the mapping of acoustic sources (DAMAS) is a standard deconvolution technique, which has been enhanced via a sparsity approach called sparsity constrained deconvolution approach for the mapping of acoustic sources (SC-DAMAS). In this paper, the DAMAS inverse problem is solved using the orthogonal matching pursuit (OMP) and compared with beamforming and SC-DAMAS. The resulting noise source maps show that OMP-DAMAS is an efficient source localization technique in the case of uncorrelated or correlated acoustic sources. Moreover, the computation time is clearly reduced as compared to SC-DAMAS.

  1. Engineering a robotic approach to mapping exposed volcanic fissures

    Science.gov (United States)

    Parcheta, C. E.; Parness, A.; Mitchell, K. L.

    2014-12-01

    Field geology provides a framework for advanced computer models and theoretical calculations of volcanic systems. Some field terrains, though, are poorly preserved or accessible, making documentation, quantification, and investigation impossible. Over 200 volcanologists at the 2012 Kona Chapman Conference on volcanology agreed that and important step forward in the field over the next 100 years should address the realistic size and shape of volcanic conduits. The 1969 Mauna Ulu eruption of Kīlauea provides a unique opportunity to document volcanic fissure conduits, thus, we have an ideal location to begin addressing this topic and provide data on these geometries. Exposed fissures can be mapped with robotics using machine vision. In order to test the hypothesis that fissures have irregularities with depth that will influence their fluid dynamical behavior, we must first map the fissure vents and shallow conduit to deci- or centimeter scale. We have designed, constructed, and field-tested the first version of a robotic device that will image an exposed volcanic fissure in three dimensions. The design phase included three steps: 1) create the payload harness and protective shell to prevent damage to the electronics and robot, 2) construct a circuit board to have the electronics communicate with a surface-based computer, and 3) prototype wheel shapes that can handle a variety of volcanic rock textures. The robot's mechanical parts were built using 3d printing, milling, casting and laser cutting techniques, and the electronics were assembled from off the shelf components. The testing phase took place at Mauna Ulu, Kīlauea, Hawai'i, from May 5 - 9, 2014. Many valuable design lessons were learned during the week, and the first ever 3D map from inside a volcanic fissure were successfully collected. Three vents had between 25% and 95% of their internal surfaces imaged. A fourth location, a non-eruptive crack (possibly a fault line) had two transects imaging the textures

  2. Mapping national capacity to engage in health promotion: overview of issues and approaches.

    Science.gov (United States)

    Mittelmark, Maurice B; Wise, Marilyn; Nam, Eun Woo; Santos-Burgoa, Carlos; Fosse, Elisabeth; Saan, Hans; Hagard, Spencer; Tang, Kwok Cho

    2006-12-01

    This paper reviews approaches to the mapping of resources needed to engage in health promotion at the country level. There is not a single way, or a best way to make a capacity map, since it should speak to the needs of its users as they define their needs. Health promotion capacity mapping is therefore approached in various ways. At the national level, the objective is usually to learn the extent to which essential policies, institutions, programmes and practices are in place to guide recommendations about what remedial measures are desirable. In Europe, capacity mapping has been undertaken at the national level by the WHO for a decade. A complimentary capacity mapping approach, HP-Source.net, has been undertaken since 2000 by a consortium of European organizations including the EC, WHO, International Union for Health Promotion and Education, Health Development Agency (of England) and various European university research centres. The European approach emphasizes the need for multi-methods and the principle of triangulation. In North America, Canadian approaches have included large- and small-scale international collaborations to map capacity for sustainable development. US efforts include state-level mapping of capacity to prevent chronic diseases and reduce risk factor levels. In Australia, two decades of mapping national health promotion capacity began with systems needed by the health sector to design and deliver effective, efficient health promotion, and has now expanded to include community-level capacity and policy review. In Korea and Japan, capacity mapping is newly developing in collaboration with European efforts, illustrating the usefulness of international health promotion networks. Mapping capacity for health promotion is a practical and vital aspect of developing capacity for health promotion. The new context for health promotion contains both old and new challenges, but also new opportunities. A large scale, highly collaborative approach to capacity

  3. Conjecture Mapping: An Approach to Systematic Educational Design Research

    Science.gov (United States)

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  4. ODI launches RAPID Outcome Mapping Approach online guide ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2014-06-17

    Jun 17, 2014 ... The UK-based Overseas Development Institute has released its online guide to understanding, engaging with, and influencing policy. ... The RAPID team's focus on learning and evolution remains a central part of the approach, with a chapter devoted to the design of monitoring and evaluation activities that ...

  5. Comparison of four Vulnerability Approaches to Mapping of Shallow Aquifers of Eastern Dahomey Basin of Nigeria

    Science.gov (United States)

    Oke, Saheed; Vermeulen, Danie

    2016-04-01

    This study presents the outcome of mapping the shallow aquifers of the eastern Dahomey Basin of southwestern Nigeria vulnerability studies. The basin is a coastal transboundary aquifer extending from eastern Ghana to southwestern Nigeria. The study aimed to examine the most suitable method for mapping the basin shallow aquifers by comparing the results of four different vulnerability approaches. This is most important due to differences in vulnerability assessment parameters, approaches and results derived from most vulnerability methods on a particular aquifer. The methodology involves using vulnerability techniques that assess the intrinsic properties of the aquifer. Two methods from travel time approach (AVI and RTt) and index approach (DRASTIC and PI) were employed in the mapping of the basin. The results show the AVI has the least mapping parameters with 75% of the basin classified as very high vulnerability and 25% with high vulnerability. The DRASTIC mapping shows 18% as low vulnerability, 61% as moderate vulnerability and 21% reveal high vulnerability. Mapping with the PI method which has highest parameters shows 66% of the aquifer as low vulnerability and 34% reveal moderate vulnerability. The RTt method shows 18% as very high vulnerability, 8% as high vulnerability, 64% as moderate vulnerability and 10% reveal very low vulnerability. Further analysis involving correlation plots shows the highest correlation of 62% between the RTt and DRASTIC method than within any others methods. The analysis shows that the PI method is the mildest of all the vulnerability methods while the AVI method is the strictest of the methods considered in this vulnerability mapping. The significance of using four different approaches to the mapping of the shallow aquifers of the eastern Dahomey Basin will guide in the recommendation of the best vulnerability method for subsequent future assessment of this and other shallow aquifers. Keywords: Aquifer vulnerability, Dahomey Basin

  6. Using Concept Mapping in Community-Based Participatory Research: A Mixed Methods Approach.

    Science.gov (United States)

    Windsor, Liliane Cambraia

    2013-07-01

    Community-based participatory research (CBPR) has been identified as a useful approach to increasing community involvement in research. Developing rigorous methods in conducting CBPR is an important step in gaining more support for this approach. The current article argues that concept mapping, a structured mixed methods approach, is useful in the initial development of a rigorous CBPR program of research aiming to develop culturally tailored and community-based health interventions for vulnerable populations. A research project examining social dynamics and consequences of alcohol and substance use in Newark, New Jersey, is described to illustrate the use of concept mapping methodology in CBPR. A total of 75 individuals participated in the study.

  7. A National Approach to Quantify and Map Biodiversity ...

    Science.gov (United States)

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  8. Flow in Random Microstructures: a Multilevel Monte Carlo Approach

    KAUST Repository

    Icardi, Matteo

    2016-01-06

    In this work we are interested in the fast estimation of effective parameters of random heterogeneous materials using Multilevel Monte Carlo (MLMC). MLMC is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrisation of the input randomness is not available or too expensive. We propose a general-purpose algorithm and computational code for the solution of Partial Differential Equations (PDEs) on random heterogeneous materials. We make use of the key idea of MLMC, based on different discretization levels, extending it in a more general context, making use of a hierarchy of physical resolution scales, solvers, models and other numerical/geometrical discretisation parameters. Modifications of the classical MLMC estimators are proposed to further reduce variance in cases where analytical convergence rates and asymptotic regimes are not available. Spheres, ellipsoids and general convex-shaped grains are placed randomly in the domain with different placing/packing algorithms and the effective properties of the heterogeneous medium are computed. These are, for example, effective diffusivities, conductivities, and reaction rates. The implementation of the Monte-Carlo estimators, the statistical samples and each single solver is done efficiently in parallel. The method is tested and applied for pore-scale simulations of random sphere packings.

  9. Non-linear dynamics of operant behavior: a new approach via the extended return map.

    Science.gov (United States)

    Li, Jay-Shake; Huston, Joseph P

    2002-01-01

    Previous efforts to apply non-linear dynamic tools to the analysis of operant behavior revealed some promise for this kind of approach, but also some doubts, since the complexity of animal behavior seemed to be beyond the analyzing ability of the available tools. We here outline a series of studies based on a novel approach. We modified the so-called 'return map' and developed a new method, the 'extended return map' (ERM) to extract information from the highly irregular time series data, the inter-response time (IRT) generated by Skinner-box experiments. We applied the ERM to operant lever pressing data from rats using the four fundamental reinforcement schedules: fixed interval (FI), fixed ratio (FR), variable interval (VI) and variable ratio (VR). Our results revealed interesting patterns in all experiment groups. In particular, the FI and VI groups exhibited well-organized clusters of data points. We calculated the fractal dimension out of these patterns and compared experimental data with surrogate data sets, that were generated by randomly shuffling the sequential order of original IRTs. This comparison supported the finding that patterns in ERM reflect the dynamics of the operant behaviors under study. We then built two models to simulate the functional mechanisms of the FI schedule. Both models can produce similar distributions of IRTs and the stereotypical 'scalloped' curve characteristic of FI responding. However, they differ in one important feature in their formulation: while one model uses a continuous function to describe the probability of occurrence of an operant behavior, the other one employs an abrupt switch of behavioral state. Comparison of ERMs showed that only the latter was able to produce patterns similar to the experimental results, indicative of the operation of an abrupt switch from one behavioral state to another over the course of the inter-reinforcement period. This example demonstrated the ERM to be a useful tool for the analysis of

  10. Mapping Transcription Factors on Extended DNA: A Single Molecule Approach

    Science.gov (United States)

    Ebenstein, Yuval; Gassman, Natalie; Weiss, Shimon

    The ability to determine the precise loci and distribution of nucleic acid binding proteins is instrumental to our detailed understanding of cellular processes such as transcription, replication, and chromatin reorganization. Traditional molecular biology approaches and above all Chromatin immunoprecipitation (ChIP) based methods have provided a wealth of information regarding protein-DNA interactions. Nevertheless, existing techniques can only provide average properties of these interactions, since they are based on the accumulation of data from numerous protein-DNA complexes analyzed at the ensemble level. We propose a single molecule approach for direct visualization of DNA binding proteins bound specifically to their recognition sites along a long stretch of DNA such as genomic DNA. Fluorescent Quantum dots are used to tag proteins bound to DNA, and the complex is deposited on a glass substrate by extending the DNA to a linear form. The sample is then imaged optically to determine the precise location of the protein binding site. The method is demonstrated by detecting individual, Quantum dot tagged T7-RNA polymerase enzymes on the bacteriophage T7 genomic DNA and assessing the relative occupancy of the different promoters.

  11. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    Science.gov (United States)

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  12. A Community Resource Map to Support Clinical-Community Linkages in a Randomized Controlled Trial of Childhood Obesity, Eastern Massachusetts, 2014-2016.

    Science.gov (United States)

    Fiechtner, Lauren; Puente, Gabriella C; Sharifi, Mona; Block, Jason P; Price, Sarah; Marshall, Richard; Blossom, Jeff; Gerber, Monica W; Taveras, Elsie M

    2017-07-06

    Novel approaches to health care delivery that leverage community resources could improve outcomes for children at high risk for obesity. We describe the process by which we created an online interactive community resources map for use in the Connect for Health randomized controlled trial. The trial was conducted in the 6 pediatric practices that cared for the highest percentage of children with overweight or obesity within a large multi-specialty group practice in eastern Massachusetts. By using semistructured interviews with parents and community partners and geographic information systems (GIS), we created and validated a community resource map for use in a randomized controlled trial for childhood obesity. We conducted semistructured interviews with 11 parents and received stakeholder feedback from 5 community partners, 2 pediatricians, and 3 obesity-built environment experts to identify community resources that could support behavior change. We used GIS databases to identify the location of resources. After the resources were validated, we created an online, interactive searchable map. We evaluated parent resource empowerment at baseline and follow-up, examined if the participant families went to new locations for physical activity and food shopping, and evaluated how satisfied the families were with the information they received. Parents, community partners, and experts identified several resources to be included in the map, including farmers markets, supermarkets, parks, and fitness centers. Parents expressed the need for affordable activities. Parent resource empowerment increased by 0.25 units (95% confidence interval, 0.21-0.30) over the 1-year intervention period; 76.2% of participants were physically active at new places, 57.1% of participant families shopped at new locations; and 71.8% reported they were very satisfied with the information they received. Parents and community partners identified several community resources that could help support

  13. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    Energy Technology Data Exchange (ETDEWEB)

    Chelouche, Doron; Pozo-Nuñez, Francisco [Department of Physics, Faculty of Natural Sciences, University of Haifa, Haifa 3498838 (Israel); Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il [Department of Geosciences, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel Aviv University, Tel Aviv 6997801 (Israel)

    2017-08-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discrete correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.

  14. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    Science.gov (United States)

    Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay

    2017-08-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discrete correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size-luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.

  15. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  16. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  17. Development and Comparison of Techniques for Generating Permeability Maps using Independent Experimental Approaches

    Science.gov (United States)

    Hingerl, Ferdinand; Romanenko, Konstantin; Pini, Ronny; Balcom, Bruce; Benson, Sally

    2014-05-01

    We have developed and evaluated methods for creating voxel-based 3D permeability maps of a heterogeneous sandstone sample using independent experimental data from single phase flow (Magnetic Resonance Imaging, MRI) and two-phase flow (X-ray Computed Tomography, CT) measurements. Fluid velocities computed from the generated permeability maps using computational fluid dynamics simulations fit measured velocities very well and significantly outperform empirical porosity-permeability relations, such as the Kozeny-Carman equation. Acquiring images on the meso-scale from porous rocks using MRI has till recently been a great challenge, due to short spin relaxation times and large field gradients within the sample. The combination of the 13-interval Alternating-Pulsed-Gradient Stimulated-Echo (APGSTE) scheme with three-dimensional Single Point Ramped Imaging with T1 Enhancement (SPRITE) - a technique recently developed at the UNB MRI Center - can overcome these challenges and enables obtaining quantitative 3 dimensional maps of porosities and fluid velocities. Using porosity and (single-phase) velocity maps from MRI and (multi-phase) saturation maps from CT measurements, we employed three different techniques to obtain permeability maps. In the first approach, we applied the Kozeny-Carman relationship to porosities measured using MRI. In the second approach, we computed permeabilities using a J-Leverett scaling method, which is based on saturation maps obtained from N2-H2O multi-phase experiments. The third set of permeabilities was generated using a new inverse iterative-updating technique, which is based on porosities and measured velocities obtained in single-phase flow experiments. The resulting three permeability maps provided then input for computational fluid dynamics simulations - employing the Stanford CFD code AD-GPRS - to generate velocity maps, which were compared to velocity maps measured by MRI. The J-Leveret scaling method and the iterative-updating method

  18. RENNSH: a novel α-helix identification approach for intermediate resolution electron density maps.

    Science.gov (United States)

    Ma, Lingyu; Reisert, Marco; Burkhardt, Hans

    2012-01-01

    Accurate identification of protein secondary structures is beneficial to understand three-dimensional structures of biological macromolecules. In this paper, a novel refined classification framework is proposed, which treats alpha-helix identification as a machine learning problem by representing each voxel in the density map with its Spherical Harmonic Descriptors (SHD). An energy function is defined to provide statistical analysis of its identification performance, which can be applied to all the α-helix identification approaches. Comparing with other existing α-helix identification methods for intermediate resolution electron density maps, the experimental results demonstrate that our approach gives the best identification accuracy and is more robust to the noise.

  19. Ontology Mapping: An Information Retrieval and Interactive Activation Network Based Approach

    Science.gov (United States)

    Mao, Ming

    Ontology mapping is to find semantic correspondences between similar elements of different ontologies. It is critical to achieve semantic interoperability in the WWW. This paper proposes a new generic and scalable ontology mapping approach based on propagation theory, information retrieval technique and artificial intelligence model. The approach utilizes both linguistic and structural information, measures the similarity of different elements of ontologies in a vector space model, and deals with constraints using the interactive activation network. The results of pilot study, the PRIOR, are promising and scalable.

  20. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Energy Technology Data Exchange (ETDEWEB)

    Chess, Jordan J., E-mail: jchess@uoregon.edu [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Montoya, Sergio A. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); Harvey, Tyler R. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Ophus, Colin [National Center for Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); McMorran, Benjamin J. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States)

    2017-06-15

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  1. Behavioral approach with or without surgical intervention to the vulvar vestibulitis syndrome : A prospective randomized and non randomized study

    NARCIS (Netherlands)

    Schultz, WCMW; Gianotten, WL; vanderMeijden, WI; vandeWiel, HBM; Blindeman, L; Chadha, S; Drogendijk, AC

    This article describes the outcome of a behavioral approach with or without preceding surgical intervention in 48 women with the vulvar vestibulitis syndrome. In the first part of the study, 14 women with the vulvar vestibulitis syndrome were randomly assigned to one of two treatment programs:

  2. Rapid land cover map updates using change detection and robust random forest classifiers

    CSIR Research Space (South Africa)

    Wessels, Konrad J

    2016-01-01

    Full Text Available The paper evaluated the Landsat Automated Land Cover Update Mapping (LALCUM) system designed to rapidly update a land cover map to a desired nominal year using a pre-existing reference land cover map. The system uses the Iteratively Reweighted...

  3. Benthic habitat mapping in a Portuguese Marine Protected Area using EUNIS: An integrated approach

    Science.gov (United States)

    Henriques, Victor; Guerra, Miriam Tuaty; Mendes, Beatriz; Gaudêncio, Maria José; Fonseca, Paulo

    2015-06-01

    A growing demand for seabed and habitat mapping has taken place over the past years to support the maritime integrated policies at EU and national levels aiming at the sustainable use of sea resources. This study presents the results of applying the hierarchical European Nature Information System (EUNIS) to classify and map the benthic habitats of the Luiz Saldanha Marine Park, a marine protected area (MPA), located in the mainland Portuguese southwest coast, in the Iberian Peninsula. The habitat map was modelled by applying a methodology based on EUNIS to merge biotic and abiotic key habitat drivers. The modelling in this approach focused on predicting the association of different data types: substrate, bathymetry, light intensity, waves and currents energy, sediment grain size and benthic macrofauna into a common framework. The resulting seamless medium scale habitat map discriminates twenty six distinct sublittoral habitats, including eight with no match in the current classification, which may be regarded as new potential habitat classes and therefore will be submitted to EUNIS. A discussion is provided examining the suitability of the current EUNIS scheme as a standardized approach to classify marine benthic habitats and map their spatial distribution at medium scales in the Portuguese coast. In addition the factors that most affected the results available in the predictive habitat map and the role of the environmental factors on macrofaunal assemblage composition and distribution are outlined.

  4. Tropical land use land cover mapping in Pará (Brazil) using discriminative Markov random fields and multi-temporal TerraSAR-X data

    Science.gov (United States)

    Hagensieker, Ron; Roscher, Ribana; Rosentreter, Johannes; Jakimow, Benjamin; Waske, Björn

    2017-12-01

    Remote sensing satellite data offer the unique possibility to map land use land cover transformations by providing spatially explicit information. However, detection of short-term processes and land use patterns of high spatial-temporal variability is a challenging task. We present a novel framework using multi-temporal TerraSAR-X data and machine learning techniques, namely discriminative Markov random fields with spatio-temporal priors, and import vector machines, in order to advance the mapping of land cover characterized by short-term changes. Our study region covers a current deforestation frontier in the Brazilian state Pará with land cover dominated by primary forests, different types of pasture land and secondary vegetation, and land use dominated by short-term processes such as slash-and-burn activities. The data set comprises multi-temporal TerraSAR-X imagery acquired over the course of the 2014 dry season, as well as optical data (RapidEye, Landsat) for reference. Results show that land use land cover is reliably mapped, resulting in spatially adjusted overall accuracies of up to 79% in a five class setting, yet limitations for the differentiation of different pasture types remain. The proposed method is applicable on multi-temporal data sets, and constitutes a feasible approach to map land use land cover in regions that are affected by high-frequent temporal changes.

  5. Predictive Mapping of Dwarf Shrub Vegetation in an Arid High Mountain Ecosystem Using Remote Sensing and Random Forests

    Directory of Open Access Journals (Sweden)

    Kim André Vanselow

    2014-07-01

    Full Text Available In many arid mountains, dwarf shrubs represent the most important fodder and firewood resources; therefore, they are intensely used. For the Eastern Pamirs (Tajikistan, they are assumed to be overused. However, empirical evidence on this issue is lacking. We aim to provide a method capable of mapping vegetation in this mountain desert. We used random forest models based on remote sensing data (RapidEye, ASTER GDEM and 359 plots to predictively map total vegetative cover and the distribution of the most important firewood plants, K. ceratoides and A. leucotricha. These species were mapped as present in 33.8% of the study area (accuracy 90.6%. The total cover of the dwarf shrub communities ranged from 0.5% to 51% (per pixel. Areas with very low cover were limited to the vicinity of roads and settlements. The model could explain 80.2% of the total variance. The most important predictor across the models was MSAVI2 (a spectral vegetation index particularly invented for low-cover areas. We conclude that the combination of statistical models and remote sensing data worked well to map vegetation in an arid mountainous environment. With this approach, we were able to provide tangible data on dwarf shrub resources in the Eastern Pamirs and to relativize previous reports about their extensive depletion.

  6. Density Functional Approach and Random Matrix Theory in Proteogenesis

    Science.gov (United States)

    Yamanaka, Masanori

    2017-02-01

    We study the energy-level statistics of amino acids by random matrix theory. The molecular orbital and the Kohn-Sham orbital energies are calculated using ab initio and density-functional formalisms for 20 different amino acids. To generate statistical data, we performed a multipoint calculation on 10000 molecular structures produced via a molecular dynamics simulation. For the valence orbitals, the energy-level statistics exhibit repulsion, but the universality in the random matrix cannot be determined. For the unoccupied orbitals, the energy-level statistics indicate an intermediate distribution between the Gaussian orthogonal ensemble and the semi-Poisson statistics for all 20 different amino acids. These amino acids are considered to be in a type of critical state.

  7. An Effective NoSQL-Based Vector Map Tile Management Approach

    Directory of Open Access Journals (Sweden)

    Lin Wan

    2016-11-01

    Full Text Available Within a digital map service environment, the rapid growth of Spatial Big-Data is driving new requirements for effective mechanisms for massive online vector map tile processing. The emergence of Not Only SQL (NoSQL databases has resulted in a new data storage and management model for scalable spatial data deployments and fast tracking. They better suit the scenario of high-volume, low-latency network map services than traditional standalone high-performance computer (HPC or relational databases. In this paper, we propose a flexible storage framework that provides feasible methods for tiled map data parallel clipping and retrieval operations within a distributed NoSQL database environment. We illustrate the parallel vector tile generation and querying algorithms with the MapReduce programming model. Three different processing approaches, including local caching, distributed file storage, and the NoSQL-based method, are compared by analyzing the concurrent load and calculation time. An online geological vector tile map service prototype was developed to embed our processing framework in the China Geological Survey Information Grid. Experimental results show that our NoSQL-based parallel tile management framework can support applications that process huge volumes of vector tile data and improve performance of the tiled map service.

  8. A regularized, model-based approach to phase-based conductivity mapping using MRI.

    Science.gov (United States)

    Ropella, Kathleen M; Noll, Douglas C

    2017-11-01

    To develop a novel regularized, model-based approach to phase-based conductivity mapping that uses structural information to improve the accuracy of conductivity maps. The inverse of the three-dimensional Laplacian operator is used to model the relationship between measured phase maps and the object conductivity in a penalized weighted least-squares optimization problem. Spatial masks based on structural information are incorporated into the problem to preserve data near boundaries. The proposed Inverse Laplacian method was compared against a restricted Gaussian filter in simulation, phantom, and human experiments. The Inverse Laplacian method resulted in lower reconstruction bias and error due to noise in simulations than the Gaussian filter. The Inverse Laplacian method also produced conductivity maps closer to the measured values in a phantom and with reduced noise in the human brain, as compared to the Gaussian filter. The Inverse Laplacian method calculates conductivity maps with less noise and more accurate values near boundaries. Improving the accuracy of conductivity maps is integral for advancing the applications of conductivity mapping. Magn Reson Med 78:2011-2021, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  9. An Approach of Dynamic Object Removing for Indoor Mapping Based on UGV SLAM

    Directory of Open Access Journals (Sweden)

    Jian Tang

    2015-07-01

    Full Text Available The study of indoor mapping for Location Based Service (LBS becomes more and more popular in recent years. LiDAR SLAM based mapping method seems to be a promising indoor mapping solution. However, there are some dynamic objects such as pedestrians, indoor vehicles, etc. existing in the raw LiDAR range data. They have to be removal for mapping purpose. In this paper, a new approach of dynamic object removing called Likelihood Grid Voting (LGV is presented. It is a model free method and takes full advantage of the high scanning rate of LiDAR, which is moving at a relative low speed in indoor environment. In this method, a counting grid is allocated for recording the occupation of map position by laser scans. The lower counter value of this position can be recognized as dynamic objects and the point cloud will be removed from map. This work is a part of algorithms in our self- developed Unmanned Ground Vehicles (UGV simultaneous localization and Mapping (SLAM system- NAVIS. Field tests are carried in an indoor parking place with NAVIS to evaluate the effectiveness of the proposed method. The result shows that all the small size objects like pedestrians can be detected and removed quickly; large size of objects like cars can be detected and removed partly.

  10. A Random Finite Set Approach to Space Junk Tracking and Identification

    Science.gov (United States)

    2014-09-03

    Final 3. DATES COVERED (From - To) 31 Jan 13 – 29 Apr 14 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and...01-2013 to 29-04-2014 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and Identification 5a. CONTRACT NUMBER FA2386-13...Prescribed by ANSI Std Z39-18 A Random Finite Set Approach to Space Junk Tracking and Indentification Ba-Ngu Vo1, Ba-Tuong Vo1, 1Department of

  11. A Visual-Based Approach for Indoor Radio Map Construction Using Smartphones.

    Science.gov (United States)

    Liu, Tao; Zhang, Xing; Li, Qingquan; Fang, Zhixiang

    2017-08-04

    Localization of users in indoor spaces is a common issue in many applications. Among various technologies, a Wi-Fi fingerprinting based localization solution has attracted much attention, since it can be easily deployed using the existing off-the-shelf mobile devices and wireless networks. However, the collection of the Wi-Fi radio map is quite labor-intensive, which limits its potential for large-scale application. In this paper, a visual-based approach is proposed for the construction of a radio map in anonymous indoor environments. This approach collects multi-sensor data, e.g., Wi-Fi signals, video frames, inertial readings, when people are walking in indoor environments with smartphones in their hands. Then, it spatially recovers the trajectories of people by using both visual and inertial information. Finally, it estimates the location of fingerprints from the trajectories and constructs a Wi-Fi radio map. Experiment results show that the average location error of the fingerprints is about 0.53 m. A weighted k-nearest neighbor method is also used to evaluate the constructed radio map. The average localization error is about 3.2 m, indicating that the quality of the constructed radio map is at the same level as those constructed by site surveying. However, this approach can greatly reduce the human labor cost, which increases the potential for applying it to large indoor environments.

  12. Physical Mapping of Bread Wheat Chromosome 5A: An Integrated Approach

    Directory of Open Access Journals (Sweden)

    Delfina Barabaschi

    2015-11-01

    Full Text Available The huge size, redundancy, and highly repetitive nature of the bread wheat [ (L.] genome, makes it among the most difficult species to be sequenced. To overcome these limitations, a strategy based on the separation of individual chromosomes or chromosome arms and the subsequent production of physical maps was established within the frame of the International Wheat Genome Sequence Consortium (IWGSC. A total of 95,812 bacterial artificial chromosome (BAC clones of short-arm chromosome 5A (5AS and long-arm chromosome 5A (5AL arm-specific BAC libraries were fingerprinted and assembled into contigs by complementary analytical approaches based on the FingerPrinted Contig (FPC and Linear Topological Contig (LTC tools. Combined anchoring approaches based on polymerase chain reaction (PCR marker screening, microarray, and sequence homology searches applied to several genomic tools (i.e., genetic maps, deletion bin map, neighbor maps, BAC end sequences (BESs, genome zipper, and chromosome survey sequences allowed the development of a high-quality physical map with an anchored physical coverage of 75% for 5AS and 53% for 5AL with high portions (64 and 48%, respectively of contigs ordered along the chromosome. In the genome of grasses, [ (L. Beauv.], rice ( L., and sorghum [ (L. Moench] homologs of genes on wheat chromosome 5A were separated into syntenic blocks on different chromosomes as a result of translocations and inversions during evolution. The physical map presented represents an essential resource for fine genetic mapping and map-based cloning of agronomically relevant traits and a reference for the 5A sequencing projects.

  13. Determination of contact maps in proteins: A combination of structural and chemical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wołek, Karol; Cieplak, Marek, E-mail: mc@ifpan.edu.pl [Institute of Physics, Polish Academy of Science, Al. Lotników 32/46, 02-668 Warsaw (Poland); Gómez-Sicilia, Àngel [Instituto Cajal, Consejo Superior de Investigaciones Cientificas (CSIC), Av. Doctor Arce, 37, 28002 Madrid (Spain); Instituto Madrileño de Estudios Avanzados en Nanociencia (IMDEA-Nanociencia), C/Faraday 9, 28049 Cantoblanco (Madrid) (Spain)

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  14. Random matrix approach to the distribution of genomic distance.

    Science.gov (United States)

    Alexeev, Nikita; Zograf, Peter

    2014-08-01

    The cycle graph introduced by Bafna and Pevzner is an important tool for evaluating the distance between two genomes, that is, the minimal number of rearrangements needed to transform one genome into another. We interpret this distance in topological terms and relate it to the random matrix theory. Namely, the number of genomes at a given 2-break distance from a fixed one (the Hultman number) is represented by a coefficient in the genus expansion of a matrix integral over the space of complex matrices with the Gaussian measure. We study generating functions for the Hultman numbers and prove that the two-break distance distribution is asymptotically normal.

  15. Random Matrix Theory Approach to Chaotic Coherent Perfect Absorbers

    Science.gov (United States)

    Li, Huanan; Suwunnarat, Suwun; Fleischmann, Ragnar; Schanz, Holger; Kottos, Tsampikos

    2017-01-01

    We employ random matrix theory in order to investigate coherent perfect absorption (CPA) in lossy systems with complex internal dynamics. The loss strength γCPA and energy ECPA, for which a CPA occurs, are expressed in terms of the eigenmodes of the isolated cavity—thus carrying over the information about the chaotic nature of the target—and their coupling to a finite number of scattering channels. Our results are tested against numerical calculations using complex networks of resonators and chaotic graphs as CPA cavities.

  16. Spectral rigidity of vehicular streams (random matrix theory approach)

    Energy Technology Data Exchange (ETDEWEB)

    Krbalek, Milan [Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University, Prague (Czech Republic); Seba, Petr [Doppler Institute for Mathematical Physics and Applied Mathematics, Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University, Prague (Czech Republic)

    2009-08-28

    Using a method originally developed for the random matrix theory, we derive an approximate mathematical formula for the number variance {delta}{sub N}(L) describing the rigidity of particle ensembles with a power-law repulsion. The resulting relation is compared with the relevant statistics of the single-vehicle data measured on the Dutch freeway A9. The detected value of the inverse temperature {beta}, which can be identified as a coefficient of the mental strain of the car drivers, is then discussed in detail with the relation to the traffic density {rho} and flow J.

  17. Impact of a concept map teaching approach on nursing students' critical thinking skills.

    Science.gov (United States)

    Kaddoura, Mahmoud; Van-Dyke, Olga; Yang, Qing

    2016-09-01

    Nurses confront complex problems and decisions that require critical thinking in order to identify patient needs and implement best practices. An active strategy for teaching students the skills to think critically is the concept map. This study explores the development of critical thinking among nursing students in a required pathophysiology and pharmacology course during the first year of a Bachelor of Science in Nursing in response to concept mapping as an interventional strategy, using the Health Education Systems, Incorporated critical thinking test. A two-group experimental study with a pretest and posttest design was used. Participants were randomly divided into a control group (n = 42) taught by traditional didactic lecturing alone, and an intervention group (n = 41), taught by traditional didactic lecturing with concept mapping. Students in the concept mapping group performed much better on the Health Education Systems, Incorporated than students in the control group. It is recommended that deans, program directors, and nursing faculties evaluate their curricula to integrate concept map teaching strategies in courses in order to develop critical thinking abilities in their students. © 2016 John Wiley & Sons Australia, Ltd.

  18. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    Science.gov (United States)

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  19. An innovative multimodality approach for sentinel node mapping and biopsy in head and neck malignancies

    NARCIS (Netherlands)

    Borbón-Arce, M.; Brouwer, O. R.; van den Berg, N. S.; Mathéron, H.; Klop, W. M. C.; Balm, A. J. M.; van Leeuwen, F. W. B.; Valdés-Olmos, R. A.

    2014-01-01

    Purpose: Recent innovations such as preoperative SPECT/CT, intraoperative imaging using portable devices and a hybrid tracer were evaluated in a multimodality approach for sentinel node (SN) mapping and biopsy in head and neck malignancies. Material and methods: The evaluation included 25

  20. Orbital stability during the mapping and approach phases of the MarcoPolo-R spacecraft

    Science.gov (United States)

    Wickhusen, K.; Hussmann, H.; Oberst, J.; Luedicke, F.

    2012-09-01

    In support of the Marco-Polo-R mission we are analyzing the motion of the spacecraft in the vicinity of its primary target, the binary asteroid system 175706 (1996 FG3). We ran simulations in order to support the general mapping, the approach, and the sampling phase

  1. METHODICAL APPROACHES TO ECOLOGICAL MAPPING OF CITY TERRITORY (ON THE EXAMPLE OFKHABAROVSK

    Directory of Open Access Journals (Sweden)

    L. P. Mayorova

    2015-01-01

    Full Text Available In article methodical approaches in ecological mapping and offers on structure of layers of a card of an ecological condition of Khabarovsk are considered. Examples of layers of ‘Dump’ and ‘Landfill and waste Processors’ in QGIS are resulted.

  2. Testing a Landsat-based approach for mapping disturbance causality in U.S. forests

    Science.gov (United States)

    Todd A. Schroeder; Karen G. Schleeweis; Gretchen G. Moisen; Chris Toney; Warren B. Cohen; Elizabeth A. Freeman; Zhiqiang Yang; Chengquan Huang

    2017-01-01

    In light of Earth's changing climate and growing human population, there is an urgent need to improve monitoring of natural and anthropogenic disturbanceswhich effect forests' ability to sequester carbon and provide other ecosystem services. In this study, a two-step modeling approach was used to map the type and timing of forest disturbances occurring...

  3. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    Science.gov (United States)

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  4. Mapping community vulnerability to poaching: A whole-of-society approach

    CSIR Research Space (South Africa)

    Schmitz, Peter MU

    2017-01-01

    Full Text Available in Cartography and GIScience Mapping community vulnerability to poaching: A whole-of-society approach Peter M.U. Schmitz,1,2,3 Duarte Gonçalves,4 and Merin Jacob4 1. CSIR Built Environment, Meiring Naude Rd, Brummeria, Pretoria, South Africa; pschmitz...

  5. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  6. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  7. Constrained Perturbation Regularization Approach for Signal Estimation Using Random Matrix Theory

    Science.gov (United States)

    Suliman, Mohamed; Ballal, Tarig; Kammoun, Abla; Al-Naffouri, Tareq Y.

    2016-12-01

    In this supplementary appendix we provide proofs and additional extensive simulations that complement the analysis of the main paper (constrained perturbation regularization approach for signal estimation using random matrix theory).

  8. Remarks on the Radiative Transfer Approach to Scattering of Electromagnetic Waves in Layered Random Media

    Science.gov (United States)

    2010-03-01

    AFRL-RY-HS-TR-2010-0029 REMARKS ON THE RADIATIVE TRANSFER APPROACH TO SCATTERING OF ELECTROMAGNETIC WAVES IN LAYERED RANDOM MEDIA...TRANSFER APPROACH TO SCATTERING OF ELECTROMAGNETIC WAVES IN LAYERED RANDOM MEDIA 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER 5c. PROGRAM...Beckmann and A. Spizzichino. The Scattering of Electromagnetic Waves from Rough Surfaces. Artech House, Norwood, Massachusetts, 1987. [3] G. S. Brown. A

  9. A hybrid partial least squares and random forest approach to ...

    African Journals Online (AJOL)

    Up to date forest inventory data has become increasingly essential for sustainable planning and management of a commercial forest plantation. Forest inventory data may be collected in the form of traditional field based approaches or using remote sensing techniques. The aim of this study was to examine the utility of the ...

  10. Rational and random approaches to adenoviral vector engineering

    NARCIS (Netherlands)

    Uil, Taco Gilles

    2011-01-01

    The overall aim of this thesis is to contribute to the engineering of more selective and effective oncolytic Adenovirus (Ad) vectors. Two general approaches are taken for this purpose: (i) genetic capsid modification to achieve Ad retargeting (Chapters 2 to 4), and (ii) directed evolution to improve

  11. An effective Hamiltonian approach to quantum random walk

    Indian Academy of Sciences (India)

    2017-02-09

    Feb 9, 2017 ... We showed that in the case of two-step walk, the time evolution operator effectively can have multiplicative form. In the case of a square lattice, quantum walk has been studied computationally for different coins and the results for both the additive and the multiplica- tive approaches have been compared.

  12. Improved spatial accuracy of functional maps in the rat olfactory bulb using supervised machine learning approach.

    Science.gov (United States)

    Murphy, Matthew C; Poplawsky, Alexander J; Vazquez, Alberto L; Chan, Kevin C; Kim, Seong-Gi; Fukuda, Mitsuhiro

    2016-08-15

    Functional MRI (fMRI) is a popular and important tool for noninvasive mapping of neural activity. As fMRI measures the hemodynamic response, the resulting activation maps do not perfectly reflect the underlying neural activity. The purpose of this work was to design a data-driven model to improve the spatial accuracy of fMRI maps in the rat olfactory bulb. This system is an ideal choice for this investigation since the bulb circuit is well characterized, allowing for an accurate definition of activity patterns in order to train the model. We generated models for both cerebral blood volume weighted (CBVw) and blood oxygen level dependent (BOLD) fMRI data. The results indicate that the spatial accuracy of the activation maps is either significantly improved or at worst not significantly different when using the learned models compared to a conventional general linear model approach, particularly for BOLD images and activity patterns involving deep layers of the bulb. Furthermore, the activation maps computed by CBVw and BOLD data show increased agreement when using the learned models, lending more confidence to their accuracy. The models presented here could have an immediate impact on studies of the olfactory bulb, but perhaps more importantly, demonstrate the potential for similar flexible, data-driven models to improve the quality of activation maps calculated using fMRI data. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Shiguo [Univ. Wisc.-Madison; Kile, A. [Univ. Wisc.-Madison; Bechner, M. [Univ. Wisc.-Madison; Kvikstad, E. [Univ. Wisc.-Madison; Deng, W. [Univ. Wisc.-Madison; Wei, J. [Univ. Wisc.-Madison; Severin, J. [Univ. Wisc.-Madison; Runnheim, R. [Univ. Wisc.-Madison; Churas, C. [Univ. Wisc.-Madison; Forrest, D. [Univ. Wisc.-Madison; Dimalanta, E. [Univ. Wisc.-Madison; Lamers, C. [Univ. Wisc.-Madison; Burland, V. [Univ. Wisc.-Madison; Blattner, F. R. [Univ. Wisc.-Madison; Schwartz, David C. [Univ. Wisc.-Madison

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  14. Random matrix theory approach to vibrations near the jamming transition

    Science.gov (United States)

    Beltukov, Y. M.

    2015-03-01

    It has been shown that the dynamical matrix M describing harmonic oscillations in granular media can be represented in the form M = AA T, where the rows of the matrix A correspond to the degrees of freedom of individual granules and its columns correspond to elastic contacts between granules. Such a representation of the dynamical matrix makes it possible to estimate the density of vibrational states with the use of the random matrix theory. The found density of vibrational states is approximately constant in a wide frequency range ω- < ω < ω+, which is determined by the ratio of the number of degrees of freedom to the total number of contacts in the system, which is in good agreement with the results of the numerical experiments.

  15. Large-extent digital soil mapping approaches for total soil depth

    Science.gov (United States)

    Mulder, Titia; Lacoste, Marine; Saby, Nicolas P. A.; Arrouays, Dominique

    2015-04-01

    Total soil depth (SDt) plays a key role in supporting various ecosystem services and properties, including plant growth, water availability and carbon stocks. Therefore, predictive mapping of SDt has been included as one of the deliverables within the GlobalSoilMap project. In this work SDt was predicted for France following the directions of GlobalSoilMap, which requires modelling at 90m resolution. This first method, further referred to as DM, consisted of modelling the deterministic trend in SDt using data mining, followed by a bias correction and ordinary kriging of the residuals. Considering the total surface area of France, being about 540K km2, employed methods may need to be able dealing with large data sets. Therefore, a second method, multi-resolution kriging (MrK) for large datasets, was implemented. This method consisted of modelling the deterministic trend by a linear model, followed by interpolation of the residuals. For the two methods, the general trend was assumed to be explained by the biotic and abiotic environmental conditions, as described by the Soil-Landscape paradigm. The mapping accuracy was evaluated by an internal validation and its concordance with previous soil maps. In addition, the prediction interval for DM and the confidence interval for MrK were determined. Finally, the opportunities and limitations of both approaches were evaluated. The results showed consistency in mapped spatial patterns and a good prediction of the mean values. DM was better capable in predicting extreme values due to the bias correction. Also, DM was more powerful in capturing the deterministic trend than the linear model of the MrK approach. However, MrK was found to be more straightforward and flexible in delivering spatial explicit uncertainty measures. The validation indicated that DM was more accurate than MrK. Improvements for DM may be expected by predicting soil depth classes. MrK shows potential for modelling beyond the country level, at high

  16. Letting youths choose for themselves: concept mapping as a participatory approach for program and service planning.

    Science.gov (United States)

    Minh, Anita; Patel, Sejal; Bruce-Barrett, Cindy; OʼCampo, Patricia

    2015-01-01

    Ensuring that the voices of youths are heard is key in creating services that align with the needs and goals of youths. Concept mapping, a participatory mixed-methods approach, was used to engage youths, families, and service providers in an assessment of service gaps facing youth in an underserviced neighborhood in Toronto, Canada. We describe 6 phases of concept mapping: preparation, brainstorming, sorting and rating, analysis, interpretation, and utilization. Results demonstrate that youths and service providers vary in their conceptualizations of youth service needs and priorities. Implications for service planning and for youth engagement in research are discussed.

  17. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  18. An Automated Approach for Mapping Persistent Ice and Snow Cover over High Latitude Regions

    Directory of Open Access Journals (Sweden)

    David J. Selkowitz

    2015-12-01

    Full Text Available We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N. Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI, and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI, with a mean accuracy (agreement with the RGI of 0.96, a mean precision (user’s accuracy of the snow/ice cover class of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class of 0.86, and a mean F-score (a measure that considers both precision and recall of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to

  19. Information rich mapping requirement to product architecture through functional system deployment: The multi entity domain approach

    DEFF Research Database (Denmark)

    Hauksdóttir, Dagny; Mortensen, Niels Henrik

    2017-01-01

    for mapping requirements to architecture. These approaches do not fully support the steps and information created during product design synthesis. Design Specifications used to guide the design are often documented in text based documents, outside the design models. This results in lack of traceability which....... The results suggest that it is possible to present design information in structural domain views, presenting more elaborate information of the design synthesis than provided by previous approaches. However, further validation in a practical project setting is required to validate the approach....

  20. Fractional calculus approach to the statistical characterization of random variables and vectors

    Science.gov (United States)

    Cottone, Giulio; Di Paola, Mario; Metzler, Ralf

    2010-03-01

    Fractional moments have been investigated by many authors to represent the density of univariate and bivariate random variables in different contexts. Fractional moments are indeed important when the density of the random variable has inverse power-law tails and, consequently, it lacks integer order moments. In this paper, starting from the Mellin transform of the characteristic function and by fractional calculus method we present a new perspective on the statistics of random variables. Introducing the class of complex moments, that include both integer and fractional moments, we show that every random variable can be represented within this approach, even if its integer moments diverge. Applications to the statistical characterization of raw data and in the representation of both random variables and vectors are provided, showing that the good numerical convergence makes the proposed approach a good and reliable tool also for practical data analysis.

  1. Exploring multicollinearity using a random matrix theory approach.

    Science.gov (United States)

    Feher, Kristen; Whelan, James; Müller, Samuel

    2012-01-01

    Clustering of gene expression data is often done with the latent aim of dimension reduction, by finding groups of genes that have a common response to potentially unknown stimuli. However, what is poorly understood to date is the behaviour of a low dimensional signal embedded in high dimensions. This paper introduces a multicollinear model which is based on random matrix theory results, and shows potential for the characterisation of a gene cluster's correlation matrix. This model projects a one dimensional signal into many dimensions and is based on the spiked covariance model, but rather characterises the behaviour of the corresponding correlation matrix. The eigenspectrum of the correlation matrix is empirically examined by simulation, under the addition of noise to the original signal. The simulation results are then used to propose a dimension estimation procedure of clusters from data. Moreover, the simulation results warn against considering pairwise correlations in isolation, as the model provides a mechanism whereby a pair of genes with `low' correlation may simply be due to the interaction of high dimension and noise. Instead, collective information about all the variables is given by the eigenspectrum.

  2. Mapping CORINE Land Cover from Sentinel-1A SAR and SRTM Digital Elevation Model Data using Random Forests

    Directory of Open Access Journals (Sweden)

    Heiko Balzter

    2015-11-01

    Full Text Available The European CORINE land cover mapping scheme is a standardized classification system with 44 land cover and land use classes. It is used by the European Environment Agency to report large-scale land cover change with a minimum mapping unit of 5 ha every six years and operationally mapped by its member states. The most commonly applied method to map CORINE land cover change is by visual interpretation of optical/near-infrared satellite imagery. The Sentinel-1A satellite carries a C-band Synthetic Aperture Radar (SAR and was launched in 2014 by the European Space Agency as the first operational Copernicus mission. This study is the first investigation of Sentinel-1A for CORINE land cover mapping. Two of the first Sentinel-1A images acquired during its ramp-up phase in May and December 2014 over Thuringia in Germany are analysed. 27 hybrid level 2/3 CORINE classes are defined. 17 of these were present at the study site and classified based on a stratified random sample of training pixels from the polygon-eroded CORINE 2006 map. Sentinel-1A logarithmic radar backscatter at HH and HV polarisation (May acquisition, VV and VH polarisation (December acquisition, and the HH image texture are used as input bands to the classification. In addition, a Digital Terrain Model (DTM, a Canopy Height Model (CHM and slope and aspect maps from the Shuttle Radar Topography Mission (SRTM are used as input bands to account for geomorphological features of the landscape. In future, elevation data will be delivered for areas with sufficiently high coherence from the Sentinel-1A Interferometric Wide-Swath Mode itself. When augmented by elevation data from radar interferometry, Sentinel-1A is able to discriminate several CORINE land cover classes, making it useful for monitoring of cloud-covered regions. A bistatic Sentinel-1 Convoy mission would enable single-pass interferometric acquisitions without temporal decorrelation.

  3. Clustering of the Self-Organizing Map based Approach in Induction Machine Rotor Faults Diagnostics

    Directory of Open Access Journals (Sweden)

    Ahmed TOUMI

    2009-12-01

    Full Text Available Self-Organizing Maps (SOM is an excellent method of analyzingmultidimensional data. The SOM based classification is attractive, due to itsunsupervised learning and topology preserving properties. In this paper, theperformance of the self-organizing methods is investigated in induction motorrotor fault detection and severity evaluation. The SOM is based on motor currentsignature analysis (MCSA. The agglomerative hierarchical algorithms using theWard’s method is applied to automatically dividing the map into interestinginterpretable groups of map units that correspond to clusters in the input data. Theresults obtained with this approach make it possible to detect a rotor bar fault justdirectly from the visualization results. The system is also able to estimate theextent of rotor faults.

  4. Random matrix approach to the dynamics of stock inventory variations

    Science.gov (United States)

    Zhou, Wei-Xing; Mu, Guo-Hua; Kertész, János

    2012-09-01

    It is well accepted that investors can be classified into groups owing to distinct trading strategies, which forms the basic assumption of many agent-based models for financial markets when agents are not zero-intelligent. However, empirical tests of these assumptions are still very rare due to the lack of order flow data. Here we adopt the order flow data of Chinese stocks to tackle this problem by investigating the dynamics of inventory variations for individual and institutional investors that contain rich information about the trading behavior of investors and have a crucial influence on price fluctuations. We find that the distributions of cross-correlation coefficient Cij have power-law forms in the bulk that are followed by exponential tails, and there are more positive coefficients than negative ones. In addition, it is more likely that two individuals or two institutions have a stronger inventory variation correlation than one individual and one institution. We find that the largest and the second largest eigenvalues (λ1 and λ2) of the correlation matrix cannot be explained by random matrix theory and the projections of investors' inventory variations on the first eigenvector u(λ1) are linearly correlated with stock returns, where individual investors play a dominating role. The investors are classified into three categories based on the cross-correlation coefficients CV R between inventory variations and stock returns. A strong Granger causality is unveiled from stock returns to inventory variations, which means that a large proportion of individuals hold the reversing trading strategy and a small part of individuals hold the trending strategy. Our empirical findings have scientific significance in the understanding of investors' trading behavior and in the construction of agent-based models for emerging stock markets.

  5. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  6. Combining participatory and socioeconomic approaches to map fishing effort in small-scale fisheries.

    Directory of Open Access Journals (Sweden)

    Lauric Thiault

    Full Text Available Mapping the spatial allocation of fishing effort while including key stakeholders in the decision making process is essential for effective fisheries management but is difficult to implement in complex small-scale fisheries that are diffuse, informal and multifaceted. Here we present a standardized but flexible approach that combines participatory mapping approaches (fishers' spatial preference for fishing grounds, or fishing suitability with socioeconomic approaches (spatial extrapolation of social surrogates, or fishing capacity to generate a comprehensive map of predicted fishing effort. Using a real world case study, in Moorea, French Polynesia, we showed that high predicted fishing effort is not simply located in front of, or close to, main fishing villages with high dependence on marine resources; it also occurs where resource dependency is moderate and generally in near-shore areas and reef passages. The integrated approach we developed can contribute to addressing the recurrent lack of fishing effort spatial data through key stakeholders' (i.e., resource users participation. It can be tailored to a wide range of social, ecological and data availability contexts, and should help improve place-based management of natural resources.

  7. A DEM-based approach for large-scale floodplain mapping in ungauged watersheds

    Science.gov (United States)

    Jafarzadegan, Keighobad; Merwade, Venkatesh

    2017-07-01

    Binary threshold classifiers are a simple form of supervised classification methods that can be used in floodplain mapping. In these methods, a given watershed is examined as a grid of cells with a particular morphologic value. A reference map is a grid of cells labeled as flood and non-flood from hydraulic modeling or remote sensing observations. By using the reference map, a threshold on morphologic feature is determined to label the unknown cells as flood and non-flood (binary classification). The main limitation of these methods is the threshold transferability assumption in which a homogenous geomorphological and hydrological behavior is assumed for the entire region and the same threshold derived from the reference map (training area) is used for other locations (ungauged watersheds) inside the study area. In order to overcome this limitation and consider the threshold variability inside a large region, regression modeling is used in this paper to predict the threshold by relating it to the watershed characteristics. Application of this approach for North Carolina shows that the threshold is related to main stream slope, average watershed elevation, and average watershed slope. By using the Fitness (F) and Correct (C) criteria of C > 0.9 and F > 0.6, results show the threshold prediction and the corresponding floodplain for 100-year design flow are comparable to that from Federal Emergency Management Agency's (FEMA) Flood Insurance Rate Maps (FIRMs) in the region. However, the floodplains from the proposed model are underpredicted and overpredicted in the flat (average watershed slope 20%). Overall, the proposed approach provides an alternative way of mapping floodplain in data-scarce regions.

  8. Automated mapping of glacial overdeepenings beneath contemporary ice sheets: Approaches and potential applications

    Science.gov (United States)

    Patton, Henry; Swift, Darrel A.; Clark, Chris D.; Livingstone, Stephen J.; Cook, Simon J.; Hubbard, Alun

    2015-03-01

    Awareness is growing on the significance of overdeepenings in ice sheet systems. However, a complete understanding of overdeepening formation is lacking, meaning observations of overdeepening location and morphometry are urgently required to motivate process understanding. Subject to the development of appropriate mapping approaches, high resolution subglacial topography data sets covering the whole of Antarctica and Greenland offer significant potential to acquire such observations and to relate overdeepening characteristics to ice sheet parameters. We explore a possible method for mapping overdeepenings beneath the Antarctic and Greenland ice sheets and illustrate a potential application of this approach by testing a possible relationship between overdeepening elongation ratio and ice sheet flow velocity. We find that hydrological and terrain filtering approaches are unsuited to mapping overdeepenings and develop a novel rule-based GIS methodology that delineates overdeepening perimeters by analysis of closed-contour properties. We then develop GIS procedures that provide information on overdeepening morphology and topographic context. Limitations in the accuracy and resolution of bed-topography data sets mean that application to glaciological problems requires consideration of quality-control criteria to (a) remove potentially spurious depressions and (b) reduce uncertainties that arise from the inclusion of depressions of nonglacial origin, or those in regions where empirical data are sparse. To address the problem of overdeepening elongation, potential quality control criteria are introduced; and discussion of this example serves to highlight the limitations that mapping approaches - and applications of such approaches - must confront. We predict that improvements in bed-data quality will reduce the need for quality control procedures and facilitate increasingly robust insights from empirical data.

  9. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    Science.gov (United States)

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.

  10. Benchmarking for research-related competencies - a curricular mapping approach at medical faculties in Germany.

    Science.gov (United States)

    Lammerding-Koeppel, Maria; Fritze, Olaf; Giesler, Marianne; Narciss, Elisabeth; Steffens, Sandra; Wosnik, Annette; Griewatz, Jan

    2017-11-16

    Internationally, scientific and research-related competencies need to be sufficiently targeted as core outcomes in many undergraduate medical curricula. Since 2015, standards have been recommended for Germany in the National Competency-based Learning Objective Catalogue in Medicine (NKLM). The aim of this study is to develop a multi-center mapping approach for curricular benchmarking against national standards and against other medical faculties. A total of 277 faculty members from four German medical faculties have mapped the local curriculum against the scientific and research-related NKLM objectives, using consented procedures, metrics, and tools. The amount of mapping citations of each objective is used as indicator for its weighting in the local curriculum. Achieved competency levels after five-year education are compared. All four programs fulfill the NKLM standards, with each emphasizing different sub-competencies explicitly in writing (Scholar: 17-41% of all courses; Medical Scientific Skills: 14-37% of all courses). Faculties show major or full agreement in objective weighting: Scholar 44%, scientific skills 79%. The given NKLM competency level is met or even outperformed in 78-100% of the courses. The multi-center mapping approach provides an informative dataset allowing curricular diagnosis by external benchmarking and guidance for optimization of local curricula.

  11. Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches

    Directory of Open Access Journals (Sweden)

    Damien Vivet

    2013-08-01

    Full Text Available This paper is concerned with robotic applications using a ground-based radar sensor for simultaneous localization and mapping problems. In mobile robotics, radar technology is interesting because of its long range and the robustness of radar waves to atmospheric conditions, making these sensors well-suited for extended outdoor robotic applications. Two localization and mapping approaches using data obtained from a 360° field of view microwave radar sensor are presented and compared. The first method is a trajectory-oriented simultaneous localization and mapping technique, which makes no landmark assumptions and avoids the data association problem. The estimation of the ego-motion makes use of the Fourier-Mellin transform for registering radar images in a sequence, from which the rotation and translation of the sensor motion can be estimated. The second approach uses the consequence of using a rotating range sensor in high speed robotics. In such a situation, movement combinations create distortions in the collected data. Velocimetry is achieved here by explicitly analysing these measurement distortions. As a result, the trajectory of the vehicle and then the radar map of outdoor environments can be obtained. The evaluation of experimental results obtained by the two methods is presented on real-world data from a vehicle moving at 30 km/h over a 2.5 km course.

  12. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  13. Mapping urban impervious surfaces from an airborne hyperspectral imagery using the object-oriented classification approach

    Directory of Open Access Journals (Sweden)

    Aguejdad Rahim

    2017-01-01

    Full Text Available The objective of this research is to explore the capabilities of the hyperspectral imagery in mapping the urban impervious objects and identifying the surface materials using an object-oriented approach. The application is conducted to Toulouse city (France within the HYEP research project in charge of using hyperspectral imagery for the environmental urban planning. The method uses the multi-resolution segmentation and classification algorithms. The first results highlight a high potential of the hyperspectral imagery in land cover mapping of the urban environment, especially the extraction of impervious surfaces. They, also, illustrate, that the object-oriented approach by means of the fuzzy logic classifier yields promising results in distinguishing the mean roofing materials based only on the spectral information. Conversely to the red clay tiles and metal roofs, which are easily identified, the concrete, gravel and asphalt roofs are still confused with roads.

  14. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    Science.gov (United States)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  15. An efficient approach to the travelling salesman problem using self-organizing maps.

    Science.gov (United States)

    Vieira, Frederico Carvalho; Dória Neto, Adrião Duarte; Costa, José Alfredo Ferreira

    2003-04-01

    This paper presents an approach to the well-known Travelling Salesman Problem (TSP) using Self-Organizing Maps (SOM). The SOM algorithm has interesting topological information about its neurons configuration on cartesian space, which can be used to solve optimization problems. Aspects of initialization, parameters adaptation, and complexity analysis of the proposed SOM based algorithm are discussed. The results show an average deviation of 3.7% from the optimal tour length for a set of 12 TSP instances.

  16. An Approach to Dynamic Fusion of the Knowledge Maps of an Activities Process: Application on Healthcare

    OpenAIRE

    Menaouer Brahami; Baghdad Atmani; Nada Matta

    2015-01-01

    The interest of companies for a greater valuation of their information, knowledge and competency is increasing. These companies have a knowledge capital (tacit and explicit) often poorly exploited. These information resources include knowledge and information useful and necessary to the execution of trades' processes and that it will be captured and formalized by using knowledge engineering methods, such as knowledge mapping techniques. In this context, the authors present a new approach to d...

  17. Mapping audiovisual translation investigations: research approaches and the role of technology

    OpenAIRE

    Matamala, Anna

    2017-01-01

    This article maps audiovisual translation research by analysing in a contrastive way the abstracts presented at three audiovisual translation conferences ten years ago and nowadays. The comparison deals with the audiovisual transfer modes and topics under discussion, and the approach taken by the authors in their abstracts. The article then shifts the focus to the role of technology in audiovisual translation research, as it is considered an element that is impacting and will continue to impa...

  18. Area-Based Approach for Mapping and Monitoring Riverine Vegetation Using Mobile Laser Scanning

    Directory of Open Access Journals (Sweden)

    Harri Kaartinen

    2013-10-01

    Full Text Available Vegetation plays an important role in stabilizing the soil and decreasing fluvial erosion. In certain cases, vegetation increases the accumulation of fine sediments. Efficient and accurate methods are required for mapping and monitoring changes in the fluvial environment. Here, we develop an area-based approach for mapping and monitoring the vegetation structure along a river channel. First, a 2 × 2 m grid was placed over the study area. Metrics describing vegetation density and height were derived from mobile laser-scanning (MLS data and used to predict the variables in the nearest-neighbor (NN estimations. The training data were obtained from aerial images. The vegetation cover type was classified into the following four classes: bare ground, field layer, shrub layer, and canopy layer. Multi-temporal MLS data sets were applied to the change detection of riverine vegetation. This approach successfully classified vegetation cover with an overall classification accuracy of 72.6%; classification accuracies for bare ground, field layer, shrub layer, and canopy layer were 79.5%, 35.0%, 45.2% and 100.0%, respectively. Vegetation changes were detected primarily in outer river bends. These results proved that our approach was suitable for mapping riverine vegetation.

  19. Putting people on the map through an approach that integrates social data in conservation planning.

    Science.gov (United States)

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. © 2014 Society for Conservation Biology.

  20. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  1. An assessment of a collaborative mapping approach for exploring land use patterns for several European metropolises

    Science.gov (United States)

    Jokar Arsanjani, Jamal; Vaz, Eric

    2015-03-01

    Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment

  2. MAP3D: a media processor approach for high-end 3D graphics

    Science.gov (United States)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  3. Information rich mapping requirement to product architecture through functional system deployment: The multi entity domain approach

    DEFF Research Database (Denmark)

    Hauksdóttir, Dagny; Mortensen, Niels Henrik

    2017-01-01

    for mapping requirements to architecture. These approaches do not fully support the steps and information created during product design synthesis. Design Specifications used to guide the design are often documented in text based documents, outside the design models. This results in lack of traceability which......Successful transformation of design information from customer requirements to design implementation is critical for engineering design. As systems become complex the tracking of how customer requirements are implement becomes difficult. Existing approaches suggest so called domain modelling...... may impede the ability to evolve, maintain or reuse systems. In this paper the Multi Entity Domain Approach (MEDA) is presented. The approach combines different design information within the domain views, incorporates both Software and Hardware design and supports iterative requirements definition...

  4. Whole-Genome Restriction Mapping by "Subhaploid"-Based RAD Sequencing: An Efficient and Flexible Approach for Physical Mapping and Genome Scaffolding.

    Science.gov (United States)

    Dou, Jinzhuang; Dou, Huaiqian; Mu, Chuang; Zhang, Lingling; Li, Yangping; Wang, Jia; Li, Tianqi; Li, Yuli; Hu, Xiaoli; Wang, Shi; Bao, Zhenmin

    2017-07-01

    Assembly of complex genomes using short reads remains a major challenge, which usually yields highly fragmented assemblies. Generation of ultradense linkage maps is promising for anchoring such assemblies, but traditional linkage mapping methods are hindered by the infrequency and unevenness of meiotic recombination that limit attainable map resolution. Here we develop a sequencing-based "in vitro" linkage mapping approach (called RadMap), where chromosome breakage and segregation are realized by generating hundreds of "subhaploid" fosmid/bacterial-artificial-chromosome clone pools, and by restriction site-associated DNA sequencing of these clone pools to produce an ultradense whole-genome restriction map to facilitate genome scaffolding. A bootstrap-based minimum spanning tree algorithm is developed for grouping and ordering of genome-wide markers and is implemented in a user-friendly, integrated software package (AMMO). We perform extensive analyses to validate the power and accuracy of our approach in the model plant Arabidopsis thaliana and human. We also demonstrate the utility of RadMap for enhancing the contiguity of a variety of whole-genome shotgun assemblies generated using either short Illumina reads (300 bp) or long PacBio reads (6-14 kb), with up to 15-fold improvement of N50 (∼816 kb-3.7 Mb) and high scaffolding accuracy (98.1-98.5%). RadMap outperforms BioNano and Hi-C when input assembly is highly fragmented (contig N50 = 54 kb). RadMap can capture wide-range contiguity information and provide an efficient and flexible tool for high-resolution physical mapping and scaffolding of highly fragmented assemblies. Copyright © 2017 Dou et al.

  5. The Effectiveness of Web-Based Asthma Self-Management System, My Asthma Portal (MAP): A Pilot Randomized Controlled Trial.

    Science.gov (United States)

    Ahmed, Sara; Ernst, Pierre; Bartlett, Susan J; Valois, Marie-France; Zaihra, Tasneem; Paré, Guy; Grad, Roland; Eilayyan, Owis; Perreault, Robert; Tamblyn, Robyn

    2016-12-01

    Whether Web-based technologies can improve disease self-management is uncertain. My Asthma Portal (MAP) is a Web-based self-management support system that couples evidence-based behavioral change components (self-monitoring of symptoms, physical activity, and medication adherence) with real-time monitoring, feedback, and support from a nurse case manager. The aim of this study was to compare the impact of access to a Web-based asthma self-management patient portal linked to a case-management system (MAP) over 6 months compared with usual care on asthma control and quality of life. A multicenter, parallel, 2-arm, pilot, randomized controlled trial was conducted with 100 adults with confirmed diagnosis of asthma from 2 specialty clinics. Asthma control was measured using an algorithm based on overuse of fast-acting bronchodilators and emergency department visits, and asthma-related quality of life was assessed using the Mini-Asthma Quality of Life Questionnaire (MAQLQ). Secondary mediating outcomes included asthma symptoms, depressive symptoms, self-efficacy, and beliefs about medication. Process evaluations were also included. A total of 49 individuals were randomized to MAP and 51 to usual care. Compared with usual care, participants in the intervention group reported significantly higher asthma quality of life (mean change 0.61, 95% CI 0.03 to 1.19), and the change in asthma quality of life for the intervention group between baseline and 3 months (mean change 0.66, 95% CI 0.35 to 0.98) was not seen in the control group. No significant differences in asthma quality of life were found between the intervention and control groups at 6 (mean change 0.46, 95% CI -0.12 to 1.05) and 9 months (mean change 0.39, 95% CI -0.2 to 0.98). For poor control status, there was no significant effect of group, time, or group by time. For all self-reported measures, the intervention group had a significantly higher proportion of individuals, demonstrating a minimal clinically

  6. Extreme rainfall distribution mapping: Comparison of two approaches in West Africa

    Science.gov (United States)

    Panthou, G.; Vischel, T.; Lebel, T.; Blanchet, J.; Quantin, G.; Ali, A.

    2012-12-01

    In a world where populations are increasingly exposed to natural hazards, extreme rainfall mapping remains an important subject of research. Extreme rainfall maps are required for both flood risk management and civil engineering structure design, the challenge being to take into account the local information provided by point rainfall series as well as the necessity of some regional coherency. Such a coherency is not guaranteed when extreme value distributions are fitted separately to individual maximum rainfall series. Two approaches based on the extreme value theory (Block Maxima Analysis) are compared here, with an application to extreme rainfall mapping in West Africa. Annual daily rainfall maxima are extracted from rain-gauges series and modeled over the study region by GEV (Generalized Extreme Value) distributions. These two approaches are the following: (i) The Local Fit and Interpolation (LFI) approach which consists of a spatial interpolation of the GEV distribution parameters estimated independently at each raingauge serie. (ii) The Spatial Maximum Likelihood Estimation (SMLE) which directly estimates the GEV distribution over the entire region by a single maximum likelihood fit using jointly all measurements combined with spatial covariates. Five LFI and three SMLE methods are considered, using the information provided by 126 daily rainfall series covering the period 1950-1990. The methods are first evaluated in calibration. Then the predictive skills and the robustness are assessed through a cross validation and an independent network validation process. The SMLE approach, especially when using the mean annual rainfall as covariate, appears to perform better for most of the scores computed. Using a reference series of 104 years of daily data recorded at Niamey (Niger), it is also shown that the SMLE approach has the capacity to deal more efficiently with the effect of local outliers by using the spatial information provided by nearby stations.

  7. Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches

    Science.gov (United States)

    Skinner, J. A., Jr.; Herkenhoff, K.

    2010-01-01

    The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].

  8. Concept mapping as an approach for expert-guided model building: The example of health literacy.

    Science.gov (United States)

    Soellner, Renate; Lenartz, Norbert; Rudinger, Georg

    2017-02-01

    Concept mapping served as the starting point for the aim of capturing the comprehensive structure of the construct of 'health literacy.' Ideas about health literacy were generated by 99 experts and resulted in 105 statements that were subsequently organized by 27 experts in an unstructured card sorting. Multidimensional scaling was applied to the sorting data and a two and three-dimensional solution was computed. The three dimensional solution was used in subsequent cluster analysis and resulted in a concept map of nine "clusters": (1) self-regulation, (2) self-perception, (3) proactive approach to health, (4) basic literacy and numeracy skills, (5) information appraisal, (6) information search, (7) health care system knowledge and acting, (8) communication and cooperation, and (9) beneficial personality traits. Subsequently, this concept map served as a starting point for developing a "qualitative" structural model of health literacy and a questionnaire for the measurement of health literacy. On the basis of questionnaire data, a "quantitative" structural model was created by first applying exploratory factor analyses (EFA) and then cross-validating the model with confirmatory factor analyses (CFA). Concept mapping proved to be a highly valuable tool for the process of model building up to translational research in the "real world". Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach

    Science.gov (United States)

    Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai

    2017-02-01

    The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.

  10. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    Science.gov (United States)

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher

    2017-07-01

    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  11. Mapping the distributions of C3 and C4 grasses in the mixed-grass prairies of southwest Oklahoma using the Random Forest classification algorithm

    Science.gov (United States)

    Yan, Dong; de Beurs, Kirsten M.

    2016-05-01

    The objective of this paper is to demonstrate a new method to map the distributions of C3 and C4 grasses at 30 m resolution and over a 25-year period of time (1988-2013) by combining the Random Forest (RF) classification algorithm and patch stable areas identified using the spatial pattern analysis software FRAGSTATS. Predictor variables for RF classifications consisted of ten spectral variables, four soil edaphic variables and three topographic variables. We provided a confidence score in terms of obtaining pure land cover at each pixel location by retrieving the classification tree votes. Classification accuracy assessments and predictor variable importance evaluations were conducted based on a repeated stratified sampling approach. Results show that patch stable areas obtained from larger patches are more appropriate to be used as sample data pools to train and validate RF classifiers for historical land cover mapping purposes and it is more reasonable to use patch stable areas as sample pools to map land cover in a year closer to the present rather than years further back in time. The percentage of obtained high confidence prediction pixels across the study area ranges from 71.18% in 1988 to 73.48% in 2013. The repeated stratified sampling approach is necessary in terms of reducing the positive bias in the estimated classification accuracy caused by the possible selections of training and validation pixels from the same patch stable areas. The RF classification algorithm was able to identify the important environmental factors affecting the distributions of C3 and C4 grasses in our study area such as elevation, soil pH, soil organic matter and soil texture.

  12. An optimization approach for mapping and measuring the divergence and correspondence between paths.

    Science.gov (United States)

    Mueller, Shane T; Perelman, Brandon S; Veinott, Elizabeth S

    2016-03-01

    Many domains of empirical research produce or analyze spatial paths as a measure of behavior. Previously, approaches for measuring the similarity or deviation between two paths have either required timing information or have used ad hoc or manual coding schemes. In this paper, we describe an optimization approach for robustly measuring the area-based deviation between two paths we call ALCAMP (Algorithm for finding the Least-Cost Areal Mapping between Paths). ALCAMP measures the deviation between two paths and produces a mapping between corresponding points on the two paths. The method is robust to a number of aspects in real path data, such as crossovers, self-intersections, differences in path segmentation, and partial or incomplete paths. Unlike similar algorithms that produce distance metrics between trajectories (i.e., paths that include timing information), this algorithm uses only the order of observed path segments to determine the mapping. We describe the algorithm and show its results on a number of sample problems and data sets, and demonstrate its effectiveness for assessing human memory for paths. We also describe available software code written in the R statistical computing language that implements the algorithm to enable data analysis.

  13. Mapping wildfire vulnerability in Mediterranean Europe. Testing a stepwise approach for operational purposes.

    Science.gov (United States)

    Oliveira, Sandra; Félix, Fernando; Nunes, Adélia; Lourenço, Luciano; Laneve, Giovanni; Sebastián-López, Ana

    2017-10-20

    Vulnerability assessment is a vital component of wildfire management. This research focused on the development of a framework to measure and map vulnerability levels in several areas within Mediterranean Europe, where wildfires are a major concern. The framework followed a stepwise approach to evaluate its main components, expressed by exposure, sensitivity and coping capacity. Data on population density, fuel types, protected areas location, roads infrastructure and surveillance activities, among others, were integrated to create composite indices, representing each component and articulated in multiple dimensions. Maps were created for several test areas, in northwest Portugal, southwest Sardinia in Italy and northeast Corsica in France, with the contribution of local participants from civil protection institutions and forest services. Results showed the influence of fuel sensitivity levels, population distribution and protected areas coverage for the overall vulnerability classes. Reasonable levels of accuracy were found on the maps provided through the validation procedure, with an overall match above 72% for the several sites. The systematic and flexible approach applied allowed for adjustments to local circumstances with regards to data availability and fire management procedures, without compromising its consistency and with substantial operational capabilities. The results obtained and the positive feedback of end-users encourage its further application, as a means to improve wildfire management strategies at multiple levels with the latest scientific outputs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. A Photogrammetric Approach for Assessing Positional Accuracy of OpenStreetMap© Roads

    Directory of Open Access Journals (Sweden)

    Peter Doucette

    2013-04-01

    Full Text Available As open source volunteered geographic information continues to gain popularity, the user community and data contributions are expected to grow, e.g., CloudMade, Apple, and Ushahidi now provide OpenStreetMap© (OSM as a base layer for some of their mapping applications. This, coupled with the lack of cartographic standards and the expectation to one day be able to use this vector data for more geopositionally sensitive applications, like GPS navigation, leaves potential users and researchers to question the accuracy of the database. This research takes a photogrammetric approach to determining the positional accuracy of OSM road features using stereo imagery and a vector adjustment model. The method applies rigorous analytical measurement principles to compute accurate real world geolocations of OSM road vectors. The proposed approach was tested on several urban gridded city streets from the OSM database with the results showing that the post adjusted shape points improved positionally by 86%. Furthermore, the vector adjustment was able to recover 95% of the actual positional displacement present in the database. To demonstrate a practical application, a head-to-head positional accuracy assessment between OSM, the USGS National Map (TNM, and United States Census Bureau’s Topologically Integrated Geographic Encoding Referencing (TIGER 2007 roads was conducted.

  15. Synchronous slowing down in coupled logistic maps via random network topology

    Science.gov (United States)

    Wang, Sheng-Jun; Du, Ru-Hai; Jin, Tao; Wu, Xing-Sen; Qu, Shi-Xian

    2016-03-01

    The speed and paths of synchronization play a key role in the function of a system, which has not received enough attention up to now. In this work, we study the synchronization process of coupled logistic maps that reveals the common features of low-dimensional dissipative systems. A slowing down of synchronization process is observed, which is a novel phenomenon. The result shows that there are two typical kinds of transient process before the system reaches complete synchronization, which is demonstrated by both the coupled multiple-period maps and the coupled multiple-band chaotic maps. When the coupling is weak, the evolution of the system is governed mainly by the local dynamic, i.e., the node states are attracted by the stable orbits or chaotic attractors of the single map and evolve toward the synchronized orbit in a less coherent way. When the coupling is strong, the node states evolve in a high coherent way toward the stable orbit on the synchronized manifold, where the collective dynamics dominates the evolution. In a mediate coupling strength, the interplay between the two paths is responsible for the slowing down. The existence of different synchronization paths is also proven by the finite-time Lyapunov exponent and its distribution.

  16. A Machine Learning Approach to Mapping Agricultural Fields Across Sub-Saharan Africa

    Science.gov (United States)

    Debats, S. R.; Fuchs, T. J.; Thompson, D. R.; Estes, L. D.; Evans, T. P.; Caylor, K. K.

    2013-12-01

    Food production in sub-Saharan Africa is dominated by smallholder agriculture. Rainfed farming practices and the prevailing dryland conditions render crop yields vulnerable to increasing climatic variability. As a result, smallholder farmers are among the poorest and most food insecure groups among the region's population. Quantifying the distribution of smallholder agriculture across sub-Saharan Africa would greatly assist efforts to boost food security. Existing agricultural land cover data sets are limited to estimating the percentage of cropland within a coarse grid cell. The goal of this research is to develop a statistical machine learning algorithm to map individual agricultural fields, mirroring the accuracy of hand-digitization. For the algorithm, a random forest pixel-wise classifier learns by example from training data to distinguish between fields and non-fields. The algorithm then applies this training to classify previously unseen data. These classifications can then be smoothed into coherent regions corresponding to agricultural fields. Our training data set consists of hand-digitized boundaries of agricultural fields in South Africa, commissioned by its government in 2008. Working with 1 km x 1 km scenes across South Africa, the hand-digitized field boundaries are matched with satellite images extracted from Google Maps. To highlight different information contained within the images, several image processing filters are applied. The inclusion of Landsat images for additional training information is also explored. After training and testing the algorithm in South Africa, we aim to expand our mapping efforts across sub-Saharan Africa. Through Princeton's Mapping Africa project, crowdsourcing will produce additional training data sets of hand-digitized field boundaries in new areas of interest. This algorithm and the resulting data sets will provide previously unavailable information at an unprecedented level of detail on the largest and most

  17. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    Science.gov (United States)

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. The impact of image integration on catheter ablation of atrial fibrillation using electroanatomic mapping: a prospective randomized study.

    Science.gov (United States)

    Kistler, Peter M; Rajappan, Kim; Harris, Stuart; Earley, Mark J; Richmond, Laura; Sporton, Simon C; Schilling, Richard J

    2008-12-01

    A detailed appreciation of the left atrial/pulmonary venous (LA/PV) anatomy may be important in improving the safety and success of catheter ablation for AF. The aim of this randomized study was to determine the impact of computed tomographic (CT) integration into an electroanatomic mapping (EAM) system on clinical outcome in patients undergoing catheter ablation for atrial fibrillation (AF). Eighty patients with AF were randomized to undergo first-time wide encirclement of ipsilateral PV pairs using EAM alone (40 patients) or with CT (40 patients, Cartomerge). Wide encirclement of the pulmonary veins was performed using irrigated radiofrequency ablation with the electrophysiological endpoint of electrical isolation (EI). The primary endpoint was single-procedure success at 6 month follow up. Acute and long-term procedural outcomes were also determined. There was no significant difference in single procedure success between EAM (56%) and cavotricuspid isthmus image (CTI) (50%) groups (P = 0.9). Acute procedural outcomes (EI, PV reconnection, sinus rhythm restored by ablation in persistent AF), fluoroscopy, and procedure durations (EI of right PVs, EI of left PVs, total) did not differ significantly between EAM and CTI groups. Image integration to guide catheter ablation for AF did not significantly improve the clinical outcome. Achieving PV EI is the critical determinant of procedural success rather than the mapping tools used to achieve it.

  19. Distance expanding random mappings, thermodynamical formalism, Gibbs measures and fractal geometry

    CERN Document Server

    Mayer, Volker; Skorulski, Bartlomiej

    2011-01-01

    The theory of random dynamical systems originated from stochastic differential equations. It is intended to provide a framework and techniques to describe and analyze the evolution of dynamical systems when the input and output data are known only approximately, according to some probability distribution. The development of this field, in both the theory and applications, has gone in many directions. In this manuscript we introduce measurable expanding random dynamical systems, develop the thermodynamical formalism and establish, in particular, the exponential decay of correlations and analyticity of the expected pressure although the spectral gap property does not hold. This theory is then used to investigate fractal properties of conformal random systems. We prove a Bowen’s formula and develop the multifractal formalism of the Gibbs states. Depending on the behavior of the Birkhoff sums of the pressure function we arrive at a natural classification of the systems into two classes: quasi-deterministic syst...

  20. Mendelian Randomization as an Approach to Assess Causality Using Observational Data.

    Science.gov (United States)

    Sekula, Peggy; Del Greco M, Fabiola; Pattaro, Cristian; Köttgen, Anna

    2016-11-01

    Mendelian randomization refers to an analytic approach to assess the causality of an observed association between a modifiable exposure or risk factor and a clinically relevant outcome. It presents a valuable tool, especially when randomized controlled trials to examine causality are not feasible and observational studies provide biased associations because of confounding or reverse causality. These issues are addressed by using genetic variants as instrumental variables for the tested exposure: the alleles of this exposure-associated genetic variant are randomly allocated and not subject to reverse causation. This, together with the wide availability of published genetic associations to screen for suitable genetic instrumental variables make Mendelian randomization a time- and cost-efficient approach and contribute to its increasing popularity for assessing and screening for potentially causal associations. An observed association between the genetic instrumental variable and the outcome supports the hypothesis that the exposure in question is causally related to the outcome. This review provides an overview of the Mendelian randomization method, addresses assumptions and implications, and includes illustrative examples. We also discuss special issues in nephrology, such as inverse risk factor associations in advanced disease, and outline opportunities to design Mendelian randomization studies around kidney function and disease. Copyright © 2016 by the American Society of Nephrology.

  1. A New Approach to Liquefaction Potential Mapping Using Remote Sensing and Machine Learning

    Science.gov (United States)

    Oommen, T.; Baise, L. G.

    2007-12-01

    learning capabilities of a human brain and make appropriate predictions that involve intuitive judgments and a high degree of nonlinearity. The accuracy of the developed liquefaction potential map was tested using independent testing data that was not used for the model development. The results show that the developed liquefaction potential map has an overall classification accuracy of 84%, indicating that the combination of remote sensing data and other relevant spatial data together with machine learning can be a promising approach for liquefaction potential mapping.

  2. Alternative SERRS probes for the immunochemical localization of ovalbumin in paintings: an advanced mapping detection approach.

    Science.gov (United States)

    Sciutto, Giorgia; Litti, Lucio; Lofrumento, Cristiana; Prati, Silvia; Ricci, Marilena; Gobbo, Marina; Roda, Aldo; Castellucci, Emilio; Meneghetti, Moreno; Mazzeo, Rocco

    2013-08-21

    In the field of analytical chemistry, many scientific efforts have been devoted to develop experimental procedures for the characterization of organic substances present in heterogeneous artwork samples, due to their challenging identification. In particular, performances of immunochemical techniques have been recently investigated, optimizing ad hoc systems for the identification of proteins. Among all the different immunochemical approaches, the use of metal nanoparticles - for surface enhanced Raman scattering (SERS) detection - remains one of the most powerful methods that has still not been explored enough for the analysis of artistic artefacts. For this reason, the present research work was aimed at proposing a new optimized and highly efficient indirect immunoassay for the detection of ovalbumin. In particular, the study proposed a new SERRS probe composed of gold nanoparticles (AuNPs) functionalised with Nile Blue A and produced with an excellent green and cheap alternative approach to the traditional chemical nanoparticles synthesis: the laser ablation synthesis in solution (LASiS). This procedure allows us to obtain stable nanoparticles which can be easily functionalized without any ligand exchange reaction or extensive purification procedures. Moreover, the present research work also focused on the development of a comprehensive analytical approach, based on the combination of potentialities of immunochemical methods and Raman analysis, for the simultaneous identification of the target protein and the different organic and inorganic substances present in the paint matrix. An advanced mapping detection system was proposed to achieve the exact spatial location of all the components through the creation of false colour chemical maps.

  3. Polymer Dynamics in Random Media, Replica Theory, Ternary Systems: Mappings and Equivalences

    NARCIS (Netherlands)

    U. Ebert (Ute); L. Schäfer

    1994-01-01

    htmlabstractFor polymer dynamics in quenched random media a renormalizability proof is lacking and calculations are lengthy. We here propose and define the static and the ergodic limit of the dynamic theory, and show, that these limits are equivalent to well-known renormalizable static polymer

  4. Conceptualizing Stakeholders’ Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    Directory of Open Access Journals (Sweden)

    Lopes Rita

    2015-12-01

    Full Text Available A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  5. Integration of spectral, thermal, and textural features of ASTER data using Random Forests classification for lithological mapping

    Science.gov (United States)

    Masoumi, Feizollah; Eslamkish, Taymour; Abkar, Ali Akbar; Honarmand, Mehdi; Harris, Jeff R.

    2017-05-01

    The ensemble classifier, Random Forests (RF), is assessed for mapping lithology using the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) imagery over an area in southern Iran. The study area in the northern part of Rabor in the Kerman's Cenozoic magmatic arc (KCMA) is well exposed and contains some copper mineralization occurrences. In this research, the following six groups of ASTER datasets were used for RF classification: nine spectral bands in the VNIR and SWIR, five thermal bands in TIR, all 14 bands (including VNIR, SWIR, and TIR), band ratios, texture features, and principal components (PCs). The results showed that band ratios and all ASTER bands were able to more efficiently discriminate rock units than PC and texture images. The overall classification accuracies achieved were 62.58%, 55.40%, 65.04%, 67.12%, 54.54%, and 53.99% for the nine VNIR/SWIR bands, five TIR bands, all ASTER bands, band ratios, textural, and PCs datasets, respectively. Four datasets including all ASTER bands, band ratios, textural, and PCs datasets (37 bands) were combined as one group and applied in second RF classification which led to increase overall accuracy (up to 81.52%). Based on the four classified maps, an uncertainty map was produced to identify areas of variable (uncertain) classification results, which revealed that approximately 21.43% of all pixels on the classified map were highly uncertain. The RF algorithm found that 12 of the predictors were more important in the classification process. These predictors were used in a third RF classification, which resulted in an overall classification accuracy of 77.21%. Thus, the third RF classification decreases the accuracy. Field observations were used to validate our classification results.

  6. Analogies between colored Lévy noise and random channel approach to disordered kinetics

    Science.gov (United States)

    Vlad, Marcel O.; Velarde, Manuel G.; Ross, John

    2004-02-01

    We point out some interesting analogies between colored Lévy noise and the random channel approach to disordered kinetics. These analogies are due to the fact that the probability density of the Lévy noise source plays a similar role as the probability density of rate coefficients in disordered kinetics. Although the equations for the two approaches are not identical, the analogies can be used for deriving new, useful results for both problems. The random channel approach makes it possible to generalize the fractional Uhlenbeck-Ornstein processes (FUO) for space- and time-dependent colored noise. We describe the properties of colored noise in terms of characteristic functionals, which are evaluated by using a generalization of Huber's approach to complex relaxation [Phys. Rev. B 31, 6070 (1985)]. We start out by investigating the properties of symmetrical white noise and then define the Lévy colored noise in terms of a Langevin equation with a Lévy white noise source. We derive exact analytical expressions for the various characteristic functionals, which characterize the noise, and a functional fractional Fokker-Planck equation for the probability density functional of the noise at a given moment in time. Second, by making an analogy between the theory of colored noise and the random channel approach to disordered kinetics, we derive fractional equations for the evolution of the probability densities of the random rate coefficients in disordered kinetics. These equations serve as a basis for developing methods for the evaluation of the statistical properties of the random rate coefficients from experimental data. Special attention is paid to the analysis of systems for which the observed kinetic curves can be described by linear or nonlinear stretched exponential kinetics.

  7. Urban Flood Mapping Based on Unmanned Aerial Vehicle Remote Sensing and Random Forest Classifier—A Case of Yuyao, China

    Directory of Open Access Journals (Sweden)

    Quanlong Feng

    2015-03-01

    Full Text Available Flooding is a severe natural hazard, which poses a great threat to human life and property, especially in densely-populated urban areas. As one of the fastest developing fields in remote sensing applications, an unmanned aerial vehicle (UAV can provide high-resolution data with a great potential for fast and accurate detection of inundated areas under complex urban landscapes. In this research, optical imagery was acquired by a mini-UAV to monitor the serious urban waterlogging in Yuyao, China. Texture features derived from gray-level co-occurrence matrix were included to increase the separability of different ground objects. A Random Forest classifier, consisting of 200 decision trees, was used to extract flooded areas in the spectral-textural feature space. Confusion matrix was used to assess the accuracy of the proposed method. Results indicated the following: (1 Random Forest showed good performance in urban flood mapping with an overall accuracy of 87.3% and a Kappa coefficient of 0.746; (2 the inclusion of texture features improved classification accuracy significantly; (3 Random Forest outperformed maximum likelihood and artificial neural network, and showed a similar performance to support vector machine. The results demonstrate that UAV can provide an ideal platform for urban flood monitoring and the proposed method shows great capability for the accurate extraction of inundated areas.

  8. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    Directory of Open Access Journals (Sweden)

    Adrian-Valentin Nedelcu

    2015-01-01

    Full Text Available Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  9. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    Science.gov (United States)

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  10. A Novel Approach on Designing Augmented Fuzzy Cognitive Maps Using Fuzzified Decision Trees

    Science.gov (United States)

    Papageorgiou, Elpiniki I.

    This paper proposes a new methodology for designing Fuzzy Cognitive Maps using crisp decision trees that have been fuzzified. Fuzzy cognitive map is a knowledge-based technique that works as an artificial cognitive network inheriting the main aspects of cognitive maps and artificial neural networks. Decision trees, in the other hand, are well known intelligent techniques that extract rules from both symbolic and numeric data. Fuzzy theoretical techniques are used to fuzzify crisp decision trees in order to soften decision boundaries at decision nodes inherent in this type of trees. Comparisons between crisp decision trees and the fuzzified decision trees suggest that the later fuzzy tree is significantly more robust and produces a more balanced decision making. The approach proposed in this paper could incorporate any type of fuzzy decision trees. Through this methodology, new linguistic weights were determined in FCM model, thus producing augmented FCM tool. The framework is consisted of a new fuzzy algorithm to generate linguistic weights that describe the cause-effect relationships among the concepts of the FCM model, from induced fuzzy decision trees.

  11. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    Directory of Open Access Journals (Sweden)

    Chris Wallace

    2015-06-01

    Full Text Available Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS and type 1 diabetes (T1D associations in the IL-2RA (CD25 gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3 and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data.

  12. The field line map approach for simulations of magnetically confined plasmas

    CERN Document Server

    Stegmeir, Andreas; Maj, Omar; Hallatschek, Klaus; Lackner, Karl

    2015-01-01

    In the presented field line map approach the simulation domain of a tokamak is covered with a cylindrical grid, which is Cartesian within poloidal planes. Standard finite-difference methods can be used for the discretisation of perpendicular (w.r.t.~magnetic field lines) operators. The characteristic flute mode property $\\left(k_{\\parallel}\\ll k_{\\perp}\\right)$ of structures is exploited computationally by a grid sparsification in the toroidal direction. A field line following discretisation of parallel operators is then required, which is achieved via a finite difference along magnetic field lines. This includes field line tracing and interpolation or integration. The main emphasis of this paper is on the discretisation of the parallel diffusion operator. Based on the support operator method a scheme is constructed which exhibits only very low numerical perpendicular diffusion. The schemes are implemented in the new code GRILLIX, and extensive benchmarks are presented which show the validity of the approach ...

  13. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    Science.gov (United States)

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  14. Efficacy of the transtemporal approach with awake brain mapping to reach the dominant posteromedial temporal lesions.

    Science.gov (United States)

    Iijima, Kentaro; Motomura, Kazuya; Chalise, Lushun; Hirano, Masaki; Natsume, Atsushi; Wakabayashi, Toshihiko

    2017-01-01

    Surgeries for lesions in the dominant hippocampal and parahippocampal gyrus involving the posteromedial temporal regions are challenging to perform because they are located close to Wernicke's area; white matter fibers related with language; the optic radiations; and critical neurovascular structures. We performed a transtemporal approach with awake functional mapping for lesions affecting the dominant posteromedial temporal regions. The aim of this study was to assess the feasibility, safety, and efficacy of awake craniotomy for these lesions. We retrospectively reviewed four consecutive patients with tumors or cavernous angiomas located in the left hippocampal and parahippocampal gyrus, which further extended to the posteromedial temporal regions, who underwent awake surgery between December 2014 and January 2016. Four patients with lesions associated with the left hippocampal and parahippocampal gyrus, including the posteromedial temporal area, who underwent awake surgery were registered in the study. In all four patients, cortical and subcortical eloquent areas were identified via direct electrical stimulation. This allowed determination of the optimal surgical route to the angioma or tumor, even in the language-dominant hippocampal and parahippocampal gyrus. In particular, this approach enabled access to the upper part of posteromedial temporal lesions, while protecting the subcortical language-related fibers, such as the superior longitudinal fasciculus. This study revealed that awake brain mapping can enable the safe resection of dominant posteromedial temporal lesions, while protecting cortical and subcortical eloquent areas. Furthermore, our experience with four patients demonstrates the feasibility, safety, and efficacy of awake surgery for these lesions.

  15. Cross-section comparisons of cloaks designed by transformation optical and optical conformal mapping approaches

    Science.gov (United States)

    Urzhumov, Yaroslav A.; Kundtz, Nathan B.; Smith, David R.; Pendry, John B.

    2011-02-01

    We review several approaches to optical invisibility designed using transformation optics (TO) and optical conformal mapping (CM) techniques. TO is a general framework for solving inverse scattering problems based on mimicking spatial coordinate transformations with distributions of material properties. There are two essential steps in the design of TO media: first, a coordinate transformation that achieves some desired functionality, resulting in a continuous spatial distribution of constitutive parameters that are generally anisotropic; and, second, the reduction of the derived continuous constitutive parameters to a metamaterial that serves as a stepwise approximation. We focus here on the first step, discussing the merits of various TO strategies proposed for the long-sought 'invisibility cloak'—a structure that renders opaque objects invisible. We also evaluate the cloaking capabilities of structures designed by the related CM approach, which makes use of conformal mapping to achieve index-only material distributions. The performance of the various cloaks is evaluated and compared using a universal measure—the total (all-angle) scattering cross section.

  16. In Silico Design of Human IMPDH Inhibitors Using Pharmacophore Mapping and Molecular Docking Approaches

    Directory of Open Access Journals (Sweden)

    Rui-Juan Li

    2015-01-01

    Full Text Available Inosine 5′-monophosphate dehydrogenase (IMPDH is one of the crucial enzymes in the de novo biosynthesis of guanosine nucleotides. It has served as an attractive target in immunosuppressive, anticancer, antiviral, and antiparasitic therapeutic strategies. In this study, pharmacophore mapping and molecular docking approaches were employed to discover novel Homo sapiens IMPDH (hIMPDH inhibitors. The Güner-Henry (GH scoring method was used to evaluate the quality of generated pharmacophore hypotheses. One of the generated pharmacophore hypotheses was found to possess a GH score of 0.67. Ten potential compounds were selected from the ZINC database using a pharmacophore mapping approach and docked into the IMPDH active site. We find two hits (i.e., ZINC02090792 and ZINC00048033 that match well the optimal pharmacophore features used in this investigation, and it is found that they form interactions with key residues of IMPDH. We propose that these two hits are lead compounds for the development of novel hIMPDH inhibitors.

  17. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps.

    Science.gov (United States)

    Santos, Diego G; Fernandes, Bruno J T; Bezerra, Byron L D

    2015-11-12

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset.

  18. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    Science.gov (United States)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  19. Hyperspectral Data for Mangrove Species Mapping: A Comparison of Pixel-Based and Object-Based Approach

    Directory of Open Access Journals (Sweden)

    Muhammad Kamal

    2011-10-01

    Full Text Available Visual image interpretation and digital image classification have been used to map and monitor mangrove extent and composition for decades. The presence of a high-spatial resolution hyperspectral sensor can potentially improve our ability to differentiate mangrove species. However, little research has explored the use of pixel-based and object-based approaches on high-spatial hyperspectral datasets for this purpose. This study assessed the ability of CASI-2 data for mangrove species mapping using pixel-based and object-based approaches at the mouth of the Brisbane River area, southeast Queensland, Australia. Three mapping techniques used in this study: spectral angle mapper (SAM and linear spectral unmixing (LSU for the pixel-based approaches, and multi-scale segmentation for the object-based image analysis (OBIA. The endmembers for the pixel-based approach were collected based on existing vegetation community map. Nine targeted classes were mapped in the study area from each approach, including three mangrove species: Avicennia marina, Rhizophora stylosa, and Ceriops australis. The mapping results showed that SAM produced accurate class polygons with only few unclassified pixels (overall accuracy 69%, Kappa 0.57, the LSU resulted in a patchy polygon pattern with many unclassified pixels (overall accuracy 56%, Kappa 0.41, and the object-based mapping produced the most accurate results (overall accuracy 76%, Kappa 0.67. Our results demonstrated that the object-based approach, which combined a rule-based and nearest-neighbor classification method, was the best classifier to map mangrove species and its adjacent environments.

  20. On the statistical implications of certain Random permutations in Markovian Arrival Processes (MAPs) and second order self-similar processes

    DEFF Research Database (Denmark)

    Andersen, Allan T.; Nielsen, Bo Friis

    2000-01-01

    . The implications for the correlation structure when shuffling an exactly second-order self-similar process are examined. We apply the Markovian arrival process (MAP) as a tool to investigate whether general conclusions can be made with regard to the statistical implications of the shuffling experiments......In this paper, we examine the implications of certain random permutations in an arrival process that have gained considerable interest in recent literature. The so-called internal and external shuffling have been used to explain phenomena observed in traffic traces from LANs. Loosely, the internal...... shuffling can be viewed as a way of performing local permutations in the arrival stream, while the external shuffling is a way of performing global permutations. We derive formulas for the correlation structures of the shuffled processes in terms of the original arrival process in great generality...

  1. Integrating legacy soil information in a Digital Soil Mapping approach based on a modified conditioned Latin Hypercube Sampling design

    Science.gov (United States)

    Stumpf, Felix; Schmidt, Karsten; Behrens, Thorsten; Schoenbrodt-Stitt, Sarah; Scholten, Thomas

    2014-05-01

    One crucial component of a Digital Soil Mapping (DSM) framework is outlined by geo-referenced soil observations. Nevertheless, highly informative legacy soil information, acquired by traditional soil surveys, is often neglected due to lacking accordance with specific statistical DSM designs. The focus of this study is to integrate legacy data into a state-of-the-art DSM approach, based on a modified conditioned Latin Hypercube Sampling (cLHS) design and Random Forest. Furthermore, by means of the cLHS modification the scope of actually unique cLHS sampling locations is widened in order to compensate limited accessability in the field. As well, the maximally stratified cLHS design is not diluted by the modification. Exemplarily the target variables of the modelling are represented by sand and clay fractions. The study site is a small mountainous hydrological catchment of 4.2 km² in the reservoir of the Three Gorges Dam in Central China. The modification is accomplished by demarcating the histogram borders of each cLHS stratum, which are based on the multivariate cLHS feature space. Thereby, all potential sample locations per stratum are identified. This provides a possibility to integrate legacy data samples that match one of the newly created sample locations, and flexibility with respect to field accessibility. Consequently, six legacy data samples, taken from a total sample size of n = 30 were integrated into the sampling design and for all strata several potential sample locations are identified. The comparability of the modified and standard cLHS data sets is approved by (i) identifying their feature space coverage with respect to the cLHS stratifying variables, and (ii) by assessing the Random Forest accuracy estimates.

  2. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    Directory of Open Access Journals (Sweden)

    Ayman Nagi

    2017-04-01

    Full Text Available Purpose: This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation process nonconformities .

  3. Remote Sensing in Mapping Mangrove Ecosystems — An Object-Based Approach

    Directory of Open Access Journals (Sweden)

    Quoc Tuan Vo

    2013-01-01

    Full Text Available Over the past few decades, clearing for shrimp farming has caused severe losses of mangroves in the Mekong Delta (MD of Vietnam. Although the increasing importance of shrimp aquaculture in Vietnam has brought significant financial benefits to the local communities, the rapid and largely uncontrolled increase in aquacultural area has contributed to a considerable loss of mangrove forests and to environmental degradation. Although different approaches have been used for mangrove classification, no approach to date has addressed the challenges of the special conditions that can be found in the aquaculture-mangrove system in the Ca Mau province of the MD. This paper presents an object-based classification approach for estimating the percentage of mangroves in mixed mangrove-aquaculture farming systems to assist the government to monitor the extent of the shrimp farming area. The method comprises multi-resolution segmentation and classification of SPOT5 data using a decision tree approach as well as local knowledge from the region of interest. The results show accuracies higher than 75% for certain classes at the object level. Furthermore, we successfully detect areas with mixed aquaculture-mangrove land cover with high accuracies. Based on these results, mangrove development, especially within shrimp farming-mangrove systems, can be monitored. However, the mangrove forest cover fraction per object is affected by image segmentation and thus does not always correspond to the real farm boundaries. It remains a serious challenge, then, to accurately map mangrove forest cover within mixed systems.

  4. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Dhruba Das

    2015-04-01

    Full Text Available In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM/M/1 and M/FM/1 has been studied and constructed their membership functions of the system characteristics based on the aforesaid principle. The former represents a queue with fuzzy exponential arrivals and exponential service rate while the latter represents a queue with exponential arrival rate and fuzzy exponential service rate.

  5. Clinical medication review tool for polypharmacy: Mapping approach for pharmacotherapeutic classifications.

    Science.gov (United States)

    Mizokami, Fumihiro; Mizuno, Tomohiro; Mori, Tomoyo; Nagamatsu, Tadashi; Endo, Hideharu; Hirashita, Tomoyuki; Ichino, Takanobu; Akishita, Masahiro; Furuta, Katsunori

    2017-11-01

    Polypharmacy is an extremely important problem, because it increases the risk of adverse drug reactions. The aim of the current study was to create a clinical medication review tool to detect inappropriate medication use, and assess this new method with elderly Japanese patients. The new method involves optimizing prescription drugs from indications, based on the chronic disease-anatomical therapeutic class code list. The present study investigated the prevalence of potentially inappropriate medications in 5667 Japanese patients aged ≥65 years with polypharmacy (≥5 drugs) in comparison with the Beers criteria 2012. We propose a new method called the Mapping Approach for Pharmacotherapeutic Classifications: (i) identify the chronic disease-anatomical therapeutic class code assigned to the prescription drugs; (ii) identify the chronic disease-anatomical therapeutic class code corresponding to the patient's chronic disease; (iii) compare the prescription drug and patient's chronic disease chronic disease-anatomical therapeutic class codes; and (iv) identify the appropriateness of medication use based on the comparison (appropriate use is defined as matching codes). The mean number of potentially inappropriate medications detected was significantly different between the mapping approach and Beers criteria 2012 (3.1 ± 2.6 vs 0.6 ± 0.8 drugs, respectively; P Approach for Pharmacotherapeutic Classifications is highly dependent on the chronic condition. Pharmacists should confirm the chronic condition with the treating physician before reducing a patient's medications. We hope this process will further influence prescribing patterns, and decrease the inappropriate use of medications and associated adverse drug reactions in older adults. Geriatr Gerontol Int 2017; 17: 2025-2033. © 2017 Japan Geriatrics Society.

  6. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    Science.gov (United States)

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Development of "Long Live Love+," a School-Based Online Sexual Health Programme for Young Adults. An Intervention Mapping Approach

    Science.gov (United States)

    Mevissen, Fraukje E. F.; van Empelen, Pepijn; Watzeels, Anita; van Duin, Gee; Meijer, Suzanne; van Lieshout, Sanne; Kok, Gerjo

    2018-01-01

    This paper describes the development of a Dutch online programme called "Long Live Love+" focusing on positive, coercion-free relationships, contraception use, and the prevention of STIs, using the Intervention Mapping (IM) approach. All six steps of the approach were followed. Step 1 confirmed the need for a sexual health programme…

  8. Resistance of a 1D random chain: Hamiltonian version of the transfer matrix approach

    Science.gov (United States)

    Dossetti-Romero, V.; Izrailev, F. M.; Krokhin, A. A.

    2004-01-01

    We study some mesoscopic properties of electron transport by employing one-dimensional chains and Anderson tight-binding model. Principal attention is paid to the resistance of finite-length chains with disordered white-noise potential. We develop a new version of the transfer matrix approach based on the equivalency of a discrete Schrödinger equation and a two-dimensional Hamiltonian map describing a parametric kicked oscillator. In the two limiting cases of ballistic and localized regime we demonstrate how analytical results for the mean resistance and its second moment can be derived directly from the averaging over classical trajectories of the Hamiltonian map. We also discuss the implication of the single parameter scaling hypothesis to the resistance.

  9. The land morphology approach to flood risk mapping: An application to Portugal.

    Science.gov (United States)

    Cunha, N S; Magalhães, M R; Domingos, T; Abreu, M M; Küpfer, C

    2017-05-15

    In the last decades, the increasing vulnerability of floodplains is linked to societal changes such as population density growth, land use changes, water use patterns, among other factors. Land morphology directly influences surface water flow, transport of sediments, soil genesis, local climate and vegetation distribution. Therefore, the land morphology, the land used and management directly influences flood risks genesis. However, attention is not always given to the underlying geomorphological and ecological processes that influence the dynamic of rivers and their floodplains. Floodplains are considered a part of a larger system called Wet System (WS). The WS includes permanent and temporary streams, water bodies, wetlands and valley bottoms. Valley bottom is a broad concept which comprehends not only floodplains but also flat and concave areas, contiguous to streams, in which slope is less than 5%. This will be addressed through a consistent method based on a land morphology approach that classifies landforms according to their hydrological position in the watershed. This method is based on flat areas (slopes less than 5%), surface curvature and hydrological features. The comparison between WS and flood risk data from the Portuguese Environmental Agency for the main rivers of mainland Portugal showed that in downstream areas of watersheds, valley bottoms are coincident with floodplains modelled by hydrological methods. Mapping WS has a particular interest in analysing river ecosystems position and function in the landscape, from upstream to downstream areas in the watershed. This morphological approach is less demanding data and time-consuming than hydrological methods and can be used as the preliminary delimitation of floodplains and potential flood risk areas in situations where there is no hydrological data available. The results were also compared with the land use/cover map at a national level and detailed in Trancão river basin, located in Lisbon

  10. Comparison of Flood Inundation Mapping Techniques between Different Modeling Approaches and Satellite Imagery

    Science.gov (United States)

    Zhang, J.; Munasinghe, D.; Huang, Y. F.; Lin, P.; Fang, N. Z.; Cohen, S.; Tsang, Y. P.

    2016-12-01

    Flood inundation extent serves as a crucial information source for both hydrologists and decision makers. Accurate and timely inundation mapping can potentially improve flood risk management and reduce flood damage. In this study, the authors applied two modeling approaches to estimate flood inundation area for a large flooding event that occurred in May 2016 in the Brazos River: The Height Above the Nearest Drainage combined with National Hydrograph Dataset (NHD-HAND) and the International River Interface Cooperative - Flow and Sediment Transport with Morphological Evolution of Channels (iRIC-FaSTMECH). NHD-HAND features a terrain model that simplifies the dynamic flood inundation mapping process while iRIC-FaSTMECH is a hydrodynamic model that simulates flood extent under quasi-steady approximation. In terms of data sources, HAND and iRIC utilized the National Water Model (NWM) output and the United States Geological Survey (USGS) stream gage data, respectively. The flood inundation extents generated from these two approaches were validated against Landsat 8 Satellite Imagery. Four remote sensing classification techniques were used to provide alternative observations: supervised, unsupervised, normalized difference water index and delta-cue change detection of water. According to the quantitative analysis that compares simulated areas with different remote sensing classifications, the advanced fitness index of iRIC simulation ranges from 57.5% to 69.9% while that of HAND ranges from 49.4% to 55.5%. We found that even though HAND better captures some details than iRIC in the inundation extent, it has problems in certain areas where subcatchments are not behaving independently, especially for extreme flooding events. The iRIC model performs better in this case, however, we cannot simply conclude iRIC is a better-suited approach than HAND considering the uncertainties in remote sensing observations and iRIC model parameters. Further research will include more

  11. An Approach to Assessment of Relief Formats for Hardcopy Topographic Maps

    Science.gov (United States)

    1979-04-01

    determine what types of relief information must be ex-dtracted from maps by representative users, we studied and analyzed Army Field Manual FM 21-16, "Map...extensive review materiall (a) study of one map format might facilitate (or hinder) use of another map format. TRAINING NECESSARY [OR TEST USE This test

  12. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  13. Random spectrum loading of dental implants: An alternative approach to functional performance assessment.

    Science.gov (United States)

    Shemtov-Yona, K; Rittel, D

    2016-09-01

    The fatigue performance of dental implants is usually assessed on the basis of cyclic S/N curves. This neither provides information on the anticipated service performance of the implant, nor does it allow for detailed comparisons between implants unless a thorough statistical analysis is performed, of the kind not currently required by certification standards. The notion of endurance limit is deemed to be of limited applicability, given unavoidable stress concentrations and random load excursions, that all characterize dental implants and their service conditions. We propose a completely different approach, based on random spectrum loading, as long used in aeronautical design. The implant is randomly loaded by a sequence of loads encompassing all load levels it would endure during its service life. This approach provides a quantitative and comparable estimate of its performance in terms of lifetime, based on the very fact that the implant will fracture sooner or later, instead of defining a fatigue endurance limit of limited practical application. Five commercial monolithic Ti-6Al-4V implants were tested under cyclic, and another 5 under spectrum loading conditions, at room temperature and dry air. The failure modes and fracture planes were identical for all implants. The approach is discussed, including its potential applications, for systematic, straightforward and reliable comparisons of various implant designs and environments, without the need for cumbersome statistical analyses. It is believed that spectrum loading can be considered for the generation of new standardization procedures and design applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A new LPV modeling approach using PCA-based parameter set mapping to design a PSS.

    Science.gov (United States)

    Jabali, Mohammad B Abolhasani; Kazemi, Mohammad H

    2017-01-01

    This paper presents a new methodology for the modeling and control of power systems based on an uncertain polytopic linear parameter-varying (LPV) approach using parameter set mapping with principle component analysis (PCA). An LPV representation of the power system dynamics is generated by linearization of its differential-algebraic equations about the transient operating points for some given specific faults containing the system nonlinear properties. The time response of the output signal in the transient state plays the role of the scheduling signal that is used to construct the LPV model. A set of sample points of the dynamic response is formed to generate an initial LPV model. PCA-based parameter set mapping is used to reduce the number of models and generate a reduced LPV model. This model is used to design a robust pole placement controller to assign the poles of the power system in a linear matrix inequality (LMI) region, such that the response of the power system has a proper damping ratio for all of the different oscillation modes. The proposed scheme is applied to controller synthesis of a power system stabilizer, and its performance is compared with a tuned standard conventional PSS using nonlinear simulation of a multi-machine power network. The results under various conditions show the robust performance of the proposed controller.

  15. Policy, Research and Residents' Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach.

    Science.gov (United States)

    Stankov, Ivana; Howard, Natasha J; Daniel, Mark; Cargo, Margaret

    2017-02-09

    An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR). Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups' perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention.

  16. Policy, Research and Residents’ Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach

    Directory of Open Access Journals (Sweden)

    Ivana Stankov

    2017-02-01

    Full Text Available An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR. Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups’ perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention.

  17. Geographical information system approaches for hazard mapping of dilute lahars on Montserrat, West Indies

    Science.gov (United States)

    Darnell, A. R.; Barclay, J.; Herd, R. A.; Phillips, J. C.; Lovett, A. A.; Cole, P.

    2012-08-01

    Many research tools for lahar hazard assessment have proved wholly unsuitable for practical application to an active volcanic system where field measurements are challenging to obtain. Two simple routing models, with minimal data demands and implemented in a geographical information system (GIS), were applied to dilute lahars originating from Soufrière Hills Volcano, Montserrat. Single-direction flow routing by path of steepest descent, commonly used for simulating normal stream-flow, was tested against LAHARZ, an established lahar model calibrated for debris flows, for ability to replicate the main flow routes. Comparing the ways in which these models capture observed changes, and how the different modelled paths deviate can also provide an indication of where dilute lahars, do not follow behaviour expected from single-phase flow models. Data were collected over two field seasons and provide (1) an overview of gross morphological change after one rainy season, (2) details of dominant channels at the time of measurement, and (3) order of magnitude estimates of individual flow volumes. Modelling results suggested both GIS-based predictive tools had associated benefits. Dominant flow routes observed in the field were generally well-predicted using the hydrological approach with a consideration of elevation error, while LAHARZ was comparatively more successful at mapping lahar dispersion and was better suited to long-term hazard assessment. This research suggests that end-member models can have utility for first-order dilute lahar hazard mapping.

  18. Policy, Research and Residents’ Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach

    Science.gov (United States)

    Stankov, Ivana; Howard, Natasha J.; Daniel, Mark; Cargo, Margaret

    2017-01-01

    An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR). Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups’ perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention. PMID:28208786

  19. Reasons for electronic cigarette use beyond cigarette smoking cessation: A concept mapping approach.

    Science.gov (United States)

    Soule, Eric K; Rosas, Scott R; Nasim, Aashir

    2016-05-01

    Electronic cigarettes (ECIGs) continue to grow in popularity, however, limited research has examined reasons for ECIG use. This study used an integrated, mixed-method participatory research approach called concept mapping (CM) to characterize and describe adults' reasons for using ECIGs. A total of 108 adults completed a multi-module online CM study that consisted of brainstorming statements about their reasons for ECIG use, sorting each statement into conceptually similar categories, and then rating each statement based on whether it represented a reason why they have used an ECIG in the past month. Participants brainstormed a total of 125 unique statements related to their reasons for ECIG use. Multivariate analyses generated a map revealing 11, interrelated components or domains that characterized their reasons for use. Importantly, reasons related to Cessation Methods, Perceived Health Benefits, Private Regard, Convenience and Conscientiousness were rated significantly higher than other categories/types of reasons related to ECIG use (p<.05). There also were significant model differences in participants' endorsement of reasons based on their demography and ECIG behaviors. This study shows that ECIG users are motivated to use ECIGs for many reasons. ECIG regulations should address these reasons for ECIG use in addition to smoking cessation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. An efficient unsupervised index based approach for mapping urban vegetation from IKONOS imagery

    Science.gov (United States)

    Anchang, Julius Y.; Ananga, Erick O.; Pu, Ruiliang

    2016-08-01

    Despite the increased availability of high resolution satellite image data, their operational use for mapping urban land cover in Sub-Saharan Africa continues to be limited by lack of computational resources and technical expertise. As such, there is need for simple and efficient image classification techniques. Using Bamenda in North West Cameroon as a test case, we investigated two completely unsupervised pixel based approaches to extract tree/shrub (TS) and ground vegetation (GV) cover from an IKONOS derived soil adjusted vegetation index. These included: (1) a simple Jenks Natural Breaks classification and (2) a two-step technique that combined the Jenks algorithm with agglomerative hierarchical clustering. Both techniques were compared with each other and with a non-linear support vector machine (SVM) for classification performance. While overall classification accuracy was generally high for all techniques (>90%), One-Way Analysis of Variance tests revealed the two step technique to outperform the simple Jenks classification in terms of predicting the GV class. It also outperformed the SVM in predicting the TS class. We conclude that the unsupervised methods are technically as good and practically superior for efficient urban vegetation mapping in budget and technically constrained regions such as Sub-Saharan Africa.

  1. Data Mining Approaches for Landslide Susceptibility Mapping in Umyeonsan, Seoul, South Korea

    Directory of Open Access Journals (Sweden)

    Sunmin Lee

    2017-07-01

    Full Text Available The application of data mining models has become increasingly popular in recent years in assessments of a variety of natural hazards such as landslides and floods. Data mining techniques are useful for understanding the relationships between events and their influencing variables. Because landslides are influenced by a combination of factors including geomorphological and meteorological factors, data mining techniques are helpful in elucidating the mechanisms by which these complex factors affect landslide events. In this study, spatial data mining approaches based on data on landslide locations in the geographic information system environment were investigated. The topographical factors of slope, aspect, curvature, topographic wetness index, stream power index, slope length factor, standardized height, valley depth, and downslope distance gradient were determined using topographical maps. Additional soil and forest variables using information obtained from national soil and forest maps were also investigated. A total of 17 variables affecting the frequency of landslide occurrence were selected to construct a spatial database, and support vector machine (SVM and artificial neural network (ANN models were applied to predict landslide susceptibility from the selected factors. In the SVM model, linear, polynomial, radial base function, and sigmoid kernels were applied in sequence; the model yielded 72.41%, 72.83%, 77.17% and 72.79% accuracy, respectively. The ANN model yielded a validity accuracy of 78.41%. The results of this study are useful in guiding effective strategies for the prevention and management of landslides in urban areas.

  2. A new LPV modeling approach using PCA-based parameter set mapping to design a PSS

    Directory of Open Access Journals (Sweden)

    Mohammad B. Abolhasani Jabali

    2017-01-01

    Full Text Available This paper presents a new methodology for the modeling and control of power systems based on an uncertain polytopic linear parameter-varying (LPV approach using parameter set mapping with principle component analysis (PCA. An LPV representation of the power system dynamics is generated by linearization of its differential-algebraic equations about the transient operating points for some given specific faults containing the system nonlinear properties. The time response of the output signal in the transient state plays the role of the scheduling signal that is used to construct the LPV model. A set of sample points of the dynamic response is formed to generate an initial LPV model. PCA-based parameter set mapping is used to reduce the number of models and generate a reduced LPV model. This model is used to design a robust pole placement controller to assign the poles of the power system in a linear matrix inequality (LMI region, such that the response of the power system has a proper damping ratio for all of the different oscillation modes. The proposed scheme is applied to controller synthesis of a power system stabilizer, and its performance is compared with a tuned standard conventional PSS using nonlinear simulation of a multi-machine power network. The results under various conditions show the robust performance of the proposed controller.

  3. Multi-scale hierarchical approach for parametric mapping: assessment on multi-compartmental models.

    Science.gov (United States)

    Rizzo, G; Turkheimer, F E; Bertoldo, A

    2013-02-15

    This paper investigates a new hierarchical method to apply basis function to mono- and multi-compartmental models (Hierarchical-Basis Function Method, H-BFM) at a voxel level. This method identifies the parameters of the compartmental model in its nonlinearized version, integrating information derived at the region of interest (ROI) level by segmenting the cerebral volume based on anatomical definition or functional clustering. We present the results obtained by using a two tissue-four rate constant model with two different tracers ([(11)C]FLB457 and [carbonyl-(11)C]WAY100635), one of the most complex models used in receptor studies, especially at the voxel level. H-BFM is robust and its application on both [(11)C]FLB457 and [carbonyl-(11)C]WAY100635 allows accurate and precise parameter estimates, good quality parametric maps and a low percentage of voxels out of physiological bound (modeling at the voxel level. In particular, different from other proposed approaches, this method can also be used when the linearization of the model is not appropriate. We expect that applying it to clinical data will generate reliable parametric maps. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Epitope mapping by random peptide phage display reveals essential residues for vaccinia extracellular enveloped virion spread

    Directory of Open Access Journals (Sweden)

    He Yong

    2012-09-01

    Full Text Available Abstract Background A33 is a type II integral membrane protein expressed on the extracellular enveloped form of vaccinia virus (VACV. Passive transfer of A33-directed monoclonal antibodies or vaccination with an A33 subunit vaccine confers protection against lethal poxvirus challenge in animal models. Homologs of A33 are highly conserved among members of the Orthopoxvirus genus and are potential candidates for inclusion in vaccines or assays targeting extracellular enveloped virus activity. One monoclonal antibody directed against VACV A33, MAb-1G10, has been shown to target a conformation-dependent epitope. Interestingly, while it recognizes VACV A33 as well as the corresponding variola homolog, it does not bind to the monkeypox homolog. In this study, we utilized a random phage display library to investigate the epitope recognized by MAb-1G10 that is critical for facilitating cell-to-cell spread of the vaccinia virus. Results By screening with linear or conformational random phage libraries, we found that phages binding to MAb-1G10 display the consensus motif CEPLC, with a disulfide bond formed between two cysteine residues required for MAb-1G10 binding. Although the phage motif contained no linear sequences homologous to VACV A33, structure modeling and analysis suggested that residue D115 is important to form the minimal epitope core. A panel of point mutants expressing the ectodomain of A33 protein was generated and analyzed by either binding assays such as ELISA and immunoprecipitation or a functional assessment by blocking MAb-1G10 mediated comet inhibition in cell culture. Conclusions These results confirm L118 as a component of the MAb-1G10 binding epitope, and further identify D115 as an essential residue. By defining the minimum conformational structure, as well as the conformational arrangement of a short peptide sequence recognized by MAb-1G10, these results introduce the possibility of designing small molecule mimetics that may

  5. The D4Science Approach toward Grid Resource Sharing: The Species Occurrence Maps Generation Case

    Science.gov (United States)

    Candela, Leonardo; Pagano, Pasquale

    Nowadays science is highly multidisciplinary and requires innovative research environments. Such research environments, also known as Virtual Research Environments, should be powerful and flexible enough to help researchers in all disciplines to manage the complex range of tasks involved in carrying out eScience activities. Such research environments should support computationallyintensive, data-intensive and collaboration-intensive tasks on both small and large scale. The community to be served by a specific research environment is expected to be potentially distributed across multiple organizational domains and institutions. This paper discusses the approach put in place in the context of the D4Science EU project to enable on-demand production of Virtual Research Environments by relying on an innovative, grid-based Infrastructure. In particular, the foundational principles, the enabling technology and the concrete experience resulting from developing (i) a production Infrastructure and (ii) a Virtual Research Environment for generating predictive species distribution maps are described.

  6. F-MAP: A Bayesian approach to infer the gene regulatory network using external hints.

    Science.gov (United States)

    Shahdoust, Maryam; Pezeshk, Hamid; Mahjub, Hossein; Sadeghi, Mehdi

    2017-01-01

    The Common topological features of related species gene regulatory networks suggest reconstruction of the network of one species by using the further information from gene expressions profile of related species. We present an algorithm to reconstruct the gene regulatory network named; F-MAP, which applies the knowledge about gene interactions from related species. Our algorithm sets a Bayesian framework to estimate the precision matrix of one species microarray gene expressions dataset to infer the Gaussian Graphical model of the network. The conjugate Wishart prior is used and the information from related species is applied to estimate the hyperparameters of the prior distribution by using the factor analysis. Applying the proposed algorithm on six related species of drosophila shows that the precision of reconstructed networks is improved considerably compared to the precision of networks constructed by other Bayesian approaches.

  7. An Improved Map-Matching Technique Based on the Fréchet Distance Approach for Pedestrian Navigation Services

    Directory of Open Access Journals (Sweden)

    Yoonsik Bang

    2016-10-01

    Full Text Available Wearable and smartphone technology innovations have propelled the growth of Pedestrian Navigation Services (PNS. PNS need a map-matching process to project a user’s locations onto maps. Many map-matching techniques have been developed for vehicle navigation services. These techniques are inappropriate for PNS because pedestrians move, stop, and turn in different ways compared to vehicles. In addition, the base map data for pedestrians are more complicated than for vehicles. This article proposes a new map-matching method for locating Global Positioning System (GPS trajectories of pedestrians onto road network datasets. The theory underlying this approach is based on the Fréchet distance, one of the measures of geometric similarity between two curves. The Fréchet distance approach can provide reasonable matching results because two linear trajectories are parameterized with the time variable. Then we improved the method to be adaptive to the positional error of the GPS signal. We used an adaptation coefficient to adjust the search range for every input signal, based on the assumption of auto-correlation between consecutive GPS points. To reduce errors in matching, the reliability index was evaluated in real time for each match. To test the proposed map-matching method, we applied it to GPS trajectories of pedestrians and the road network data. We then assessed the performance by comparing the results with reference datasets. Our proposed method performed better with test data when compared to a conventional map-matching technique for vehicles.

  8. An Improved Map-Matching Technique Based on the Fréchet Distance Approach for Pedestrian Navigation Services.

    Science.gov (United States)

    Bang, Yoonsik; Kim, Jiyoung; Yu, Kiyun

    2016-10-22

    Wearable and smartphone technology innovations have propelled the growth of Pedestrian Navigation Services (PNS). PNS need a map-matching process to project a user's locations onto maps. Many map-matching techniques have been developed for vehicle navigation services. These techniques are inappropriate for PNS because pedestrians move, stop, and turn in different ways compared to vehicles. In addition, the base map data for pedestrians are more complicated than for vehicles. This article proposes a new map-matching method for locating Global Positioning System (GPS) trajectories of pedestrians onto road network datasets. The theory underlying this approach is based on the Fréchet distance, one of the measures of geometric similarity between two curves. The Fréchet distance approach can provide reasonable matching results because two linear trajectories are parameterized with the time variable. Then we improved the method to be adaptive to the positional error of the GPS signal. We used an adaptation coefficient to adjust the search range for every input signal, based on the assumption of auto-correlation between consecutive GPS points. To reduce errors in matching, the reliability index was evaluated in real time for each match. To test the proposed map-matching method, we applied it to GPS trajectories of pedestrians and the road network data. We then assessed the performance by comparing the results with reference datasets. Our proposed method performed better with test data when compared to a conventional map-matching technique for vehicles.

  9. Mapping Inundation and Changes in Wetland Extent with L-band SAR: A Combined Data and Modeling Approach

    Science.gov (United States)

    Galantowicz, J. F.; Samanta, A.

    2011-12-01

    Accurate mapping of seasonal and inter-annual changes in inundation and wetland extent is a key requisite for the estimation of greenhouse gas (GHG, e.g., methane) emissions from land surfaces to the atmosphere. This task would benefit from the 1- to 3-km spatial resolution L-band synthetic aperture radar (SAR) and 3-day revisit time of NASA's Soil Moisture Active Passive (SMAP) mission, planned for launch in 2014. With a view to utilizing this unique capability, we propose a method for mapping the fraction of area inundated using a combination of semi-empirical models of radar backscatter and L-band SAR data. Inundation exhibits a characteristic radar backscatter that is affected by a set of factors, including roughness of soil and water surfaces, and presence of vegetation cover. Further, the impact of vegetation cover on radar backscatter from underlying soil and/or water surface will depend on biome type. The effects of these factors on both the like-polarized (HH, VV) and cross-polarized (HV) radar backscatter was investigated using semi-empirical models. A key step in devising an inundation fraction retrieval algorithm is to benchmark and calibrate the backscatter simulated with semi-empirical models against SAR data. This task was undertaken using data from the Phased Array L-Band Synthetic Aperture Radar (PALSAR) instrument onboard Japan's Earth Resources Satellite's (JERS, e.g., Fig. 1). This calibration was performed in the following way. First, using a Monte-Carlo type of approach, a large set of random backscatter samples was extracted from different landcover classes, including dry forests and clear-cut areas, inundated forests (wetlands), and open water. Second, mean backscatter was calculated at varying spatial resolutions: 100 m, 500 m, 1 km, 2 km, 3 km and 10 km. Third, the mean model backscatter was set to the mean PALSAR backscatter for each landcover class, but the model dispersion was retained. Finally, using these calibrated values

  10. Novel fine-scale aerial mapping approach quantifies grassland weed cover dynamics and response to management.

    Science.gov (United States)

    Malmstrom, Carolyn M; Butterfield, H Scott; Planck, Laura; Long, Christopher W; Eviner, Valerie T

    2017-01-01

    Invasive weeds threaten the biodiversity and forage productivity of grasslands worldwide. However, management of these weeds is constrained by the practical difficulty of detecting small-scale infestations across large landscapes and by limits in understanding of landscape-scale invasion dynamics, including mechanisms that enable patches to expand, contract, or remain stable. While high-end hyperspectral remote sensing systems can effectively map vegetation cover, these systems are currently too costly and limited in availability for most land managers. We demonstrate application of a more accessible and cost-effective remote sensing approach, based on simple aerial imagery, for quantifying weed cover dynamics over time. In California annual grasslands, the target communities of interest include invasive weedy grasses (Aegilops triuncialis and Elymus caput-medusae) and desirable forage grass species (primarily Avena spp. and Bromus spp.). Detecting invasion of annual grasses into an annual-dominated community is particularly challenging, but we were able to consistently characterize these two communities based on their phenological differences in peak growth and senescence using maximum likelihood supervised classification of imagery acquired twice per year (in mid- and end-of season). This approach permitted us to map weed-dominated cover at a 1-m scale (correctly detecting 93% of weed patches across the landscape) and to evaluate weed cover change over time. We found that weed cover was more pervasive and persistent in management units that had no significant grazing for several years than in those that were grazed, whereas forage cover was more abundant and stable in the grazed units. This application demonstrates the power of this method for assessing fine-scale vegetation transitions across heterogeneous landscapes. It thus provides means for small-scale early detection of invasive species and for testing fundamental questions about landscape dynamics.

  11. A universal airborne LiDAR approach for tropical forest carbon mapping.

    Science.gov (United States)

    Asner, Gregory P; Mascaro, Joseph; Muller-Landau, Helene C; Vieilledent, Ghislain; Vaudry, Romuald; Rasamoelina, Maminiaina; Hall, Jefferson S; van Breugel, Michiel

    2012-04-01

    Airborne light detection and ranging (LiDAR) is fast turning the corner from demonstration technology to a key tool for assessing carbon stocks in tropical forests. With its ability to penetrate tropical forest canopies and detect three-dimensional forest structure, LiDAR may prove to be a major component of international strategies to measure and account for carbon emissions from and uptake by tropical forests. To date, however, basic ecological information such as height-diameter allometry and stand-level wood density have not been mechanistically incorporated into methods for mapping forest carbon at regional and global scales. A better incorporation of these structural patterns in forests may reduce the considerable time needed to calibrate airborne data with ground-based forest inventory plots, which presently necessitate exhaustive measurements of tree diameters and heights, as well as tree identifications for wood density estimation. Here, we develop a new approach that can facilitate rapid LiDAR calibration with minimal field data. Throughout four tropical regions (Panama, Peru, Madagascar, and Hawaii), we were able to predict aboveground carbon density estimated in field inventory plots using a single universal LiDAR model (r ( 2 ) = 0.80, RMSE = 27.6 Mg C ha(-1)). This model is comparable in predictive power to locally calibrated models, but relies on limited inputs of basal area and wood density information for a given region, rather than on traditional plot inventories. With this approach, we propose to radically decrease the time required to calibrate airborne LiDAR data and thus increase the output of high-resolution carbon maps, supporting tropical forest conservation and climate mitigation policy.

  12. A coherent geostatistical approach for combining choropleth map and field data in the spatial interpolation of soil properties.

    Science.gov (United States)

    Goovaerts, P

    2011-06-01

    Information available for mapping continuous soil attributes often includes point field data and choropleth maps (e.g. soil or geological maps) that model the spatial distribution of soil attributes as the juxtaposition of polygons (areas) with constant values. This paper presents two approaches to incorporate both point and areal data in the spatial interpolation of continuous soil attributes. In the first instance, area-to-point kriging is used to map the variability within soil units while ensuring the coherence of the prediction so that the average of disaggregated estimates is equal to the original areal datum. The resulting estimates are then used as local means in residual kriging. The second approach proceeds in one step and capitalizes on: 1) a general formulation of kriging that allows the combination of both point and areal data through the use of area-to-area, area-to-point, and point-to-point covariances in the kriging system, 2) the availability of GIS to discretize polygons of irregular shape and size, and 3) knowledge of the point-support variogram model that can be inferred directly from point measurements, thereby eliminating the need for deconvolution procedures. The two approaches are illustrated using the geological map and heavy metal concentrations recorded in the topsoil of the Swiss Jura. Sensitivity analysis indicates that the new procedures improve prediction over ordinary kriging and traditional residual kriging based on the assumption that the local mean is constant within each mapping unit.

  13. A Systematic Approach to Modified BCJR MAP Algorithms for Convolutional Codes

    Directory of Open Access Journals (Sweden)

    Patenaude François

    2006-01-01

    Full Text Available Since Berrou, Glavieux and Thitimajshima published their landmark paper in 1993, different modified BCJR MAP algorithms have appeared in the literature. The existence of a relatively large number of similar but different modified BCJR MAP algorithms, derived using the Markov chain properties of convolutional codes, naturally leads to the following questions. What is the relationship among the different modified BCJR MAP algorithms? What are their relative performance, computational complexities, and memory requirements? In this paper, we answer these questions. We derive systematically four major modified BCJR MAP algorithms from the BCJR MAP algorithm using simple mathematical transformations. The connections between the original and the four modified BCJR MAP algorithms are established. A detailed analysis of the different modified BCJR MAP algorithms shows that they have identical computational complexities and memory requirements. Computer simulations demonstrate that the four modified BCJR MAP algorithms all have identical performance to the BCJR MAP algorithm.

  14. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    Science.gov (United States)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  15. New machine learning tools for predictive vegetation mapping after climate change: Bagging and Random Forest perform better than Regression Tree Analysis

    Science.gov (United States)

    L.R. Iverson; A.M. Prasad; A. Liaw

    2004-01-01

    More and better machine learning tools are becoming available for landscape ecologists to aid in understanding species-environment relationships and to map probable species occurrence now and potentially into the future. To thal end, we evaluated three statistical models: Regression Tree Analybib (RTA), Bagging Trees (BT) and Random Forest (RF) for their utility in...

  16. A new approach of mapping soils in the Alps - Challenges of deriving soil information and creating soil maps for sustainable land use. An example from South Tyrol (Italy)

    Science.gov (United States)

    Baruck, Jasmin; Gruber, Fabian E.; Geitner, Clemens

    2015-04-01

    Nowadays sustainable land use management is gaining importance because intensive land use leads to increasing soil degradation. Especially in mountainous regions like the Alps sustainable land use management is important, as topography limits land use. Therefore, a database containing detailed information of soil characteristics is required. However, information of soil properties is far from being comprehensive. The project "ReBo - Terrain classification based on airborne laser scanning data to support soil mapping in the Alps", founded by the Autonomous Province of Bolzano, aims at developing a methodical framework of how to obtain soil data. The approach combines geomorphometric analysis and soil mapping to generate modern soil maps at medium-scale in a time and cost efficient way. In this study the open source GRASS GIS extension module r.geomorphon (Jasciewicz and Stepinski, 2013) is used to derive topographically homogeneous landform units out of high resolution DTMs on scale 1:5.000. Furthermore, for terrain segmentation and classification we additionally use medium-scale data sets (geology, parent material, land use etc.). As the Alps are characterized by a great variety of topography, parent material, wide range of moisture regimes etc. getting reliable soil data is difficult. Additionally, geomorphic activity (debris flow, landslide etc.) leads to natural disturbances. Thus, soil properties are highly diverse and largely scale dependent. Furthermore, getting soil information of anthropogenically influenced soils is an added challenge. Due to intensive cultivation techniques the natural link between the soil forming factors is often repealed. In South Tyrol we find the largest pome producing area in Europe. Normally, the annual precipitation is not enough for intensive orcharding. Thus, irrigation strategies are in use. However, as knowledge about the small scaled heterogeneous soil properties is mostly lacking, overwatering and modifications of the

  17. A Random Matrix Approach for Quantifying Model-Form Uncertainties in Turbulence Modeling

    CERN Document Server

    Xiao, Heng; Ghanem, Roger G

    2016-01-01

    With the ever-increasing use of Reynolds-Averaged Navier--Stokes (RANS) simulations in mission-critical applications, the quantification of model-form uncertainty in RANS models has attracted attention in the turbulence modeling community. Recently, a physics-based, nonparametric approach for quantifying model-form uncertainty in RANS simulations has been proposed, where Reynolds stresses are projected to physically meaningful dimensions and perturbations are introduced only in the physically realizable limits. However, a challenge associated with this approach is to assess the amount of information introduced in the prior distribution and to avoid imposing unwarranted constraints. In this work we propose a random matrix approach for quantifying model-form uncertainties in RANS simulations with the realizability of the Reynolds stress guaranteed. Furthermore, the maximum entropy principle is used to identify the probability distribution that satisfies the constraints from available information but without int...

  18. Estimating a DIF decomposition model using a random-weights linear logistic test model approach.

    Science.gov (United States)

    Paek, Insu; Fukuhara, Hirotaka

    2015-09-01

    A differential item functioning (DIF) decomposition model separates a testlet item DIF into two sources: item-specific differential functioning and testlet-specific differential functioning. This article provides an alternative model-building framework and estimation approach for a DIF decomposition model that was proposed by Beretvas and Walker (2012). Although their model is formulated under multilevel modeling with the restricted pseudolikelihood estimation method, our approach illustrates DIF decomposition modeling that is directly built upon the random-weights linear logistic test model framework with the marginal maximum likelihood estimation method. In addition to demonstrating our approach's performance, we provide detailed information on how to implement this new DIF decomposition model using an item response theory software program; using DIF decomposition may be challenging for practitioners, yet practical information on how to implement it has previously been unavailable in the measurement literature.

  19. A fast and cost-effective approach to develop and map EST-SSR markers: oak as a case study

    Directory of Open Access Journals (Sweden)

    Cherubini Marcello

    2010-10-01

    Full Text Available Abstract Background Expressed Sequence Tags (ESTs are a source of simple sequence repeats (SSRs that can be used to develop molecular markers for genetic studies. The availability of ESTs for Quercus robur and Quercus petraea provided a unique opportunity to develop microsatellite markers to accelerate research aimed at studying adaptation of these long-lived species to their environment. As a first step toward the construction of a SSR-based linkage map of oak for quantitative trait locus (QTL mapping, we describe the mining and survey of EST-SSRs as well as a fast and cost-effective approach (bin mapping to assign these markers to an approximate map position. We also compared the level of polymorphism between genomic and EST-derived SSRs and address the transferability of EST-SSRs in Castanea sativa (chestnut. Results A catalogue of 103,000 Sanger ESTs was assembled into 28,024 unigenes from which 18.6% presented one or more SSR motifs. More than 42% of these SSRs corresponded to trinucleotides. Primer pairs were designed for 748 putative unigenes. Overall 37.7% (283 were found to amplify a single polymorphic locus in a reference full-sib pedigree of Quercus robur. The usefulness of these loci for establishing a genetic map was assessed using a bin mapping approach. Bin maps were constructed for the male and female parental tree for which framework linkage maps based on AFLP markers were available. The bin set consisting of 14 highly informative offspring selected based on the number and position of crossover sites. The female and male maps comprised 44 and 37 bins, with an average bin length of 16.5 cM and 20.99 cM, respectively. A total of 256 EST-SSRs were assigned to bins and their map position was further validated by linkage mapping. EST-SSRs were found to be less polymorphic than genomic SSRs, but their transferability rate to chestnut, a phylogenetically related species to oak, was higher. Conclusion We have generated a bin map for oak

  20. A COGNITIVE APPROACH TO CORPORATE GOVERNANCE: A VISUALIZATION TEST OF MENTAL MODELS WITH THE COGNITIVE MAPPING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Garoui NASSREDDINE

    2012-01-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the fi rm with respect to the cognitive approach of corporate governance. The paper takes a corporate governance perspective, discusses mental models and uses the cognitive map to view the diagrams showing the ways of thinking and the conceptualization of the cognitive approach. In addition, it employs a cognitive mapping technique. Returning to the systematic exploration of grids for each actor, it concludes that there is a balance of concepts expressing their cognitive orientation.

  1. Developing a model for effective leadership in healthcare: a concept mapping approach.

    Science.gov (United States)

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison Mb; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group's ideas) to identify stakeholders' mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were "Acting with Personal Integrity", "Communicating Effectively", "Acting with Professional Ethical Values", "Pursuing Excellence", "Building and Maintaining Relationships", and "Thinking Critically". Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research.

  2. Factors Influencing Seasonal Influenza Vaccination Uptake in Emergency Medical Services Workers: A Concept Mapping Approach.

    Science.gov (United States)

    Subramaniam, Dipti P; Baker, Elizabeth A; Zelicoff, Alan P; Elliott, Michael B

    2016-08-01

    Seasonal influenza has serious impacts on morbidity and mortality and has a significant economic toll through lost workforce time and strains on the health system. Health workers, particularly emergency medical services (EMS) workers have the potential to transmit influenza to those in their care, yet little is known of the factors that influence EMS workers' decisions regarding seasonal influenza vaccination (SIV) uptake, a key factor in reducing potential for transmitting disease. This study utilizes a modified Theory of Planned Behavior (TPB) model as a guiding framework to explore the factors that influence SIV uptake in EMS workers. Concept mapping, which consists of six-stages (preparation, generation, structuring, representation, interpretation, and utilization) that use quantitative and qualitative approaches, was used to identify participants' perspectives towards SIV. This study identified nine EMS-conceptualized factors that influence EMS workers' vaccination intent and behavior. The EMS-conceptualized factors align with the modified TPB model and suggest the need to consider community-wide approaches that were not initially conceptualized in the model. Additionally, the expansion of non-pharmaceutical measures went above and beyond original conceptualization. Overall, this study demonstrates the need to develop customized interventions such as messages highlighting the importance of EMS workers receiving SIV as the optimum solution. EMS workers who do not intend to receive the SIV should be provided with accurate information on the SIV to dispel misconceptions. Finally, EMS workers should also receive interventions which promote voluntary vaccination, encouraging them to be proactive in the health decisions they make for themselves.

  3. KNN-MDR: a learning approach for improving interactions mapping performances in genome wide association studies.

    Science.gov (United States)

    Abo Alchamlat, Sinan; Farnir, Frédéric

    2017-03-21

    Finding epistatic interactions in large association studies like genome-wide association studies (GWAS) with the nowadays-available large volume of genomic data is a challenging and largely unsolved issue. Few previous studies could handle genome-wide data due to the intractable difficulties met in searching a combinatorial explosive search space and statistically evaluating epistatic interactions given a limited number of samples. Our work is a contribution to this field. We propose a novel approach combining K-Nearest Neighbors (KNN) and Multi Dimensional Reduction (MDR) methods for detecting gene-gene interactions as a possible alternative to existing algorithms, e especially in situations where the number of involved determinants is high. After describing the approach, a comparison of our method (KNN-MDR) to a set of the other most performing methods (i.e., MDR, BOOST, BHIT, MegaSNPHunter and AntEpiSeeker) is carried on to detect interactions using simulated data as well as real genome-wide data. Experimental results on both simulated data and real genome-wide data show that KNN-MDR has interesting properties in terms of accuracy and power, and that, in many cases, it significantly outperforms its recent competitors. The presented methodology (KNN-MDR) is valuable in the context of loci and interactions mapping and can be seen as an interesting addition to the arsenal used in complex traits analyses.

  4. Review of the Space Mapping Approach to Engineering Optimization and Modeling

    DEFF Research Database (Denmark)

    Bakr, M. H.; Bandler, J. W.; Madsen, Kaj

    2000-01-01

    We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physically-based models is exploited. S......-based Modeling (SMM). These include Space Derivative Mapping (SDM), Generalized Space Mapping (GSM) and Space Mapping-based Neuromodeling (SMN). Finally, we address open points for research and future development....

  5. Mapping science through bibliometric triangulation: an experimental approach applied to water research

    NARCIS (Netherlands)

    Wen, Bei; Horlings, Edwin; van der Zouwen, Marielle; Van den Besselaar, P.A.A.

    2016-01-01

    The idea of constructing science maps based on bibliographic data has intrigued researchers for decades, and various techniques have been developed to map the structure of research disciplines. Most science mapping studies use a single method. However, as research fields have various properties, a

  6. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    Science.gov (United States)

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  7. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    Directory of Open Access Journals (Sweden)

    R. Brumana

    2015-08-01

    Full Text Available Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality, is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical

  8. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    Science.gov (United States)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels

  9. Evolution of syncarpy and other morphological characters in African Annonaceae: a posterior mapping approach.

    Science.gov (United States)

    Couvreur, T L P; Richardson, J E; Sosef, M S M; Erkens, R H J; Chatrou, L W

    2008-04-01

    The congenital fusion of carpels, or syncarpy, is considered a key innovation as it is found in more than 80% of angiosperms. Within the magnoliids however, syncarpy has rarely evolved. Two alternative evolutionary origins of syncarpy were suggested in order to explain the evolution of this feature: multiplication of a single carpel vs. fusion of a moderate number of carpels. The magnoliid family Annonaceae provides an ideal situation to test these hypotheses as two African genera, Isolona and Monodora, are syncarpous in an otherwise apocarpous family with multicarpellate and unicarpellate genera. In addition to syncarpy, the evolution of six other morphological characters was studied. Well-supported phylogenetic relationships of African Annonaceae and in particular those of Isolona and Monodora were reconstructed. Six plastid regions were sequenced and analyzed using maximum parsimony and Bayesian inference methods. The Bayesian posterior mapping approach to study character evolution was used as it accounts for both mapping and phylogenetic uncertainty, and also allows multiple state changes along the branches. Our phylogenetic analyses recovered a fully resolved clade comprising twelve genera endemic to Africa, including Isolona and Monodora, which was nested within the so-called long-branch clade. This is the largest and most species-rich clade of African genera identified to date within Annonaceae. The two syncarpous genera were inferred with maximum support to be sister to a clade characterized by genera with multicarpellate apocarpous gynoecia, supporting the hypothesis that syncarpy arose by fusion of a moderate number of carpels. This hypothesis was also favoured when studying the floral anatomy of both genera. Annonaceae provide the only case of a clear evolution of syncarpy within an otherwise apocarpous magnoliid family. The results presented here offer a better understanding of the evolution of syncarpy in Annonaceae and within angiosperms in general.

  10. Mapping Spatial Distribution of Larch Plantations from Multi-Seasonal Landsat-8 OLI Imagery and Multi-Scale Textures Using Random Forests

    Directory of Open Access Journals (Sweden)

    Tian Gao

    2015-02-01

    Full Text Available The knowledge about spatial distribution of plantation forests is critical for forest management, monitoring programs and functional assessment. This study demonstrates the potential of multi-seasonal (spring, summer, autumn and winter Landsat-8 Operational Land Imager imageries with random forests (RF modeling to map larch plantations (LP in a typical plantation forest landscape in North China. The spectral bands and two types of textures were applied for creating 675 input variables of RF. An accuracy of 92.7% for LP, with a Kappa coefficient of 0.834, was attained using the RF model. A RF-based importance assessment reveals that the spectral bands and bivariate textural features calculated by pseudo-cross variogram (PC strongly promoted forest class-separability, whereas the univariate textural features influenced weakly. A feature selection strategy eliminated 93% of variables, and then a subset of the 47 most essential variables was generated. In this subset, PC texture derived from summer and winter appeared the most frequently, suggesting that this variability in growing peak season and non-growing season can effectively enhance forest class-separability. A RF classifier applied to the subset led to 91.9% accuracy for LP, with a Kappa coefficient of 0.829. This study provides an insight into approaches for discriminating plantation forests with phenological behaviors.

  11. Mapping Natural Terroir Units using a multivariate approach and legacy data

    Science.gov (United States)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    was then subdivided into 9 NTUs, statistically differentiated for the used variables. The vineyard areas of Siena province was subdivided into 9 NTU, statistically differentiated for the used variables. The study demonstrated the strength of a multivariate approach for NTU mapping at province scale (1:125,000), using viticultural legacy data. Identification and mapping of terroir diversity within the DOC and DOCG at the province scale suggest the adoption of viticultural subzones. The subzones, based on the NTU, could bring to the fruition of different wine-production systems that enhanced the peculiarities of the terroir.

  12. The Randomized CRM: An Approach to Overcoming the Long-Memory Property of the CRM.

    Science.gov (United States)

    Koopmeiners, Joseph S; Wey, Andrew

    2017-03-01

    The primary object of a Phase I clinical trial is to determine the maximum tolerated dose (MTD). Typically, the MTD is identified using a dose-escalation study, where initial subjects are treated at the lowest dose level and subsequent subjects are treated at progressively higher dose levels until the MTD is identified. The continual reassessment method (CRM) is a popular model-based dose-escalation design, which utilizes a formal model for the relationship between dose and toxicity to guide dose finding. Recently, it was shown that the CRM has a tendency to get "stuck" on a dose level, with little escalation or de-escalation in the late stages of the trial, due to the long-memory property of the CRM. We propose the randomized CRM (rCRM), which introduces random escalation and de-escalation into the standard CRM dose-finding algorithm, as well as a hybrid approach that incorporates escalation and de-escalation only when certain criteria are met. Our simulation results show that both the rCRM and the hybrid approach reduce the trial-to-trial variability in the number of cohorts treated at the MTD but that the hybrid approach has a more favorable tradeoff with respect to the average number treated at the MTD.

  13. Seeking mathematics success for college students: a randomized field trial of an adapted approach

    Science.gov (United States)

    Gula, Taras; Hoessler, Carolyn; Maciejewski, Wes

    2015-11-01

    Many students enter the Canadian college system with insufficient mathematical ability and leave the system with little improvement. Those students who enter with poor mathematics ability typically take a developmental mathematics course as their first and possibly only mathematics course. The educational experiences that comprise a developmental mathematics course vary widely and are, too often, ineffective at improving students' ability. This trend is concerning, since low mathematics ability is known to be related to lower rates of success in subsequent courses. To date, little attention has been paid to the selection of an instructional approach to consistently apply across developmental mathematics courses. Prior research suggests that an appropriate instructional method would involve explicit instruction and practising mathematical procedures linked to a mathematical concept. This study reports on a randomized field trial of a developmental mathematics approach at a college in Ontario, Canada. The new approach is an adaptation of the JUMP Math program, an explicit instruction method designed for primary and secondary school curriculae, to the college learning environment. In this study, a subset of courses was assigned to JUMP Math and the remainder was taught in the same style as in the previous years. We found consistent, modest improvement in the JUMP Math sections compared to the non-JUMP sections, after accounting for potential covariates. The findings from this randomized field trial, along with prior research on effective education for developmental mathematics students, suggest that JUMP Math is a promising way to improve college student outcomes.

  14. Multivariate non-normally distributed random variables in climate research – introduction to the copula approach

    Directory of Open Access Journals (Sweden)

    P. Friederichs

    2008-10-01

    Full Text Available Probability distributions of multivariate random variables are generally more complex compared to their univariate counterparts which is due to a possible nonlinear dependence between the random variables. One approach to this problem is the use of copulas, which have become popular over recent years, especially in fields like econometrics, finance, risk management, or insurance. Since this newly emerging field includes various practices, a controversial discussion, and vast field of literature, it is difficult to get an overview. The aim of this paper is therefore to provide an brief overview of copulas for application in meteorology and climate research. We examine the advantages and disadvantages compared to alternative approaches like e.g. mixture models, summarize the current problem of goodness-of-fit (GOF tests for copulas, and discuss the connection with multivariate extremes. An application to station data shows the simplicity and the capabilities as well as the limitations of this approach. Observations of daily precipitation and temperature are fitted to a bivariate model and demonstrate, that copulas are valuable complement to the commonly used methods.

  15. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    Science.gov (United States)

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  16. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    Science.gov (United States)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  17. Soil Moisture Mapping in an Arid Area Using a Land Unit Area (LUA Sampling Approach and Geostatistical Interpolation Techniques

    Directory of Open Access Journals (Sweden)

    Saeid Gharechelou

    2016-03-01

    Full Text Available Soil moisture (SM plays a key role in many environmental processes and has a high spatial and temporal variability. Collecting sample SM data through field surveys (e.g., for validation of remote sensing-derived products can be very expensive and time consuming if a study area is large, and producing accurate SM maps from the sample point data is a difficult task as well. In this study, geospatial processing techniques are used to combine several geo-environmental layers relevant to SM (soil, geology, rainfall, land cover, etc. into a land unit area (LUA map, which delineates regions with relatively homogeneous geological/geomorphological, land use/land cover, and climate characteristics. This LUA map is used to guide the collection of sample SM data in the field, and the field data is finally spatially interpolated to create a wall-to-wall map of SM in the study area (Garmsar, Iran. The main goal of this research is to create a SM map in an arid area, using a land unit area (LUA approach to obtain the most appropriate sample locations for collecting SM field data. Several environmental GIS layers, which have an impact on SM, were combined to generate a LUA map, and then field surveying was done in each class of the LUA map. A SM map was produced based on LUA, remote sensing data indexes, and spatial interpolation of the field survey sample data. The several interpolation methods (inverse distance weighting, kriging, and co-kriging were evaluated for generating SM maps from the sample data. The produced maps were compared to each other and validated using ground truth data. The results show that the LUA approach is a reasonable method to create the homogenous field to introduce a representative sample for field soil surveying. The geostatistical SM map achieved adequate accuracy; however, trend analysis and distribution of the soil sample point locations within the LUA types should be further investigated to achieve even better results. Co

  18. Implementation of random forest algorithm for crop mapping across an aridic to ustic area of Indian states

    Science.gov (United States)

    Shukla, Gaurav; Garg, Rahul Dev; Srivastava, Hari Shanker; Garg, Pradeep Kumar

    2017-04-01

    The purpose of this study is to effectively implement random forest algorithm for crop classification of large areas and to check the classification capability of different variables. To incorporate dependency of crops in different variables namely, texture, phenological, parent material and soil, soil moisture, topographic, vegetation, and climate, 35 digital layers are prepared using different satellite data (ALOS DEM, Landsat-8, MODIS NDVI, RISAT-1, and Sentinel-1A) and climatic data (precipitation and temperature). The importance of variables is also calculated based on mean decrease in accuracy and mean decrease in Gini score. Importance and capabilities of variables for crop mapping have been discussed. Variables associated with spectral responses have shown greater importance in comparison to topographic and climate variables. The spectral range (0.85 to 0.88 μm) of the near-infrared band is the most useful variable with the highest scores. The topographic variable and elevation have secured the second place rank in the both scores. This indicates the importance of spectral responses as well as of topography in model development. Climate variables have not shown as much importance as others, but in association with others, they cause a decrease in the out of bag (OOB) error rate. In addition to the OOB data, a 20% independent dataset of training samples is used to evaluate RF model. Results show that RF has good capability for crop classification.

  19. Using a constructivist approach with online concept maps: relationship between theory and nursing education.

    Science.gov (United States)

    Conceição, Simone C O; Taylor, Linda D

    2007-01-01

    Concept maps have been used in nursing education as a method for students to organize and analyze data. This article describes an online course that used concept maps and self-reflective journals to assess students' thinking processes. The self-reflective journals of 21 students collected over two semesters were qualitatively examined. Three major themes emerged from students' use of concept maps: 1) factors influencing the map creation, 2) developmental learning process over time, and 3) validation of existing knowledge and construction of new knowledge. The use of concept maps with reflective journaling provided a learning experience that allowed students to integrate content consistent with a constructivist paradigm. This integration is a developmental process influenced by the personal preferences of students, concept map design, and content complexity. This developmental process provides early evidence that the application of concept mapping in the online environment, along with reflective journaling, allows students to make new connections, integrate previous knowledge, and validate existing knowledge.

  20. A randomized control study of instructional approaches for struggling adult readers.

    Science.gov (United States)

    Greenberg, Daphne; Wise, Justin; Morris, Robin; Fredrick, Laura; Nanda, Alice O; Pae, Hye-K

    2011-01-01

    This study measured the effectiveness of various instructional approaches on the reading outcomes of 198 adults who read single words at the 3.0 through 5.9 grade equivalency levels. The students were randomly assigned to one of the following interventions: Decoding and Fluency; Decoding, Comprehension, and Fluency; Decoding, Comprehension, Fluency, and Extensive Reading; Extensive Reading; and a Control/Comparison approach. The Control/Comparison approach employed a curriculum common to community-based adult literacy programs, and the Extensive Reading approach focused on wide exposure to literature. The Fluency component was a guided repeated oral reading approach, and the Decoding/Comprehension components were SRA/McGraw-Hill Direct Instruction Corrective Reading Programs. Results indicated continued weaknesses in and poor integration of participants' skills. Although students made significant gains independent of reading instruction group, all improvements were associated with small effect sizes. When reading instruction group was considered, only one significant finding was detected, with the Comparison/Control group, the Decoding and Fluency group, and the Decoding, Comprehension, Extensive Reading and Fluency group showing stronger word attack outcomes than the Extensive Reading group.

  1. An approach for mapping large-area impervious surfaces: Synergistic use of Landsat-7 ETM+ and high spatial resolution imagery

    Science.gov (United States)

    Yang, Limin; Huang, Chengquan; Homer, Collin G.; Wylie, Bruce K.; Coan, Michael

    2003-01-01

    A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.

  2. Hierarchical Bayesian approach for estimating physical properties in spiral galaxies: Age Maps for M74

    Science.gov (United States)

    Sánchez Gil, M. Carmen; Berihuete, Angel; Alfaro, Emilio J.; Pérez, Enrique; Sarro, Luis M.

    2015-09-01

    One of the fundamental goals of modern Astronomy is to estimate the physical parameters of galaxies from images in different spectral bands. We present a hierarchical Bayesian model for obtaining age maps from images in the Ha line (taken with Taurus Tunable Filter (TTF)), ultraviolet band (far UV or FUV, from GALEX) and infrared bands (24, 70 and 160 microns (μm), from Spitzer). As shown in [1], we present the burst ages for young stellar populations in the nearby and nearly face on galaxy M74. As it is shown in the previous work, the Hα to FUV flux ratio gives a good relative indicator of very recent star formation history (SFH). As a nascent star-forming region evolves, the Ha line emission declines earlier than the UV continuum, leading to a decrease in the HαFUV ratio. Through a specific star-forming galaxy model (Starburst 99, SB99), we can obtain the corresponding theoretical ratio Hα / FUV to compare with our observed flux ratios, and thus to estimate the ages of the observed regions. Due to the nature of the problem, it is necessary to propose a model of high complexity to take into account the mean uncertainties, and the interrelationship between parameters when the Hα / FUV flux ratio mentioned above is obtained. To address the complexity of the model, we propose a Bayesian hierarchical model, where a joint probability distribution is defined to determine the parameters (age, metallicity, IMF), from the observed data, in this case the observed flux ratios Hα / FUV. The joint distribution of the parameters is described through an i.i.d. (independent and identically distributed random variables), generated through MCMC (Markov Chain Monte Carlo) techniques.

  3. Historical maintenance relevant information road-map for a self-learning maintenance prediction procedural approach

    Science.gov (United States)

    Morales, Francisco J.; Reyes, Antonio; Cáceres, Noelia; Romero, Luis M.; Benitez, Francisco G.; Morgado, Joao; Duarte, Emanuel; Martins, Teresa

    2017-09-01

    A large percentage of transport infrastructures are composed of linear assets, such as roads and rail tracks. The large social and economic relevance of these constructions force the stakeholders to ensure a prolonged health/durability. Even though, inevitable malfunctioning, breaking down, and out-of-service periods arise randomly during the life cycle of the infrastructure. Predictive maintenance techniques tend to diminish the appearance of unpredicted failures and the execution of needed corrective interventions, envisaging the adequate interventions to be conducted before failures show up. This communication presents: i) A procedural approach, to be conducted, in order to collect the relevant information regarding the evolving state condition of the assets involved in all maintenance interventions; this reported and stored information constitutes a rich historical data base to train Machine Learning algorithms in order to generate reliable predictions of the interventions to be carried out in further time scenarios. ii) A schematic flow chart of the automatic learning procedure. iii) Self-learning rules from automatic learning from false positive/negatives. The description, testing, automatic learning approach and the outcomes of a pilot case are presented; finally some conclusions are outlined regarding the methodology proposed for improving the self-learning predictive capability.

  4. The episodic random utility model unifies time trade-off and discrete choice approaches in health state valuation

    NARCIS (Netherlands)

    B.M. Craig (Benjamin); J.J. van Busschbach (Jan)

    2009-01-01

    textabstractABSTRACT: BACKGROUND: To present an episodic random utility model that unifies time trade-off and discrete choice approaches in health state valuation. METHODS: First, we introduce two alternative random utility models (RUMs) for health preferences: the episodic RUM and the more common

  5. Radio polarization maps of shell-type supernova remnants - I. Effects of a random magnetic field component and thin-shell models

    Science.gov (United States)

    Bandiera, R.; Petruk, O.

    2016-06-01

    The maps of intensity and polarization of the radio synchrotron emission from shell-type supernova remnants (SNRs) contain a considerable amount of information, although of not easy interpretation. With the aim of deriving constraints on the 3D spatial distribution of the emissivity, as well as on the structure of both ordered and random magnetic fields (MFs), we present here a scheme to model maps of the emission and polarization in SNRs. We first generalize the classical treatment of the synchrotron emission to the case in which the MF is composed of an ordered MF plus an isotropic random component, with arbitrary relative strengths. For a power-law particle energy distribution, we derive analytic formulae that formally resemble those for the classical case. We also treat the shock compression of a fully random upstream field and we predict that the polarization fraction in this case should be higher than typically measured in SNRs. We implement the above treatment into a code, which simulates the observed polarized emission of an emitting shell, taking into account also the effect of the internal Faraday rotation. Finally, we show simulated maps for different orientations with respect to the observer, levels of the turbulent MF component, Faraday rotation levels, distributions of the emissivity (either barrel-shaped or limited to polar caps) and geometries for the ordered MF component (either tangential to the shell or radial). Their analysis allows us to outline properties useful for the interpretation of radio intensity and polarization maps.

  6. Tailoring online information retrieval to user's needs based on a logical semantic approach to natural language processing and UMLS mapping.

    Science.gov (United States)

    Kossman, Susan; Jones, Josette; Brennan, Patricia Flatley

    2007-10-11

    Depression can derail teenagers' lives and cause serious chronic health problems. Acquiring pertinent knowledge and skills supports care management, but retrieving appropriate information can be difficult. This poster presents a strategy to tailor online information to user attributes using a logical semantic approach to natural language processing (NLP) and mapping propositions to UMLS terms. This approach capitalizes on existing NLM resources and presents a potentially sustainable plan for meeting consumers and providers information needs.

  7. Agricultural Land Use mapping by multi-sensor approach for hydrological water quality monitoring

    Science.gov (United States)

    Brodsky, Lukas; Kodesova, Radka; Kodes, Vit

    2010-05-01

    The main objective of this study is to demonstrate potential of operational use of the high and medium resolution remote sensing data for hydrological water quality monitoring by mapping agriculture intensity and crop structures. In particular use of remote sensing mapping for optimization of pesticide monitoring. The agricultural mapping task is tackled by means of medium spatial and high temporal resolution ESA Envisat MERIS FR images together with single high spatial resolution IRS AWiFS image covering the whole area of interest (the Czech Republic). High resolution data (e.g. SPOT, ALOS, Landsat) are often used for agricultural land use classification, but usually only at regional or local level due to data availability and financial constraints. AWiFS data (nominal spatial resolution 56 m) due to the wide satellite swath seems to be more suitable for use at national level. Nevertheless, one of the critical issues for such a classification is to have sufficient image acquisitions over the whole vegetation period to describe crop development in appropriate way. ESA MERIS middle-resolution data were used in several studies for crop classification. The high temporal and also spectral resolution of MERIS data has indisputable advantage for crop classification. However, spatial resolution of 300 m results in mixture signal in a single pixel. AWiFS-MERIS data synergy brings new perspectives in agricultural Land Use mapping. Also, the developed methodology procedure is fully compatible with future use of ESA (GMES) Sentinel satellite images. The applied methodology of hybrid multi-sensor approach consists of these main stages: a/ parcel segmentation and spectral pre-classification of high resolution image (AWiFS); b/ ingestion of middle resolution (MERIS) vegetation spectro-temporal features; c/ vegetation signatures unmixing; and d/ semantic object-oriented classification of vegetation classes into final classification scheme. These crop groups were selected to be

  8. Making sense of human ecology mapping: an overview of approaches to integrating socio-spatial data into environmental planning

    Science.gov (United States)

    Rebecca McLain; Melissa R. Poe; Kelly Biedenweg; Lee K. Cerveny; Diane Besser; Dale J. Blahna

    2013-01-01

    Ecosystem-based planning and management have stimulated the need to gather sociocultural values and human uses of land in formats accessible to diverse planners and researchers. Human Ecology Mapping (HEM) approaches offer promising spatial data gathering and analytical tools, while also addressing important questions about human-landscape connections. This article...

  9. Using an empirical and rule-based modeling approach to map cause of disturbance in U.S

    Science.gov (United States)

    Todd A. Schroeder; Gretchen G. Moisen; Karen Schleeweis; Chris Toney; Warren B. Cohen; Zhiqiang Yang; Elizabeth A. Freeman

    2015-01-01

    Recently completing over a decade of research, the NASA/NACP funded North American Forest Dynamics (NAFD) project has led to several important advancements in the way U.S. forest disturbance dynamics are mapped at regional and continental scales. One major contribution has been the development of an empirical and rule-based modeling approach which addresses two of the...

  10. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  11. A new approach for inversion of large random matrices in massive MIMO systems.

    Directory of Open Access Journals (Sweden)

    Muhammad Ali Raza Anjum

    Full Text Available We report a novel approach for inversion of large random matrices in massive Multiple-Input Multiple Output (MIMO systems. It is based on the concept of inverse vectors in which an inverse vector is defined for each column of the principal matrix. Such an inverse vector has to satisfy two constraints. Firstly, it has to be in the null-space of all the remaining columns. We call it the null-space problem. Secondly, it has to form a projection of value equal to one in the direction of selected column. We term it as the normalization problem. The process essentially decomposes the inversion problem and distributes it over columns. Each column can be thought of as a node in the network or a particle in a swarm seeking its own solution, the inverse vector, which lightens the computational load on it. Another benefit of this approach is its applicability to all three cases pertaining to a linear system: the fully-determined, the over-determined, and the under-determined case. It eliminates the need of forming the generalized inverse for the last two cases by providing a new way to solve the least squares problem and the Moore and Penrose's pseudoinverse problem. The approach makes no assumption regarding the size, structure or sparsity of the matrix. This makes it fully applicable to much in vogue large random matrices arising in massive MIMO systems. Also, the null-space problem opens the door for a plethora of methods available in literature for null-space computation to enter the realm of matrix inversion. There is even a flexibility of finding an exact or approximate inverse depending on the null-space method employed. We employ the Householder's null-space method for exact solution and present a complete exposition of the new approach. A detailed comparison with well-established matrix inversion methods in literature is also given.

  12. A double SIMEX approach for bivariate random-effects meta-analysis of diagnostic accuracy studies

    Directory of Open Access Journals (Sweden)

    Annamaria Guolo

    2017-01-01

    Full Text Available Abstract Background Bivariate random-effects models represent a widely accepted and recommended approach for meta-analysis of test accuracy studies. Standard likelihood methods routinely used for inference are prone to several drawbacks. Small sample size can give rise to unreliable inferential conclusions and convergence issues make the approach unappealing. This paper suggests a different methodology to address such difficulties. Methods A SIMEX methodology is proposed. The method is a simulation-based technique originally developed as a correction strategy within the measurement error literature. It suits the meta-analysis framework as the diagnostic accuracy measures provided by each study are prone to measurement error. SIMEX can be straightforwardly adapted to cover different measurement error structures and to deal with covariates. The effortless implementation with standard software is an interesting feature of the method. Results Extensive simulation studies highlight the improvement provided by SIMEX over likelihood approach in terms of empirical coverage probabilities of confidence intervals under different scenarios, independently of the sample size and the values of the correlation between sensitivity and specificity. A remarkable amelioration is obtained in case of deviations from the normality assumption for the random-effects distribution. From a computational point of view, the application of SIMEX is shown to be neither involved nor subject to the convergence issues affecting likelihood-based alternatives. Application of the method to a diagnostic review of the performance of transesophageal echocardiography for assessing ascending aorta atherosclerosis enables overcoming limitations of the likelihood procedure. Conclusions The SIMEX methodology represents an interesting alternative to likelihood-based procedures for inference in meta-analysis of diagnostic accuracy studies. The approach can provide more accurate inferential

  13. A new approach for inversion of large random matrices in massive MIMO systems.

    Science.gov (United States)

    Anjum, Muhammad Ali Raza; Ahmed, Muhammad Mansoor

    2014-01-01

    We report a novel approach for inversion of large random matrices in massive Multiple-Input Multiple Output (MIMO) systems. It is based on the concept of inverse vectors in which an inverse vector is defined for each column of the principal matrix. Such an inverse vector has to satisfy two constraints. Firstly, it has to be in the null-space of all the remaining columns. We call it the null-space problem. Secondly, it has to form a projection of value equal to one in the direction of selected column. We term it as the normalization problem. The process essentially decomposes the inversion problem and distributes it over columns. Each column can be thought of as a node in the network or a particle in a swarm seeking its own solution, the inverse vector, which lightens the computational load on it. Another benefit of this approach is its applicability to all three cases pertaining to a linear system: the fully-determined, the over-determined, and the under-determined case. It eliminates the need of forming the generalized inverse for the last two cases by providing a new way to solve the least squares problem and the Moore and Penrose's pseudoinverse problem. The approach makes no assumption regarding the size, structure or sparsity of the matrix. This makes it fully applicable to much in vogue large random matrices arising in massive MIMO systems. Also, the null-space problem opens the door for a plethora of methods available in literature for null-space computation to enter the realm of matrix inversion. There is even a flexibility of finding an exact or approximate inverse depending on the null-space method employed. We employ the Householder's null-space method for exact solution and present a complete exposition of the new approach. A detailed comparison with well-established matrix inversion methods in literature is also given.

  14. A Combined Random Forest and OBIA Classification Scheme for Mapping Smallholder Agriculture at Different Nomenclature Levels Using Multisource Data (Simulated Sentinel-2 Time Series, VHRS and DEM

    Directory of Open Access Journals (Sweden)

    Valentine Lebourgeois

    2017-03-01

    Full Text Available Sentinel-2 images are expected to improve global crop monitoring even in challenging tropical small agricultural systems that are characterized by high intra- and inter-field spatial variability and where satellite observations are disturbed by the presence of clouds. To overcome these constraints, we analyzed and optimized the performance of a combined Random Forest (RF classifier/object-based approach and applied it to multisource satellite data to produce land use maps of a smallholder agricultural zone in Madagascar at five different nomenclature levels. The RF classifier was first optimized by reducing the number of input variables. Experiments were then carried out to (i test cropland masking prior to the classification of more detailed nomenclature levels, (ii analyze the importance of each data source (a high spatial resolution (HSR time series, a very high spatial resolution (VHSR coverage and a digital elevation model (DEM and data type (spectral, textural or other, and (iii quantify their contributions to classification accuracy levels. The results show that RF classifier optimization allowed for a reduction in the number of variables by 1.5- to 6-fold (depending on the classification level and thus a reduction in the data processing time. Classification results were improved via the hierarchical approach at all classification levels, achieving an overall accuracy of 91.7% and 64.4% for the cropland and crop subclass levels, respectively. Spectral variables derived from an HSR time series were shown to be the most discriminating, with a better score for spectral indices over the reflectances. VHSR data were only found to be essential when implementing the segmentation of the area into objects and not for the spectral or textural features they can provide during classification.

  15. Mapping Agricultural Fields in Sub-Saharan Africa with a Computer Vision Approach

    Science.gov (United States)

    Debats, S. R.; Luo, D.; Estes, L. D.; Fuchs, T.; Caylor, K. K.

    2014-12-01

    Sub-Saharan Africa is an important focus for food security research, because it is experiencing unprecedented population growth, agricultural activities are largely dominated by smallholder production, and the region is already home to 25% of the world's undernourished. One of the greatest challenges to monitoring and improving food security in this region is obtaining an accurate accounting of the spatial distribution of agriculture. Households are the primary units of agricultural production in smallholder communities and typically rely on small fields of less than 2 hectares. Field sizes are directly related to household crop productivity, management choices, and adoption of new technologies. As population and agriculture expand, it becomes increasingly important to understand both the distribution of field sizes as well as how agricultural communities are spatially embedded in the landscape. In addition, household surveys, a common tool for tracking agricultural productivity in Sub-Saharan Africa, would greatly benefit from spatially explicit accounting of fields. Current gridded land cover data sets do not provide information on individual agricultural fields or the distribution of field sizes. Therefore, we employ cutting edge approaches from the field of computer vision to map fields across Sub-Saharan Africa, including semantic segmentation, discriminative classifiers, and automatic feature selection. Our approach aims to not only improve the binary classification accuracy of cropland, but also to isolate distinct fields, thereby capturing crucial information on size and geometry. Our research focuses on the development of descriptive features across scales to increase the accuracy and geographic range of our computer vision algorithm. Relevant data sets include high-resolution remote sensing imagery and Landsat (30-m) multi-spectral imagery. Training data for field boundaries is derived from hand-digitized data sets as well as crowdsourcing.

  16. Development of Return Period Inundation Maps In A Changing Climate Using a Systems of Systems Approach

    Science.gov (United States)

    Bilskie, M. V.; Hagen, S. C.; Alizad, K.; Passeri, D. L.; Irish, J. L.

    2016-12-01

    Worldwide, coastal land margins are experiencing increased vulnerability from natural and manmade disasters [Nicholls et al., 1999]. Specifically, coastal flooding is expected to increase due to the effects of climate change, and sea level rise (SLR) in particular. A systems of systems (SoS) approach has been implemented to better understand the dynamic and nonlinear effects of SLR on tropical cyclone-induced coastal flooding along a low-gradient coastal landscape (northern Gulf of Mexico [NGOM]). The backbone of the SoS framework is a high-resolution, physics-based, tide, wind-wave, and hurricane storm surge model [Bilskie et al., 2016a] that includes systems of SLR scenarios [Parris et al., 2012], shoreline morphology [Passeri et al., 2016; Plant et al., 2016], marsh evolution [Alizad et al., 2016], and population dynamics driven by carbon emission scenarios [Bilskie et al., 2016b]. Prior to considering future conditions, the storm surge model was comprehensively validated for present-day conditions [Bilskie et al., 2016a]. The present-day model was then modified to represent the potential future landscape based on four SLR scenarios prescribed by Parris et al. [2012] linked to carbon emissions scenarios for the year 2100. Numerical simulations forced by hundreds of synthetic tropical cyclones were performed and the results facilitate the development of return period inundation maps across the NGOM that reflect changes to the coastal landscape. The SoS approach allows new patterns and properties to emerge (i.e. nonlinear and dynamic effects of SLR) that would otherwise be unobserved using a static SLR model.

  17. Developing a model for effective leadership in healthcare: a concept mapping approach

    Science.gov (United States)

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison MB; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Purpose Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group’s ideas) to identify stakeholders’ mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Methods Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. Results A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were “Acting with Personal Integrity”, “Communicating Effectively”, “Acting with Professional Ethical Values”, “Pursuing Excellence”, “Building and Maintaining Relationships”, and “Thinking Critically”. Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Conclusion Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research. PMID:29355249

  18. 3D mapping of airway wall thickening in asthma with MSCT: a level set approach

    Science.gov (United States)

    Fetita, Catalin; Brillet, Pierre-Yves; Hartley, Ruth; Grenier, Philippe A.; Brightling, Christopher

    2014-03-01

    Assessing the airway wall thickness in multi slice computed tomography (MSCT) as image marker for airway disease phenotyping such asthma and COPD is a current trend and challenge for the scientific community working in lung imaging. This paper addresses the same problem from a different point of view: considering the expected wall thickness-to-lumen-radius ratio for a normal subject as known and constant throughout the whole airway tree, the aim is to build up a 3D map of airway wall regions of larger thickness and to define an overall score able to highlight a pathological status. In this respect, the local dimension (caliber) of the previously segmented airway lumen is obtained on each point by exploiting the granulometry morphological operator. A level set function is defined based on this caliber information and on the expected wall thickness ratio, which allows obtaining a good estimate of the airway wall throughout all segmented lumen generations. Next, the vascular (or mediastinal dense tissue) contact regions are automatically detected and excluded from analysis. For the remaining airway wall border points, the real wall thickness is estimated based on the tissue density analysis in the airway radial direction; thick wall points are highlighted on a 3D representation of the airways and several quantification scores are defined. The proposed approach is fully automatic and was evaluated (proof of concept) on a patient selection coming from different databases including mild, severe asthmatics and normal cases. This preliminary evaluation confirms the discriminative power of the proposed approach regarding different phenotypes and is currently extending to larger cohorts.

  19. A hybrid model for mapping simplified seismic response via a GIS-metamodel approach

    Science.gov (United States)

    Grelle, G.; Bonito, L.; Revellino, P.; Guerriero, L.; Guadagno, F. M.

    2014-07-01

    In earthquake-prone areas, site seismic response due to lithostratigraphic sequence plays a key role in seismic hazard assessment. A hybrid model, consisting of GIS and metamodel (model of model) procedures, was introduced aimed at estimating the 1-D spatial seismic site response in accordance with spatial variability of sediment parameters. Inputs and outputs are provided and processed by means of an appropriate GIS model, named GIS Cubic Model (GCM). This consists of a block-layered parametric structure aimed at resolving a predicted metamodel by means of pixel to pixel vertical computing. The metamodel, opportunely calibrated, is able to emulate the classic shape of the spectral acceleration response in relation to the main physical parameters that characterize the spectrum itself. Therefore, via the GCM structure and the metamodel, the hybrid model provides maps of normalized acceleration response spectra. The hybrid model was applied and tested on the built-up area of the San Giorgio del Sannio village, located in a high-risk seismic zone of southern Italy. Efficiency tests showed a good correspondence between the spectral values resulting from the proposed approach and the 1-D physical computational models. Supported by lithology and geophysical data and corresponding accurate interpretation regarding modelling, the hybrid model can be an efficient tool in assessing urban planning seismic hazard/risk.

  20. Multimodality approach to optical early detection and mapping of oral neoplasia

    Science.gov (United States)

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-07-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins.

  1. A new map of global ecological land units—An ecophysiographic stratification approach

    Science.gov (United States)

    Sayre, Roger; Dangermond, Jack; Frye, Charlie; Vaughan, Randy; Aniello, Peter; Breyer, Sean; Cribbs, Douglas; Hopkins, Dabney; Nauman, Richard; Derrenbacher, William; Wright, Dawn; Brown, Clint; Convis, Charles; Smith, Jonathan H.; Benson, Laurence; Van Sistine, Darren; Warner, Harumi; Cress, Jill Janene; Danielson, Jeffrey J.; Hamann, Sharon L.; Cecere, Thomas; Reddy, Ashwan D.; Burton, Devon; Grosse, Andrea; True, Diane; Metzger, Marc; Hartmann, Jens; Moosdorf, Nils; Durr, Hans; Paganini, Marc; Defourny, Pierre; Arino, Olivier; Maynard, Simone; Anderson, Mark; Comer, Patrick

    2014-01-01

    In response to the need and an intergovernmental commission for a high resolution and data-derived global ecosystem map, land surface elements of global ecological pattern were characterized in an ecophysiographic stratification of the planet. The stratification produced 3,923 terrestrial ecological land units (ELUs) at a base resolution of 250 meters. The ELUs were derived from data on land surface features in a three step approach. The first step involved acquiring or developing four global raster datalayers representing the primary components of ecosystem structure: bioclimate, landform, lithology, and land cover. These datasets generally represent the most accurate, current, globally comprehensive, and finest spatial and thematic resolution data available for each of the four inputs. The second step involved a spatial combination of the four inputs into a single, new integrated raster dataset where every cell represents a combination of values from the bioclimate, landforms, lithology, and land cover datalayers. This foundational global raster datalayer, called ecological facets (EFs), contains 47,650 unique combinations of the four inputs. The third step involved an aggregation of the EFs into the 3,923 ELUs. This subdivision of the Earth’s surface into relatively fine, ecological land areas is designed to be useful for various types of ecosystem research and management applications, including assessments of climate change impacts to ecosystems, economic and non-economic valuation of ecosystem services, and conservation planning.

  2. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  3. An Effective NoSQL-Based Vector Map Tile Management Approach

    OpenAIRE

    Lin Wan; Zhou Huang; Xia Peng

    2016-01-01

    Within a digital map service environment, the rapid growth of Spatial Big-Data is driving new requirements for effective mechanisms for massive online vector map tile processing. The emergence of Not Only SQL (NoSQL) databases has resulted in a new data storage and management model for scalable spatial data deployments and fast tracking. They better suit the scenario of high-volume, low-latency network map services than traditional standalone high-performance computer (HPC) or relational data...

  4. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    National Research Council Canada - National Science Library

    Ayman Nagi; Safwan Altarazi

    2017-01-01

    .... Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings...

  5. INTEGRATED GEOREFERENCING OF STEREO IMAGE SEQUENCES CAPTURED WITH A STEREOVISION MOBILE MAPPING SYSTEM – APPROACHES AND PRACTICAL RESULTS

    Directory of Open Access Journals (Sweden)

    H. Eugster

    2012-07-01

    Full Text Available Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations – in our case of the imaging sensors – normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  6. Integrated Georeferencing of Stereo Image Sequences Captured with a Stereovision Mobile Mapping System - Approaches and Practical Results

    Science.gov (United States)

    Eugster, H.; Huber, F.; Nebiker, S.; Gisi, A.

    2012-07-01

    Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations - in our case of the imaging sensors - normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  7. Minimum intervention dentistry approach to managing early childhood caries: a randomized control trial.

    Science.gov (United States)

    Arrow, Peter; Klobas, Elizabeth

    2015-12-01

    A pragmatic randomized control trial was undertaken to compare the minimum intervention dentistry (MID) approach, based on the atraumatic restorative treatment procedures (MID-ART: Test), against the standard care approach (Control) to treat early childhood caries in a primary care setting. Consenting parent/child dyads were allocated to the Test or Control group using stratified block randomization. Inclusion and exclusion criteria were applied. Participants were examined at baseline and at follow-up by two calibrated examiners blind to group allocation status (κ = 0.77), and parents completed a questionnaire at baseline and follow-up. Dental therapists trained in MID-ART provided treatment to the Test group and dentists treated the Control group using standard approaches. The primary outcome of interest was the number of children who were referred for specialist pediatric care. Secondary outcomes were the number of teeth treated, changes in child oral health-related quality of life and dental anxiety and parental perceptions of care received. Data were analyzed on an intention to treat basis; risk ratio for referral for specialist care, test of proportions, Wilcoxon rank test and logistic regression were used. Three hundred and seventy parents/carers were initially screened; 273 children were examined at baseline and 254 were randomized (Test = 127; Control = 127): mean age = 3.8 years, SD 0.90; 59% male, mean dmft = 4.9, SD 4.0. There was no statistically significant difference in age, sex, baseline caries experience or child oral health-related quality of life between the Test and Control group. At follow-up (mean interval 11.4 months, SD 3.1 months), 220 children were examined: Test = 115, Control = 105. Case-notes review of 231 children showed Test = 6 (5%) and Control = 53 (49%) were referred for specialist care, P ART approach reduced significantly the likelihood of referral for specialist care, and more children and teeth were

  8. Using an intervention mapping approach to develop a discharge protocol for intensive care patients.

    Science.gov (United States)

    van Mol, Margo; Nijkamp, Marjan; Markham, Christine; Ista, Erwin

    2017-12-19

    Admission into an intensive care unit (ICU) may result in long-term physical, cognitive, and emotional consequences for patients and their relatives. The care of the critically ill patient does not end upon ICU discharge; therefore, integrated and ongoing care during and after transition to the follow-up ward is pivotal. This study described the development of an intervention that responds to this need. Intervention Mapping (IM), a six-step theory- and evidence-based approach, was used to guide intervention development. The first step, a problem analysis, comprised a literature review, six semi-structured telephone interviews with former ICU-patients and their relatives, and seven qualitative roundtable meetings for all eligible nurses (i.e., 135 specialized and 105 general ward nurses). Performance and change objectives were formulated in step two. In step three, theory-based methods and practical applications were selected and directed at the desired behaviors and the identified barriers. Step four designed a revised discharge protocol taking into account existing interventions. Adoption, implementation and evaluation of the new discharge protocol (IM steps five and six) are in progress and were not included in this study. Four former ICU patients and two relatives underlined the importance of the need for effective discharge information and supportive written material. They also reported a lack of knowledge regarding the consequences of ICU admission. 42 ICU and 19 general ward nurses identified benefits and barriers regarding discharge procedures using three vignettes framed by literature. Some discrepancies were found. For example, ICU nurses were skeptical about the impact of writing a lay summary despite extensive evidence of the known benefits for the patients. ICU nurses anticipated having insufficient skills, not knowing the patient well enough, and fearing legal consequences of their writings. The intervention was designed to target the knowledge

  9. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach.

    Science.gov (United States)

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier's Scopus database over the period 1980-2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980-1989 and they reached 3% (N = 69) in the time slice 1990-1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010-2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980-1994, the research topics were poorly characterized, but they become better defined during the time slice 1995-1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses revealed highly

  10. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach.

    Directory of Open Access Journals (Sweden)

    Pierfrancesco Nardi

    Full Text Available This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier's Scopus database over the period 1980-2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17 during the period 1980-1989 and they reached 3% (N = 69 in the time slice 1990-1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476 during the period 2010-2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178 of total publications followed by Italy (18%, N = 482 and France (12%, N = 336. These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980-1994, the research topics were poorly characterized, but they become better defined during the time slice 1995-1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014 was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses

  11. Mapping Urban Green Infrastructure: A Novel Landscape-Based Approach to Incorporating Land Use and Land Cover in the Mapping of Human-Dominated Systems

    Directory of Open Access Journals (Sweden)

    Matthew Dennis

    2018-01-01

    Full Text Available Common approaches to mapping green infrastructure in urbanised landscapes invariably focus on measures of land use or land cover and associated functional or physical traits. However, such one-dimensional perspectives do not accurately capture the character and complexity of the landscapes in which urban inhabitants live. The new approach presented in this paper demonstrates how open-source, high spatial and temporal resolution data with global coverage can be used to measure and represent the landscape qualities of urban environments. Through going beyond simple metrics of quantity, such as percentage green and blue cover, it is now possible to explore the extent to which landscape quality helps to unpick the mixed evidence presented in the literature on the benefits of urban nature to human well-being. Here we present a landscape approach, employing remote sensing, GIS and data reduction techniques to map urban green infrastructure elements in a large U.K. city region. Comparison with existing urban datasets demonstrates considerable improvement in terms of coverage and thematic detail. The characterisation of landscapes, using census tracts as spatial units, and subsequent exploration of associations with social–ecological attributes highlights the further detail that can be uncovered by the approach. For example, eight urban landscape types identified for the case study city exhibited associations with distinct socioeconomic conditions accountable not only to quantities but also qualities of green and blue space. The identification of individual landscape features through simultaneous measures of land use and land cover demonstrated unique and significant associations between the former and indicators of human health and ecological condition. The approach may therefore provide a promising basis for developing further insight into processes and characteristics that affect human health and well-being in urban areas, both in the United

  12. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    Science.gov (United States)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  13. Digital soil mapping at pilot sites in the northwest coast of Egypt: A multinomial logistic regression approach

    Directory of Open Access Journals (Sweden)

    Fawzy Hassan Abdel-Kader

    2011-06-01

    Full Text Available The study examines a digital soil mapping approach for the production of soil maps by using multinomial logistic regression on soil and terrain information at pilot sites in the Northwestern Coastal region of Egypt. The aim is to reproduce the original map and predict soil distribution in the adherent landscape. Reference soil maps produced by conventional methods at Omayed and Nagamish areas were used. Spectral and terrain parameters were calculated and logit models of the soil classes were developed. Predicted soil classes’ maps were produced. Software’s IDRISI/SAGA/SATISTCA/SPSS were used. The terrain and spectral parameters were found to be significantly influential and the selection of the land surfaces predictors was satisfactory. The McFadden pseudo R-squares ranged from 0.473 to 0.496. The most significant terrain parameters influencing the spatial distribution of the soil classes were elevation, valley depth, multiresolution ridgetop flatness index, multiresolution valley-bottom flatness index, and SAGA wetness index. However, the most influential spectral parameters are the first two principal components of the six Landsat Enhanced Thematic Mapper bands. The overall accuracy of the predicted soil maps ranged from 72% to 74% with a Kappa Index ranging from 0.62 to 0.64. The developed probability models were successfully used to predict the spatial distribution of the soil mapping units at pixel resolutions of 28.5 m × 28.5 m and 90 m × 90 m at adjacent unvisited areas at Matrouh and Alamin. The developed methodology could contribute to the allocation and to the soil digital mapping and management of new expansion sites in remote desert areas of Egypt.

  14. A Remote Sensing Approach for Regional-Scale Mapping of Agricultural Land-Use Systems Based on NDVI Time Series

    Directory of Open Access Journals (Sweden)

    Beatriz Bellón

    2017-06-01

    Full Text Available In response to the need for generic remote sensing tools to support large-scale agricultural monitoring, we present a new approach for regional-scale mapping of agricultural land-use systems (ALUS based on object-based Normalized Difference Vegetation Index (NDVI time series analysis. The approach consists of two main steps. First, to obtain relatively homogeneous land units in terms of phenological patterns, a principal component analysis (PCA is applied to an annual MODIS NDVI time series, and an automatic segmentation is performed on the resulting high-order principal component images. Second, the resulting land units are classified into the crop agriculture domain or the livestock domain based on their land-cover characteristics. The crop agriculture domain land units are further classified into different cropping systems based on the correspondence of their NDVI temporal profiles with the phenological patterns associated with the cropping systems of the study area. A map of the main ALUS of the Brazilian state of Tocantins was produced for the 2013–2014 growing season with the new approach, and a significant coherence was observed between the spatial distribution of the cropping systems in the final ALUS map and in a reference map extracted from the official agricultural statistics of the Brazilian Institute of Geography and Statistics (IBGE. This study shows the potential of remote sensing techniques to provide valuable baseline spatial information for supporting agricultural monitoring and for large-scale land-use systems analysis.

  15. Groundwater potential mapping using C5.0, random forest, and multivariate adaptive regression spline models in GIS.

    Science.gov (United States)

    Golkarian, Ali; Naghibi, Seyed Amir; Kalantar, Bahareh; Pradhan, Biswajeet

    2018-02-17

    Ever increasing demand for water resources for different purposes makes it essential to have better understanding and knowledge about water resources. As known, groundwater resources are one of the main water resources especially in countries with arid climatic condition. Thus, this study seeks to provide groundwater potential maps (GPMs) employing new algorithms. Accordingly, this study aims to validate the performance of C5.0, random forest (RF), and multivariate adaptive regression splines (MARS) algorithms for generating GPMs in the eastern part of Mashhad Plain, Iran. For this purpose, a dataset was produced consisting of spring locations as indicator and groundwater-conditioning factors (GCFs) as input. In this research, 13 GCFs were selected including altitude, slope aspect, slope angle, plan curvature, profile curvature, topographic wetness index (TWI), slope length, distance from rivers and faults, rivers and faults density, land use, and lithology. The mentioned dataset was divided into two classes of training and validation with 70 and 30% of the springs, respectively. Then, C5.0, RF, and MARS algorithms were employed using R statistical software, and the final values were transformed into GPMs. Finally, two evaluation criteria including Kappa and area under receiver operating characteristics curve (AUC-ROC) were calculated. According to the findings of this research, MARS had the best performance with AUC-ROC of 84.2%, followed by RF and C5.0 algorithms with AUC-ROC values of 79.7 and 77.3%, respectively. The results indicated that AUC-ROC values for the employed models are more than 70% which shows their acceptable performance. As a conclusion, the produced methodology could be used in other geographical areas. GPMs could be used by water resource managers and related organizations to accelerate and facilitate water resource exploitation.

  16. Ontology Mapping Neural Network: An Approach to Learning and Inferring Correspondences among Ontologies

    Science.gov (United States)

    Peng, Yefei

    2010-01-01

    An ontology mapping neural network (OMNN) is proposed in order to learn and infer correspondences among ontologies. It extends the Identical Elements Neural Network (IENN)'s ability to represent and map complex relationships. The learning dynamics of simultaneous (interlaced) training of similar tasks interact at the shared connections of the…

  17. The Use of Concept Maps to Assess Preservice Teacher Understanding: A Formative Approach in Mathematics Education

    Science.gov (United States)

    Brakoniecki, Aaron; Shah, Fahmil

    2017-01-01

    The research reported in this article explored the methods by which concept maps served as formative assessment by capturing changes in the ways preservice mathematics teachers represented their understanding of algebra. The participants were enrolled in a course on high school algebra for teachers and created the maps on the first and last day of…

  18. Concept Mapping: An Approach for Evaluating a Public Alternative School Program

    Science.gov (United States)

    Streeter, Calvin L.; Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2011-01-01

    This article describes how concept mapping techniques were applied to evaluate the development of a solution-focused, public alternative school program. Concept Systems software was used to create 15 cluster maps based on statements generated from students, teachers, and school staff. In addition, pattern matches were analyzed to examine the…

  19. A Different Approach to Preparing Novakian Concept Maps: The Indexing Method

    Science.gov (United States)

    Turan Oluk, Nurcan; Ekmekci, Güler

    2016-01-01

    People who claim that applying Novakian concept maps in Turkish is problematic base their arguments largely upon the structural differences between the English and Turkish languages. This study aims to introduce the indexing method to eliminate problems encountered in Turkish applications of Novakian maps and to share the preliminary results of…

  20. Predicting Childhood Sexual or Physical Abuse: A Logistic Regression Geo-mapping Approach to Prevention

    OpenAIRE

    Tadoum, Roland K.; Smolij, Kamila; Michelle A. Lyn; Johnson, Craig W.

    2005-01-01

    This study investigates the degree to which gender, ethnicity, relationship to perpetrator, and geo-mapped socio-economic factors significantly predict the incidence of childhood sexual abuse, physical abuse and non- abuse. These variables are then linked to geographic identifiers using geographic information system (GIS) technology to develop a geo-mapping framework for child sexual and physical abuse prevention.

  1. Multiple QTL mapping in related plant populations via a pedigree-analysis approach

    NARCIS (Netherlands)

    Bink, M.C.A.M.; Uimari, P.; Sillanpää, M.J.; Janss, L.L.G.; Jansen, R.C.

    2002-01-01

    QTL mapping experiments in plant breeding may involve multiple populations or pedigrees that are related through their ancestors. These known relationships have often been ignored for the sake of statistical analysis, despite their potential increase in power of mapping. We describe here a Bayesian

  2. Multiple QTL mapping in related plant populations via a pedigree-analysis approach

    NARCIS (Netherlands)

    Bink, M.C.A.M.; Uimari, P.; Sillanpää, M.J.; Janss, L.L.G.; Jansen, R.C.

    QTL mapping experiments in plant breeding may involve multiple populations or pedigrees that are related through their ancestors. These known relationships have often been ignored for the sake of statistical analysis, despite their potential increase in power of mapping. We describe here a Bayesian

  3. A two-stage approach for improved prediction of residue contact maps

    Directory of Open Access Journals (Sweden)

    Pollastri Gianluca

    2006-03-01

    Full Text Available Abstract Background Protein topology representations such as residue contact maps are an important intermediate step towards ab initio prediction of protein structure. Although improvements have occurred over the last years, the problem of accurately predicting residue contact maps from primary sequences is still largely unsolved. Among the reasons for this are the unbalanced nature of the problem (with far fewer examples of contacts than non-contacts, the formidable challenge of capturing long-range interactions in the maps, the intrinsic difficulty of mapping one-dimensional input sequences into two-dimensional output maps. In order to alleviate these problems and achieve improved contact map predictions, in this paper we split the task into two stages: the prediction of a map's principal eigenvector (PE from the primary sequence; the reconstruction of the contact map from the PE and primary sequence. Predicting the PE from the primary sequence consists in mapping a vector into a vector. This task is less complex than mapping vectors directly into two-dimensional matrices since the size of the problem is drastically reduced and so is the scale length of interactions that need to be learned. Results We develop architectures composed of ensembles of two-layered bidirectional recurrent neural networks to classify the components of the PE in 2, 3 and 4 classes from protein primary sequence, predicted secondary structure, and hydrophobicity interaction scales. Our predictor, tested on a non redundant set of 2171 proteins, achieves classification performances of up to 72.6%, 16% above a base-line statistical predictor. We design a system for the prediction of contact maps from the predicted PE. Our results show that predicting maps through the PE yields sizeable gains especially for long-range contacts which are particularly critical for accurate protein 3D reconstruction. The final predictor's accuracy on a non-redundant set of 327 targets is 35

  4. Self-organizing maps of document collections: A new approach to interactive exploration

    Energy Technology Data Exchange (ETDEWEB)

    Lagus, K.; Honkela, T.; Kaski, S.; Kohonen, T. [Helsinki Univ. of Technology (Finland)

    1996-12-31

    Powerful methods for interactive exploration and search from collections of free-form textual documents axe needed to manage the ever-increasing flood of digital information. In this article we present a method, WEBSOM, for automatic organization of full-text document collections using the self-organizing map (SOM) algorithm. The document collection is ordered onto a map in an unsupervised manner utilizing statistical information of short word contexts. The resulting ordered map where similar documents lie near each other thus presents a general view of the document space. With the aid of a suitable (WWW-based) interface, documents in interesting areas of the map can be browsed. The browsing can also be interactively extended to related topics, which appear in nearby areas on the map. Along with the method we present a case study of its use.

  5. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    Science.gov (United States)

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In

  6. Effect of Retraining Approach-Avoidance Tendencies on an Exercise Task: A Randomized Controlled Trial.

    Science.gov (United States)

    Cheval, Boris; Sarrazin, Philippe; Pelletier, Luc; Friese, Malte

    2016-12-01

    Promoting regular physical activity (PA) and lessening sedentary behaviors (SB) constitute a public health priority. Recent evidence suggests that PA and SB are not only related to reflective processes (eg, behavioral intentions), but also to impulsive approach-avoidance tendencies (IAAT). This study aims to test the effect of a computerized IAAT intervention on an exercise task. Participants (N = 115) were randomly assigned to 1 of 3 experimental conditions, in which they were either trained to approach PA and avoid SB (ApPA-AvSB condition), to approach SB and avoid PA (ApSB-AvPA condition), or to approach and avoid PA and SB equally often (active control condition). The main outcome variable was the time spent carrying out a moderate intensity exercise task. IAAT toward PA decreased in the ApSB-AvPA condition, tended to increase in the ApPA-AvSB condition, and remained stable in the control condition. Most importantly, the ApPA-AvSB manipulation led to more time spent exercising than the ApSB-AvPA condition. Sensitivity analyses excluding individuals who were highly physically active further revealed that participants in the ApPA-AvSB condition spent more time exercising than participants in the control condition. These findings provide preliminary evidence that a single intervention session can successfully change impulsive approach tendencies toward PA and can increase the time devoted to an exercise task, especially among individuals who need to be more physically active. Potential implications for health behavior theories and behavior change interventions are outlined.

  7. Letter to the editor: Generation of self organized critical connectivity network map (SOCCNM of randomly situated water bodies during flooding process

    Directory of Open Access Journals (Sweden)

    B. S. Daya Sagar

    2001-01-01

    Full Text Available This letter presents a brief framework based on nonlinear morphological transformations to generate a self organized critical connectivity network map (SOCCNM in 2-dimensional space. This simple and elegant framework is implemented on a section that contains a few simulated water bodies to generate SOCCNM. This is based on a postulate that the randomly situated surface water bodies of various sizes and shapes self organize during flooding process.

  8. A METHOD TO ESTIMATE TEMPORAL INTERACTION IN A CONDITIONAL RANDOM FIELD BASED APPROACH FOR CROP RECOGNITION

    Directory of Open Access Journals (Sweden)

    P. M. A. Diaz

    2016-06-01

    Full Text Available This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  9. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  10. Treatment comparison in randomized clinical trials with nonignorable missingness: A reverse regression approach.

    Science.gov (United States)

    Zhang, Zhiwei; Cheon, Kyeongmi

    2017-04-01

    A common problem in randomized clinical trials is nonignorable missingness, namely that the clinical outcome(s) of interest can be missing in a way that is not fully explained by the observed quantities. This happens when the continued participation of patients depends on the current outcome after adjusting for the observed history. Standard methods for handling nonignorable missingness typically require specification of the response mechanism, which can be difficult in practice. This article proposes a reverse regression approach that does not require a model for the response mechanism. Instead, the proposed approach relies on the assumption that missingness is independent of treatment assignment upon conditioning on the relevant outcome(s). This conditional independence assumption is motivated by the observation that, when patients are effectively masked to the assigned treatment, their decision to either stay in the trial or drop out cannot depend on the assigned treatment directly. Under this assumption, one can estimate parameters in the reverse regression model, test for the presence of a treatment effect, and in some cases estimate the outcome distributions. The methodology can be extended to longitudinal outcomes under natural conditions. The proposed approach is illustrated with real data from a cardiovascular study.

  11. Water-sanitation-hygiene mapping: an improved approach for data collection at local level.

    Science.gov (United States)

    Giné-Garriga, Ricard; de Palencia, Alejandro Jiménez-Fernández; Pérez-Foguet, Agustí

    2013-10-01

    Strategic planning and appropriate development and management of water and sanitation services are strongly supported by accurate and accessible data. If adequately exploited, these data might assist water managers with performance monitoring, benchmarking comparisons, policy progress evaluation, resources allocation, and decision making. A variety of tools and techniques are in place to collect such information. However, some methodological weaknesses arise when developing an instrument for routine data collection, particularly at local level: i) comparability problems due to heterogeneity of indicators, ii) poor reliability of collected data, iii) inadequate combination of different information sources, and iv) statistical validity of produced estimates when disaggregated into small geographic subareas. This study proposes an improved approach for water, sanitation and hygiene (WASH) data collection at decentralised level in low income settings, as an attempt to overcome previous shortcomings. The ultimate aim is to provide local policymakers with strong evidences to inform their planning decisions. The survey design takes the Water Point Mapping (WPM) as a starting point to record all available water sources at a particular location. This information is then linked to data produced by a household survey. Different survey instruments are implemented to collect reliable data by employing a variety of techniques, such as structured questionnaires, direct observation and water quality testing. The collected data is finally validated through simple statistical analysis, which in turn produces valuable outputs that might feed into the decision-making process. In order to demonstrate the applicability of the method, outcomes produced from three different case studies (Homa Bay District-Kenya-; Kibondo District-Tanzania-; and Municipality of Manhiça-Mozambique-) are presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Coastal system mapping: a new approach to formalising and conceptualising the connectivity of large-scale coastal systems

    Science.gov (United States)

    French, J.; Burningham, H.; Whitehouse, R.

    2010-12-01

    The concept of the coastal sediment cell has proved invaluable as a basis for estimating sediment budgets and as a framework for coastal management. However, whilst coastal sediment cells are readily identified on compartmentalised coastlines dominated by beach-grade material, the cell concept is less suited to handling broader linkages between estuarine, coastal and offshore systems, and for incorporating longer-range suspended sediment transport. We present a new approach to the conceptualisation of large-scale coastal geomorphic systems based on a hierarchical classification of component landforms and management interventions and mapping of the interactions between them. Coastal system mapping is founded on a classification that identifies high-level landform features, low-level landform elements and engineering interventions. Geomorphic features define the large-scale organisation of a system and include landforms that define gross coastal configuration (e.g. headland, bay) as well as fluvial, estuarine and offshore sub-systems that exchange sediment with and influence the open coast. Detailed system structure is mapped out with reference to a larger set of geomorphic elements (e.g. cliff, dune, beach ridge). Element-element interactions define cross-shore linkages (conceptualised as hinterland, backshore and foreshore zones) and alongshore system structure. Both structural and non-structural engineering interventions are also represented at this level. Element-level mapping is rationalised to represent alongshore variation using as few elements as possible. System linkages include both sediment transfer pathways and influences not associated with direct mass transfer (e.g. effect of a jetty at an inlet). A formal procedure for capturing and graphically representing coastal system structure has been developed around free concept mapping software, CmapTools (http://cmap.ihmc.us). Appended meta-data allow geographic coordinates, data, images and literature

  13. Developing a model for effective leadership in healthcare: a concept mapping approach

    Directory of Open Access Journals (Sweden)

    Hargett CW

    2017-08-01

    Full Text Available Charles William Hargett,1 Joseph P Doty,2 Jennifer N Hauck,3 Allison MB Webb,4 Steven H Cook,5 Nicholas E Tsipis,4 Julie A Neumann,6 Kathryn M Andolsek,7 Dean C Taylor6 1Division of Pulmonary, Allergy, and Critical Care Medicine, Department of Medicine, 2Feagin Leadership Program, 3Department of Anesthesiology, 4School of Medicine, 5Department of Neurosurgery, 6Department of Orthopaedic Surgery, 7Department of Community and Family Medicine, Duke University School of Medicine, Durham, NC, USA Purpose: Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group’s ideas to identify stakeholders’ mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies.Methods: Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare.Results: A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were “Acting with Personal Integrity”, “Communicating Effectively”, “Acting with Professional Ethical Values”, “Pursuing Excellence”, “Building and Maintaining Relationships”, and “Thinking Critically”. Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient

  14. Relation of project managers' personality and project performance: An approach based on value stream mapping

    Directory of Open Access Journals (Sweden)

    Maurizio Bevilacqua

    2014-09-01

    Full Text Available Purpose: This work investigates the influence of project managers’ personality on the success of a project in a Multinational Corporation. The methodology proposed for analyzing the project managers’ personality is based on the Myers-Briggs Type Indicator.Design/methodology/approach: Forty projects carried out in 2012 by multinational corporation, concerning new product development (NPD, have been analyzed, comparing the profile of project managers with results obtained in terms of traditional performance indexes (time delay and over-budget of projects and performance indexes usually used in “Lean Production” sector (waste time and type of “wastes”. A detailed analysis of the most important “wastes” during the project development is carried out using the Value Stream Mapping (VSM technique.Findings and Originality/value: Relying on the Myers–Briggs personality instrument, results show that extroverted managers (as opposed to introverted managers carry out projects that show lower delay and lower waste time. Introverted managers often make “Over-processing” and “Defect” types of waste. Moreover, lower delay and over-budget have been shown by perceiving managers.Research limitations: Regarding the limitations of this work it is necessary to highlight that we collected data from project managers in a retrospective way. While we believe that several aspects of our data collection effort helped enhance the accuracy of the results, future research could conduct real-time case study research to get more detailed insights into the proposed relationships and avoid retrospective bias. Moreover we focused on a single respondent, the project manager. This helped us ensure that their interpretations played an important role in product development. But, we cannot examined the opinion of team members that could be different from project managers opinion regarding some questions.Originality/value: This research provides insight useful

  15. Active music therapy approach in amyotrophic lateral sclerosis: a randomized-controlled trial.

    Science.gov (United States)

    Raglio, Alfredo; Giovanazzi, Elena; Pain, Debora; Baiardi, Paola; Imbriani, Chiara; Imbriani, Marcello; Mora, Gabriele

    2016-12-01

    This randomized controlled study assessed the efficacy of active music therapy (AMT) on anxiety, depression, and quality of life in amyotrophic lateral sclerosis (ALS). Communication and relationship during AMT treatment were also evaluated. Thirty patients were assigned randomly to experimental [AMT plus standard of care (SC)] or control (SC) groups. AMT consisted of 12 sessions (three times a week), whereas the SC treatment was based on physical and speech rehabilitation sessions, occupational therapy, and psychological support. ALS Functional Rating Scale-Revised, Hospital Anxiety and Depression Scale, McGill Quality of Life Questionnaire, and Music Therapy Rating Scale were administered to assess functional, psychological, and music therapy outcomes. The AMT group improved significantly in McGill Quality of Life Questionnaire global scores (P=0.035) and showed a positive trend in nonverbal and sonorous-music relationship during the treatment. Further studies involving larger samples in a longer AMT intervention are needed to confirm the effectiveness of this approach in ALS.

  16. Bayesian and variational Bayesian approaches for flows in heterogeneous random media

    Science.gov (United States)

    Yang, Keren; Guha, Nilabja; Efendiev, Yalchin; Mallick, Bani K.

    2017-09-01

    In this paper, we study porous media flows in heterogeneous stochastic media. We propose an efficient forward simulation technique that is tailored for variational Bayesian inversion. As a starting point, the proposed forward simulation technique decomposes the solution into the sum of separable functions (with respect to randomness and the space), where each term is calculated based on a variational approach. This is similar to Proper Generalized Decomposition (PGD). Next, we apply a multiscale technique to solve for each term (as in [1]) and, further, decompose the random function into 1D fields. As a result, our proposed method provides an approximation hierarchy for the solution as we increase the number of terms in the expansion and, also, increase the spatial resolution of each term. We use the hierarchical solution distributions in a variational Bayesian approximation to perform uncertainty quantification in the inverse problem. We conduct a detailed numerical study to explore the performance of the proposed uncertainty quantification technique and show the theoretical posterior concentration.

  17. A stochastic control approach to Slotted-ALOHA random access protocol

    Science.gov (United States)

    Pietrabissa, Antonio

    2013-12-01

    ALOHA random access protocols are distributed protocols based on transmission probabilities, that is, each node decides upon packet transmissions according to a transmission probability value. In the literature, ALOHA protocols are analysed by giving necessary and sufficient conditions for the stability of the queues of the node buffers under a control vector (whose elements are the transmission probabilities assigned to the nodes), given an arrival rate vector (whose elements represent the rates of the packets arriving in the node buffers). The innovation of this work is that, given an arrival rate vector, it computes the optimal control vector by defining and solving a stochastic control problem aimed at maximising the overall transmission efficiency, while keeping a grade of fairness among the nodes. Furthermore, a more general case in which the arrival rate vector changes in time is considered. The increased efficiency of the proposed solution with respect to the standard ALOHA approach is evaluated by means of numerical simulations.

  18. A new approach to analyze strategy map using an integrated BSC and FUZZY DEMATEL

    Directory of Open Access Journals (Sweden)

    Seyed Abdollah Heydariyeh

    2012-01-01

    Full Text Available Today, with ever-increasing competition in global economic conditions, the necessity of effective implementation of strategy map has become an inevitable and necessary. The strategy map represents a general and structured framework for strategic objectives and plays an important role in forming competitive advantages for organizations. It is important to find important factors influencing strategy map and prioritize them based on suitable factors. In this paper, we propose an integration of BSC and Fuzzy DEMATEL technique to rank different items influencing strategy of a production plan. The proposed technique is implemented for real-world case study of glass production.

  19. A concept mapping approach to guide and understand dissemination and implementation.

    Science.gov (United States)

    Green, Amy E; Fettes, Danielle L; Aarons, Gregory A

    2012-10-01

    Many efforts to implement evidence-based programs do not reach their full potential or fail due to the variety of challenges inherent in dissemination and implementation. This article describes the use of concept mapping-a mixed method strategy-to study implementation of behavioral health innovations and evidence-based practice (EBP). The application of concept mapping to implementation research represents a practical and concise way to identify and quantify factors affecting implementation, develop conceptual models of implementation, target areas to address as part of implementation readiness and active implementation, and foster communication among stakeholders. Concept mapping is described and a case example is provided to illustrate its use in an implementation study. Implications for the use of concept mapping methods in both research and applied settings towards the dissemination and implementation of behavioral health services are discussed.

  20. First level seismic microzonation map of Chennai city – a GIS approach

    National Research Council Canada - National Science Library

    Ganapathy, G. P

    2011-01-01

    ...) to Moderate Seismic Hazard (Zone III)-(BIS: 1893 (2001)). In this connection, a first level seismic microzonation map of Chennai city has been produced with a GIS platform using the themes, viz, Peak Ground Acceleration (PGA...

  1. A Systematic Approach to Modified BCJR MAP Algorithms for Convolutional Codes

    National Research Council Canada - National Science Library

    Wang, Sichun; Patenaude, François

    2006-01-01

    .... The existence of a relatively large number of similar but different modified BCJR MAP algorithms, derived using the Markov chain properties of convolutional codes, naturally leads to the following questions...

  2. Wave propagation properties in oscillatory chains with cubic nonlinearities via nonlinear map approach

    Energy Technology Data Exchange (ETDEWEB)

    Romeo, Francesco [Dipartimento di Ingegneria Strutturale e Geotecnica, Universita di Roma ' La Sapienza' , Via Gramsci 53, 00197 Rome (Italy)] e-mail: francesco.romeo@uniromal.it; Rega, Giuseppe [Dipartimento di Ingegneria Strutturale e Geotecnica, Universita di Roma ' La Sapienza' , Via Gramsci 53, 00197 Rome (Italy)] e-mail: giuseppe.rega@uniromal.it

    2006-02-01

    Free wave propagation properties in one-dimensional chains of nonlinear oscillators are investigated by means of nonlinear maps. In this realm, the governing difference equations are regarded as symplectic nonlinear transformations relating the amplitudes in adjacent chain sites (n, n + 1) thereby considering a dynamical system where the location index n plays the role of the discrete time. Thus, wave propagation becomes synonymous of stability: finding regions of propagating wave solutions is equivalent to finding regions of linearly stable map solutions. Mechanical models of chains of linearly coupled nonlinear oscillators are investigated. Pass- and stop-band regions of the mono-coupled periodic system are analytically determined for period-q orbits as they are governed by the eigenvalues of the linearized 2D map arising from linear stability analysis of periodic orbits. Then, equivalent chains of nonlinear oscillators in complex domain are tackled. Also in this case, where a 4D real map governs the wave transmission, the nonlinear pass- and stop-bands for periodic orbits are analytically determined by extending the 2D map analysis. The analytical findings concerning the propagation properties are then compared with numerical results obtained through nonlinear map iteration.

  3. The syntax-phonology mapping of intonational phrases in complex sentences: A flexible approach

    Directory of Open Access Journals (Sweden)

    Fatima Hamlaoui

    2017-06-01

    Full Text Available In this paper, we extend to complex sentences the proposal that the notion of 'clause 'in ALIGN/MATCH constraints related to the syntax-prosody mapping of the intonational phrase should be determined in each language (and each construction by making reference to the highest syntactic phrase whose head is overtly filled by the verb (or verbal material (Hamlaoui & Szendrői 2015. We propose that while root-clauses have a privileged status from the syntax-to- prosody mapping perspective, all clauses are equal in the prosody-to-syntax mapping. In the spirit of the Minimalist Program (Chomsky 2005, we bring in extragrammatical motivation for the proposed mapping principles from parsing and learnability. This allows us to account for the fact that, whereas in many languages like Bàsàá (Bantu and Hungarian (Finno-Ugric, only root clauses normally map onto intonational phrases, additional intonational phrase edges can be found under the pressure of high-ranked prosodic, processing or information-structural requirements. This is the case with Hungarian embedded foci and Bàsàá embedded topics where, we argue, embedded 'ι 'edges are meant to satisfy STRESSFOCUS and ALIGNTOPIC, respectively. In languages where embedded clauses seem to map onto their own intonational phrases more generally, such as Japanese or Luganda, further independent constraints should be evoked.   This article is part of the special collection: Prosody and Constituent Structure

  4. Conditional random slope: A new approach for estimating individual child growth velocity in epidemiological research.

    Science.gov (United States)

    Leung, Michael; Bassani, Diego G; Racine-Poon, Amy; Goldenberg, Anna; Ali, Syed Asad; Kang, Gagandeep; Premkumar, Prasanna S; Roth, Daniel E

    2017-09-10

    Conditioning child growth measures on baseline accounts for regression to the mean (RTM). Here, we present the "conditional random slope" (CRS) model, based on a linear-mixed effects model that incorporates a baseline-time interaction term that can accommodate multiple data points for a child while also directly accounting for RTM. In two birth cohorts, we applied five approaches to estimate child growth velocities from 0 to 12 months to assess the effect of increasing data density (number of measures per child) on the magnitude of RTM of unconditional estimates, and the correlation and concordance between the CRS and four alternative metrics. Further, we demonstrated the differential effect of the choice of velocity metric on the magnitude of the association between infant growth and stunting at 2 years. RTM was minimally attenuated by increasing data density for unconditional growth modeling approaches. CRS and classical conditional models gave nearly identical estimates with two measures per child. Compared to the CRS estimates, unconditional metrics had moderate correlation (r = 0.65-0.91), but poor agreement in the classification of infants with relatively slow growth (kappa = 0.38-0.78). Estimates of the velocity-stunting association were the same for CRS and classical conditional models but differed substantially between conditional versus unconditional metrics. The CRS can leverage the flexibility of linear mixed models while addressing RTM in longitudinal analyses. © 2017 The Authors American Journal of Human Biology Published by Wiley Periodicals, Inc.

  5. Similarity and accuracy of mental models formed during nursing handovers: A concept mapping approach.

    Science.gov (United States)

    Drach-Zahavy, Anat; Broyer, Chaya; Dagan, Efrat

    2017-09-01

    Shared mental models are crucial for constructing mutual understanding of the patient's condition during a clinical handover. Yet, scant research, if any, has empirically explored mental models of the parties involved in a clinical handover. This study aimed to examine the similarities among mental models of incoming and outgoing nurses, and to test their accuracy by comparing them with mental models of expert nurses. A cross-sectional study, exploring nurses' mental models via the concept mapping technique. 40 clinical handovers. Data were collected via concept mapping of the incoming, outgoing, and expert nurses' mental models (total of 120 concept maps). Similarity and accuracy for concepts and associations indexes were calculated to compare the different maps. About one fifth of the concepts emerged in both outgoing and incoming nurses' concept maps (concept similarity=23%±10.6). Concept accuracy indexes were 35%±18.8 for incoming and 62%±19.6 for outgoing nurses' maps. Although incoming nurses absorbed fewer number of concepts and associations (23% and 12%, respectively), they partially closed the gap (35% and 22%, respectively) relative to expert nurses' maps. The correlations between concept similarities, and incoming as well as outgoing nurses' concept accuracy, were significant (r=0.43, pmaps, outgoing nurses added information concerning the processes enacted during the shift, beyond the expert nurses' gold standard. Two seemingly contradicting processes in the handover were identified. "Information loss", captured by the low similarity indexes among the mental models of incoming and outgoing nurses; and "information restoration", based on accuracy measures indexes among the mental models of the incoming nurses. Based on mental model theory, we propose possible explanations for these processes and derive implications for how to improve a clinical handover. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Visualizing the topical structure of the medical sciences: a self-organizing map approach.

    Science.gov (United States)

    Skupin, André; Biberstine, Joseph R; Börner, Katy

    2013-01-01

    We implement a high-resolution visualization of the medical knowledge domain using the self-organizing map (SOM) method, based on a corpus of over two million publications. While self-organizing maps have been used for document visualization for some time, (1) little is known about how to deal with truly large document collections in conjunction with a large number of SOM neurons, (2) post-training geometric and semiotic transformations of the SOM tend to be limited, and (3) no user studies have been conducted with domain experts to validate the utility and readability of the resulting visualizations. Our study makes key contributions to all of these issues. Documents extracted from Medline and Scopus are analyzed on the basis of indexer-assigned MeSH terms. Initial dimensionality is reduced to include only the top 10% most frequent terms and the resulting document vectors are then used to train a large SOM consisting of over 75,000 neurons. The resulting two-dimensional model of the high-dimensional input space is then transformed into a large-format map by using geographic information system (GIS) techniques and cartographic design principles. This map is then annotated and evaluated by ten experts stemming from the biomedical and other domains. Study results demonstrate that it is possible to transform a very large document corpus into a map that is visually engaging and conceptually stimulating to subject experts from both inside and outside of the particular knowledge domain. The challenges of dealing with a truly large corpus come to the fore and require embracing parallelization and use of supercomputing resources to solve otherwise intractable computational tasks. Among the envisaged future efforts are the creation of a highly interactive interface and the elaboration of the notion of this map of medicine acting as a base map, onto which other knowledge artifacts could be overlaid.

  7. Visualizing the topical structure of the medical sciences: a self-organizing map approach.

    Directory of Open Access Journals (Sweden)

    André Skupin

    Full Text Available We implement a high-resolution visualization of the medical knowledge domain using the self-organizing map (SOM method, based on a corpus of over two million publications. While self-organizing maps have been used for document visualization for some time, (1 little is known about how to deal with truly large document collections in conjunction with a large number of SOM neurons, (2 post-training geometric and semiotic transformations of the SOM tend to be limited, and (3 no user studies have been conducted with domain experts to validate the utility and readability of the resulting visualizations. Our study makes key contributions to all of these issues.Documents extracted from Medline and Scopus are analyzed on the basis of indexer-assigned MeSH terms. Initial dimensionality is reduced to include only the top 10% most frequent terms and the resulting document vectors are then used to train a large SOM consisting of over 75,000 neurons. The resulting two-dimensional model of the high-dimensional input space is then transformed into a large-format map by using geographic information system (GIS techniques and cartographic design principles. This map is then annotated and evaluated by ten experts stemming from the biomedical and other domains.Study results demonstrate that it is possible to transform a very large document corpus into a map that is visually engaging and conceptually stimulating to subject experts from both inside and outside of the particular knowledge domain. The challenges of dealing with a truly large corpus come to the fore and require embracing parallelization and use of supercomputing resources to solve otherwise intractable computational tasks. Among the envisaged future efforts are the creation of a highly interactive interface and the elaboration of the notion of this map of medicine acting as a base map, onto which other knowledge artifacts could be overlaid.

  8. Topographic mapping on large-scale tidal flats with an iterative approach on the waterline method

    Science.gov (United States)

    Kang, Yanyan; Ding, Xianrong; Xu, Fan; Zhang, Changkuan; Ge, Xiaoping

    2017-05-01

    Tidal flats, which are both a natural ecosystem and a type of landscape, are of significant importance to ecosystem function and land resource potential. Morphologic monitoring of tidal flats has become increasingly important with respect to achieving sustainable development targets. Remote sensing is an established technique for the measurement of topography over tidal flats; of the available methods, the waterline method is particularly effective for constructing a digital elevation model (DEM) of intertidal areas. However, application of the waterline method is more limited in large-scale, shifting tidal flats areas, where the tides are not synchronized and the waterline is not a quasi-contour line. For this study, a topographical map of the intertidal regions within the Radial Sand Ridges (RSR) along the Jiangsu Coast, China, was generated using an iterative approach on the waterline method. A series of 21 multi-temporal satellite images (18 HJ-1A/B CCD and three Landsat TM/OLI) of the RSR area collected at different water levels within a five month period (31 December 2013-28 May 2014) was used to extract waterlines based on feature extraction techniques and artificial further modification. These 'remotely-sensed waterlines' were combined with the corresponding water levels from the 'model waterlines' simulated by a hydrodynamic model with an initial generalized DEM of exposed tidal flats. Based on the 21 heighted 'remotely-sensed waterlines', a DEM was constructed using the ANUDEM interpolation method. Using this new DEM as the input data, it was re-entered into the hydrodynamic model, and a new round of water level assignment of waterlines was performed. A third and final output DEM was generated covering an area of approximately 1900 km2 of tidal flats in the RSR. The water level simulation accuracy of the hydrodynamic model was within 0.15 m based on five real-time tide stations, and the height accuracy (root mean square error) of the final DEM was 0.182 m

  9. A Multi - Disciplinary Approach Combining Geological, Geomorphological and Geophysical Data for Mapping the Susceptibility to Sinkholes

    Science.gov (United States)

    Margiotta, Stefano; Negri, Sergio; Quarta, Tatiana A. M.; Parise, Mario

    2013-04-01

    The Salento region of southern Italy has a great number of active sinkholes, related to both natural and anthropogenic cavities. The presence of sinkholes is at the origin of several problems to the built-up environment, due to the increasing population growth and development pressures. In such a context, the detection of cavities, and therefore the assessment of the sinkhole hazard presents numerous difficulties. Multidisciplinary - approach, comprising geological, geomorphological and geophysical analyses, is therefore necessary to obtain comprehensive knowledge of the complex phenomena in karstic areas. Geophysical methods can also be of great help to identify and map the areas at higher risk of collapse. In this case it is important to identify the features related to the underground voids, likely evolving to sinkholes, by contrasts in physical properties such as density, electrical resistivity, and so on, with the surrounding sediments. At the same time, identification of the presence of sinkholes by geophysical methods has to adapt to the different geological conditions, so that there is not the possibility to use the same techniques everywhere. At this aim, the present paper illustrates the advantages of integrating geological and geomorphological surveys with surface geophysical techniques such as seismic, geoelectrical and ground penetrating radar methods for the identification of sinkhole-prone areas. The present work illustrates the results concerning a sinkhole system at Nociglia (inland Salento, southeastern Italy) where the shallow phreatic speleogenesis operates close to the water table level with formation of karst conduits and proto-caves whose evolution occurs through successive roof collapse, formation of wide caverns and sinkhole development at the surface. All of this creates serious problems to the nearby infrastructures, including a province road that has often been threatened by the sinkhole development. Geological and geomorphological

  10. Comparing Pixel and Object-Based Approaches to Map an Understorey Invasive Shrub in Tropical Mixed Forests

    Directory of Open Access Journals (Sweden)

    Madhura Niphadkar

    2017-05-01

    Full Text Available The establishment of invasive alien species in varied habitats across the world is now recognized as a genuine threat to the preservation of biodiversity. Specifically, plant invasions in understory tropical forests are detrimental to the persistence of healthy ecosystems. Monitoring such invasions using Very High Resolution (VHR satellite remote sensing has been shown to be valuable in designing management interventions for conservation of native habitats. Object-based classification methods are very helpful in identifying invasive plants in various habitats, by their inherent nature of imitating the ability of the human brain in pattern recognition. However, these methods have not been tested adequately in dense tropical mixed forests where invasion occurs in the understorey. This study compares a pixel-based and object-based classification method for mapping the understorey invasive shrub Lantana camara (Lantana in a tropical mixed forest habitat in the Western Ghats biodiversity hotspot in India. Overall, a hierarchical approach of mapping top canopy at first, and then further processing for the understorey shrub, using measures such as texture and vegetation indices proved effective in separating out Lantana from other cover types. In the first method, we implement a simple parametric supervised classification for mapping cover types, and then process within these types for Lantana delineation. In the second method, we use an object-based segmentation algorithm to map cover types, and then perform further processing for separating Lantana. The improved ability of the object-based approach to delineate structurally distinct objects with characteristic spectral and spatial characteristics of their own, as well as with reference to their surroundings, allows for much flexibility in identifying invasive understorey shrubs among the complex vegetation of the tropical forest than that provided by the parametric classifier. Conservation practices

  11. Comparing Pixel and Object-Based Approaches to Map an Understorey Invasive Shrub in Tropical Mixed Forests

    Science.gov (United States)

    Niphadkar, Madhura; Nagendra, Harini; Tarantino, Cristina; Adamo, Maria; Blonda, Palma

    2017-01-01

    The establishment of invasive alien species in varied habitats across the world is now recognized as a genuine threat to the preservation of biodiversity. Specifically, plant invasions in understory tropical forests are detrimental to the persistence of healthy ecosystems. Monitoring such invasions using Very High Resolution (VHR) satellite remote sensing has been shown to be valuable in designing management interventions for conservation of native habitats. Object-based classification methods are very helpful in identifying invasive plants in various habitats, by their inherent nature of imitating the ability of the human brain in pattern recognition. However, these methods have not been tested adequately in dense tropical mixed forests where invasion occurs in the understorey. This study compares a pixel-based and object-based classification method for mapping the understorey invasive shrub Lantana camara (Lantana) in a tropical mixed forest habitat in the Western Ghats biodiversity hotspot in India. Overall, a hierarchical approach of mapping top canopy at first, and then further processing for the understorey shrub, using measures such as texture and vegetation indices proved effective in separating out Lantana from other cover types. In the first method, we implement a simple parametric supervised classification for mapping cover types, and then process within these types for Lantana delineation. In the second method, we use an object-based segmentation algorithm to map cover types, and then perform further processing for separating Lantana. The improved ability of the object-based approach to delineate structurally distinct objects with characteristic spectral and spatial characteristics of their own, as well as with reference to their surroundings, allows for much flexibility in identifying invasive understorey shrubs among the complex vegetation of the tropical forest than that provided by the parametric classifier. Conservation practices in tropical mixed

  12. A novel approach for monitoring writing interferences during navigated transcranial magnetic stimulation mappings of writing related cortical areas.

    Science.gov (United States)

    Rogić Vidaković, Maja; Gabelica, Dragan; Vujović, Igor; Šoda, Joško; Batarelo, Nikolina; Džimbeg, Andrija; Zmajević Schönwald, Marina; Rotim, Krešimir; Đogaš, Zoran

    2015-11-30

    It has recently been shown that navigated repetitive transcranial magnetic stimulation (nTMS) is useful in preoperative neurosurgical mapping of motor and language brain areas. In TMS mapping of motor cortices the evoked responses can be quantitatively monitored by electromyographic (EMG) recordings. No such setup exists for monitoring of writing during nTMS mappings of writing related cortical areas. We present a novel approach for monitoring writing during nTMS mappings of motor writing related cortical areas. To our best knowledge, this is the first demonstration of quantitative monitoring of motor evoked responses from hand by EMG, and of pen related activity during writing with our custom made pen, together with the application of chronometric TMS design and patterned protocol of rTMS. The method was applied in four healthy subjects participating in writing during nTMS mapping of the premotor cortical area corresponding to BA 6 and close to the superior frontal sulcus. The results showed that stimulation impaired writing in all subjects. The corresponding spectra of measured signal related to writing movements was observed in the frequency band 0-20 Hz. Magnetic stimulation affected writing by suppressing normal writing frequency band. The proposed setup for monitoring of writing provides additional quantitative data for monitoring and the analysis of rTMS induced writing response modifications. The setup can be useful for investigation of neurophysiologic mechanisms of writing, for therapeutic effects of nTMS, and in preoperative mapping of language cortical areas in patients undergoing brain surgery. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Impact of visual impairment on the lives of young adults in the Netherlands: a concept-mapping approach.

    Science.gov (United States)

    Elsman, Ellen Bernadette Maria; van Rens, Gerardus Hermanus Maria Bartholomeus; van Nispen, Ruth Marie Antoinette

    2017-12-01

    While the impact of visual impairments on specific aspects of young adults' lives is well recognised, a systematic understanding of its impact on all life aspects is lacking. This study aims to provide an overview of life aspects affected by visual impairment in young adults (aged 18-25 years) using a concept-mapping approach. Visually impaired young adults (n = 22) and rehabilitation professionals (n = 16) participated in online concept-mapping workshops (brainstorm procedure), to explore how having a visual impairment influences the lives of young adults. Statements were categorised based on similarity and importance. Using multidimensional scaling, concept maps were produced and interpreted. A total of 59 and 260 statements were generated by young adults and professionals, respectively, resulting in 99 individual statements after checking and deduplication. The combined concept map revealed 11 clusters: work, study, information and regulations, social skills, living independently, computer, social relationships, sport and activities, mobility, leisure time, and hobby. The concept maps provided useful insight into activities influenced by visual impairments in young adults, which can be used by rehabilitation centres to improve their services. This might help in goal setting, rehabilitation referral and successful transition to adult life, ultimately increasing participation and quality of life. Implications for rehabilitation Having a visual impairment affects various life-aspects related to participation, including activities related to work, study, social skills and relationships, activities of daily living, leisure time and mobility. Concept-mapping helped to identify the life aspects affected by low vision, and quantify these aspects in terms of importance according to young adults and low vision rehabilitation professionals. Low vision rehabilitation centres should focus on all life aspects found in this study when identifying the needs of young

  14. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    Science.gov (United States)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  15. Brain mapping in a patient with congenital blindness – a case for multimodal approaches

    Directory of Open Access Journals (Sweden)

    Jarod L Roland

    2013-07-01

    Full Text Available Recent advances in basic neuroscience research across a wide range of methodologies have contributed significantly to our understanding of human cortical electrophysiology and functional brain imaging. Translation of this research into clinical neurosurgery has opened doors for advanced mapping of functionality that previously was prohibitively difficult, if not impossible. Here we present the case of a unique individual with congenital blindness and medically refractory epilepsy who underwent neurosurgical treatment of her seizures. Pre-operative evaluation presented the challenge of accurately and robustly mapping the cerebral cortex for an individual with a high probability of significant cortical re-organization. Additionally, a blind individual has unique priorities in one’s ability to read Braille by touch and sense the environment primarily by sound than the non-vision impaired person. For these reasons we employed additional measures to map sensory, motor, speech, language, and auditory perception by employing a number of cortical electrophysiologic mapping and functional magnetic resonance imaging methods. Our data show promising results in the application of these adjunctive methods in the pre-operative mapping of otherwise difficult to localize, and highly variable, functional cortical areas.

  16. Remote sensing approach to map riparian vegetation of the Colorado River Ecosystem, Grand Canyon area, Arizona

    Science.gov (United States)

    Nguyen, U.; Glenn, E.; Nagler, P. L.; Sankey, J. B.

    2015-12-01

    Riparian zones in the southwestern U.S. are usually a mosaic of vegetation types at varying states of succession in response to past floods or droughts. Human impacts also affect riparian vegetation patterns. Human- induced changes include introduction of exotic species, diversion of water for human use, channelization of the river to protect property, and other land use changes that can lead to deterioration of the riparian ecosystem. This study explored the use of remote sensing to map an iconic stretch of the Colorado River in the Grand Canyon National Park, Arizona. The pre-dam riparian zone in the Grand Canyon was affected by annual floods from spring run-off from the watersheds of Green River, the Colorado River and the San Juan River. A pixel-based vegetation map of the riparian zone in the Grand Canyon, Arizona, was produced from high-resolution aerial imagery. The map was calibrated and validated with ground survey data. A seven-step image processing and classification procedure was developed based on a suite of vegetation indices and classification subroutines available in ENVI Image Processing and Analysis software. The result was a quantitative species level vegetation map that could be more accurate than the qualitative, polygon-based maps presently used on the Lower Colorado River. The dominant woody species in the Grand Canyon are now saltcedar, arrowweed and mesquite, reflecting stress-tolerant forms adapted to alternated flow regimes associated with the river regulation.

  17. A SPREADSHEET MAPPING APPROACH FOR ERROR CHECKING AND SHARING COLLECTION POINT DATA

    Directory of Open Access Journals (Sweden)

    Desmond Foley

    2010-11-01

    Full Text Available The ready availability of online maps of plant and animal collection locations has drawn attention to the need for georeference accuracy. Many obvious georeference errors, for example, that map land animals over sea, the wrong hemisphere, or the wrong country, may be avoided if collectors and data providers could easily map their data points prior to publication. Various tools are available for quality control of georeference data, but many involve an investment of time to learn the software involved. This paper presents a method for the rapid map display of longitude and latitude data using the chart function in Microsoft Office Excel®, arguably the most ubiquitous spreadsheet software. Advantages of this method include: immediate visual feedback to assess data point accuracy; and results that can be easily shared with others. Methods for making custom Excel chart maps are given, and we provide free charts for the world and a selection of countries at http://www.vectormap.org/resources.htm.

  18. A nuclear reload optimization approach using a real coded genetic algorithm with random keys

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Alan M.M. de; Schirru, Roberto; Medeiros, Jose A.C.C., E-mail: alan@lmp.ufrj.b, E-mail: schirru@lmp.ufrj.b, E-mail: canedo@lmp.ufrj.b [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear

    2009-07-01

    The fuel reload of a Pressurized Water Reactor is made whenever the burn up of the fuel assemblies in the nucleus of the reactor reaches a certain value such that it is not more possible to maintain a critical reactor producing energy at nominal power. The problem of fuel reload optimization consists on determining the positioning of the fuel assemblies within the nucleus of the reactor in an optimized way to minimize the cost benefit relationship of fuel assemblies cost per maximum burn up, and also satisfying symmetry and safety restrictions. The fuel reload optimization problem difficulty grows exponentially with the number of fuel assemblies in the nucleus of the reactor. During decades the fuel reload optimization problem was solved manually by experts that used their knowledge and experience to build configurations of the reactor nucleus, and testing them to verify if safety restrictions of the plant are satisfied. To reduce this burden, several optimization techniques have been used, included the binary code genetic algorithm. In this work we show the use of a real valued coded approach of the genetic algorithm, with different recombination methods, together with a transformation mechanism called random keys, to transform the real values of the genes of each chromosome in a combination of discrete fuel assemblies for evaluation of the reload optimization. Four different recombination methods were tested: discrete recombination, intermediate recombination, linear recombination and extended linear recombination. For each of the 4 recombination methods 10 different tests using different seeds for the random number generator were conducted 10 generating, totaling 40 tests. The results of the application of the genetic algorithm are shown with formulation of real numbers for the problem of the nuclear reload of the plant Angra 1 type PWR. Since the best results in the literature for this problem were found by the parallel PSO we will it use for comparison

  19. Mapping Speech Spectra from Throat Microphone to Close-Speaking Microphone: A Neural Network Approach

    Directory of Open Access Journals (Sweden)

    Yegnanarayana B

    2007-01-01

    Full Text Available Speech recorded from a throat microphone is robust to the surrounding noise, but sounds unnatural unlike the speech recorded from a close-speaking microphone. This paper addresses the issue of improving the perceptual quality of the throat microphone speech by mapping the speech spectra from the throat microphone to the close-speaking microphone. A neural network model is used to capture the speaker-dependent functional relationship between the feature vectors (cepstral coefficients of the two speech signals. A method is proposed to ensure the stability of the all-pole synthesis filter. Objective evaluations indicate the effectiveness of the proposed mapping scheme. The advantage of this method is that the model gives a smooth estimate of the spectra of the close-speaking microphone speech. No distortions are perceived in the reconstructed speech. This mapping technique is also used for bandwidth extension of telephone speech.

  20. Fuzzy outranking approach: A knowledge-driven method for mineral prospectivity mapping

    Science.gov (United States)

    Abedi, Maysam; Norouzi, Gholam-Hossain; Fathianpour, Nader

    2013-04-01

    This paper describes the application of a new multi-criteria decision making (MCDM) technique called fuzzy outranking to map prospectivity for porphyry Cusbnd Mo deposits. Various raster-based evidential layers involving geological, geophysical, and geochemical geo-data sets are integrated for mineral prospectivity mapping (MPM). In a case study, 13 layers of the Now Chun deposit located in the Kerman province of Iran are used to explore the region of interest. The outputs are validated using 21 boreholes drilled in this area. Comparison of the output prospectivity map with concentrations of Cu and Mo in the boreholes indicates that the fuzzy outranking MCDM is a useful tool for MPM. The proposed method shows a high performance for MPM thereby reducing the cost of exploratory drilling in the study area.

  1. Acoustical source mapping based on deconvolution approaches for circular microphone arrays

    DEFF Research Database (Denmark)

    Tiana Roig, Elisabet; Jacobsen, Finn

    2011-01-01

    Recently, the aeroacoustic community has examined various methods based on deconvolution to improve the visualization of acoustic fields scanned with planar arrays of microphones. These methods are based on the assumption that the beamforming map in an observation plane parallel to the array can...... be approximated by a convolution of the actual sources and the beamformer’s point spread-function, i.e., the beamformer’s response to a point source. By deconvolving the resulting map, the resolution is improved and the side-lobes effect is reduced or even eliminated compared to conventional beamforming. Even...... though these methods are originally designed for planar sparse arrays, they can be adapted to uniform circular arrays for mapping the sound over 360º. Such geometry has the advantage that the beamforming response has always the same shape around the focusing direction, or in other words...

  2. Automatic mapping of event landslides at basin scale in Taiwan using a Montecarlo approach and synthetic land cover fingerprints

    Science.gov (United States)

    Mondini, Alessandro C.; Chang, Kang-Tsung; Chiang, Shou-Hao; Schlögel, Romy; Notarnicola, Claudia; Saito, Hitoshi

    2017-12-01

    We propose a framework to systematically generate event landslide inventory maps from satellite images in southern Taiwan, where landslides are frequent and abundant. The spectral information is used to assess the pixel land cover class membership probability through a Maximum Likelihood classifier trained with randomly generated synthetic land cover spectral fingerprints, which are obtained from an independent training images dataset. Pixels are classified as landslides when the calculated landslide class membership probability, weighted by a susceptibility model, is higher than membership probabilities of other classes. We generated synthetic fingerprints from two FORMOSAT-2 images acquired in 2009 and tested the procedure on two other images, one in 2005 and the other in 2009. We also obtained two landslide maps through manual interpretation. The agreement between the two sets of inventories is given by the Cohen's k coefficients of 0.62 and 0.64, respectively. This procedure can now classify a new FORMOSAT-2 image automatically facilitating the production of landslide inventory maps.

  3. Identifying patients with chronic conditions using pharmacy data in Switzerland: an updated mapping approach to the classification of medications.

    Science.gov (United States)

    Huber, Carola A; Szucs, Thomas D; Rapold, Roland; Reich, Oliver

    2013-10-30

    Quantifying population health is important for public health policy. Since national disease registers recording clinical diagnoses are often not available, pharmacy data were frequently used to identify chronic conditions (CCs) in populations. However, most approaches mapping prescribed drugs to CCs are outdated and unambiguous. The aim of this study was to provide an improved and updated mapping approach to the classification of medications. Furthermore, we aimed to give an overview of the proportions of patients with CCs in Switzerland using this new mapping approach. The database included medical and pharmacy claims data (2011) from patients aged 18 years or older. Based on prescription drug data and using the Anatomical Therapeutic Chemical (ATC) classification system, patients with CCs were identified by a medical expert review. Proportions of patients with CCs were calculated by sex and age groups. We constructed multiple logistic regression models to assess the association between patient characteristics and having a CC, as well as between risk factors (diabetes, hyperlipidemia) for cardiovascular diseases (CVD) and CVD as one of the most prevalent CCs. A total of 22 CCs were identified. In 2011, 62% of the 932'612 subjects enrolled have been prescribed a drug for the treatment of at least one CC. Rheumatologic conditions, CVD and pain were the most frequent CCs. 29% of the persons had CVD, 10% both CVD and hyperlipidemia, 4% CVD and diabetes, and 2% suffered from all of the three conditions. The regression model showed that diabetes and hyperlipidemia were strongly associated with CVD. Using pharmacy claims data, we developed an updated and improved approach for a feasible and efficient measure of patients' chronic disease status. Pharmacy drug data may be a valuable source for measuring population's burden of disease, when clinical data are missing. This approach may contribute to health policy debates about health services sources and risk adjustment

  4. HyDEn: a hybrid steganocryptographic approach for data encryption using randomized error-correcting DNA codes.

    Science.gov (United States)

    Tulpan, Dan; Regoui, Chaouki; Durand, Guillaume; Belliveau, Luc; Léger, Serge

    2013-01-01

    This paper presents a novel hybrid DNA encryption (HyDEn) approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach.

  5. HyDEn: A Hybrid Steganocryptographic Approach for Data Encryption Using Randomized Error-Correcting DNA Codes

    Directory of Open Access Journals (Sweden)

    Dan Tulpan

    2013-01-01

    Full Text Available This paper presents a novel hybrid DNA encryption (HyDEn approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach.

  6. Understanding Users' Meaning Constructions of IS Adoption and Use - A Cognitive Mapping Approach

    DEFF Research Database (Denmark)

    Jensen, Tina Blegind; Kjærgaard, Annemette

    adoptive use behavior. We study the post-adoption process of a larger Electronic Patient Record (EPR) project, focusing specifically on how two different professional groups, doctors and nurses, make sense of the technology as it becomes integrated into their work processes. We use cognitive mapping...... to represent the users' perceptions of the information system in order to study their sensemaking. In continuation of present findings, we aim to develop the use of cognitive mapping further in order to move from using it as a static snapshot understanding of the users' perceptions to a dynamic tool...

  7. Mapping the electrical properties of semiconductor junctions - the electron holographic approach

    DEFF Research Database (Denmark)

    Twitchett-Harrison, A.C.; Dunin-Borkowski, Rafal E.; Midgley, P.A.

    2008-01-01

    The need to determine the electrical properties of semiconductor junctions with high spatial resolution Is as pressing now as ever. One technique that offers the possibility of quantitative high-resolution mapping of two- and three-dimensional electrostatic potential distributions is off-axis ele......The need to determine the electrical properties of semiconductor junctions with high spatial resolution Is as pressing now as ever. One technique that offers the possibility of quantitative high-resolution mapping of two- and three-dimensional electrostatic potential distributions is off...

  8. Mapping of rock types using a joint approach by combining the multivariate statistics, self-organizing map and Bayesian neural networks: an example from IODP 323 site

    Science.gov (United States)

    Karmakar, Mampi; Maiti, Saumen; Singh, Amrita; Ojha, Maheswar; Maity, Bhabani Sankar

    2017-07-01

    Modeling and classification of the subsurface lithology is very important to understand the evolution of the earth system. However, precise classification and mapping of lithology using a single framework are difficult due to the complexity and the nonlinearity of the problem driven by limited core sample information. Here, we implement a joint approach by combining the unsupervised and the supervised methods in a single framework for better classification and mapping of rock types. In the unsupervised method, we use the principal component analysis (PCA), K-means cluster analysis (K-means), dendrogram analysis, Fuzzy C-means (FCM) cluster analysis and self-organizing map (SOM). In the supervised method, we use the Bayesian neural networks (BNN) optimized by the Hybrid Monte Carlo (HMC) (BNN-HMC) and the scaled conjugate gradient (SCG) (BNN-SCG) techniques. We use P-wave velocity, density, neutron porosity, resistivity and gamma ray logs of the well U1343E of the Integrated Ocean Drilling Program (IODP) Expedition 323 in the Bering Sea slope region. While the SOM algorithm allows us to visualize the clustering results in spatial domain, the combined classification schemes (supervised and unsupervised) uncover the different patterns of lithology such of as clayey-silt, diatom-silt and silty-clay from an un-cored section of the drilled hole. In addition, the BNN approach is capable of estimating uncertainty in the predictive modeling of three types of rocks over the entire lithology section at site U1343. Alternate succession of clayey-silt, diatom-silt and silty-clay may be representative of crustal inhomogeneity in general and thus could be a basis for detail study related to the productivity of methane gas in the oceans worldwide. Moreover, at the 530 m depth down below seafloor (DSF), the transition from Pliocene to Pleistocene could be linked to lithological alternation between the clayey-silt and the diatom-silt. The present results could provide the basis for

  9. Exploring a New Simulation Approach to Improve Clinical Reasoning Teaching and Assessment: Randomized Trial Protocol.

    Science.gov (United States)

    Pennaforte, Thomas; Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude

    2016-02-17

    Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. This study is in its preliminary stages and the results are expected to be made available by April, 2016. This will be the first study to explore a new

  10. Novel head and neck cancer survival analysis approach: random survival forests versus Cox proportional hazards regression.

    Science.gov (United States)

    Datema, Frank R; Moya, Ana; Krause, Peter; Bäck, Thomas; Willmes, Lars; Langeveld, Ton; Baatenburg de Jong, Robert J; Blom, Henk M

    2012-01-01

    Electronic patient files generate an enormous amount of medical data. These data can be used for research, such as prognostic modeling. Automatization of statistical prognostication processes allows automatic updating of models when new data is gathered. The increase of power behind an automated prognostic model makes its predictive capability more reliable. Cox proportional hazard regression is most frequently used in prognostication. Automatization of a Cox model is possible, but we expect the updating process to be time-consuming. A possible solution lies in an alternative modeling technique called random survival forests (RSFs). RSF is easily automated and is known to handle the proportionality assumption coherently and automatically. Performance of RSF has not yet been tested on a large head and neck oncological dataset. This study investigates performance of head and neck overall survival of RSF models. Performances are compared to a Cox model as the "gold standard." RSF might be an interesting alternative modeling approach for automatization when performances are similar. RSF models were created in R (Cox also in SPSS). Four RSF splitting rules were used: log-rank, conservation of events, log-rank score, and log-rank approximation. Models were based on historical data of 1371 patients with primary head-and-neck cancer, diagnosed between 1981 and 1998. Models contain 8 covariates: tumor site, T classification, N classification, M classification, age, sex, prior malignancies, and comorbidity. Model performances were determined by Harrell's concordance error rate, in which 33% of the original data served as a validation sample. RSF and Cox models delivered similar error rates. The Cox model performed slightly better (error rate, 0.2826). The log-rank splitting approach gave the best RSF performance (error rate, 0.2873). In accord with Cox and RSF models, high T classification, high N classification, and severe comorbidity are very important covariates in the

  11. Mapping of Multiple Sclerosis Walking Scale (MSWS-12 to five-dimension EuroQol (EQ-5D health outcomes: an independent validation in a randomized control cohort

    Directory of Open Access Journals (Sweden)

    Sidovar MF

    2016-02-01

    Full Text Available Matthew F Sidovar,1 Brendan L Limone,2 Craig I Coleman2 1Clinical Development and Medical Affairs, Acorda Therapeutics, Ardsley, NY, 2Department of Pharmacy Practice, University of Connecticut School of Pharmacy, Storrs, CT, USA Background: Mapping of patient-reported outcomes to the five-dimension EuroQol (EQ-5D health index is increasingly being used for understanding the relationship of outcomes to health states and for predicting utilities that have application in economic evaluations. The 12-item Multiple Sclerosis Walking Scale (MSWS-12 is a patient-reported outcome that assesses the impact of walking impairment in people with MS. An equation for mapping the MSWS-12 to the EQ-5D was previously developed and validated using a North American Research Committee on MS (NARCOMS registry cohort. Materials and methods: This analysis retested the validity of the equation mapping the MSWS-12 to the three-level EQ-5D (EQ-5D-3L by using an independent cohort of patients with MS enrolled in a randomized controlled trial. Mapping was evaluated at two separate time points (baseline and week 4 during the clinical trial. The mapping equation’s performance was subsequently assessed with mean absolute error (MAE and root-mean-square error (RMSE by comparing equation-based estimates to values elicited in the trial using the actual EQ-5D-3L questionnaire. Results: The mapping equation predicted EQ-5D-3L values in this external cohort with reasonable precision at both time points (MAE 0.116 and RMSE 0.155 at baseline; MAE 0.105 and RMSE 0.138 at week 4, and was similar to that reported in the original NARCOMS cohort (MAE 0.109 and RMSE 0.145. Also as observed in the original NARCOMS cohort, the mapping equation performed best in patients with EQ-5D-3L values between 0.50 and 0.75, and poorly in patients with values <0.50.Conclusion: The mapping equation performed similarly in this external cohort as in the original derivation cohort, including a poorer

  12. Approaching multidimensional forms of knowledge through Personal Meaning Mapping in science integrating teaching outside the classroom

    DEFF Research Database (Denmark)

    Hartmeyer, Rikke; Bolling, Mads; Bentsen, Peter

    2017-01-01

    and technology lessons in the classroom, and again after experiencing teaching outside the classroom. Maps and interviews were analysed drawing on Bloom’s revised taxonomy of educational objectives. Our findings show that PMM is highly useful for identifying and activating factual knowledge, conceptual knowledge...

  13. A Multi-Objective Approach to Visualize Proportions and Similarities Between Individuals by Rectangular Maps

    DEFF Research Database (Denmark)

    Carrizosa, Emilio; Guerrero, Vanesa; Morales, Dolores Romero

    In this paper we address the problem of visualizing the proportions and the similarities attached to a set of individuals. We represent this information using a rectangular map, i.e., a subdivision of a rectangle into rectangular portions so that each portion is associated with one individual, th...

  14. Pathways to bridge the biophysical realism gap in ecosystem services mapping approaches

    NARCIS (Netherlands)

    Lavorel, Sandra; Bayer, Anita; Bondeau, Alberte; Lautenbach, Sven; Ruiz-Frau, Ana; Schulp, Nynke; Seppelt, Ralf; Verburg, P.H.; Teeffelen, Astrid van; Vannier, Clémence; Arneth, Almut; Cramer, Wolfgang; Marba, Nuria

    2017-01-01

    The mapping of ecosystem service supply has become quite common in ecosystem service assessment practice for terrestrial ecosystems, but land cover remains the most common indicator for ecosystems ability to deliver ecosystem services. For marine ecosystems, practice is even less advanced, with a

  15. An Educational Data Mining Approach to Concept Map Construction for Web based Learning

    Directory of Open Access Journals (Sweden)

    Anal ACHARYA

    2017-01-01

    Full Text Available This aim of this article is to study the use of Educational Data Mining (EDM techniques in constructing concept maps for organizing knowledge in web based learning systems whereby studying their synergistic effects in enhancing learning. This article first provides a tutorial based introduction to EDM. The applicability of web based learning systems in enhancing the efficiency of EDM techniques in real time environment is investigated. Web based learning systems often use a tool for organizing knowledge. This article explores the use of one such tool called concept map for this purpose. The pioneering works by various researchers who proposed web based learning systems in personalized and collaborative environment in this arena are next presented. A set of parameters are proposed based on which personalized and collaborative learning applications may be generalized and their performances compared. It is found that personalized learning environment uses EDM techniques more exhaustively compared to collaborative learning for concept map construction in web based environment. This article can be used as a starting point for freshers who would like to use EDM techniques for concept map construction for web based learning purposes.

  16. A collaborative approach to mapping value of fisheries resources in the North Sea (Part 1: Methodology)

    NARCIS (Netherlands)

    Hintzen, N.T.; Coers, A.; Hamon, K.

    2013-01-01

    IMARES and LEI are both contracted on occasion to perform VMS-analyses to produce maps of fishing activity or economic value of fisheries in particular area(s) in the North Sea. Until present, IMARES and LEI use their own methodology which is inspired mostly on their own unique data availabilities

  17. Prototype of interactive Web Maps: an approach based on open sources

    Directory of Open Access Journals (Sweden)

    Jürgen Philips

    2004-07-01

    Full Text Available To explore the potentialities available in the World Wide Web (WWW, a prototype with interactive Web map was elaborated using standardized codes and open sources, such as eXtensible Markup Language (XML, Scalable Vector Graphics (SVG, Document Object Model (DOM , script languages ECMAScript/JavaScript and “PHP: Hypertext Preprocessor”, and PostgreSQL and its extension, the PostGIS, to disseminate information related to the urban real estate register. Data from the City Hall of São José - Santa Catarina, were used, referring to Campinas district. Using Client/Server model, a prototype of a Web map with standardized codes and open sources was implemented, allowing a user to visualize Web maps using only the Adobe’s plug-in Viewer 3.0 in his/her browser. Aiming a good cartographic project for the Web, it was obeyed rules of graphical translation and was implemented different functionalities of interaction, like interactive legends, symbolization and dynamic scale. From the results, it can be recommended the use of using standardized codes and open sources in interactive Web mapping projects. It is understood that, with the use of Open Source code, in the public and private administration, the possibility of technological development is amplified, and consequently, a reduction with expenses in the acquisition of computer’s program. Besides, it stimulates the development of computer applications targeting specific demands and requirements.

  18. New approaches to forest planning: inventorying and mapping place values in the Pacific Northwest Region

    Science.gov (United States)

    Troy E. Hall; Jennifer O. Farnum; Terry C. Slider; Kathy Ludlow

    2009-01-01

    This report chronicles a large-scale effort to map place values across the Pacific Northwest Region (Washington and Oregon) of the U.S. Forest Service. Through workshops held with Forest Service staff, 485 socioculturally meaningful places were identified. Staff also generated corresponding descriptions of the places’ unique social and biophysical elements—in other...

  19. Effects of Concept Mapping Instruction Approach on Students' Achievement in Basic Science

    Science.gov (United States)

    Ogonnaya, Ukpai Patricia; Okafor, Gabriel; Abonyi, Okechukwu S.; Ugama, J. O.

    2016-01-01

    The study investigated the effects of concept mapping on students' achievement in basic science. The study was carried out in Ebonyi State of Nigeria. The study employed a quasi-experimental design. Specifically the pretest posttest non-equivalent control group research design was used. The sample was 122 students selected from two secondary…

  20. Mapping and Quantifying Biodiversity and Ecosystem Services Related to Terrestrial Vertebrates: A National Approach

    Science.gov (United States)

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  1. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  2. Towards Exploring Vast MPSoC Mapping Design Spaces using a Bias-Elitist Evolutionary Approach

    NARCIS (Netherlands)

    Quan, W.; Pimentel, A.D.

    2014-01-01

    The problem of optimally mapping a set of tasks onto a set of given heterogeneous processors for maximal throughput has been known, in general, to be NP-complete. Previous research has shown that Genetic Algorithms (GA) typically are a good choice to solve this problem when the solution space is

  3. Mapping a Mutation in "Caenorhabditis elegans" Using a Polymerase Chain Reaction-Based Approach

    Science.gov (United States)

    Myers, Edith M.

    2014-01-01

    Many single nucleotide polymorphisms (SNPs) have been identified within the "Caenorhabditis elegans" genome. SNPs present in the genomes of two isogenic "C. elegans" strains have been routinely used as a tool in forward genetics to map a mutation to a particular chromosome. This article describes a laboratory exercise in which…

  4. Towards local implementation of Dutch health policy guidelines: a concept-mapping approach.

    Science.gov (United States)

    Kuunders, Theo J M; van Bon-Martens, Marja J H; van de Goor, Ien A M; Paulussen, Theo G W M; van Oers, Hans A M

    2017-02-22

    To develop a targeted implementation strategy for a municipal health policy guideline, implementation targets of two guideline users [Regional Health Services (RHSs)] and guideline developers of leading national health institutes were made explicit. Therefore, characteristics of successful implementation of the guideline were identified. Differences and similarities in perceptions of these characteristics between RHSs and developers were explored. Separate concept mapping procedures were executed in two RHSs, one with representatives from partner local health organizations and municipalities, the second with RHS members only. A third map was conducted with the developers of the guideline. All mapping procedures followed the same design of generating statements up to interpretation of results with participants. Concept mapping, as a practical implementation tool, will be discussed in the context of international research literature on guideline implementation in public health. Guideline developers consider implementation successful when substantive components (health issues) of the guidelines, content are visible in local policy practice. RHSs, local organizations and municipalities view the implementation process itself within and between organizations as more relevant, and state that usability of the guideline for municipal policy and commitment by officials and municipal managers are critical targets for successful implementation. Between the RHSs, differences in implementation targets were smaller than between RHSs and guideline developers. For successful implementation, RHSs tend to focus on process targets while developers focus more on the thematic contents of the guideline. Implications of these different orientations for implementation strategies are dealt with in the discussion. © The Author 2017. Published by Oxford University Press.

  5. Integrated GIS Based Approach in Mapping Groundwater Potential Zones in Kota Kinabalu, Sabah, Malaysia

    Directory of Open Access Journals (Sweden)

    Zulherry Isnain

    2017-03-01

    Full Text Available DOI: 10.17014/ijog.4.2.111-120The shortage of clean water occurs almost everywhere around the world. The demand for water supply increases from time to time due to various problems such as development, population growth, pollution, global warming, agricultural activities, logging, and so on. This study was conducted in the vicinity of Kota Kinabalu, Sabah, by using the Geographic Information System (GIS for mapping the groundwater potential zones. The main objective of this study was to generate the predictive map of groundwater potential zones in the studied area through the integration of various thematic maps by using the GIS. This study includes five stages, namely collection and preparation of basic data, data analyses, development of space database, spatial analyses, and space integration. There are eleven parameters used in this study, namely rainfall, drainage, soil type, landuse, lithology, lineament density, topography, slope steepness, the ratio of sand and clay, major fault zones, and syncline zones. By using the Heuristic method, the final map of groundwater potential zones in the studied area is divided into five classes, which are very low, low, moderate, high, and very high.

  6. Beyond the single species climate envelope: A multifaceted approach to mapping climate change vulnerability

    Science.gov (United States)

    Christopher S. Balzotti; Stanley G. Kitchen; Clinton McCarthy

    2016-01-01

    Federal land management agencies and conservation organizations have begun incorporating climate change vulnerability assessments (CCVAs) as an important component in the management and conservation of landscapes. It is often a challenge to translate that knowledge into management plans and actions, even when research infers species risk. Predictive maps can...

  7. A National Approach to Map and Quantify Terrestrial Vertebrate Biodiversity within an Ecosystem Services Framework

    Science.gov (United States)

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  8. Predicting childhood sexual or physical abuse: a logistic regression geo-mapping approach to prevention.

    Science.gov (United States)

    Tadoum, Roland K; Smolij, Kamila; Lyn, Michelle A; Johnson, Craig W

    2005-01-01

    This study investigates the degree to which gender, ethnicity, relationship to perpetrator, and geomapped socio-economic factors significantly predict the incidence of childhood sexual abuse, physical abuse and non- abuse. These variables are then linked to geographic identifiers using geographic information system (GIS) technology to develop a geo-mapping framework for child sexual and physical abuse prevention.

  9. A new approach for deriving Flood hazard maps from SAR data and global hydrodynamic models

    Science.gov (United States)

    Matgen, P.; Hostache, R.; Chini, M.; Giustarini, L.; Pappenberger, F.; Bally, P.

    2014-12-01

    With the flood consequences likely to amplify because of the growing population and ongoing accumulation of assets in flood-prone areas, global flood hazard and risk maps are needed for improving flood preparedness at large scale. At the same time, with the rapidly growing archives of SAR images of floods, there is a high potential of making use of these images for global and regional flood management. In this framework, an original method that integrates global flood inundation modeling and microwave remote sensing is presented. It takes advantage of the combination of the time and space continuity of a global inundation model with the high spatial resolution of satellite observations. The availability of model simulations over a long time period offers opportunities for estimating flood non-exceedance probabilities in a robust way. These probabilities can be attributed to historical satellite observations. Time series of SAR-derived flood extent maps and associated non-exceedance probabilities can then be combined generate flood hazard maps with a spatial resolution equal to that of the satellite images, which is most of the time higher than that of a global inundation model. In principle, this can be done for any area of interest in the world, provided that a sufficient number of relevant remote sensing images are available. As a test case we applied the method on the Severn River (UK) and the Zambezi River (Mozambique), where large archives of Envisat flood images can be exploited. The global ECMWF flood inundation model is considered for computing the statistics of extreme events. A comparison with flood hazard maps estimated with in situ measured discharge is carried out. The first results confirm the potentiality of the method. However, further developments on two aspects are required to improve the quality of the hazard map and to ensure the acceptability of the product by potential end user organizations. On the one hand, it is of paramount importance to

  10. Effectiveness of the Comprehensive Approach to Rehabilitation (CARe) methodology: design of a cluster randomized controlled trial.

    Science.gov (United States)

    Bitter, Neis A; Roeg, Diana P K; van Nieuwenhuizen, Chijs; van Weeghel, Jaap

    2015-07-22

    There is an increasing amount of evidence for the effectiveness of rehabilitation interventions for people with severe mental illness (SMI). In the Netherlands, a rehabilitation methodology that is well known and often applied is the Comprehensive Approach to Rehabilitation (CARe) methodology. The overall goal of the CARe methodology is to improve the client's quality of life by supporting the client in realizing his/her goals and wishes, handling his/her vulnerability and improving the quality of his/her social environment. The methodology is strongly influenced by the concept of 'personal recovery' and the 'strengths case management model'. No controlled effect studies have been conducted hitherto regarding the CARe methodology. This study is a two-armed cluster randomized controlled trial (RCT) that will be executed in teams from three organizations for sheltered and supported housing, which provide services to people with long-term severe mental illness. Teams in the intervention group will receive the multiple-day CARe methodology training from a specialized institute and start working according the CARe Methodology guideline. Teams in the control group will continue working in their usual way. Standardized questionnaires will be completed at baseline (T0), and 10 (T1) and 20 months (T2) post baseline. Primary outcomes are recovery, social functioning and quality of life. The model fidelity of the CARe methodology will be assessed at T1 and T2. This study is the first controlled effect study on the CARe methodology and one of the few RCTs on a broad rehabilitation method or strength-based approach. This study is relevant because mental health care organizations have become increasingly interested in recovery and rehabilitation-oriented care. The trial registration number is ISRCTN77355880 .

  11. Randomized trial of a warfarin communication protocol for nursing homes: an SBAR-based approach.

    Science.gov (United States)

    Field, Terry S; Tjia, Jennifer; Mazor, Kathleen M; Donovan, Jennifer L; Kanaan, Abir O; Harrold, Leslie R; Reed, George; Doherty, Peter; Spenard, Ann; Gurwitz, Jerry H

    2011-02-01

    More than 1.6 million Americans currently reside in nursing homes. As many as 12% of them receive long-term anticoagulant therapy with warfarin. Prior research has demonstrated compelling evidence of safety problems with warfarin therapy in this setting, often associated with suboptimal communication between nursing home staff and prescribing physicians. We conducted a randomized trial of a warfarin management protocol using facilitated telephone communication between nurses and physicians in 26 nursing homes in Connecticut in 2007-2008. Intervention facilities received a warfarin management communication protocol using the approach "Situation, Background, Assessment, and Recommendation" (SBAR). The protocol included an SBAR template to standardize telephone communication about residents on warfarin by requiring information about the situation triggering the call, the background, the nurse's assessment, and recommendations. There were 435 residents who received warfarin therapy during the study period for 55,167 resident days in the intervention homes and 53,601 in control homes. In intervention homes, residents' international normalized ratio (INR) values were in the therapeutic range a statistically significant 4.50% more time than in control homes (95% confidence interval [CI], 0.31%-8.69%). There was no difference in obtaining a follow-up INR within 3 days after an INR value ≥4.5 (odds ratio 1.02; 95% CI, 0.44-2.4). Rates of preventable adverse warfarin-related events were lower in intervention homes, although this result was not statistically significant: the incident rate ratio for any preventable adverse warfarin-related event was .87 (95% CI, .54-1.4). Facilitated telephone communication between nurses and physicians using the SBAR approach modestly improves the quality of warfarin management for nursing home residents. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. A novel approach to assess the treatment response using Gaussian random field in PET

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Mengdie [Department of Biomedical Engineering, Tsinghua University, Beijing 100084, China and Center for Advanced Medical Imaging Science, Division of Nuclear Medicine and Molecular Imaging, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States); Guo, Ning [Center for Advanced Medical Imaging Science, Division of Nuclear Medicine and Molecular Imaging, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States); Hu, Guangshu; Zhang, Hui, E-mail: hzhang@mail.tsinghua.edu.cn, E-mail: li.quanzheng@mgh.harvard.edu [Department of Biomedical Engineering, Tsinghua University, Beijing 100084 (China); El Fakhri, Georges; Li, Quanzheng, E-mail: hzhang@mail.tsinghua.edu.cn, E-mail: li.quanzheng@mgh.harvard.edu [Center for Advanced Medical Imaging Science, Division of Nuclear Medicine and Molecular Imaging, Massachusetts General Hospital, Boston, Massachusetts 02114 and Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2016-02-15

    Purpose: The assessment of early therapeutic response to anticancer therapy is vital for treatment planning and patient management in clinic. With the development of personal treatment plan, the early treatment response, especially before any anatomically apparent changes after treatment, becomes urgent need in clinic. Positron emission tomography (PET) imaging serves an important role in clinical oncology for tumor detection, staging, and therapy response assessment. Many studies on therapy response involve interpretation of differences between two PET images, usually in terms of standardized uptake values (SUVs). However, the quantitative accuracy of this measurement is limited. This work proposes a statistically robust approach for therapy response assessment based on Gaussian random field (GRF) to provide a statistically more meaningful scale to evaluate therapy effects. Methods: The authors propose a new criterion for therapeutic assessment by incorporating image noise into traditional SUV method. An analytical method based on the approximate expressions of the Fisher information matrix was applied to model the variance of individual pixels in reconstructed images. A zero mean unit variance GRF under the null hypothesis (no response to therapy) was obtained by normalizing each pixel of the post-therapy image with the mean and standard deviation of the pretherapy image. The performance of the proposed method was evaluated by Monte Carlo simulation, where XCAT phantoms (128{sup 2} pixels) with lesions of various diameters (2–6 mm), multiple tumor-to-background contrasts (3–10), and different changes in intensity (6.25%–30%) were used. The receiver operating characteristic curves and the corresponding areas under the curve were computed for both the proposed method and the traditional methods whose figure of merit is the percentage change of SUVs. The formula for the false positive rate (FPR) estimation was developed for the proposed therapy response

  13. Global financial indices and twitter sentiment: A random matrix theory approach

    Science.gov (United States)

    García, A.

    2016-11-01

    We use Random Matrix Theory (RMT) approach to analyze the correlation matrix structure of a collection of public tweets and the corresponding return time series associated to 20 global financial indices along 7 trading months of 2014. In order to quantify the collection of tweets, we constructed daily polarity time series from public tweets via sentiment analysis. The results from RMT analysis support the fact of the existence of true correlations between financial indices, polarities, and the mixture of them. Moreover, we found a good agreement between the temporal behavior of the extreme eigenvalues of both empirical data, and similar results were found when computing the inverse participation ratio, which provides an evidence about the emergence of common factors in global financial information whether we use the return or polarity data as a source. In addition, we found a very strong presumption that polarity Granger causes returns of an Indonesian index for a long range of lag trading days, whereas for Israel, South Korea, Australia, and Japan, the predictive information of returns is also presented but with less presumption. Our results suggest that incorporating polarity as a financial indicator may open up new insights to understand the collective and even individual behavior of global financial indices.

  14. Reconstruction of compressive multispectral sensing data using a multilayered conditional random field approach

    Science.gov (United States)

    Kazemzadeh, Farnoud; Shafiee, Mohammad J.; Wong, Alexander; Clausi, David A.

    2014-09-01

    The prevalence of compressive sensing is continually growing in all facets of imaging science. Com- pressive sensing allows for the capture and reconstruction of an entire signal from a sparse (under- sampled), yet sufficient, set of measurements that is representative of the target being observed. This compressive sensing strategy reduces the duration of the data capture, the size of the acquired data, and the cost of the imaging hardware as well as complexity while preserving the necessary underlying information. Compressive sensing systems require the accompaniment of advanced re- construction algorithms to reconstruct complete signals from the sparse measurements made. Here, a new reconstruction algorithm is introduced specifically for the reconstruction of compressive multispectral (MS) sensing data that allows for high-quality reconstruction from acquisitions at sub-Nyquist rates. We propose a multilayered conditional random field (MCRF) model, which extends upon the CRF model by incorporating two joint layers of certainty and estimated states. The proposed algorithm treats the reconstruction of each spectral channel as a MCRF given the sparse MS measurements. Since the observations are incomplete, the MCRF incorporates an extra layer determining the certainty of the measurements. The proposed MCRF approach was evaluated using simulated compressive MS data acquisitions, and is shown to enable fast acquisition of MS sensing data with reduced imaging hardware cost and complexity.

  15. Singularities in primate orientation maps.

    Science.gov (United States)

    Obermayer, K; Blasdel, G G

    1997-04-01

    We report the results of an analysis of orientation maps in primate striate cortex with focus on singularities and their distribution. Data were obtained from squirrel monkeys and macaque monkeys of different ages. We find the approximately 80% of singularities that are nearest neighbors have the opposite sign and that the spatial distribution of singularities differs significantly from a random distribution of points. We do not find evidence for consistent geometric patterns that singularities may form across the cortex. Except for a different overall alignment of orientation bands and different periods of repetition, maps obtained from different animals and different ages are found similar with respect to the measures used. Orientation maps are then compared with two different pattern models that are currently discussed in the literature: bandpass-filtered white noise, which accounts very well for the overall map structure, and the field analogy model, which specifies the orientation map by the location of singularities and their properties. The bandpass-filtered noise approach to orientation patterns correctly predicts the sign correlations between singularities and accounts for the deviations in the spatial distribution of singularities away from a random dot pattern. The field analogy model can account for the structure of certain local patches of the orientation map but not for the whole map. Neither of the models is completely satisfactory, and the structure of the orientation map remains to be fully explained.

  16. An improved data-driven fuzzy mineral prospectivity mapping procedure; cosine amplitude-based similarity approach to delineate exploration targets

    Science.gov (United States)

    Parsa, Mohammad; Maghsoudi, Abbas; Yousefi, Mahyar

    2017-06-01

    Weighting and synthesizing exploration evidence criteria for mineral prospectivity mapping (MPM) are affected by complexity and ambiguity of ore mineralization processes. In this regard, fuzziness could facilitate the modeling of such vague processes for MPM. Furthermore, imprecise selection of the exploration criteria to be used in MPM has negative influence on the efficiency of the generated prospectivity models. In this paper, of various exploration criteria, a coherent set of exploration features were recognized by using the distance distribution analysis. Then, the application of cosine amplitude-based similarity procedure was adapted as a data-driven fuzzy logic approach for predictive mapping of porphyry-Cu prospectivity in Arasbaran metallogenic zone, NW Iran. In addition, a conventional data-driven fuzzy prospectivity model was generated for comparison purpose. Comparison of the two models demonstrated the superiority of the cosine amplitude-based fuzzy procedure for MPM.

  17. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    Energy Technology Data Exchange (ETDEWEB)

    Vatsavai, Raju [ORNL

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  18. Identification and mapping the high nature value farmland by the comparison of a combined and species approaches in Tuscany, Italy

    Directory of Open Access Journals (Sweden)

    Giulio Lazzerini

    2015-09-01

    Full Text Available Low-intensity farming systems play a crucial role in nature conservation by preserving 50% of habitats, flora and fauna occurring in Europe. For this reason the identification, classification and mapping of high nature value farmlands (HNVfs is becoming an overriding concern. In this study, two different approaches, namely combined approach and species-based approach, were used to spatially identify HNVfs (type 1, 2 and 3 across Tuscany region (Italy. The first approach calculated different indicators (extensive practices indicator, crop diversity indicator, landscape element indicator at 1×1 km grid cell spatial resolution using pre-existent spatial datasets integrated within a global information system environment. Whilst, the speciesbased approach relied on a pre-existent regional naturalistic inventory. All indicators and the resulting HNVfs derived from the two approaches were aggregated at municipality level. Despite some difference, the two adopted approaches intercepted spatially the same HNVfs areas, accounting for 35% of the total utilised agricultural area of the region. Just 16% of HNVfs resulted located inside protected areas, thus under current conservation and protection management actions. Finally, HNVfs of the Tuscany region were spatially aggregated in four relevant agro-ecosystems by taking into consideration the cropping systems and the landscape elements’ characteristics peculiar in the region.

  19. Prediction of the indoor temperatures of an urban area with an in-time regression mapping approach.

    Science.gov (United States)

    Smargiassi, Audrey; Fournier, Michel; Griot, Chloé; Baudouin, Yves; Kosatsky, Tom

    2008-05-01

    Excess mortality has been noted during high ambient temperature episodes. During such episodes, individuals are not likely to be uniformly exposed to temperatures within cities. Exposure of individuals to high temperatures is likely to fluctuate with the micro-urban variation of outdoor temperatures (heat island effect) and with factors linked to building properties. In this paper, a GIS-based regression mapping approach is proposed to model urban spatial patterns of indoor temperatures in time, for all residential buildings of an urban area. In July 2005, the hourly indoor temperature was measured with data loggers for 31 consecutive days, concurrently in 75 dwellings in Montreal. The general estimating equation model (GEE) developed to predict indoor temperatures integrates temporal variability of outdoor temperatures (and their 24 h moving average), with geo-referenced determinants available for the entire city, such as surface temperatures at each site (from a satellite image) and building characteristics (from the Montreal Property Assessment database). The proportion of the variability of the indoor temperatures explained increases from 20%, using only outdoor temperatures, to 54% with the full model. Using this model, high-resolution maps of indoor temperatures can be provided across an entire urban area. The model developed adds a temporal dimension to similar regression mapping approaches used to estimate exposure for population health studies, based on spatial predictors, and can thus be used to predict exposure to indoor temperatures under various outdoor temperature scenarios. It is thus concluded that such a model might be used as a means of mapping indoor temperatures either to inform urban planning and housing strategies to mitigate the effects of climate change, to orient public health interventions, or as a basis for assessing exposure as part of epidemiological studies.

  20. Development of a dynamic web mapping service for vegetation productivity using earth observation and in situ sensors in a sensor web based approach

    NARCIS (Netherlands)

    Kooistra, L.; Bergsma, A.R.; Chuma, B.; Bruin, de S.

    2009-01-01

    This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the

  1. Mapping Aquatic Vegetation in a Large, Shallow Eutrophic Lake: A Frequency-Based Approach Using Multiple Years of MODIS Data

    Directory of Open Access Journals (Sweden)

    Xiaohan Liu

    2015-08-01

    Full Text Available Aquatic vegetation serves many important ecological and socioeconomic functions in lake ecosystems. The presence of floating algae poses difficulties for accurately estimating the distribution of aquatic vegetation in eutrophic lakes. We present an approach to map the distribution of aquatic vegetation in Lake Taihu (a large, shallow eutrophic lake in China and reduce the influence of floating algae on aquatic vegetation mapping. Our approach involved a frequency analysis over a 2003–2013 time series of the floating algal index (FAI based on moderate-resolution imaging spectroradiometer (MODIS data. Three phenological periods were defined based on the vegetation presence frequency (VPF and the growth of algae and aquatic vegetation: December and January composed the period of wintering aquatic vegetation; February and March composed the period of prolonged coexistence of algal blooms and wintering aquatic vegetation; and June to October was the peak period of the coexistence of algal blooms and aquatic vegetation. By comparing and analyzing the satellite-derived aquatic vegetation distribution and 244 in situ measurements made in 2013, we established a FAI threshold of −0.025 and VPF thresholds of 0.55, 0.45 and 0.85 for the three phenological periods. We validated the accuracy of our approach by comparing the results between the satellite-derived maps and the in situ results obtained from 2008–2012. The overall classification accuracy was 87%, 81%, 77%, 88% and 73% in the five years from 2008–2012, respectively. We then applied the approach to the MODIS images from 2003–2013 and obtained the total area of the aquatic vegetation, which varied from 265.94 km2 in 2007 to 503.38 km2 in 2008, with an average area of 359.62 ± 69.20 km2 over the 11 years. Our findings suggest that (1 the proposed approach can be used to map the distribution of aquatic vegetation in eutrophic algae-rich waters and (2 dramatic changes occurred in the

  2. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  3. Using a conceptual approach with concept mapping to promote critical thinking.

    Science.gov (United States)

    Vacek, Jenny E

    2009-01-01

    Promoting the development of critical thinking is crucial to nursing education for two reasons. First, the National League for Nursing and the American Association of Colleges of Nurses consider critical thinking an outcome criterion for baccalaureate nursing education. Second, and significantly more important, professional nursing practice requires critical thinking skills and problem solving abilities. Too often, teaching is not directed at specifically designed activities that foster critical thinking. Various teaching strategies have been proposed that promote critical thinking, including service learning, role playing, reflective learning, the critical incidence conference, videotaped vignettes, preceptorship, and concept mapping. This article focuses on the use of assimilation theory and concept maps to facilitate critical thinking experiences in nursing education.

  4. Mouse whole-body organ mapping by non-rigid registration approach

    Science.gov (United States)

    Xiao, Di; Zahra, David; Bourgeat, Pierrick; Berghofer, Paula; Acosta Tamayo, Oscar; Green, Heather; Gregoire, Marie Claude; Salvado, Olivier

    2011-03-01

    Automatic small animal whole-body organ registration is challenging because of subject's joint structure, posture and position difference and loss of reference features. In this paper, an improved 3D shape context based non-rigid registration method is applied for mouse whole-body skeleton registration and lung registration. A geodesic path based non-rigid registration method is proposed for mouse torso skin registration. Based on the above registration methods, a novel non-rigid registration framework is proposed for mouse whole-body organ mapping from an atlas to new scanned CT data. A preliminary experiment was performed to test the method on lung and skin registration. A whole-body organ mapping was performed on three target data and the selected organs were compared with the manual outlining results. The robust of the method has been demonstrated.

  5. The development of an adolescent smoking cessation intervention—an Intervention Mapping approach to planning

    Science.gov (United States)

    Dalum, Peter; Schaalma, Herman; Kok, Gerjo

    2012-01-01

    The objective of this project was to develop a theory- and evidence-based adolescent smoking cessation intervention using both new and existing materials. We used the Intervention Mapping framework for planning health promotion programmes. Based on a needs assessment, we identified important and changeable determinants of cessation behaviour, specified change objectives for the intervention programme, selected theoretical change methods for accomplishing intervention objectives and finally operationalized change methods into practical intervention strategies. We found that guided practice, modelling, self-monitoring, coping planning, consciousness raising, dramatic relief and decisional balance were suitable methods for adolescent smoking cessation. We selected behavioural journalism, guided practice and Motivational Interviewing as strategies in our intervention. Intervention Mapping helped us to develop as systematic adolescent smoking cessation intervention with a clear link between behavioural goals, theoretical methods, practical strategies and materials and with a strong focus on implementation and recruitment. This paper does not present evaluation data. PMID:21730251

  6. The development of an adolescent smoking cessation intervention--an Intervention Mapping approach to planning.

    Science.gov (United States)

    Dalum, Peter; Schaalma, Herman; Kok, Gerjo

    2012-02-01

    The objective of this project was to develop a theory- and evidence-based adolescent smoking cessation intervention using both new and existing materials. We used the Intervention Mapping framework for planning health promotion programmes. Based on a needs assessment, we identified important and changeable determinants of cessation behaviour, specified change objectives for the intervention programme, selected theoretical change methods for accomplishing intervention objectives and finally operationalized change methods into practical intervention strategies. We found that guided practice, modelling, self-monitoring, coping planning, consciousness raising, dramatic relief and decisional balance were suitable methods for adolescent smoking cessation. We selected behavioural journalism, guided practice and Motivational Interviewing as strategies in our intervention. Intervention Mapping helped us to develop as systematic adolescent smoking cessation intervention with a clear link between behavioural goals, theoretical methods, practical strategies and materials and with a strong focus on implementation and recruitment. This paper does not present evaluation data.

  7. Effective stakeholder participation in setting research priorities using a Global Evidence Mapping approach.

    Science.gov (United States)

    Clavisi, Ornella; Bragge, Peter; Tavender, Emma; Turner, Tari; Gruen, Russell L

    2013-05-01

    We present a multistep process for identifying priority research areas in rehabilitation and long-term care of traumatic brain-injured (TBI) patients. In particular, we aimed to (1) identify which stakeholders should be involved; (2) identify what methods are appropriate; (3) examine different criteria for the generation of research priority areas; and (4) test the feasibility of linkage and exchange among researchers, decision makers, and other potential users of the research. Potential research questions were identified and developed using an initial scoping meeting and preliminary literature search, followed by a facilitated mapping workshop and an online survey. Identified research questions were then prioritized against specific criteria (clinical importance, novelty, and controversy). Existing evidence was then mapped to the high-priority questions using usual processes for search, screening, and selection. A broad range of stakeholders were then brought together at a forum to identify priority research themes for future research investment. Using clinical and research leaders, smaller targeted planning workshops prioritized specific research projects for each of the identified themes. Twenty-six specific questions about TBI rehabilitation were generated, 14 of which were high priority. No one method identified all high-priority questions. Methods that relied solely on the views of clinicians and researchers identified fewer high-priority questions compared with methods that used broader stakeholder engagement. Evidence maps of these high-priority questions yielded a number of evidence gaps. Priority questions and evidence maps were then used to inform a research forum, which identified 12 priority themes for future research. Our research demonstrates the value of a multistep and multimethod process involving many different types of stakeholders for prioritizing research to improve the rehabilitation outcomes of people who have suffered TBI. Enhancing

  8. Transfer map approach to an optical effects of energy degraders on the perfomance of fragment separators.

    Energy Technology Data Exchange (ETDEWEB)

    Erdelyi, B.; Bandura, L.; Nolen, J.; Physics

    2009-01-01

    A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized.

  9. Transfer map approach to and optical effects of energy degraders in fragment separators

    Directory of Open Access Journals (Sweden)

    B. Erdelyi

    2009-01-01

    Full Text Available A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized.

  10. Transfer map approach to and optical effects of energy degraders in fragment separators

    Science.gov (United States)

    Erdelyi, B.; Bandura, L.; Nolen, J.

    2009-01-01

    A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized.

  11. Depth-Resolved Mapping of Tissue Mechanical Properties Using a Novel Optical Approach

    OpenAIRE

    Hajjarian, Zeinab; Nadkarni, Seemantni K.

    2011-01-01

    Progression of most diseases, such as atherosclerosis, cancer, neurodegenerative disease and osteoarthritis is accompanied with drastic changes in biomechanics of tissue. Hence, non-contact and non-invasive technologies for 3-dimensional mapping of tissue biomechanics are invaluable for diagnostic purposes. Laser speckle Microrheology (LSM) is developed in our lab to enable high resolution mechanical evaluation of tissue. To this end, the tissue sample is illuminated by a coherent and focused...

  12. Compression map, functional groups and fossilization: A chemometric approach (Pennsylvanian neuropteroid foliage, Canada)

    Science.gov (United States)

    D'Angelo, J. A.; Zodrow, E.L.; Mastalerz, Maria

    2012-01-01

    Nearly all of the spectrochemical studies involving Carboniferous foliage of seed-ferns are based on a limited number of pinnules, mainly compressions. In contrast, in this paper we illustrate working with a larger pinnate segment, i.e., a 22-cm long neuropteroid specimen, compression-preserved with cuticle, the compression map. The objective is to study preservation variability on a larger scale, where observation of transparency/opacity of constituent pinnules is used as a first approximation for assessing the degree of pinnule coalification/fossilization. Spectrochemical methods by Fourier transform infrared spectrometry furnish semi-quantitative data for principal component analysis.The compression map shows a high degree of preservation variability, which ranges from comparatively more coalified pinnules to less coalified pinnules that resemble fossilized-cuticles, noting that the pinnule midveins are preserved more like fossilized-cuticles. A general overall trend of coalified pinnules towards fossilized-cuticles, i.e., variable chemistry, is inferred from the semi-quantitative FTIR data as higher contents of aromatic compounds occur in the visually more opaque upper location of the compression map. The latter also shows a higher condensation of the aromatic nuclei along with some variation in both ring size and degree of aromatic substitution. From principal component analysis we infer correspondence between transparency/opacity observation and chemical information which correlate with varying degree to fossilization/coalification among pinnules. ?? 2011 Elsevier B.V.

  13. A new approach to concordance in mid-infrared spectromicroscopy mapping of malignant tumors.

    Science.gov (United States)

    Ali, Kaiser; Reichert, Todd; Gomez, Daniel; Lu, Yanjie; Jan, Alexander; Christensen, Colleen

    2010-10-01

    Mid-infrared spectromicroscopy studies on biological tissue sections require accurate identification of tumor-bearing areas in histology-stained and infrared-unstained tissue sections. Concordance was achieved as follows: paired stained and unstained thin (5 microm) human brain tumor cryosections mounted on slides were scanned with a Nikon Coolscan 4000 film scanner at 4000 dpi, edited with Adobe Photoshop CS2 software, and both digital images saved. A digital tractile grid, developed in our laboratory, was overlaid onto both images. Boundaries of tumor-containing areas in stained sections were identified by light microscopy, and a digital boundary map constructed. The map was transferred onto the unstained spectromicroscopy tissue image, and finally layered onto the gridded, equisized, spectromicroscope-generated overview image prior to Fourier transform infrared spectromicroscopy. Accurate identification of tumor-bearing areas, normal brain tissue and transitional zones allowed for meaningful interpretation of respective spectral patterns in detecting subtle differences within biochemical profiles. This is the first reported method of a standardized technique for ensuring concordance in mapping of malignant tumors by mid-infrared spectromicroscopy. This technique is applicable to all biological thin tissue sections, and serves to enhance accuracy of concordance between globar- and synchrotron-light generated infrared data with that obtained by conventional light microscopy.

  14. a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps

    Science.gov (United States)

    Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.

    2016-06-01

    Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.

  15. Statistical Approaches in Analysis of Variance: from Random Arrangements to Latin Square Experimental Design

    OpenAIRE

    Radu E. SESTRAŞ; Lorentz JÄNTSCHI; Sorana D. BOLBOACĂ

    2009-01-01

    Background: The choices of experimental design as well as of statisticalanalysis are of huge importance in field experiments. These are necessary tobe correctly in order to obtain the best possible precision of the results. Therandom arrangements, randomized blocks and Latin square designs werereviewed and analyzed from the statistical perspective of error analysis.Material and Method: Random arrangements, randomized block and Latinsquares experimental designs were used as field experiments. ...

  16. Permutation Test Approach for Ordered Alternatives in Randomized Complete Block Design: A Comparative Study

    OpenAIRE

    GOKPINAR, Esra; GUL, Hasan; GOKPINAR, Fikri; BAYRAK, Hülya; OZONUR, Deniz

    2013-01-01

    Randomized complete block design is one of the most used experimental designs in statistical analysis. For testing ordered alternatives in randomized complete block design, parametric tests are used if random sample are drawn from Normal distribution. If normality assumption is not provide, nonparametric methods are used. In this study, we are interested nonparametric tests and we introduce briefly the nonparametric tests, such as Page, Modified Page and Hollander tests. We also give Permutat...

  17. Fractional calculus approach to the statistical characterization of random variables and vectors

    OpenAIRE

    Cottone, D. ; Paola, M.D.

    2015-01-01

    Fractional moments have been investigated by many authors to represent the density of univariate and bivariate random variables in different contexts. Fractional moments are indeed important when the density of the random variable has inverse power-law tails and, consequently, it lacks integer order moments. In this paper, starting from the Mellin transform of the characteristic function and by fractional calculus method we present a new perspective on the statistics of random variables. Intr...

  18. Approach of automatic 3D geological mapping: the case of the Kovdor phoscorite-carbonatite complex, NW Russia.

    Science.gov (United States)

    Kalashnikov, A O; Ivanyuk, G Yu; Mikhailova, J A; Sokharev, V A

    2017-07-31

    We have developed an approach for automatic 3D geological mapping based on conversion of chemical composition of rocks to mineral composition by logical computation. It allows to calculate mineral composition based on bulk rock chemistry, interpolate the mineral composition in the same way as chemical composition, and, finally, build a 3D geological model. The approach was developed for the Kovdor phoscorite-carbonatite complex containing the Kovdor baddeleyite-apatite-magnetite deposit. We used 4 bulk rock chemistry analyses - Fe magn , P 2 O 5 , CO 2 and SiO 2 . We used four techniques for prediction of rock types - calculation of normative mineral compositions (norms), multiple regression, artificial neural network and developed by logical evaluation. The two latter became the best. As a result, we distinguished 14 types of phoscorites (forsterite-apatite-magnetite-carbonate rock), carbonatite and host rocks. The results show good convergence with our petrographical studies of the deposit, and recent manually built maps. The proposed approach can be used as a tool of a deposit genesis reconstruction and preliminary geometallurgical modelling.

  19. One step versus two step approach for gestational diabetes screening: systematic review and meta-analysis of the randomized trials.

    Science.gov (United States)

    Saccone, Gabriele; Caissutti, Claudia; Khalifeh, Adeeb; Meltzer, Sara; Scifres, Christina; Simhan, Hyagriv N; Kelekci, Sefa; Sevket, Osman; Berghella, Vincenzo

    2017-12-03

    To compare both the prevalence of gestational diabetes mellitus (GDM) as well as maternal and neonatal outcomes by either the one-step or the two-step approaches. Electronic databases were searched from their inception until June 2017. We included all randomized controlled trials (RCTs) comparing the one-step with the two-step approaches for the screening and diagnosis of GDM. The primary outcome was the incidence of GDM. Three RCTs (n = 2333 participants) were included in the meta-analysis. 910 were randomized to the one step approach (75 g, 2 hrs), and 1423 to the two step approach. No significant difference in the incidence of GDM was found comparing the one step versus the two step approaches (8.4 versus 4.3%; relative risk (RR) 1.64, 95%CI 0.77-3.48). Women screened with the one step approach had a significantly lower risk of preterm birth (PTB) (3.7 versus 7.6%; RR 0.49, 95%CI 0.27-0.88), cesarean delivery (16.3 versus 22.0%; RR 0.74, 95%CI 0.56-0.99), macrosomia (2.9 versus 6.9%; RR 0.43, 95%CI 0.22-0.82), neonatal hypoglycemia (1.7 versus 4.5%; RR 0.38, 95%CI 0.16-0.90), and admission to neonatal intensive care unit (NICU) (4.4 versus 9.0%; RR 0.49, 95%CI 0.29-0.84), compared to those randomized to screening with the two step approach. The one and the two step approaches were not associated with a significant difference in the incidence of GDM. However, the one step approach was associated with better maternal and perinatal outcomes.

  20. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    Science.gov (United States)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  1. A Bayesian meta-analytic approach for safety signal detection in randomized clinical trials.

    Science.gov (United States)

    Odani, Motoi; Fukimbara, Satoru; Sato, Tosiya

    2017-04-01

    Meta-analyses are frequently performed on adverse event data and are primarily used for improving statistical power to detect safety signals. However, in the evaluation of drug safety for New Drug Applications, simple pooling of adverse event data from multiple clinical trials is still commonly used. We sought to propose a new Bayesian hierarchical meta-analytic approach based on consideration of a hierarchical structure of reported individual adverse event data from multiple randomized clinical trials. To develop our meta-analysis model, we extended an existing three-stage Bayesian hierarchical model by including an additional stage of the clinical trial level in the hierarchical model; this generated a four-stage Bayesian hierarchical model. We applied the proposed Bayesian meta-analysis models to published adverse event data from three premarketing randomized clinical trials of tadalafil and to a simulation study motivated by the case example to evaluate the characteristics of three alternative models. Comparison of the results from the Bayesian meta-analysis model with those from Fisher's exact test after simple pooling showed that 6 out of 10 adverse events were the same within a top 10 ranking of individual adverse events with regard to association with treatment. However, more individual adverse events were detected in the Bayesian meta-analysis model than in Fisher's exact test under the body system "Musculoskeletal and connective tissue disorders." Moreover, comparison of the overall trend of estimates between the Bayesian model and the standard approach (odds ratios after simple pooling methods) revealed that the posterior median odds ratios for the Bayesian model for most adverse events shrank toward values for no association. Based on the simulation results, the Bayesian meta-analysis model could balance the false detection rate and power to a better extent than Fisher's exact test. For example, when the threshold value of the posterior probability for

  2. A pattern-mixture model approach for handling missing continuous outcome data in longitudinal cluster randomized trials.

    Science.gov (United States)

    Fiero, Mallorie H; Hsu, Chiu-Hsieh; Bell, Melanie L

    2017-11-20

    We extend the pattern-mixture approach to handle missing continuous outcome data in longitudinal cluster randomized trials, which randomize groups of individuals to treatment arms, rather than the individuals themselves. Individuals who drop out at the same time point are grouped into the same dropout pattern. We approach extrapolation of the pattern-mixture model by applying multilevel multiple imputation, which imputes missing values while appropriately accounting for the hierarchical data structure found in cluster randomized trials. To assess parameters of interest under various missing data assumptions, imputed values are multiplied by a sensitivity parameter, k, which increases or decreases imputed values. Using simulated data, we show that estimates of parameters of interest can vary widely under differing missing data assumptions. We conduct a sensitivity analysis using real data from a cluster randomized trial by increasing k until the treatment effect inference changes. By performing a sensitivity analysis for missing data, researchers can assess whether certain missing data assumptions are reasonable for their cluster randomized trial. Copyright © 2017 John Wiley & Sons, Ltd.

  3. A random generation approach to pattern library creation for full chip lithographic simulation

    Science.gov (United States)

    Zou, Elain; Hong, Sid; Liu, Limei; Huang, Lucas; Yang, Legender; Kabeel, Aliaa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Du, Chunshan; Hu, Xinyi; Wan, Qijian; Zhang, Recoo

    2017-04-01

    As technology advances, the need for running lithographic (litho) checking for early detection of hotspots before tapeout has become essential. This process is important at all levels—from designing standard cells and small blocks to large intellectual property (IP) and full chip layouts. Litho simulation provides high accuracy for detecting printability issues due to problematic geometries, but it has the disadvantage of slow performance on large designs and blocks [1]. Foundries have found a good compromise solution for running litho simulation on full chips by filtering out potential candidate hotspot patterns using pattern matching (PM), and then performing simulation on the matched locations. The challenge has always been how to easily create a PM library of candidate patterns that provides both comprehensive coverage for litho problems and fast runtime performance. This paper presents a new strategy for generating candidate real design patterns through a random generation approach using a layout schema generator (LSG) utility. The output patterns from the LSG are simulated, and then classified by a scoring mechanism that categorizes patterns according to the severity of the hotspots, probability of their presence in the design, and the likelihood of the pattern causing a hotspot. The scoring output helps to filter out the yield problematic patterns that should be removed from any standard cell design, and also to define potential problematic patterns that must be simulated within a bigger context to decide whether or not they represent an actual hotspot. This flow is demonstrated on SMIC 14nm technology, creating a candidate hotspot pattern library that can be used in full chip simulation with very high coverage and robust performance.

  4. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. A novel continuous colour mapping approach for visualization of facial skin hydration and transepidermal water loss for four ethnic groups.

    Science.gov (United States)

    Voegeli, R; Rawlings, A V; Seroul, P; Summers, B

    2015-12-01

    The aim of this exploratory study was to develop a novel colour mapping approach to visualize and interpret the complexity of facial skin hydration and barrier properties of four ethnic groups (Caucasians, Indians, Chinese and Black Africans) living in Pretoria, South Africa. We measured transepidermal water loss (TEWL) and skin capacitance on 30 pre-defined sites on the forehead, cheek, jaw and eye areas of sixteen women (four per ethnic group) and took digital images of their faces. Continuous colour maps were generated by interpolating between each measured value and superimposing the values on the digital images. The complexity of facial skin hydration and skin barrier properties is revealed by these measurements and visualized by the continuous colour maps of the digital images. Overall, the Caucasian subjects had the better barrier properties followed by the Black African subjects, Chinese subjects and Indian subjects. Nevertheless, the two more darkly pigmented ethnic groups had superior skin hydration properties. Subtle differences were seen when examining the different facial sites. There exists remarkable skin capacitance and TEWL gradients within short distances on selected areas of the face. These gradients are distinctive in the different ethnic groups. In contrast to other reports, we found that darkly pigmented skin does not always have a superior barrier function and differences in skin hydration values are complex on the different parts of the face among the different ethnic groups. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  6. A New Perspective on Formation of Haze-Fog: The Fuzzy Cognitive Map and Its Approaches to Data Mining

    Directory of Open Access Journals (Sweden)

    Zhen Peng

    2017-02-01

    Full Text Available Haze-fog has seriously hindered the sustainable development of the ecological environment and caused great harm to the physical and mental health of residents in China. Therefore, it is important to probe the formation of haze-fog for its early warning and prevention. The formation of haze-fog is, in fact, a fuzzy nonlinear process. The formation of haze-fog is such a complex process that it is difficult to simulate its dynamic evolution using traditional methods, mainly because of the lack of their consideration of the nonlinear relationships. It is, therefore, essential to explore new perspectives on the formation of haze-fog. In this work, previous research on haze-fog formation is summarized first. Second, a new perspective is proposed on the application of fuzzy cognitive map to the formation of haze-fog. Third, a data mining method based on the genetic algorithm is used to discover the causality values of a fuzzy cognitive map (FCM for hazefog formation. Finally, simulation results are obtained through an experiment using the fuzzy cognitive map and its data mining method for the formation of haze-fog. The validity of this approach is determined by definition of a simple rule and the Kappa values. Thus, this research not only provides a new idea using FCM modeling the formation of haze-fog, but also uses an effective method of FCM for solving the nonlinear dynamics of the haze-fog formation.

  7. Random Matrix Theoretic Approaches to Sensor Fusion for Sensing and Surveillance in Highly Cluttered Environments

    Science.gov (United States)

    2015-08-24

    Squared Error (MSE) tracking performance for direction of arrival estimation in the presence of noise and missing data; see Fig. 5. 6) We have...scatter in random directions, thereby hindering its passage. As the thickness of a slab of highly scattering random medium increases, this effect

  8. Random-matrix-theory approach to mesoscopic fluctuations of heat current

    Science.gov (United States)

    Schmidt, Martin; Kottos, Tsampikos; Shapiro, Boris

    2013-08-01

    We consider an ensemble of fully connected networks of N oscillators coupled harmonically with random springs and show, using random-matrix-theory considerations, that both the average phonon heat current and its variance are scale invariant and take universal values in the large N limit. These anomalous mesoscopic fluctuations is the hallmark of strong correlations between normal modes.

  9. A Unified Approach to Power Calculation and Sample Size Determination for Random Regression Models

    Science.gov (United States)

    Shieh, Gwowen

    2007-01-01

    The underlying statistical models for multiple regression analysis are typically attributed to two types of modeling: fixed and random. The procedures for calculating power and sample size under the fixed regression models are well known. However, the literature on random regression models is limited and has been confined to the case of all…

  10. Local lattice relaxations in random metallic alloys: Effective tetrahedron model and supercell approach

    DEFF Research Database (Denmark)

    Ruban, Andrei; Simak, S.I.; Shallcross, S.

    2003-01-01

    We present a simple effective tetrahedron model for local lattice relaxation effects in random metallic alloys on simple primitive lattices. A comparison with direct ab initio calculations for supercells representing random Ni0.50Pt0.50 and Cu0.25Au0.75 alloys as well as the dilute limit of Au-ri...

  11. Monitoring the brazilian pasturelands: A new mapping approach based on the landsat 8 spectral and temporal domains

    Science.gov (United States)

    Parente, Leandro; Ferreira, Laerte; Faria, Adriano; Nogueira, Sérgio; Araújo, Fernando; Teixeira, Lana; Hagen, Stephen

    2017-10-01

    In a world marked by a rapid population expansion and an unprecedented increase in per capita income and consumption, sustainable food production is certainly the most pressing issue affecting mankind. Within this context, the brazilian pasturelands, the main land-use form in the country, constitute a particularly important asset as a land reserve, which, through improved land-use strategies and intensification, can meet food security goals and contribute to the mitigation of greenhouse gas emissions. In this study, we utilized the entire set of Landsat 8 images available for Brazil in 2015, from which dozens of seasonal metrics were derived, to produce, through objective criteria and automated classification strategies, a new pasture map for the country. Based on the Random Forest algorithm, individually modelled and applied to each one of the 380 Landsat scenes covering the Brazilian territory, our map showed an overall accuracy of 87%. Another result of this study was the thorough spatial and temporal assessment of Landsat 8 data availability in Brazil, which indicated that about 80% of the country had 12 or fewer observations free of clouds or cloud shadows in 2015.

  12. Mapping suitability of rice production systems for mitigation: Strategic approach for prioritizing improved irrigation management across scales

    Science.gov (United States)

    Wassmann, Reiner; Sander, Bjoern Ole

    2016-04-01

    After the successful conclusion of the COP21 in Paris, many developing countries are now embracing the task of reducing emissions with much vigor than previously. In many countries of South and South-East Asia, the agriculture sector constitutes a vast share of the national GHG budget which can mainly be attributed to methane emissions from flooded rice production. Thus, rice growing countries are now looking for tangible and easily accessible information as to how to reduce emissions from rice production in an efficient manner. Given present and future food demand, mitigation options will have to comply with aim of increasing productivity. At the same time, limited financial resources demand for strategic planning of potential mitigation projects based on cost-benefit ratios. At this point, the most promising approach for mitigating methane emissions from rice is an irrigation technique called Alternate Wetting and Drying (AWD). AWD was initially developed for saving water and subsequently, represents an adaptation strategy in its own right by coping with less rainfall. Moreover, AWD also reduces methane emissions in a range from 30-70%. However, AWD is not universally suitable. It is attractive to farmers who have to pump water and may save fuel under AWD, but renders limited incentives in situations where there is no real pressing water scarcity. Thus, planning for AWD adoption at larger scale, e.g. for country-wide programs, should be based on a systematic prioritization of target environments. This presentation encompasses a new methodology for mapping suitability of water-saving in rice production - as a means for planning adaptation and mitigation programs - alongside with preliminary results. The latter comprises three new GIS maps on climate-driven suitability of AWD in major rice growing countries (Philippines, Vietnam, Bangladesh). These maps have been derived from high-resolution data of the areal and temporal extent of rice production that are now

  13. Comparing Kriging and Regression Approaches for Mapping Soil Clay Content in a diverse Danish Landscape

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Bou Kheir, Rania; Greve, Mette Balslev

    2013-01-01

    Information on the spatial variability of soil texture including soil clay content in a landscape is very important for agricultural and environmental use. Different prediction techniques are available to assess and map spatial variability of soil properties, but selecting the most suitable......, and residual prediction deviation (RPD) for comparison. Among all the prediction methods, the highest R2 (i.e., 0.74) and lowest RMSE (i.e., 0.28) were associated with the RKrr model, which also had an RPD value of 2.2, confirming RKrr as the best prediction method. Stratification of samples slightly improved...

  14. Hyperspectral VNIR and photogrammetric data fusion approach for urban luminance map generation

    Directory of Open Access Journals (Sweden)

    L. Pipia

    2016-12-01

    Full Text Available This paper puts forward a methodology for the generation of high resolution luminance maps from simultaneous hyperspectral VNIR and photogrammetric imagery. The integration of hyperspectral radiance at ground level, properly weighted by the photopic-based coefficients, plus a sensor fusion strategy, provides for the first time a quantitative description of the luminous flux at high spatial resolution and with multi-angle geometry. Accordingly, this methodology allows following up any strategic policy aimed to improve urban illumination management and quantifying its effects in terms of energetic efficiency.

  15. Comparative study between computed tomography guided superior hypogastric plexus block and the classic posterior approach: A prospective randomized study

    Directory of Open Access Journals (Sweden)

    Ayman A Ghoneim

    2014-01-01

    Full Text Available Context: The classic posterior approach to superior hypogastric plexus block (SHPB is sometimes hindered by the iliac crest or a prominent transverse process of L5. The computed tomography (CT - guided anterior approach might overcome these difficulties. Aims: This prospective, comparative, randomized study was aimed to compare the CT guided anterior approach versus the classic posterior approach. Settings and Design: Controlled randomized study. Materials and Methods: A total of 30 patients with chronic pelvic cancer pain were randomized into either classic or CT groups where classic posterior approach or CT guided anterior approach were done, respectively. Visual analog score, daily analgesic morphine consumed and patient satisfaction were assessed just before the procedure, then, after 24 h, 1 week and monthly for 2 months after the procedure. Duration of the procedure was also recorded. Adverse effects associated with the procedure were closely observed and recorded. Statistical Analysis Used: Student′s t-test was used for comparison between groups. Results: Visual analog scale and morphine consumption decreased significantly in both groups at the measured times after the block compared with the baseline in the same group with no significant difference between both groups. The procedure was carried out in significantly shorter duration in the CT group than that in the classic group. The mean patient satisfaction scale increased significantly in both groups at the measured times after the block compared with the baseline in the same group. The patients in the CT groups were significantly more satisfied than those in classic group from day one after the procedure until the end of the study. Conclusions: The CT guided approach for SHPB is easier, faster, safer and more effective, with less side-effects than the classic approach.

  16. Mapping sequences by parts

    Directory of Open Access Journals (Sweden)

    Guziolowski Carito

    2007-09-01

    Full Text Available Abstract Background: We present the N-map method, a pairwise and asymmetrical approach which allows us to compare sequences by taking into account evolutionary events that produce shuffled, reversed or repeated elements. Basically, the optimal N-map of a sequence s over a sequence t is the best way of partitioning the first sequence into N parts and placing them, possibly complementary reversed, over the second sequence in order to maximize the sum of their gapless alignment scores. Results: We introduce an algorithm computing an optimal N-map with time complexity O (|s| × |t| × N using O (|s| × |t| × N memory space. Among all the numbers of parts taken in a reasonable range, we select the value N for which the optimal N-map has the most significant score. To evaluate this significance, we study the empirical distributions of the scores of optimal N-maps and show that they can be approximated by normal distributions with a reasonable accuracy. We test the functionality of the approach over random sequences on which we apply artificial evolutionary events. Practical Application: The method is illustrated with four case studies of pairs of sequences involving non-standard evolutionary events.

  17. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach.

    Science.gov (United States)

    Ammendolia, Carlo; Cassidy, David; Steensta, Ivan; Soklaridis, Sophie; Boyle, Eleanor; Eng, Stephanie; Howard, Hamer; Bhupinder, Bains; Côté, Pierre

    2009-06-09

    Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP) and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW) coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting.

  18. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    Directory of Open Access Journals (Sweden)

    Ammendolia Carlo

    2009-06-01

    Full Text Available Abstract Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting.

  19. Progress in national-scale landslide susceptibility mapping in Romania using a combined statistical-heuristical approach

    Science.gov (United States)

    Bălteanu, Dan; Micu, Mihai; Malet, Jean-Philippe; Jurchescu, Marta; Sima, Mihaela; Kucsicsa, Gheorghe; Dumitrică, Cristina; Petrea, Dănuţ; Mărgărint, Ciprian; Bilaşco, Ştefan; Văcăreanu, Radu; Georgescu, Sever; Senzaconi, Francisc

    2017-04-01

    Landslide processes represent a very widespread geohazard in Romania, affecting mainly the hilly and plateau regions as well as the mountain sectors developed on flysch formations. Two main projects provided the framework for improving the existing national landslide susceptibility map (Bălteanu et al. 2010): the ELSUS (Pan-European and nation-wide landslide susceptibility assessment, EC-CERG) and the RO-RISK (Disaster Risk Evaluation at National Level, ESF-POCA) projects. The latter one, a flagship project aiming at strengthening risk prevention and management in Romania, focused on a national-level evaluation of the main risks in the country including landslides. The strategy for modeling landslide susceptibility was designed based on the experience gained from continental and national level assessments conducted in the frame of the International Programme on Landslides (IPL) project IPL-162, the European Landslides Expert Group - JRC and the ELSUS project. The newly proposed landslide susceptibility model used as input a reduced set of landslide conditioning factor maps available at scales of 1:100,000 - 1:200,000 and consisting of lithology, slope angle and land cover. The input data was further differentiated for specific natural environments, defined here as morpho-structural units in order to incorporate differences induced by elevation (vertical climatic zonation), morpho-structure as well as neotectonic features. In order to best discern the specific landslide conditioning elements, the analysis has been carried out for one single process category, namely slides. The existence of a landslide inventory covering the whole country's territory ( 30,000 records, Micu et al. 2014), although affected by incompleteness and lack of homogeneity, allowed for the application of a semi-quantitative, mixed statistical-heuristical approach having the advantage of combining the objectivity of statistics with expert-knowledge in calibrating class and factor weights. The

  20. Assessment of the presumed mapping function approach for the stationary laminar flamelet modelling of reacting double scalar mixing layers

    Science.gov (United States)

    El Sayed, Ahmad; Mortensen, Mikael; Wen, John Z.

    2014-09-01

    This paper assesses the Presumed Mapping Function (PMF) approach in the context of the Stationary Laminar Flamelet Modelling (SLFM) of a reacting Double Scalar Mixing Layer (DSML). Starting from a prescribed Gaussian reference field, the PMF approach provides a presumed Probability Density Function (PDF) for the mixture fraction that is subsequently employed to close the Conditional Scalar Dissipation Rate (CSDR) upon doubly-integrating the homogeneous PDF transport equation. The PMF approach is unique in its ability to yield PDF and CSDR distributions that capture the effect of multiple fuel injections of different composition. This distinct feature overcomes the shortcomings of the classical SLFM closures (the β-distribution for the PDF and the counterflow solution for the CSDR). The current study analyses the impact of the binary (two-stream) and trinary (three-stream) PMF approaches on the structure of laminar flamelets in a DSML formed by the mixing of a fuel stream and an oxidiser stream separated by a pilot. The conditions of a partially-premixed methane/air piloted jet flame are considered. A parametric assessment is performed by varying the local mixing statistics and the findings are compared to those of the classical SLFM approach. Further, the influence of the PMF approach on flamelet extinction and transport by means of differential diffusion is thoroughly investigated. It is shown that the trinary PMF approach captures the influence of the pilot stream as it is capable of yielding bimodal CSDR and trimodal PDF distributions. It is further demonstrated that, when the influence of the pilot is significant, flamelets generated using the trinary CSDR closure extinguish at higher strain levels compared to flamelets generated using the binary and counterflow closures. Lastly, it is shown that the trinary PMF approach can be critical for accurate SLFM computations of DSMLs when differential diffusion effects are important.

  1. Weight gain prevention in young adults: design of the study of novel approaches to weight gain prevention (SNAP) randomized controlled trial

    National Research Council Canada - National Science Library

    Wing, Rena R; Tate, Deborah; Espeland, Mark; Gorin, Amy; LaRose, Jessica Gokee; Robichaud, Erica Ferguson; Erickson, Karen; Perdue, Letitia; Bahnson, Judy; Lewis, Cora E

    2013-01-01

    ... (Study of Novel Approaches to Weight Gain Prevention) is an NIH-funded randomized clinical trial examining the efficacy of two novel self-regulation approaches to weight gain prevention in young adults compared to a minimal treatment control...

  2. Deriving pathway maps from automated text analysis using a grammar-based approach.

    Science.gov (United States)

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.

  3. Evolutionary Feature Selection for Big Data Classification: A MapReduce Approach

    Directory of Open Access Journals (Sweden)

    Daniel Peralta

    2015-01-01

    Full Text Available Nowadays, many disciplines have to deal with big datasets that additionally involve a high number of features. Feature selection methods aim at eliminating noisy, redundant, or irrelevant features that may deteriorate the classification performance. However, traditional methods lack enough scalability to cope with datasets of millions of instances and extract successful results in a delimited time. This paper presents a feature selection algorithm based on evolutionary computation that uses the MapReduce paradigm to obtain subsets of features from big datasets. The algorithm decomposes the original dataset in blocks of instances to learn from them in the map phase; then, the reduce phase merges the obtained partial results into a final vector of feature weights, which allows a flexible application of the feature selection procedure using a threshold to determine the selected subset of features. The feature selection method is evaluated by using three well-known classifiers (SVM, Logistic Regression, and Naive Bayes implemented within the Spark framework to address big data problems. In the experiments, datasets up to 67 millions of instances and up to 2000 attributes have been managed, showing that this is a suitable framework to perform evolutionary feature selection, improving both the classification accuracy and its runtime when dealing with big data problems.

  4. A population genetic approach to mapping neurological disorder genes using deep resequencing.

    Directory of Open Access Journals (Sweden)

    Rachel A Myers

    2011-02-01

    Full Text Available Deep resequencing of functional regions in human genomes is key to identifying potentially causal rare variants for complex disorders. Here, we present the results from a large-sample resequencing (n  =  285 patients study of candidate genes coupled with population genetics and statistical methods to identify rare variants associated with Autism Spectrum Disorder and Schizophrenia. Three genes, MAP1A, GRIN2B, and CACNA1F, were consistently identified by different methods as having significant excess of rare missense mutations in either one or both disease cohorts. In a broader context, we also found that the overall site frequency spectrum of variation in these cases is best explained by population models of both selection and complex demography rather than neutra