WorldWideScience

Sample records for random mapping approach

  1. Mapping important nucleotides in the peptidyl transferase centre of 23 S rRNA using a random mutagenesis approach

    DEFF Research Database (Denmark)

    Porse, B T; Garrett, R A

    1995-01-01

    Random mutations were generated in the lower half of the peptidyl transferase loop in domain V of 23 S rRNA from Escherichia coli using a polymerase chain reaction (PCR) approach, a rapid procedure for identifying mutants and a plasmid-based expression system. The effects of 21 single-site mutati...

  2. Pseudo-Random Number Generator Based on Coupled Map Lattices

    Science.gov (United States)

    Lü, Huaping; Wang, Shihong; Hu, Gang

    A one-way coupled chaotic map lattice is used for generating pseudo-random numbers. It is shown that with suitable cooperative applications of both chaotic and conventional approaches, the output of the spatiotemporally chaotic system can easily meet the practical requirements of random numbers, i.e., excellent random statistical properties, long periodicity of computer realizations, and fast speed of random number generations. This pseudo-random number generator system can be used as ideal synchronous and self-synchronizing stream cipher systems for secure communications.

  3. ON RANDOM ITERATED FUNCTION SYSTEMS WITH GREYSCALE MAPS

    Directory of Open Access Journals (Sweden)

    Matthew Demers

    2012-05-01

    Full Text Available In the theory of Iterated Function Systems (IFSs it is known that one can find an IFS with greyscale maps (IFSM to approximate any target signal or image with arbitrary precision, and a systematic approach for doing so was described. In this paper, we extend these ideas to the framework of random IFSM operators. We consider the situation where one has many noisy observations of a particular target signal and show that the greyscale map parameters for each individual observation inherit the noise distribution of the observation. We provide illustrative examples.

  4. Random unitary maps for quantum state reconstruction

    International Nuclear Information System (INIS)

    Merkel, Seth T.; Riofrio, Carlos A.; Deutsch, Ivan H.; Flammia, Steven T.

    2010-01-01

    We study the possibility of performing quantum state reconstruction from a measurement record that is obtained as a sequence of expectation values of a Hermitian operator evolving under repeated application of a single random unitary map, U 0 . We show that while this single-parameter orbit in operator space is not informationally complete, it can be used to yield surprisingly high-fidelity reconstruction. For a d-dimensional Hilbert space with the initial observable in su(d), the measurement record lacks information about a matrix subspace of dimension ≥d-2 out of the total dimension d 2 -1. We determine the conditions on U 0 such that the bound is saturated, and show they are achieved by almost all pseudorandom unitary matrices. When we further impose the constraint that the physical density matrix must be positive, we obtain even higher fidelity than that predicted from the missing subspace. With prior knowledge that the state is pure, the reconstruction will be perfect (in the limit of vanishing noise) and for arbitrary mixed states, the fidelity is over 0.96, even for small d, and reaching F>0.99 for d>9. We also study the implementation of this protocol based on the relationship between random matrices and quantum chaos. We show that the Floquet operator of the quantum kicked top provides a means of generating the required type of measurement record, with implications on the relationship between quantum chaos and information gain.

  5. Effective Perron-Frobenius eigenvalue for a correlated random map

    Science.gov (United States)

    Pool, Roman R.; Cáceres, Manuel O.

    2010-09-01

    We investigate the evolution of random positive linear maps with various type of disorder by analytic perturbation and direct simulation. Our theoretical result indicates that the statistics of a random linear map can be successfully described for long time by the mean-value vector state. The growth rate can be characterized by an effective Perron-Frobenius eigenvalue that strongly depends on the type of correlation between the elements of the projection matrix. We apply this approach to an age-structured population dynamics model. We show that the asymptotic mean-value vector state characterizes the population growth rate when the age-structured model has random vital parameters. In this case our approach reveals the nontrivial dependence of the effective growth rate with cross correlations. The problem was reduced to the calculation of the smallest positive root of a secular polynomial, which can be obtained by perturbations in terms of Green’s function diagrammatic technique built with noncommutative cumulants for arbitrary n -point correlations.

  6. Pseudo-random bit generator based on Chebyshev map

    Science.gov (United States)

    Stoyanov, B. P.

    2013-10-01

    In this paper, we study a pseudo-random bit generator based on two Chebyshev polynomial maps. The novel derivative algorithm shows perfect statistical properties established by number of statistical tests.

  7. A random matrix approach to language acquisition

    Science.gov (United States)

    Nicolaidis, A.; Kosmidis, Kosmas; Argyrakis, Panos

    2009-12-01

    Since language is tied to cognition, we expect the linguistic structures to reflect patterns that we encounter in nature and are analyzed by physics. Within this realm we investigate the process of lexicon acquisition, using analytical and tractable methods developed within physics. A lexicon is a mapping between sounds and referents of the perceived world. This mapping is represented by a matrix and the linguistic interaction among individuals is described by a random matrix model. There are two essential parameters in our approach. The strength of the linguistic interaction β, which is considered as a genetically determined ability, and the number N of sounds employed (the lexicon size). Our model of linguistic interaction is analytically studied using methods of statistical physics and simulated by Monte Carlo techniques. The analysis reveals an intricate relationship between the innate propensity for language acquisition β and the lexicon size N, N~exp(β). Thus a small increase of the genetically determined β may lead to an incredible lexical explosion. Our approximate scheme offers an explanation for the biological affinity of different species and their simultaneous linguistic disparity.

  8. A random matrix approach to language acquisition

    International Nuclear Information System (INIS)

    Nicolaidis, A; Kosmidis, Kosmas; Argyrakis, Panos

    2009-01-01

    Since language is tied to cognition, we expect the linguistic structures to reflect patterns that we encounter in nature and are analyzed by physics. Within this realm we investigate the process of lexicon acquisition, using analytical and tractable methods developed within physics. A lexicon is a mapping between sounds and referents of the perceived world. This mapping is represented by a matrix and the linguistic interaction among individuals is described by a random matrix model. There are two essential parameters in our approach. The strength of the linguistic interaction β, which is considered as a genetically determined ability, and the number N of sounds employed (the lexicon size). Our model of linguistic interaction is analytically studied using methods of statistical physics and simulated by Monte Carlo techniques. The analysis reveals an intricate relationship between the innate propensity for language acquisition β and the lexicon size N, N∼exp(β). Thus a small increase of the genetically determined β may lead to an incredible lexical explosion. Our approximate scheme offers an explanation for the biological affinity of different species and their simultaneous linguistic disparity

  9. Stochastic perturbations in open chaotic systems: random versus noisy maps.

    Science.gov (United States)

    Bódai, Tamás; Altmann, Eduardo G; Endler, Antonio

    2013-04-01

    We investigate the effects of random perturbations on fully chaotic open systems. Perturbations can be applied to each trajectory independently (white noise) or simultaneously to all trajectories (random map). We compare these two scenarios by generalizing the theory of open chaotic systems and introducing a time-dependent conditionally-map-invariant measure. For the same perturbation strength we show that the escape rate of the random map is always larger than that of the noisy map. In random maps we show that the escape rate κ and dimensions D of the relevant fractal sets often depend nonmonotonically on the intensity of the random perturbation. We discuss the accuracy (bias) and precision (variance) of finite-size estimators of κ and D, and show that the improvement of the precision of the estimations with the number of trajectories N is extremely slow ([proportionality]1/lnN). We also argue that the finite-size D estimators are typically biased. General theoretical results are combined with analytical calculations and numerical simulations in area-preserving baker maps.

  10. A tiered approach for ecosystem services mapping

    OpenAIRE

    Grêt-Regamey, Adrienne; Weibel, Bettina; Rabe, Sven-Erik; Burkhard, Benjamin

    2017-01-01

    Mapping ecosystem services delivers essential insights into the spatial characteristics of various goods’ and services’ flows from nature to human society. It has become a central topic of science, policy, business and society – all belonging on functioning ecosystems. This textbook summarises the current state-of-the-art of ecosystem services mapping, related theory and methods, different ecosystem service quantification and modelling approaches as well as practical applications. The book...

  11. MAGNETIC VT study: a prospective, multicenter, post-market randomized controlled trial comparing VT ablation outcomes using remote magnetic navigation-guided substrate mapping and ablation versus manual approach in a low LVEF population.

    Science.gov (United States)

    Di Biase, Luigi; Tung, Roderick; Szili-Torok, Tamás; Burkhardt, J David; Weiss, Peter; Tavernier, Rene; Berman, Adam E; Wissner, Erik; Spear, William; Chen, Xu; Neužil, Petr; Skoda, Jan; Lakkireddy, Dhanunjaya; Schwagten, Bruno; Lock, Ken; Natale, Andrea

    2017-04-01

    Patients with ischemic cardiomyopathy (ICM) are prone to scar-related ventricular tachycardia (VT). The success of VT ablation depends on accurate arrhythmogenic substrate localization, followed by optimal delivery of energy provided by constant electrode-tissue contact. Current manual and remote magnetic navigation (RMN)-guided ablation strategies aim to identify a reentry circuit and to target a critical isthmus through activation and entrainment mapping during ongoing tachycardia. The MAGNETIC VT trial will assess if VT ablation using the Niobe™ ES magnetic navigation system results in superior outcomes compared to a manual approach in subjects with ischemic scar VT and low ejection fraction. This is a randomized, single-blind, prospective, multicenter post-market study. A total of 386 subjects (193 per group) will be enrolled and randomized 1:1 between treatment with the Niobe ES system and treatment via a manual procedure at up to 20 sites. The study population will consist of patients with ischemic cardiomyopathy with left ventricular ejection fraction (LVEF) of ≤35% and implantable cardioverter defibrillator (ICD) who have sustained monomorphic VT. The primary study endpoint is freedom from any recurrence of VT through 12 months. The secondary endpoints are acute success; freedom from any VT at 1 year in a large-scar subpopulation; procedure-related major adverse events; and mortality rate through 12-month follow-up. Follow-up will consist of visits at 3, 6, 9, and 12 months, all of which will include ICD interrogation. The MAGNETIC VT trial will help determine whether substrate-based ablation of VT with RMN has clinical advantages over manual catheter manipulation. Clinicaltrials.gov identifier: NCT02637947.

  12. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  13. Improving the pseudo-randomness properties of chaotic maps using deep-zoom

    Science.gov (United States)

    Machicao, Jeaneth; Bruno, Odemir M.

    2017-05-01

    A generalized method is proposed to compose new orbits from a given chaotic map. The method provides an approach to examine discrete-time chaotic maps in a "deep-zoom" manner by using k-digits to the right from the decimal separator of a given point from the underlying chaotic map. Interesting phenomena have been identified. Rapid randomization was observed, i.e., chaotic patterns tend to become indistinguishable when compared to the original orbits of the underlying chaotic map. Our results were presented using different graphical analyses (i.e., time-evolution, bifurcation diagram, Lyapunov exponent, Poincaré diagram, and frequency distribution). Moreover, taking advantage of this randomization improvement, we propose a Pseudo-Random Number Generator (PRNG) based on the k-logistic map. The pseudo-random qualities of the proposed PRNG passed both tests successfully, i.e., DIEHARD and NIST, and were comparable with other traditional PRNGs such as the Mersenne Twister. The results suggest that simple maps such as the logistic map can be considered as good PRNG methods.

  14. Cancerous tissue mapping from random lasing emission spectra

    International Nuclear Information System (INIS)

    Polson, R C; Vardeny, Z V

    2010-01-01

    Random lasing emission spectra have been collected from both healthy and cancerous tissues. The two types of tissue with optical gain have different light scattering properties as obtained from an average power Fourier transform of their random lasing emission spectra. The difference in the power Fourier transform leads to a contrast between cancerous and benign tissues, which is utilized for tissue mapping of healthy and cancerous regions of patients

  15. Decay of random correlation functions for unimodal maps

    Science.gov (United States)

    Baladi, Viviane; Benedicks, Michael; Maume-Deschamps, Véronique

    2000-10-01

    Since the pioneering results of Jakobson and subsequent work by Benedicks-Carleson and others, it is known that quadratic maps tfa( χ) = a - χ2 admit a unique absolutely continuous invariant measure for a positive measure set of parameters a. For topologically mixing tfa, Young and Keller-Nowicki independently proved exponential decay of correlation functions for this a.c.i.m. and smooth observables. We consider random compositions of small perturbations tf + ωt, with tf = tfa or another unimodal map satisfying certain nonuniform hyperbolicity axioms, and ωt chosen independently and identically in [-ɛ, ɛ]. Baladi-Viana showed exponential mixing of the associated Markov chain, i.e., averaging over all random itineraries. We obtain stretched exponential bounds for the random correlation functions of Lipschitz observables for the sample measure μωof almost every itinerary.

  16. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    digital soil mapping methods and sets of ancillary variables for producing the most accurate spatial prediction of texture classes in a given area of interest. Both legacy and recently collected data on PSD were used as reference information. The predictor variable data set consisted of digital elevation model and its derivatives, lithology, land use maps as well as various bands and indices of satellite images. Two conceptionally different approaches can be applied in the mapping process. Textural classification can be realized after particle size data were spatially extended by proper geostatistical method. Alternatively, the textural classification is carried out first, followed by the spatial extension through suitable data mining method. According to the first approach, maps of sand, silt and clay percentage have been computed through regression kriging (RK). Since the three maps are compositional (their sum must be 100%), we applied Additive Log-Ratio (alr) transformation, instead of kriging them independently. Finally, the texture class map has been compiled according to the USDA categories from the three maps. Different combinations of reference and training soil data and auxiliary covariables resulted several different maps. On the basis of the other way, the PSD were classified firstly into the USDA categories, then the texture class maps were compiled directly by data mining methods (classification trees and random forests). The various results were compared to each other as well as to the RK maps. The performance of the different methods and data sets has been examined by testing the accuracy of the geostatistically computed and the directly classified results to assess the most predictive and accurate method. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  17. Pseudo random number generator based on quantum chaotic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Mobaraki, A.; Lim, S.-C.; Hassan, Z.

    2014-01-01

    For many years dissipative quantum maps were widely used as informative models of quantum chaos. In this paper, a new scheme for generating good pseudo-random numbers (PRNG), based on quantum logistic map is proposed. Note that the PRNG merely relies on the equations used in the quantum chaotic map. The algorithm is not complex, which does not impose high requirement on computer hardware and thus computation speed is fast. In order to face the challenge of using the proposed PRNG in quantum cryptography and other practical applications, the proposed PRNG is subjected to statistical tests using well-known test suites such as NIST, DIEHARD, ENT and TestU01. The results of the statistical tests were promising, as the proposed PRNG successfully passed all these tests. Moreover, the degree of non-periodicity of the chaotic sequences of the quantum map is investigated through the Scale index technique. The obtained result shows that, the sequence is more non-periodic. From these results it can be concluded that, the new scheme can generate a high percentage of usable pseudo-random numbers for simulation and other applications in scientific computing.

  18. Cryptographic pseudo-random sequence from the spatial chaotic map

    International Nuclear Information System (INIS)

    Sun Fuyan; Liu Shutang

    2009-01-01

    A scheme for pseudo-random binary sequence generation based on the spatial chaotic map is proposed. In order to face the challenge of using the proposed PRBS in cryptography, the proposed PRBS is subjected to statistical tests which are the well-known FIPS-140-1 in the area of cryptography, and correlation properties of the proposed sequences are investigated. The proposed PRBS successfully passes all these tests. Results of statistical testing of the sequences are found encouraging. The results of statistical tests suggest strong candidature for cryptographic applications.

  19. Inverse problems for random differential equations using the collage method for random contraction mappings

    Science.gov (United States)

    Kunze, H. E.; La Torre, D.; Vrscay, E. R.

    2009-01-01

    In this paper we are concerned with differential equations with random coefficients which will be considered as random fixed point equations of the form T([omega],x([omega]))=x([omega]), [omega][set membership, variant][Omega]. Here T:[Omega]×X-->X is a random integral operator, is a probability space and X is a complete metric space. We consider the following inverse problem for such equations: Given a set of realizations of the fixed point of T (possibly the interpolations of different observational data sets), determine the operator T or the mean value of its random components, as appropriate. We solve the inverse problem for this class of equations by using the collage theorem for contraction mappings.

  20. Rapid Land Cover Map Updates Using Change Detection and Robust Random Forest Classifiers

    Directory of Open Access Journals (Sweden)

    Konrad J. Wessels

    2016-10-01

    Full Text Available The paper evaluated the Landsat Automated Land Cover Update Mapping (LALCUM system designed to rapidly update a land cover map to a desired nominal year using a pre-existing reference land cover map. The system uses the Iteratively Reweighted Multivariate Alteration Detection (IRMAD to identify areas of change and no change. The system then automatically generates large amounts of training samples (n > 1 million in the no-change areas as input to an optimized Random Forest classifier. Experiments were conducted in the KwaZulu-Natal Province of South Africa using a reference land cover map from 2008, a change mask between 2008 and 2011 and Landsat ETM+ data for 2011. The entire system took 9.5 h to process. We expected that the use of the change mask would improve classification accuracy by reducing the number of mislabeled training data caused by land cover change between 2008 and 2011. However, this was not the case due to exceptional robustness of Random Forest classifier to mislabeled training samples. The system achieved an overall accuracy of 65%–67% using 22 detailed classes and 72%–74% using 12 aggregated national classes. “Water”, “Plantations”, “Plantations—clearfelled”, “Orchards—trees”, “Sugarcane”, “Built-up/dense settlement”, “Cultivation—Irrigated” and “Forest (indigenous” had user’s accuracies above 70%. Other detailed classes (e.g., “Low density settlements”, “Mines and Quarries”, and “Cultivation, subsistence, drylands” which are required for operational, provincial-scale land use planning and are usually mapped using manual image interpretation, could not be mapped using Landsat spectral data alone. However, the system was able to map the 12 national classes, at a sufficiently high level of accuracy for national scale land cover monitoring. This update approach and the highly automated, scalable LALCUM system can improve the efficiency and update rate of regional land

  1. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Directory of Open Access Journals (Sweden)

    Joseph Mascaro

    Full Text Available Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus. The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag", which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1 when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  2. A tale of two "forests": random forest machine learning AIDS tropical forest carbon mapping.

    Science.gov (United States)

    Mascaro, Joseph; Asner, Gregory P; Knapp, David E; Kennedy-Bowdoin, Ty; Martin, Roberta E; Anderson, Christopher; Higgins, Mark; Chadwick, K Dana

    2014-01-01

    Accurate and spatially-explicit maps of tropical forest carbon stocks are needed to implement carbon offset mechanisms such as REDD+ (Reduced Deforestation and Degradation Plus). The Random Forest machine learning algorithm may aid carbon mapping applications using remotely-sensed data. However, Random Forest has never been compared to traditional and potentially more reliable techniques such as regionally stratified sampling and upscaling, and it has rarely been employed with spatial data. Here, we evaluated the performance of Random Forest in upscaling airborne LiDAR (Light Detection and Ranging)-based carbon estimates compared to the stratification approach over a 16-million hectare focal area of the Western Amazon. We considered two runs of Random Forest, both with and without spatial contextual modeling by including--in the latter case--x, and y position directly in the model. In each case, we set aside 8 million hectares (i.e., half of the focal area) for validation; this rigorous test of Random Forest went above and beyond the internal validation normally compiled by the algorithm (i.e., called "out-of-bag"), which proved insufficient for this spatial application. In this heterogeneous region of Northern Peru, the model with spatial context was the best preforming run of Random Forest, and explained 59% of LiDAR-based carbon estimates within the validation area, compared to 37% for stratification or 43% by Random Forest without spatial context. With the 60% improvement in explained variation, RMSE against validation LiDAR samples improved from 33 to 26 Mg C ha(-1) when using Random Forest with spatial context. Our results suggest that spatial context should be considered when using Random Forest, and that doing so may result in substantially improved carbon stock modeling for purposes of climate change mitigation.

  3. Strong disorder RG approach of random systems

    International Nuclear Information System (INIS)

    Igloi, Ferenc; Monthus, Cecile

    2005-01-01

    There is a large variety of quantum and classical systems in which the quenched disorder plays a dominant ro-circumflex le over quantum, thermal, or stochastic fluctuations: these systems display strong spatial heterogeneities, and many averaged observables are actually governed by rare regions. A unifying approach to treat the dynamical and/or static singularities of these systems has emerged recently, following the pioneering RG idea by Ma and Dasgupta and the detailed analysis by Fisher who showed that the Ma-Dasgupta RG rules yield asymptotic exact results if the broadness of the disorder grows indefinitely at large scales. Here we report these new developments by starting with an introduction of the main ingredients of the strong disorder RG method. We describe the basic properties of infinite disorder fixed points, which are realized at critical points, and of strong disorder fixed points, which control the singular behaviors in the Griffiths-phases. We then review in detail applications of the RG method to various disordered models, either (i) quantum models, such as random spin chains, ladders and higher dimensional spin systems, or (ii) classical models, such as diffusion in a random potential, equilibrium at low temperature and coarsening dynamics of classical random spin chains, trap models, delocalization transition of a random polymer from an interface, driven lattice gases and reaction diffusion models in the presence of quenched disorder. For several one-dimensional systems, the Ma-Dasgupta RG rules yields very detailed analytical results, whereas for other, mainly higher dimensional problems, the RG rules have to be implemented numerically. If available, the strong disorder RG results are compared with another, exact or numerical calculations

  4. A symbolic dynamics approach for the complexity analysis of chaotic pseudo-random sequences

    International Nuclear Information System (INIS)

    Xiao Fanghong

    2004-01-01

    By considering a chaotic pseudo-random sequence as a symbolic sequence, authors present a symbolic dynamics approach for the complexity analysis of chaotic pseudo-random sequences. The method is applied to the cases of Logistic map and one-way coupled map lattice to demonstrate how it works, and a comparison is made between it and the approximate entropy method. The results show that this method is applicable to distinguish the complexities of different chaotic pseudo-random sequences, and it is superior to the approximate entropy method

  5. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    This paper presents our ongoing effort on developing a principled methodology for automatic ontology mapping based on BayesOWL, a probabilistic framework we developed for modeling uncertainty in semantic web...

  6. Random phase approximation in relativistic approach

    International Nuclear Information System (INIS)

    Ma Zhongyu; Yang Ding; Tian Yuan; Cao Ligang

    2009-01-01

    Some special issues of the random phase approximation(RPA) in the relativistic approach are reviewed. A full consistency and proper treatment of coupling to the continuum are responsible for the successful application of the RPA in the description of dynamical properties of finite nuclei. The fully consistent relativistic RPA(RRPA) requires that the relativistic mean filed (RMF) wave function of the nucleus and the RRPA correlations are calculated in a same effective Lagrangian and the consistent treatment of the Dirac sea of negative energy states. The proper treatment of the single particle continuum with scattering asymptotic conditions in the RMF and RRPA is discussed. The full continuum spectrum can be described by the single particle Green's function and the relativistic continuum RPA is established. A separable form of the paring force is introduced in the relativistic quasi-particle RPA. (authors)

  7. A multiscale approach to mapping seabed sediments.

    Directory of Open Access Journals (Sweden)

    Benjamin Misiuk

    Full Text Available Benthic habitat maps, including maps of seabed sediments, have become critical spatial-decision support tools for marine ecological management and conservation. Despite the increasing recognition that environmental variables should be considered at multiple spatial scales, variables used in habitat mapping are often implemented at a single scale. The objective of this study was to evaluate the potential for using environmental variables at multiple scales for modelling and mapping seabed sediments. Sixteen environmental variables were derived from multibeam echosounder data collected near Qikiqtarjuaq, Nunavut, Canada at eight spatial scales ranging from 5 to 275 m, and were tested as predictor variables for modelling seabed sediment distributions. Using grain size data obtained from grab samples, we tested which scales of each predictor variable contributed most to sediment models. Results showed that the default scale was often not the best. Out of 129 potential scale-dependent variables, 11 were selected to model the additive log-ratio of mud and sand at five different scales, and 15 were selected to model the additive log-ratio of gravel and sand, also at five different scales. Boosted Regression Tree models that explained between 46.4 and 56.3% of statistical deviance produced multiscale predictions of mud, sand, and gravel that were correlated with cross-validated test data (Spearman's ρmud = 0.77, ρsand = 0.71, ρgravel = 0.58. Predictions of individual size fractions were classified to produce a map of seabed sediments that is useful for marine spatial planning. Based on the scale-dependence of variables in this study, we concluded that spatial scale consideration is at least as important as variable selection in seabed mapping.

  8. A multi-model ensemble approach to seabed mapping

    Science.gov (United States)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  9. New GIS approaches to wild land mapping in Europe

    Science.gov (United States)

    Steffen Fritz; Steve Carver; Linda See

    2000-01-01

    This paper outlines modifications and new approaches to wild land mapping developed specifically for the United Kingdom and European areas. In particular, national level reconnaissance and local level mapping of wild land in the UK and Scotland are presented. A national level study for the UK is undertaken, and a local study focuses on the Cairngorm Mountains in...

  10. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    Science.gov (United States)

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  11. Towards Technological Approaches for Concept Maps Mining from Text

    OpenAIRE

    Camila Zacche Aguiar; Davidson Cury; Amal Zouaq

    2018-01-01

    Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed...

  12. Iterated-map approach to die tossing

    DEFF Research Database (Denmark)

    Feldberg, Rasmus; Szymkat, Maciej; Knudsen, Carsten

    1990-01-01

    Nonlinear dissipative mapping is applied to determine the trajectory of a two-dimensional die thrown onto an elastic table. The basins of attraction for different outcomes are obtained and their distribution in the space of initial conditions discussed. The system has certain properties in common...... with chaotic systems. However, a die falls to rest after a finite number of impacts, and therefore the system has a finite sensitivity to the initial conditions. Quantitative measures of this sensitivity are proposed and their variations with the initial momentum and orientation of the die investigated....

  13. Clustering of color map pixels: an interactive approach

    Science.gov (United States)

    Moon, Yiu Sang; Luk, Franklin T.; Yuen, K. N.; Yeung, Hoi Wo

    2003-12-01

    The demand for digital maps continues to arise as mobile electronic devices become more popular nowadays. Instead of creating the entire map from void, we may convert a scanned paper map into a digital one. Color clustering is the very first step of the conversion process. Currently, most of the existing clustering algorithms are fully automatic. They are fast and efficient but may not work well in map conversion because of the numerous ambiguous issues associated with printed maps. Here we introduce two interactive approaches for color clustering on the map: color clustering with pre-calculated index colors (PCIC) and color clustering with pre-calculated color ranges (PCCR). We also introduce a memory model that could enhance and integrate different image processing techniques for fine-tuning the clustering results. Problems and examples of the algorithms are discussed in the paper.

  14. Approach of simultaneous localization and mapping based on local maps for robot

    Institute of Scientific and Technical Information of China (English)

    CHEN Bai-fan; CAI Zi-xing; HU De-wen

    2006-01-01

    An extended Kalman filter approach of simultaneous localization and mapping(SLAM) was proposed based on local maps.A local frame of reference was established periodically at the position of the robot, and then the observations of the robot and landmarks were fused into the global frame of reference. Because of the independence of the local map, the approach does not cumulate the estimate and calculation errors which are produced by SLAM using Kalman filter directly. At the same time, it reduces the computational complexity. This method is proven correct and feasible in simulation experiments.

  15. Optogenetic Approaches for Mesoscopic Brain Mapping.

    Science.gov (United States)

    Kyweriga, Michael; Mohajerani, Majid H

    2016-01-01

    Recent advances in identifying genetically unique neuronal proteins has revolutionized the study of brain circuitry. Researchers are now able to insert specific light-sensitive proteins (opsins) into a wide range of specific cell types via viral injections or by breeding transgenic mice. These opsins enable the activation, inhibition, or modulation of neuronal activity with millisecond control within distinct brain regions defined by genetic markers. Here we present a useful guide to implement this technique into any lab. We first review the materials needed and practical considerations and provide in-depth instructions for acute surgeries in mice. We conclude with all-optical mapping techniques for simultaneous recording and manipulation of population activity of many neurons in vivo by combining arbitrary point optogenetic stimulation and regional voltage-sensitive dye imaging. It is our intent to make these methods available to anyone wishing to use them.

  16. Soil map disaggregation improved by soil-landscape relationships, area-proportional sampling and random forest implementation

    DEFF Research Database (Denmark)

    Møller, Anders Bjørn; Malone, Brendan P.; Odgers, Nathan

    implementation generally improved the algorithm’s ability to predict the correct soil class. The implementation of soil-landscape relationships and area-proportional sampling generally increased the calculation time, while the random forest implementation reduced the calculation time. In the most successful......Detailed soil information is often needed to support agricultural practices, environmental protection and policy decisions. Several digital approaches can be used to map soil properties based on field observations. When soil observations are sparse or missing, an alternative approach...... is to disaggregate existing conventional soil maps. At present, the DSMART algorithm represents the most sophisticated approach for disaggregating conventional soil maps (Odgers et al., 2014). The algorithm relies on classification trees trained from resampled points, which are assigned classes according...

  17. A random matrix approach to VARMA processes

    International Nuclear Information System (INIS)

    Burda, Zdzislaw; Jarosz, Andrzej; Nowak, Maciej A; Snarska, Malgorzata

    2010-01-01

    We apply random matrix theory to derive the spectral density of large sample covariance matrices generated by multivariate VMA(q), VAR(q) and VARMA(q 1 , q 2 ) processes. In particular, we consider a limit where the number of random variables N and the number of consecutive time measurements T are large but the ratio N/T is fixed. In this regime, the underlying random matrices are asymptotically equivalent to free random variables (FRV). We apply the FRV calculus to calculate the eigenvalue density of the sample covariance for several VARMA-type processes. We explicitly solve the VARMA(1, 1) case and demonstrate perfect agreement between the analytical result and the spectra obtained by Monte Carlo simulations. The proposed method is purely algebraic and can be easily generalized to q 1 >1 and q 2 >1.

  18. Mapping Deforestation in North Korea Using Phenology-Based Multi-Index and Random Forest

    Directory of Open Access Journals (Sweden)

    Yihua Jin

    2016-12-01

    Full Text Available Phenology-based multi-index with the random forest (RF algorithm can be used to overcome the shortcomings of traditional deforestation mapping that involves pixel-based classification, such as ISODATA or decision trees, and single images. The purpose of this study was to investigate methods to identify specific types of deforestation in North Korea, and to increase the accuracy of classification, using phenological characteristics extracted with multi-index and random forest algorithms. The mapping of deforestation area based on RF was carried out by merging phenology-based multi-indices (i.e., normalized difference vegetation index (NDVI, normalized difference water index (NDWI, and normalized difference soil index (NDSI derived from MODIS (Moderate Resolution Imaging Spectroradiometer products and topographical variables. Our results showed overall classification accuracy of 89.38%, with corresponding kappa coefficients of 0.87. In particular, for forest and farm land categories with similar phenological characteristic (e.g., paddy, plateau vegetation, unstocked forest, hillside field, this approach improved the classification accuracy in comparison with pixel-based methods and other classes. The deforestation types were identified by incorporating point data from high-resolution imagery, outcomes of image classification, and slope data. Our study demonstrated that the proposed methodology could be used for deciding on the restoration priority and monitoring the expansion of deforestation areas.

  19. Concept maps and nursing theory: a pedagogical approach.

    Science.gov (United States)

    Hunter Revell, Susan M

    2012-01-01

    Faculty seek to teach nursing students how to link clinical and theoretical knowledge with the intent of improving patient outcomes. The author discusses an innovative 9-week concept mapping activity as a pedagogical approach to teach nursing theory in a graduate theory course. Weekly concept map building increased student engagement and fostered theoretical thinking. Unexpectedly, this activity also benefited students through group work and its ability to enhance theory-practice knowledge.

  20. DYNAMIC STRAIN MAPPING AND REAL-TIME DAMAGE STATE ESTIMATION UNDER BIAXIAL RANDOM FATIGUE LOADING

    Data.gov (United States)

    National Aeronautics and Space Administration — DYNAMIC STRAIN MAPPING AND REAL-TIME DAMAGE STATE ESTIMATION UNDER BIAXIAL RANDOM FATIGUE LOADING SUBHASISH MOHANTY*, ADITI CHATTOPADHYAY, JOHN N. RAJADAS, AND CLYDE...

  1. Mapping Smart Regions. An Exploratory Approach

    Directory of Open Access Journals (Sweden)

    Sylvie Occelli

    2014-05-01

    Full Text Available The paper presents the results of an exploratory approach aimed at extending the ranking procedures normally used in studying the socioeconomics determinants of smart growth at the regional level.   Most of these studies adopt a methodological procedure which essentially consists of the following steps: a identification of the pertinent elementary indicators according to the study objectives; b data selection and processing; c combination of the elementary indicators by multivariate statistical techniques aimed at obtaining a robust synthetic index to rank the observation units. In the procedure a relational dimension is mainly subsumed in the system oriented perspective adopted in selecting the indicators which would best represent the system determinants depending on the goals of the analysis (step a.  In order to get deeper insights into the smartness profile of the European regions, this study makes an effort to account of the relational dimension also in steps b and c of the procedure. The novelties of the proposed approach are twofold. First, by computing region-to-region distances associated with the selected indicators it extends the conventional ranking procedure (step c. Second, it uses a relational database (step b, dealing with the regional participation to the FP7-ICT project, to modify the distances and investigate its impact on the interpretation of the regional positioning.  The main results of this exercise seem to suggest that regional collaborations would have a positive role in regional convergence process. By providing an opportunity to get contacts with the areas endowed with a comparatively more robust smartness profile, regions may have a chance to enhance their own smartness profile.

  2. A random matrix approach to credit risk.

    Science.gov (United States)

    Münnix, Michael C; Schäfer, Rudi; Guhr, Thomas

    2014-01-01

    We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.

  3. A random matrix approach to credit risk.

    Directory of Open Access Journals (Sweden)

    Michael C Münnix

    Full Text Available We estimate generic statistical properties of a structural credit risk model by considering an ensemble of correlation matrices. This ensemble is set up by Random Matrix Theory. We demonstrate analytically that the presence of correlations severely limits the effect of diversification in a credit portfolio if the correlations are not identically zero. The existence of correlations alters the tails of the loss distribution considerably, even if their average is zero. Under the assumption of randomly fluctuating correlations, a lower bound for the estimation of the loss distribution is provided.

  4. Towards Technological Approaches for Concept Maps Mining from Text

    Directory of Open Access Journals (Sweden)

    Camila Zacche Aguiar

    2018-04-01

    Full Text Available Concept maps are resources for the representation and construction of knowledge. They allow showing, through concepts and relationships, how knowledge about a subject is organized. Technological advances have boosted the development of approaches for the automatic construction of a concept map, to facilitate and provide the benefits of that resource more broadly. Due to the need to better identify and analyze the functionalities and characteristics of those approaches, we conducted a detailed study on technological approaches for automatic construction of concept maps published between 1994 and 2016 in the IEEE Xplore, ACM and Elsevier Science Direct data bases. From this study, we elaborate a categorization defined on two perspectives, Data Source and Graphic Representation, and fourteen categories. That study collected 30 relevant articles, which were applied to the proposed categorization to identify the main features and limitations of each approach. A detailed view on these approaches, their characteristics and techniques are presented enabling a quantitative analysis. In addition, the categorization has given us objective conditions to establish new specification requirements for a new technological approach aiming at concept maps mining from texts.

  5. A universal algorithm to generate pseudo-random numbers based on uniform mapping as homeomorphism

    International Nuclear Information System (INIS)

    Fu-Lai, Wang

    2010-01-01

    A specific uniform map is constructed as a homeomorphism mapping chaotic time series into [0,1] to obtain sequences of standard uniform distribution. With the uniform map, a chaotic orbit and a sequence orbit obtained are topologically equivalent to each other so the map can preserve the most dynamic properties of chaotic systems such as permutation entropy. Based on the uniform map, a universal algorithm to generate pseudo random numbers is proposed and the pseudo random series is tested to follow the standard 0–1 random distribution both theoretically and experimentally. The algorithm is not complex, which does not impose high requirement on computer hard ware and thus computation speed is fast. The method not only extends the parameter spaces but also avoids the drawback of small function space caused by constraints on chaotic maps used to generate pseudo random numbers. The algorithm can be applied to any chaotic system and can produce pseudo random sequence of high quality, thus can be a good universal pseudo random number generator. (general)

  6. A universal algorithm to generate pseudo-random numbers based on uniform mapping as homeomorphism

    Science.gov (United States)

    Wang, Fu-Lai

    2010-09-01

    A specific uniform map is constructed as a homeomorphism mapping chaotic time series into [0,1] to obtain sequences of standard uniform distribution. With the uniform map, a chaotic orbit and a sequence orbit obtained are topologically equivalent to each other so the map can preserve the most dynamic properties of chaotic systems such as permutation entropy. Based on the uniform map, a universal algorithm to generate pseudo random numbers is proposed and the pseudo random series is tested to follow the standard 0-1 random distribution both theoretically and experimentally. The algorithm is not complex, which does not impose high requirement on computer hard ware and thus computation speed is fast. The method not only extends the parameter spaces but also avoids the drawback of small function space caused by constraints on chaotic maps used to generate pseudo random numbers. The algorithm can be applied to any chaotic system and can produce pseudo random sequence of high quality, thus can be a good universal pseudo random number generator.

  7. Mapping Soil Properties of Africa at 250 m Resolution: Random Forests Significantly Improve Current Predictions.

    Directory of Open Access Journals (Sweden)

    Tomislav Hengl

    Full Text Available 80% of arable land in Africa has low soil fertility and suffers from physical soil problems. Additionally, significant amounts of nutrients are lost every year due to unsustainable soil management practices. This is partially the result of insufficient use of soil management knowledge. To help bridge the soil information gap in Africa, the Africa Soil Information Service (AfSIS project was established in 2008. Over the period 2008-2014, the AfSIS project compiled two point data sets: the Africa Soil Profiles (legacy database and the AfSIS Sentinel Site database. These data sets contain over 28 thousand sampling locations and represent the most comprehensive soil sample data sets of the African continent to date. Utilizing these point data sets in combination with a large number of covariates, we have generated a series of spatial predictions of soil properties relevant to the agricultural management--organic carbon, pH, sand, silt and clay fractions, bulk density, cation-exchange capacity, total nitrogen, exchangeable acidity, Al content and exchangeable bases (Ca, K, Mg, Na. We specifically investigate differences between two predictive approaches: random forests and linear regression. Results of 5-fold cross-validation demonstrate that the random forests algorithm consistently outperforms the linear regression algorithm, with average decreases of 15-75% in Root Mean Squared Error (RMSE across soil properties and depths. Fitting and running random forests models takes an order of magnitude more time and the modelling success is sensitive to artifacts in the input data, but as long as quality-controlled point data are provided, an increase in soil mapping accuracy can be expected. Results also indicate that globally predicted soil classes (USDA Soil Taxonomy, especially Alfisols and Mollisols help improve continental scale soil property mapping, and are among the most important predictors. This indicates a promising potential for transferring

  8. Pseudo-Random Sequences Generated by a Class of One-Dimensional Smooth Map

    Science.gov (United States)

    Wang, Xing-Yuan; Qin, Xue; Xie, Yi-Xin

    2011-08-01

    We extend a class of a one-dimensional smooth map. We make sure that for each desired interval of the parameter the map's Lyapunov exponent is positive. Then we propose a novel parameter perturbation method based on the good property of the extended one-dimensional smooth map. We perturb the parameter r in each iteration by the real number xi generated by the iteration. The auto-correlation function and NIST statistical test suite are taken to illustrate the method's randomness finally. We provide an application of this method in image encryption. Experiments show that the pseudo-random sequences are suitable for this application.

  9. Pseudo-Random Sequences Generated by a Class of One-Dimensional Smooth Map

    International Nuclear Information System (INIS)

    Wang Xing-Yuan; Qin Xue; Xie Yi-Xin

    2011-01-01

    We extend a class of a one-dimensional smooth map. We make sure that for each desired interval of the parameter the map's Lyapunov exponent is positive. Then we propose a novel parameter perturbation method based on the good property of the extended one-dimensional smooth map. We perturb the parameter r in each iteration by the real number x i generated by the iteration. The auto-correlation function and NIST statistical test suite are taken to illustrate the method's randomness finally. We provide an application of this method in image encryption. Experiments show that the pseudo-random sequences are suitable for this application. (general)

  10. The frequency-domain approach for apparent density mapping

    Science.gov (United States)

    Tong, T.; Guo, L.

    2017-12-01

    Apparent density mapping is a technique to estimate density distribution in the subsurface layer from the observed gravity data. It has been widely applied for geologic mapping, tectonic study and mineral exploration for decades. Apparent density mapping usually models the density layer as a collection of vertical, juxtaposed prisms in both horizontal directions, whose top and bottom surfaces are assumed to be horizontal or variable-depth, and then inverts or deconvolves the gravity anomalies to determine the density of each prism. Conventionally, the frequency-domain approach, which assumes that both top and bottom surfaces of the layer are horizontal, is usually utilized for fast density mapping. However, such assumption is not always valid in the real world, since either the top surface or the bottom surface may be variable-depth. Here, we presented a frequency-domain approach for apparent density mapping, which permits both the top and bottom surfaces of the layer to be variable-depth. We first derived the formula for forward calculation of gravity anomalies caused by the density layer, whose top and bottom surfaces are variable-depth, and the formula for inversion of gravity anomalies for the density distribution. Then we proposed the procedure for density mapping based on both the formulas of inversion and forward calculation. We tested the approach on the synthetic data, which verified its effectiveness. We also tested the approach on the real Bouguer gravity anomalies data from the central South China. The top surface was assumed to be flat and was on the sea level, and the bottom surface was considered as the Moho surface. The result presented the crustal density distribution, which was coinciding well with the basic tectonic features in the study area.

  11. Random fixed point equations and inverse problems using "collage method" for contraction mappings

    Science.gov (United States)

    Kunze, H. E.; La Torre, D.; Vrscay, E. R.

    2007-10-01

    In this paper we are interested in the direct and inverse problems for the following class of random fixed point equations T(w,x(w))=x(w) where is a given operator, [Omega] is a probability space and X is a Polish metric space. The inverse problem is solved by recourse to the collage theorem for contractive maps. We then consider two applications: (i) random integral equations, and (ii) random iterated function systems with greyscale maps (RIFSM), for which noise is added to the classical IFSM.

  12. An automated approach to mapping corn from Landsat imagery

    Science.gov (United States)

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as ‘highly likely corn,’ ‘likely corn’ or ‘unlikely corn.’ To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data.

  13. Noise pollution mapping approach and accuracy on landscape scales.

    Science.gov (United States)

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Update on the use of random 10-mers in mapping and fingerprinting genomes

    International Nuclear Information System (INIS)

    Sinibaldi, R.M.

    2001-01-01

    The use of Randomly Amplified Polymorphic DNA (RAPDs) has continued to grow for the last several years. A quick assessment of their use can be estimated by searching PubMed at the National Library of Medicine with the acronym RAPD. Since their first report in 1990, the number of citations with RAPD in them has increased from 12 in 1990, to 45 in 1991, to, 112 in 1993, to, 130 in 1994, to 223 in 1995, to 258 in 1996, to 236 in 1997, to 316 in 1998, to 196 to date (August 31) 1999. The utilization of 10-mers for mapping or fingerprinting has many advantages. These include a relatively low cost, no use of radioactivity, easily adapted to automation, requirement for very small amounts of input DNA, rapid results, existing data bases for many organisms, and low cost equipment requirements. In conjunction with a derived technology such as SCARs (sequence characterized amplified regions), it can provide cost effective and thorough methods for mapping and fingerprinting any genome. Newer methods based on microarray technology may offer powerful but expensive alternative approaches in determining genetic diversity. The costs of arrays should come down with time and improved production methods. In the meantime, RAPDs remain a competent and cost effective method for genome characterizations. (author)

  15. Comparison of complementary and alternative medicine with conventional mind-body therapies for chronic back pain: protocol for the Mind-body Approaches to Pain (MAP) randomized controlled trial.

    Science.gov (United States)

    Cherkin, Daniel C; Sherman, Karen J; Balderson, Benjamin H; Turner, Judith A; Cook, Andrea J; Stoelb, Brenda; Herman, Patricia M; Deyo, Richard A; Hawkes, Rene J

    2014-06-07

    The self-reported health and functional status of persons with back pain in the United States have declined in recent years, despite greatly increased medical expenditures due to this problem. Although patient psychosocial factors such as pain-related beliefs, thoughts and coping behaviors have been demonstrated to affect how well patients respond to treatments for back pain, few patients receive treatments that address these factors. Cognitive-behavioral therapy (CBT), which addresses psychosocial factors, has been found to be effective for back pain, but access to qualified therapists is limited. Another treatment option with potential for addressing psychosocial issues, mindfulness-based stress reduction (MBSR), is increasingly available. MBSR has been found to be helpful for various mental and physical conditions, but it has not been well-studied for application with chronic back pain patients. In this trial, we will seek to determine whether MBSR is an effective and cost-effective treatment option for persons with chronic back pain, compare its effectiveness and cost-effectiveness compared with CBT and explore the psychosocial variables that may mediate the effects of MBSR and CBT on patient outcomes. In this trial, we will randomize 397 adults with nonspecific chronic back pain to CBT, MBSR or usual care arms (99 per group). Both interventions will consist of eight weekly 2-hour group sessions supplemented by home practice. The MBSR protocol also includes an optional 6-hour retreat. Interviewers masked to treatment assignments will assess outcomes 5, 10, 26 and 52 weeks postrandomization. The primary outcomes will be pain-related functional limitations (based on the Roland Disability Questionnaire) and symptom bothersomeness (rated on a 0 to 10 numerical rating scale) at 26 weeks. If MBSR is found to be an effective and cost-effective treatment option for patients with chronic back pain, it will become a valuable addition to the limited treatment options

  16. a Pseudo-Random Number Generator Employing Multiple RÉNYI Maps

    Science.gov (United States)

    Lui, Oi-Yan; Yuen, Ching-Hung; Wong, Kwok-Wo

    2013-11-01

    The increasing risk along with the drastic development of multimedia data transmission has raised a big concern on data security. A good pseudo-random number generator is an essential tool in cryptography. In this paper, we propose a novel pseudo-random number generator based on the controlled combination of the outputs of several digitized chaotic Rényi maps. The generated pseudo-random sequences have passed both the NIST 800-22 Revision 1a and the DIEHARD tests. Moreover, simulation results show that the proposed pseudo-random number generator requires less operation time than existing generators and is highly sensitive to the seed.

  17. Maps help protect sensitive areas from spills : an integrated approach to environmental mapping

    International Nuclear Information System (INIS)

    Laflamme, A.; Leblanc, S.R.; Percy, R.J.

    2001-01-01

    The Atlantic Sensitivity Mapping Program (ASMP) is underway in Canada's Atlantic Region to develop and maintain the best possible sensitivity mapping system to provide planners and managers with the full range of information they would need in the event of a coastal oil spill drill or spill incident. This initiative also provides recommendations concerning resource protection at the time of a spill. ASMP has become a powerful tool, providing a consistent and standardized terminology throughout the range of spill planning, preparedness and real-time response activities. The desktop mapping system provides an easy-to-use approach for a wide range of technical and support data and information stored in various databases. The data and information are based on a consistent set of terms and definitions that describe the character of the shore zone, the objective and strategies for a specific response, and the methods for achieving those objectives. The data are linked with other resource information in a GIS-based system and can be updated quickly and easily as new information becomes available. The mapping program keeps evolving to better serve the needs of environmental emergency responders. In addition, all components will soon be integrated into a web-based mapping format for broader accessibility. Future work will focus on developing a pre-spill database for Labrador. 3 refs., 8 figs

  18. A zeta function approach to the semiclassical quantization of maps

    International Nuclear Information System (INIS)

    Smilansky, Uzi.

    1993-11-01

    The quantum analogue of an area preserving map on a compact phase space is a unitary (evolution) operator which can be represented by a matrix of dimension L∝ℎ -1 . The semiclassical theory for spectrum of the evolution operator will be reviewed with special emphasize on developing a dynamical zeta function approach, similar to the one introduced recently for a semiclassical quantization of hamiltonian systems. (author)

  19. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    Science.gov (United States)

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  20. Tropical forest carbon assessment: integrating satellite and airborne mapping approaches

    International Nuclear Information System (INIS)

    Asner, Gregory P

    2009-01-01

    Large-scale carbon mapping is needed to support the UNFCCC program to reduce deforestation and forest degradation (REDD). Managers of forested land can potentially increase their carbon credits via detailed monitoring of forest cover, loss and gain (hectares), and periodic estimates of changes in forest carbon density (tons ha -1 ). Satellites provide an opportunity to monitor changes in forest carbon caused by deforestation and degradation, but only after initial carbon densities have been assessed. New airborne approaches, especially light detection and ranging (LiDAR), provide a means to estimate forest carbon density over large areas, which greatly assists in the development of practical baselines. Here I present an integrated satellite-airborne mapping approach that supports high-resolution carbon stock assessment and monitoring in tropical forest regions. The approach yields a spatially resolved, regional state-of-the-forest carbon baseline, followed by high-resolution monitoring of forest cover and disturbance to estimate carbon emissions. Rapid advances and decreasing costs in the satellite and airborne mapping sectors are already making high-resolution carbon stock and emissions assessments viable anywhere in the world.

  1. Constructing Binomial Trees Via Random Maps for Analysis of Financial Assets

    Directory of Open Access Journals (Sweden)

    Antonio Airton Carneiro de Freitas

    2010-04-01

    Full Text Available Random maps can be constructed from a priori knowledge of the financial assets. It is also addressed the reverse problem, i.e. from a function of an empirical stationary probability density function we set up a random map that naturally leads to an implied binomial tree, allowing the adjustment of models, including the ability to incorporate jumps. An applica- tion related to the options market is presented. It is emphasized that the quality of the model to incorporate a priori knowledge of the financial asset may be affected, for example, by the skewed vision of the analyst.

  2. A conformal mapping approach to a root-clustering problem

    International Nuclear Information System (INIS)

    Melnikov, Gennady I; Dudarenko, Nataly A; Melnikov, Vitaly G

    2014-01-01

    This paper presents a new approach for matrix root-clustering in sophisticated and multiply-connected regions of the complex plane. The parametric sweeping method and a concept of the closed forbidden region covered by a set of modified three-parametrical Cassini regions are used. A conformal mapping approach was applied to formulate the main results of the paper. An application of the developed method to the problem of matrix root-clustering in a multiply connected region is shown for illustration

  3. A random probabilistic approach to seismic nuclear power plant analysis

    International Nuclear Information System (INIS)

    Romo, M.P.

    1985-01-01

    A probabilistic method for the seismic analysis of structures which takes into account the random nature of earthquakes and of the soil parameter uncertainties is presented in this paper. The method was developed combining elements of the theory of perturbations, the Random vibration theory and the complex response method. The probabilistic method is evaluated by comparing the responses of a single degree of freedom system computed with this approach and the Monte Carlo method. (orig.)

  4. Physico-empirical approach for mapping soil hydraulic behaviour

    Directory of Open Access Journals (Sweden)

    G. D'Urso

    1997-01-01

    Full Text Available Abstract: Pedo-transfer functions are largely used in soil hydraulic characterisation of large areas. The use of physico-empirical approaches for the derivation of soil hydraulic parameters from disturbed samples data can be greatly enhanced if a characterisation performed on undisturbed cores of the same type of soil is available. In this study, an experimental procedure for deriving maps of soil hydraulic behaviour is discussed with reference to its application in an irrigation district (30 km2 in southern Italy. The main steps of the proposed procedure are: i the precise identification of soil hydraulic functions from undisturbed sampling of main horizons in representative profiles for each soil map unit; ii the determination of pore-size distribution curves from larger disturbed sampling data sets within the same soil map unit. iii the calibration of physical-empirical methods for retrieving soil hydraulic parameters from particle-size data and undisturbed soil sample analysis; iv the definition of functional hydraulic properties from water balance output; and v the delimitation of soil hydraulic map units based on functional properties.

  5. Development of erosion risk map using fuzzy logic approach

    Directory of Open Access Journals (Sweden)

    Fauzi Manyuk

    2017-01-01

    Full Text Available Erosion-hazard assessment is an important aspect in the management of a river basin such as Siak River Basin, Riau Province, Indonesia. This study presents an application of fuzzy logic approach to develop erosion risk map based on geographic information system. Fuzzy logic is a computing approach based on “degrees of truth” rather than the usual “true or false” (1 or 0 Boolean logic on which the modern computer is based. The results of the erosion risk map were verified by using field measurements. The verification result shows that the parameter of soil-erodibility (K indicates a good agreement with field measurement data. The classification of soil-erodibility (K as the result of validation were: very low (0.0–0.1, medium (0.21-0.32, high (0.44-0.55 and very high (0.56-0.64. The results obtained from this study show that the erosion risk map of Siak River Basin were dominantly classified as medium level which cover about 68.54%. The other classifications were high and very low erosion level which cover about 28.84% and 2.61% respectively.

  6. Changing energy-related behavior: An Intervention Mapping approach

    International Nuclear Information System (INIS)

    Kok, Gerjo; Lo, Siu Hing; Peters, Gjalt-Jorn Y.; Ruiter, Robert A.C.

    2011-01-01

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: → Intervention Mapping (IM) is a planning process for developing evidence-based interventions.→ IM takes a problem-driven rather than theory-driven approach. → IM can be applied to the promotion of energy-conservation in a multilevel approach. → IM helps identifying determinants of behaviors and environmental conditions. → IM helps selecting appropriate theory-based methods and practical applications.

  7. Changing energy-related behavior: An Intervention Mapping approach

    Energy Technology Data Exchange (ETDEWEB)

    Kok, Gerjo, E-mail: g.kok@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Lo, Siu Hing, E-mail: siu-hing.lo@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Peters, Gjalt-Jorn Y., E-mail: gj.peters@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands); Ruiter, Robert A.C., E-mail: r.ruiter@maastrichtuniversity.nl [Department of Work and Social Psychology, Maastricht University, P.O. Box 616, 6200 MD Maastricht (Netherlands)

    2011-09-15

    This paper's objective is to apply Intervention Mapping, a planning process for the systematic development of theory- and evidence-based health promotion interventions, to the development of interventions to promote energy conservation behavior. Intervention Mapping (IM) consists of six steps: needs assessment, program objectives, methods and applications, program development, planning for program implementation, and planning for program evaluation. Examples from the energy conservation field are provided to illustrate the activities associated with these steps. It is concluded that applying IM in the energy conservation field may help the development of effective behavior change interventions, and thus develop a domain specific knowledge-base for effective intervention design. - Highlights: > Intervention Mapping (IM) is a planning process for developing evidence-based interventions.> IM takes a problem-driven rather than theory-driven approach. > IM can be applied to the promotion of energy-conservation in a multilevel approach. > IM helps identifying determinants of behaviors and environmental conditions. > IM helps selecting appropriate theory-based methods and practical applications.

  8. New approach on seismic hazard isoseismal map for Romania

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Cioflan, Carmen Ortanza; Marmureanu, Alexandru

    2008-01-01

    The seismicity of Romania comes from the energy that is released by crustal earthquakes, which have a depth not more than 40 km, and by the intermediate earthquakes coming from Vrancea region (unique case in Europe) with a depth between 60 and 200 km. The authors developed the concept of 'control earthquake' and equations to obtain the banana shape of the attenuations curves of the macroseimic intensity I (along the directions defined by azimuth Az), in the case of a Vrancea earthquake at a depth 80 < x < 160 km. There were used deterministic and probabilistic approaches, linear and nonlinear ones. The final map is in MMI intensity (isoseismal map) for maximum possible Vrancea earthquake with Richter magnitude, MGR 7.5. This will avoid any drawbacks to civil structural designers and to insurance companies which are paying all damages and life loses in function of earthquake intensity. (authors)

  9. Mapping Sub-Antarctic Cushion Plants Using Random Forests to Combine Very High Resolution Satellite Imagery and Terrain Modelling

    Science.gov (United States)

    Bricher, Phillippa K.; Lucieer, Arko; Shaw, Justine; Terauds, Aleks; Bergstrom, Dana M.

    2013-01-01

    Monitoring changes in the distribution and density of plant species often requires accurate and high-resolution baseline maps of those species. Detecting such change at the landscape scale is often problematic, particularly in remote areas. We examine a new technique to improve accuracy and objectivity in mapping vegetation, combining species distribution modelling and satellite image classification on a remote sub-Antarctic island. In this study, we combine spectral data from very high resolution WorldView-2 satellite imagery and terrain variables from a high resolution digital elevation model to improve mapping accuracy, in both pixel- and object-based classifications. Random forest classification was used to explore the effectiveness of these approaches on mapping the distribution of the critically endangered cushion plant Azorella macquariensis Orchard (Apiaceae) on sub-Antarctic Macquarie Island. Both pixel- and object-based classifications of the distribution of Azorella achieved very high overall validation accuracies (91.6–96.3%, κ = 0.849–0.924). Both two-class and three-class classifications were able to accurately and consistently identify the areas where Azorella was absent, indicating that these maps provide a suitable baseline for monitoring expected change in the distribution of the cushion plants. Detecting such change is critical given the threats this species is currently facing under altering environmental conditions. The method presented here has applications to monitoring a range of species, particularly in remote and isolated environments. PMID:23940805

  10. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  11. Random Matrix Approach for Primal-Dual Portfolio Optimization Problems

    Science.gov (United States)

    Tada, Daichi; Yamamoto, Hisashi; Shinzato, Takashi

    2017-12-01

    In this paper, we revisit the portfolio optimization problems of the minimization/maximization of investment risk under constraints of budget and investment concentration (primal problem) and the maximization/minimization of investment concentration under constraints of budget and investment risk (dual problem) for the case that the variances of the return rates of the assets are identical. We analyze both optimization problems by the Lagrange multiplier method and the random matrix approach. Thereafter, we compare the results obtained from our proposed approach with the results obtained in previous work. Moreover, we use numerical experiments to validate the results obtained from the replica approach and the random matrix approach as methods for analyzing both the primal and dual portfolio optimization problems.

  12. An Image Encryption Approach Using a Shuffling Map

    International Nuclear Information System (INIS)

    Xiao Yongliang; Xia Limin

    2009-01-01

    A new image encryption approach is proposed. First, a sort transformation based on nonlinear chaotic algorithm is used to shuffle the positions of image pixels. Then the states of hyper-chaos are used to change the grey values of the shuffled image according to the changed chaotic values of the same position between the above nonlinear chaotic sequence and the sorted chaotic sequence. The experimental results demonstrate that the image encryption scheme based on a shuffling map shows advantages of large key space and high-level security. Compared with some encryption algorithms, the suggested encryption scheme is more secure. (general)

  13. An effective Hamiltonian approach to quantum random walk

    Indian Academy of Sciences (India)

    2017-02-09

    Feb 9, 2017 ... Abstract. In this article we present an effective Hamiltonian approach for discrete time quantum random walk. A form of the Hamiltonian for one-dimensional quantum walk has been prescribed, utilizing the fact that Hamil- tonians are generators of time translations. Then an attempt has been made to ...

  14. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    Science.gov (United States)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  15. Mapping topographic plant location properties using a dense matching approach

    Science.gov (United States)

    Niederheiser, Robert; Rutzinger, Martin; Lamprecht, Andrea; Bardy-Durchhalter, Manfred; Pauli, Harald; Winkler, Manuela

    2017-04-01

    Within the project MEDIALPS (Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains) six regions in Alpine and in Mediterranean mountain regions are investigated to assess how plant species respond to climate change. The project is embedded in the Global Observation Research Initiative in Alpine Environments (GLORIA), which is a well-established global monitoring initiative for systematic observation of changes in the plant species composition and soil temperature on mountain summits worldwide to discern accelerating climate change pressures on these fragile alpine ecosystems. Close-range sensing techniques such as terrestrial photogrammetry are well suited for mapping terrain topography of small areas with high resolution. Lightweight equipment, flexible positioning for image acquisition in the field, and independence on weather conditions (i.e. wind) make this a feasible method for in-situ data collection. New developments of dense matching approaches allow high quality 3D terrain mapping with less requirements for field set-up. However, challenges occur in post-processing and required data storage if many sites have to be mapped. Within MEDIALPS dense matching is used for mapping high resolution topography for 284 3x3 meter plots deriving information on vegetation coverage, roughness, slope, aspect and modelled solar radiation. This information helps identifying types of topography-dependent ecological growing conditions and evaluating the potential for existing refugial locations for specific plant species under climate change. This research is conducted within the project MEDIALPS - Disentangling anthropogenic drivers of climate change impacts on alpine plant species: Alps vs. Mediterranean mountains funded by the Earth System Sciences Programme of the Austrian Academy of Sciences.

  16. A filtering approach to edge preserving MAP estimation of images.

    Science.gov (United States)

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  17. AERIAL TERRAIN MAPPING USING UNMANNED AERIAL VEHICLE APPROACH

    Directory of Open Access Journals (Sweden)

    K. N. Tahar

    2012-08-01

    Full Text Available This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root

  18. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis

    Directory of Open Access Journals (Sweden)

    Quanlong Feng

    2015-01-01

    Full Text Available Unmanned aerial vehicle (UAV remote sensing has great potential for vegetation mapping in complex urban landscapes due to the ultra-high resolution imagery acquired at low altitudes. Because of payload capacity restrictions, off-the-shelf digital cameras are widely used on medium and small sized UAVs. The limitation of low spectral resolution in digital cameras for vegetation mapping can be reduced by incorporating texture features and robust classifiers. Random Forest has been widely used in satellite remote sensing applications, but its usage in UAV image classification has not been well documented. The objectives of this paper were to propose a hybrid method using Random Forest and texture analysis to accurately differentiate land covers of urban vegetated areas, and analyze how classification accuracy changes with texture window size. Six least correlated second-order texture measures were calculated at nine different window sizes and added to original Red-Green-Blue (RGB images as ancillary data. A Random Forest classifier consisting of 200 decision trees was used for classification in the spectral-textural feature space. Results indicated the following: (1 Random Forest outperformed traditional Maximum Likelihood classifier and showed similar performance to object-based image analysis in urban vegetation classification; (2 the inclusion of texture features improved classification accuracy significantly; (3 classification accuracy followed an inverted U relationship with texture window size. The results demonstrate that UAV provides an efficient and ideal platform for urban vegetation mapping. The hybrid method proposed in this paper shows good performance in differentiating urban vegetation mapping. The drawbacks of off-the-shelf digital cameras can be reduced by adopting Random Forest and texture analysis at the same time.

  19. Effects of randomness on chaos and order of coupled logistic maps

    International Nuclear Information System (INIS)

    Savi, Marcelo A.

    2007-01-01

    Natural systems are essentially nonlinear being neither completely ordered nor completely random. These nonlinearities are responsible for a great variety of possibilities that includes chaos. On this basis, the effect of randomness on chaos and order of nonlinear dynamical systems is an important feature to be understood. This Letter considers randomness as fluctuations and uncertainties due to noise and investigates its influence in the nonlinear dynamical behavior of coupled logistic maps. The noise effect is included by adding random variations either to parameters or to state variables. Besides, the coupling uncertainty is investigated by assuming tinny values for the connection parameters, representing the idea that all Nature is, in some sense, weakly connected. Results from numerical simulations show situations where noise alters the system nonlinear dynamics

  20. Robustification of a One-Dimensional Generic Sigmoidal Chaotic Map with Application of True Random Bit Generation

    Directory of Open Access Journals (Sweden)

    Nattagit Jiteurtragool

    2018-02-01

    Full Text Available The search for generation approaches to robust chaos has received considerable attention due to potential applications in cryptography or secure communications. This paper is of interest regarding a 1-D sigmoidal chaotic map, which has never been distinctly investigated. This paper introduces a generic form of the sigmoidal chaotic map with three terms, i.e., xn+1 = ∓AfNL(Bxn ± Cxn ± D, where A, B, C, and D are real constants. The unification of modified sigmoid and hyperbolic tangent (tanh functions reveals the existence of a “unified sigmoidal chaotic map” generically fulfilling the three terms, with robust chaos partially appearing in some parameter ranges. A simplified generic form, i.e., xn+1 = ∓fNL(Bxn ± Cxn, through various S-shaped functions, has recently led to the possibility of linearization using (i hardtanh and (ii signum functions. This study finds a linearized sigmoidal chaotic map that potentially offers robust chaos over an entire range of parameters. Chaos dynamics are described in terms of chaotic waveforms, histogram, cobweb plots, fixed point, Jacobian, and a bifurcation structure diagram based on Lyapunov exponents. As a practical example, a true random bit generator using the linearized sigmoidal chaotic map is demonstrated. The resulting output is evaluated using the NIST SP800-22 test suite and TestU01.

  1. Localization of canine brachycephaly using an across breed mapping approach.

    Directory of Open Access Journals (Sweden)

    Danika Bannasch

    2010-03-01

    Full Text Available The domestic dog, Canis familiaris, exhibits profound phenotypic diversity and is an ideal model organism for the genetic dissection of simple and complex traits. However, some of the most interesting phenotypes are fixed in particular breeds and are therefore less tractable to genetic analysis using classical segregation-based mapping approaches. We implemented an across breed mapping approach using a moderately dense SNP array, a low number of animals and breeds carefully selected for the phenotypes of interest to identify genetic variants responsible for breed-defining characteristics. Using a modest number of affected (10-30 and control (20-60 samples from multiple breeds, the correct chromosomal assignment was identified in a proof of concept experiment using three previously defined loci; hyperuricosuria, white spotting and chondrodysplasia. Genome-wide association was performed in a similar manner for one of the most striking morphological traits in dogs: brachycephalic head type. Although candidate gene approaches based on comparable phenotypes in mice and humans have been utilized for this trait, the causative gene has remained elusive using this method. Samples from nine affected breeds and thirteen control breeds identified strong genome-wide associations for brachycephalic head type on Cfa 1. Two independent datasets identified the same genomic region. Levels of relative heterozygosity in the associated region indicate that it has been subjected to a selective sweep, consistent with it being a breed defining morphological characteristic. Genotyping additional dogs in the region confirmed the association. To date, the genetic structure of dog breeds has primarily been exploited for genome wide association for segregating traits. These results demonstrate that non-segregating traits under strong selection are equally tractable to genetic analysis using small sample numbers.

  2. A Joint Land Cover Mapping and Image Registration Algorithm Based on a Markov Random Field Model

    Directory of Open Access Journals (Sweden)

    Apisit Eiumnoh

    2013-10-01

    Full Text Available Traditionally, image registration of multi-modal and multi-temporal images is performed satisfactorily before land cover mapping. However, since multi-modal and multi-temporal images are likely to be obtained from different satellite platforms and/or acquired at different times, perfect alignment is very difficult to achieve. As a result, a proper land cover mapping algorithm must be able to correct registration errors as well as perform an accurate classification. In this paper, we propose a joint classification and registration technique based on a Markov random field (MRF model to simultaneously align two or more images and obtain a land cover map (LCM of the scene. The expectation maximization (EM algorithm is employed to solve the joint image classification and registration problem by iteratively estimating the map parameters and approximate posterior probabilities. Then, the maximum a posteriori (MAP criterion is used to produce an optimum land cover map. We conducted experiments on a set of four simulated images and one pair of remotely sensed images to investigate the effectiveness and robustness of the proposed algorithm. Our results show that, with proper selection of a critical MRF parameter, the resulting LCMs derived from an unregistered image pair can achieve an accuracy that is as high as when images are perfectly aligned. Furthermore, the registration error can be greatly reduced.

  3. Pseudo-random number generator based on mixing of three chaotic maps

    Science.gov (United States)

    François, M.; Grosges, T.; Barchiesi, D.; Erra, R.

    2014-04-01

    A secure pseudo-random number generator three-mixer is proposed. The principle of the method consists in mixing three chaotic maps produced from an input initial vector. The algorithm uses permutations whose positions are computed and indexed by a standard chaotic function and a linear congruence. The performance of that scheme is evaluated through statistical analysis. Such a cryptosystem lets appear significant cryptographic qualities for a high security level.

  4. Extension of the Multipole Approach to Random Metamaterials

    Directory of Open Access Journals (Sweden)

    A. Chipouline

    2012-01-01

    Full Text Available Influence of the short-range lateral disorder in the meta-atoms positioning on the effective parameters of the metamaterials is investigated theoretically using the multipole approach. Random variation of the near field quasi-static interaction between metaatoms in form of double wires is shown to be the reason for the effective permittivity and permeability changes. The obtained analytical results are compared with the known experimental ones.

  5. Mapping radon-prone areas - a geophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Shirav, M. [Geological Survey of Israel, Jerusalem (Israel); Vulkan, U. [Soreq Nuclear Research Center, Yavne (Israel)

    1997-06-01

    Radon-prone areas in Israel were mapped on the basis of direct measurements of radon ({sup 222}Rn) in the soil/rock gas of all exposed geological units, supported by the accumulated knowledge of local stratigraphy and sub-surface geology. Measurements were carried out by a modified alpha-track detection system, resulting in high radon levels mainly in rocks of the Senonian-Paleocene-aged Mount Scopus Group, comprised of chert-bearing marly chalks, rich in phosphorite which acts as the major uranium source. Issues of source depth, seasonal variations and comparison with indoor radon levels are addressed as well. This approach could be applied to other similar terrains, especially the Mediterranean Phosphate Belt. (orig.)

  6. Mapping radon-prone areas - a geophysical approach

    International Nuclear Information System (INIS)

    Shirav, M.; Vulkan, U.

    1997-01-01

    Radon-prone areas in Israel were mapped on the basis of direct measurements of radon ( 222 Rn) in the soil/rock gas of all exposed geological units, supported by the accumulated knowledge of local stratigraphy and sub-surface geology. Measurements were carried out by a modified alpha-track detection system, resulting in high radon levels mainly in rocks of the Senonian-Paleocene-aged Mount Scopus Group, comprised of chert-bearing marly chalks, rich in phosphorite which acts as the major uranium source. Issues of source depth, seasonal variations and comparison with indoor radon levels are addressed as well. This approach could be applied to other similar terrains, especially the Mediterranean Phosphate Belt. (orig.)

  7. Local Relation Map: A Novel Illumination Invariant Face Recognition Approach

    Directory of Open Access Journals (Sweden)

    Lian Zhichao

    2012-10-01

    Full Text Available In this paper, a novel illumination invariant face recognition approach is proposed. Different from most existing methods, an additive term as noise is considered in the face model under varying illuminations in addition to a multiplicative illumination term. High frequency coefficients of Discrete Cosine Transform (DCT are discarded to eliminate the effect caused by noise. Based on the local characteristics of the human face, a simple but effective illumination invariant feature local relation map is proposed. Experimental results on the Yale B, Extended Yale B and CMU PIE demonstrate the outperformance and lower computational burden of the proposed method compared to other existing methods. The results also demonstrate the validity of the proposed face model and the assumption on noise.

  8. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers. 

  9. Mapping site-based construction workers’ motivation: Expectancy theory approach

    Directory of Open Access Journals (Sweden)

    Parviz Ghoddousi

    2014-03-01

    Full Text Available The aim of this study is to apply a recently proposed model of motivation based on expectancy theory to site-based workers in the construction context and confirm the validity of this model for the construction industry. The study drew upon data from 194 site-based construction workers in Iran to test the proposed model of motivation. To this end, the structural equation modelling (SEM approach based on the confirmatory factor analysis (CFA technique was deployed. The study reveals that the proposed model of expectancy theory incorporating five indicators (i.e. intrinsic instrumentality, extrinsic instrumentality, intrinsic valence, extrinsic valence and expectancy is able to map the process of construction workers’ motivation. Nonetheless, the findings posit that intrinsic indicators could be more effective than extrinsic ones. This proffers the necessity of construction managers placing further focus on intrinsic motivators to motivate workers.

  10. Map Archive Mining: Visual-Analytical Approaches to Explore Large Historical Map Collections

    Directory of Open Access Journals (Sweden)

    Johannes H. Uhl

    2018-04-01

    Full Text Available Historical maps are unique sources of retrospective geographical information. Recently, several map archives containing map series covering large spatial and temporal extents have been systematically scanned and made available to the public. The geographical information contained in such data archives makes it possible to extend geospatial analysis retrospectively beyond the era of digital cartography. However, given the large data volumes of such archives (e.g., more than 200,000 map sheets in the United States Geological Survey topographic map archive and the low graphical quality of older, manually-produced map sheets, the process to extract geographical information from these map archives needs to be automated to the highest degree possible. To understand the potential challenges (e.g., salient map characteristics and data quality variations in automating large-scale information extraction tasks for map archives, it is useful to efficiently assess spatio-temporal coverage, approximate map content, and spatial accuracy of georeferenced map sheets at different map scales. Such preliminary analytical steps are often neglected or ignored in the map processing literature but represent critical phases that lay the foundation for any subsequent computational processes including recognition. Exemplified for the United States Geological Survey topographic map and the Sanborn fire insurance map archives, we demonstrate how such preliminary analyses can be systematically conducted using traditional analytical and cartographic techniques, as well as visual-analytical data mining tools originating from machine learning and data science.

  11. CLASSIFICATION ALGORITHMS FOR BIG DATA ANALYSIS, A MAP REDUCE APPROACH

    Directory of Open Access Journals (Sweden)

    V. A. Ayma

    2015-03-01

    Full Text Available Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP, which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA’s machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM. The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  12. An Optimization Approach to Improving Collections of Shape Maps

    DEFF Research Database (Denmark)

    Nguyen, Andy; Ben‐Chen, Mirela; Welnicka, Katarzyna

    2011-01-01

    pairwise map independently does not take full advantage of all existing information. For example, a notorious problem with computing shape maps is the ambiguity introduced by the symmetry problem — for two similar shapes which have reflectional symmetry there exist two maps which are equally favorable...... shape maps connecting our collection, we propose to add the constraint of global map consistency, requiring that any composition of maps between two shapes should be independent of the path chosen in the network. This requirement can help us choose among the equally good symmetric alternatives, or help...

  13. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    International Nuclear Information System (INIS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  14. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan-Rong; Wang, Jian-Min [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650011 (China)

    2016-11-10

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  15. Exploring the Influence of Neighborhood Characteristics on Burglary Risks: A Bayesian Random Effects Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2016-06-01

    Full Text Available A Bayesian random effects modeling approach was used to examine the influence of neighborhood characteristics on burglary risks in Jianghan District, Wuhan, China. This random effects model is essentially spatial; a spatially structured random effects term and an unstructured random effects term are added to the traditional non-spatial Poisson regression model. Based on social disorganization and routine activity theories, five covariates extracted from the available data at the neighborhood level were used in the modeling. Three regression models were fitted and compared by the deviance information criterion to identify which model best fit our data. A comparison of the results from the three models indicates that the Bayesian random effects model is superior to the non-spatial models in fitting the data and estimating regression coefficients. Our results also show that neighborhoods with above average bar density and department store density have higher burglary risks. Neighborhood-specific burglary risks and posterior probabilities of neighborhoods having a burglary risk greater than 1.0 were mapped, indicating the neighborhoods that should warrant more attention and be prioritized for crime intervention and reduction. Implications and limitations of the study are discussed in our concluding section.

  16. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Science.gov (United States)

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  17. A topographical map approach to representing treatment efficacy: a focus on positive psychology interventions.

    Science.gov (United States)

    Gorlin, Eugenia I; Lee, Josephine; Otto, Michael W

    2018-01-01

    A recent meta-analysis by Bolier et al. indicated that positive psychology interventions have overall small to moderate effects on well-being, but results were quite heterogeneous across intervention trials. Such meta-analytic research helps condense information on the efficacy of a broad psychosocial intervention by averaging across many effects; however, such global averages may provide limited navigational guidance for selecting among specific interventions. Here, we introduce a novel method for displaying qualitative and quantitative information on the efficacy of interventions using a topographical map approach. As an initial prototype for demonstrating this method, we mapped 50 positive psychology interventions targeting well-being (as captured in the Bolier et al. [2013] meta-analysis, [Bolier, L., Haverman, M., Westerhof, G. J., Riper, H., Smit, F., & Bohlmeijer, E. (2013). Positive psychology interventions: A meta-analysis of randomized controlled studies. BMC Public Health, 13, 83]). Each intervention domain/subdomain was mapped according to its average effect size (indexed by vertical elevation), number of studies providing effect sizes (indexed by horizontal area), and therapist/client burden (indexed by shading). The geographical placement of intervention domains/subdomains was determined by their conceptual proximity, allowing viewers to gauge the general conceptual "direction" in which promising intervention effects can be found. The resulting graphical displays revealed several prominent features of the well-being intervention "landscape," such as more strongly and uniformly positive effects of future-focused interventions (including, goal-pursuit and optimism training) compared to past/present-focused ones.

  18. Time delay correlations in chaotic scattering and random matrix approach

    International Nuclear Information System (INIS)

    Lehmann, N.; Savin, D.V.; Sokolov, V.V.; Sommers, H.J.

    1994-01-01

    We study the correlations in the time delay a model of chaotic resonance scattering based on the random matrix approach. Analytical formulae which are valid for arbitrary number of open channels and arbitrary coupling strength between resonances and channels are obtained by the supersymmetry method. The time delay correlation function, through being not a Lorentzian, is characterized, similar to that of the scattering matrix, by the gap between the cloud of complex poles of the S-matrix and the real energy axis. 28 refs.; 4 figs

  19. Fast periodic stimulation (FPS): a highly effective approach in fMRI brain mapping.

    Science.gov (United States)

    Gao, Xiaoqing; Gentile, Francesco; Rossion, Bruno

    2018-03-03

    Defining the neural basis of perceptual categorization in a rapidly changing natural environment with low-temporal resolution methods such as functional magnetic resonance imaging (fMRI) is challenging. Here, we present a novel fast periodic stimulation (FPS)-fMRI approach to define face-selective brain regions with natural images. Human observers are presented with a dynamic stream of widely variable natural object images alternating at a fast rate (6 images/s). Every 9 s, a short burst of variable face images contrasting with object images in pairs induces an objective face-selective neural response at 0.111 Hz. A model-free Fourier analysis achieves a twofold increase in signal-to-noise ratio compared to a conventional block-design approach with identical stimuli and scanning duration, allowing to derive a comprehensive map of face-selective areas in the ventral occipito-temporal cortex, including the anterior temporal lobe (ATL), in all individual brains. Critically, periodicity of the desired category contrast and random variability among widely diverse images effectively eliminates the contribution of low-level visual cues, and lead to the highest values (80-90%) of test-retest reliability in the spatial activation map yet reported in imaging higher level visual functions. FPS-fMRI opens a new avenue for understanding brain function with low-temporal resolution methods.

  20. Fast and Accurate Approaches for Large-Scale, Automated Mapping of Food Diaries on Food Composition Tables

    Directory of Open Access Journals (Sweden)

    Marc Lamarine

    2018-05-01

    Full Text Available Aim of Study: The use of weighed food diaries in nutritional studies provides a powerful method to quantify food and nutrient intakes. Yet, mapping these records onto food composition tables (FCTs is a challenging, time-consuming and error-prone process. Experts make this effort manually and no automation has been previously proposed. Our study aimed to assess automated approaches to map food items onto FCTs.Methods: We used food diaries (~170,000 records pertaining to 4,200 unique food items from the DiOGenes randomized clinical trial. We attempted to map these items onto six FCTs available from the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching. The second used a machine learning approach (C5.0 classifier combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English-translation. Top matching pairs were reviewed manually to derive performance metrics: precision (the percentage of correctly mapped items and recall (percentage of mapped items.Results: The simpler approach: fuzzy matching, provided very good performance. Under a relaxed threshold (score > 50%, this approach enabled to remap 99.49% of the items with a precision of 88.75%. With a slightly more stringent threshold (score > 63%, the precision could be significantly improved to 96.81% while keeping a recall rate > 95% (i.e., only 5% of the queried items would not be mapped. The machine learning approach did not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names. Our approaches have been implemented as R packages and are freely available from GitHub.Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We

  1. An optimization approach for extracting and encoding consistent maps in a shape collection

    KAUST Repository

    Huang, Qi-Xing

    2012-11-01

    We introduce a novel approach for computing high quality point-topoint maps among a collection of related shapes. The proposed approach takes as input a sparse set of imperfect initial maps between pairs of shapes and builds a compact data structure which implicitly encodes an improved set of maps between all pairs of shapes. These maps align well with point correspondences selected from initial maps; they map neighboring points to neighboring points; and they provide cycle-consistency, so that map compositions along cycles approximate the identity map. The proposed approach is motivated by the fact that a complete set of maps between all pairs of shapes that admits nearly perfect cycleconsistency are highly redundant and can be represented by compositions of maps through a single base shape. In general, multiple base shapes are needed to adequately cover a diverse collection. Our algorithm sequentially extracts such a small collection of base shapes and creates correspondences from each of these base shapes to all other shapes. These correspondences are found by global optimization on candidate correspondences obtained by diffusing initial maps. These are then used to create a compact graphical data structure from which globally optimal cycle-consistent maps can be extracted using simple graph algorithms. Experimental results on benchmark datasets show that the proposed approach yields significantly better results than state-of-theart data-driven shape matching methods. © 2012 ACM.

  2. Mapping of multi-floor buildings: A barometric approach

    DEFF Research Database (Denmark)

    Özkil, Ali Gürcan; Fan, Zhun; Xiao, Jizhong

    2011-01-01

    This paper presents a new method for mapping multi5floor buildings. The method combines laser range sensor for metric mapping and barometric pressure sensor for detecting floor transitions and map segmentation. We exploit the fact that the barometric pressure is a function of the elevation......, and it varies between different floors. The method is tested with a real robot in a typical indoor environment, and the results show that physically consistent multi5floor representations are achievable....

  3. Evaluation of the Achieve Mapping Catheter in cryoablation for atrial fibrillation: a prospective randomized trial.

    Science.gov (United States)

    Gang, Yi; Gonna, Hanney; Domenichini, Giulia; Sampson, Michael; Aryan, Niloufar; Norman, Mark; Behr, Elijah R; Zuberi, Zia; Dhillon, Paramdeep; Gallagher, Mark M

    2016-03-01

    The purpose of this study is to establish the role of Achieve Mapping Catheter in cryoablation for paroxysmal atrial fibrillation (PAF) in a randomized trial. A total of 102 patients undergoing their first ablation for PAF were randomized at 2:1 to an Achieve- or Lasso-guided procedure. Study patients were systematically followed up for 12 months with Holter monitoring. Primary study endpoint was acute procedure success. Secondary endpoint was clinical outcomes assessed by AF free at 6 and 12 months after the procedure. Of 102 participants, 99 % of acute procedure success was achieved. Significantly shorter procedure duration with the Achieve-guided group than with the Lasso-guided group (118 ± 18 vs. 129 ± 21 min, p < 0.05) was observed as was the duration of fluoroscopy (17 ± 5 vs. 20 ± 7 min, p < 0.05) by subgroup analysis focused on procedures performed by experienced operators. In the whole study patients, procedure and fluoroscopic durations were similar in the Achieve- (n = 68) and Lasso-guided groups (n = 34). Transient phrenic nerve weakening was equally prevalent with the Achieve and Lasso. No association was found between clinical outcomes and the mapping catheter used. The use of second-generation cryoballoon (n = 68) reduced procedure time significantly compared to the first-generation balloon (n = 34); more patients were free of AF in the former than the latter group during follow-up. The use of the Achieve Mapping Catheter can reduce procedure and fluoroscopic durations compared with Lasso catheters in cryoablation for PAF after operators gained sufficient experience. The type of mapping catheter used does not affect procedure efficiency and safety by models of cryoballoon.

  4. On the design of henon and logistic map-based random number generator

    Science.gov (United States)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  5. Generation of pseudo-random numbers from given probabilistic distribution with the use of chaotic maps

    Science.gov (United States)

    Lawnik, Marcin

    2018-01-01

    The scope of the paper is the presentation of a new method of generating numbers from a given distribution. The method uses the inverse cumulative distribution function and a method of flattening of probabilistic distributions. On the grounds of these methods, a new construction of chaotic maps was derived, which generates values from a given distribution. The analysis of the new method was conducted on the example of a newly constructed chaotic recurrences, based on the Box-Muller transformation and the quantile function of the exponential distribution. The obtained results certify that the proposed method may be successively applicable for the construction of generators of pseudo-random numbers.

  6. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    Science.gov (United States)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  7. Combinatorial theory of the semiclassical evaluation of transport moments. I. Equivalence with the random matrix approach

    Energy Technology Data Exchange (ETDEWEB)

    Berkolaiko, G., E-mail: berko@math.tamu.edu [Department of Mathematics, Texas A and M University, College Station, Texas 77843-3368 (United States); Kuipers, J., E-mail: Jack.Kuipers@physik.uni-regensburg.de [Institut für Theoretische Physik, Universität Regensburg, D-93040 Regensburg (Germany)

    2013-11-15

    To study electronic transport through chaotic quantum dots, there are two main theoretical approaches. One involves substituting the quantum system with a random scattering matrix and performing appropriate ensemble averaging. The other treats the transport in the semiclassical approximation and studies correlations among sets of classical trajectories. There are established evaluation procedures within the semiclassical evaluation that, for several linear and nonlinear transport moments to which they were applied, have always resulted in the agreement with random matrix predictions. We prove that this agreement is universal: any semiclassical evaluation within the accepted procedures is equivalent to the evaluation within random matrix theory. The equivalence is shown by developing a combinatorial interpretation of the trajectory sets as ribbon graphs (maps) with certain properties and exhibiting systematic cancellations among their contributions. Remaining trajectory sets can be identified with primitive (palindromic) factorisations whose number gives the coefficients in the corresponding expansion of the moments of random matrices. The equivalence is proved for systems with and without time reversal symmetry.

  8. Identification of probabilistic approaches and map-based navigation ...

    Indian Academy of Sciences (India)

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  9. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    Science.gov (United States)

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  10. MAP-MRF-Based Super-Resolution Reconstruction Approach for Coded Aperture Compressive Temporal Imaging

    Directory of Open Access Journals (Sweden)

    Tinghua Zhang

    2018-02-01

    Full Text Available Coded Aperture Compressive Temporal Imaging (CACTI can afford low-cost temporal super-resolution (SR, but limits are imposed by noise and compression ratio on reconstruction quality. To utilize inter-frame redundant information from multiple observations and sparsity in multi-transform domains, a robust reconstruction approach based on maximum a posteriori probability and Markov random field (MAP-MRF model for CACTI is proposed. The proposed approach adopts a weighted 3D neighbor system (WNS and the coordinate descent method to perform joint estimation of model parameters, to achieve the robust super-resolution reconstruction. The proposed multi-reconstruction algorithm considers both total variation (TV and ℓ 2 , 1 norm in wavelet domain to address the minimization problem for compressive sensing, and solves it using an accelerated generalized alternating projection algorithm. The weighting coefficient for different regularizations and frames is resolved by the motion characteristics of pixels. The proposed approach can provide high visual quality in the foreground and background of a scene simultaneously and enhance the fidelity of the reconstruction results. Simulation results have verified the efficacy of our new optimization framework and the proposed reconstruction approach.

  11. Comparison of spatial association approaches for landscape mapping of soil organic carbon stocks

    Science.gov (United States)

    Miller, B. A.; Koszinski, S.; Wehrhan, M.; Sommer, M.

    2015-03-01

    The distribution of soil organic carbon (SOC) can be variable at small analysis scales, but consideration of its role in regional and global issues demands the mapping of large extents. There are many different strategies for mapping SOC, among which is to model the variables needed to calculate the SOC stock indirectly or to model the SOC stock directly. The purpose of this research is to compare direct and indirect approaches to mapping SOC stocks from rule-based, multiple linear regression models applied at the landscape scale via spatial association. The final products for both strategies are high-resolution maps of SOC stocks (kg m-2), covering an area of 122 km2, with accompanying maps of estimated error. For the direct modelling approach, the estimated error map was based on the internal error estimations from the model rules. For the indirect approach, the estimated error map was produced by spatially combining the error estimates of component models via standard error propagation equations. We compared these two strategies for mapping SOC stocks on the basis of the qualities of the resulting maps as well as the magnitude and distribution of the estimated error. The direct approach produced a map with less spatial variation than the map produced by the indirect approach. The increased spatial variation represented by the indirect approach improved R2 values for the topsoil and subsoil stocks. Although the indirect approach had a lower mean estimated error for the topsoil stock, the mean estimated error for the total SOC stock (topsoil + subsoil) was lower for the direct approach. For these reasons, we recommend the direct approach to modelling SOC stocks be considered a more conservative estimate of the SOC stocks' spatial distribution.

  12. Volatility of an Indian stock market: A random matrix approach

    International Nuclear Information System (INIS)

    Kulkarni, V.; Deo, N.

    2006-07-01

    We examine volatility of an Indian stock market in terms of aspects like participation, synchronization of stocks and quantification of volatility using the random matrix approach. Volatility pattern of the market is found using the BSE index for the three-year period 2000- 2002. Random matrix analysis is carried out using daily returns of 70 stocks for several time windows of 85 days in 2001 to (i) do a brief comparative analysis with statistics of eigenvalues and eigenvectors of the matrix C of correlations between price fluctuations, in time regimes of different volatilities. While a bulk of eigenvalues falls within RMT bounds in all the time periods, we see that the largest (deviating) eigenvalue correlates well with the volatility of the index, the corresponding eigenvector clearly shows a shift in the distribution of its components from volatile to less volatile periods and verifies the qualitative association between participation and volatility (ii) observe that the Inverse participation ratio for the last eigenvector is sensitive to market fluctuations (the two quantities are observed to anti correlate significantly) (iii) set up a variability index, V whose temporal evolution is found to be significantly correlated with the volatility of the overall market index. MIRAMAR (author)

  13. A Random Walk Approach to Query Informative Constraints for Clustering.

    Science.gov (United States)

    Abin, Ahmad Ali

    2017-08-09

    This paper presents a random walk approach to the problem of querying informative constraints for clustering. The proposed method is based on the properties of the commute time, that is the expected time taken for a random walk to travel between two nodes and return, on the adjacency graph of data. Commute time has the nice property of that, the more short paths connect two given nodes in a graph, the more similar those nodes are. Since computing the commute time takes the Laplacian eigenspectrum into account, we use this property in a recursive fashion to query informative constraints for clustering. At each recursion, the proposed method constructs the adjacency graph of data and utilizes the spectral properties of the commute time matrix to bipartition the adjacency graph. Thereafter, the proposed method benefits from the commute times distance on graph to query informative constraints between partitions. This process iterates for each partition until the stop condition becomes true. Experiments on real-world data show the efficiency of the proposed method for constraints selection.

  14. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    Science.gov (United States)

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Mapping the Dabus Wetlands, Ethiopia, Using Random Forest Classification of Landsat, PALSAR and Topographic Data

    Directory of Open Access Journals (Sweden)

    Pierre Dubeau

    2017-10-01

    Full Text Available The Dabus Wetland complex in the highlands of Ethiopia is within the headwaters of the Nile Basin and is home to significant ecological communities and rare or endangered species. Its many interrelated wetland types undergo seasonal and longer-term changes due to weather and climate variations as well as anthropogenic land use such as grazing and burning. Mapping and monitoring of these wetlands has not been previously undertaken due primarily to their relative isolation and lack of resources. This study investigated the potential of remote sensing based classification for mapping the primary vegetation groups in the Dabus Wetlands using a combination of dry and wet season data, including optical (Landsat spectral bands and derived vegetation and wetness indices, radar (ALOS PALSAR L-band backscatter, and elevation (SRTM derived DEM and other terrain metrics as inputs to the non-parametric Random Forest (RF classifier. Eight wetland types and three terrestrial/upland classes were mapped using field samples of observed plant community composition and structure groupings as reference information. Various tests to compare results using different RF input parameters and data types were conducted. A combination of multispectral optical, radar and topographic variables provided the best overall classification accuracy, 94.4% and 92.9% for the dry and wet season, respectively. Spectral and topographic data (radar data excluded performed nearly as well, while accuracies using only radar and topographic data were 82–89%. Relatively homogeneous classes such as Papyrus Swamps, Forested Wetland, and Wet Meadow yielded the highest accuracies while spatially complex classes such as Emergent Marsh were more difficult to accurately classify. The methods and results presented in this paper can serve as a basis for development of long-term mapping and monitoring of these and other non-forested wetlands in Ethiopia and other similar environmental settings.

  16. Object-based random forest classification of Landsat ETM+ and WorldView-2 satellite imagery for mapping lowland native grassland communities in Tasmania, Australia

    Science.gov (United States)

    Melville, Bethany; Lucieer, Arko; Aryal, Jagannath

    2018-04-01

    This paper presents a random forest classification approach for identifying and mapping three types of lowland native grassland communities found in the Tasmanian Midlands region. Due to the high conservation priority assigned to these communities, there has been an increasing need to identify appropriate datasets that can be used to derive accurate and frequently updateable maps of community extent. Therefore, this paper proposes a method employing repeat classification and statistical significance testing as a means of identifying the most appropriate dataset for mapping these communities. Two datasets were acquired and analysed; a Landsat ETM+ scene, and a WorldView-2 scene, both from 2010. Training and validation data were randomly subset using a k-fold (k = 50) approach from a pre-existing field dataset. Poa labillardierei, Themeda triandra and lowland native grassland complex communities were identified in addition to dry woodland and agriculture. For each subset of randomly allocated points, a random forest model was trained based on each dataset, and then used to classify the corresponding imagery. Validation was performed using the reciprocal points from the independent subset that had not been used to train the model. Final training and classification accuracies were reported as per class means for each satellite dataset. Analysis of Variance (ANOVA) was undertaken to determine whether classification accuracy differed between the two datasets, as well as between classifications. Results showed mean class accuracies between 54% and 87%. Class accuracy only differed significantly between datasets for the dry woodland and Themeda grassland classes, with the WorldView-2 dataset showing higher mean classification accuracies. The results of this study indicate that remote sensing is a viable method for the identification of lowland native grassland communities in the Tasmanian Midlands, and that repeat classification and statistical significant testing can be

  17. An integrated approach to shoreline mapping for spill response planning

    International Nuclear Information System (INIS)

    Owens, E.H.; LeBlanc, S.R.; Percy, R.J.

    1996-01-01

    A desktop mapping package was introduced which has the capability to provide consistent and standardized application of mapping and data collection/generation techniques. Its application in oil spill cleanup was discussed. The data base can be updated easily as new information becomes available. This provides a response team with access to a wide range of information that would otherwise be difficult to obtain. Standard terms and definitions and shoreline segmentation procedures are part of the system to describe the shore-zone character and shore-zone oiling conditions. The program that is in place for Atlantic Canada involves the integration of (1) Environment Canada's SCAT methodology in pre-spill data generation, (2) shoreline segmentation, (3) response management by objectives, (4) Environment Canada's national sensitivity mapping program, and (5) Environment Canada's field guide for the protection and treatment of oiled shorelines. 7 refs., 6 figs

  18. Comparative Assessment of Three Nonlinear Approaches for Landslide Susceptibility Mapping in a Coal Mine Area

    Directory of Open Access Journals (Sweden)

    Qiaomei Su

    2017-07-01

    Full Text Available Landslide susceptibility mapping is the first and most important step involved in landslide hazard assessment. The purpose of the present study is to compare three nonlinear approaches for landslide susceptibility mapping and test whether coal mining has a significant impact on landslide occurrence in coal mine areas. Landslide data collected by the Bureau of Land and Resources are represented by the X, Y coordinates of its central point; causative factors were calculated from topographic and geologic maps, as well as satellite imagery. The five-fold cross-validation method was adopted and the landslide/non-landslide datasets were randomly split into a ratio of 80:20. From this, five subsets for 20 times were acquired for training and validating models by GIS Geostatistical analysis methods, and all of the subsets were employed in a spatially balanced sample design. Three landslide models were built using support vector machine (SVM, logistic regression (LR, and artificial neural network (ANN models by selecting the median of the performance measures. Then, the three fitted models were compared using the area under the receiver operating characteristics (ROC curves (AUC and the performance measures. The results show that the prediction accuracies are between 73.43% and 87.45% in the training stage, and 67.16% to 73.13% in the validating stage for the three models. AUCs vary from 0.807 to 0.906 and 0.753 to 0.944 in the two stages, respectively. Additionally, three landslide susceptibility maps were obtained by classifying the range of landslide probabilities into four classes representing low (0–0.02, medium (0.02–0.1, high (0.1–0.85, and very high (0.85–1 probabilities of landslides. For the distributions of landslide and area percentages under different susceptibility standards, the SVM model has more relative balance in the four classes compared to the LR and the ANN models. The result reveals that the SVM model possesses better

  19. A systematic mapping review of Randomized Controlled Trials (RCTs in care homes

    Directory of Open Access Journals (Sweden)

    Gordon Adam L

    2012-06-01

    Full Text Available Abstract Background A thorough understanding of the literature generated from research in care homes is required to support evidence-based commissioning and delivery of healthcare. So far this research has not been compiled or described. We set out to describe the extent of the evidence base derived from randomized controlled trials conducted in care homes. Methods A systematic mapping review was conducted of the randomized controlled trials (RCTs conducted in care homes. Medline was searched for “Nursing Home”, “Residential Facilities” and “Homes for the Aged”; CINAHL for “nursing homes”, “residential facilities” and “skilled nursing facilities”; AMED for “Nursing homes”, “Long term care”, “Residential facilities” and “Randomized controlled trial”; and BNI for “Nursing Homes”, “Residential Care” and “Long-term care”. Articles were classified against a keywording strategy describing: year and country of publication; randomization, stratification and blinding methodology; target of intervention; intervention and control treatments; number of subjects and/or clusters; outcome measures; and results. Results 3226 abstracts were identified and 291 articles reviewed in full. Most were recent (median age 6 years and from the United States. A wide range of targets and interventions were identified. Studies were mostly functional (44 behaviour, 20 prescribing and 20 malnutrition studies rather than disease-based. Over a quarter focussed on mental health. Conclusions This study is the first to collate data from all RCTs conducted in care homes and represents an important resource for those providing and commissioning healthcare for this sector. The evidence-base is rapidly developing. Several areas - influenza, falls, mobility, fractures, osteoporosis – are appropriate for systematic review. For other topics, researchers need to focus on outcome measures that can be compared and collated.

  20. Evaluation of digital soil mapping approaches with large sets of environmental covariates

    Science.gov (United States)

    Nussbaum, Madlene; Spiess, Kay; Baltensweiler, Andri; Grob, Urs; Keller, Armin; Greiner, Lucie; Schaepman, Michael E.; Papritz, Andreas

    2018-01-01

    The spatial assessment of soil functions requires maps of basic soil properties. Unfortunately, these are either missing for many regions or are not available at the desired spatial resolution or down to the required soil depth. The field-based generation of large soil datasets and conventional soil maps remains costly. Meanwhile, legacy soil data and comprehensive sets of spatial environmental data are available for many regions. Digital soil mapping (DSM) approaches relating soil data (responses) to environmental data (covariates) face the challenge of building statistical models from large sets of covariates originating, for example, from airborne imaging spectroscopy or multi-scale terrain analysis. We evaluated six approaches for DSM in three study regions in Switzerland (Berne, Greifensee, ZH forest) by mapping the effective soil depth available to plants (SD), pH, soil organic matter (SOM), effective cation exchange capacity (ECEC), clay, silt, gravel content and fine fraction bulk density for four soil depths (totalling 48 responses). Models were built from 300-500 environmental covariates by selecting linear models through (1) grouped lasso and (2) an ad hoc stepwise procedure for robust external-drift kriging (georob). For (3) geoadditive models we selected penalized smoothing spline terms by component-wise gradient boosting (geoGAM). We further used two tree-based methods: (4) boosted regression trees (BRTs) and (5) random forest (RF). Lastly, we computed (6) weighted model averages (MAs) from the predictions obtained from methods 1-5. Lasso, georob and geoGAM successfully selected strongly reduced sets of covariates (subsets of 3-6 % of all covariates). Differences in predictive performance, tested on independent validation data, were mostly small and did not reveal a single best method for 48 responses. Nevertheless, RF was often the best among methods 1-5 (28 of 48 responses), but was outcompeted by MA for 14 of these 28 responses. RF tended to over

  1. Application of CPL with Interference Mapping Lithography to generate random contact reticle designs for the 65-nm node

    Science.gov (United States)

    Van Den Broeke, Douglas J.; Laidig, Thomas L.; Chen, J. Fung; Wampler, Kurt E.; Hsu, Stephen D.; Shi, Xuelong; Socha, Robert J.; Dusa, Mircea V.; Corcoran, Noel P.

    2004-08-01

    Imaging contact and via layers continues to be one of the major challenges to be overcome for 65nm node lithography. Initial results of using ASML MaskTools' CPL Technology to print contact arrays through pitch have demonstrated the potential to further extend contact imaging to a k1 near 0.30. While there are advantages and disadvantages for any potential RET, the benefits of not having to solve the phase assignment problem (which can lead to unresolvable phase conflicts), of it being a single reticle - single exposure technique, and its application to multiple layers within a device (clear field and dark field) make CPL an attractive, cost effective solution to low k1 imaging. However, real semiconductor circuit designs consist of much more than regular arrays of contact holes and a method to define the CPL reticle design for a full chip circuit pattern is required in order for this technique to be feasible in volume manufacturing. Interference Mapping Lithography (IML) is a novel approach for defining optimum reticle patterns based on the imaging conditions that will be used when the wafer is exposed. Figure 1 shows an interference map for an isolated contact simulated using ASML /1150 settings of 0.75NA and 0.92/0.72/30deg Quasar illumination. This technique provides a model-based approach for placing all types features (scattering bars, anti-scattering bars, non-printing assist features, phase shifted and non-phase shifted) for the purpose of enhancing the resolution of the target pattern and it can be applied to any reticle type including binary (COG), attenuated phase shifting mask (attPSM), alternating aperture phase shifting mask (altPSM), and CPL. In this work, we investigate the application of IML to generate CPL reticle designs for random contact patterns that are typical for 65nm node logic devices. We examine the critical issues related to using CPL with Interference Mapping Lithography including controlling side lobe printing, contact patterns with

  2. Control of spatio-temporal on-off intermittency in random driving diffusively coupled map lattices

    International Nuclear Information System (INIS)

    Ziabakhsh Deilami, M.; Rahmani Cherati, Z.; Jahed Motlagh, M.R.

    2009-01-01

    In this paper, we propose feedback methods for controlling spatio-temporal on-off intermittency which is an aperiodic switching between an 'off' state and an 'on' state. Diffusively coupled map lattice with spatially non-uniform random driving is used for showing spatio-temporal on-off intermittency. For this purpose, we apply three different feedbacks. First, we use a linear feedback which is a simple method but has a long transient time. To overcome this problem, two nonlinear feedbacks based on prediction strategy are proposed. An important advantage of the methods is that the feedback signal is vanished when control is realized. Simulation results show that all methods have suppressed the chaotic behavior.

  3. Analytic degree distributions of horizontal visibility graphs mapped from unrelated random series and multifractal binomial measures

    Science.gov (United States)

    Xie, Wen-Jie; Han, Rui-Qi; Jiang, Zhi-Qiang; Wei, Lijian; Zhou, Wei-Xing

    2017-08-01

    Complex network is not only a powerful tool for the analysis of complex system, but also a promising way to analyze time series. The algorithm of horizontal visibility graph (HVG) maps time series into graphs, whose degree distributions are numerically and analytically investigated for certain time series. We derive the degree distributions of HVGs through an iterative construction process of HVGs. The degree distributions of the HVG and the directed HVG for random series are derived to be exponential, which confirms the analytical results from other methods. We also obtained the analytical expressions of degree distributions of HVGs and in-degree and out-degree distributions of directed HVGs transformed from multifractal binomial measures, which agree excellently with numerical simulations.

  4. A random walk approach to stochastic neutron transport

    International Nuclear Information System (INIS)

    Mulatier, Clelia de

    2015-01-01

    One of the key goals of nuclear reactor physics is to determine the distribution of the neutron population within a reactor core. This population indeed fluctuates due to the stochastic nature of the interactions of the neutrons with the nuclei of the surrounding medium: scattering, emission of neutrons from fission events and capture by nuclear absorption. Due to these physical mechanisms, the stochastic process performed by neutrons is a branching random walk. For most applications, the neutron population considered is very large, and all physical observables related to its behaviour, such as the heat production due to fissions, are well characterised by their average values. Generally, these mean quantities are governed by the classical neutron transport equation, called linear Boltzmann equation. During my PhD, using tools from branching random walks and anomalous diffusion, I have tackled two aspects of neutron transport that cannot be approached by the linear Boltzmann equation. First, thanks to the Feynman-Kac backward formalism, I have characterised the phenomenon of 'neutron clustering' that has been highlighted for low-density configuration of neutrons and results from strong fluctuations in space and time of the neutron population. Then, I focused on several properties of anomalous (non-exponential) transport, that can model neutron transport in strongly heterogeneous and disordered media, such as pebble-bed reactors. One of the novel aspects of this work is that problems are treated in the presence of boundaries. Indeed, even though real systems are finite (confined geometries), most of previously existing results were obtained for infinite systems. (author) [fr

  5. Flow in Random Microstructures: a Multilevel Monte Carlo Approach

    KAUST Repository

    Icardi, Matteo; Tempone, Raul

    2016-01-01

    , where an explicit parametrisation of the input randomness is not available or too expensive. We propose a general-purpose algorithm and computational code for the solution of Partial Differential Equations (PDEs) on random heterogeneous materials. We

  6. Mapping embedded applications on MPSoCs : the MNEMEE approach

    NARCIS (Netherlands)

    Baloukas, C.; Papadopoulos, L.; Soudris, D.; Stuijk, S.; Jovanovic, O.; Schmoll, F.; Cordes, D.; Pyka, A.; Mallik, A.; Mamagkakis, S.; Capman, F.; Collet, S.; Mitas, N.; Kritharidis, D.

    2010-01-01

    As embedded systems are becoming the center of our digital life, system design becomes progressively harder. The integration of multiple features on devices with limited resources requires careful and exhaustive exploration of the design search space in order to efficiently map modern applications

  7. The Facebook Influence Model: A Concept Mapping Approach

    Science.gov (United States)

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  8. Mapping of health facilities in Jimeta Metropolis: a digital approach ...

    African Journals Online (AJOL)

    In planning for any suitable development in any field, the primary requirement is the relevant data and maps. This is one of the major problems hindering the proper planning and monitoring of the various health facilities located in Jimeta metropolis. Survey techniques -were employed for the acquisition of data, GPS was ...

  9. The Facebook influence model: a concept mapping approach.

    Science.gov (United States)

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts.

  10. A National Approach to Map and Quantify Terrestrial Vertebrate Biodiversity

    Science.gov (United States)

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  11. Effect of concept mapping approach on students' achievement in ...

    African Journals Online (AJOL)

    The quasi-experimental research design was used in carrying out the study adopting the pre-test – post-test control type. The sample consists of 180 Senior Secondary One (SS1) Students comprising of 88 males and 92 females. In each ... The experimental group was taught mathematical concepts using concept mapping ...

  12. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M., E-mail: rms@nih.gov [Imaging Biomarkers and Computer-aided Diagnosis Laboratory, Radiology and Imaging Sciences, National Institutes of Health Clinical Center Building, 10 Room 1C224 MSC 1182, Bethesda, Maryland 20892-1182 (United States)

    2016-07-15

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  13. Mediastinal lymph node detection and station mapping on chest CT using spatial priors and random forest

    International Nuclear Information System (INIS)

    Liu, Jiamin; Hoffman, Joanne; Zhao, Jocelyn; Yao, Jianhua; Lu, Le; Kim, Lauren; Turkbey, Evrim B.; Summers, Ronald M.

    2016-01-01

    Purpose: To develop an automated system for mediastinal lymph node detection and station mapping for chest CT. Methods: The contextual organs, trachea, lungs, and spine are first automatically identified to locate the region of interest (ROI) (mediastinum). The authors employ shape features derived from Hessian analysis, local object scale, and circular transformation that are computed per voxel in the ROI. Eight more anatomical structures are simultaneously segmented by multiatlas label fusion. Spatial priors are defined as the relative multidimensional distance vectors corresponding to each structure. Intensity, shape, and spatial prior features are integrated and parsed by a random forest classifier for lymph node detection. The detected candidates are then segmented by the following curve evolution process. Texture features are computed on the segmented lymph nodes and a support vector machine committee is used for final classification. For lymph node station labeling, based on the segmentation results of the above anatomical structures, the textual definitions of mediastinal lymph node map according to the International Association for the Study of Lung Cancer are converted into patient-specific color-coded CT image, where the lymph node station can be automatically assigned for each detected node. Results: The chest CT volumes from 70 patients with 316 enlarged mediastinal lymph nodes are used for validation. For lymph node detection, their system achieves 88% sensitivity at eight false positives per patient. For lymph node station labeling, 84.5% of lymph nodes are correctly assigned to their stations. Conclusions: Multiple-channel shape, intensity, and spatial prior features aggregated by a random forest classifier improve mediastinal lymph node detection on chest CT. Using the location information of segmented anatomic structures from the multiatlas formulation enables accurate identification of lymph node stations.

  14. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    Science.gov (United States)

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  15. Soil erodibility mapping using three approaches in the Tangiers province –Northern Morocco

    Directory of Open Access Journals (Sweden)

    Hamza Iaaich

    2016-09-01

    Full Text Available Soil erodibility is a key factor in assessing soil loss rates. In fact, soil loss is the most occurring land degradation form in Morocco, affecting rural and urban vulnerable areas. This work deals with large scale mapping of soil erodibility using three mapping approaches: (i the CORINE approach developed for Europe by the JRC; (ii the UNEP/FAO approach developed within the frame of the United Nations Environmental Program for the Mediterranean area; (iii the Universal Soil Loss Equation (USLE K factor. Our study zone is the province of Tangiers, North-West of Morocco. For each approach, we mapped and analyzed different erodibility factors in terms of parent material, topography and soil attributes. The thematic maps were then integrated using a Geographic Information System to elaborate a soil erodibility map for each of the three approaches. Finally, the validity of each approach was checked in the field, focusing on highly eroded areas, by confronting the estimated soil erodibility and the erosion state as observed in the field. We used three statistical indicators for validation: overall accuracy, weighted Kappa factor and omission/commission errors. We found that the UNEP/FAO approach, based principally on lithofacies and topography as mapping inputs, is the most adapted for the case of our study zone, followed by the CORINE approach. The USLE K factor underestimated the soil erodibility, especially for highly eroded areas.

  16. A taxonomy of behaviour change methods: an Intervention Mapping approach

    OpenAIRE

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fern?ndez, Mar?a E.; Markham, Christine; Bartholomew, L. Kay

    2015-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters fo...

  17. Concept Mapping as an Approach to Facilitate Participatory Intervention Building.

    Science.gov (United States)

    L Allen, Michele; Schaleben-Boateng, Dane; Davey, Cynthia S; Hang, Mikow; Pergament, Shannon

    2015-01-01

    A challenge to addressing community-defined need through community-based participatory intervention building is ensuring that all collaborators' opinions are represented. Concept mapping integrates perspectives of individuals with differing experiences, interests, or expertise into a common visually depicted framework, and ranks composite views on importance and feasibility. To describe the use of concept mapping to facilitate participatory intervention building for a school-based, teacher-focused, positive youth development (PYD) promotion program for Latino, Hmong, and Somali youth. Particiants were teachers, administrators, youth, parents, youth workers, and community and university researchers on the projects' community collaborative board. We incorporated previously collected qualitative data into the process. In a mixed-methods process we 1) generated statements based on key informant interview and focus group data from youth workers, teachers, parents, and youth in multiple languages regarding ways teachers promote PYD for Somali, Latino and Hmong youth; 2) guided participants to individually sort statements into meaningful groupings and rate them by importance and feasibility; 3) mapped the statements based on their relation to each other using multivariate statistical analyses to identify concepts, and as a group identified labels for each concept; and 4) used labels and statement ratings to identify feasible and important concepts as priorities for intervention development. We identified 12 concepts related to PYD promotion in schools and prioritized 8 for intervention development. Concept mapping facilitated participatory intervention building by formally representing all participants' opinions, generating visual representation of group thinking, and supporting priority setting. Use of prior qualitative work increased the diversity of viewpoints represented.

  18. Evaluating the Use of an Object-Based Approach to Lithological Mapping in Vegetated Terrain

    Directory of Open Access Journals (Sweden)

    Stephen Grebby

    2016-10-01

    Full Text Available Remote sensing-based approaches to lithological mapping are traditionally pixel-oriented, with classification performed on either a per-pixel or sub-pixel basis with complete disregard for contextual information about neighbouring pixels. However, intra-class variability due to heterogeneous surface cover (i.e., vegetation and soil or regional variations in mineralogy and chemical composition can result in the generation of unrealistic, generalised lithological maps that exhibit the “salt-and-pepper” artefact of spurious pixel classifications, as well as poorly defined contacts. In this study, an object-based image analysis (OBIA approach to lithological mapping is evaluated with respect to its ability to overcome these issues by instead classifying groups of contiguous pixels (i.e., objects. Due to significant vegetation cover in the study area, the OBIA approach incorporates airborne multispectral and LiDAR data to indirectly map lithologies by exploiting associations with both topography and vegetation type. The resulting lithological maps were assessed both in terms of their thematic accuracy and ability to accurately delineate lithological contacts. The OBIA approach is found to be capable of generating maps with an overall accuracy of 73.5% through integrating spectral and topographic input variables. When compared to equivalent per-pixel classifications, the OBIA approach achieved thematic accuracy increases of up to 13.1%, whilst also reducing the “salt-and-pepper” artefact to produce more realistic maps. Furthermore, the OBIA approach was also generally capable of mapping lithological contacts more accurately. The importance of optimising the segmentation stage of the OBIA approach is also highlighted. Overall, this study clearly demonstrates the potential of OBIA for lithological mapping applications, particularly in significantly vegetated and heterogeneous terrain.

  19. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  20. Mapping Partners Master Drug Dictionary to RxNorm using an NLP-based approach.

    Science.gov (United States)

    Zhou, Li; Plasek, Joseph M; Mahoney, Lisa M; Chang, Frank Y; DiMaggio, Dana; Rocha, Roberto A

    2012-08-01

    To develop an automated method based on natural language processing (NLP) to facilitate the creation and maintenance of a mapping between RxNorm and a local medication terminology for interoperability and meaningful use purposes. We mapped 5961 terms from Partners Master Drug Dictionary (MDD) and 99 of the top prescribed medications to RxNorm. The mapping was conducted at both term and concept levels using an NLP tool, called MTERMS, followed by a manual review conducted by domain experts who created a gold standard mapping. The gold standard was used to assess the overall mapping between MDD and RxNorm and evaluate the performance of MTERMS. Overall, 74.7% of MDD terms and 82.8% of the top 99 terms had an exact semantic match to RxNorm. Compared to the gold standard, MTERMS achieved a precision of 99.8% and a recall of 73.9% when mapping all MDD terms, and a precision of 100% and a recall of 72.6% when mapping the top prescribed medications. The challenges and gaps in mapping MDD to RxNorm are mainly due to unique user or application requirements for representing drug concepts and the different modeling approaches inherent in the two terminologies. An automated approach based on NLP followed by human expert review is an efficient and feasible way for conducting dynamic mapping. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Supplementary Appendix for: Constrained Perturbation Regularization Approach for Signal Estimation Using Random Matrix Theory

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag; Ballal, Tarig; Kammoun, Abla; Alnaffouri, Tareq Y.

    2016-01-01

    In this supplementary appendix we provide proofs and additional simulation results that complement the paper (constrained perturbation regularization approach for signal estimation using random matrix theory).

  2. Using a Similarity Matrix Approach to Evaluate the Accuracy of Rescaled Maps

    Directory of Open Access Journals (Sweden)

    Peijun Sun

    2018-03-01

    Full Text Available Rescaled maps have been extensively utilized to provide data at the appropriate spatial resolution for use in various Earth science models. However, a simple and easy way to evaluate these rescaled maps has not been developed. We propose a similarity matrix approach using a contingency table to compute three measures: overall similarity (OS, omission error (OE, and commission error (CE to evaluate the rescaled maps. The Majority Rule Based aggregation (MRB method was employed to produce the upscaled maps to demonstrate this approach. In addition, previously created, coarser resolution land cover maps from other research projects were also available for comparison. The question of which is better, a map initially produced at coarse resolution or a fine resolution map rescaled to a coarse resolution, has not been quantitatively investigated. To address these issues, we selected study sites at three different extent levels. First, we selected twelve regions covering the continental USA, then we selected nine states (from the whole continental USA, and finally we selected nine Agriculture Statistical Districts (ASDs (from within the nine selected states as study sites. Crop/non-crop maps derived from the USDA Crop Data Layer (CDL at 30 m as base maps were used for the upscaling and existing maps at 250 m and 1 km were utilized for the comparison. The results showed that a similarity matrix can effectively provide the map user with the information needed to assess the rescaling. Additionally, the upscaled maps can provide higher accuracy and better represent landscape pattern compared to the existing coarser maps. Therefore, we strongly recommend that an evaluation of the upscaled map and the existing coarser resolution map using a similarity matrix should be conducted before deciding which dataset to use for the modelling. Overall, extending our understanding on how to perform an evaluation of the rescaled map and investigation of the applicability

  3. An image encryption approach based on chaotic maps

    International Nuclear Information System (INIS)

    Zhang Linhua; Liao Xiaofeng; Wang Xuebing

    2005-01-01

    It is well-known that images are different from texts in many aspects, such as highly redundancy and correlation, the local structure and the characteristics of amplitude-frequency. As a result, the methods of conventional encryption cannot be applicable to images. In this paper, we improve the properties of confusion and diffusion in terms of discrete exponential chaotic maps, and design a key scheme for the resistance to statistic attack, differential attack and grey code attack. Experimental and theoretical results also show that our scheme is efficient and very secure

  4. Concept mapping and network analysis: an analytic approach to measure ties among constructs.

    Science.gov (United States)

    Goldman, Alyssa W; Kane, Mary

    2014-12-01

    Group concept mapping is a mixed-methods approach that helps a group visually represent its ideas on a topic of interest through a series of related maps. The maps and additional graphics are useful for planning, evaluation and theory development. Group concept maps are typically described, interpreted and utilized through points, clusters and distances, and the implications of these features in understanding how constructs relate to one another. This paper focuses on the application of network analysis to group concept mapping to quantify the strength and directionality of relationships among clusters. The authors outline the steps of this analysis, and illustrate its practical use through an organizational strategic planning example. Additional benefits of this analysis to evaluation projects are also discussed, supporting the overall utility of this supplemental technique to the standard concept mapping methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. A Structural Modeling Approach to a Multilevel Random Coefficients Model.

    Science.gov (United States)

    Rovine, Michael J.; Molenaar, Peter C. M.

    2000-01-01

    Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)

  6. A Computerized Approach to Trickle-Process, Random Assignment.

    Science.gov (United States)

    Braucht, G. Nicholas; Reichardt, Charles S.

    1993-01-01

    Procedures for implementing random assignment with trickle processing and ways they can be corrupted are described. A computerized method for implementing random assignment with trickle processing is presented as a desirable alternative in many situations and a way of protecting against threats to assignment validity. (SLD)

  7. Mass spectrometry imaging enriches biomarker discovery approaches with candidate mapping.

    Science.gov (United States)

    Scott, Alison J; Jones, Jace W; Orschell, Christie M; MacVittie, Thomas J; Kane, Maureen A; Ernst, Robert K

    2014-01-01

    Integral to the characterization of radiation-induced tissue damage is the identification of unique biomarkers. Biomarker discovery is a challenging and complex endeavor requiring both sophisticated experimental design and accessible technology. The resources within the National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Consortium, Medical Countermeasures Against Radiological Threats (MCART), allow for leveraging robust animal models with novel molecular imaging techniques. One such imaging technique, MALDI (matrix-assisted laser desorption ionization) mass spectrometry imaging (MSI), allows for the direct spatial visualization of lipids, proteins, small molecules, and drugs/drug metabolites-or biomarkers-in an unbiased manner. MALDI-MSI acquires mass spectra directly from an intact tissue slice in discrete locations across an x, y grid that are then rendered into a spatial distribution map composed of ion mass and intensity. The unique mass signals can be plotted to generate a spatial map of biomarkers that reflects pathology and molecular events. The crucial unanswered questions that can be addressed with MALDI-MSI include identification of biomarkers for radiation damage that reflect the response to radiation dose over time and the efficacy of therapeutic interventions. Techniques in MALDI-MSI also enable integration of biomarker identification among diverse animal models. Analysis of early, sublethally irradiated tissue injury samples from diverse mouse tissues (lung and ileum) shows membrane phospholipid signatures correlated with histological features of these unique tissues. This paper will discuss the application of MALDI-MSI for use in a larger biomarker discovery pipeline.

  8. Landslide Inventory Mapping from Bitemporal 10 m SENTINEL-2 Images Using Change Detection Based Markov Random Field

    Science.gov (United States)

    Qin, Y.; Lu, P.; Li, Z.

    2018-04-01

    Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF) method for landslide inventory mapping. The proposed method mainly includes two steps: 1) change detection-based multi-threshold for training samples generation and 2) MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1) it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2) it takes the spectral characteristics of landslides into account; and 3) it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2) images in China.

  9. Random matrix approach to plasmon resonances in the random impedance network model of disordered nanocomposites

    Science.gov (United States)

    Olekhno, N. A.; Beltukov, Y. M.

    2018-05-01

    Random impedance networks are widely used as a model to describe plasmon resonances in disordered metal-dielectric and other two-component nanocomposites. In the present work, the spectral properties of resonances in random networks are studied within the framework of the random matrix theory. We have shown that the appropriate ensemble of random matrices for the considered problem is the Jacobi ensemble (the MANOVA ensemble). The obtained analytical expressions for the density of states in such resonant networks show a good agreement with the results of numerical simulations in a wide range of metal filling fractions 0

  10. Expectation-based approach for one-dimensional randomly disordered phononic crystals

    International Nuclear Information System (INIS)

    Wu, Feng; Gao, Qiang; Xu, Xiaoming; Zhong, Wanxie

    2014-01-01

    An expectation-based approach to the statistical theorem is proposed for the one-dimensional randomly disordered phononic crystal. In the proposed approach, the expectations of the random eigenstates of randomly disordered phononic crystals are investigated. In terms of the expectations of the random eigenstates, the wave propagation and localization phenomenon in the random phononic crystal could be understood in a statistical perspective. Using the proposed approach, it is proved that for a randomly disordered phononic crystal, the Bloch theorem holds in the perspective of expectation. A one-dimensional randomly disordered binary phononic crystal consisting of two materials with the random geometry size or random physical parameter is addressed by using the proposed approach. From the result, it can be observed that with the increase of the disorder degree, the localization of the expectations of the eigenstates is strengthened. The effect of the random disorder on the eigenstates at higher frequencies is more significant than that at lower frequencies. Furthermore, after introducing the random disorder into phononic crystals, some random divergent eigenstates are changed to localized eigenstates in expectation sense.

  11. A cluster expansion approach to exponential random graph models

    International Nuclear Information System (INIS)

    Yin, Mei

    2012-01-01

    The exponential family of random graphs are among the most widely studied network models. We show that any exponential random graph model may alternatively be viewed as a lattice gas model with a finite Banach space norm. The system may then be treated using cluster expansion methods from statistical mechanics. In particular, we derive a convergent power series expansion for the limiting free energy in the case of small parameters. Since the free energy is the generating function for the expectations of other random variables, this characterizes the structure and behavior of the limiting network in this parameter region

  12. Characteristics of quantum open systems: free random variables approach

    International Nuclear Information System (INIS)

    Gudowska-Nowak, E.; Papp, G.; Brickmann, J.

    1998-01-01

    Random Matrix Theory provides an interesting tool for modelling a number of phenomena where noises (fluctuations) play a prominent role. Various applications range from the theory of mesoscopic systems in nuclear and atomic physics to biophysical models, like Hopfield-type models of neural networks and protein folding. Random Matrix Theory is also used to study dissipative systems with broken time-reversal invariance providing a setup for analysis of dynamic processes in condensed, disordered media. In the paper we use the Random Matrix Theory (RMT) within the formalism of Free Random Variables (alias Blue's functions), which allows to characterize spectral properties of non-Hermitean ''Hamiltonians''. The relevance of using the Blue's function method is discussed in connection with application of non-Hermitean operators in various problems of physical chemistry. (author)

  13. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    Energy Technology Data Exchange (ETDEWEB)

    Chelouche, Doron; Pozo-Nuñez, Francisco [Department of Physics, Faculty of Natural Sciences, University of Haifa, Haifa 3498838 (Israel); Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il [Department of Geosciences, Raymond and Beverly Sackler Faculty of Exact Sciences, Tel Aviv University, Tel Aviv 6997801 (Israel)

    2017-08-01

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discrete correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.

  14. Conjecture Mapping: An Approach to Systematic Educational Design Research

    Science.gov (United States)

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  15. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    Science.gov (United States)

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  16. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    Science.gov (United States)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  17. A National Approach to Quantify and Map Biodiversity ...

    Science.gov (United States)

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national interests for integrating ecology with economics to help understand the effects of human policies and actions and their subsequent impacts on both ecosystem function and human welfare. The degradation of natural ecosystems and climate variation impact the environment and society by affecting ecological integrity and ecosystems’ capacity to provide critical services (i.e., the contributions of ecosystems to human well-being). These challenges will require complex management decisions that can often involve significant trade-offs between societal desires and environmental needs. Evaluating trade-offs in terms of ecosystem services and human well-being provides an intuitive and comprehensive way to assess the broad implications of our decisions and to help shape policies that enhance environmental and social sustainability. In answer to this challenge, the U.S. government has created a partnership among the U.S. Environmental Protection Agency, other Federal agencies, academic institutions, and, Non-Governmental Organizations to develop the EnviroAtlas, an online Decision Support Tool that allows users (e.g., planners, policy-makers, resource managers, NGOs, private indu

  18. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    Science.gov (United States)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  19. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti.

    Science.gov (United States)

    Wampler, Peter J; Rediske, Richard R; Molla, Azizur R

    2013-01-18

    A remote sensing technique was developed which combines a Geographic Information System (GIS); Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only rarely was local knowledge required to identify and locate households. This

  20. Using ArcMap, Google Earth, and Global Positioning Systems to select and locate random households in rural Haiti

    Directory of Open Access Journals (Sweden)

    Wampler Peter J

    2013-01-01

    Full Text Available Abstract Background A remote sensing technique was developed which combines a Geographic Information System (GIS; Google Earth, and Microsoft Excel to identify home locations for a random sample of households in rural Haiti. The method was used to select homes for ethnographic and water quality research in a region of rural Haiti located within 9 km of a local hospital and source of health education in Deschapelles, Haiti. The technique does not require access to governmental records or ground based surveys to collect household location data and can be performed in a rapid, cost-effective manner. Methods The random selection of households and the location of these households during field surveys were accomplished using GIS, Google Earth, Microsoft Excel, and handheld Garmin GPSmap 76CSx GPS units. Homes were identified and mapped in Google Earth, exported to ArcMap 10.0, and a random list of homes was generated using Microsoft Excel which was then loaded onto handheld GPS units for field location. The development and use of a remote sensing method was essential to the selection and location of random households. Results A total of 537 homes initially were mapped and a randomized subset of 96 was identified as potential survey locations. Over 96% of the homes mapped using Google Earth imagery were correctly identified as occupied dwellings. Only 3.6% of the occupants of mapped homes visited declined to be interviewed. 16.4% of the homes visited were not occupied at the time of the visit due to work away from the home or market days. A total of 55 households were located using this method during the 10 days of fieldwork in May and June of 2012. Conclusions The method used to generate and field locate random homes for surveys and water sampling was an effective means of selecting random households in a rural environment lacking geolocation infrastructure. The success rate for locating households using a handheld GPS was excellent and only

  1. Mapping between Classical Risk Management and Game Theoretical Approaches

    OpenAIRE

    Rajbhandari , Lisa; Snekkenes , Einar ,

    2011-01-01

    Part 2: Work in Progress; International audience; In a typical classical risk assessment approach, the probabilities are usually guessed and not much guidance is provided on how to get the probabilities right. When coming up with probabilities, people are generally not well calibrated. History may not always be a very good teacher. Hence, in this paper, we explain how game theory can be integrated into classical risk management. Game theory puts emphasis on collecting representative data on h...

  2. Flow in Random Microstructures: a Multilevel Monte Carlo Approach

    KAUST Repository

    Icardi, Matteo

    2016-01-06

    In this work we are interested in the fast estimation of effective parameters of random heterogeneous materials using Multilevel Monte Carlo (MLMC). MLMC is an efficient and flexible solution for the propagation of uncertainties in complex models, where an explicit parametrisation of the input randomness is not available or too expensive. We propose a general-purpose algorithm and computational code for the solution of Partial Differential Equations (PDEs) on random heterogeneous materials. We make use of the key idea of MLMC, based on different discretization levels, extending it in a more general context, making use of a hierarchy of physical resolution scales, solvers, models and other numerical/geometrical discretisation parameters. Modifications of the classical MLMC estimators are proposed to further reduce variance in cases where analytical convergence rates and asymptotic regimes are not available. Spheres, ellipsoids and general convex-shaped grains are placed randomly in the domain with different placing/packing algorithms and the effective properties of the heterogeneous medium are computed. These are, for example, effective diffusivities, conductivities, and reaction rates. The implementation of the Monte-Carlo estimators, the statistical samples and each single solver is done efficiently in parallel. The method is tested and applied for pore-scale simulations of random sphere packings.

  3. MODIS 250m burned area mapping based on an algorithm using change point detection and Markov random fields.

    Science.gov (United States)

    Mota, Bernardo; Pereira, Jose; Campagnolo, Manuel; Killick, Rebeca

    2013-04-01

    Area burned in tropical savannas of Brazil was mapped using MODIS-AQUA daily 250m resolution imagery by adapting one of the European Space Agency fire_CCI project burned area algorithms, based on change point detection and Markov random fields. The study area covers 1,44 Mkm2 and was performed with data from 2005. The daily 1000 m image quality layer was used for cloud and cloud shadow screening. The algorithm addresses each pixel as a time series and detects changes in the statistical properties of NIR reflectance values, to identify potential burning dates. The first step of the algorithm is robust filtering, to exclude outlier observations, followed by application of the Pruned Exact Linear Time (PELT) change point detection technique. Near-infrared (NIR) spectral reflectance changes between time segments, and post change NIR reflectance values are combined into a fire likelihood score. Change points corresponding to an increase in reflectance are dismissed as potential burn events, as are those occurring outside of a pre-defined fire season. In the last step of the algorithm, monthly burned area probability maps and detection date maps are converted to dichotomous (burned-unburned maps) using Markov random fields, which take into account both spatial and temporal relations in the potential burned area maps. A preliminary assessment of our results is performed by comparison with data from the MODIS 1km active fires and the 500m burned area products, taking into account differences in spatial resolution between the two sensors.

  4. Random matrix approach to cross correlations in financial data

    Science.gov (United States)

    Plerou, Vasiliki; Gopikrishnan, Parameswaran; Rosenow, Bernd; Amaral, Luís A.; Guhr, Thomas; Stanley, H. Eugene

    2002-06-01

    We analyze cross correlations between price fluctuations of different stocks using methods of random matrix theory (RMT). Using two large databases, we calculate cross-correlation matrices C of returns constructed from (i) 30-min returns of 1000 US stocks for the 2-yr period 1994-1995, (ii) 30-min returns of 881 US stocks for the 2-yr period 1996-1997, and (iii) 1-day returns of 422 US stocks for the 35-yr period 1962-1996. We test the statistics of the eigenvalues λi of C against a ``null hypothesis'' - a random correlation matrix constructed from mutually uncorrelated time series. We find that a majority of the eigenvalues of C fall within the RMT bounds [λ-,λ+] for the eigenvalues of random correlation matrices. We test the eigenvalues of C within the RMT bound for universal properties of random matrices and find good agreement with the results for the Gaussian orthogonal ensemble of random matrices-implying a large degree of randomness in the measured cross-correlation coefficients. Further, we find that the distribution of eigenvector components for the eigenvectors corresponding to the eigenvalues outside the RMT bound display systematic deviations from the RMT prediction. In addition, we find that these ``deviating eigenvectors'' are stable in time. We analyze the components of the deviating eigenvectors and find that the largest eigenvalue corresponds to an influence common to all stocks. Our analysis of the remaining deviating eigenvectors shows distinct groups, whose identities correspond to conventionally identified business sectors. Finally, we discuss applications to the construction of portfolios of stocks that have a stable ratio of risk to return.

  5. Comparing the performance of cluster random sampling and integrated threshold mapping for targeting trachoma control, using computer simulation.

    Directory of Open Access Journals (Sweden)

    Jennifer L Smith

    Full Text Available Implementation of trachoma control strategies requires reliable district-level estimates of trachomatous inflammation-follicular (TF, generally collected using the recommended gold-standard cluster randomized surveys (CRS. Integrated Threshold Mapping (ITM has been proposed as an integrated and cost-effective means of rapidly surveying trachoma in order to classify districts according to treatment thresholds. ITM differs from CRS in a number of important ways, including the use of a school-based sampling platform for children aged 1-9 and a different age distribution of participants. This study uses computerised sampling simulations to compare the performance of these survey designs and evaluate the impact of varying key parameters.Realistic pseudo gold standard data for 100 districts were generated that maintained the relative risk of disease between important sub-groups and incorporated empirical estimates of disease clustering at the household, village and district level. To simulate the different sampling approaches, 20 clusters were selected from each district, with individuals sampled according to the protocol for ITM and CRS. Results showed that ITM generally under-estimated the true prevalence of TF over a range of epidemiological settings and introduced more district misclassification according to treatment thresholds than did CRS. However, the extent of underestimation and resulting misclassification was found to be dependent on three main factors: (i the district prevalence of TF; (ii the relative risk of TF between enrolled and non-enrolled children within clusters; and (iii the enrollment rate in schools.Although in some contexts the two methodologies may be equivalent, ITM can introduce a bias-dependent shift as prevalence of TF increases, resulting in a greater risk of misclassification around treatment thresholds. In addition to strengthening the evidence base around choice of trachoma survey methodologies, this study illustrates

  6. Effective information flow through efficient supply chain management - Value stream mapping approach Case Outokumpu Tornio Works

    OpenAIRE

    Juvonen, Piia

    2012-01-01

    ABSTRACT Juvonen, Piia Suvi Päivikki 2012. Effective information flow through efficient supply chain management -Value stream mapping approach - Case Outokumpu Tornio Works. Master`s Thesis. Kemi-Tornio University of Applied Sciences. Business and Culture. Pages 63. Appendices 2. The general aim of this thesis is to explore effective information flow through efficient supply chain management by following one of the lean management principles, value stream mapping. The specific research...

  7. Analytic mappings: a new approach in particle production by accelerated observers

    International Nuclear Information System (INIS)

    Sanchez, N.

    1982-01-01

    This is a summary of the authors recent results about physical consequences of analytic mappings in the space-time. Classically, the mapping defines an accelerated frame. At the quantum level it gives rise to particle production. Statistically, the real singularities of the mapping have associated temperatures. This concerns a new approach in Q.F.T. as formulated in accelerated frames. It has been considered as a first step in the understanding of the deep connection that could exist between the structure (geometry and topology) of the space-time and thermodynamics, mainly motivated by the works of Hawking since 1975. (Auth.)

  8. A Self-Adaptive Evolutionary Approach to the Evolution of Aesthetic Maps for a RTS Game

    OpenAIRE

    Lara-Cabrera, Raúl; Cotta, Carlos; Fernández-Leiva, Antonio J.

    2014-01-01

    Procedural content generation (PCG) is a research eld on the rise,with numerous papers devoted to this topic. This paper presents a PCG method based on a self-adaptive evolution strategy for the automatic generation of maps for the real-time strategy (RTS) game PlanetWars. These maps are generated in order to ful ll the aesthetic preferences of the user, as implied by her assessment of a collection of maps used as training set. A topological approach is used for the characterization of th...

  9. Rapid Construction of Fe-Co-Ni Composition-Phase Map by Combinatorial Materials Chip Approach.

    Science.gov (United States)

    Xing, Hui; Zhao, Bingbing; Wang, Yujie; Zhang, Xiaoyi; Ren, Yang; Yan, Ningning; Gao, Tieren; Li, Jindong; Zhang, Lanting; Wang, Hong

    2018-03-12

    One hundred nanometer thick Fe-Co-Ni material chips were prepared and isothermally annealed at 500, 600, and 700 °C, respectively. Pixel-by-pixel composition and structural mapping was performed by microbeam X-ray at synchrotron light source. Diffraction images were recorded at a rate of 1 pattern/s. The XRD patterns were automatically processed, phase-identified, and categorized by hierarchical clustering algorithm to construct the composition-phase map. The resulting maps are consistent with corresponding isothermal sections reported in the ASM Alloy Phase Diagram Database, verifying the effectiveness of the present approach in phase diagram construction.

  10. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    Science.gov (United States)

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  11. Predictive Mapping of Dwarf Shrub Vegetation in an Arid High Mountain Ecosystem Using Remote Sensing and Random Forests

    Directory of Open Access Journals (Sweden)

    Kim André Vanselow

    2014-07-01

    Full Text Available In many arid mountains, dwarf shrubs represent the most important fodder and firewood resources; therefore, they are intensely used. For the Eastern Pamirs (Tajikistan, they are assumed to be overused. However, empirical evidence on this issue is lacking. We aim to provide a method capable of mapping vegetation in this mountain desert. We used random forest models based on remote sensing data (RapidEye, ASTER GDEM and 359 plots to predictively map total vegetative cover and the distribution of the most important firewood plants, K. ceratoides and A. leucotricha. These species were mapped as present in 33.8% of the study area (accuracy 90.6%. The total cover of the dwarf shrub communities ranged from 0.5% to 51% (per pixel. Areas with very low cover were limited to the vicinity of roads and settlements. The model could explain 80.2% of the total variance. The most important predictor across the models was MSAVI2 (a spectral vegetation index particularly invented for low-cover areas. We conclude that the combination of statistical models and remote sensing data worked well to map vegetation in an arid mountainous environment. With this approach, we were able to provide tangible data on dwarf shrub resources in the Eastern Pamirs and to relativize previous reports about their extensive depletion.

  12. Tropical land use land cover mapping in Pará (Brazil) using discriminative Markov random fields and multi-temporal TerraSAR-X data

    Science.gov (United States)

    Hagensieker, Ron; Roscher, Ribana; Rosentreter, Johannes; Jakimow, Benjamin; Waske, Björn

    2017-12-01

    Remote sensing satellite data offer the unique possibility to map land use land cover transformations by providing spatially explicit information. However, detection of short-term processes and land use patterns of high spatial-temporal variability is a challenging task. We present a novel framework using multi-temporal TerraSAR-X data and machine learning techniques, namely discriminative Markov random fields with spatio-temporal priors, and import vector machines, in order to advance the mapping of land cover characterized by short-term changes. Our study region covers a current deforestation frontier in the Brazilian state Pará with land cover dominated by primary forests, different types of pasture land and secondary vegetation, and land use dominated by short-term processes such as slash-and-burn activities. The data set comprises multi-temporal TerraSAR-X imagery acquired over the course of the 2014 dry season, as well as optical data (RapidEye, Landsat) for reference. Results show that land use land cover is reliably mapped, resulting in spatially adjusted overall accuracies of up to 79% in a five class setting, yet limitations for the differentiation of different pasture types remain. The proposed method is applicable on multi-temporal data sets, and constitutes a feasible approach to map land use land cover in regions that are affected by high-frequent temporal changes.

  13. Algorithms for random generation and counting a Markov chain approach

    CERN Document Server

    Sinclair, Alistair

    1993-01-01

    This monograph studies two classical computational problems: counting the elements of a finite set of combinatorial structures, and generating them at random from some probability distribution. Apart from their intrinsic interest, these problems arise naturally in many branches of mathematics and the natural sciences.

  14. Effects of prenatal X-irradiation on open-field behavior in rats: application of randomized fostering technique and mapping results

    International Nuclear Information System (INIS)

    Tachibana, T.

    1986-01-01

    Sprague-Dawley (SD) rats were given X-irradiation (150 R) on Day 17 of gestation. After birth, all male pups were pooled once and then assigned randomly to irradiated mothers and control mothers. Offspring were administered an open-field test at about 7 weeks of age. The analysis was performed on the basis of two approaches: In the per subject approach, individual subject data (aggregation across Day 2 through Day 4) were treated as the basic unit of statistical analysis. In the per litter approach, double aggregation (aggregation across Day 2 through Day 4 for each subject and aggregation across subjects within each litter) was used. The per subject approach was slightly more sensitive as to the treatment effect, but it induced a reduction in the magnitude of eta squared. A principal component analysis was performed using eta squared together with those of several reference groups. Results were plotted on a map constructed from component scores. The characteristics of behavior in X-irradiated rats were very similar to those of the earlier stage of trials in terms of the location on the map. The postnatal maternal effect on open-field behavior was not serious and was adequately negligible in practice. A new fostering procedure was proposed and its advantages discussed

  15. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    International Nuclear Information System (INIS)

    Chess, Jordan J.; Montoya, Sergio A.; Harvey, Tyler R.; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E.; McMorran, Benjamin J.

    2017-01-01

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  16. Streamlined approach to mapping the magnetic induction of skyrmionic materials

    Energy Technology Data Exchange (ETDEWEB)

    Chess, Jordan J., E-mail: jchess@uoregon.edu [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Montoya, Sergio A. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); Harvey, Tyler R. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States); Ophus, Colin [National Center for Electron Microscopy, Molecular Foundry, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E. [Center for Memory and Recording Research, University of California, San Diego, CA 92093 (United States); Department of Electrical and Computer Engineering, University of California, San Diego, La Jolla, CA 92093 (United States); McMorran, Benjamin J. [Department of Physics, University of Oregon, Eugene, OR 97403 (United States)

    2017-06-15

    Highlights: • A method to reconstruction the phase of electrons after pasting though a sample that requires a single defocused image is presented. • Restrictions as to when it is appropriate to apply this method are described. • The relative error associated with this method is compared to conventional transport of intensity equation analysis. - Abstract: Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach.

  17. Random Valued Impulse Noise Removal Using Region Based Detection Approach

    Directory of Open Access Journals (Sweden)

    S. Banerjee

    2017-12-01

    Full Text Available Removal of random valued noisy pixel is extremely challenging when the noise density is above 50%. The existing filters are generally not capable of eliminating such noise when density is above 70%. In this paper a region wise density based detection algorithm for random valued impulse noise has been proposed. On the basis of the intensity values, the pixels of a particular window are sorted and then stored into four regions. The higher density based region is considered for stepwise detection of noisy pixels. As a result of this detection scheme a maximum of 75% of noisy pixels can be detected. For this purpose this paper proposes a unique noise removal algorithm. It was experimentally proved that the proposed algorithm not only performs exceptionally when it comes to visual qualitative judgment of standard images but also this filter combination outsmarts the existing algorithm in terms of MSE, PSNR and SSIM comparison even up to 70% noise density level.

  18. LANDSLIDE INVENTORY MAPPING FROM BITEMPORAL 10 m SENTINEL-2 IMAGES USING CHANGE DETECTION BASED MARKOV RANDOM FIELD

    Directory of Open Access Journals (Sweden)

    Y. Qin

    2018-04-01

    Full Text Available Landslide inventory mapping is essential for hazard assessment and mitigation. In most previous studies, landslide mapping was achieved by visual interpretation of aerial photos and remote sensing images. However, such method is labor-intensive and time-consuming, especially over large areas. Although a number of semi-automatic landslide mapping methods have been proposed over the past few years, limitations remain in terms of their applicability over different study areas and data, and there is large room for improvement in terms of the accuracy and automation degree. For these reasons, we developed a change detection-based Markov Random Field (CDMRF method for landslide inventory mapping. The proposed method mainly includes two steps: 1 change detection-based multi-threshold for training samples generation and 2 MRF for landslide inventory mapping. Compared with the previous methods, the proposed method in this study has three advantages: 1 it combines multiple image difference techniques with multi-threshold method to generate reliable training samples; 2 it takes the spectral characteristics of landslides into account; and 3 it is highly automatic with little parameter tuning. The proposed method was applied for regional landslides mapping from 10 m Sentinel-2 images in Western China. Results corroborated the effectiveness and applicability of the proposed method especially the capability of rapid landslide mapping. Some directions for future research are offered. This study to our knowledge is the first attempt to map landslides from free and medium resolution satellite (i.e., Sentinel-2 images in China.

  19. An Odometry-free Approach for Simultaneous Localization and Online Hybrid Map Building

    Directory of Open Access Journals (Sweden)

    Wei Hong Chin

    2016-11-01

    Full Text Available In this paper, a new approach is proposed for mobile robot localization and hybrid map building simultaneously without using any odometry hardware system. The proposed method termed as Genetic Bayesian ARAM which comprises two main components: 1 Steady state genetic algorithm (SSGA for self-localization and occupancy grid map building; 2 Bayesian Adaptive Resonance Associative Memory (ARAM for online topological map building. The model of the explored environment is formed as a hybrid representation, both topological and grid-based, and it is incrementally constructed during the exploration process. During occupancy map building, robot estimated self-position is updated by SSGA. At the same time, robot estimated self position is transmit to Bayesian ARAM for topological map building and localization. The effectiveness of our proposed approach is validated by a number of standardized benchmark datasets and real experimental results carried on mobile robot. Benchmark datasets are used to verify the proposed method capable of generating topological map in different environment conditions. Real robot experiment is to verify the proposed method can be implemented in real world.

  20. Force scanning: a rapid, high-resolution approach for spatial mechanical property mapping

    International Nuclear Information System (INIS)

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the microscale and nanoscale is force mapping, which involves taking individual force curves at discrete sites across a region of interest. The limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straightforward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact-mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to ones achieved by the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue.

  1. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    Science.gov (United States)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  2. A multi-temporal analysis approach for land cover mapping in support of nuclear incident response

    Science.gov (United States)

    Sah, Shagan; van Aardt, Jan A. N.; McKeown, Donald M.; Messinger, David W.

    2012-06-01

    Remote sensing can be used to rapidly generate land use maps for assisting emergency response personnel with resource deployment decisions and impact assessments. In this study we focus on constructing accurate land cover maps to map the impacted area in the case of a nuclear material release. The proposed methodology involves integration of results from two different approaches to increase classification accuracy. The data used included RapidEye scenes over Nine Mile Point Nuclear Power Station (Oswego, NY). The first step was building a coarse-scale land cover map from freely available, high temporal resolution, MODIS data using a time-series approach. In the case of a nuclear accident, high spatial resolution commercial satellites such as RapidEye or IKONOS can acquire images of the affected area. Land use maps from the two image sources were integrated using a probability-based approach. Classification results were obtained for four land classes - forest, urban, water and vegetation - using Euclidean and Mahalanobis distances as metrics. Despite the coarse resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. The classifications were augmented using this fused approach, with few supplementary advantages such as correction for cloud cover and independence from time of year. We concluded that this method would generate highly accurate land maps, using coarse spatial resolution time series satellite imagery and a single date, high spatial resolution, multi-spectral image.

  3. An Approach of Dynamic Object Removing for Indoor Mapping Based on UGV SLAM

    Directory of Open Access Journals (Sweden)

    Jian Tang

    2015-07-01

    Full Text Available The study of indoor mapping for Location Based Service (LBS becomes more and more popular in recent years. LiDAR SLAM based mapping method seems to be a promising indoor mapping solution. However, there are some dynamic objects such as pedestrians, indoor vehicles, etc. existing in the raw LiDAR range data. They have to be removal for mapping purpose. In this paper, a new approach of dynamic object removing called Likelihood Grid Voting (LGV is presented. It is a model free method and takes full advantage of the high scanning rate of LiDAR, which is moving at a relative low speed in indoor environment. In this method, a counting grid is allocated for recording the occupation of map position by laser scans. The lower counter value of this position can be recognized as dynamic objects and the point cloud will be removed from map. This work is a part of algorithms in our self- developed Unmanned Ground Vehicles (UGV simultaneous localization and Mapping (SLAM system- NAVIS. Field tests are carried in an indoor parking place with NAVIS to evaluate the effectiveness of the proposed method. The result shows that all the small size objects like pedestrians can be detected and removed quickly; large size of objects like cars can be detected and removed partly.

  4. An Effective NoSQL-Based Vector Map Tile Management Approach

    Directory of Open Access Journals (Sweden)

    Lin Wan

    2016-11-01

    Full Text Available Within a digital map service environment, the rapid growth of Spatial Big-Data is driving new requirements for effective mechanisms for massive online vector map tile processing. The emergence of Not Only SQL (NoSQL databases has resulted in a new data storage and management model for scalable spatial data deployments and fast tracking. They better suit the scenario of high-volume, low-latency network map services than traditional standalone high-performance computer (HPC or relational databases. In this paper, we propose a flexible storage framework that provides feasible methods for tiled map data parallel clipping and retrieval operations within a distributed NoSQL database environment. We illustrate the parallel vector tile generation and querying algorithms with the MapReduce programming model. Three different processing approaches, including local caching, distributed file storage, and the NoSQL-based method, are compared by analyzing the concurrent load and calculation time. An online geological vector tile map service prototype was developed to embed our processing framework in the China Geological Survey Information Grid. Experimental results show that our NoSQL-based parallel tile management framework can support applications that process huge volumes of vector tile data and improve performance of the tiled map service.

  5. A regularized, model-based approach to phase-based conductivity mapping using MRI.

    Science.gov (United States)

    Ropella, Kathleen M; Noll, Douglas C

    2017-11-01

    To develop a novel regularized, model-based approach to phase-based conductivity mapping that uses structural information to improve the accuracy of conductivity maps. The inverse of the three-dimensional Laplacian operator is used to model the relationship between measured phase maps and the object conductivity in a penalized weighted least-squares optimization problem. Spatial masks based on structural information are incorporated into the problem to preserve data near boundaries. The proposed Inverse Laplacian method was compared against a restricted Gaussian filter in simulation, phantom, and human experiments. The Inverse Laplacian method resulted in lower reconstruction bias and error due to noise in simulations than the Gaussian filter. The Inverse Laplacian method also produced conductivity maps closer to the measured values in a phantom and with reduced noise in the human brain, as compared to the Gaussian filter. The Inverse Laplacian method calculates conductivity maps with less noise and more accurate values near boundaries. Improving the accuracy of conductivity maps is integral for advancing the applications of conductivity mapping. Magn Reson Med 78:2011-2021, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  6. Crude oil and its’ distillation: an experimental approach in High School using conceptual maps

    Directory of Open Access Journals (Sweden)

    Dionísio Borsato

    2006-02-01

    Full Text Available Conceptual maps are representations of ideas organized in the form of bidimensional diagrams. In the present work the theme of oil fractional distillation was explored, and the conceptual maps were elaborated both before and after the activities by 43 students from the 1st and 3rd High School grades of a public school in Londrina – PR. The study was conducted theoretically and in practice, with a daily life approach. The use of the motivational theme and the opening text as previous organizers, enabled the establishment of a cognitive link between the students’ previous knowledge and the new concepts. Differences between the maps were verified before and after the activities as well as among the work groups. The students, stimulated by the technique, created better structured maps.

  7. Lyapunov exponent of the random frequency oscillator: cumulant expansion approach

    International Nuclear Information System (INIS)

    Anteneodo, C; Vallejos, R O

    2010-01-01

    We consider a one-dimensional harmonic oscillator with a random frequency, focusing on both the standard and the generalized Lyapunov exponents, λ and λ* respectively. We discuss the numerical difficulties that arise in the numerical calculation of λ* in the case of strong intermittency. When the frequency corresponds to a Ornstein-Uhlenbeck process, we compute analytically λ* by using a cumulant expansion including up to the fourth order. Connections with the problem of finding an analytical estimate for the largest Lyapunov exponent of a many-body system with smooth interactions are discussed.

  8. Matrix product approach for the asymmetric random average process

    International Nuclear Information System (INIS)

    Zielen, F; Schadschneider, A

    2003-01-01

    We consider the asymmetric random average process which is a one-dimensional stochastic lattice model with nearest-neighbour interaction but continuous and unbounded state variables. First, the explicit functional representations, so-called beta densities, of all local interactions leading to steady states of product measure form are rigorously derived. This also completes an outstanding proof given in a previous publication. Then we present an alternative solution for the processes with factorized stationary states by using a matrix product ansatz. Due to continuous state variables we obtain a matrix algebra in the form of a functional equation which can be solved exactly

  9. Evaluation of random temperature fluctuation problems with frequency response approach

    International Nuclear Information System (INIS)

    Lejeail, Yves; Kasahara, Naoto

    2000-01-01

    Since thermal striping is a coupled thermohydraulic and thermomechanical phenomenon, sodium mock-up tests were usually required to confirm structural integrity. Authors have developed the frequency response function to establish design-by-analysis methodology for this phenomenon. Applicability of this method to sinusoidal fluctuation was validated through two benchmark problems with FAENA and TIFFSS facilities under EJCC contract. This report describes the extension of the frequency response method to random fluctuations. As an example of application, fatigue strength of a Tee junction of PHENIX secondary piping system was investigated. (author)

  10. Physical Mapping of Bread Wheat Chromosome 5A: An Integrated Approach

    Directory of Open Access Journals (Sweden)

    Delfina Barabaschi

    2015-11-01

    Full Text Available The huge size, redundancy, and highly repetitive nature of the bread wheat [ (L.] genome, makes it among the most difficult species to be sequenced. To overcome these limitations, a strategy based on the separation of individual chromosomes or chromosome arms and the subsequent production of physical maps was established within the frame of the International Wheat Genome Sequence Consortium (IWGSC. A total of 95,812 bacterial artificial chromosome (BAC clones of short-arm chromosome 5A (5AS and long-arm chromosome 5A (5AL arm-specific BAC libraries were fingerprinted and assembled into contigs by complementary analytical approaches based on the FingerPrinted Contig (FPC and Linear Topological Contig (LTC tools. Combined anchoring approaches based on polymerase chain reaction (PCR marker screening, microarray, and sequence homology searches applied to several genomic tools (i.e., genetic maps, deletion bin map, neighbor maps, BAC end sequences (BESs, genome zipper, and chromosome survey sequences allowed the development of a high-quality physical map with an anchored physical coverage of 75% for 5AS and 53% for 5AL with high portions (64 and 48%, respectively of contigs ordered along the chromosome. In the genome of grasses, [ (L. Beauv.], rice ( L., and sorghum [ (L. Moench] homologs of genes on wheat chromosome 5A were separated into syntenic blocks on different chromosomes as a result of translocations and inversions during evolution. The physical map presented represents an essential resource for fine genetic mapping and map-based cloning of agronomically relevant traits and a reference for the 5A sequencing projects.

  11. Deciphering the genomic architecture of the stickleback brain with a novel multilocus gene-mapping approach.

    Science.gov (United States)

    Li, Zitong; Guo, Baocheng; Yang, Jing; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Shikano, Takahito; Calboli, Federico C F; Merilä, Juha

    2017-03-01

    Quantitative traits important to organismal function and fitness, such as brain size, are presumably controlled by many small-effect loci. Deciphering the genetic architecture of such traits with traditional quantitative trait locus (QTL) mapping methods is challenging. Here, we investigated the genetic architecture of brain size (and the size of five different brain parts) in nine-spined sticklebacks (Pungitius pungitius) with the aid of novel multilocus QTL-mapping approaches based on a de-biased LASSO method. Apart from having more statistical power to detect QTL and reduced rate of false positives than conventional QTL-mapping approaches, the developed methods can handle large marker panels and provide estimates of genomic heritability. Single-locus analyses of an F 2 interpopulation cross with 239 individuals and 15 198, fully informative single nucleotide polymorphisms (SNPs) uncovered 79 QTL associated with variation in stickleback brain size traits. Many of these loci were in strong linkage disequilibrium (LD) with each other, and consequently, a multilocus mapping of individual SNPs, accounting for LD structure in the data, recovered only four significant QTL. However, a multilocus mapping of SNPs grouped by linkage group (LG) identified 14 LGs (1-6 depending on the trait) that influence variation in brain traits. For instance, 17.6% of the variation in relative brain size was explainable by cumulative effects of SNPs distributed over six LGs, whereas 42% of the variation was accounted for by all 21 LGs. Hence, the results suggest that variation in stickleback brain traits is influenced by many small-effect loci. Apart from suggesting moderately heritable (h 2  ≈ 0.15-0.42) multifactorial genetic architecture of brain traits, the results highlight the challenges in identifying the loci contributing to variation in quantitative traits. Nevertheless, the results demonstrate that the novel QTL-mapping approach developed here has distinctive advantages

  12. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    OpenAIRE

    Dhruba Das; Hemanta K. Baruah

    2015-01-01

    In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM...

  13. Mapping SOC (Soil Organic Carbon) using LiDAR-derived vegetation indices in a random forest regression model

    Science.gov (United States)

    Will, R. M.; Glenn, N. F.; Benner, S. G.; Pierce, J. L.; Spaete, L.; Li, A.

    2015-12-01

    Quantifying SOC (Soil Organic Carbon) storage in complex terrain is challenging due to high spatial variability. Generally, the challenge is met by transforming point data to the entire landscape using surrogate, spatially-distributed, variables like elevation or precipitation. In many ecosystems, remotely sensed information on above-ground vegetation (e.g. NDVI) is a good predictor of below-ground carbon stocks. In this project, we are attempting to improve this predictive method by incorporating LiDAR-derived vegetation indices. LiDAR provides a mechanism for improved characterization of aboveground vegetation by providing structural parameters such as vegetation height and biomass. In this study, a random forest model is used to predict SOC using a suite of LiDAR-derived vegetation indices as predictor variables. The Reynolds Creek Experimental Watershed (RCEW) is an ideal location for a study of this type since it encompasses a strong elevation/precipitation gradient that supports lower biomass sagebrush ecosystems at low elevations and forests with more biomass at higher elevations. Sagebrush ecosystems composed of Wyoming, Low and Mountain Sagebrush have SOC values ranging from .4 to 1% (top 30 cm), while higher biomass ecosystems composed of aspen, juniper and fir have SOC values approaching 4% (top 30 cm). Large differences in SOC have been observed between canopy and interspace locations and high resolution vegetation information is likely to explain plot scale variability in SOC. Mapping of the SOC reservoir will help identify underlying controls on SOC distribution and provide insight into which processes are most important in determining SOC in semi-arid mountainous regions. In addition, airborne LiDAR has the potential to characterize vegetation communities at a high resolution and could be a tool for improving estimates of SOC at larger scales.

  14. Data mining approach to bipolar cognitive map development and decision analysis

    Science.gov (United States)

    Zhang, Wen-Ran

    2002-03-01

    A data mining approach to cognitive mapping is presented based on bipolar logic, bipolar relations, and bipolar clustering. It is shown that a correlation network derived from a database can be converted to a bipolar cognitive map (or bipolar relation). A transitive, symmetric, and reflexive bipolar relation (equilibrium relation) can be used to identify focal links in decision analysis. It can also be used to cluster a set of events or itemsets into three different clusters: coalition sets, conflict sets, and harmony sets. The coalition sets are positively correlated events or itemsets; each conflict set is a negatively correlated set of two coalition subsets; and a harmony set consists of events that are both negatively and positively correlated. A cognitive map and the clusters can then be used for online decision analysis. This approach combines knowledge discovery with the views of decision makers and provides an effective means for online analytical processing (OLAP) and online analytical mining (OLAM).

  15. A New Approach to High-accuracy Road Orthophoto Mapping Based on Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2011-12-01

    Full Text Available Existing orthophoto map based on satellite photography and aerial photography is not precise enough for road marking. This paper proposes a new approach to high-accuracy orthophoto mapping. The approach uses inverse perspective transformation to process the image information and generates the orthophoto fragment. The offline interpolation algorithm is used to process the location information. It processes the dead reckoning and the EKF location information, and uses the result to transform the fragments to the global coordinate system. At last it uses wavelet transform to divides the image to two frequency bands and uses weighted median algorithm to deal with them separately. The result of experiment shows that the map produced with this method has high accuracy.

  16. Impact of a concept map teaching approach on nursing students' critical thinking skills.

    Science.gov (United States)

    Kaddoura, Mahmoud; Van-Dyke, Olga; Yang, Qing

    2016-09-01

    Nurses confront complex problems and decisions that require critical thinking in order to identify patient needs and implement best practices. An active strategy for teaching students the skills to think critically is the concept map. This study explores the development of critical thinking among nursing students in a required pathophysiology and pharmacology course during the first year of a Bachelor of Science in Nursing in response to concept mapping as an interventional strategy, using the Health Education Systems, Incorporated critical thinking test. A two-group experimental study with a pretest and posttest design was used. Participants were randomly divided into a control group (n = 42) taught by traditional didactic lecturing alone, and an intervention group (n = 41), taught by traditional didactic lecturing with concept mapping. Students in the concept mapping group performed much better on the Health Education Systems, Incorporated than students in the control group. It is recommended that deans, program directors, and nursing faculties evaluate their curricula to integrate concept map teaching strategies in courses in order to develop critical thinking abilities in their students. © 2016 John Wiley & Sons Australia, Ltd.

  17. Determination of contact maps in proteins: A combination of structural and chemical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wołek, Karol; Cieplak, Marek, E-mail: mc@ifpan.edu.pl [Institute of Physics, Polish Academy of Science, Al. Lotników 32/46, 02-668 Warsaw (Poland); Gómez-Sicilia, Àngel [Instituto Cajal, Consejo Superior de Investigaciones Cientificas (CSIC), Av. Doctor Arce, 37, 28002 Madrid (Spain); Instituto Madrileño de Estudios Avanzados en Nanociencia (IMDEA-Nanociencia), C/Faraday 9, 28049 Cantoblanco (Madrid) (Spain)

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  18. A geomorphic approach to 100-year floodplain mapping for the Conterminous United States

    Science.gov (United States)

    Jafarzadegan, Keighobad; Merwade, Venkatesh; Saksena, Siddharth

    2018-06-01

    Floodplain mapping using hydrodynamic models is difficult in data scarce regions. Additionally, using hydrodynamic models to map floodplain over large stream network can be computationally challenging. Some of these limitations of floodplain mapping using hydrodynamic modeling can be overcome by developing computationally efficient statistical methods to identify floodplains in large and ungauged watersheds using publicly available data. This paper proposes a geomorphic model to generate probabilistic 100-year floodplain maps for the Conterminous United States (CONUS). The proposed model first categorizes the watersheds in the CONUS into three classes based on the height of the water surface corresponding to the 100-year flood from the streambed. Next, the probability that any watershed in the CONUS belongs to one of these three classes is computed through supervised classification using watershed characteristics related to topography, hydrography, land use and climate. The result of this classification is then fed into a probabilistic threshold binary classifier (PTBC) to generate the probabilistic 100-year floodplain maps. The supervised classification algorithm is trained by using the 100-year Flood Insurance Rated Maps (FIRM) from the U.S. Federal Emergency Management Agency (FEMA). FEMA FIRMs are also used to validate the performance of the proposed model in areas not included in the training. Additionally, HEC-RAS model generated flood inundation extents are used to validate the model performance at fifteen sites that lack FEMA maps. Validation results show that the probabilistic 100-year floodplain maps, generated by proposed model, match well with both FEMA and HEC-RAS generated maps. On average, the error of predicted flood extents is around 14% across the CONUS. The high accuracy of the validation results shows the reliability of the geomorphic model as an alternative approach for fast and cost effective delineation of 100-year floodplains for the CONUS.

  19. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  20. Mapping community vulnerability to poaching: A whole-of-society approach

    CSIR Research Space (South Africa)

    Schmitz, Peter

    2017-01-01

    Full Text Available in Cartography and GIScience Mapping community vulnerability to poaching: A whole-of-society approach Peter M.U. Schmitz,1,2,3 Duarte Gonçalves,4 and Merin Jacob4 1. CSIR Built Environment, Meiring Naude Rd, Brummeria, Pretoria, South Africa; pschmitz...

  1. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    Science.gov (United States)

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  2. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  3. A novel matrix approach for controlling the invariant densities of chaotic maps

    International Nuclear Information System (INIS)

    Rogers, Alan; Shorten, Robert; Heffernan, Daniel M.

    2008-01-01

    Recent work on positive matrices has resulted in a new matrix method for generating chaotic maps with arbitrary piecewise constant invariant densities, sometimes known as the inverse Frobenius-Perron problem (IFPP). In this paper, we give an extensive introduction to the IFPP, describing existing methods for solving it, and we describe our new matrix approach for solving the IFPP

  4. Quantifying Spatial Variation in Ecosystem Services Demand : A Global Mapping Approach

    NARCIS (Netherlands)

    Wolff, S.; Schulp, C. J E; Kastner, T.; Verburg, P. H.

    2017-01-01

    Understanding the spatial-temporal variability in ecosystem services (ES) demand can help anticipate externalities of land use change. This study presents new operational approaches to quantify and map demand for three non-commodity ES on a global scale: animal pollination, wild medicinal plants and

  5. Constrained Perturbation Regularization Approach for Signal Estimation Using Random Matrix Theory

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag; Ballal, Tarig; Kammoun, Abla; Al-Naffouri, Tareq Y.

    2016-01-01

    random matrix theory are applied to derive the near-optimum regularizer that minimizes the mean-squared error of the estimator. Simulation results demonstrate that the proposed approach outperforms a set of benchmark regularization methods for various

  6. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    Directory of Open Access Journals (Sweden)

    Emma Lawrence

    Full Text Available The recently declared Australian Commonwealth Marine Reserve (CMR Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia designed to test the benefits of two approaches to characterising shelf habitats: (i MBES mapping of a continuous (~30 km2 area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN

  7. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    Science.gov (United States)

    Lawrence, Emma; Hayes, Keith R; Lucieer, Vanessa L; Nichol, Scott L; Dambacher, Jeffrey M; Hill, Nicole A; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN zone IV, and in

  8. A Markov random field approach for microstructure synthesis

    International Nuclear Information System (INIS)

    Kumar, A; Nguyen, L; DeGraef, M; Sundararaghavan, V

    2016-01-01

    We test the notion that many microstructures have an underlying stationary probability distribution. The stationary probability distribution is ubiquitous: we know that different windows taken from a polycrystalline microstructure are generally ‘statistically similar’. To enable computation of such a probability distribution, microstructures are represented in the form of undirected probabilistic graphs called Markov Random Fields (MRFs). In the model, pixels take up integer or vector states and interact with multiple neighbors over a window. Using this lattice structure, algorithms are developed to sample the conditional probability density for the state of each pixel given the known states of its neighboring pixels. The sampling is performed using reference experimental images. 2D microstructures are artificially synthesized using the sampled probabilities. Statistical features such as grain size distribution and autocorrelation functions closely match with those of the experimental images. The mechanical properties of the synthesized microstructures were computed using the finite element method and were also found to match the experimental values. (paper)

  9. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Shiguo [Univ. Wisc.-Madison; Kile, A. [Univ. Wisc.-Madison; Bechner, M. [Univ. Wisc.-Madison; Kvikstad, E. [Univ. Wisc.-Madison; Deng, W. [Univ. Wisc.-Madison; Wei, J. [Univ. Wisc.-Madison; Severin, J. [Univ. Wisc.-Madison; Runnheim, R. [Univ. Wisc.-Madison; Churas, C. [Univ. Wisc.-Madison; Forrest, D. [Univ. Wisc.-Madison; Dimalanta, E. [Univ. Wisc.-Madison; Lamers, C. [Univ. Wisc.-Madison; Burland, V. [Univ. Wisc.-Madison; Blattner, F. R. [Univ. Wisc.-Madison; Schwartz, David C. [Univ. Wisc.-Madison

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  10. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    Science.gov (United States)

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  11. Exploring multicollinearity using a random matrix theory approach.

    Science.gov (United States)

    Feher, Kristen; Whelan, James; Müller, Samuel

    2012-01-01

    Clustering of gene expression data is often done with the latent aim of dimension reduction, by finding groups of genes that have a common response to potentially unknown stimuli. However, what is poorly understood to date is the behaviour of a low dimensional signal embedded in high dimensions. This paper introduces a multicollinear model which is based on random matrix theory results, and shows potential for the characterisation of a gene cluster's correlation matrix. This model projects a one dimensional signal into many dimensions and is based on the spiked covariance model, but rather characterises the behaviour of the corresponding correlation matrix. The eigenspectrum of the correlation matrix is empirically examined by simulation, under the addition of noise to the original signal. The simulation results are then used to propose a dimension estimation procedure of clusters from data. Moreover, the simulation results warn against considering pairwise correlations in isolation, as the model provides a mechanism whereby a pair of genes with `low' correlation may simply be due to the interaction of high dimension and noise. Instead, collective information about all the variables is given by the eigenspectrum.

  12. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population.

    Science.gov (United States)

    Raghavan, Chitra; Mauleon, Ramil; Lacorte, Vanica; Jubay, Monalisa; Zaw, Hein; Bonifacio, Justine; Singh, Rakesh Kumar; Huang, B Emma; Leung, Hei

    2017-06-07

    Multi-parent Advanced Generation Intercross (MAGIC) populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL) mapping. In this study, 1316 S6:8 indica MAGIC (MI) lines and the eight founders were sequenced using Genotyping by Sequencing (GBS). As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height), physical (grain length and grain width) and cooking properties (amylose content) of the rice grain, abiotic stress (submergence tolerance), and biotic stress (brown spot disease) were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations. Copyright © 2017 Raghavan et al.

  13. Approaches in Characterizing Genetic Structure and Mapping in a Rice Multiparental Population

    Directory of Open Access Journals (Sweden)

    Chitra Raghavan

    2017-06-01

    Full Text Available Multi-parent Advanced Generation Intercross (MAGIC populations are fast becoming mainstream tools for research and breeding, along with the technology and tools for analysis. This paper demonstrates the analysis of a rice MAGIC population from data filtering to imputation and processing of genetic data to characterizing genomic structure, and finally quantitative trait loci (QTL mapping. In this study, 1316 S6:8 indica MAGIC (MI lines and the eight founders were sequenced using Genotyping by Sequencing (GBS. As the GBS approach often includes missing data, the first step was to impute the missing SNPs. The observable number of recombinations in the population was then explored. Based on this case study, a general outline of procedures for a MAGIC analysis workflow is provided, as well as for QTL mapping of agronomic traits and biotic and abiotic stress, using the results from both association and interval mapping approaches. QTL for agronomic traits (yield, flowering time, and plant height, physical (grain length and grain width and cooking properties (amylose content of the rice grain, abiotic stress (submergence tolerance, and biotic stress (brown spot disease were mapped. Through presenting this extensive analysis in the MI population in rice, we highlight important considerations when choosing analytical approaches. The methods and results reported in this paper will provide a guide to future genetic analysis methods applied to multi-parent populations.

  14. An automated approach for mapping persistent ice and snow cover over high latitude regions

    Science.gov (United States)

    Selkowitz, David J.; Forster, Richard R.

    2016-01-01

    We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly

  15. A Hybrid Color Mapping Approach to Fusing MODIS and Landsat Images for Forward Prediction

    OpenAIRE

    Chiman Kwan; Bence Budavari; Feng Gao; Xiaolin Zhu

    2018-01-01

    We present a new, simple, and efficient approach to fusing MODIS and Landsat images. It is well known that MODIS images have high temporal resolution and low spatial resolution, whereas Landsat images are just the opposite. Similar to earlier approaches, our goal is to fuse MODIS and Landsat images to yield high spatial and high temporal resolution images. Our approach consists of two steps. First, a mapping is established between two MODIS images, where one is at an earlier time, t1, and the...

  16. Random matrix approach to the dynamics of stock inventory variations

    International Nuclear Information System (INIS)

    Zhou Weixing; Mu Guohua; Kertész, János

    2012-01-01

    It is well accepted that investors can be classified into groups owing to distinct trading strategies, which forms the basic assumption of many agent-based models for financial markets when agents are not zero-intelligent. However, empirical tests of these assumptions are still very rare due to the lack of order flow data. Here we adopt the order flow data of Chinese stocks to tackle this problem by investigating the dynamics of inventory variations for individual and institutional investors that contain rich information about the trading behavior of investors and have a crucial influence on price fluctuations. We find that the distributions of cross-correlation coefficient C ij have power-law forms in the bulk that are followed by exponential tails, and there are more positive coefficients than negative ones. In addition, it is more likely that two individuals or two institutions have a stronger inventory variation correlation than one individual and one institution. We find that the largest and the second largest eigenvalues (λ 1 and λ 2 ) of the correlation matrix cannot be explained by random matrix theory and the projections of investors' inventory variations on the first eigenvector u(λ 1 ) are linearly correlated with stock returns, where individual investors play a dominating role. The investors are classified into three categories based on the cross-correlation coefficients C VR between inventory variations and stock returns. A strong Granger causality is unveiled from stock returns to inventory variations, which means that a large proportion of individuals hold the reversing trading strategy and a small part of individuals hold the trending strategy. Our empirical findings have scientific significance in the understanding of investors' trading behavior and in the construction of agent-based models for emerging stock markets. (paper)

  17. Blind Measurement Selection: A Random Matrix Theory Approach

    KAUST Repository

    Elkhalil, Khalil

    2016-12-14

    This paper considers the problem of selecting a set of $k$ measurements from $n$ available sensor observations. The selected measurements should minimize a certain error function assessing the error in estimating a certain $m$ dimensional parameter vector. The exhaustive search inspecting each of the $n\\\\choose k$ possible choices would require a very high computational complexity and as such is not practical for large $n$ and $k$. Alternative methods with low complexity have recently been investigated but their main drawbacks are that 1) they require perfect knowledge of the measurement matrix and 2) they need to be applied at the pace of change of the measurement matrix. To overcome these issues, we consider the asymptotic regime in which $k$, $n$ and $m$ grow large at the same pace. Tools from random matrix theory are then used to approximate in closed-form the most important error measures that are commonly used. The asymptotic approximations are then leveraged to select properly $k$ measurements exhibiting low values for the asymptotic error measures. Two heuristic algorithms are proposed: the first one merely consists in applying the convex optimization artifice to the asymptotic error measure. The second algorithm is a low-complexity greedy algorithm that attempts to look for a sufficiently good solution for the original minimization problem. The greedy algorithm can be applied to both the exact and the asymptotic error measures and can be thus implemented in blind and channel-aware fashions. We present two potential applications where the proposed algorithms can be used, namely antenna selection for uplink transmissions in large scale multi-user systems and sensor selection for wireless sensor networks. Numerical results are also presented and sustain the efficiency of the proposed blind methods in reaching the performances of channel-aware algorithms.

  18. Random matrix approach to the dynamics of stock inventory variations

    Science.gov (United States)

    Zhou, Wei-Xing; Mu, Guo-Hua; Kertész, János

    2012-09-01

    It is well accepted that investors can be classified into groups owing to distinct trading strategies, which forms the basic assumption of many agent-based models for financial markets when agents are not zero-intelligent. However, empirical tests of these assumptions are still very rare due to the lack of order flow data. Here we adopt the order flow data of Chinese stocks to tackle this problem by investigating the dynamics of inventory variations for individual and institutional investors that contain rich information about the trading behavior of investors and have a crucial influence on price fluctuations. We find that the distributions of cross-correlation coefficient Cij have power-law forms in the bulk that are followed by exponential tails, and there are more positive coefficients than negative ones. In addition, it is more likely that two individuals or two institutions have a stronger inventory variation correlation than one individual and one institution. We find that the largest and the second largest eigenvalues (λ1 and λ2) of the correlation matrix cannot be explained by random matrix theory and the projections of investors' inventory variations on the first eigenvector u(λ1) are linearly correlated with stock returns, where individual investors play a dominating role. The investors are classified into three categories based on the cross-correlation coefficients CV R between inventory variations and stock returns. A strong Granger causality is unveiled from stock returns to inventory variations, which means that a large proportion of individuals hold the reversing trading strategy and a small part of individuals hold the trending strategy. Our empirical findings have scientific significance in the understanding of investors' trading behavior and in the construction of agent-based models for emerging stock markets.

  19. Clustering of the Self-Organizing Map based Approach in Induction Machine Rotor Faults Diagnostics

    Directory of Open Access Journals (Sweden)

    Ahmed TOUMI

    2009-12-01

    Full Text Available Self-Organizing Maps (SOM is an excellent method of analyzingmultidimensional data. The SOM based classification is attractive, due to itsunsupervised learning and topology preserving properties. In this paper, theperformance of the self-organizing methods is investigated in induction motorrotor fault detection and severity evaluation. The SOM is based on motor currentsignature analysis (MCSA. The agglomerative hierarchical algorithms using theWard’s method is applied to automatically dividing the map into interestinginterpretable groups of map units that correspond to clusters in the input data. Theresults obtained with this approach make it possible to detect a rotor bar fault justdirectly from the visualization results. The system is also able to estimate theextent of rotor faults.

  20. Reducing approach bias to achieve smoking cessation: A pilot randomized placebo-controlled trial

    NARCIS (Netherlands)

    Baird, S.O.; Rinck, M.; Rosenfield, D.; Davis, M.L.; Fisher, J.R.; Becker, E.S.; Powers, M.B.; Smits, J.A.J.

    2017-01-01

    This study aimed to provide a preliminary test of the efficacy of a brief cognitive bias modification program for reducing approach bias in adult smokers motivated to quit. Participants were 52 smokers who were randomly assigned to four sessions of approach bias modification training (AAT) or sham

  1. ADVANCED EARTH OBSERVATION APPROACH FOR MULTISCALE FOREST ECOSYSTEM SERVICES MODELING AND MAPPING (MIMOSE

    Directory of Open Access Journals (Sweden)

    G. Chirici

    2014-04-01

    Full Text Available In the last decade ecosystem services (ES have been proposed as a method for quantifying the multifunctional role of forest ecosystems. Their spatial distribution on large areas is frequently limited by the lack of information, because field data collection with traditional methods requires much effort in terms of time and cost.  In this contribution we propose a methodology (namely, MultIscale Mapping Of ecoSystem servicEs - MIMOSE based on the integration of remotely sensed images and field observation to produce a wall-to-wall geodatabase of forest parcels accompanied with several information useful as a basis for future trade-off analysis of different ES. Here, we present the application of the MIMOSE approach to a study area of 443,758 hectares  coincident with administrative Molise Region in Central Italy. The procedure is based on a local high resolution forest types map integrated with information on the main forest management approaches. Through the non-parametric k-Nearest Neighbors techniques, we produced a growing stock volume map integrating a local forest inventory with a multispectral satellite IRS LISS III imagery. With the growing stock volume map we derived a forest age map for even-aged forest types. Later these information were used to automatically create a vector forest parcels map by multidimensional image segmentation that were finally populated with a number of information useful for ES spatial estimation. The contribution briefly introduce to the MIMOSE methodology presenting the preliminary results we achieved which constitute the basis for a future implementation of ES modeling.

  2. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    Directory of Open Access Journals (Sweden)

    Mutlu Ozdogan

    Full Text Available In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i creating masks for water, non-forested areas, clouds, and cloud shadows; ii identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR difference image; iii filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission, issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for

  3. A novel intra-operative, high-resolution atrial mapping approach.

    Science.gov (United States)

    Yaksh, Ameeta; van der Does, Lisette J M E; Kik, Charles; Knops, Paul; Oei, Frans B S; van de Woestijne, Pieter C; Bekkers, Jos A; Bogers, Ad J J C; Allessie, Maurits A; de Groot, Natasja M S

    2015-12-01

    A new technique is demonstrated for extensive high-resolution intra-operative atrial mapping that will facilitate the localization of atrial fibrillation (AF) sources and identification of the substrate perpetuating AF. Prior to the start of extra-corporal circulation, a 8 × 24-electrode array (2-mm inter-electrode distance) is placed subsequently on all the right and left epicardial atrial sites, including Bachmann's bundle, for recording of unipolar electrograms during sinus rhythm and (induced) AF. AF is induced by high-frequency pacing at the right atrial free wall. A pacemaker wire stitched to the right atrium serves as a reference signal. The indifferent pole is connected to a steal wire fixed to subcutaneous tissue. Electrograms are recorded by a computerized mapping system and, after amplification (gain 1000), filtering (bandwidth 0.5-400 Hz), sampling (1 kHz) and analogue to digital conversion (16 bits), automatically stored on hard disk. During the mapping procedure, real-time visualization secures electrogram quality. Analysis will be performed offline. This technique was performed in 168 patients of 18 years and older, with coronary and/or structural heart disease, with or without AF, electively scheduled for cardiac surgery and a ventricular ejection fraction above 40 %. The mean duration of the entire mapping procedure including preparation time was 9 ± 2 min. Complications related to the mapping procedure during or after cardiac surgery were not observed. We introduce the first epicardial atrial mapping approach with a high resolution of ≥1728 recording sites which can be performed in a procedure time of only 9±2 mins. This mapping technique can potentially identify areas responsible for initiation and persistence of AF and hopefully can individualize both diagnosis and therapy of AF.

  4. Constrained Perturbation Regularization Approach for Signal Estimation Using Random Matrix Theory

    KAUST Repository

    Suliman, Mohamed Abdalla Elhag

    2016-10-06

    In this work, we propose a new regularization approach for linear least-squares problems with random matrices. In the proposed constrained perturbation regularization approach, an artificial perturbation matrix with a bounded norm is forced into the system model matrix. This perturbation is introduced to improve the singular-value structure of the model matrix and, hence, the solution of the estimation problem. Relying on the randomness of the model matrix, a number of deterministic equivalents from random matrix theory are applied to derive the near-optimum regularizer that minimizes the mean-squared error of the estimator. Simulation results demonstrate that the proposed approach outperforms a set of benchmark regularization methods for various estimated signal characteristics. In addition, simulations show that our approach is robust in the presence of model uncertainty.

  5. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    Science.gov (United States)

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  6. Mobile Ground-Based Radar Sensor for Localization and Mapping: An Evaluation of two Approaches

    Directory of Open Access Journals (Sweden)

    Damien Vivet

    2013-08-01

    Full Text Available This paper is concerned with robotic applications using a ground-based radar sensor for simultaneous localization and mapping problems. In mobile robotics, radar technology is interesting because of its long range and the robustness of radar waves to atmospheric conditions, making these sensors well-suited for extended outdoor robotic applications. Two localization and mapping approaches using data obtained from a 360° field of view microwave radar sensor are presented and compared. The first method is a trajectory-oriented simultaneous localization and mapping technique, which makes no landmark assumptions and avoids the data association problem. The estimation of the ego-motion makes use of the Fourier-Mellin transform for registering radar images in a sequence, from which the rotation and translation of the sensor motion can be estimated. The second approach uses the consequence of using a rotating range sensor in high speed robotics. In such a situation, movement combinations create distortions in the collected data. Velocimetry is achieved here by explicitly analysing these measurement distortions. As a result, the trajectory of the vehicle and then the radar map of outdoor environments can be obtained. The evaluation of experimental results obtained by the two methods is presented on real-world data from a vehicle moving at 30 km/h over a 2.5 km course.

  7. A prospective randomized peri- and post-operative comparison of the minimally invasive anterolateral approach versus the lateral approach

    OpenAIRE

    Stefan Landgraeber; Henning Quitmann; Sebastian Güth; Marcel Haversath; Wojciech Kowalczyk; Andrés Kecskeméthy; Hansjörg Heep; Marcus Jäger

    2013-01-01

    There is still controversy as to whether minimally invasive total hip arthroplasty enhances the postoperative outcome. The aim of this study was to compare the outcome of patients who underwent total hip replacement through an anterolateral minimally invasive (MIS) or a conventional lateral approach (CON). We performed a randomized, prospective study of 75 patients with primary hip arthritis, who underwent hip replacement through the MIS (n=36) or CON (n=39) approach. The Western Ontario and ...

  8. Global land cover mapping at 30 m resolution: A POK-based operational approach

    Science.gov (United States)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  9. Recognition of building group patterns in topographic maps based on graph partitioning and random forest

    Science.gov (United States)

    He, Xianjin; Zhang, Xinchang; Xin, Qinchuan

    2018-02-01

    Recognition of building group patterns (i.e., the arrangement and form exhibited by a collection of buildings at a given mapping scale) is important to the understanding and modeling of geographic space and is hence essential to a wide range of downstream applications such as map generalization. Most of the existing methods develop rigid rules based on the topographic relationships between building pairs to identify building group patterns and thus their applications are often limited. This study proposes a method to identify a variety of building group patterns that allow for map generalization. The method first identifies building group patterns from potential building clusters based on a machine-learning algorithm and further partitions the building clusters with no recognized patterns based on the graph partitioning method. The proposed method is applied to the datasets of three cities that are representative of the complex urban environment in Southern China. Assessment of the results based on the reference data suggests that the proposed method is able to recognize both regular (e.g., the collinear, curvilinear, and rectangular patterns) and irregular (e.g., the L-shaped, H-shaped, and high-density patterns) building group patterns well, given that the correctness values are consistently nearly 90% and the completeness values are all above 91% for three study areas. The proposed method shows promises in automated recognition of building group patterns that allows for map generalization.

  10. Energy-efficient virtual optical network mapping approaches over converged flexible bandwidth optical networks and data centers.

    Science.gov (United States)

    Chen, Bowen; Zhao, Yongli; Zhang, Jie

    2015-09-21

    In this paper, we develop a virtual link priority mapping (LPM) approach and a virtual node priority mapping (NPM) approach to improve the energy efficiency and to reduce the spectrum usage over the converged flexible bandwidth optical networks and data centers. For comparison, the lower bound of the virtual optical network mapping is used for the benchmark solutions. Simulation results show that the LPM approach achieves the better performance in terms of power consumption, energy efficiency, spectrum usage, and the number of regenerators compared to the NPM approach.

  11. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    Science.gov (United States)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  12. Heat transfer analysis in internally-cooled fuel elements by means of a conformal mapping approach

    International Nuclear Information System (INIS)

    Sarmiento, G.S.; Laura, P.A.A.

    1981-01-01

    The present paper deals with an approximate solution of the steady-state heat conduction problem in internally cooled fuel elements of fast breeder reactors. Explicit expressions for the dimensionless temperature distribution in terms of the governing physical and geometrical parameters are determined by means of a coupled conformal mapping-variational approach. The results obtained are found to be in very good agreement with those calculated by means of a finite element code. (orig.)

  13. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  14. MAP3D: a media processor approach for high-end 3D graphics

    Science.gov (United States)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  15. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    Science.gov (United States)

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher

    2017-07-01

    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  16. Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach

    Science.gov (United States)

    Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai

    2017-02-01

    The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.

  17. A Spectral Approach for Quenched Limit Theorems for Random Expanding Dynamical Systems

    Science.gov (United States)

    Dragičević, D.; Froyland, G.; González-Tokman, C.; Vaienti, S.

    2018-01-01

    We prove quenched versions of (i) a large deviations principle (LDP), (ii) a central limit theorem (CLT), and (iii) a local central limit theorem for non-autonomous dynamical systems. A key advance is the extension of the spectral method, commonly used in limit laws for deterministic maps, to the general random setting. We achieve this via multiplicative ergodic theory and the development of a general framework to control the regularity of Lyapunov exponents of twisted transfer operator cocycles with respect to a twist parameter. While some versions of the LDP and CLT have previously been proved with other techniques, the local central limit theorem is, to our knowledge, a completely new result, and one that demonstrates the strength of our method. Applications include non-autonomous (piecewise) expanding maps, defined by random compositions of the form {T_{σ^{n-1} ω} circ\\cdotscirc T_{σω}circ T_ω} . An important aspect of our results is that we only assume ergodicity and invertibility of the random driving {σ:Ω\\toΩ} ; in particular no expansivity or mixing properties are required.

  18. Quantitative Trait Loci Mapping Problem: An Extinction-Based Multi-Objective Evolutionary Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Nicholas S. Flann

    2013-09-01

    Full Text Available The Quantitative Trait Loci (QTL mapping problem aims to identify regions in the genome that are linked to phenotypic features of the developed organism that vary in degree. It is a principle step in determining targets for further genetic analysis and is key in decoding the role of specific genes that control quantitative traits within species. Applications include identifying genetic causes of disease, optimization of cross-breeding for desired traits and understanding trait diversity in populations. In this paper a new multi-objective evolutionary algorithm (MOEA method is introduced and is shown to increase the accuracy of QTL mapping identification for both independent and epistatic loci interactions. The MOEA method optimizes over the space of possible partial least squares (PLS regression QTL models and considers the conflicting objectives of model simplicity versus model accuracy. By optimizing for minimal model complexity, MOEA has the advantage of solving the over-fitting problem of conventional PLS models. The effectiveness of the method is confirmed by comparing the new method with Bayesian Interval Mapping approaches over a series of test cases where the optimal solutions are known. This approach can be applied to many problems that arise in analysis of genomic data sets where the number of features far exceeds the number of observations and where features can be highly correlated.

  19. A Photogrammetric Approach for Assessing Positional Accuracy of OpenStreetMap© Roads

    Directory of Open Access Journals (Sweden)

    Peter Doucette

    2013-04-01

    Full Text Available As open source volunteered geographic information continues to gain popularity, the user community and data contributions are expected to grow, e.g., CloudMade, Apple, and Ushahidi now provide OpenStreetMap© (OSM as a base layer for some of their mapping applications. This, coupled with the lack of cartographic standards and the expectation to one day be able to use this vector data for more geopositionally sensitive applications, like GPS navigation, leaves potential users and researchers to question the accuracy of the database. This research takes a photogrammetric approach to determining the positional accuracy of OSM road features using stereo imagery and a vector adjustment model. The method applies rigorous analytical measurement principles to compute accurate real world geolocations of OSM road vectors. The proposed approach was tested on several urban gridded city streets from the OSM database with the results showing that the post adjusted shape points improved positionally by 86%. Furthermore, the vector adjustment was able to recover 95% of the actual positional displacement present in the database. To demonstrate a practical application, a head-to-head positional accuracy assessment between OSM, the USGS National Map (TNM, and United States Census Bureau’s Topologically Integrated Geographic Encoding Referencing (TIGER 2007 roads was conducted.

  20. Urban Flood Mapping Based on Unmanned Aerial Vehicle Remote Sensing and Random Forest Classifier—A Case of Yuyao, China

    Directory of Open Access Journals (Sweden)

    Quanlong Feng

    2015-03-01

    Full Text Available Flooding is a severe natural hazard, which poses a great threat to human life and property, especially in densely-populated urban areas. As one of the fastest developing fields in remote sensing applications, an unmanned aerial vehicle (UAV can provide high-resolution data with a great potential for fast and accurate detection of inundated areas under complex urban landscapes. In this research, optical imagery was acquired by a mini-UAV to monitor the serious urban waterlogging in Yuyao, China. Texture features derived from gray-level co-occurrence matrix were included to increase the separability of different ground objects. A Random Forest classifier, consisting of 200 decision trees, was used to extract flooded areas in the spectral-textural feature space. Confusion matrix was used to assess the accuracy of the proposed method. Results indicated the following: (1 Random Forest showed good performance in urban flood mapping with an overall accuracy of 87.3% and a Kappa coefficient of 0.746; (2 the inclusion of texture features improved classification accuracy significantly; (3 Random Forest outperformed maximum likelihood and artificial neural network, and showed a similar performance to support vector machine. The results demonstrate that UAV can provide an ideal platform for urban flood monitoring and the proposed method shows great capability for the accurate extraction of inundated areas.

  1. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  2. Approaching multidimensional forms of knowledge through Personal Meaning Mapping in science integrating teaching outside the classroom

    DEFF Research Database (Denmark)

    Hartmeyer, Rikke; Bolling, Mads; Bentsen, Peter

    2017-01-01

    knowledge dimensions is important, especially in science teaching outside the classroom, where “hands-on” approaches and experiments are often part of teaching and require procedural knowledge, among other things. Therefore, this study investigates PMM as a method for exploring specific knowledge dimensions......Current research points to Personal Meaning Mapping (PMM) as a method useful in investigating students’ prior and current science knowledge. However, studies investigating PMM as a method for exploring specific knowledge dimensions are lacking. Ensuring that students are able to access specific...... in formal science education integrating teaching outside the classroom. We applied a case study design involving two schools and four sixth-grade classes. Data were collected from six students in each class who constructed personal meaning maps and were interviewed immediately after natural science...

  3. A high-throughput shotgun mutagenesis approach to mapping B-cell antibody epitopes.

    Science.gov (United States)

    Davidson, Edgar; Doranz, Benjamin J

    2014-09-01

    Characterizing the binding sites of monoclonal antibodies (mAbs) on protein targets, their 'epitopes', can aid in the discovery and development of new therapeutics, diagnostics and vaccines. However, the speed of epitope mapping techniques has not kept pace with the increasingly large numbers of mAbs being isolated. Obtaining detailed epitope maps for functionally relevant antibodies can be challenging, particularly for conformational epitopes on structurally complex proteins. To enable rapid epitope mapping, we developed a high-throughput strategy, shotgun mutagenesis, that enables the identification of both linear and conformational epitopes in a fraction of the time required by conventional approaches. Shotgun mutagenesis epitope mapping is based on large-scale mutagenesis and rapid cellular testing of natively folded proteins. Hundreds of mutant plasmids are individually cloned, arrayed in 384-well microplates, expressed within human cells, and tested for mAb reactivity. Residues are identified as a component of a mAb epitope if their mutation (e.g. to alanine) does not support candidate mAb binding but does support that of other conformational mAbs or allows full protein function. Shotgun mutagenesis is particularly suited for studying structurally complex proteins because targets are expressed in their native form directly within human cells. Shotgun mutagenesis has been used to delineate hundreds of epitopes on a variety of proteins, including G protein-coupled receptor and viral envelope proteins. The epitopes mapped on dengue virus prM/E represent one of the largest collections of epitope information for any viral protein, and results are being used to design better vaccines and drugs. © 2014 John Wiley & Sons Ltd.

  4. Regional geology mapping using satellite-based remote sensing approach in Northern Victoria Land, Antarctica

    Science.gov (United States)

    Pour, Amin Beiranvand; Park, Yongcheol; Park, Tae-Yoon S.; Hong, Jong Kuk; Hashim, Mazlan; Woo, Jusun; Ayoobi, Iman

    2018-06-01

    Satellite remote sensing imagery is especially useful for geological investigations in Antarctica because of its remoteness and extreme environmental conditions that constrain direct geological survey. The highest percentage of exposed rocks and soils in Antarctica occurs in Northern Victoria Land (NVL). Exposed Rocks in NVL were part of the paleo-Pacific margin of East Gondwana during the Paleozoic time. This investigation provides a satellite-based remote sensing approach for regional geological mapping in the NVL, Antarctica. Landsat-8 and the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) datasets were used to extract lithological-structural and mineralogical information. Several spectral-band ratio indices were developed using Landsat-8 and ASTER bands and proposed for Antarctic environments to map spectral signatures of snow/ice, iron oxide/hydroxide minerals, Al-OH-bearing and Fe, Mg-OH and CO3 mineral zones, and quartz-rich felsic and mafic-to-ultramafic lithological units. The spectral-band ratio indices were tested and implemented to Level 1 terrain-corrected (L1T) products of Landsat-8 and ASTER datasets covering the NVL. The surface distribution of the mineral assemblages was mapped using the spectral-band ratio indices and verified by geological expeditions and laboratory analysis. Resultant image maps derived from spectral-band ratio indices that developed in this study are fairly accurate and correspond well with existing geological maps of the NVL. The spectral-band ratio indices developed in this study are especially useful for geological investigations in inaccessible locations and poorly exposed lithological units in Antarctica environments.

  5. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    Science.gov (United States)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  6. Color image encryption using random transforms, phase retrieval, chaotic maps, and diffusion

    Science.gov (United States)

    Annaby, M. H.; Rushdi, M. A.; Nehary, E. A.

    2018-04-01

    The recent tremendous proliferation of color imaging applications has been accompanied by growing research in data encryption to secure color images against adversary attacks. While recent color image encryption techniques perform reasonably well, they still exhibit vulnerabilities and deficiencies in terms of statistical security measures due to image data redundancy and inherent weaknesses. This paper proposes two encryption algorithms that largely treat these deficiencies and boost the security strength through novel integration of the random fractional Fourier transforms, phase retrieval algorithms, as well as chaotic scrambling and diffusion. We show through detailed experiments and statistical analysis that the proposed enhancements significantly improve security measures and immunity to attacks.

  7. A new meshless approach to map electromagnetic loads for FEM analysis on DEMO TF coil system

    International Nuclear Information System (INIS)

    Biancolini, Marco Evangelos; Brutti, Carlo; Giorgetti, Francesco; Muzzi, Luigi; Turtù, Simonetta; Anemona, Alessandro

    2015-01-01

    Graphical abstract: - Highlights: • Generation and mapping of magnetic load on DEMO using radial basis function. • Good agreement between RBF interpolation and EM TOSCA computations. • Resultant forces are stable with respect to the target mesh used. • Stress results are robust and accurate even if a coarse cloud is used for RBF interpolation. - Abstract: Demonstration fusion reactors (DEMO) are being envisaged to be able to produce commercial electrical power. The design of the DEMO magnets and of the constituting conductors is a crucial issue in the overall engineering design of such a large fusion machine. In the frame of the EU roadmap of the so-called fast track approach, mechanical studies of preliminary DEMO toroidal field (TF) coil system conceptual designs are being enforced. The magnetic field load acting on the DEMO TF coil conductor has to be evaluated as input in the FEM model mesh, in order to evaluate the stresses on the mechanical structure. To gain flexibility, a novel approach based on the meshless method of radial basis functions (RBF) has been implemented. The present paper describes this original and flexible approach for the generation and mapping of magnetic load on DEMO TF coil system.

  8. Non-invasive multiparametric qBOLD approach for robust mapping of the oxygen extraction fraction.

    Science.gov (United States)

    Domsch, Sebastian; Mie, Moritz B; Wenz, Frederik; Schad, Lothar R

    2014-09-01

    The quantitative blood oxygenation level-dependent (qBOLD) method has not become clinically established yet because long acquisition times are necessary to achieve an acceptable certainty of the parameter estimates. In this work, a non-invasive multiparametric (nimp) qBOLD approach based on a simple analytical model is proposed to facilitate robust oxygen extraction fraction (OEF) mapping within clinically acceptable acquisition times by using separate measurements. The protocol consisted of a gradient-echo sampled spin-echo sequence (GESSE), a T2-weighted Carr-Purcell-Meiboom-Gill (CPMG) sequence, and a T2(*)-weighted multi-slice multi-echo gradient echo (MMGE) sequence. The GESSE acquisition time was less than 5 minutes and the extra measurement time for CPMG/MMGE was below 2 minutes each. The proposed nimp-qBOLD approach was validated in healthy subjects (N = 5) and one patient. The proposed nimp-qBOLD approach facilitated more robust OEF mapping with significantly reduced inter- and intra-subject variability compared to the standard qBOLD method. Thereby, an average OEF in all subjects of 27±2% in white matter (WM) and 29±2% in gray matter (GM) using the nimp-qBOLD method was more stable compared to 41±10% (WM) and 46±10% (GM) with standard qBOLD. Moreover, the spatial variance in the image slice (i.e. standard deviation divided by mean) was on average reduced from 35% to 25%. In addition, the preliminary results of the patient are encouraging. The proposed nimp-qBOLD technique provides a promising tool for robust OEF mapping within clinically acceptable acquisition times and could therefore provide an important contribution for analyzing tumors or monitoring the success of radio and chemo therapies. Copyright © 2014. Published by Elsevier GmbH.

  9. Non-invasive multiparametric qBOLD approach for robust mapping of the oxygen extraction fraction

    Energy Technology Data Exchange (ETDEWEB)

    Domsch, Sebastian; Mie, Moritz B.; Schad, Lothar R. [Heidelberg Univ., Medical Faculty Mannheim (Germany). Computer Assisted Clinical Medicine; Wenz, Frederik [Heidelberg Univ., Medical Faculty Mannheim (Germany). Dept. of Radiation Oncology

    2014-10-01

    Introduction: The quantitative blood oxygenation level-dependent (qBOLD) method has not become clinically established yet because long acquisition times are necessary to achieve an acceptable certainty of the parameter estimates. In this work, a non-invasive multiparametric (nimp) qBOLD approach based on a simple analytical model is proposed to facilitate robust oxygen extraction fraction (OEF) mapping within clinically acceptable acquisition times by using separate measurements. Methods: The protocol consisted of a gradient-echo sampled spin-echo sequence (GESSE), a T{sub 2}-weighted Carr-Purcell-Meiboom-Gill (CPMG) sequence, and a T{sub 2}{sup *}-weighted multi-slice multi-echo gradient echo (MMGE) sequence. The GESSE acquisition time was less than 5 minutes and the extra measurement time for CPMG / MMGE was below 2 minutes each. The proposed nimp-qBOLD approach was validated in healthy subjects (N = 5) and one patient. Results: The proposed nimp-qBOLD approach facilitated more robust OEF mapping with significantly reduced inter- and intra-subject variability compared to the standard qBOLD method. Thereby, an average OEF in all subjects of 27 ± 2 % in white matter (WM) and 29 ± 2 % in gray matter (GM) using the nimp-qBOLD method was more stable compared to 41 ± 10 % (WM) and 46 ± 10 % (GM) with standard qBOLD. Moreover, the spatial variance in the image slice (i.e. standard deviation divided by mean) was on average reduced from 35 % to 25 %. In addition, the preliminary results of the patient are encouraging. Conclusion: The proposed nimp-qBOLD technique provides a promising tool for robust OEF mapping within clinically acceptable acquisition times and could therefore provide an important contribution for analyzing tumors or monitoring the success of radio and chemo therapies. (orig.)

  10. Pixel-based dust-extinction mapping in nearby galaxies: A new approach to lifting the veil of dust

    Science.gov (United States)

    Tamura, Kazuyuki

    In the first part of this dissertation, I explore a new approach to mapping dust extinction in galaxies, using the observed and estimated dust-free flux- ratios of optical V -band and mid-IR 3.6 micro-meter emission. Inferred missing V -band flux is then converted into an estimate of dust extinction. While dust features are not clearly evident in the observed ground-based images of NGC 0959, the target of my pilot study, the dust-map created with this method clearly traces the distribution of dust seen in higher resolution Hubble images. Stellar populations are then analyzed through various pixel Color- Magnitude Diagrams and pixel Color-Color Diagrams (pCCDs), both before and after extinction correction. The ( B - 3.6 microns) versus (far-UV - U ) pCCD proves particularly powerful to distinguish pixels that are dominated by different types of or mixtures of stellar populations. Mapping these pixel- groups onto a pixel-coordinate map shows that they are not distributed randomly, but follow genuine galactic structures, such as a previously unrecognized bar. I show that selecting pixel-groups is not meaningful when using uncorrected colors, and that pixel-based extinction correction is crucial to reveal the true spatial variations in stellar populations. This method is then applied to a sample of late-type galaxies to study the distribution of dust and stellar population as a function of their morphological type and absolute magnitude. In each galaxy, I find that dust extinction is not simply decreasing radially, but that is concentrated in localized clumps throughout a galaxy. I also find some cases where star-formation regions are not associated with dust. In the second part, I describe the application of astronomical image analysis tools for medical purposes. In particular, Source Extractor is used to detect nerve fibers in the basement membrane images of human skin-biopsies of obese subjects. While more development and testing is necessary for this kind of work

  11. The Effectiveness of Web-Based Asthma Self-Management System, My Asthma Portal (MAP): A Pilot Randomized Controlled Trial.

    Science.gov (United States)

    Ahmed, Sara; Ernst, Pierre; Bartlett, Susan J; Valois, Marie-France; Zaihra, Tasneem; Paré, Guy; Grad, Roland; Eilayyan, Owis; Perreault, Robert; Tamblyn, Robyn

    2016-12-01

    Whether Web-based technologies can improve disease self-management is uncertain. My Asthma Portal (MAP) is a Web-based self-management support system that couples evidence-based behavioral change components (self-monitoring of symptoms, physical activity, and medication adherence) with real-time monitoring, feedback, and support from a nurse case manager. The aim of this study was to compare the impact of access to a Web-based asthma self-management patient portal linked to a case-management system (MAP) over 6 months compared with usual care on asthma control and quality of life. A multicenter, parallel, 2-arm, pilot, randomized controlled trial was conducted with 100 adults with confirmed diagnosis of asthma from 2 specialty clinics. Asthma control was measured using an algorithm based on overuse of fast-acting bronchodilators and emergency department visits, and asthma-related quality of life was assessed using the Mini-Asthma Quality of Life Questionnaire (MAQLQ). Secondary mediating outcomes included asthma symptoms, depressive symptoms, self-efficacy, and beliefs about medication. Process evaluations were also included. A total of 49 individuals were randomized to MAP and 51 to usual care. Compared with usual care, participants in the intervention group reported significantly higher asthma quality of life (mean change 0.61, 95% CI 0.03 to 1.19), and the change in asthma quality of life for the intervention group between baseline and 3 months (mean change 0.66, 95% CI 0.35 to 0.98) was not seen in the control group. No significant differences in asthma quality of life were found between the intervention and control groups at 6 (mean change 0.46, 95% CI -0.12 to 1.05) and 9 months (mean change 0.39, 95% CI -0.2 to 0.98). For poor control status, there was no significant effect of group, time, or group by time. For all self-reported measures, the intervention group had a significantly higher proportion of individuals, demonstrating a minimal clinically

  12. Cryptanalysis and improvement of an optical image encryption scheme using a chaotic Baker map and double random phase encoding

    International Nuclear Information System (INIS)

    Chen, Jun-Xin; Fu, Chong; Zhu, Zhi-Liang; Zhang, Li-Bo; Zhang, Yushu

    2014-01-01

    In this paper, we evaluate the security of an enhanced double random phase encoding (DRPE) image encryption scheme (2013 J. Lightwave Technol. 31 2533). The original system employs a chaotic Baker map prior to DRPE to provide more protection to the plain image and hence promote the security level of DRPE, as claimed. However, cryptanalysis shows that this scheme is vulnerable to a chosen-plaintext attack, and the ciphertext can be precisely recovered. The corresponding improvement is subsequently reported upon the basic premise that no extra equipment or computational complexity is required. The simulation results and security analyses prove its effectiveness and security. The proposed achievements are suitable for all cryptosystems under permutation and, following that, the DRPE architecture, and we hope that our work can motivate the further research on optical image encryption. (paper)

  13. Cryptanalysis and improvement of an optical image encryption scheme using a chaotic Baker map and double random phase encoding

    Science.gov (United States)

    Chen, Jun-Xin; Zhu, Zhi-Liang; Fu, Chong; Zhang, Li-Bo; Zhang, Yushu

    2014-12-01

    In this paper, we evaluate the security of an enhanced double random phase encoding (DRPE) image encryption scheme (2013 J. Lightwave Technol. 31 2533). The original system employs a chaotic Baker map prior to DRPE to provide more protection to the plain image and hence promote the security level of DRPE, as claimed. However, cryptanalysis shows that this scheme is vulnerable to a chosen-plaintext attack, and the ciphertext can be precisely recovered. The corresponding improvement is subsequently reported upon the basic premise that no extra equipment or computational complexity is required. The simulation results and security analyses prove its effectiveness and security. The proposed achievements are suitable for all cryptosystems under permutation and, following that, the DRPE architecture, and we hope that our work can motivate the further research on optical image encryption.

  14. Analogies between colored Lévy noise and random channel approach to disordered kinetics

    Science.gov (United States)

    Vlad, Marcel O.; Velarde, Manuel G.; Ross, John

    2004-02-01

    We point out some interesting analogies between colored Lévy noise and the random channel approach to disordered kinetics. These analogies are due to the fact that the probability density of the Lévy noise source plays a similar role as the probability density of rate coefficients in disordered kinetics. Although the equations for the two approaches are not identical, the analogies can be used for deriving new, useful results for both problems. The random channel approach makes it possible to generalize the fractional Uhlenbeck-Ornstein processes (FUO) for space- and time-dependent colored noise. We describe the properties of colored noise in terms of characteristic functionals, which are evaluated by using a generalization of Huber's approach to complex relaxation [Phys. Rev. B 31, 6070 (1985)]. We start out by investigating the properties of symmetrical white noise and then define the Lévy colored noise in terms of a Langevin equation with a Lévy white noise source. We derive exact analytical expressions for the various characteristic functionals, which characterize the noise, and a functional fractional Fokker-Planck equation for the probability density functional of the noise at a given moment in time. Second, by making an analogy between the theory of colored noise and the random channel approach to disordered kinetics, we derive fractional equations for the evolution of the probability densities of the random rate coefficients in disordered kinetics. These equations serve as a basis for developing methods for the evaluation of the statistical properties of the random rate coefficients from experimental data. Special attention is paid to the analysis of systems for which the observed kinetic curves can be described by linear or nonlinear stretched exponential kinetics.

  15. A novel approach to generate random surface thermal loads in piping

    Energy Technology Data Exchange (ETDEWEB)

    Costa Garrido, Oriol, E-mail: oriol.costa@ijs.si; El Shawish, Samir; Cizelj, Leon

    2014-07-01

    Highlights: • Approach for generating continuous and time-dependent random thermal fields. • Temperature fields simulate fluid mixing thermal loads at fluid–wall interface. • Through plane-wave decomposition, experimental temperature statistics are reproduced. • Validation of the approach with a case study from literature. • Random surface thermal loads generation for future thermal fatigue analyses of piping. - Abstract: There is a need to perform three-dimensional mechanical analyses of pipes, subjected to complex thermo-mechanical loadings such as the ones evolving from turbulent fluid mixing in a T-junction. A novel approach is proposed in this paper for fast and reliable generation of random thermal loads at the pipe surface. The resultant continuous and time-dependent temperature fields simulate the fluid mixing thermal loads at the fluid–wall interface. The approach is based on reproducing discrete fluid temperature statistics, from experimental readings or computational fluid dynamic simulation's results, at interface locations through plane-wave decomposition of temperature fluctuations. The obtained random thermal fields contain large scale instabilities such as cold and hot spots traveling at flow velocities. These low frequency instabilities are believed to be among the major causes of the thermal fatigue in T-junction configurations. The case study found in the literature has been used to demonstrate the generation of random surface thermal loads. The thermal fields generated with the proposed approach are statistically equivalent (within the first two moments) to those from CFD simulations results of similar characteristics. The fields maintain the input data at field locations for a large set of parameters used to generate the thermal loads. This feature will be of great advantage in future sensitivity fatigue analyses of three-dimensional pipe structures.

  16. A novel approach to generate random surface thermal loads in piping

    International Nuclear Information System (INIS)

    Costa Garrido, Oriol; El Shawish, Samir; Cizelj, Leon

    2014-01-01

    Highlights: • Approach for generating continuous and time-dependent random thermal fields. • Temperature fields simulate fluid mixing thermal loads at fluid–wall interface. • Through plane-wave decomposition, experimental temperature statistics are reproduced. • Validation of the approach with a case study from literature. • Random surface thermal loads generation for future thermal fatigue analyses of piping. - Abstract: There is a need to perform three-dimensional mechanical analyses of pipes, subjected to complex thermo-mechanical loadings such as the ones evolving from turbulent fluid mixing in a T-junction. A novel approach is proposed in this paper for fast and reliable generation of random thermal loads at the pipe surface. The resultant continuous and time-dependent temperature fields simulate the fluid mixing thermal loads at the fluid–wall interface. The approach is based on reproducing discrete fluid temperature statistics, from experimental readings or computational fluid dynamic simulation's results, at interface locations through plane-wave decomposition of temperature fluctuations. The obtained random thermal fields contain large scale instabilities such as cold and hot spots traveling at flow velocities. These low frequency instabilities are believed to be among the major causes of the thermal fatigue in T-junction configurations. The case study found in the literature has been used to demonstrate the generation of random surface thermal loads. The thermal fields generated with the proposed approach are statistically equivalent (within the first two moments) to those from CFD simulations results of similar characteristics. The fields maintain the input data at field locations for a large set of parameters used to generate the thermal loads. This feature will be of great advantage in future sensitivity fatigue analyses of three-dimensional pipe structures

  17. Conceptualizing Stakeholders’ Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    Directory of Open Access Journals (Sweden)

    Lopes Rita

    2015-12-01

    Full Text Available A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  18. Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach

    Science.gov (United States)

    Xiao, T.

    2012-12-01

    One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.

  19. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach.

    Science.gov (United States)

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  20. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    Science.gov (United States)

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  1. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    Directory of Open Access Journals (Sweden)

    Adrian-Valentin Nedelcu

    2015-01-01

    Full Text Available Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems.

  2. A score-statistic approach for determining threshold values in QTL mapping.

    Science.gov (United States)

    Kao, Chen-Hung; Ho, Hsiang-An

    2012-06-01

    Issues in determining the threshold values of QTL mapping are often investigated for the backcross and F2 populations with relatively simple genome structures so far. The investigations of these issues in the progeny populations after F2 (advanced populations) with relatively more complicated genomes are generally inadequate. As these advanced populations have been well implemented in QTL mapping, it is important to address these issues for them in more details. Due to an increasing number of meiosis cycle, the genomes of the advanced populations can be very different from the backcross and F2 genomes. Therefore, special devices that consider the specific genome structures present in the advanced populations are required to resolve these issues. By considering the differences in genome structure between populations, we formulate more general score test statistics and gaussian processes to evaluate their threshold values. In general, we found that, given a significance level and a genome size, threshold values for QTL detection are higher in the denser marker maps and in the more advanced populations. Simulations were performed to validate our approach.

  3. The constellation of dietary factors in adolescent acne: a semantic connectivity map approach.

    Science.gov (United States)

    Grossi, E; Cazzaniga, S; Crotti, S; Naldi, L; Di Landro, A; Ingordo, V; Cusano, F; Atzori, L; Tripodi Cutrì, F; Musumeci, M L; Pezzarossa, E; Bettoli, V; Caproni, M; Bonci, A

    2016-01-01

    Different lifestyle and dietetic factors have been linked with the onset and severity of acne. To assess the complex interconnection between dietetic variables and acne. This was a reanalysis of data from a case-control study by using a semantic connectivity map approach. 563 subjects, aged 10-24 years, involved in a case-control study of acne between March 2009 and February 2010, were considered in this study. The analysis evaluated the link between a moderate to severe acne and anthropometric variables, family history and dietetic factors. Analyses were conducted by relying on an artificial adaptive system, the Auto Semantic Connectivity Map (AutoCM). The AutoCM map showed that moderate-severe acne was closely associated with family history of acne in first degree relatives, obesity (BMI ≥ 30), and high consumption of milk, in particular skim milk, cheese/yogurt, sweets/cakes, chocolate, and a low consumption of fish, and limited intake of fruits/vegetables. Our analyses confirm the link between several dietetic items and acne. When providing care, dermatologists should also be aware of the complex interconnection between dietetic factors and acne. © 2014 European Academy of Dermatology and Venereology.

  4. Application of the random vibration approach in the seismic analysis of LMFBR structures

    International Nuclear Information System (INIS)

    Preumont, A.

    1988-01-01

    The first part discusses the general topic of the spectral analysis of linear multi-degree-of-freedom structure subjected to a stationary random field. Particular attention is given to structures with non-classical damping and hereditary characteristics. The method is implemented in the computer programme RANDOM. Next, the same concepts are applied to multi-supported structures subjected to a stationary seismic excitation. The method is implemented in the computer programme SEISME. Two related problems are dealt with in the next two chapters: (i) the relation between the input of the random vibration analysis and the traditional ground motion specification for seismic analysis (the Design Response Spectra) and (ii) the application of random vibration techniques to the direct generation of floor response spectra. Finally the problem of extracting information from costly time history analyses is addressed. This study has mainly been concerned with the methodology and the development of appropriate softwares. Some qualitative conclusions have been drawn regarding the expected benefit of the approach. They have been judged promising enough to motivate a benchmark exercise. Specifically, the random vibration approach will be compared to the current approximate methods (response spectrum) and time-history analyses (considered as representative of the true response) for a set of typical structures. The hope is that some of the flaws of the current approximate methods can be removed

  5. Degree distributions of the visibility graphs mapped from fractional Brownian motions and multifractal random walks

    International Nuclear Information System (INIS)

    Ni Xiaohui; Jiang Zhiqiang; Zhou Weixing

    2009-01-01

    The dynamics of a complex system is usually recorded in the form of time series, which can be studied through its visibility graph from a complex network perspective. We investigate the visibility graphs extracted from fractional Brownian motions and multifractal random walks, and find that the degree distributions exhibit power-law behaviors, in which the power-law exponent α is a linear function of the Hurst index H of the time series. We also find that the degree distribution of the visibility graph is mainly determined by the temporal correlation of the original time series with minor influence from the possible multifractal nature. As an example, we study the visibility graphs constructed from three Chinese stock market indexes and unveil that the degree distributions have power-law tails, where the tail exponents of the visibility graphs and the Hurst indexes of the indexes are close to the α∼H linear relationship.

  6. GuiTope: an application for mapping random-sequence peptides to protein sequences.

    Science.gov (United States)

    Halperin, Rebecca F; Stafford, Phillip; Emery, Jack S; Navalkar, Krupa Arun; Johnston, Stephen Albert

    2012-01-03

    Random-sequence peptide libraries are a commonly used tool to identify novel ligands for binding antibodies, other proteins, and small molecules. It is often of interest to compare the selected peptide sequences to the natural protein binding partners to infer the exact binding site or the importance of particular residues. The ability to search a set of sequences for similarity to a set of peptides may sometimes enable the prediction of an antibody epitope or a novel binding partner. We have developed a software application designed specifically for this task. GuiTope provides a graphical user interface for aligning peptide sequences to protein sequences. All alignment parameters are accessible to the user including the ability to specify the amino acid frequency in the peptide library; these frequencies often differ significantly from those assumed by popular alignment programs. It also includes a novel feature to align di-peptide inversions, which we have found improves the accuracy of antibody epitope prediction from peptide microarray data and shows utility in analyzing phage display datasets. Finally, GuiTope can randomly select peptides from a given library to estimate a null distribution of scores and calculate statistical significance. GuiTope provides a convenient method for comparing selected peptide sequences to protein sequences, including flexible alignment parameters, novel alignment features, ability to search a database, and statistical significance of results. The software is available as an executable (for PC) at http://www.immunosignature.com/software and ongoing updates and source code will be available at sourceforge.net.

  7. GuiTope: an application for mapping random-sequence peptides to protein sequences

    Directory of Open Access Journals (Sweden)

    Halperin Rebecca F

    2012-01-01

    Full Text Available Abstract Background Random-sequence peptide libraries are a commonly used tool to identify novel ligands for binding antibodies, other proteins, and small molecules. It is often of interest to compare the selected peptide sequences to the natural protein binding partners to infer the exact binding site or the importance of particular residues. The ability to search a set of sequences for similarity to a set of peptides may sometimes enable the prediction of an antibody epitope or a novel binding partner. We have developed a software application designed specifically for this task. Results GuiTope provides a graphical user interface for aligning peptide sequences to protein sequences. All alignment parameters are accessible to the user including the ability to specify the amino acid frequency in the peptide library; these frequencies often differ significantly from those assumed by popular alignment programs. It also includes a novel feature to align di-peptide inversions, which we have found improves the accuracy of antibody epitope prediction from peptide microarray data and shows utility in analyzing phage display datasets. Finally, GuiTope can randomly select peptides from a given library to estimate a null distribution of scores and calculate statistical significance. Conclusions GuiTope provides a convenient method for comparing selected peptide sequences to protein sequences, including flexible alignment parameters, novel alignment features, ability to search a database, and statistical significance of results. The software is available as an executable (for PC at http://www.immunosignature.com/software and ongoing updates and source code will be available at sourceforge.net.

  8. A new approach to the statistical treatment of 2D-maps in proteomics using fuzzy logic.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio

    2003-01-01

    A new approach to the statistical treatment of 2D-maps has been developed. This method is based on the use of fuzzy logic and allows to take into consideration the typical low reproducibility of 2D-maps. In this approach the signal corresponding to the presence of proteins on the 2D-maps is substituted with probability functions, centred on the signal itself. The standard deviation of the bidimensional gaussian probability function employed to blur the signal allows to assign different uncertainties to the two electrophoretic dimensions. The effect of changing the standard deviation and the digitalisation resolution are investigated.

  9. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    Directory of Open Access Journals (Sweden)

    Chris Wallace

    2015-06-01

    Full Text Available Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS and type 1 diabetes (T1D associations in the IL-2RA (CD25 gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3 and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data.

  10. Determining Coastal Hazards Risk Perception to Enhance Local Mitigation Planning through a Participatory Mapping Approach

    Science.gov (United States)

    Bethel, M.; Braud, D.; Lambeth, T.; Biber, P.; Wu, W.

    2017-12-01

    Coastal community leaders, government officials, and natural resource managers must be able to accurately assess and predict a given coastal landscape's sustainability and/or vulnerability as coastal habitat continues to undergo rapid and dramatic changes associated with natural and anthropogenic activities such as accelerated relative sea level rise (SLR). To help address this information need, a multi-disciplinary project team conducted Sea Grant sponsored research in Louisiana and Mississippi with traditional ecosystem users and natural resource managers to determine a method for producing localized vulnerability and sustainability maps for projected SLR and storm surge impacts, and determine how and whether the results of such an approach can provide more useful information to enhance hazard mitigation planning. The goals of the project are to develop and refine SLR visualization tools for local implementation in areas experiencing subsidence and erosion, and discover the different ways stakeholder groups evaluate risk and plan mitigation strategies associated with projected SLR and storm surge. Results from physical information derived from data and modeling of subsidence, erosion, engineered restoration and coastal protection features, historical land loss, and future land projections under SLR are integrated with complimentary traditional ecological knowledge (TEK) offered by the collaborating local ecosystem users for these assessments. The data analysis involves interviewing stakeholders, coding the interviews for themes, and then converting the themes into vulnerability and sustainability factors. Each factor is weighted according to emphasis by the TEK experts and number of experts who mention it to determine which factors are the highest priority. The priority factors are then mapped with emphasis on the perception of contributing to local community vulnerability or sustainability to SLR and storm surge. The maps are used by the collaborators to benefit

  11. A national scale flood hazard mapping methodology: The case of Greece - Protection and adaptation policy approaches.

    Science.gov (United States)

    Kourgialas, Nektarios N; Karatzas, George P

    2017-12-01

    The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Random-walk approach to the d -dimensional disordered Lorentz gas

    Science.gov (United States)

    Adib, Artur B.

    2008-02-01

    A correlated random walk approach to diffusion is applied to the disordered nonoverlapping Lorentz gas. By invoking the Lu-Torquato theory for chord-length distributions in random media [J. Chem. Phys. 98, 6472 (1993)], an analytic expression for the diffusion constant in arbitrary number of dimensions d is obtained. The result corresponds to an Enskog-like correction to the Boltzmann prediction, being exact in the dilute limit, and better or nearly exact in comparison to renormalized kinetic theory predictions for all allowed densities in d=2,3 . Extensive numerical simulations were also performed to elucidate the role of the approximations involved.

  13. Instanton Approach to the Langevin Motion of a Particle in a Random Potential

    International Nuclear Information System (INIS)

    Lopatin, A. V.; Vinokur, V. M.

    2001-01-01

    We develop an instanton approach to the nonequilibrium dynamics in one-dimensional random environments. The long time behavior is controlled by rare fluctuations of the disorder potential and, accordingly, by the tail of the distribution function for the time a particle needs to propagate along the system (the delay time). The proposed method allows us to find the tail of the delay time distribution function and delay time moments, providing thus an exact description of the long time dynamics. We analyze arbitrary environments covering different types of glassy dynamics: dynamics in a short-range random field, creep, and Sinai's motion

  14. Integrating Volcanic Hazard Data in a Systematic Approach to Develop Volcanic Hazard Maps in the Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Jan M. Lindsay

    2018-04-01

    Full Text Available We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick ‘em Jenny and Ronde/Caille, Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia, and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past ~10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that

  15. A novel linear programming approach to fluence map optimization for intensity modulated radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Romeijn, H Edwin; Ahuja, Ravindra K; Dempsey, James F; Kumar, Arvind; Li, Jonathan G

    2003-01-01

    We present a novel linear programming (LP) based approach for efficiently solving the intensity modulated radiation therapy (IMRT) fluence-map optimization (FMO) problem to global optimality. Our model overcomes the apparent limitations of a linear-programming approach by approximating any convex objective function by a piecewise linear convex function. This approach allows us to retain the flexibility offered by general convex objective functions, while allowing us to formulate the FMO problem as a LP problem. In addition, a novel type of partial-volume constraint that bounds the tail averages of the differential dose-volume histograms of structures is imposed while retaining linearity as an alternative approach to improve dose homogeneity in the target volumes, and to attempt to spare as many critical structures as possible. The goal of this work is to develop a very rapid global optimization approach that finds high quality dose distributions. Implementation of this model has demonstrated excellent results. We found globally optimal solutions for eight 7-beam head-and-neck cases in less than 3 min of computational time on a single processor personal computer without the use of partial-volume constraints. Adding such constraints increased the running times by a factor of 2-3, but improved the sparing of critical structures. All cases demonstrated excellent target coverage (>95%), target homogeneity (<10% overdosing and <7% underdosing) and organ sparing using at least one of the two models

  16. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    Science.gov (United States)

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  17. Geospatial approach in mapping soil erodibility using CartoDEM – A ...

    Indian Academy of Sciences (India)

    unscientific management practices followed in the hilly regions. .... country. In the absence of large scale or detail map, researcher use the small scale of soil map prepared ..... tural development. .... mapping: An introductory perspective; Dev.

  18. In Silico Design of Human IMPDH Inhibitors Using Pharmacophore Mapping and Molecular Docking Approaches

    Directory of Open Access Journals (Sweden)

    Rui-Juan Li

    2015-01-01

    Full Text Available Inosine 5′-monophosphate dehydrogenase (IMPDH is one of the crucial enzymes in the de novo biosynthesis of guanosine nucleotides. It has served as an attractive target in immunosuppressive, anticancer, antiviral, and antiparasitic therapeutic strategies. In this study, pharmacophore mapping and molecular docking approaches were employed to discover novel Homo sapiens IMPDH (hIMPDH inhibitors. The Güner-Henry (GH scoring method was used to evaluate the quality of generated pharmacophore hypotheses. One of the generated pharmacophore hypotheses was found to possess a GH score of 0.67. Ten potential compounds were selected from the ZINC database using a pharmacophore mapping approach and docked into the IMPDH active site. We find two hits (i.e., ZINC02090792 and ZINC00048033 that match well the optimal pharmacophore features used in this investigation, and it is found that they form interactions with key residues of IMPDH. We propose that these two hits are lead compounds for the development of novel hIMPDH inhibitors.

  19. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    Science.gov (United States)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  20. Hyperspectral Data for Mangrove Species Mapping: A Comparison of Pixel-Based and Object-Based Approach

    Directory of Open Access Journals (Sweden)

    Muhammad Kamal

    2011-10-01

    Full Text Available Visual image interpretation and digital image classification have been used to map and monitor mangrove extent and composition for decades. The presence of a high-spatial resolution hyperspectral sensor can potentially improve our ability to differentiate mangrove species. However, little research has explored the use of pixel-based and object-based approaches on high-spatial hyperspectral datasets for this purpose. This study assessed the ability of CASI-2 data for mangrove species mapping using pixel-based and object-based approaches at the mouth of the Brisbane River area, southeast Queensland, Australia. Three mapping techniques used in this study: spectral angle mapper (SAM and linear spectral unmixing (LSU for the pixel-based approaches, and multi-scale segmentation for the object-based image analysis (OBIA. The endmembers for the pixel-based approach were collected based on existing vegetation community map. Nine targeted classes were mapped in the study area from each approach, including three mangrove species: Avicennia marina, Rhizophora stylosa, and Ceriops australis. The mapping results showed that SAM produced accurate class polygons with only few unclassified pixels (overall accuracy 69%, Kappa 0.57, the LSU resulted in a patchy polygon pattern with many unclassified pixels (overall accuracy 56%, Kappa 0.41, and the object-based mapping produced the most accurate results (overall accuracy 76%, Kappa 0.67. Our results demonstrated that the object-based approach, which combined a rule-based and nearest-neighbor classification method, was the best classifier to map mangrove species and its adjacent environments.

  1. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    Directory of Open Access Journals (Sweden)

    Dhruba Das

    2015-04-01

    Full Text Available In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM/M/1 and M/FM/1 has been studied and constructed their membership functions of the system characteristics based on the aforesaid principle. The former represents a queue with fuzzy exponential arrivals and exponential service rate while the latter represents a queue with exponential arrival rate and fuzzy exponential service rate.

  2. Mapas de taxas epidemiológicas: uma abordagem Bayesiana Maps of epidemiological rates: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Renato Martins Assunção

    1998-10-01

    Full Text Available Neste artigo, apresentamos métodos estatísticos desenvolvidos recentemente para a análise de mapas de taxas de morbidade quando as unidades geográficas possuem pequenas populações de risco. Eles adotam a abordagem Bayesiana e utilizam métodos computacionais intensivos para estimação do risco de cada área. O objetivo dos métodos é separar a variabilidade das taxas devida às diferenças entre as regiões do risco subjacente daquela devida à pura flutuação aleatória. As estimativas de risco possuem um erro quadrático médio total menor que as estimativas usuais. Aplicamos esses novos métodos para estimar o risco de mortalidade infantil nos municípios de Minas Gerais em 1994.This article presents statistical methods recently developed for the analysis of maps of disease rates when the geographic units have small populations at risk. They adopt the Bayesian approach and use intensive computational methods for estimating risk in each area. The objective of the methods is to separate the variability of rates due to differences between regions from the background risk due to pure random fluctuation. Risk estimates have a total mean quadratic error smaller than usual estimates. We apply these new methods to estimate infant mortality risk in the municipalities of the State of Minas Gerais in 1994.

  3. A Pragmatic Approach to Guide Implementation Evaluation Research: Strategy Mapping for Complex Interventions

    Directory of Open Access Journals (Sweden)

    Alexis K. Huynh

    2018-05-01

    for producing valid and reliable process evaluation data, mapping implementation strategies has informed development of a pragmatic blueprint for implementation and longitudinal analyses and evaluation activities.DiscussionWe update recent recommendations on specification of implementation strategies by considering the implications for multi-strategy frameworks and propose an approach for mapping the use of implementation strategies within complex, multi-level interventions, in support of rigorous evaluation. Developing pragmatic tools to aid in operationalizing the conduct of implementation and evaluation activities is essential to enacting sound implementation research.

  4. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    Directory of Open Access Journals (Sweden)

    Ayman Nagi

    2017-04-01

    Full Text Available Purpose: This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation process nonconformities .

  5. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    Energy Technology Data Exchange (ETDEWEB)

    Nagi, A.; Altarazi, S.

    2017-07-01

    This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO) was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation) process nonconformities.

  6. A comparison of two different approaches for mapping potential ozone damage to vegetation. A model study

    International Nuclear Information System (INIS)

    Simpson, D.; Ashmore, M.R.; Emberson, L.; Tuovinen, J.-P.

    2007-01-01

    Two very different types of approaches are currently in use today for indicating risk of ozone damage to vegetation in Europe. One approach is the so-called AOTX (accumulated exposure over threshold of X ppb) index, which is based upon ozone concentrations only. The second type of approach entails an estimate of the amount of ozone entering via the stomates of vegetation, the AFstY approach (accumulated stomatal flux over threshold of Y nmol m -2 s -1 ). The EMEP chemical transport model is used to map these different indicators of ozone damage across Europe, for two illustrative vegetation types, wheat and beech forests. The results show that exceedences of critical levels for either type of indicator are widespread, but that the indicators give very different spatial patterns across Europe. Model simulations for year 2020 scenarios suggest reductions in risks of vegetation damage whichever indicator is used, but suggest that AOT40 is much more sensitive to emission control than AFstY values. - Model calculations of AOT40 and AFstY show very different spatial variations in the risks of ozone damage to vegetation

  7. Integration of value stream map and strategic layout planning into DMAIC approach to improve carpeting process

    International Nuclear Information System (INIS)

    Nagi, A.; Altarazi, S.

    2017-01-01

    This paper presents an implementation of the Six Sigma DMAIC approach implementing lean tools and facilities layout techniques to reduce the occurrence of different types of nonconformities in the carpeting process. Such carpeting process can be found in several industries such as construction, aviation, and automotive. Design/methodology/approach: The improvement process was built through a sequential implementation of appropriate interconnected tools at each phase of the DMAIC approach. Utilized tools included: Pareto analysis, control charts, Ishikawa chart, 5-whys, failure mode and effect analysis, process capability ratio, value stream mapping, and strategic layout planning. Findings: The carpeting process capability, quality of the product, customer satisfaction, and cost of poor quality were significantly improved. Explicitly, the sigma level was improved from 2.297 to 2.886 and the defects per million opportunities (DPMO) was reduced from 21615 to 3905. Originality/value: This paper has approved the capability of the Six Sigma DMAIC approach to analyze, investigate, and remove the root causes of the carpeting (preparation-installation) process nonconformities.

  8. Random Process Theory Approach to Geometric Heterogeneous Surfaces: Effective Fluid-Solid Interaction

    Science.gov (United States)

    Khlyupin, Aleksey; Aslyamov, Timur

    2017-06-01

    Realistic fluid-solid interaction potentials are essential in description of confined fluids especially in the case of geometric heterogeneous surfaces. Correlated random field is considered as a model of random surface with high geometric roughness. We provide the general theory of effective coarse-grained fluid-solid potential by proper averaging of the free energy of fluid molecules which interact with the solid media. This procedure is largely based on the theory of random processes. We apply first passage time probability problem and assume the local Markov properties of random surfaces. General expression of effective fluid-solid potential is obtained. In the case of small surface irregularities analytical approximation for effective potential is proposed. Both amorphous materials with large surface roughness and crystalline solids with several types of fcc lattices are considered. It is shown that the wider the lattice spacing in terms of molecular diameter of the fluid, the more obtained potentials differ from classical ones. A comparison with published Monte-Carlo simulations was discussed. The work provides a promising approach to explore how the random geometric heterogeneity affects on thermodynamic properties of the fluids.

  9. Application of the random vibration approach in the seismic analysis of LMFBR structures - Benchmark calculations

    International Nuclear Information System (INIS)

    Preumont, A.; Shilab, S.; Cornaggia, L.; Reale, M.; Labbe, P.; Noe, H.

    1992-01-01

    This benchmark exercise is the continuation of the state-of-the-art review (EUR 11369 EN) which concluded that the random vibration approach could be an effective tool in seismic analysis of nuclear power plants, with potential advantages on time history and response spectrum techniques. As compared to the latter, the random vibration method provides an accurate treatment of multisupport excitations, non classical damping as well as the combination of high-frequency modal components. With respect to the former, the random vibration method offers direct information on statistical variability (probability distribution) and cheaper computations. The disadvantages of the random vibration method are that it is based on stationary results, and requires a power spectral density input instead of a response spectrum. A benchmark exercise to compare the three methods from the various aspects mentioned above, on one or several simple structures has been made. The following aspects have been covered with the simplest possible models: (i) statistical variability, (ii) multisupport excitation, (iii) non-classical damping. The random vibration method is therefore concluded to be a reliable method of analysis. Its use is recommended, particularly for preliminary design, owing to its computational advantage on multiple time history analysis

  10. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    Science.gov (United States)

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Resistance of a 1D random chain: Hamiltonian version of the transfer matrix approach

    Science.gov (United States)

    Dossetti-Romero, V.; Izrailev, F. M.; Krokhin, A. A.

    2004-01-01

    We study some mesoscopic properties of electron transport by employing one-dimensional chains and Anderson tight-binding model. Principal attention is paid to the resistance of finite-length chains with disordered white-noise potential. We develop a new version of the transfer matrix approach based on the equivalency of a discrete Schrödinger equation and a two-dimensional Hamiltonian map describing a parametric kicked oscillator. In the two limiting cases of ballistic and localized regime we demonstrate how analytical results for the mean resistance and its second moment can be derived directly from the averaging over classical trajectories of the Hamiltonian map. We also discuss the implication of the single parameter scaling hypothesis to the resistance.

  12. Resistance of a 1D random chain: Hamiltonian version of the transfer matrix approach

    International Nuclear Information System (INIS)

    Dossetti-Romero, V.; Izrailev, F.M.; Krokhin, A.A.

    2004-01-01

    We study some mesoscopic properties of electron transport by employing one-dimensional chains and Anderson tight-binding model. Principal attention is paid to the resistance of finite-length chains with disordered white-noise potential. We develop a new version of the transfer matrix approach based on the equivalency of a discrete Schroedinger equation and a two-dimensional Hamiltonian map describing a parametric kicked oscillator. In the two limiting cases of ballistic and localized regime we demonstrate how analytical results for the mean resistance and its second moment can be derived directly from the averaging over classical trajectories of the Hamiltonian map. We also discuss the implication of the single parameter scaling hypothesis to the resistance

  13. Development of Long Live Love+, a school-based online sexual health programme for young adults. An intervention mapping approach

    NARCIS (Netherlands)

    Mevissen, F.E.F.; Empelen, P. van; Watzeels, A.; Duin, G. van; Meijer, S.; Lieshout, S. van; Kok, G.

    2017-01-01

    This paper describes the development of a Dutch online programme called Long Live Love+ focusing on positive, coercion-free relationships, contraception use, and the prevention of STIs, using the Intervention Mapping (IM) approach. All six steps of the approach were followed. Step 1 confirmed the

  14. Development of "Long Live Love+," a School-Based Online Sexual Health Programme for Young Adults. An Intervention Mapping Approach

    Science.gov (United States)

    Mevissen, Fraukje E. F.; van Empelen, Pepijn; Watzeels, Anita; van Duin, Gee; Meijer, Suzanne; van Lieshout, Sanne; Kok, Gerjo

    2018-01-01

    This paper describes the development of a Dutch online programme called "Long Live Love+" focusing on positive, coercion-free relationships, contraception use, and the prevention of STIs, using the Intervention Mapping (IM) approach. All six steps of the approach were followed. Step 1 confirmed the need for a sexual health programme…

  15. An effective approach to attenuate random noise based on compressive sensing and curvelet transform

    International Nuclear Information System (INIS)

    Liu, Wei; Cao, Siyuan; Zu, Shaohuan; Chen, Yangkang

    2016-01-01

    Random noise attenuation is an important step in seismic data processing. In this paper, we propose a novel denoising approach based on compressive sensing and the curvelet transform. We formulate the random noise attenuation problem as an L _1 norm regularized optimization problem. We propose to use the curvelet transform as the sparse transform in the optimization problem to regularize the sparse coefficients in order to separate signal and noise and to use the gradient projection for sparse reconstruction (GPSR) algorithm to solve the formulated optimization problem with an easy implementation and a fast convergence. We tested the performance of our proposed approach on both synthetic and field seismic data. Numerical results show that the proposed approach can effectively suppress the distortion near the edge of seismic events during the noise attenuation process and has high computational efficiency compared with the traditional curvelet thresholding and iterative soft thresholding based denoising methods. Besides, compared with f-x deconvolution, the proposed denoising method is capable of eliminating the random noise more effectively while preserving more useful signals. (paper)

  16. Random spectrum loading of dental implants: An alternative approach to functional performance assessment.

    Science.gov (United States)

    Shemtov-Yona, K; Rittel, D

    2016-09-01

    The fatigue performance of dental implants is usually assessed on the basis of cyclic S/N curves. This neither provides information on the anticipated service performance of the implant, nor does it allow for detailed comparisons between implants unless a thorough statistical analysis is performed, of the kind not currently required by certification standards. The notion of endurance limit is deemed to be of limited applicability, given unavoidable stress concentrations and random load excursions, that all characterize dental implants and their service conditions. We propose a completely different approach, based on random spectrum loading, as long used in aeronautical design. The implant is randomly loaded by a sequence of loads encompassing all load levels it would endure during its service life. This approach provides a quantitative and comparable estimate of its performance in terms of lifetime, based on the very fact that the implant will fracture sooner or later, instead of defining a fatigue endurance limit of limited practical application. Five commercial monolithic Ti-6Al-4V implants were tested under cyclic, and another 5 under spectrum loading conditions, at room temperature and dry air. The failure modes and fracture planes were identical for all implants. The approach is discussed, including its potential applications, for systematic, straightforward and reliable comparisons of various implant designs and environments, without the need for cumbersome statistical analyses. It is believed that spectrum loading can be considered for the generation of new standardization procedures and design applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. New machine learning tools for predictive vegetation mapping after climate change: Bagging and Random Forest perform better than Regression Tree Analysis

    Science.gov (United States)

    L.R. Iverson; A.M. Prasad; A. Liaw

    2004-01-01

    More and better machine learning tools are becoming available for landscape ecologists to aid in understanding species-environment relationships and to map probable species occurrence now and potentially into the future. To thal end, we evaluated three statistical models: Regression Tree Analybib (RTA), Bagging Trees (BT) and Random Forest (RF) for their utility in...

  18. Fast and accurate approaches for large-scale, automated mapping of food diaries on food composition tables

    DEFF Research Database (Denmark)

    Lamarine, Marc; Hager, Jörg; Saris, Wim H M

    2018-01-01

    the EuroFIR resource. Two approaches were tested: the first was based solely on food name similarity (fuzzy matching). The second used a machine learning approach (C5.0 classifier) combining both fuzzy matching and food energy. We tested mapping food items using their original names and also an English...... not lead to any improvements compared to the fuzzy matching. However, it could increase substantially the recall rate for food items without any clear equivalent in the FCTs (+7 and +20% when mapping items using their original or English-translated names). Our approaches have been implemented as R packages...... and are freely available from GitHub. Conclusion: This study is the first to provide automated approaches for large-scale food item mapping onto FCTs. We demonstrate that both high precision and recall can be achieved. Our solutions can be used with any FCT and do not require any programming background...

  19. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  20. Policy, Research and Residents’ Perspectives on Built Environments Implicated in Heart Disease: A Concept Mapping Approach

    Directory of Open Access Journals (Sweden)

    Ivana Stankov

    2017-02-01

    Full Text Available An underrepresentation of stakeholder perspectives within urban health research arguably limits our understanding of what is a multi-dimensional and complex relationship between the built environment and health. By engaging a wide range of stakeholders using a participatory concept mapping approach, this study aimed to achieve a more holistic and nuanced understanding of the built environments shaping disease risk, specifically cardiometabolic risk (CMR. Moreover, this study aimed to ascertain the importance and changeability of identified environments through government action. Through the concept mapping process, community members, researchers, government and non-government stakeholders collectively identified eleven clusters encompassing 102 built environmental domains related to CMR, a number of which are underrepresented within the literature. Among the identified built environments, open space, public transportation and pedestrian environments were highlighted as key targets for policy intervention. Whilst there was substantive convergence in stakeholder groups’ perspectives concerning the built environment and CMR, there were disparities in the level of importance government stakeholders and community members respectively assigned to pedestrian environments and street connectivity. These findings support the role of participatory methods in strengthening how urban health issues are understood and in affording novel insights into points of action for public health and policy intervention.

  1. New approaches to high-resolution mapping of marine vertical structures.

    Science.gov (United States)

    Robert, Katleen; Huvenne, Veerle A I; Georgiopoulou, Aggeliki; Jones, Daniel O B; Marsh, Leigh; D O Carter, Gareth; Chaumillon, Leo

    2017-08-21

    Vertical walls in marine environments can harbour high biodiversity and provide natural protection from bottom-trawling activities. However, traditional mapping techniques are usually restricted to down-looking approaches which cannot adequately replicate their 3D structure. We combined sideways-looking multibeam echosounder (MBES) data from an AUV, forward-looking MBES data from ROVs and ROV-acquired videos to examine walls from Rockall Bank and Whittard Canyon, Northeast Atlantic. High-resolution 3D point clouds were extracted from each sonar dataset and structure from motion photogrammetry (SfM) was applied to recreate 3D representations of video transects along the walls. With these reconstructions, it was possible to interact with extensive sections of video footage and precisely position individuals. Terrain variables were derived on scales comparable to those experienced by megabenthic individuals. These were used to show differences in environmental conditions between observed and background locations as well as explain spatial patterns in ecological characteristics. In addition, since the SfM 3D reconstructions retained colours, they were employed to separate and quantify live coral colonies versus dead framework. The combination of these new technologies allows us, for the first time, to map the physical 3D structure of previously inaccessible habitats and demonstrates the complexity and importance of vertical structures.

  2. Data Mining Approaches for Landslide Susceptibility Mapping in Umyeonsan, Seoul, South Korea

    Directory of Open Access Journals (Sweden)

    Sunmin Lee

    2017-07-01

    Full Text Available The application of data mining models has become increasingly popular in recent years in assessments of a variety of natural hazards such as landslides and floods. Data mining techniques are useful for understanding the relationships between events and their influencing variables. Because landslides are influenced by a combination of factors including geomorphological and meteorological factors, data mining techniques are helpful in elucidating the mechanisms by which these complex factors affect landslide events. In this study, spatial data mining approaches based on data on landslide locations in the geographic information system environment were investigated. The topographical factors of slope, aspect, curvature, topographic wetness index, stream power index, slope length factor, standardized height, valley depth, and downslope distance gradient were determined using topographical maps. Additional soil and forest variables using information obtained from national soil and forest maps were also investigated. A total of 17 variables affecting the frequency of landslide occurrence were selected to construct a spatial database, and support vector machine (SVM and artificial neural network (ANN models were applied to predict landslide susceptibility from the selected factors. In the SVM model, linear, polynomial, radial base function, and sigmoid kernels were applied in sequence; the model yielded 72.41%, 72.83%, 77.17% and 72.79% accuracy, respectively. The ANN model yielded a validity accuracy of 78.41%. The results of this study are useful in guiding effective strategies for the prevention and management of landslides in urban areas.

  3. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms.

    Science.gov (United States)

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting; Guo, Feng-Biao

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus , which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge.

  4. My Family-Study, Early-Onset Substance use Prevention Program: An Application of Intervention Mapping Approach

    Directory of Open Access Journals (Sweden)

    Mehdi Mirzaei-Alavijeh

    2017-03-01

    Full Text Available Background and Objectives: Based on different studies, substance use is one of the health problems in the Iranian society. The prevalence of substance use is on a growing trend; moreover, the age of the onset of substance use has declined to early adolescence and even lower. Regarding this, the present study aimed to develop a family-based early-onset substance use prevention program in children (My Family-Study by using intervention mapping approach. Materials and Methods: This study descirbes the research protocol during which the intervention mapping approach was used as a framework to develop My Family-Study. In this study, six steps of intervention mapping were completed. Interviews with experts and literature review fulfilled the need assessment. In the second step, the change objectivs were rewritten based on the intersection of the performance objectives and the determinants associated in the matrices. After designing the program and planning the implementation of the intervention, the evaluation plan of the program was accomplished. Results: The use of intervention mapping approach facilitated the develop-pment of a systematic as well as theory- and evidence-based program. Moreover, this approach was helful in the determination of outcomes, performance and change objectives, determinants, theoretical methods, practical application, intervention, dissemination, and evaluation program. Conclusions: The intervention mapping provided a systematic as well as theory- and evidence-based approach to develop a quality continuing health promotion program.

  5. Mapping an appropriate health promotion approach for crèches in an informal settlement.

    Science.gov (United States)

    Brijlal, P; Gordon, N

    2005-02-01

    People living in informal settlements in South Africa experience the double burden of poverty and ill health. Wallacedene, an informal settlement was highlighted in the media as being a socially and otherwise deprived community, with many accompanying health problems. It was against this background that this study was conducted to gain a better understanding of the health and oral health status of children attending crèches in Wallacedene. It was designed to inform the mapping of an appropriate approach to develop a health promotion programme for crèches. Baseline data were collected through oral and general health examinations, site observations, a structured questionnaire and interviews with key people working with the children at two crèches. The results indicate poor oral and general health. Gingival inflammation (82.8%), caries (81.5%), and moderate to abundant plaque deposits (95.7%), fungal infections (33.9%), runny nose (51.4%), lymphadenopathy (45.7%) and itchy skin (5.7%) were found. Caregivers were not well informed about oral health. However, they were enthusiastic to engage in new interventions. The community was impoverished; public health interventions were limited with minimal resources such as health centres and voluntary service providers. The limited resources were not coordinated and did not adequately address the health and educational needs of the children. A multi-sectoral approach focusing on community development is an appropriate approach to address the needs of crèche children in this community.

  6. The mapping approach in the path integral formalism applied to curve-crossing systems

    International Nuclear Information System (INIS)

    Novikov, Alexey; Kleinekathoefer, Ulrich; Schreiber, Michael

    2004-01-01

    The path integral formalism in a combined phase-space and coherent-state representation is applied to the problem of curve-crossing dynamics. The system of interest is described by two coupled one-dimensional harmonic potential energy surfaces interacting with a heat bath consisting of harmonic oscillators. The mapping approach is used to rewrite the Lagrangian function of the electronic part of the system. Using the Feynman-Vernon influence-functional method the bath is eliminated whereas the non-Gaussian part of the path integral is treated using the generating functional for the electronic trajectories. The dynamics of a Gaussian wave packet is analyzed along a one-dimensional reaction coordinate within a perturbative treatment for a small coordinate shift between the potential energy surfaces

  7. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    Science.gov (United States)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  8. A Systematic Approach to Modified BCJR MAP Algorithms for Convolutional Codes

    Directory of Open Access Journals (Sweden)

    Patenaude François

    2006-01-01

    Full Text Available Since Berrou, Glavieux and Thitimajshima published their landmark paper in 1993, different modified BCJR MAP algorithms have appeared in the literature. The existence of a relatively large number of similar but different modified BCJR MAP algorithms, derived using the Markov chain properties of convolutional codes, naturally leads to the following questions. What is the relationship among the different modified BCJR MAP algorithms? What are their relative performance, computational complexities, and memory requirements? In this paper, we answer these questions. We derive systematically four major modified BCJR MAP algorithms from the BCJR MAP algorithm using simple mathematical transformations. The connections between the original and the four modified BCJR MAP algorithms are established. A detailed analysis of the different modified BCJR MAP algorithms shows that they have identical computational complexities and memory requirements. Computer simulations demonstrate that the four modified BCJR MAP algorithms all have identical performance to the BCJR MAP algorithm.

  9. Mapping Spatial Distribution of Larch Plantations from Multi-Seasonal Landsat-8 OLI Imagery and Multi-Scale Textures Using Random Forests

    Directory of Open Access Journals (Sweden)

    Tian Gao

    2015-02-01

    Full Text Available The knowledge about spatial distribution of plantation forests is critical for forest management, monitoring programs and functional assessment. This study demonstrates the potential of multi-seasonal (spring, summer, autumn and winter Landsat-8 Operational Land Imager imageries with random forests (RF modeling to map larch plantations (LP in a typical plantation forest landscape in North China. The spectral bands and two types of textures were applied for creating 675 input variables of RF. An accuracy of 92.7% for LP, with a Kappa coefficient of 0.834, was attained using the RF model. A RF-based importance assessment reveals that the spectral bands and bivariate textural features calculated by pseudo-cross variogram (PC strongly promoted forest class-separability, whereas the univariate textural features influenced weakly. A feature selection strategy eliminated 93% of variables, and then a subset of the 47 most essential variables was generated. In this subset, PC texture derived from summer and winter appeared the most frequently, suggesting that this variability in growing peak season and non-growing season can effectively enhance forest class-separability. A RF classifier applied to the subset led to 91.9% accuracy for LP, with a Kappa coefficient of 0.829. This study provides an insight into approaches for discriminating plantation forests with phenological behaviors.

  10. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    Science.gov (United States)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  11. A fast and cost-effective approach to develop and map EST-SSR markers: oak as a case study

    Directory of Open Access Journals (Sweden)

    Cherubini Marcello

    2010-10-01

    Full Text Available Abstract Background Expressed Sequence Tags (ESTs are a source of simple sequence repeats (SSRs that can be used to develop molecular markers for genetic studies. The availability of ESTs for Quercus robur and Quercus petraea provided a unique opportunity to develop microsatellite markers to accelerate research aimed at studying adaptation of these long-lived species to their environment. As a first step toward the construction of a SSR-based linkage map of oak for quantitative trait locus (QTL mapping, we describe the mining and survey of EST-SSRs as well as a fast and cost-effective approach (bin mapping to assign these markers to an approximate map position. We also compared the level of polymorphism between genomic and EST-derived SSRs and address the transferability of EST-SSRs in Castanea sativa (chestnut. Results A catalogue of 103,000 Sanger ESTs was assembled into 28,024 unigenes from which 18.6% presented one or more SSR motifs. More than 42% of these SSRs corresponded to trinucleotides. Primer pairs were designed for 748 putative unigenes. Overall 37.7% (283 were found to amplify a single polymorphic locus in a reference full-sib pedigree of Quercus robur. The usefulness of these loci for establishing a genetic map was assessed using a bin mapping approach. Bin maps were constructed for the male and female parental tree for which framework linkage maps based on AFLP markers were available. The bin set consisting of 14 highly informative offspring selected based on the number and position of crossover sites. The female and male maps comprised 44 and 37 bins, with an average bin length of 16.5 cM and 20.99 cM, respectively. A total of 256 EST-SSRs were assigned to bins and their map position was further validated by linkage mapping. EST-SSRs were found to be less polymorphic than genomic SSRs, but their transferability rate to chestnut, a phylogenetically related species to oak, was higher. Conclusion We have generated a bin map for oak

  12. Review of the Space Mapping Approach to Engineering Optimization and Modeling

    DEFF Research Database (Denmark)

    Bakr, M. H.; Bandler, J. W.; Madsen, Kaj

    2000-01-01

    We review the Space Mapping (SM) concept and its applications in engineering optimization and modeling. The aim of SM is to avoid computationally expensive calculations encountered in simulating an engineering system. The existence of less accurate but fast physically-based models is exploited. S......-based Modeling (SMM). These include Space Derivative Mapping (SDM), Generalized Space Mapping (GSM) and Space Mapping-based Neuromodeling (SMN). Finally, we address open points for research and future development....

  13. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches

    Science.gov (United States)

    Ma, Ying; Shaik, Mohammed A.; Kozberg, Mariel G.; Thibodeaux, David N.; Zhao, Hanzhi T.; Yu, Hang

    2016-01-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574312

  14. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches.

    Science.gov (United States)

    Ma, Ying; Shaik, Mohammed A; Kim, Sharon H; Kozberg, Mariel G; Thibodeaux, David N; Zhao, Hanzhi T; Yu, Hang; Hillman, Elizabeth M C

    2016-10-05

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'. © 2016 The Authors.

  15. Seeing the whole picture: A comprehensive imaging approach to functional mapping of circuits in behaving zebrafish.

    Science.gov (United States)

    Feierstein, C E; Portugues, R; Orger, M B

    2015-06-18

    In recent years, the zebrafish has emerged as an appealing model system to tackle questions relating to the neural circuit basis of behavior. This can be attributed not just to the growing use of genetically tractable model organisms, but also in large part to the rapid advances in optical techniques for neuroscience, which are ideally suited for application to the small, transparent brain of the larval fish. Many characteristic features of vertebrate brains, from gross anatomy down to particular circuit motifs and cell-types, as well as conserved behaviors, can be found in zebrafish even just a few days post fertilization, and, at this early stage, the physical size of the brain makes it possible to analyze neural activity in a comprehensive fashion. In a recent study, we used a systematic and unbiased imaging method to record the pattern of activity dynamics throughout the whole brain of larval zebrafish during a simple visual behavior, the optokinetic response (OKR). This approach revealed the broadly distributed network of neurons that were active during the behavior and provided insights into the fine-scale functional architecture in the brain, inter-individual variability, and the spatial distribution of behaviorally relevant signals. Combined with mapping anatomical and functional connectivity, targeted electrophysiological recordings, and genetic labeling of specific populations, this comprehensive approach in zebrafish provides an unparalleled opportunity to study complete circuits in a behaving vertebrate animal. Copyright © 2014. Published by Elsevier Ltd.

  16. A Bac Library and Paired-PCR Approach to Mapping and Completing the Genome Sequence of Sulfolobus Solfataricus P2

    DEFF Research Database (Denmark)

    She, Qunxin; Confalonieri, F.; Zivanovic, Y.

    2000-01-01

    The original strategy used in the Sulfolobus solfatnricus genome project was to sequence non overlapping, or minimally overlapping, cosmid or lambda inserts without constructing a physical map. However, after only about two thirds of the genome sequence was completed, this approach became counter......-productive because there was a high sequence bias in the cosmid and lambda libraries. Therefore, a new approach was devised for linking the sequenced regions which may be generally applicable. BAC libraries were constructed and terminal sequences of the clones were determined and used for both end mapping and PCR...

  17. A COGNITIVE APPROACH TO CORPORATE GOVERNANCE: A VISUALIZATION TEST OF MENTAL MODELS WITH THE COGNITIVE MAPPING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Garoui NASSREDDINE

    2012-01-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the fi rm with respect to the cognitive approach of corporate governance. The paper takes a corporate governance perspective, discusses mental models and uses the cognitive map to view the diagrams showing the ways of thinking and the conceptualization of the cognitive approach. In addition, it employs a cognitive mapping technique. Returning to the systematic exploration of grids for each actor, it concludes that there is a balance of concepts expressing their cognitive orientation.

  18. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    Science.gov (United States)

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  19. The Randomized CRM: An Approach to Overcoming the Long-Memory Property of the CRM.

    Science.gov (United States)

    Koopmeiners, Joseph S; Wey, Andrew

    2017-01-01

    The primary object of a Phase I clinical trial is to determine the maximum tolerated dose (MTD). Typically, the MTD is identified using a dose-escalation study, where initial subjects are treated at the lowest dose level and subsequent subjects are treated at progressively higher dose levels until the MTD is identified. The continual reassessment method (CRM) is a popular model-based dose-escalation design, which utilizes a formal model for the relationship between dose and toxicity to guide dose finding. Recently, it was shown that the CRM has a tendency to get "stuck" on a dose level, with little escalation or de-escalation in the late stages of the trial, due to the long-memory property of the CRM. We propose the randomized CRM (rCRM), which introduces random escalation and de-escalation into the standard CRM dose-finding algorithm, as well as a hybrid approach that incorporates escalation and de-escalation only when certain criteria are met. Our simulation results show that both the rCRM and the hybrid approach reduce the trial-to-trial variability in the number of cohorts treated at the MTD but that the hybrid approach has a more favorable tradeoff with respect to the average number treated at the MTD.

  20. Fusion Approaches for Land Cover Map Production Using High Resolution Image Time Series without Reference Data of the Corresponding Period

    Directory of Open Access Journals (Sweden)

    Benjamin Tardy

    2017-11-01

    Full Text Available Optical sensor time series images allow one to produce land cover maps at a large scale. The supervised classification algorithms have been shown to be the best to produce maps automatically with good accuracy. The main drawback of these methods is the need for reference data, the collection of which can introduce important production delays. Therefore, the maps are often available too late for some applications. Domain adaptation methods seem to be efficient for using past data for land cover map production. According to this idea, the main goal of this study is to propose several simple past data fusion schemes to override the current land cover map production delays. A single classifier approach and three voting rules are considered to produce maps without reference data of the corresponding period. These four approaches reach an overall accuracy of around 80% with a 17-class nomenclature using Formosat-2 image time series. A study of the impact of the number of past periods used is also done. It shows that the overall accuracy increases with the number of periods used. The proposed methods require at least two or three previous years to be used.

  1. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    Directory of Open Access Journals (Sweden)

    R. Brumana

    2015-08-01

    Full Text Available Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality, is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical

  2. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    Science.gov (United States)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels

  3. A new experimental approach to study the stability of logistic map

    International Nuclear Information System (INIS)

    Rani, Mamta; Agarwal, Rashi

    2009-01-01

    Remarkably benign looking logistic transformations x n+1 = rx n (1 - x n ) for choosing x 0 between 0 and 1 and 0 < r ≤ 4 have found a celebrated place in chaos, fractals and discrete dynamics. The purpose of this paper is to enhance the capabilities of logistic map via superior iterations. Stability of logistic map has been studied by running computer programs. Logistic map is stable for 0 < r ≤ 3.2 in Picard orbit. In superior orbit, we see that the range of stability of logistic map increases drastically. Also, chaotic behavior of logistic map disappears in certain cases.

  4. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    Science.gov (United States)

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  5. Randomized trial of two swallowing assessment approaches in patients with acquired brain injury

    DEFF Research Database (Denmark)

    Kjaersgaard, Annette; Nielsen, Lars Hedemann; Sjölund, Bengt H.

    2014-01-01

    trial. SETTING: Specialized, national neurorehabilitation centre. SUBJECTS: Adult patients with acquired brain injury. Six hundred and seventy-nine patients were assessed for eligibility and 138 were randomly allocated between June 2009 and April 2011. INTERVENTIONS: Assessment by Facial-Oral Tract....... Seven patients were left for analysis, 4 of whom developed aspiration pneumonia within 10 days after initiating oral intake (1 control/3 interventions). CONCLUSION: In the presence of a structured clinical assessment with the Facial-Oral Tract Therapy approach, it is unnecessary to undertake...

  6. Scope of Various Random Number Generators in ant System Approach for TSP

    Science.gov (United States)

    Sen, S. K.; Shaykhian, Gholam Ali

    2007-01-01

    Experimented on heuristic, based on an ant system approach for traveling salesman problem, are several quasi- and pseudo-random number generators. This experiment is to explore if any particular generator is most desirable. Such an experiment on large samples has the potential to rank the performance of the generators for the foregoing heuristic. This is mainly to seek an answer to the controversial issue "which generator is the best in terms of quality of the result (accuracy) as well as cost of producing the result (time/computational complexity) in a probabilistic/statistical sense."

  7. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    Science.gov (United States)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  8. A Ranking Analysis/An Interlinking Approach of New Triangular Fuzzy Cognitive Maps and Combined Effective Time Dependent Matrix

    Science.gov (United States)

    Adiga, Shreemathi; Saraswathi, A.; Praveen Prakash, A.

    2018-04-01

    This paper aims an interlinking approach of new Triangular Fuzzy Cognitive Maps (TrFCM) and Combined Effective Time Dependent (CETD) matrix to find the ranking of the problems of Transgenders. Section one begins with an introduction that briefly describes the scope of Triangular Fuzzy Cognitive Maps (TrFCM) and CETD Matrix. Section two provides the process of causes of problems faced by Transgenders using Fuzzy Triangular Fuzzy Cognitive Maps (TrFCM) method and performs the calculations using the collected data among the Transgender. In Section 3, the reasons for the main causes for the problems of the Transgenders. Section 4 describes the Charles Spearmans coefficients of rank correlation method by interlinking of Triangular Fuzzy Cognitive Maps (TrFCM) Method and CETD Matrix. Section 5 shows the results based on our study.

  9. A Prospective Randomized Peri- and Post-Operative Comparison of the Minimally Invasive Anterolateral Approach Versus the Lateral Approach

    Science.gov (United States)

    Landgraeber, Stefan; Quitmann, Henning; Güth, Sebastian; Haversath, Marcel; Kowalczyk, Wojciech; Kecskeméthy, Andrés; Heep, Hansjörg; Jäger, Marcus

    2013-01-01

    There is still controversy as to whether minimally invasive total hip arthroplasty enhances the postoperative outcome. The aim of this study was to compare the outcome of patients who underwent total hip replacement through an anterolateral minimally invasive (MIS) or a conventional lateral approach (CON). We performed a randomized, prospective study of 75 patients with primary hip arthritis, who underwent hip replacement through the MIS (n=36) or CON (n=39) approach. The Western Ontario and McMaster Universities Osteoarthritis Index and Harris Hip score (HHS) were evaluated at frequent intervals during the early postoperative follow-up period and then after 3.5 years. Pain sensations were recorded. Serological and radiological analyses were performed. In the MIS group the patients had smaller skin incisions and there was a significantly lower rate of patients with a positive Trendelenburg sign after six weeks postoperatively. After six weeks the HHS was 6.85 points higher in the MIS group (P=0.045). But calculating the mean difference between the baseline and the six weeks HHS we evaluated no significant differences. Blood loss was greater and the duration of surgery was longer in the MIS group. The other parameters, especially after the twelfth week, did not differ significantly. Radiographs showed the inclination of the acetabular component to be significantly higher in the MIS group, but on average it was within the same permitted tolerance range as in the CON group. Both approaches are adequate for hip replacement. Given the data, there appears to be no significant long term advantage to the MIS approach, as described in this study. PMID:24191179

  10. A prospective randomized peri- and post-operative comparison of the minimally invasive anterolateral approach versus the lateral approach

    Directory of Open Access Journals (Sweden)

    Stefan Landgraeber

    2013-07-01

    Full Text Available There is still controversy as to whether minimally invasive total hip arthroplasty enhances the postoperative outcome. The aim of this study was to compare the outcome of patients who underwent total hip replacement through an anterolateral minimally invasive (MIS or a conventional lateral approach (CON. We performed a randomized, prospective study of 75 patients with primary hip arthritis, who underwent hip replacement through the MIS (n=36 or CON (n=39 approach. The Western Ontario\tand\tMcMaster\tUniversities Osteoarthritis Index and Harris Hip score (HHS were evaluated at frequent intervals during the early postoperative follow-up period and then after 3.5 years. Pain sensations were recorded. Serological and radiological analyses were performed. In the MIS group the patients had smaller skin incisions and there was a significantly lower rate of patients with a positive Trendelenburg sign after six weeks postoperatively. After six weeks the HHS was 6.85 points higher in the MIS group (P=0.045. But calculating the mean difference between the baseline and the six weeks HHS we evaluated no significant differences. Blood loss was greater and the duration of surgery was longer in the MIS group. The other parameters, especially after the twelfth week, did not differ significantly. Radiographs showed the inclination of the acetabular component to be significantly higher in the MIS group, but on average it was within the same permitted tolerance range as in the CON group. Both approaches are adequate for hip replacement. Given the data, there appears to be no significant long term advantage to the MIS approach, as described in this study.

  11. A random sampling approach for robust estimation of tissue-to-plasma ratio from extremely sparse data.

    Science.gov (United States)

    Chu, Hui-May; Ette, Ene I

    2005-09-02

    his study was performed to develop a new nonparametric approach for the estimation of robust tissue-to-plasma ratio from extremely sparsely sampled paired data (ie, one sample each from plasma and tissue per subject). Tissue-to-plasma ratio was estimated from paired/unpaired experimental data using independent time points approach, area under the curve (AUC) values calculated with the naïve data averaging approach, and AUC values calculated using sampling based approaches (eg, the pseudoprofile-based bootstrap [PpbB] approach and the random sampling approach [our proposed approach]). The random sampling approach involves the use of a 2-phase algorithm. The convergence of the sampling/resampling approaches was investigated, as well as the robustness of the estimates produced by different approaches. To evaluate the latter, new data sets were generated by introducing outlier(s) into the real data set. One to 2 concentration values were inflated by 10% to 40% from their original values to produce the outliers. Tissue-to-plasma ratios computed using the independent time points approach varied between 0 and 50 across time points. The ratio obtained from AUC values acquired using the naive data averaging approach was not associated with any measure of uncertainty or variability. Calculating the ratio without regard to pairing yielded poorer estimates. The random sampling and pseudoprofile-based bootstrap approaches yielded tissue-to-plasma ratios with uncertainty and variability. However, the random sampling approach, because of the 2-phase nature of its algorithm, yielded more robust estimates and required fewer replications. Therefore, a 2-phase random sampling approach is proposed for the robust estimation of tissue-to-plasma ratio from extremely sparsely sampled data.

  12. A Copula Based Approach for Design of Multivariate Random Forests for Drug Sensitivity Prediction.

    Science.gov (United States)

    Haider, Saad; Rahman, Raziur; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Modeling sensitivity to drugs based on genetic characterizations is a significant challenge in the area of systems medicine. Ensemble based approaches such as Random Forests have been shown to perform well in both individual sensitivity prediction studies and team science based prediction challenges. However, Random Forests generate a deterministic predictive model for each drug based on the genetic characterization of the cell lines and ignores the relationship between different drug sensitivities during model generation. This application motivates the need for generation of multivariate ensemble learning techniques that can increase prediction accuracy and improve variable importance ranking by incorporating the relationships between different output responses. In this article, we propose a novel cost criterion that captures the dissimilarity in the output response structure between the training data and node samples as the difference in the two empirical copulas. We illustrate that copulas are suitable for capturing the multivariate structure of output responses independent of the marginal distributions and the copula based multivariate random forest framework can provide higher accuracy prediction and improved variable selection. The proposed framework has been validated on genomics of drug sensitivity for cancer and cancer cell line encyclopedia database.

  13. A novel chaotic particle swarm optimization approach using Henon map and implicit filtering local search for economic load dispatch

    International Nuclear Information System (INIS)

    Coelho, Leandro dos Santos; Mariani, Viviana Cocco

    2009-01-01

    Particle swarm optimization (PSO) is a population-based swarm intelligence algorithm driven by the simulation of a social psychological metaphor instead of the survival of the fittest individual. Based on the chaotic systems theory, this paper proposed a novel chaotic PSO combined with an implicit filtering (IF) local search method to solve economic dispatch problems. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed PSO introduces chaos mapping using Henon map sequences which increases its convergence rate and resulting precision. The chaotic PSO approach is used to produce good potential solutions, and the IF is used to fine-tune of final solution of PSO. The hybrid methodology is validated for a test system consisting of 13 thermal units whose incremental fuel cost function takes into account the valve-point loading effects. Simulation results are promising and show the effectiveness of the proposed approach.

  14. Historical maintenance relevant information road-map for a self-learning maintenance prediction procedural approach

    Science.gov (United States)

    Morales, Francisco J.; Reyes, Antonio; Cáceres, Noelia; Romero, Luis M.; Benitez, Francisco G.; Morgado, Joao; Duarte, Emanuel; Martins, Teresa

    2017-09-01

    A large percentage of transport infrastructures are composed of linear assets, such as roads and rail tracks. The large social and economic relevance of these constructions force the stakeholders to ensure a prolonged health/durability. Even though, inevitable malfunctioning, breaking down, and out-of-service periods arise randomly during the life cycle of the infrastructure. Predictive maintenance techniques tend to diminish the appearance of unpredicted failures and the execution of needed corrective interventions, envisaging the adequate interventions to be conducted before failures show up. This communication presents: i) A procedural approach, to be conducted, in order to collect the relevant information regarding the evolving state condition of the assets involved in all maintenance interventions; this reported and stored information constitutes a rich historical data base to train Machine Learning algorithms in order to generate reliable predictions of the interventions to be carried out in further time scenarios. ii) A schematic flow chart of the automatic learning procedure. iii) Self-learning rules from automatic learning from false positive/negatives. The description, testing, automatic learning approach and the outcomes of a pilot case are presented; finally some conclusions are outlined regarding the methodology proposed for improving the self-learning predictive capability.

  15. Fast approach to evaluate MAP reconstruction for lesion detection and localization

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2004-01-01

    Lesion detection is an important task in emission tomography. Localization ROC (LROC) studies are often used to analyze the lesion detection and localization performance. Most researchers rely on Monte Carlo reconstruction samples to obtain LROC curves, which can be very time-consuming for iterative algorithms. In this paper we develop a fast approach to obtain LROC curves that does not require Monte Carlo reconstructions. We use a channelized Hotelling observer model to search for lesions, and the results can be easily extended to other numerical observers. We theoretically analyzed the mean and covariance of the observer output. Assuming the observer outputs are multivariate Gaussian random variables, an LROC curve can be directly generated by integrating the conditional probability density functions. The high-dimensional integrals are calculated using a Monte Carlo method. The proposed approach is very fast because no iterative reconstruction is involved. Computer simulations show that the results of the proposed method match well with those obtained using the tradition LROC analysis

  16. Identifying reports of randomized controlled trials (RCTs) via a hybrid machine learning and crowdsourcing approach.

    Science.gov (United States)

    Wallace, Byron C; Noel-Storr, Anna; Marshall, Iain J; Cohen, Aaron M; Smalheiser, Neil R; Thomas, James

    2017-11-01

    Identifying all published reports of randomized controlled trials (RCTs) is an important aim, but it requires extensive manual effort to separate RCTs from non-RCTs, even using current machine learning (ML) approaches. We aimed to make this process more efficient via a hybrid approach using both crowdsourcing and ML. We trained a classifier to discriminate between citations that describe RCTs and those that do not. We then adopted a simple strategy of automatically excluding citations deemed very unlikely to be RCTs by the classifier and deferring to crowdworkers otherwise. Combining ML and crowdsourcing provides a highly sensitive RCT identification strategy (our estimates suggest 95%-99% recall) with substantially less effort (we observed a reduction of around 60%-80%) than relying on manual screening alone. Hybrid crowd-ML strategies warrant further exploration for biomedical curation/annotation tasks. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  17. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  18. Mapping Trends in Pedagogical Approaches and Learning Technologies: Perspectives from the Canadian, International, and Military Education Contexts

    Science.gov (United States)

    Scoppio, Grazia; Covell, Leigha

    2016-01-01

    Increased technological advances, coupled with new learners' needs, have created new realities for higher education contexts. This study explored and mapped trends in pedagogical approaches and learning technologies in postsecondary education and identified how these innovations are affecting teaching and learning practices in higher education…

  19. A dominance-based approach to map risks of ecological invasions in the presence of severe uncertainty

    Science.gov (United States)

    Denys Yemshanov; Frank H. Koch; D. Barry Lyons; Mark Ducey; Klaus Koehler

    2012-01-01

    Aim Uncertainty has been widely recognized as one of the most critical issues in predicting the expansion of ecological invasions. The uncertainty associated with the introduction and spread of invasive organisms influences how pest management decision makers respond to expanding incursions. We present a model-based approach to map risk of ecological invasions that...

  20. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    Science.gov (United States)

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  1. A knowledge-based approach for C-factor mapping in Spain using Landsat TM and GIS

    DEFF Research Database (Denmark)

    Veihe (former Folly), Anita; Bronsveld, M.C.; Clavaux, M

    1996-01-01

    The cover and management factor (C) in the Universal Soil Loss Equation (USLE), is one of the most important parameters for assessing erosion. In this study it is shown how a knowledge-based approach can be used to optimize C-factor mapping in the Mediterranean region being characterized...... to the limitations of the USLE...

  2. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  3. Rethinking clinical language mapping approaches: discordant receptive and expressive hemispheric language dominance in epilepsy surgery candidates.

    Science.gov (United States)

    Gage, Nicole M; Eliashiv, Dawn S; Isenberg, Anna L; Fillmore, Paul T; Kurelowech, Lacey; Quint, Patti J; Chung, Jeffrey M; Otis, Shirley M

    2011-06-01

    Neuroimaging studies have shed light on cortical language organization, with findings implicating the left and right temporal lobes in speech processing converging to a left-dominant pattern. Findings highlight the fact that the state of theoretical language knowledge is ahead of current clinical language mapping methods, motivating a rethinking of these approaches. The authors used magnetoencephalography and multiple tasks in seven candidates for resective epilepsy surgery to investigate language organization. The authors scanned 12 control subjects to investigate the time course of bilateral receptive speech processes. Laterality indices were calculated for left and right hemisphere late fields ∼150 to 400 milliseconds. The authors report that (1) in healthy adults, speech processes activated superior temporal regions bilaterally converging to a left-dominant pattern, (2) in four of six patients, this was reversed, with bilateral processing converging to a right-dominant pattern, and (3) in three of four of these patients, receptive and expressive language processes were laterally discordant. Results provide evidence that receptive and expressive language may have divergent hemispheric dominance. Right-sided receptive language dominance in epilepsy patients emphasizes the need to assess both receptive and expressive language. Findings indicate that it is critical to use multiple tasks tapping separable aspects of language function to provide sensitive and specific estimates of language localization in surgical patients.

  4. Developing a model for effective leadership in healthcare: a concept mapping approach

    Science.gov (United States)

    Hargett, Charles William; Doty, Joseph P; Hauck, Jennifer N; Webb, Allison MB; Cook, Steven H; Tsipis, Nicholas E; Neumann, Julie A; Andolsek, Kathryn M; Taylor, Dean C

    2017-01-01

    Purpose Despite increasing awareness of the importance of leadership in healthcare, our understanding of the competencies of effective leadership remains limited. We used a concept mapping approach (a blend of qualitative and quantitative analysis of group processes to produce a visual composite of the group’s ideas) to identify stakeholders’ mental model of effective healthcare leadership, clarifying the underlying structure and importance of leadership competencies. Methods Literature review, focus groups, and consensus meetings were used to derive a representative set of healthcare leadership competency statements. Study participants subsequently sorted and rank-ordered these statements based on their perceived importance in contributing to effective healthcare leadership in real-world settings. Hierarchical cluster analysis of individual sortings was used to develop a coherent model of effective leadership in healthcare. Results A diverse group of 92 faculty and trainees individually rank-sorted 33 leadership competency statements. The highest rated statements were “Acting with Personal Integrity”, “Communicating Effectively”, “Acting with Professional Ethical Values”, “Pursuing Excellence”, “Building and Maintaining Relationships”, and “Thinking Critically”. Combining the results from hierarchical cluster analysis with our qualitative data led to a healthcare leadership model based on the core principle of Patient Centeredness and the core competencies of Integrity, Teamwork, Critical Thinking, Emotional Intelligence, and Selfless Service. Conclusion Using a mixed qualitative-quantitative approach, we developed a graphical representation of a shared leadership model derived in the healthcare setting. This model may enhance learning, teaching, and patient care in this important area, as well as guide future research. PMID:29355249

  5. BAUM: Improving genome assembly by adaptive unique mapping and local overlap-layout-consensus approach.

    Science.gov (United States)

    Wang, Anqi; Wang, Zhanyu; Li, Zheng; Li, Lei M

    2018-01-15

    It is highly desirable to assemble genomes of high continuity and consistency at low cost. The current bottleneck of draft genome continuity using the Second Generation Sequencing (SGS) reads is primarily caused by uncertainty among repetitive sequences. Even though the Single-Molecule Real-Time sequencing technology is very promising to overcome the uncertainty issue, its relatively high cost and error rate add burden on budget or computation. Many long-read assemblers take the overlap-layout-consensus (OLC) paradigm, which is less sensitive to sequencing errors, heterozygosity and variability of coverage. However, current assemblers of SGS data do not sufficiently take advantage of the OLC approach. Aiming at minimizing uncertainty, the proposed method BAUM, breaks the whole genome into regions by adaptive unique mapping; then the local OLC is used to assemble each region in parallel. BAUM can: (1) perform reference-assisted assembly based on the genome of a close species; (2) or improve the results of existing assemblies that are obtained based on short or long sequencing reads. The tests on two eukaryote genomes, a wild rice Oryza longistaminata and a parrot Melopsittacus undulatus, show that BAUM achieved substantial improvement on genome size and continuity. Besides, BAUM reconstructed a considerable amount of repetitive regions that failed to be assembled by existing short read assemblers. We also propose statistical approaches to control the uncertainty in different steps of BAUM. http://www.zhanyuwang.xin/wordpress/index.php/2017/07/21/baum. lilei@amss.ac.cn. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  6. Relationship between accuracy and number of samples on statistical quantity and contour map of environmental gamma-ray dose rate. Example of random sampling

    International Nuclear Information System (INIS)

    Matsuda, Hideharu; Minato, Susumu

    2002-01-01

    The accuracy of statistical quantity like the mean value and contour map obtained by measurement of the environmental gamma-ray dose rate was evaluated by random sampling of 5 different model distribution maps made by the mean slope, -1.3, of power spectra calculated from the actually measured values. The values were derived from 58 natural gamma dose rate data reported worldwide ranging in the means of 10-100 Gy/h rates and 10 -3 -10 7 km 2 areas. The accuracy of the mean value was found around ±7% even for 60 or 80 samplings (the most frequent number) and the standard deviation had the accuracy less than 1/4-1/3 of the means. The correlation coefficient of the frequency distribution was found 0.860 or more for 200-400 samplings (the most frequent number) but of the contour map, 0.502-0.770. (K.H.)

  7. INTEGRATED GEOREFERENCING OF STEREO IMAGE SEQUENCES CAPTURED WITH A STEREOVISION MOBILE MAPPING SYSTEM – APPROACHES AND PRACTICAL RESULTS

    Directory of Open Access Journals (Sweden)

    H. Eugster

    2012-07-01

    Full Text Available Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations – in our case of the imaging sensors – normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  8. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  9. Mapping Urban Green Infrastructure: A Novel Landscape-Based Approach to Incorporating Land Use and Land Cover in the Mapping of Human-Dominated Systems

    Directory of Open Access Journals (Sweden)

    Matthew Dennis

    2018-01-01

    Full Text Available Common approaches to mapping green infrastructure in urbanised landscapes invariably focus on measures of land use or land cover and associated functional or physical traits. However, such one-dimensional perspectives do not accurately capture the character and complexity of the landscapes in which urban inhabitants live. The new approach presented in this paper demonstrates how open-source, high spatial and temporal resolution data with global coverage can be used to measure and represent the landscape qualities of urban environments. Through going beyond simple metrics of quantity, such as percentage green and blue cover, it is now possible to explore the extent to which landscape quality helps to unpick the mixed evidence presented in the literature on the benefits of urban nature to human well-being. Here we present a landscape approach, employing remote sensing, GIS and data reduction techniques to map urban green infrastructure elements in a large U.K. city region. Comparison with existing urban datasets demonstrates considerable improvement in terms of coverage and thematic detail. The characterisation of landscapes, using census tracts as spatial units, and subsequent exploration of associations with social–ecological attributes highlights the further detail that can be uncovered by the approach. For example, eight urban landscape types identified for the case study city exhibited associations with distinct socioeconomic conditions accountable not only to quantities but also qualities of green and blue space. The identification of individual landscape features through simultaneous measures of land use and land cover demonstrated unique and significant associations between the former and indicators of human health and ecological condition. The approach may therefore provide a promising basis for developing further insight into processes and characteristics that affect human health and well-being in urban areas, both in the United

  10. Comparative genomics and association mapping approaches for blast resistant genes in finger millet using SSRs.

    Directory of Open Access Journals (Sweden)

    B Kalyana Babu

    Full Text Available The major limiting factor for production and productivity of finger millet crop is blast disease caused by Magnaporthe grisea. Since, the genome sequence information available in finger millet crop is scarce, comparative genomics plays a very important role in identification of genes/QTLs linked to the blast resistance genes using SSR markers. In the present study, a total of 58 genic SSRs were developed for use in genetic analysis of a global collection of 190 finger millet genotypes. The 58 SSRs yielded ninety five scorable alleles and the polymorphism information content varied from 0.186 to 0.677 at an average of 0.385. The gene diversity was in the range of 0.208 to 0.726 with an average of 0.487. Association mapping for blast resistance was done using 104 SSR markers which identified four QTLs for finger blast and one QTL for neck blast resistance. The genomic marker RM262 and genic marker FMBLEST32 were linked to finger blast disease at a P value of 0.007 and explained phenotypic variance (R² of 10% and 8% respectively. The genomic marker UGEP81 was associated to finger blast at a P value of 0.009 and explained 7.5% of R². The QTLs for neck blast was associated with the genomic SSR marker UGEP18 at a P value of 0.01, which explained 11% of R². Three QTLs for blast resistance were found common by using both GLM and MLM approaches. The resistant alleles were found to be present mostly in the exotic genotypes. Among the genotypes of NW Himalayan region of India, VHC3997, VHC3996 and VHC3930 were found highly resistant, which may be effectively used as parents for developing blast resistant cultivars in the NW Himalayan region of India. The markers linked to the QTLs for blast resistance in the present study can be further used for cloning of the full length gene, fine mapping and their further use in the marker assisted breeding programmes for introgression of blast resistant alleles into locally adapted cultivars.

  11. Comparative genomics and association mapping approaches for blast resistant genes in finger millet using SSRs.

    Science.gov (United States)

    Babu, B Kalyana; Dinesh, Pandey; Agrawal, Pawan K; Sood, S; Chandrashekara, C; Bhatt, Jagadish C; Kumar, Anil

    2014-01-01

    The major limiting factor for production and productivity of finger millet crop is blast disease caused by Magnaporthe grisea. Since, the genome sequence information available in finger millet crop is scarce, comparative genomics plays a very important role in identification of genes/QTLs linked to the blast resistance genes using SSR markers. In the present study, a total of 58 genic SSRs were developed for use in genetic analysis of a global collection of 190 finger millet genotypes. The 58 SSRs yielded ninety five scorable alleles and the polymorphism information content varied from 0.186 to 0.677 at an average of 0.385. The gene diversity was in the range of 0.208 to 0.726 with an average of 0.487. Association mapping for blast resistance was done using 104 SSR markers which identified four QTLs for finger blast and one QTL for neck blast resistance. The genomic marker RM262 and genic marker FMBLEST32 were linked to finger blast disease at a P value of 0.007 and explained phenotypic variance (R²) of 10% and 8% respectively. The genomic marker UGEP81 was associated to finger blast at a P value of 0.009 and explained 7.5% of R². The QTLs for neck blast was associated with the genomic SSR marker UGEP18 at a P value of 0.01, which explained 11% of R². Three QTLs for blast resistance were found common by using both GLM and MLM approaches. The resistant alleles were found to be present mostly in the exotic genotypes. Among the genotypes of NW Himalayan region of India, VHC3997, VHC3996 and VHC3930 were found highly resistant, which may be effectively used as parents for developing blast resistant cultivars in the NW Himalayan region of India. The markers linked to the QTLs for blast resistance in the present study can be further used for cloning of the full length gene, fine mapping and their further use in the marker assisted breeding programmes for introgression of blast resistant alleles into locally adapted cultivars.

  12. Using an intervention mapping approach to develop a discharge protocol for intensive care patients.

    Science.gov (United States)

    van Mol, Margo; Nijkamp, Marjan; Markham, Christine; Ista, Erwin

    2017-12-19

    Admission into an intensive care unit (ICU) may result in long-term physical, cognitive, and emotional consequences for patients and their relatives. The care of the critically ill patient does not end upon ICU discharge; therefore, integrated and ongoing care during and after transition to the follow-up ward is pivotal. This study described the development of an intervention that responds to this need. Intervention Mapping (IM), a six-step theory- and evidence-based approach, was used to guide intervention development. The first step, a problem analysis, comprised a literature review, six semi-structured telephone interviews with former ICU-patients and their relatives, and seven qualitative roundtable meetings for all eligible nurses (i.e., 135 specialized and 105 general ward nurses). Performance and change objectives were formulated in step two. In step three, theory-based methods and practical applications were selected and directed at the desired behaviors and the identified barriers. Step four designed a revised discharge protocol taking into account existing interventions. Adoption, implementation and evaluation of the new discharge protocol (IM steps five and six) are in progress and were not included in this study. Four former ICU patients and two relatives underlined the importance of the need for effective discharge information and supportive written material. They also reported a lack of knowledge regarding the consequences of ICU admission. 42 ICU and 19 general ward nurses identified benefits and barriers regarding discharge procedures using three vignettes framed by literature. Some discrepancies were found. For example, ICU nurses were skeptical about the impact of writing a lay summary despite extensive evidence of the known benefits for the patients. ICU nurses anticipated having insufficient skills, not knowing the patient well enough, and fearing legal consequences of their writings. The intervention was designed to target the knowledge

  13. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    Science.gov (United States)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  14. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    Science.gov (United States)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  15. Mapping Annual Precipitation across Mainland China in the Period 2001–2010 from TRMM3B43 Product Using Spatial Downscaling Approach

    Directory of Open Access Journals (Sweden)

    Yuli Shi

    2015-05-01

    Full Text Available Spatially explicit precipitation data is often responsible for the prediction accuracy of hydrological and ecological models. Several statistical downscaling approaches have been developed to map precipitation at a high spatial resolution, which are mainly based on the valid conjugations between satellite-driven precipitation data and geospatial predictors. Performance of the existing approaches should be first evaluated before applying them to larger spatial extents with a complex terrain across different climate zones. In this paper, we investigate the statistical downscaling algorithms to derive the high spatial resolution maps of precipitation over continental China using satellite datasets, including the Normalized Distribution Vegetation Index (NDVI from the Moderate Resolution Imaging Spectroradiometer (MODIS, the Global Digital Elevation Model (GDEM from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER, and the rainfall product from the Tropical Rainfall Monitoring Mission (TRMM. We compare three statistical techniques (multiple linear regression, exponential regression, and Random Forest regression trees for modeling precipitation to better understand how the selected model types affect the prediction accuracy. Then, those models are implemented to downscale the original TRMM product (3B43; 0.25° resolution onto the finer grids (1 × 1 km2 of precipitation. Finally we validate the downscaled annual precipitation (a wet year 2001 and a dry year 2010 against the ground rainfall observations from 596 rain gauge stations over continental China. The result indicates that the downscaling algorithm based on the Random Forest regression outperforms, when compared to the linear regression and the exponential regression. It also shows that the addition of the residual terms does not significantly improve the accuracy of results for the RF model. The analysis of the variable importance reveals the NDVI related predictors

  16. A Remote Sensing Approach for Regional-Scale Mapping of Agricultural Land-Use Systems Based on NDVI Time Series

    Directory of Open Access Journals (Sweden)

    Beatriz Bellón

    2017-06-01

    Full Text Available In response to the need for generic remote sensing tools to support large-scale agricultural monitoring, we present a new approach for regional-scale mapping of agricultural land-use systems (ALUS based on object-based Normalized Difference Vegetation Index (NDVI time series analysis. The approach consists of two main steps. First, to obtain relatively homogeneous land units in terms of phenological patterns, a principal component analysis (PCA is applied to an annual MODIS NDVI time series, and an automatic segmentation is performed on the resulting high-order principal component images. Second, the resulting land units are classified into the crop agriculture domain or the livestock domain based on their land-cover characteristics. The crop agriculture domain land units are further classified into different cropping systems based on the correspondence of their NDVI temporal profiles with the phenological patterns associated with the cropping systems of the study area. A map of the main ALUS of the Brazilian state of Tocantins was produced for the 2013–2014 growing season with the new approach, and a significant coherence was observed between the spatial distribution of the cropping systems in the final ALUS map and in a reference map extracted from the official agricultural statistics of the Brazilian Institute of Geography and Statistics (IBGE. This study shows the potential of remote sensing techniques to provide valuable baseline spatial information for supporting agricultural monitoring and for large-scale land-use systems analysis.

  17. Ontology Mapping Neural Network: An Approach to Learning and Inferring Correspondences among Ontologies

    Science.gov (United States)

    Peng, Yefei

    2010-01-01

    An ontology mapping neural network (OMNN) is proposed in order to learn and infer correspondences among ontologies. It extends the Identical Elements Neural Network (IENN)'s ability to represent and map complex relationships. The learning dynamics of simultaneous (interlaced) training of similar tasks interact at the shared connections of the…

  18. A two-stage approach for improved prediction of residue contact maps

    Directory of Open Access Journals (Sweden)

    Pollastri Gianluca

    2006-03-01

    Full Text Available Abstract Background Protein topology representations such as residue contact maps are an important intermediate step towards ab initio prediction of protein structure. Although improvements have occurred over the last years, the problem of accurately predicting residue contact maps from primary sequences is still largely unsolved. Among the reasons for this are the unbalanced nature of the problem (with far fewer examples of contacts than non-contacts, the formidable challenge of capturing long-range interactions in the maps, the intrinsic difficulty of mapping one-dimensional input sequences into two-dimensional output maps. In order to alleviate these problems and achieve improved contact map predictions, in this paper we split the task into two stages: the prediction of a map's principal eigenvector (PE from the primary sequence; the reconstruction of the contact map from the PE and primary sequence. Predicting the PE from the primary sequence consists in mapping a vector into a vector. This task is less complex than mapping vectors directly into two-dimensional matrices since the size of the problem is drastically reduced and so is the scale length of interactions that need to be learned. Results We develop architectures composed of ensembles of two-layered bidirectional recurrent neural networks to classify the components of the PE in 2, 3 and 4 classes from protein primary sequence, predicted secondary structure, and hydrophobicity interaction scales. Our predictor, tested on a non redundant set of 2171 proteins, achieves classification performances of up to 72.6%, 16% above a base-line statistical predictor. We design a system for the prediction of contact maps from the predicted PE. Our results show that predicting maps through the PE yields sizeable gains especially for long-range contacts which are particularly critical for accurate protein 3D reconstruction. The final predictor's accuracy on a non-redundant set of 327 targets is 35

  19. From medium heterogeneity to flow and transport: A time-domain random walk approach

    Science.gov (United States)

    Hakoun, V.; Comolli, A.; Dentz, M.

    2017-12-01

    The prediction of flow and transport processes in heterogeneous porous media is based on the qualitative and quantitative understanding of the interplay between 1) spatial variability of hydraulic conductivity, 2) groundwater flow and 3) solute transport. Using a stochastic modeling approach, we study this interplay through direct numerical simulations of Darcy flow and advective transport in heterogeneous media. First, we study flow in correlated hydraulic permeability fields and shed light on the relationship between the statistics of log-hydraulic conductivity, a medium attribute, and the flow statistics. Second, we determine relationships between Eulerian and Lagrangian velocity statistics, this means, between flow and transport attributes. We show how Lagrangian statistics and thus transport behaviors such as late particle arrival times are influenced by the medium heterogeneity on one hand and the initial particle velocities on the other. We find that equidistantly sampled Lagrangian velocities can be described by a Markov process that evolves on the characteristic heterogeneity length scale. We employ a stochastic relaxation model for the equidistantly sampled particle velocities, which is parametrized by the velocity correlation length. This description results in a time-domain random walk model for the particle motion, whose spatial transitions are characterized by the velocity correlation length and temporal transitions by the particle velocities. This approach relates the statistical medium and flow properties to large scale transport, and allows for conditioning on the initial particle velocities and thus to the medium properties in the injection region. The approach is tested against direct numerical simulations.

  20. A METHOD TO ESTIMATE TEMPORAL INTERACTION IN A CONDITIONAL RANDOM FIELD BASED APPROACH FOR CROP RECOGNITION

    Directory of Open Access Journals (Sweden)

    P. M. A. Diaz

    2016-06-01

    Full Text Available This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  1. When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

    KAUST Repository

    Liu, Xiao

    2017-03-21

    Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.

  2. Water-sanitation-hygiene mapping: an improved approach for data collection at local level.

    Science.gov (United States)

    Giné-Garriga, Ricard; de Palencia, Alejandro Jiménez-Fernández; Pérez-Foguet, Agustí

    2013-10-01

    Strategic planning and appropriate development and management of water and sanitation services are strongly supported by accurate and accessible data. If adequately exploited, these data might assist water managers with performance monitoring, benchmarking comparisons, policy progress evaluation, resources allocation, and decision making. A variety of tools and techniques are in place to collect such information. However, some methodological weaknesses arise when developing an instrument for routine data collection, particularly at local level: i) comparability problems due to heterogeneity of indicators, ii) poor reliability of collected data, iii) inadequate combination of different information sources, and iv) statistical validity of produced estimates when disaggregated into small geographic subareas. This study proposes an improved approach for water, sanitation and hygiene (WASH) data collection at decentralised level in low income settings, as an attempt to overcome previous shortcomings. The ultimate aim is to provide local policymakers with strong evidences to inform their planning decisions. The survey design takes the Water Point Mapping (WPM) as a starting point to record all available water sources at a particular location. This information is then linked to data produced by a household survey. Different survey instruments are implemented to collect reliable data by employing a variety of techniques, such as structured questionnaires, direct observation and water quality testing. The collected data is finally validated through simple statistical analysis, which in turn produces valuable outputs that might feed into the decision-making process. In order to demonstrate the applicability of the method, outcomes produced from three different case studies (Homa Bay District-Kenya-; Kibondo District-Tanzania-; and Municipality of Manhiça-Mozambique-) are presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Depletion benchmarks calculation of random media using explicit modeling approach of RMC

    International Nuclear Information System (INIS)

    Liu, Shichang; She, Ding; Liang, Jin-gang; Wang, Kan

    2016-01-01

    Highlights: • Explicit modeling of RMC is applied to depletion benchmark for HTGR fuel element. • Explicit modeling can provide detailed burnup distribution and burnup heterogeneity. • The results would serve as a supplement for the HTGR fuel depletion benchmark. • The method of adjacent burnup regions combination is proposed for full-core problems. • The combination method can reduce memory footprint, keeping the computing accuracy. - Abstract: Monte Carlo method plays an important role in accurate simulation of random media, owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. Three stochastic geometry modeling methods including Random Lattice Method, Chord Length Sampling and explicit modeling approach with mesh acceleration technique, have been implemented in RMC to simulate the particle transport in the dispersed fuels, in which the explicit modeling method is regarded as the best choice. In this paper, the explicit modeling method is applied to the depletion benchmark for HTGR fuel element, and the method of combination of adjacent burnup regions has been proposed and investigated. The results show that the explicit modeling can provide detailed burnup distribution of individual TRISO particles, and this work would serve as a supplement for the HTGR fuel depletion benchmark calculations. The combination of adjacent burnup regions can effectively reduce the memory footprint while keeping the computational accuracy.

  4. Random Forest Approach to QSPR Study of Fluorescence Properties Combining Quantum Chemical Descriptors and Solvent Conditions.

    Science.gov (United States)

    Chen, Chia-Hsiu; Tanaka, Kenichi; Funatsu, Kimito

    2018-04-22

    The Quantitative Structure - Property Relationship (QSPR) approach was performed to study the fluorescence absorption wavelengths and emission wavelengths of 413 fluorescent dyes in different solvent conditions. The dyes included the chromophore derivatives of cyanine, xanthene, coumarin, pyrene, naphthalene, anthracene and etc., with the wavelength ranging from 250 nm to 800 nm. An ensemble method, random forest (RF), was employed to construct nonlinear prediction models compared with the results of linear partial least squares and nonlinear support vector machine regression models. Quantum chemical descriptors derived from density functional theory method and solvent information were also used by constructing models. The best prediction results were obtained from RF model, with the squared correlation coefficients [Formula: see text] of 0.940 and 0.905 for λ abs and λ em , respectively. The descriptors used in the models were discussed in detail in this report by comparing the feature importance of RF.

  5. Dynamic Load Balanced Clustering using Elitism based Random Immigrant Genetic Approach for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    K. Mohaideen Pitchai

    2017-07-01

    Full Text Available Wireless Sensor Network (WSN consists of a large number of small sensors with restricted energy. Prolonged network lifespan, scalability, node mobility and load balancing are important needs for several WSN applications. Clustering the sensor nodes is an efficient technique to reach these goals. WSN have the characteristics of topology dynamics because of factors like energy conservation and node movement that leads to Dynamic Load Balanced Clustering Problem (DLBCP. In this paper, Elitism based Random Immigrant Genetic Approach (ERIGA is proposed to solve DLBCP which adapts to topology dynamics. ERIGA uses the dynamic Genetic Algorithm (GA components for solving the DLBCP. The performance of load balanced clustering process is enhanced with the help of this dynamic GA. As a result, the ERIGA achieves to elect suitable cluster heads which balances the network load and increases the lifespan of the network.

  6. Relation of project managers' personality and project performance: An approach based on value stream mapping

    Directory of Open Access Journals (Sweden)

    Maurizio Bevilacqua

    2014-09-01

    Full Text Available Purpose: This work investigates the influence of project managers’ personality on the success of a project in a Multinational Corporation. The methodology proposed for analyzing the project managers’ personality is based on the Myers-Briggs Type Indicator.Design/methodology/approach: Forty projects carried out in 2012 by multinational corporation, concerning new product development (NPD, have been analyzed, comparing the profile of project managers with results obtained in terms of traditional performance indexes (time delay and over-budget of projects and performance indexes usually used in “Lean Production” sector (waste time and type of “wastes”. A detailed analysis of the most important “wastes” during the project development is carried out using the Value Stream Mapping (VSM technique.Findings and Originality/value: Relying on the Myers–Briggs personality instrument, results show that extroverted managers (as opposed to introverted managers carry out projects that show lower delay and lower waste time. Introverted managers often make “Over-processing” and “Defect” types of waste. Moreover, lower delay and over-budget have been shown by perceiving managers.Research limitations: Regarding the limitations of this work it is necessary to highlight that we collected data from project managers in a retrospective way. While we believe that several aspects of our data collection effort helped enhance the accuracy of the results, future research could conduct real-time case study research to get more detailed insights into the proposed relationships and avoid retrospective bias. Moreover we focused on a single respondent, the project manager. This helped us ensure that their interpretations played an important role in product development. But, we cannot examined the opinion of team members that could be different from project managers opinion regarding some questions.Originality/value: This research provides insight useful

  7. Elearning approaches to prevent weight gain in young adults: A randomized controlled study.

    Science.gov (United States)

    Nikolaou, Charoula Konstantia; Hankey, Catherine Ruth; Lean, Michael Ernest John

    2015-12-01

    Preventing obesity among young adults should be a preferred public health approach given the limited efficacy of treatment interventions. This study examined whether weight gain can be prevented by online approaches using two different behavioral models, one overtly directed at obesity and the other covertly. A three-group parallel randomized controlled intervention was conducted in 2012-2013; 20,975 young adults were allocated a priori to one control and two "treatment" groups. Two treatment groups were offered online courses over 19 weeks on (1) personal weight control ("Not the Ice Cream Van," NTICV) and, (2) political, environmental, and social issues around food ("Goddess Demetra," "GD"). Control group received no contact. The primary outcome was weight change over 40 weeks. Within-group 40-week weight changes were different between groups (P < 0.001): Control (n = 2,134): +2.0 kg (95% CI = 1.5, 2.3 kg); NTICV (n = 1,810): -1.0 kg (95% CI = -1.3, -0.5); and GD (n = 2,057): -1.35 kg (95% CI = -1.4 to -0.7). Relative risks for weight gain vs. NTICV = 0.13 kg (95% CI = 0.10, 0.15), P < 0.0001; GD = 0.07 kg (95% CI = 0.05, 0.10), P < 0.0001. Both interventions were associated with prevention of the weight gain observed among control subjects. This low-cost intervention could be widely transferable as one tool against the obesity epidemic. Outside the randomized controlled trial setting, it could be enhanced using supporting advertising and social media. © 2015 The Obesity Society.

  8. A new approach to analyze strategy map using an integrated BSC and FUZZY DEMATEL

    Directory of Open Access Journals (Sweden)

    Seyed Abdollah Heydariyeh

    2012-01-01

    Full Text Available Today, with ever-increasing competition in global economic conditions, the necessity of effective implementation of strategy map has become an inevitable and necessary. The strategy map represents a general and structured framework for strategic objectives and plays an important role in forming competitive advantages for organizations. It is important to find important factors influencing strategy map and prioritize them based on suitable factors. In this paper, we propose an integration of BSC and Fuzzy DEMATEL technique to rank different items influencing strategy of a production plan. The proposed technique is implemented for real-world case study of glass production.

  9. Comparing Kriging and Regression Approaches for Mapping Soil Clay Content in a diverse Danish Landscape

    DEFF Research Database (Denmark)

    Adhikari, Kabindra; Bou Kheir, Rania; Greve, Mette Balslev

    2013-01-01

    Information on the spatial variability of soil texture including soil clay content in a landscape is very important for agricultural and environmental use. Different prediction techniques are available to assess and map spatial variability of soil properties, but selecting the most suitable techn...... the prediction in OKst compared with that in OK, whereas RT showed the lowest performance of all (R2 = 0.52; RMSE = 0.52; and RPD = 1.17). We found RKrr to be an effective prediction method and recommend this method for any future soil mapping activities in Denmark....... technique at a given site has always been a major issue in all soil mapping applications. We studied the prediction performance of ordinary kriging (OK), stratified OK (OKst), regression trees (RT), and rule-based regression kriging (RKrr) for digital mapping of soil clay content at 30.4-m grid size using 6...

  10. Comparing human and automatic thesaurus mapping approaches in the agricultural domain

    NARCIS (Netherlands)

    Lauser, B.; Johannsen, G.; Caracciolo, C.; Hage, W.R. van; Keizer, J.; Mayr, P.

    2008-01-01

    Knowledge organization systems (KOS), like thesauri and other controlled vocabularies, are used to provide subject access to information systems across the web. Due to the heterogeneity of these systems, mapping between vocabularies becomes crucial for retrieving relevant information. However,

  11. A National Approach to Quantify and Map Biodiversity Conservation Metrics within an Ecosystem Services Framework

    Science.gov (United States)

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have be...

  12. An optimization approach for extracting and encoding consistent maps in a shape collection

    KAUST Repository

    Huang, Qi-Xing; Zhang, Guo-Xin; Gao, Lin; Hu, Shi-Min; Butscher, Adrian; Guibas, Leonidas

    2012-01-01

    by compositions of maps through a single base shape. In general, multiple base shapes are needed to adequately cover a diverse collection. Our algorithm sequentially extracts such a small collection of base shapes and creates correspondences from each

  13. A National Approach for Mapping and Quantifying Habitat-based Biodiversity Metrics Across Multiple Spatial Scales

    Science.gov (United States)

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...

  14. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    Science.gov (United States)

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  15. A concept mapping approach to guide and understand dissemination and implementation.

    Science.gov (United States)

    Green, Amy E; Fettes, Danielle L; Aarons, Gregory A

    2012-10-01

    Many efforts to implement evidence-based programs do not reach their full potential or fail due to the variety of challenges inherent in dissemination and implementation. This article describes the use of concept mapping-a mixed method strategy-to study implementation of behavioral health innovations and evidence-based practice (EBP). The application of concept mapping to implementation research represents a practical and concise way to identify and quantify factors affecting implementation, develop conceptual models of implementation, target areas to address as part of implementation readiness and active implementation, and foster communication among stakeholders. Concept mapping is described and a case example is provided to illustrate its use in an implementation study. Implications for the use of concept mapping methods in both research and applied settings towards the dissemination and implementation of behavioral health services are discussed.

  16. Similarity and accuracy of mental models formed during nursing handovers: A concept mapping approach.

    Science.gov (United States)

    Drach-Zahavy, Anat; Broyer, Chaya; Dagan, Efrat

    2017-09-01

    Shared mental models are crucial for constructing mutual understanding of the patient's condition during a clinical handover. Yet, scant research, if any, has empirically explored mental models of the parties involved in a clinical handover. This study aimed to examine the similarities among mental models of incoming and outgoing nurses, and to test their accuracy by comparing them with mental models of expert nurses. A cross-sectional study, exploring nurses' mental models via the concept mapping technique. 40 clinical handovers. Data were collected via concept mapping of the incoming, outgoing, and expert nurses' mental models (total of 120 concept maps). Similarity and accuracy for concepts and associations indexes were calculated to compare the different maps. About one fifth of the concepts emerged in both outgoing and incoming nurses' concept maps (concept similarity=23%±10.6). Concept accuracy indexes were 35%±18.8 for incoming and 62%±19.6 for outgoing nurses' maps. Although incoming nurses absorbed fewer number of concepts and associations (23% and 12%, respectively), they partially closed the gap (35% and 22%, respectively) relative to expert nurses' maps. The correlations between concept similarities, and incoming as well as outgoing nurses' concept accuracy, were significant (r=0.43, p<0.01; r=0.68 p<0.01, respectively). Finally, in 90% of the maps, outgoing nurses added information concerning the processes enacted during the shift, beyond the expert nurses' gold standard. Two seemingly contradicting processes in the handover were identified. "Information loss", captured by the low similarity indexes among the mental models of incoming and outgoing nurses; and "information restoration", based on accuracy measures indexes among the mental models of the incoming nurses. Based on mental model theory, we propose possible explanations for these processes and derive implications for how to improve a clinical handover. Copyright © 2017 Elsevier Ltd. All

  17. A hierarchical approach of hybrid image classification for land use and land cover mapping

    Directory of Open Access Journals (Sweden)

    Rahdari Vahid

    2018-01-01

    Full Text Available Remote sensing data analysis can provide thematic maps describing land-use and land-cover (LULC in a short period. Using proper image classification method in an area, is important to overcome the possible limitations of satellite imageries for producing land-use and land-cover maps. In the present study, a hierarchical hybrid image classification method was used to produce LULC maps using Landsat Thematic mapper TM for the year of 1998 and operational land imager OLI for the year of 2016. Images were classified using the proposed hybrid image classification method, vegetation cover crown percentage map from normalized difference vegetation index, Fisher supervised classification and object-based image classification methods. Accuracy assessment results showed that the hybrid classification method produced maps with total accuracy up to 84 percent with kappa statistic value 0.81. Results of this study showed that the proposed classification method worked better with OLI sensor than with TM. Although OLI has a higher radiometric resolution than TM, the produced LULC map using TM is almost accurate like OLI, which is because of LULC definitions and image classification methods used.

  18. Inguinal hernia repair: totally preperitoneal laparoscopic approach versus Stoppa operation: randomized trial of 100 cases.

    Science.gov (United States)

    Champault, G G; Rizk, N; Catheline, J M; Turner, R; Boutelier, P

    1997-12-01

    In a prospective randomized trial comparing the totally preperitoneal (TPP) laparoscopic approach and the Stoppa procedure (open), 100 patients with inguinal hernias (Nyhus IIIA, IIIB, IV) were followed over a 3-year period. Both groups were epidemiologically comparable. In the laparoscopic group, operating time was significantly longer (p = 0.01), but hospital stay (3.2 vs. 7.3 days) and delay in return to work (17 vs. 35 days) were significantly reduced (p = 0.01). Postoperative comfort (less pain) was better (p = 0.001) after laparoscopy. In this group, morbidity was also reduced (4 vs. 20%; p = 0.02). The mean follow-up was 605 days, and 93% of the patients were reviewed at 3 years. There were three (6%) recurrences after TPP, especially at the beginning of the surgeon's learning curve, versus one for the Stoppa procedure (NS). For bilateral hernias, the authors suggest the use of a large prosthesis rather than two small ones to minimize the likelihood of recurrence. In the conditions described, the laparoscopic (TPP) approach to inguinal hernia treatment appears to have the same long-term recurrence rate as the open (Stoppa) procedure but a real advantage in the early postoperative period.

  19. Conditional random slope: A new approach for estimating individual child growth velocity in epidemiological research.

    Science.gov (United States)

    Leung, Michael; Bassani, Diego G; Racine-Poon, Amy; Goldenberg, Anna; Ali, Syed Asad; Kang, Gagandeep; Premkumar, Prasanna S; Roth, Daniel E

    2017-09-10

    Conditioning child growth measures on baseline accounts for regression to the mean (RTM). Here, we present the "conditional random slope" (CRS) model, based on a linear-mixed effects model that incorporates a baseline-time interaction term that can accommodate multiple data points for a child while also directly accounting for RTM. In two birth cohorts, we applied five approaches to estimate child growth velocities from 0 to 12 months to assess the effect of increasing data density (number of measures per child) on the magnitude of RTM of unconditional estimates, and the correlation and concordance between the CRS and four alternative metrics. Further, we demonstrated the differential effect of the choice of velocity metric on the magnitude of the association between infant growth and stunting at 2 years. RTM was minimally attenuated by increasing data density for unconditional growth modeling approaches. CRS and classical conditional models gave nearly identical estimates with two measures per child. Compared to the CRS estimates, unconditional metrics had moderate correlation (r = 0.65-0.91), but poor agreement in the classification of infants with relatively slow growth (kappa = 0.38-0.78). Estimates of the velocity-stunting association were the same for CRS and classical conditional models but differed substantially between conditional versus unconditional metrics. The CRS can leverage the flexibility of linear mixed models while addressing RTM in longitudinal analyses. © 2017 The Authors American Journal of Human Biology Published by Wiley Periodicals, Inc.

  20. Minimum intervention dentistry approach to managing early childhood caries: a randomized control trial.

    Science.gov (United States)

    Arrow, Peter; Klobas, Elizabeth

    2015-12-01

    A pragmatic randomized control trial was undertaken to compare the minimum intervention dentistry (MID) approach, based on the atraumatic restorative treatment procedures (MID-ART: Test), against the standard care approach (Control) to treat early childhood caries in a primary care setting. Consenting parent/child dyads were allocated to the Test or Control group using stratified block randomization. Inclusion and exclusion criteria were applied. Participants were examined at baseline and at follow-up by two calibrated examiners blind to group allocation status (κ = 0.77), and parents completed a questionnaire at baseline and follow-up. Dental therapists trained in MID-ART provided treatment to the Test group and dentists treated the Control group using standard approaches. The primary outcome of interest was the number of children who were referred for specialist pediatric care. Secondary outcomes were the number of teeth treated, changes in child oral health-related quality of life and dental anxiety and parental perceptions of care received. Data were analyzed on an intention to treat basis; risk ratio for referral for specialist care, test of proportions, Wilcoxon rank test and logistic regression were used. Three hundred and seventy parents/carers were initially screened; 273 children were examined at baseline and 254 were randomized (Test = 127; Control = 127): mean age = 3.8 years, SD 0.90; 59% male, mean dmft = 4.9, SD 4.0. There was no statistically significant difference in age, sex, baseline caries experience or child oral health-related quality of life between the Test and Control group. At follow-up (mean interval 11.4 months, SD 3.1 months), 220 children were examined: Test = 115, Control = 105. Case-notes review of 231 children showed Test = 6 (5%) and Control = 53 (49%) were referred for specialist care, P < 0.0001. More teeth were filled in the Test group (mean = 2.93, SD 2.48) than in the Control group (mean = 1.54, SD

  1. Visualizing the topical structure of the medical sciences: a self-organizing map approach.

    Directory of Open Access Journals (Sweden)

    André Skupin

    Full Text Available We implement a high-resolution visualization of the medical knowledge domain using the self-organizing map (SOM method, based on a corpus of over two million publications. While self-organizing maps have been used for document visualization for some time, (1 little is known about how to deal with truly large document collections in conjunction with a large number of SOM neurons, (2 post-training geometric and semiotic transformations of the SOM tend to be limited, and (3 no user studies have been conducted with domain experts to validate the utility and readability of the resulting visualizations. Our study makes key contributions to all of these issues.Documents extracted from Medline and Scopus are analyzed on the basis of indexer-assigned MeSH terms. Initial dimensionality is reduced to include only the top 10% most frequent terms and the resulting document vectors are then used to train a large SOM consisting of over 75,000 neurons. The resulting two-dimensional model of the high-dimensional input space is then transformed into a large-format map by using geographic information system (GIS techniques and cartographic design principles. This map is then annotated and evaluated by ten experts stemming from the biomedical and other domains.Study results demonstrate that it is possible to transform a very large document corpus into a map that is visually engaging and conceptually stimulating to subject experts from both inside and outside of the particular knowledge domain. The challenges of dealing with a truly large corpus come to the fore and require embracing parallelization and use of supercomputing resources to solve otherwise intractable computational tasks. Among the envisaged future efforts are the creation of a highly interactive interface and the elaboration of the notion of this map of medicine acting as a base map, onto which other knowledge artifacts could be overlaid.

  2. The stereotactic approach for mapping epileptic networks: a prospective study of 200 patients.

    Science.gov (United States)

    Serletis, Demitre; Bulacio, Juan; Bingaman, William; Najm, Imad; González-Martínez, Jorge

    2014-11-01

    Stereoelectroencephalography (SEEG) is a methodology that permits accurate 3D in vivo electroclinical recordings of epileptiform activity. Among other general indications for invasive intracranial electroencephalography (EEG) monitoring, its advantages include access to deep cortical structures, its ability to localize the epileptogenic zone when subdural grids have failed to do so, and its utility in the context of possible multifocal seizure onsets with the need for bihemispheric explorations. In this context, the authors present a brief historical overview of the technique and report on their experience with 2 SEEG techniques (conventional Leksell frame-based stereotaxy and frameless stereotaxy under robotic guidance) for the purpose of invasively monitoring difficult-to-localize refractory focal epilepsy. Over a period of 4 years, the authors prospectively identified 200 patients with refractory epilepsy who collectively underwent 2663 tailored SEEG electrode implantations for invasive intracranial EEG monitoring and extraoperative mapping. The first 122 patients underwent conventional Leksell frame-based SEEG electrode placement; the remaining 78 patients underwent frameless stereotaxy under robotic guidance, following acquisition of a stereotactic ROSA robotic device at the authors' institution. Electrodes were placed according to a preimplantation hypothesis of the presumed epileptogenic zone, based on a standardized preoperative workup including video-EEG monitoring, MRI, PET, ictal SPECT, and neuropsychological assessment. Demographic features, seizure semiology, number and location of implanted SEEG electrodes, and location of the epileptogenic zone were recorded and analyzed for all patients. For patients undergoing subsequent craniotomy for resection, the type of resection and procedure-related complications were prospectively recorded. These results were analyzed and correlated with pathological diagnosis and postoperative seizure outcomes. The

  3. Topographic mapping on large-scale tidal flats with an iterative approach on the waterline method

    Science.gov (United States)

    Kang, Yanyan; Ding, Xianrong; Xu, Fan; Zhang, Changkuan; Ge, Xiaoping

    2017-05-01

    Tidal flats, which are both a natural ecosystem and a type of landscape, are of significant importance to ecosystem function and land resource potential. Morphologic monitoring of tidal flats has become increasingly important with respect to achieving sustainable development targets. Remote sensing is an established technique for the measurement of topography over tidal flats; of the available methods, the waterline method is particularly effective for constructing a digital elevation model (DEM) of intertidal areas. However, application of the waterline method is more limited in large-scale, shifting tidal flats areas, where the tides are not synchronized and the waterline is not a quasi-contour line. For this study, a topographical map of the intertidal regions within the Radial Sand Ridges (RSR) along the Jiangsu Coast, China, was generated using an iterative approach on the waterline method. A series of 21 multi-temporal satellite images (18 HJ-1A/B CCD and three Landsat TM/OLI) of the RSR area collected at different water levels within a five month period (31 December 2013-28 May 2014) was used to extract waterlines based on feature extraction techniques and artificial further modification. These 'remotely-sensed waterlines' were combined with the corresponding water levels from the 'model waterlines' simulated by a hydrodynamic model with an initial generalized DEM of exposed tidal flats. Based on the 21 heighted 'remotely-sensed waterlines', a DEM was constructed using the ANUDEM interpolation method. Using this new DEM as the input data, it was re-entered into the hydrodynamic model, and a new round of water level assignment of waterlines was performed. A third and final output DEM was generated covering an area of approximately 1900 km2 of tidal flats in the RSR. The water level simulation accuracy of the hydrodynamic model was within 0.15 m based on five real-time tide stations, and the height accuracy (root mean square error) of the final DEM was 0.182 m

  4. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors.

    Directory of Open Access Journals (Sweden)

    Anna Cichonska

    2017-08-01

    Full Text Available Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001 between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel

  5. Brain mapping in a patient with congenital blindness – a case for multimodal approaches

    Directory of Open Access Journals (Sweden)

    Jarod L Roland

    2013-07-01

    Full Text Available Recent advances in basic neuroscience research across a wide range of methodologies have contributed significantly to our understanding of human cortical electrophysiology and functional brain imaging. Translation of this research into clinical neurosurgery has opened doors for advanced mapping of functionality that previously was prohibitively difficult, if not impossible. Here we present the case of a unique individual with congenital blindness and medically refractory epilepsy who underwent neurosurgical treatment of her seizures. Pre-operative evaluation presented the challenge of accurately and robustly mapping the cerebral cortex for an individual with a high probability of significant cortical re-organization. Additionally, a blind individual has unique priorities in one’s ability to read Braille by touch and sense the environment primarily by sound than the non-vision impaired person. For these reasons we employed additional measures to map sensory, motor, speech, language, and auditory perception by employing a number of cortical electrophysiologic mapping and functional magnetic resonance imaging methods. Our data show promising results in the application of these adjunctive methods in the pre-operative mapping of otherwise difficult to localize, and highly variable, functional cortical areas.

  6. An interdisciplinary approach to mapping through scientific cartography, design and artistic expression

    Science.gov (United States)

    Gardener, Joanna; Cartwright, William; Duxbury, Lesley

    2018-05-01

    This paper reports on the initial findings of an interdisciplinary study exploring perceptions of space and place through alternate ways of mapping. The research project aims to bring depth and meaning to places by utilising a combination of diverse influences and responses, including emotional, sensory, memory and imaginary. It investigates mapping from a designer's perspective, with further narration from both the cartographic science and fine art perspectives. It examines the role of design and artistic expression in the cartographic process, and its capacity to effect and transform the appearance, reading and meaning of the final cartographic outcome (Robinson 2010). The crossover between the cartographic sciences and the work of artists who explore space and place enables an interrogation of where these fields collide or alternatively merge, in order to challenge the definition of a map. By exploring cartography through the overlapping of the distinct fields of science and art, this study challenges and questions the tipping point of when a map ceases to be a map and becomes art.

  7. A new integrated statistical approach to the diagnostic use of two-dimensional maps.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio; Cecconi, Daniela; Domenici, Enrico

    2003-01-01

    Two-dimensional (2-D) electrophoresis is a very useful technique for the analysis of proteins in biological tissues. The complexity of the 2-D maps obtained causes many difficulties in the comparison of different samples. A new method is proposed for comparing different 2-D maps, based on five steps: (i) the digitalisation of the image; (ii) the transformation of the digitalised map in a fuzzy entity, in order to consider the variability of the 2-D electrophoretic separation; (iii) the calculation of a similarity index for each pair of maps; (iv) the analysis by multidimensional scaling of the previously obtained similarity matrix; (v) the analysis by classification or cluster analysis techniques of the resulting map co-ordinates. The method adopted was first tested on some simulated samples in order to evaluate its sensitivity to small changes in the spots position and size. The optimal setting of the method parameters was also investigated. Finally, the method was successfully applied to a series of real samples corresponding to the electrophoretic bidimensional analysis of sera from normal and nicotine-treated rats. Multidimensional scaling allowed the separation of the two classes of samples without any misclassification.

  8. Online Problem Solving for Adolescent Brain Injury: A Randomized Trial of 2 Approaches.

    Science.gov (United States)

    Wade, Shari L; Taylor, Hudson Gerry; Yeates, Keith Owen; Kirkwood, Michael; Zang, Huaiyu; McNally, Kelly; Stacin, Terry; Zhang, Nanhua

    Adolescent traumatic brain injury (TBI) contributes to deficits in executive functioning and behavior, but few evidence-based treatments exist. We conducted a randomized clinical trial comparing Teen Online Problem Solving with Family (TOPS-Family) with Teen Online Problem Solving with Teen Only (TOPS-TO) or the access to Internet Resources Comparison (IRC) group. Children, aged 11 to 18 years, who sustained a complicated mild-to-severe TBI in the previous 18 months were randomly assigned to the TOPS-Family (49), TOPS-TO (51), or IRC group (52). Parent and self-report measures of externalizing behaviors and executive functioning were completed before treatment and 6 months later. Treatment effects were examined using linear regression models, adjusting for baseline symptom levels. Age, maternal education, and family stresses were examined as moderators. The TOPS-Family group had lower levels of parent-reported executive dysfunction at follow-up than the TOPS-TO group, and differences between the TOPS-Family and IRC groups approached significance. Maternal education moderated improvements in parent-reported externalizing behaviors, with less educated parents in the TOPS-Family group reporting fewer symptoms. On the self-report Behavior Rating Inventory of Executive Functions, treatment efficacy varied with the level of parental stresses. The TOPS-Family group reported greater improvements at low stress levels, whereas the TOPS-TO group reported greater improvement at high-stress levels. The TOPS-TO group did not have significantly lower symptoms than the IRC group on any comparison. Findings support the efficacy of online family problem solving to address executive dysfunction and improve externalizing behaviors among youth with TBI from less advantaged households. Treatment with the teen alone may be indicated in high-stress families.

  9. A randomized clinical study of two interceptive approaches to palatally displaced canines.

    Science.gov (United States)

    Baccetti, Tiziano; Leonardi, Maria; Armi, Pamela

    2008-08-01

    This study evaluated the effectiveness of two interceptive approaches to palatally displaced canines (PDC), i.e. extraction of the primary canines alone or in association with the use of a cervical-pull headgear. The randomized prospective design comprised 75 subjects with PDC (92 maxillary canines) who were randomly assigned to three groups: extraction of the primary canine only (EG), extraction of the primary canine and cervical-pull headgear (EHG), and an untreated control group (CG). Panoramic radiographs were evaluated at the time of initial observation (T1) and after an average period of 18 months (T2). At T2, an evaluation of the success of canine eruption was undertaken. Between-group statistical comparisons, Kruskal-Wallis test with Bonferroni correction, were performed on the T1-T2 changes of the diagnostic parameters on panoramic radiographs and the prevalence rates of success in canine eruption. A superimposition study on lateral cephalograms at T1 and T2 was carried out to evaluate the changes in the sagittal position of the upper molars in the three groups. The removal of the primary canine as an isolated measure to intercept palatal displacement of maxillary canines showed a success rate of 65.2 per cent, which was significantly greater than that in the untreated controls (36 per cent). The additional use of a headgear resulted in successful eruption in 87.5 per cent of the subjects, with a significant improvement in the measurements for intraosseous canine position. The cephalometric superimposition study showed a significant mesial movement of the upper first molars in the CG and EG when compared with the EHG.

  10. A nuclear reload optimization approach using a real coded genetic algorithm with random keys

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    The fuel reload of a Pressurized Water Reactor is made whenever the burn up of the fuel assemblies in the nucleus of the reactor reaches a certain value such that it is not more possible to maintain a critical reactor producing energy at nominal power. The problem of fuel reload optimization consists on determining the positioning of the fuel assemblies within the nucleus of the reactor in an optimized way to minimize the cost benefit relationship of fuel assemblies cost per maximum burn up, and also satisfying symmetry and safety restrictions. The fuel reload optimization problem difficulty grows exponentially with the number of fuel assemblies in the nucleus of the reactor. During decades the fuel reload optimization problem was solved manually by experts that used their knowledge and experience to build configurations of the reactor nucleus, and testing them to verify if safety restrictions of the plant are satisfied. To reduce this burden, several optimization techniques have been used, included the binary code genetic algorithm. In this work we show the use of a real valued coded approach of the genetic algorithm, with different recombination methods, together with a transformation mechanism called random keys, to transform the real values of the genes of each chromosome in a combination of discrete fuel assemblies for evaluation of the reload optimization. Four different recombination methods were tested: discrete recombination, intermediate recombination, linear recombination and extended linear recombination. For each of the 4 recombination methods 10 different tests using different seeds for the random number generator were conducted 10 generating, totaling 40 tests. The results of the application of the genetic algorithm are shown with formulation of real numbers for the problem of the nuclear reload of the plant Angra 1 type PWR. Since the best results in the literature for this problem were found by the parallel PSO we will it use for comparison

  11. A novel approach for monitoring writing interferences during navigated transcranial magnetic stimulation mappings of writing related cortical areas.

    Science.gov (United States)

    Rogić Vidaković, Maja; Gabelica, Dragan; Vujović, Igor; Šoda, Joško; Batarelo, Nikolina; Džimbeg, Andrija; Zmajević Schönwald, Marina; Rotim, Krešimir; Đogaš, Zoran

    2015-11-30

    It has recently been shown that navigated repetitive transcranial magnetic stimulation (nTMS) is useful in preoperative neurosurgical mapping of motor and language brain areas. In TMS mapping of motor cortices the evoked responses can be quantitatively monitored by electromyographic (EMG) recordings. No such setup exists for monitoring of writing during nTMS mappings of writing related cortical areas. We present a novel approach for monitoring writing during nTMS mappings of motor writing related cortical areas. To our best knowledge, this is the first demonstration of quantitative monitoring of motor evoked responses from hand by EMG, and of pen related activity during writing with our custom made pen, together with the application of chronometric TMS design and patterned protocol of rTMS. The method was applied in four healthy subjects participating in writing during nTMS mapping of the premotor cortical area corresponding to BA 6 and close to the superior frontal sulcus. The results showed that stimulation impaired writing in all subjects. The corresponding spectra of measured signal related to writing movements was observed in the frequency band 0-20 Hz. Magnetic stimulation affected writing by suppressing normal writing frequency band. The proposed setup for monitoring of writing provides additional quantitative data for monitoring and the analysis of rTMS induced writing response modifications. The setup can be useful for investigation of neurophysiologic mechanisms of writing, for therapeutic effects of nTMS, and in preoperative mapping of language cortical areas in patients undergoing brain surgery. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Impact of visual impairment on the lives of young adults in the Netherlands: a concept-mapping approach.

    Science.gov (United States)

    Elsman, Ellen Bernadette Maria; van Rens, Gerardus Hermanus Maria Bartholomeus; van Nispen, Ruth Marie Antoinette

    2017-12-01

    While the impact of visual impairments on specific aspects of young adults' lives is well recognised, a systematic understanding of its impact on all life aspects is lacking. This study aims to provide an overview of life aspects affected by visual impairment in young adults (aged 18-25 years) using a concept-mapping approach. Visually impaired young adults (n = 22) and rehabilitation professionals (n = 16) participated in online concept-mapping workshops (brainstorm procedure), to explore how having a visual impairment influences the lives of young adults. Statements were categorised based on similarity and importance. Using multidimensional scaling, concept maps were produced and interpreted. A total of 59 and 260 statements were generated by young adults and professionals, respectively, resulting in 99 individual statements after checking and deduplication. The combined concept map revealed 11 clusters: work, study, information and regulations, social skills, living independently, computer, social relationships, sport and activities, mobility, leisure time, and hobby. The concept maps provided useful insight into activities influenced by visual impairments in young adults, which can be used by rehabilitation centres to improve their services. This might help in goal setting, rehabilitation referral and successful transition to adult life, ultimately increasing participation and quality of life. Implications for rehabilitation Having a visual impairment affects various life-aspects related to participation, including activities related to work, study, social skills and relationships, activities of daily living, leisure time and mobility. Concept-mapping helped to identify the life aspects affected by low vision, and quantify these aspects in terms of importance according to young adults and low vision rehabilitation professionals. Low vision rehabilitation centres should focus on all life aspects found in this study when identifying the needs of young

  13. Mapping of Cold-Water Coral Carbonate Mounds Based on Geomorphometric Features: An Object-Based Approach

    Directory of Open Access Journals (Sweden)

    Markus Diesing

    2018-01-01

    Full Text Available Cold-water coral reefs are rich, yet fragile ecosystems found in colder oceanic waters. Knowledge of their spatial distribution on continental shelves, slopes, seamounts and ridge systems is vital for marine spatial planning and conservation. Cold-water corals frequently form conspicuous carbonate mounds of varying sizes, which are identifiable from multibeam echosounder bathymetry and derived geomorphometric attributes. However, the often-large number of mounds makes manual interpretation and mapping a tedious process. We present a methodology that combines image segmentation and random forest spatial prediction with the aim to derive maps of carbonate mounds and an associated measure of confidence. We demonstrate our method based on multibeam echosounder data from Iverryggen on the mid-Norwegian shelf. We identified the image-object mean planar curvature as the most important predictor. The presence and absence of carbonate mounds is mapped with high accuracy. Spatially-explicit confidence in the predictions is derived from the predicted probability and whether the predictions are within or outside the modelled range of values and is generally high. We plan to apply the showcased method to other areas of the Norwegian continental shelf and slope where multibeam echosounder data have been collected with the aim to provide crucial information for marine spatial planning.

  14. HyDEn: A Hybrid Steganocryptographic Approach for Data Encryption Using Randomized Error-Correcting DNA Codes

    Directory of Open Access Journals (Sweden)

    Dan Tulpan

    2013-01-01

    Full Text Available This paper presents a novel hybrid DNA encryption (HyDEn approach that uses randomized assignments of unique error-correcting DNA Hamming code words for single characters in the extended ASCII set. HyDEn relies on custom-built quaternary codes and a private key used in the randomized assignment of code words and the cyclic permutations applied on the encoded message. Along with its ability to detect and correct errors, HyDEn equals or outperforms existing cryptographic methods and represents a promising in silico DNA steganographic approach.

  15. Mapping Speech Spectra from Throat Microphone to Close-Speaking Microphone: A Neural Network Approach

    Directory of Open Access Journals (Sweden)

    B. Yegnanarayana

    2007-01-01

    Full Text Available Speech recorded from a throat microphone is robust to the surrounding noise, but sounds unnatural unlike the speech recorded from a close-speaking microphone. This paper addresses the issue of improving the perceptual quality of the throat microphone speech by mapping the speech spectra from the throat microphone to the close-speaking microphone. A neural network model is used to capture the speaker-dependent functional relationship between the feature vectors (cepstral coefficients of the two speech signals. A method is proposed to ensure the stability of the all-pole synthesis filter. Objective evaluations indicate the effectiveness of the proposed mapping scheme. The advantage of this method is that the model gives a smooth estimate of the spectra of the close-speaking microphone speech. No distortions are perceived in the reconstructed speech. This mapping technique is also used for bandwidth extension of telephone speech.

  16. The Power of Visual Approaches in Qualitative Inquiry: The Use of Collage Making and Concept Mapping in Experiential Research

    Directory of Open Access Journals (Sweden)

    Lynn Butler-Kisber

    2010-01-01

    Full Text Available The burgeoning interest in arts-informed research and the increasing variety of visual possibilities as a result of new technologies have paved the way for researchers to explore and use visual forms of inquiry. This article investigates how collage making and concept mapping are useful visual approaches that can inform qualitative research. They are experiential ways of doing/knowing that help to get at tacit aspects of both understanding and process and to make these more explicit to the researcher and more accessible to audiences. It outlines specific ways that each approach can be used with examples to illustrate how the approach informs the researcher's experience and that of the audience. The two approaches are compared and contrasted and issues that can arise in the work are discussed.

  17. Domain growth in weakly disordered magnets: A coupled map lattice approach

    International Nuclear Information System (INIS)

    Biswal, B.; Puri, S.; Chowdhury, D.

    1994-06-01

    We developed a novel numerical method of studying domain growth in the Ising-type models. Using this method we investigate the laws of domain growth in random-exchange Ising model. (author). 23 refs, 5 figs

  18. Systematic pain assessment in nursing homes: a cluster-randomized trial using mixed-methods approach.

    Science.gov (United States)

    Mamhidir, Anna-Greta; Sjölund, Britt-Marie; Fläckman, Birgitta; Wimo, Anders; Sköldunger, Anders; Engström, Maria

    2017-02-28

    Chronic pain affects nursing home residents' daily life. Pain assessment is central to adequate pain management. The overall aim was to investigate effects of a pain management intervention on nursing homes residents and to describe staffs' experiences of the intervention. A cluster-randomized trial and a mixed-methods approach. Randomized nursing home assignment to intervention or comparison group. The intervention group after theoretical and practical training sessions, performed systematic pain assessments using predominately observational scales with external and internal facilitators supporting the implementation. No measures were taken in the comparison group; pain management continued as before, but after the study corresponding training was provided. Resident data were collected baseline and at two follow-ups using validated scales and record reviews. Nurse group interviews were carried out twice. Primary outcome measures were wellbeing and proxy-measured pain. Secondary outcome measures were ADL-dependency and pain documentation. Using both non-parametric statistics on residential level and generalized estimating equation (GEE) models to take clustering effects into account, the results revealed non-significant interaction effects for the primary outcome measures, while for ADL-dependency using Katz-ADL there was a significant interaction effect. Comparison group (n = 66 residents) Katz-ADL values showed increased dependency over time, while the intervention group demonstrated no significant change over time (n = 98). In the intervention group, 13/44 residents showed decreased pain scores over the period, 14/44 had no pain score changes ≥ 30% in either direction measured with Doloplus-2. Furthermore, 17/44 residents showed increased pain scores ≥ 30% over time, indicating pain/risk for pain; 8 identified at the first assessment and 9 were new, i.e. developed pain over time. No significant changes in the use of drugs was found in any of

  19. A Multi-Objective Approach to Visualize Proportions and Similarities Between Individuals by Rectangular Maps

    DEFF Research Database (Denmark)

    Carrizosa, Emilio; Guerrero, Vanesa; Morales, Dolores Romero

    In this paper we address the problem of visualizing the proportions and the similarities attached to a set of individuals. We represent this information using a rectangular map, i.e., a subdivision of a rectangle into rectangular portions so that each portion is associated with one individual...... area and adjacency requirements, this visualization problem is formulated as a three-objective Mixed Integer Nonlinear Problem. The first objective seeks to maximize the number of true adjacencies that the rectangular map is able to reproduce, the second one is to minimize the number of false...

  20. Exploring a New Simulation Approach to Improve Clinical Reasoning Teaching and Assessment: Randomized Trial Protocol.

    Science.gov (United States)

    Pennaforte, Thomas; Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude

    2016-02-17

    Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. This study is in its preliminary stages and the results are expected to be made available by April, 2016. This will be the first study to explore a new

  1. A new approach to quantify and map carbon stored, sequestered and emissions avoided by urban forests

    Science.gov (United States)

    E. Gregory McPherson; Qingfu Xiao; Elena Aguaron

    2013-01-01

    This paper describes the use of field surveys, biometric information for urban tree species and remote sensing to quantify and map carbon (C) storage, sequestration and avoided emissions from energy savings. Its primary contribution is methodological; the derivation and application of urban tree canopy (UTC) based transfer functions (t C ha-1 UTC). Findings for Los...

  2. An Educational Data Mining Approach to Concept Map Construction for Web based Learning

    Directory of Open Access Journals (Sweden)

    Anal ACHARYA

    2017-01-01

    Full Text Available This aim of this article is to study the use of Educational Data Mining (EDM techniques in constructing concept maps for organizing knowledge in web based learning systems whereby studying their synergistic effects in enhancing learning. This article first provides a tutorial based introduction to EDM. The applicability of web based learning systems in enhancing the efficiency of EDM techniques in real time environment is investigated. Web based learning systems often use a tool for organizing knowledge. This article explores the use of one such tool called concept map for this purpose. The pioneering works by various researchers who proposed web based learning systems in personalized and collaborative environment in this arena are next presented. A set of parameters are proposed based on which personalized and collaborative learning applications may be generalized and their performances compared. It is found that personalized learning environment uses EDM techniques more exhaustively compared to collaborative learning for concept map construction in web based environment. This article can be used as a starting point for freshers who would like to use EDM techniques for concept map construction for web based learning purposes.

  3. A PDCA-based approach to Environmental Value Stream Mapping (E-VSM)

    DEFF Research Database (Denmark)

    Garza-Reyes, Jose Arturo; Torres Romero, Joseth; Govindan, Kannan

    2018-01-01

    Research into the application of Value Stream Mapping (VSM) as a tool to enhance the environmental sustainability performance of operations has been confined to a handful of studies only. Research on this green lean research stream is therefore limited, especially when compared to the vast amount...

  4. Mapping and Quantifying Biodiversity and Ecosystem Services Related to Terrestrial Vertebrates: A National Approach

    Science.gov (United States)

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  5. A National Approach to Map and Quantify Terrestrial Vertebrate Biodiversity within an Ecosystem Services Framework

    Science.gov (United States)

    Biodiversity is crucial for the functioning of ecosystems and the products and services from which we transform natural assets of the Earth for human survival, security, and well-being. The ability to assess, report, map, and forecast the life support functions of ecosystems is a...

  6. New geospatial approaches for efficiently mapping forest biomass logistics at high resolution over large areas

    Science.gov (United States)

    John Hogland; Nathaniel Anderson; Woodam Chung

    2018-01-01

    Adequate biomass feedstock supply is an important factor in evaluating the financial feasibility of alternative site locations for bioenergy facilities and for maintaining profitability once a facility is built. We used newly developed spatial analysis and logistics software to model the variables influencing feedstock supply and to estimate and map two components of...

  7. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  8. Prototype of interactive Web Maps: an approach based on open sources

    Directory of Open Access Journals (Sweden)

    Jürgen Philips

    2004-07-01

    Full Text Available To explore the potentialities available in the World Wide Web (WWW, a prototype with interactive Web map was elaborated using standardized codes and open sources, such as eXtensible Markup Language (XML, Scalable Vector Graphics (SVG, Document Object Model (DOM , script languages ECMAScript/JavaScript and “PHP: Hypertext Preprocessor”, and PostgreSQL and its extension, the PostGIS, to disseminate information related to the urban real estate register. Data from the City Hall of São José - Santa Catarina, were used, referring to Campinas district. Using Client/Server model, a prototype of a Web map with standardized codes and open sources was implemented, allowing a user to visualize Web maps using only the Adobe’s plug-in Viewer 3.0 in his/her browser. Aiming a good cartographic project for the Web, it was obeyed rules of graphical translation and was implemented different functionalities of interaction, like interactive legends, symbolization and dynamic scale. From the results, it can be recommended the use of using standardized codes and open sources in interactive Web mapping projects. It is understood that, with the use of Open Source code, in the public and private administration, the possibility of technological development is amplified, and consequently, a reduction with expenses in the acquisition of computer’s program. Besides, it stimulates the development of computer applications targeting specific demands and requirements.

  9. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.

    2013-01-01

    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  10. An integrated approach to consumer representation and involvement in a multicentre randomized controlled trial.

    Science.gov (United States)

    Langston, Anne L; McCallum, Marilyn; Campbell, Marion K; Robertson, Clare; Ralston, Stuart H

    2005-01-01

    Although, consumer involvement in individual studies is often limited, their involvement in guiding health research is generally considered to be beneficial. This paper outlines our experiences of an integrated relationship between the organisers of a clinical trial and a consumer organisation. The PRISM trial is a UK multicentre, randomized controlled trial comparing treatment strategies for Paget's disease of the bone. The National Association for the Relief of Paget's Disease (NARPD) is the only UK support group for sufferers of Paget's disease and has worked closely with the PRISM team from the outset. NARPD involvement is integral to the conduct of the trial and specific roles have included: peer-review; trial steering committee membership; provision of advice to participants, and promotion of the trial amongst Paget's disease patients. The integrated relationship has yielded benefits to both the trial and the consumer organisation. The benefits for the trial have included: recruitment of participants via NARPD contacts; well-informed participants; unsolicited patient advocacy of the trial; and interested and pro-active collaborators. For the NARPD and Paget's disease sufferers, benefits have included: increased awareness of Paget's disease; increased access to relevant health research; increased awareness of the NARPD services; and wider transfer of diagnosis and management knowledge to/from health care professionals. Our experience has shown that an integrated approach between a trial team and a consumer organisation is worthwhile. Adoption of such an approach in other trials may yield significant improvements in recruitment and quality of participant information flow. There are, however, resource implications for both parties.

  11. A randomized controlled trial testing an Internet delivered cost-benefit approach to weight loss maintenance.

    Science.gov (United States)

    Leahey, Tricia M; Fava, Joseph L; Seiden, Andrew; Fernandes, Denise; Doyle, Caroline; Kent, Kimberly; La Rue, Molly; Mitchell, Marc; Wing, Rena R

    2016-11-01

    Weight loss maintenance is a significant challenge in obesity treatment. During maintenance the "costs" of adhering to weight management behaviors may outweigh the "benefits." This study examined the efficacy of a novel approach to weight loss maintenance based on modifying the cost-benefit ratio. Individuals who achieved a 5% weight loss (N=75) were randomized to one of three, 10-month maintenance interventions. All interventions were delivered primarily via the Internet. The Standard arm received traditional weight maintenance strategies. To increase benefits, or rewards, for maintenance behaviors, the two cost-benefit intervention conditions received weekly monetary rewards for self-monitoring and social reinforcement via e-coaching. To decrease behavioral costs (boredom) and increase novelty, participants in the cost-benefit conditions also monitored different evidence-based behaviors every two weeks (e.g., Weeks 1 & 2: steps; Week 3 & 4: red foods). The primary difference between the cost-benefit interventions was type of e-coach providing social reinforcement: Professional (CB Pro) or Peer (CB Peer). Study procedures took place in Providence, RI from 2013 to 2014. Retention was 99%. There were significant group differences in weight regain (p=.01). The Standard arm gained 3.5±5.7kg. In contrast, participants in CB Pro and CB Peer lost an additional 1.8±7.0kg and 0.5±6.4kg, respectively. These results suggest that an Internet delivered cost-benefit approach to weight loss maintenance may be effective for long-term weight control. In addition, using peer coaches to provide reinforcement may be a particularly economic alternative to professionals. These data are promising and provide support for a larger, longer trial. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Multidisciplinary approach to management of maternal asthma (MAMMA [copyright]): the PROTOCOL for a randomized controlled trial.

    Science.gov (United States)

    Lim, Angelina; Stewart, Kay; Abramson, Michael J; Walker, Susan P; George, Johnson

    2012-12-19

    Uncontrolled asthma during pregnancy is associated with the maternal hazards of disease exacerbation, and perinatal hazards including intrauterine growth restriction and preterm birth. Interventions directed at achieving better asthma control during pregnancy should be considered a high priority in order to optimise both maternal and perinatal outcomes. Poor compliance with prescribed asthma medications during pregnancy and suboptimal prescribing patterns to pregnant women have both been shown to be contributing factors that jeopardise asthma control. The aim is to design and evaluate an intervention involving multidisciplinary care for women experiencing asthma in pregnancy. A pilot single-blinded parallel-group randomized controlled trial testing a Multidisciplinary Approach to Management of Maternal Asthma (MAMMA©) which involves education and regular monitoring. Pregnant women with asthma will be recruited from antenatal clinics in Victoria, Australia. Recruited participants, stratified by disease severity, will be allocated to the intervention or the usual care group in a 1:1 ratio. Both groups will be followed prospectively throughout pregnancy and outcomes will be compared between groups at three and six months after recruitment to evaluate the effectiveness of this intervention. Outcome measures include Asthma Control Questionnaire (ACQ) scores, oral corticosteroid use, asthma exacerbations and asthma related hospital admissions, and days off work, preventer to reliever ratio, along with pregnancy and neonatal adverse events at delivery. The use of FEV(1)/FEV(6) will be also investigated during this trial as a marker for asthma control. If successful, this model of care could be widely implemented in clinical practice and justify more funding for support services and resources for these women. This intervention will also promote awareness of the risks of poorly controlled asthma and the need for a collaborative, multidisciplinary approach to asthma

  13. A novel approach to assess the treatment response using Gaussian random field in PET

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Mengdie [Department of Biomedical Engineering, Tsinghua University, Beijing 100084, China and Center for Advanced Medical Imaging Science, Division of Nuclear Medicine and Molecular Imaging, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States); Guo, Ning [Center for Advanced Medical Imaging Science, Division of Nuclear Medicine and Molecular Imaging, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States); Hu, Guangshu; Zhang, Hui, E-mail: hzhang@mail.tsinghua.edu.cn, E-mail: li.quanzheng@mgh.harvard.edu [Department of Biomedical Engineering, Tsinghua University, Beijing 100084 (China); El Fakhri, Georges; Li, Quanzheng, E-mail: hzhang@mail.tsinghua.edu.cn, E-mail: li.quanzheng@mgh.harvard.edu [Center for Advanced Medical Imaging Science, Division of Nuclear Medicine and Molecular Imaging, Massachusetts General Hospital, Boston, Massachusetts 02114 and Department of Radiology, Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2016-02-15

    Purpose: The assessment of early therapeutic response to anticancer therapy is vital for treatment planning and patient management in clinic. With the development of personal treatment plan, the early treatment response, especially before any anatomically apparent changes after treatment, becomes urgent need in clinic. Positron emission tomography (PET) imaging serves an important role in clinical oncology for tumor detection, staging, and therapy response assessment. Many studies on therapy response involve interpretation of differences between two PET images, usually in terms of standardized uptake values (SUVs). However, the quantitative accuracy of this measurement is limited. This work proposes a statistically robust approach for therapy response assessment based on Gaussian random field (GRF) to provide a statistically more meaningful scale to evaluate therapy effects. Methods: The authors propose a new criterion for therapeutic assessment by incorporating image noise into traditional SUV method. An analytical method based on the approximate expressions of the Fisher information matrix was applied to model the variance of individual pixels in reconstructed images. A zero mean unit variance GRF under the null hypothesis (no response to therapy) was obtained by normalizing each pixel of the post-therapy image with the mean and standard deviation of the pretherapy image. The performance of the proposed method was evaluated by Monte Carlo simulation, where XCAT phantoms (128{sup 2} pixels) with lesions of various diameters (2–6 mm), multiple tumor-to-background contrasts (3–10), and different changes in intensity (6.25%–30%) were used. The receiver operating characteristic curves and the corresponding areas under the curve were computed for both the proposed method and the traditional methods whose figure of merit is the percentage change of SUVs. The formula for the false positive rate (FPR) estimation was developed for the proposed therapy response

  14. Numerical optimization approach for resonant electromagnetic vibration transducer designed for random vibration

    International Nuclear Information System (INIS)

    Spreemann, Dirk; Hoffmann, Daniel; Folkmer, Bernd; Manoli, Yiannos

    2008-01-01

    This paper presents a design and optimization strategy for resonant electromagnetic vibration energy harvesting devices. An analytic expression for the magnetic field of cylindrical permanent magnets is used to build up an electromagnetic subsystem model. This subsystem is used to find the optimal resting position of the oscillating mass and to optimize the geometrical parameters (shape and size) of the magnet and coil. The objective function to be investigated is thereby the maximum voltage output of the transducer. An additional mechanical subsystem model based on well-known equations describing the dynamics of spring–mass–damper systems is established to simulate both nonlinear spring characteristics and the effect of internal limit stops. The mechanical subsystem enables the identification of optimal spring characteristics for realistic operation conditions such as stochastic vibrations. With the overall transducer model, a combination of both subsystems connected to a simple electrical circuit, a virtual operation of the optimized vibration transducer excited by a measured random acceleration profile can be performed. It is shown that the optimization approach results in an appreciable increase of the converter performance

  15. Investigating the causal effect of vitamin D on serum adiponectin using a mendelian randomization approach

    DEFF Research Database (Denmark)

    Husemoen, L. L. N.; Skaaby, T.; Martinussen, Torben

    2014-01-01

    Background/Objectives: The aim was to examine the causal effect of vitamin D on serum adiponectin using a multiple instrument Mendelian randomization approach. Subjects/Methods: Serum 25-hydroxy vitamin D (25(OH)D) and serum total or high molecular weight (HMW) adiponectin were measured in two...... doubling of 25(OH)D was 4.78, 95% CI: 1.96, 7.68, Pvitamin D-binding protein gene and the filaggrin gene as instrumental variables, the causal effect in % was estimated to 61.46, 95% CI: 17.51, 120.28, P=0.003 higher adiponectin per doubling of 25(OH)D. In the MONICA10...... effect estimate in % per doubling of 25(OH)D was 37.13, 95% CI:-3.67, 95.20, P=0.080). Conclusions: The results indicate a possible causal association between serum 25(OH)D and total adiponectin. However, the association was not replicated for HMW adiponectin. Thus, further studies are needed to confirm...

  16. The Regional Land Cover Monitoring System: Building regional capacity through innovative land cover mapping approaches

    Science.gov (United States)

    Saah, D.; Tenneson, K.; Hanh, Q. N.; Aekakkararungroj, A.; Aung, K. S.; Goldstein, J.; Cutter, P. G.; Maus, P.; Markert, K. N.; Anderson, E.; Ellenburg, W. L.; Ate, P.; Flores Cordova, A. I.; Vadrevu, K.; Potapov, P.; Phongsapan, K.; Chishtie, F.; Clinton, N.; Ganz, D.

    2017-12-01

    Earth observation and Geographic Information System (GIS) tools, products, and services are vital to support the environmental decision making by governmental institutions, non-governmental agencies, and the general public. At the heart of environmental decision making is the monitoring land cover and land use change (LCLUC) for land resource planning and for ecosystem services, including biodiversity conservation and resilience to climate change. A major challenge for monitoring LCLUC in developing regions, such as Southeast Asia, is inconsistent data products at inconsistent intervals that have different typologies across the region and are typically made in without stakeholder engagement or input. Here we present the Regional Land Cover Monitoring System (RLCMS), a novel land cover mapping effort for Southeast Asia, implemented by SERVIR-Mekong, a joint NASA-USAID initiative that brings Earth observations to improve environmental decision making in developing countries. The RLCMS focuses on mapping biophysical variables (e.g. canopy cover, tree height, or percent surface water) at an annual interval and in turn using those biophysical variables to develop land cover maps based on stakeholder definitions of land cover classes. This allows for flexible and consistent land cover classifications that can meet the needs of different institutions across the region. Another component of the RLCMS production is the stake-holder engagement through co-development. Institutions that directly benefit from this system have helped drive the development for regional needs leading to services for their specific uses. Examples of services for regional stakeholders include using the RLCMS to develop maps using the IPCC classification scheme for GHG emission reporting and developing custom annual maps as an input to hydrologic modeling/flood forecasting systems. In addition to the implementation of this system and the service stemming from the RLCMS in Southeast Asia, it is

  17. A Karnaugh map based approach towards systemic reviews and meta-analysis.

    Science.gov (United States)

    Hassan, Abdul Wahab; Hassan, Ahmad Kamal

    2016-01-01

    Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews.

  18. Radiographic cup position following posterior and lateral approach to total hip arthroplasty. An explorative randomized controlled trial.

    Science.gov (United States)

    Kruse, Christine; Rosenlund, Signe; Broeng, Leif; Overgaard, Søren

    2018-01-01

    The two most common surgical approaches to total hip arthroplasty are the posterior approach and lateral approach. The surgical approach may influence cup positioning and restoration of the offset, which may affect the biomechanical properties of the hip joint. The primary aim was to compare cup position between posterior approach and lateral approach. Secondary aims were to compare femoral offset, abductor moment arm and leg length discrepancy between the two approaches. Eighty patients with primary hip osteoarthritis were included in a randomized controlled trial and assigned to total hip arthroplasty using posterior approach or lateral approach. Postoperative radiographs from 38 patients in each group were included in this study for measurement of cup anteversion and inclination. Femoral offset, cup offset, total offset, abductor moment arm and leg length discrepancy were measured on preoperative and postoperative radiographs in 28 patients in each group. We found that mean anteversion was 5° larger in the posterior approach group (95% CI, -8.1 to -1.4; p = 0.006), while mean inclination was 5° less steep (95% CI, 2.7 to 7.2; pcup anteversion but less steep cup inclination in the posterior approach group compared with the lateral approach group. Femoral offset and abductor moment arm were restored after total hip arthroplasty using lateral approach but significantly increased when using posterior approach.

  19. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    Energy Technology Data Exchange (ETDEWEB)

    Vatsavai, Raju [ORNL

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  20. Mapping the Hot Spots: A Zoning Approach to Space Analysis and Design

    Science.gov (United States)

    Bunnell, Adam; Carpenter, Russell; Hensley, Emily; Strong, Kelsey; Williams, ReBecca; Winter, Rachel

    2016-01-01

    This article examines a preliminary approach to space design developed and implemented in Eastern Kentucky University's Noel Studio for Academic Creativity. The approach discussed here is entitled "hot spots," which has allowed the research team to observe trends in space usage and composing activities among students. This approach has…

  1. Preoperative Radiotherapy in Resectable Rectal Cancer: A Prospective Randomized Study of Two Different Approaches

    International Nuclear Information System (INIS)

    EITTA, M.A.; EL- WAHIDI, G.F.; FOUDA, M.A.; ABO EL-NAGA, E.M.; GAD EL-HAK, N.

    2010-01-01

    Preoperative radiotherapy in resectable rectal cancer has a number of potential advantages, most importantly reducing local recurrence, increasing survival and down-staging effect. Purpose: This prospective study was designed to compare between two different approaches of preoperative radiotherapy, either short course or long course radiotherapy. The primary endpoint is to evaluate the local recurrence rate, overall survival (OS) and disease free survival (DFS). The secondary endpoint is to evaluate down staging, treatment toxicity and ability to do sphincter sparing procedure (SSP), aiming at helping in the choice of the optimal treatment modality. Patients and Methods: This is a prospective randomized study of patients with resectable rectal cancer who presented to the department of Clinical Oncology and Nuclear Medicine, Mansoura University during the time period between June 2007 and September 2009. These patients received preoperative radiotherapy and were randomized into two arms: Arm 1, short course (SCRT) 25Gy/week/5 fractions followed by surgery within one week, and arm 2, long course preoperative radiotherapy (LCRT) 45Gy/5 weeks/25 fractions followed by surgery after 4-6 weeks. Adjuvant chemotherapy was given 4-6 weeks after surgery according to the postoperative pathology. Results: After a median follow-up of 18 months (range 6 to 28 months), we studied the patterns of recurrence. Three patients experienced local recurrence (LR), two out of 14 (14.2%) in arm 1 and one out of 15 patients (6.7%) in arm 2, (p=0.598). Three patients developed distant metastases [two in arm 1 (14.2%) and one in arm 2 (6.7%), p=0.598]. Two-year OS rate was 64±3% and 66±2%, (p= 0.389), and the 2-year DFS rate was 61±2% and 83±2% for arms 1 and 2, respectively (p=0.83). Tumor (T) downstaging was more achieved in LCRT arm with a statistically significant difference, but did not reach statistical significance in node (N) down-staging. SSP was more available in LCRT but with no

  2. Identification and mapping the high nature value farmland by the comparison of a combined and species approaches in Tuscany, Italy

    Directory of Open Access Journals (Sweden)

    Giulio Lazzerini

    2015-09-01

    Full Text Available Low-intensity farming systems play a crucial role in nature conservation by preserving 50% of habitats, flora and fauna occurring in Europe. For this reason the identification, classification and mapping of high nature value farmlands (HNVfs is becoming an overriding concern. In this study, two different approaches, namely combined approach and species-based approach, were used to spatially identify HNVfs (type 1, 2 and 3 across Tuscany region (Italy. The first approach calculated different indicators (extensive practices indicator, crop diversity indicator, landscape element indicator at 1×1 km grid cell spatial resolution using pre-existent spatial datasets integrated within a global information system environment. Whilst, the speciesbased approach relied on a pre-existent regional naturalistic inventory. All indicators and the resulting HNVfs derived from the two approaches were aggregated at municipality level. Despite some difference, the two adopted approaches intercepted spatially the same HNVfs areas, accounting for 35% of the total utilised agricultural area of the region. Just 16% of HNVfs resulted located inside protected areas, thus under current conservation and protection management actions. Finally, HNVfs of the Tuscany region were spatially aggregated in four relevant agro-ecosystems by taking into consideration the cropping systems and the landscape elements’ characteristics peculiar in the region.

  3. Development of a dynamic web mapping service for vegetation productivity using earth observation and in situ sensors in a sensor web based approach

    NARCIS (Netherlands)

    Kooistra, L.; Bergsma, A.R.; Chuma, B.; Bruin, de S.

    2009-01-01

    This paper describes the development of a sensor web based approach which combines earth observation and in situ sensor data to derive typical information offered by a dynamic web mapping service (WMS). A prototype has been developed which provides daily maps of vegetation productivity for the

  4. Mapping Aquatic Vegetation in a Large, Shallow Eutrophic Lake: A Frequency-Based Approach Using Multiple Years of MODIS Data

    Directory of Open Access Journals (Sweden)

    Xiaohan Liu

    2015-08-01

    Full Text Available Aquatic vegetation serves many important ecological and socioeconomic functions in lake ecosystems. The presence of floating algae poses difficulties for accurately estimating the distribution of aquatic vegetation in eutrophic lakes. We present an approach to map the distribution of aquatic vegetation in Lake Taihu (a large, shallow eutrophic lake in China and reduce the influence of floating algae on aquatic vegetation mapping. Our approach involved a frequency analysis over a 2003–2013 time series of the floating algal index (FAI based on moderate-resolution imaging spectroradiometer (MODIS data. Three phenological periods were defined based on the vegetation presence frequency (VPF and the growth of algae and aquatic vegetation: December and January composed the period of wintering aquatic vegetation; February and March composed the period of prolonged coexistence of algal blooms and wintering aquatic vegetation; and June to October was the peak period of the coexistence of algal blooms and aquatic vegetation. By comparing and analyzing the satellite-derived aquatic vegetation distribution and 244 in situ measurements made in 2013, we established a FAI threshold of −0.025 and VPF thresholds of 0.55, 0.45 and 0.85 for the three phenological periods. We validated the accuracy of our approach by comparing the results between the satellite-derived maps and the in situ results obtained from 2008–2012. The overall classification accuracy was 87%, 81%, 77%, 88% and 73% in the five years from 2008–2012, respectively. We then applied the approach to the MODIS images from 2003–2013 and obtained the total area of the aquatic vegetation, which varied from 265.94 km2 in 2007 to 503.38 km2 in 2008, with an average area of 359.62 ± 69.20 km2 over the 11 years. Our findings suggest that (1 the proposed approach can be used to map the distribution of aquatic vegetation in eutrophic algae-rich waters and (2 dramatic changes occurred in the

  5. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  6. The development of an adolescent smoking cessation intervention--an Intervention Mapping approach to planning.

    Science.gov (United States)

    Dalum, Peter; Schaalma, Herman; Kok, Gerjo

    2012-02-01

    The objective of this project was to develop a theory- and evidence-based adolescent smoking cessation intervention using both new and existing materials. We used the Intervention Mapping framework for planning health promotion programmes. Based on a needs assessment, we identified important and changeable determinants of cessation behaviour, specified change objectives for the intervention programme, selected theoretical change methods for accomplishing intervention objectives and finally operationalized change methods into practical intervention strategies. We found that guided practice, modelling, self-monitoring, coping planning, consciousness raising, dramatic relief and decisional balance were suitable methods for adolescent smoking cessation. We selected behavioural journalism, guided practice and Motivational Interviewing as strategies in our intervention. Intervention Mapping helped us to develop as systematic adolescent smoking cessation intervention with a clear link between behavioural goals, theoretical methods, practical strategies and materials and with a strong focus on implementation and recruitment. This paper does not present evaluation data.

  7. Transfer map approach to an optical effects of energy degraders on the performance of fragment separators

    International Nuclear Information System (INIS)

    Erdelyi, B.; Bandura, L.; Nolen, J.

    2009-01-01

    A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized

  8. Transfer map approach to and optical effects of energy degraders in fragment separators

    Directory of Open Access Journals (Sweden)

    B. Erdelyi

    2009-01-01

    Full Text Available A second order analytical and an arbitrary order numerical procedure is developed for the computation of transfer maps of energy degraders. The incorporation of the wedges into the optics of fragment separators for next-generation exotic beam facilities, their optical effects, and the optimization of their performance is studied in detail. It is shown how to place and shape the degraders in the system such that aberrations are minimized and resolving powers are maximized.

  9. Approaching the O{sup -} photodetachment threshold with velocity-map imaging

    Energy Technology Data Exchange (ETDEWEB)

    Cavanagh, S J; Gibson, S T; Lewis, B R, E-mail: Steven.Cavanagh@anu.edu.a, E-mail: Stephen.Gibson@anu.edu.a [Research School of Physics and Engineering, Australian National University, Canberra ACT 0200 (Australia)

    2009-11-01

    A series of photodetachment spectra from O{sup -} has been measured from near threshold to several eV using the technique of velocity-map imaging. With a resolving power of {Delta}E/E {<=} 0.38%, the energy and angular dependences for the six fine-structure transitions have been determined. For the first time, the energy and angular dependences of the cross section within a few meV of threshold have been determined.

  10. Determination of alpha(s)(M(tau)(2)): conformal mapping approach

    Czech Academy of Sciences Publication Activity Database

    Caprini, I.; Fischer, Jan

    2011-01-01

    Roč. 218, - (2011), s. 128-133 ISSN 0920-5632. [11th International Workshop on Tau Lepton Physics. Manchester, 13.09.2010-17.09.2010] R&D Projects: GA MŠk LC527; GA MŠk LA08015 Institutional research plan: CEZ:AV0Z10100502 Keywords : perturbative QCD expansion * conformal mapping * hadronic decay of the tau lepton Subject RIV: BF - Elementary Particles and High Energy Physics

  11. A Concept Mapping Approach to Guide and Understand Dissemination and Implementation

    OpenAIRE

    Green, Amy E.; Fettes, Danielle L.; Aarons, Gregory A.

    2012-01-01

    Many efforts to implement evidence-based programs do not reach their full potential or fail due to the variety of challenges inherent in dissemination and implementation. This article describes the use of concept mapping—a mixed method strategy—to study implementation of behavioral health innovations and evidence-based practice (EBP). The application of concept mapping to implementation research represents a practical and concise way to identify and quantify factors affecting implementation, ...

  12. Compression map, functional groups and fossilization: A chemometric approach (Pennsylvanian neuropteroid foliage, Canada)

    Science.gov (United States)

    D'Angelo, J. A.; Zodrow, E.L.; Mastalerz, Maria

    2012-01-01

    Nearly all of the spectrochemical studies involving Carboniferous foliage of seed-ferns are based on a limited number of pinnules, mainly compressions. In contrast, in this paper we illustrate working with a larger pinnate segment, i.e., a 22-cm long neuropteroid specimen, compression-preserved with cuticle, the compression map. The objective is to study preservation variability on a larger scale, where observation of transparency/opacity of constituent pinnules is used as a first approximation for assessing the degree of pinnule coalification/fossilization. Spectrochemical methods by Fourier transform infrared spectrometry furnish semi-quantitative data for principal component analysis.The compression map shows a high degree of preservation variability, which ranges from comparatively more coalified pinnules to less coalified pinnules that resemble fossilized-cuticles, noting that the pinnule midveins are preserved more like fossilized-cuticles. A general overall trend of coalified pinnules towards fossilized-cuticles, i.e., variable chemistry, is inferred from the semi-quantitative FTIR data as higher contents of aromatic compounds occur in the visually more opaque upper location of the compression map. The latter also shows a higher condensation of the aromatic nuclei along with some variation in both ring size and degree of aromatic substitution. From principal component analysis we infer correspondence between transparency/opacity observation and chemical information which correlate with varying degree to fossilization/coalification among pinnules. ?? 2011 Elsevier B.V.

  13. Structural-genetic approach to analysis and mapping of Chernobyl's radionuclide contamination field

    International Nuclear Information System (INIS)

    Proskura, N.I.; Bujkov, M.; Nagorsky, V.A.; Tepikin, V.; Poletaev, V.; Solyanke, E.G.; Shkvorets, O.Y.; Shestopalov, V.M.; Skvortsov, V.

    1997-01-01

    As a main tool for revealing and interpreting the internal structure of radionuclide contamination field, around the Chernobyl NPP the reliable and validated detailed scale maps of contamination densities could serve. Such maps should have, on the one hand, a high enough density of initial observation points (not less than 1 to 10 points per 1 sq.cm. of final map) and, on the other hand, a high representativeness of each observation point, i.e. reliability of presentation of its vicinity (0.1 to 1 sq.km). The available analytical data files of soil sampling in the exclusion zone conform neither to the first requirement, nor to the second one: real density of sampling does not exceed 0-2 to 0.5 points per 1 sq.m, and the representativeness of obtained results has a typical variation from medium values (in the neighbourhood of 0.1 to 1 sq.km) to 3 to 5 times

  14. One step versus two step approach for gestational diabetes screening: systematic review and meta-analysis of the randomized trials.

    Science.gov (United States)

    Saccone, Gabriele; Caissutti, Claudia; Khalifeh, Adeeb; Meltzer, Sara; Scifres, Christina; Simhan, Hyagriv N; Kelekci, Sefa; Sevket, Osman; Berghella, Vincenzo

    2017-12-03

    To compare both the prevalence of gestational diabetes mellitus (GDM) as well as maternal and neonatal outcomes by either the one-step or the two-step approaches. Electronic databases were searched from their inception until June 2017. We included all randomized controlled trials (RCTs) comparing the one-step with the two-step approaches for the screening and diagnosis of GDM. The primary outcome was the incidence of GDM. Three RCTs (n = 2333 participants) were included in the meta-analysis. 910 were randomized to the one step approach (75 g, 2 hrs), and 1423 to the two step approach. No significant difference in the incidence of GDM was found comparing the one step versus the two step approaches (8.4 versus 4.3%; relative risk (RR) 1.64, 95%CI 0.77-3.48). Women screened with the one step approach had a significantly lower risk of preterm birth (PTB) (3.7 versus 7.6%; RR 0.49, 95%CI 0.27-0.88), cesarean delivery (16.3 versus 22.0%; RR 0.74, 95%CI 0.56-0.99), macrosomia (2.9 versus 6.9%; RR 0.43, 95%CI 0.22-0.82), neonatal hypoglycemia (1.7 versus 4.5%; RR 0.38, 95%CI 0.16-0.90), and admission to neonatal intensive care unit (NICU) (4.4 versus 9.0%; RR 0.49, 95%CI 0.29-0.84), compared to those randomized to screening with the two step approach. The one and the two step approaches were not associated with a significant difference in the incidence of GDM. However, the one step approach was associated with better maternal and perinatal outcomes.

  15. Approach of automatic 3D geological mapping: the case of the Kovdor phoscorite-carbonatite complex, NW Russia.

    Science.gov (United States)

    Kalashnikov, A O; Ivanyuk, G Yu; Mikhailova, J A; Sokharev, V A

    2017-07-31

    We have developed an approach for automatic 3D geological mapping based on conversion of chemical composition of rocks to mineral composition by logical computation. It allows to calculate mineral composition based on bulk rock chemistry, interpolate the mineral composition in the same way as chemical composition, and, finally, build a 3D geological model. The approach was developed for the Kovdor phoscorite-carbonatite complex containing the Kovdor baddeleyite-apatite-magnetite deposit. We used 4 bulk rock chemistry analyses - Fe magn , P 2 O 5 , CO 2 and SiO 2 . We used four techniques for prediction of rock types - calculation of normative mineral compositions (norms), multiple regression, artificial neural network and developed by logical evaluation. The two latter became the best. As a result, we distinguished 14 types of phoscorites (forsterite-apatite-magnetite-carbonate rock), carbonatite and host rocks. The results show good convergence with our petrographical studies of the deposit, and recent manually built maps. The proposed approach can be used as a tool of a deposit genesis reconstruction and preliminary geometallurgical modelling.

  16. Mapping sequences by parts

    Directory of Open Access Journals (Sweden)

    Guziolowski Carito

    2007-09-01

    Full Text Available Abstract Background: We present the N-map method, a pairwise and asymmetrical approach which allows us to compare sequences by taking into account evolutionary events that produce shuffled, reversed or repeated elements. Basically, the optimal N-map of a sequence s over a sequence t is the best way of partitioning the first sequence into N parts and placing them, possibly complementary reversed, over the second sequence in order to maximize the sum of their gapless alignment scores. Results: We introduce an algorithm computing an optimal N-map with time complexity O (|s| × |t| × N using O (|s| × |t| × N memory space. Among all the numbers of parts taken in a reasonable range, we select the value N for which the optimal N-map has the most significant score. To evaluate this significance, we study the empirical distributions of the scores of optimal N-maps and show that they can be approximated by normal distributions with a reasonable accuracy. We test the functionality of the approach over random sequences on which we apply artificial evolutionary events. Practical Application: The method is illustrated with four case studies of pairs of sequences involving non-standard evolutionary events.

  17. Case studies combined with or without concept maps improve critical thinking in hospital-based nurses: a randomized-controlled trial.

    Science.gov (United States)

    Huang, Yu-Chuan; Chen, Hsing-Hsia; Yeh, Mei-Ling; Chung, Yu-Chu

    2012-06-01

    Critical thinking (CT) is essential to the exercise of professional judgment. As nurses face increasingly complex health-care situations, critical thinking can promote appropriate clinical decision-making and improve the quality of nursing care. This study aimed to evaluate the effects of a program of case studies, alone (CS) or combined with concept maps (CSCM), on improving CT in clinical nurses. The study was a randomized controlled trial. The experimental group participated in a 16-week CSCM program, whereas the control group participated in a CS program of equal duration. A randomized-controlled trial with a multistage randomization process was used to select and to assign participants, ultimately resulting in 67 nurses in each group. Data were collected before and after the program using the California Critical Thinking Skill Test (CCTST) and the California Critical Thinking Disposition Inventory (CCTDI). After the programs, there were significant differences between the two groups in the critical thinking skills of analysis, evaluation, inference, deduction, and induction. There was also an overall significant difference, and a significant difference in the specific disposition of open-mindedness. This study supports the application of case studies combined with concept maps as a hospital-based teaching strategy to promote development of critical thinking skills and encourage dispositions for nurses. The CSCM resulted in greater improvements in all critical thinking skills of as well as the overall and open-minded affective dispositions toward critical thinking, compared with the case studies alone. An obvious improvement in the CSCM participants was the analytic skill and disposition. Further longitudinal studies and data collection from multisite evaluations in a range of geographic locales are warranted. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Text messaging approach improves weight loss in patients with nonalcoholic fatty liver disease: A randomized study.

    Science.gov (United States)

    Axley, Page; Kodali, Sudha; Kuo, Yong-Fang; Ravi, Sujan; Seay, Toni; Parikh, Nina M; Singal, Ashwani K

    2018-05-01

    Nonalcoholic fatty liver disease (NAFLD) is emerging as the most common liver disease. The only effective treatment is 7%-10% weight loss. Mobile technology is increasingly used in weight management. This study was performed to evaluate the effects of text messaging intervention on weight loss in patients with NAFLD. Thirty well-defined NAFLD patients (mean age 52 years, 67% females, mean BMI 38) were randomized 1:1 to control group: counselling on healthy diet and exercise, or intervention group: text messages in addition to healthy life style counselling. NAFLD text messaging program sent weekly messages for 22 weeks on healthy life style education. Primary outcome was change in weight. Secondary outcomes were changes in liver enzymes and lipid profile. Intervention group lost an average of 6.9 lbs. (P = .03) compared to gain of 1.8 lbs. in the control group (P = .45). Intervention group also showed a decrease in ALT level (-12.5 IU/L, P = .035) and improvement in serum triglycerides (-28 mg/dL, P = .048). There were no changes in the control group on serum ALT level (-6.1 IU/L, P = .46) and on serum triglycerides (-20.3 mg/dL P = .27). Using one-way analysis of variance, change in outcomes in intervention group compared to control group was significant for weight (P = .02) and BMI (P = .02). Text messaging on healthy life style is associated with reduction in weight in NAFLD patients. Larger studies are suggested to examine benefits on liver histology, and assess long-term impact of this approach in patients with NAFLD. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Reliability of mechanisms with periodic random modal frequencies using an extreme value-based approach

    International Nuclear Information System (INIS)

    Savage, Gordon J.; Zhang, Xufang; Son, Young Kap; Pandey, Mahesh D.

    2016-01-01

    Resonance in a dynamic system is to be avoided since it often leads to impaired performance, overstressing, fatigue fracture and adverse human reactions. Thus, it is necessary to know the modal frequencies and ensure they do not coincide with any applied periodic loadings. For a rotating planar mechanism, the coefficients in the mass and stiffness matrices are periodically varying, and if the underlying geometry and material properties are treated as random variables then the modal frequencies are both position-dependent and probabilistic. The avoidance of resonance is now a complex problem. Herein, free vibration analysis helps determine ranges of modal frequencies that in turn, identify the running speeds of the mechanism to be avoided. This paper presents an efficient and accurate sample-based approach to determine probabilistic minimum and maximum extremes of the fundamental frequencies and the angular positions of their occurrence. Then, given critical lower and upper frequency constraints it is straightforward to determine reliability in terms of probability of exceedance. The novelty of the proposed approach is that the original expensive and implicit mechanistic model is replaced by an explicit meta-model that captures the tolerances of the design variables over the entire range of angular positions: position-dependent eigenvalues can be found easily and quickly. Extreme-value statistics of the modal frequencies and extreme-value statistics of the angular positions are readily computed through MCS. Limit-state surfaces that connect the frequencies to the design variables may be easily constructed. Error analysis identifies three errors and the paper presents ways to control them so the methodology can be sufficiently accurate. A numerical example of a flexible four-bar linkage shows the proposed methodology has engineering applications. The impact of the proposed methodology is two-fold: it presents a safe-side analysis based on free vibration methods to

  20. What kind of cyber security? Theorising cyber security and mapping approaches

    OpenAIRE

    Laura Fichtner

    2018-01-01

    Building on conceptual work on security and cyber security, the paper explores how different approaches to cyber security are constructed. It outlines structural components and presents four common approaches. Each of them suggests a different role for the actors involved and is motivated and justified by different values such as privacy, economic order and national security. When a cyber security policy or initiative is chosen by policymakers, the analysis of the underlying approach enhances...

  1. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    Science.gov (United States)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  2. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Application of fuzzy logic approach for wind erosion hazard mapping in Laghouat region (Algeria) using remote sensing and GIS

    Science.gov (United States)

    Saadoud, Djouher; Hassani, Mohamed; Martin Peinado, Francisco José; Guettouche, Mohamed Saïd

    2018-06-01

    Wind erosion is one of the most serious environmental problems in Algeria that threatens human activities and socio-economic development. The main goal of this study is to apply a fuzzy logic approach to wind erosion sensitivity mapping in the Laghouat region, Algeria. Six causative factors, obtained by applying fuzzy membership functions to each used parameter, are considered: soil, vegetation cover, wind factor, soil dryness, land topography and land cover sensitivity. Different fuzzy operators (AND, OR, SUM, PRODUCT, and GAMMA) are applied to generate wind-erosion hazard map. Success rate curves reveal that the fuzzy gamma (γ) operator, with γ equal to 0.9, gives the best prediction accuracy with an area under curve of 85.2%. The resulting wind-erosion sensitivity map delineates the area into different zones of five relative sensitivity classes: very high, high, moderate, low and very low. The estimated result was verified by field measurements and the high statistically significant value of a chi-square test.

  4. A novel continuous colour mapping approach for visualization of facial skin hydration and transepidermal water loss for four ethnic groups.

    Science.gov (United States)

    Voegeli, R; Rawlings, A V; Seroul, P; Summers, B

    2015-12-01

    The aim of this exploratory study was to develop a novel colour mapping approach to visualize and interpret the complexity of facial skin hydration and barrier properties of four ethnic groups (Caucasians, Indians, Chinese and Black Africans) living in Pretoria, South Africa. We measured transepidermal water loss (TEWL) and skin capacitance on 30 pre-defined sites on the forehead, cheek, jaw and eye areas of sixteen women (four per ethnic group) and took digital images of their faces. Continuous colour maps were generated by interpolating between each measured value and superimposing the values on the digital images. The complexity of facial skin hydration and skin barrier properties is revealed by these measurements and visualized by the continuous colour maps of the digital images. Overall, the Caucasian subjects had the better barrier properties followed by the Black African subjects, Chinese subjects and Indian subjects. Nevertheless, the two more darkly pigmented ethnic groups had superior skin hydration properties. Subtle differences were seen when examining the different facial sites. There exists remarkable skin capacitance and TEWL gradients within short distances on selected areas of the face. These gradients are distinctive in the different ethnic groups. In contrast to other reports, we found that darkly pigmented skin does not always have a superior barrier function and differences in skin hydration values are complex on the different parts of the face among the different ethnic groups. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  5. Approaches to vegetation mapping and ecophysiological hypothesis testing using combined information from TIMS, AVIRIS, and AIRSAR

    Science.gov (United States)

    Oren, R.; Vane, G.; Zimmermann, R.; Carrere, V.; Realmuto, V.; Zebker, Howard A.; Schoeneberger, P.; Schoeneberger, M.

    1991-01-01

    The Tropical Rainforest Ecology Experiment (TREE) had two primary objectives: (1) to design a method for mapping vegetation in tropical regions using remote sensing and determine whether the result improves on available vegetation maps; and (2) to test a specific hypothesis on plant/water relations. Both objectives were thought achievable with the combined information from the Thermal Infrared Multispectral Scanner (TIMS), Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and Airborne Synthetic Aperture Radar (AIRSAR). Implicitly, two additional objectives were: (1) to ascertain that the range within each variable potentially measurable with the three instruments is large enough in the site, relative to the sensitivity of the instruments, so that differences between ecological groups may be detectable; and (2) to determine the ability of the three systems to quantify different variables and sensitivities. We found that the ranges in values of foliar nitrogen concentration, water availability, stand structure and species composition, and plant/water relations were large, even within the upland broadleaf vegetation type. The range was larger when other vegetation types were considered. Unfortunately, cloud cover and navigation errors compromised the utility of the TIMS and AVIRIS data. Nevertheless, the AIRSAR data alone appear to have improved on the available vegetation map for the study area. An example from an area converted to a farm is given to demonstrate how the combined information from AIRSAR, TIMS, and AVIRIS can uniquely identify distinct classes of land use. The example alludes to the potential utility of the three instruments for identifying vegetation at an ecological scale finer than vegetation types.

  6. Karst groundwater protection: First application of a Pan-European Approach to vulnerability, hazard and risk mapping in the Sierra de Libar (Southern Spain)

    Energy Technology Data Exchange (ETDEWEB)

    Andreo, Bartolome [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain)]. E-mail: Andreo@uma.es; Goldscheider, Nico [Centre of Hydrogeology, University of Neuchatel, 11 rue Emile-Argand, CH-2007 Neuchatel (Switzerland); Vadillo, Inaki [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Vias, Jesus Maria [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Neukum, Christoph [Department of Applied Geology, University of Karlsruhe, Kaiserstrasse, 12, D-76128 Karlsruhe (Germany); Sinreich, Michael [Centre of Hydrogeology, University of Neuchatel, 11 rue Emile-Argand, CH-2007 Neuchatel (Switzerland); Jimenez, Pablo [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Brechenmacher, Julia [Department of Applied Geology, University of Karlsruhe, Kaiserstrasse, 12, D-76128 Karlsruhe (Germany); Carrasco, Francisco [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Hoetzl, Heinz [Department of Applied Geology, University of Karlsruhe, Kaiserstrasse, 12, D-76128 Karlsruhe (Germany); Perles, Maria Jesus [Group of Hydrogeology, Faculty of Science, University of Malaga, Campus de Teatinos, E-29071 Malaga (Spain); Zwahlen, Francois [Centre of Hydrogeology, University of Neuchatel, 11 rue Emile-Argand, CH-2007 Neuchatel (Switzerland)

    2006-03-15

    The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, validation of vulnerability maps, hazard and risk mapping. This paper presents the first application of all components of this Pan-European Approach to the Sierra de Libar, a karst hydrogeology system in Andalusia, Spain. The intrinsic vulnerability maps take into account the hydrogeological characteristics of the area but are independent from specific contaminant properties. Two specific vulnerability maps were prepared for faecal coliforms and BTEX. These maps take into account the specific properties of these two groups of contaminants and their interaction with the karst hydrogeological system. The vulnerability assessment was validated by means of tracing tests, hydrological, hydrochemical and isotope methods. The hazard map shows the localization of potential contamination sources resulting from human activities, and evaluates those according to their dangerousness. The risk of groundwater contamination depends on the hazards and the vulnerability of the aquifer system. The risk map for the Sierra de Libar was thus created by overlaying the hazard and vulnerability maps.

  7. Karst groundwater protection: First application of a Pan-European Approach to vulnerability, hazard and risk mapping in the Sierra de Libar (Southern Spain)

    International Nuclear Information System (INIS)

    Andreo, Bartolome; Goldscheider, Nico; Vadillo, Inaki; Vias, Jesus Maria; Neukum, Christoph; Sinreich, Michael; Jimenez, Pablo; Brechenmacher, Julia; Carrasco, Francisco; Hoetzl, Heinz; Perles, Maria Jesus; Zwahlen, Francois

    2006-01-01

    The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, validation of vulnerability maps, hazard and risk mapping. This paper presents the first application of all components of this Pan-European Approach to the Sierra de Libar, a karst hydrogeology system in Andalusia, Spain. The intrinsic vulnerability maps take into account the hydrogeological characteristics of the area but are independent from specific contaminant properties. Two specific vulnerability maps were prepared for faecal coliforms and BTEX. These maps take into account the specific properties of these two groups of contaminants and their interaction with the karst hydrogeological system. The vulnerability assessment was validated by means of tracing tests, hydrological, hydrochemical and isotope methods. The hazard map shows the localization of potential contamination sources resulting from human activities, and evaluates those according to their dangerousness. The risk of groundwater contamination depends on the hazards and the vulnerability of the aquifer system. The risk map for the Sierra de Libar was thus created by overlaying the hazard and vulnerability maps

  8. Coupled map lattice (CML) approach to power reactor dynamics (I) - preservation of normality

    International Nuclear Information System (INIS)

    Konno, H.

    1996-01-01

    An application of coupled map lattice (CML) model for simulating power fluctuations in nuclear power reactors is presented. (1) Preservation of Gaussianity in the point model is studied in a chaotic force driven Langevin equation in conjunction with the Gaussian-white noise driven Langevin equation. (2) Preservation of Guassianity is also studied in the space-dependent model with the use of a CML model near the onset of the Hopf bifurcation point. It is shown that the spatial dimensionality decreases as the maximum eigenvalue of the system increases. The result is consistent with the observation of neutron fluctuation in a BWR. (author)

  9. Information rich mapping requirement to product architecture through functional system deployment: The multi entity domain approach

    DEFF Research Database (Denmark)

    Hauksdóttir, Dagný; Mortensen, Niels Henrik

    2017-01-01

    may impede the ability to evolve, maintain or reuse systems. In this paper the Multi Entity Domain Approach (MEDA) is presented. The approach combines different design information within the domain views, incorporates both Software and Hardware design and supports iterative requirements definition...

  10. Mapping suitability of rice production systems for mitigation: Strategic approach for prioritizing improved irrigation management across scales

    Science.gov (United States)

    Wassmann, Reiner; Sander, Bjoern Ole

    2016-04-01

    After the successful conclusion of the COP21 in Paris, many developing countries are now embracing the task of reducing emissions with much vigor than previously. In many countries of South and South-East Asia, the agriculture sector constitutes a vast share of the national GHG budget which can mainly be attributed to methane emissions from flooded rice production. Thus, rice growing countries are now looking for tangible and easily accessible information as to how to reduce emissions from rice production in an efficient manner. Given present and future food demand, mitigation options will have to comply with aim of increasing productivity. At the same time, limited financial resources demand for strategic planning of potential mitigation projects based on cost-benefit ratios. At this point, the most promising approach for mitigating methane emissions from rice is an irrigation technique called Alternate Wetting and Drying (AWD). AWD was initially developed for saving water and subsequently, represents an adaptation strategy in its own right by coping with less rainfall. Moreover, AWD also reduces methane emissions in a range from 30-70%. However, AWD is not universally suitable. It is attractive to farmers who have to pump water and may save fuel under AWD, but renders limited incentives in situations where there is no real pressing water scarcity. Thus, planning for AWD adoption at larger scale, e.g. for country-wide programs, should be based on a systematic prioritization of target environments. This presentation encompasses a new methodology for mapping suitability of water-saving in rice production - as a means for planning adaptation and mitigation programs - alongside with preliminary results. The latter comprises three new GIS maps on climate-driven suitability of AWD in major rice growing countries (Philippines, Vietnam, Bangladesh). These maps have been derived from high-resolution data of the areal and temporal extent of rice production that are now

  11. Randomness-Based Scale-Chromatic Image Analysis for Interactive Mapping on Satellite-Roadway-Vehicle Network

    Directory of Open Access Journals (Sweden)

    Kohji Kamejima

    2007-08-01

    Full Text Available A new framework is presented for integrating satellite/avionics sensors with onboard vision to support information intensive maneuvering. Real time bindings of the bird's eye observation and the driver's view via GPS provides \\textit{as-is} basis for perception and decision. Randomness-based roadway pattern model is implemented by fractal coding scheme associating bird's eye and frontal views. The feasibility of the framework with resquirements for vison system is discussed through concept modeling and experimental studies.

  12. Randomness-Based Scale-Chromatic Image Analysis for Interactive Mapping on Satellite-Roadway-Vehicle Network

    Directory of Open Access Journals (Sweden)

    Kohji Kamejima

    2007-08-01

    Full Text Available A new framework is presented for integrating satellite/avionics sensors with onboard vision to support information intensive maneuvering. Real time bindings of the bird's eye observation and the driver's view via GPS provides extit{as-is} basis for perception and decision. Randomness-based roadway pattern model is implemented by fractal coding scheme associating bird's eye and frontal views. The feasibility of the framework with resquirements for vison system is discussed through concept modeling and experimental studies.

  13. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    Directory of Open Access Journals (Sweden)

    Ammendolia Carlo

    2009-06-01

    Full Text Available Abstract Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting.

  14. Designing a workplace return-to-work program for occupational low back pain: an intervention mapping approach

    Science.gov (United States)

    Ammendolia, Carlo; Cassidy, David; Steensta, Ivan; Soklaridis, Sophie; Boyle, Eleanor; Eng, Stephanie; Howard, Hamer; Bhupinder, Bains; Côté, Pierre

    2009-01-01

    Background Despite over 2 decades of research, the ability to prevent work-related low back pain (LBP) and disability remains elusive. Recent research suggests that interventions that are focused at the workplace and incorporate the principals of participatory ergonomics and return-to-work (RTW) coordination can improve RTW and reduce disability following a work-related back injury. Workplace interventions or programs to improve RTW are difficult to design and implement given the various individuals and environments involved, each with their own unique circumstances. Intervention mapping provides a framework for designing and implementing complex interventions or programs. The objective of this study is to design a best evidence RTW program for occupational LBP tailored to the Ontario setting using an intervention mapping approach. Methods We used a qualitative synthesis based on the intervention mapping methodology. Best evidence from systematic reviews, practice guidelines and key articles on the prognosis and management of LBP and improving RTW was combined with theoretical models for managing LBP and changing behaviour. This was then systematically operationalized into a RTW program using consensus among experts and stakeholders. The RTW Program was further refined following feedback from nine focus groups with various stakeholders. Results A detailed five step RTW program was developed. The key features of the program include; having trained personnel coordinate the RTW process, identifying and ranking barriers and solutions to RTW from the perspective of all important stakeholders, mediating practical solutions at the workplace and, empowering the injured worker in RTW decision-making. Conclusion Intervention mapping provided a useful framework to develop a comprehensive RTW program tailored to the Ontario setting. PMID:19508728

  15. Snapshots for Semantic Maps

    National Research Council Canada - National Science Library

    Nielsen, Curtis W; Ricks, Bob; Goodrich, Michael A; Bruemmer, David; Few, Doug; Walton, Miles

    2004-01-01

    .... Semantic maps are a relatively new approach to information presentation. Semantic maps provide more detail about an environment than typical maps because they are augmented by icons or symbols that provide meaning for places or objects of interest...

  16. Local lattice relaxations in random metallic alloys: Effective tetrahedron model and supercell approach

    DEFF Research Database (Denmark)

    Ruban, Andrei; Simak, S.I.; Shallcross, S.

    2003-01-01

    We present a simple effective tetrahedron model for local lattice relaxation effects in random metallic alloys on simple primitive lattices. A comparison with direct ab initio calculations for supercells representing random Ni0.50Pt0.50 and Cu0.25Au0.75 alloys as well as the dilute limit of Au-ri......-rich CuAu alloys shows that the model yields a quantitatively accurate description of the relaxtion energies in these systems. Finally, we discuss the bond length distribution in random alloys....

  17. Progress in national-scale landslide susceptibility mapping in Romania using a combined statistical-heuristical approach

    Science.gov (United States)

    Bălteanu, Dan; Micu, Mihai; Malet, Jean-Philippe; Jurchescu, Marta; Sima, Mihaela; Kucsicsa, Gheorghe; Dumitrică, Cristina; Petrea, Dănuţ; Mărgărint, Ciprian; Bilaşco, Ştefan; Văcăreanu, Radu; Georgescu, Sever; Senzaconi, Francisc

    2017-04-01

    Landslide processes represent a very widespread geohazard in Romania, affecting mainly the hilly and plateau regions as well as the mountain sectors developed on flysch formations. Two main projects provided the framework for improving the existing national landslide susceptibility map (Bălteanu et al. 2010): the ELSUS (Pan-European and nation-wide landslide susceptibility assessment, EC-CERG) and the RO-RISK (Disaster Risk Evaluation at National Level, ESF-POCA) projects. The latter one, a flagship project aiming at strengthening risk prevention and management in Romania, focused on a national-level evaluation of the main risks in the country including landslides. The strategy for modeling landslide susceptibility was designed based on the experience gained from continental and national level assessments conducted in the frame of the International Programme on Landslides (IPL) project IPL-162, the European Landslides Expert Group - JRC and the ELSUS project. The newly proposed landslide susceptibility model used as input a reduced set of landslide conditioning factor maps available at scales of 1:100,000 - 1:200,000 and consisting of lithology, slope angle and land cover. The input data was further differentiated for specific natural environments, defined here as morpho-structural units in order to incorporate differences induced by elevation (vertical climatic zonation), morpho-structure as well as neotectonic features. In order to best discern the specific landslide conditioning elements, the analysis has been carried out for one single process category, namely slides. The existence of a landslide inventory covering the whole country's territory ( 30,000 records, Micu et al. 2014), although affected by incompleteness and lack of homogeneity, allowed for the application of a semi-quantitative, mixed statistical-heuristical approach having the advantage of combining the objectivity of statistics with expert-knowledge in calibrating class and factor weights. The

  18. A computational approach for functional mapping of quantitative trait loci that regulate thermal performance curves.

    Directory of Open Access Journals (Sweden)

    John Stephen Yap

    2007-06-01

    Full Text Available Whether and how thermal reaction norm is under genetic control is fundamental to understand the mechanistic basis of adaptation to novel thermal environments. However, the genetic study of thermal reaction norm is difficult because it is often expressed as a continuous function or curve. Here we derive a statistical model for dissecting thermal performance curves into individual quantitative trait loci (QTL with the aid of a genetic linkage map. The model is constructed within the maximum likelihood context and implemented with the EM algorithm. It integrates the biological principle of responses to temperature into a framework for genetic mapping through rigorous mathematical functions established to describe the pattern and shape of thermal reaction norms. The biological advantages of the model lie in the decomposition of the genetic causes for thermal reaction norm into its biologically interpretable modes, such as hotter-colder, faster-slower and generalist-specialist, as well as the formulation of a series of hypotheses at the interface between genetic actions/interactions and temperature-dependent sensitivity. The model is also meritorious in statistics because the precision of parameter estimation and power of QTLdetection can be increased by modeling the mean-covariance structure with a small set of parameters. The results from simulation studies suggest that the model displays favorable statistical properties and can be robust in practical genetic applications. The model provides a conceptual platform for testing many ecologically relevant hypotheses regarding organismic adaptation within the Eco-Devo paradigm.

  19. An object-based approach for tree species extraction from digital orthophoto maps

    Science.gov (United States)

    Jamil, Akhtar; Bayram, Bulent

    2018-05-01

    Tree segmentation is an active and ongoing research area in the field of photogrammetry and remote sensing. It is more challenging due to both intra-class and inter-class similarities among various tree species. In this study, we exploited various statistical features for extraction of hazelnut trees from 1 : 5000 scaled digital orthophoto maps. Initially, the non-vegetation areas were eliminated using traditional normalized difference vegetation index (NDVI) followed by application of mean shift segmentation for transforming the pixels into meaningful homogeneous objects. In order to eliminate false positives, morphological opening and closing was employed on candidate objects. A number of heuristics were also derived to eliminate unwanted effects such as shadow and bounding box aspect ratios, before passing them into the classification stage. Finally, a knowledge based decision tree was constructed to distinguish the hazelnut trees from rest of objects which include manmade objects and other type of vegetation. We evaluated the proposed methodology on 10 sample orthophoto maps obtained from Giresun province in Turkey. The manually digitized hazelnut tree boundaries were taken as reference data for accuracy assessment. Both manually digitized and segmented tree borders were converted into binary images and the differences were calculated. According to the obtained results, the proposed methodology obtained an overall accuracy of more than 85 % for all sample images.

  20. Evolutionary Feature Selection for Big Data Classification: A MapReduce Approach

    Directory of Open Access Journals (Sweden)

    Daniel Peralta

    2015-01-01

    Full Text Available Nowadays, many disciplines have to deal with big datasets that additionally involve a high number of features. Feature selection methods aim at eliminating noisy, redundant, or irrelevant features that may deteriorate the classification performance. However, traditional methods lack enough scalability to cope with datasets of millions of instances and extract successful results in a delimited time. This paper presents a feature selection algorithm based on evolutionary computation that uses the MapReduce paradigm to obtain subsets of features from big datasets. The algorithm decomposes the original dataset in blocks of instances to learn from them in the map phase; then, the reduce phase merges the obtained partial results into a final vector of feature weights, which allows a flexible application of the feature selection procedure using a threshold to determine the selected subset of features. The feature selection method is evaluated by using three well-known classifiers (SVM, Logistic Regression, and Naive Bayes implemented within the Spark framework to address big data problems. In the experiments, datasets up to 67 millions of instances and up to 2000 attributes have been managed, showing that this is a suitable framework to perform evolutionary feature selection, improving both the classification accuracy and its runtime when dealing with big data problems.

  1. Charting a course to competency: an approach to mapping public health core competencies to existing trainings.

    Science.gov (United States)

    Neiworth, Latrissa L; Allan, Susan; D'Ambrosio, Luann; Coplen-Abrahamson, Marlene

    2014-03-01

    Consistent with other professional fields, the goals of public health training have moved from a focus on knowledge transfer to the development of skills or competencies. At least six national competency sets have been developed in the past decade pertaining to public health professionals. State and local public health agencies are increasingly using competency sets as frameworks for staff development and assessment. Mapping competencies to training has potential for enhancing the value of public health training during resource-constrained times by directly linking training content to the desired skills. For existing public health trainings, the challenge is how to identify competencies addressed in those courses in a manner that is not burdensome and that produces valid results. This article describes a process for mapping competencies to the learning objectives, assignments, and assessments of existing trainings. The process presented could be used by any training center or organization that seeks to connect public health workforce competencies to previously developed instruction. Public health practice can be strengthened more effectively if trainings can be selected for the desired practice skills or competencies.

  2. New Geospatial Approaches for Efficiently Mapping Forest Biomass Logistics at High Resolution over Large Areas

    Directory of Open Access Journals (Sweden)

    John Hogland

    2018-04-01

    Full Text Available Adequate biomass feedstock supply is an important factor in evaluating the financial feasibility of alternative site locations for bioenergy facilities and for maintaining profitability once a facility is built. We used newly developed spatial analysis and logistics software to model the variables influencing feedstock supply and to estimate and map two components of the supply chain for a bioenergy facility: (1 the total biomass stocks available within an economically efficient transportation distance; (2 the cost of logistics to move the required stocks from the forest to the facility. Both biomass stocks and flows have important spatiotemporal dynamics that affect procurement costs and project viability. Though seemingly straightforward, these two components can be difficult to quantify and map accurately in a useful and spatially explicit manner. For an 8 million hectare study area, we used raster-based methods and tools to quantify and visualize these supply metrics at 10 m2 spatial resolution. The methodology and software leverage a novel raster-based least-cost path modeling algorithm that quantifies off-road and on-road transportation and other logistics costs. The results of the case study highlight the efficiency, flexibility, fine resolution, and spatial complexity of model outputs developed for facility siting and procurement planning.

  3. What kind of cyber security? Theorising cyber security and mapping approaches

    Directory of Open Access Journals (Sweden)

    Laura Fichtner

    2018-05-01

    Full Text Available Building on conceptual work on security and cyber security, the paper explores how different approaches to cyber security are constructed. It outlines structural components and presents four common approaches. Each of them suggests a different role for the actors involved and is motivated and justified by different values such as privacy, economic order and national security. When a cyber security policy or initiative is chosen by policymakers, the analysis of the underlying approach enhances our understanding of how this shapes relationships between actors and of the values prioritised, promoted and inscribed into the concerned technologies.

  4. Application of a new genetic classification and semi-automated geomorphic mapping approach in the Perth submarine canyon, Australia

    Science.gov (United States)

    Picard, K.; Nanson, R.; Huang, Z.; Nichol, S.; McCulloch, M.

    2017-12-01

    The acquisition of high resolution marine geophysical data has intensified in recent years (e.g. multibeam echo-sounding, sub-bottom profiling). This progress provides the opportunity to classify and map the seafloor in greater detail, using new methods that preserve the links between processes and morphology. Geoscience Australia has developed a new genetic classification approach, nested within the Harris et al (2014) global seafloor mapping framework. The approach divides parent units into sub-features based on established classification schemes and feature descriptors defined by Bradwell et al. (2016: http://nora.nerc.ac.uk/), the International Hydrographic Organization (https://www.iho.int) and the Coastal Marine and Ecological Classification Standard (https://www.cmecscatalog.org). Owing to the ecological significance of submarine canyon systems in particular, much recent attention has focused on defining their variation in form and process, whereby they can be classified using a range of topographic metrics, fluvial dis/connection and shelf-incising status. The Perth Canyon is incised into the continental slope and shelf of southwest Australia, covering an area of >1500 km2 and extending from 4700 m water depth to the shelf break in 170 m. The canyon sits within a Marine Protected Area, incorporating a Marine National Park and Habitat Protection Zone in recognition of its benthic and pelagic biodiversity values. However, detailed information of the spatial patterns of the seabed habitats that influence this biodiversity is lacking. Here we use 20 m resolution bathymetry and acoustic backscatter data acquired in 2015 by the Schmidt Ocean Institute plus sub-bottom datasets and sediment samples collected Geoscience Australia in 2005 to apply the new geomorphic classification system to the Perth Canyon. This presentation will show the results of the geomorphic feature mapping of the canyon and its application to better defining potential benthic habitats.

  5. An alternative approach to treating lateral epicondylitis. A randomized, placebo-controlled, double-blinded study

    NARCIS (Netherlands)

    Nourbakhsh, Mohammad Reza; Fearon, Frank J.

    Objective: To investigate the effect of noxious level electrical stimulation on pain, grip strength and functional abilities in subjects with chronic lateral epicondylitis. Design: Randomized, placebo-control, double-blinded study. Setting: Physical Therapy Department, North Georgia College and

  6. Using the Hadoop/MapReduce approach for monitoring the CERN storage system and improving the ATLAS computing model

    CERN Document Server

    Russo, Stefano Alberto; Lamanna, M

    The processing of huge amounts of data, an already fundamental task for the research in the elementary particle physics field, is becoming more and more important also for companies operating in the Information Technology (IT) industry. In this context, if conventional approaches are adopted several problems arise, starting from the congestion of the communication channels. In the IT sector, one of the approaches designed to minimize this congestion on is to exploit the data locality, or in other words, to bring the computation as closer as possible to where the data resides. The most common implementation of this concept is the Hadoop/MapReduce framework. In this thesis work I evaluate the usage of Hadoop/MapReduce in two areas: a standard one similar to typical IT analyses, and an innovative one related to high energy physics analyses. The first consists in monitoring the history of the storage cluster which stores the data generated by the LHC experiments, the second in the physics analysis of the latter, ...

  7. Scaling Watershed Models: Modern Approaches to Science Computation with MapReduce, Parallelization, and Cloud Optimization

    Science.gov (United States)

    Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...

  8. Subtraction of random coincidences in γ-ray spectroscopy: A new approach

    International Nuclear Information System (INIS)

    Pattabiraman, N.S.; Ghugre, S.S.; Basu, S.K.; Garg, U.; Ray, S.; Sinha, A.K.; Zhu, S.

    2006-01-01

    A new analytical method for estimation and subsequent subtraction of random coincidences has been developed. It utilizes the knowledge of the counts in the main diagonal of a background-subtracted symmetric data set for the estimation of the events originating from random coincidences. This procedure has been successfully applied to several data sets. It could be a valuable tool for low-fold data sets, especially for low-cross-section events

  9. Analyzing the impact of social factors on homelessness: a fuzzy cognitive map approach.

    Science.gov (United States)

    Mago, Vijay K; Morden, Hilary K; Fritz, Charles; Wu, Tiankuang; Namazi, Sara; Geranmayeh, Parastoo; Chattopadhyay, Rakhi; Dabbaghian, Vahid

    2013-08-23

    The forces which affect homelessness are complex and often interactive in nature. Social forces such as addictions, family breakdown, and mental illness are compounded by structural forces such as lack of available low-cost housing, poor economic conditions, and insufficient mental health services. Together these factors impact levels of homelessness through their dynamic relations. Historic models, which are static in nature, have only been marginally successful in capturing these relationships. Fuzzy Logic (FL) and fuzzy cognitive maps (FCMs) are particularly suited to the modeling of complex social problems, such as homelessness, due to their inherent ability to model intricate, interactive systems often described in vague conceptual terms and then organize them into a specific, concrete form (i.e., the FCM) which can be readily understood by social scientists and others. Using FL we converted information, taken from recently published, peer reviewed articles, for a select group of factors related to homelessness and then calculated the strength of influence (weights) for pairs of factors. We then used these weighted relationships in a FCM to test the effects of increasing or decreasing individual or groups of factors. Results of these trials were explainable according to current empirical knowledge related to homelessness. Prior graphic maps of homelessness have been of limited use due to the dynamic nature of the concepts related to homelessness. The FCM technique captures greater degrees of dynamism and complexity than static models, allowing relevant concepts to be manipulated and interacted. This, in turn, allows for a much more realistic picture of homelessness. Through network analysis of the FCM we determined that Education exerts the greatest force in the model and hence impacts the dynamism and complexity of a social problem such as homelessness. The FCM built to model the complex social system of homelessness reasonably represented reality for the

  10. Dynamical zeta functions and dynamical determinants for hyperbolic maps a functional approach

    CERN Document Server

    Baladi, Viviane

    2018-01-01

    The spectra of transfer operators associated to dynamical systems, when acting on suitable Banach spaces, contain key information about the ergodic properties of the systems. Focusing on expanding and hyperbolic maps, this book gives a self-contained account on the relation between zeroes of dynamical determinants, poles of dynamical zeta functions, and the discrete spectra of the transfer operators. In the hyperbolic case, the first key step consists in constructing a suitable Banach space of anisotropic distributions. The first part of the book is devoted to the easier case of expanding endomorphisms, showing how the (isotropic) function spaces relevant there can be studied via Paley–Littlewood decompositions, and allowing easier access to the construction of the anisotropic spaces which is performed in the second part. This is the first book describing the use of anisotropic spaces in dynamics. Aimed at researchers and graduate students, it presents results and techniques developed since the beginning of...

  11. DOES GENDER EQUALITY LEAD TO BETTER-PERFORMING ECONOMIES? A BAYESIAN CAUSAL MAP APPROACH

    Directory of Open Access Journals (Sweden)

    Yelda YÜCEL

    2017-01-01

    Full Text Available This study explores the existence of relationships between gender inequalities –represented by the components of the World Economic Forum (WEF Global Gender Gap Index– and the major macroeconomic indicators. The relationships within gender inequalities in education, the labour market, health and the political arena, and between gender inequalities and gross macroeconomic aggregates were modelled with the Bayesian Causal Map, an effective tool that is used to analyze cause-effect relations and conditional dependencies between variables. A data set of 128 countries during the period 2007–2011 is used. Findings reveal that some inequalities have high levels of interaction with each other. In addition, eradicating gender inequalities is found to be associated with better economic performance, mainly in the form of higher gross domestic product growth, investment, and competitiveness.

  12. Processing parameters for the mechanical working of 9 Cr-1 Mo steel: processing maps approach

    Energy Technology Data Exchange (ETDEWEB)

    Sivaprasad, P.V.; Mannan, S.L.; Prasad, Y.V.R.K. [Indira Ghandi Centre for Atomic Research, Tamilnadu (India)

    2004-12-15

    Processing and instability maps using a dynamic materials model have been developed for 9Cr-1Mo steel in the temperature range 850 to 1200{sup o}C and strain rate range 0.001-100 s{sup -1} with a view to optimising its hot workability. The efficiency of power dissipation increased with increase in temperature and decrease in strain rate. The 9Cr-1Mo material exhibited two dynamic recrystallisation domains, one with a peak efficiency of 37% occurring at 950{sup o}C and 0.001 s{sup -1} and the other with a peak efficiency of 35% occurring at 1200{sup o}C and 0.1 s{sup -1}. These results are in good agreement with those found in industry. (author)

  13. An AIDS risk reduction program for Dutch drug users: an intervention mapping approach to planning.

    Science.gov (United States)

    van Empelen, Pepijn; Kok, Gerjo; Schaalma, Herman P; Bartholomew, L Kay

    2003-10-01

    This article presents the development of a theory- and evidence-based AIDS prevention program targeting Dutch drug users and aimed at promoting condom use. The emphasis is on the development of the program using a five-step intervention development protocol called intervention mapping (IM). Preceding Step 1 of the IM process, an assessment of the HIV problem among drug users was conducted. The product of IM Step 1 was a series of program objectives specifying what drug users should learn in order to use condoms consistently. In Step 2, theoretical methods for influencing the most important determinants were chosen and translated into practical strategies that fit the program objectives. The main strategy chosen was behavioral journalism. In Step 3, leaflets with role-model stories based on authentic interviews with drug users were developed and pilot tested. Finally, the need for cooperation with program users is discussed in IM Steps 4 and 5.

  14. Review of Control Techniques for HVAC Systems—Nonlinearity Approaches Based on Fuzzy Cognitive Maps

    Directory of Open Access Journals (Sweden)

    Farinaz Behrooz

    2018-02-01

    Full Text Available Heating, Ventilating, and Air Conditioning (HVAC systems are the major energy-consuming devices in buildings. Nowadays, due to the high demand for HVAC system installation in buildings, designing an effective controller in order to decrease the energy consumption of the devices while meeting the thermal comfort demands in buildings are the most important goals of control designers. The purpose of this article is to investigate the different control methods for Heating, Ventilating, and Air Conditioning and Refrigeration (HVAC & R systems. The advantages and disadvantages of each control method are discussed and finally the Fuzzy Cognitive Map (FCM method is introduced as a new strategy for HVAC systems. The FCM method is an intelligent and advanced control technique to address the nonlinearity, Multiple-Input and Multiple-Output (MIMO, complexity and coupling effect features of the systems. The significance of this method and improvements by this method are compared with other methods.

  15. Adjusting for sampling variability in sparse data: geostatistical approaches to disease mapping.

    Science.gov (United States)

    Hampton, Kristen H; Serre, Marc L; Gesink, Dionne C; Pilcher, Christopher D; Miller, William C

    2011-10-06

    Disease maps of crude rates from routinely collected health data indexed at a small geographical resolution pose specific statistical problems due to the sparse nature of the data. Spatial smoothers allow areas to borrow strength from neighboring regions to produce a more stable estimate of the areal value. Geostatistical smoothers are able to quantify the uncertainty in smoothed rate estimates without a high computational burden. In this paper, we introduce a uniform model extension of Bayesian Maximum Entropy (UMBME) and compare its performance to that of Poisson kriging in measures of smoothing strength and estimation accuracy as applied to simulated data and the real data example of HIV infection in North Carolina. The aim is to produce more reliable maps of disease rates in small areas to improve identification of spatial trends at the local level. In all data environments, Poisson kriging exhibited greater smoothing strength than UMBME. With the simulated data where the true latent rate of infection was known, Poisson kriging resulted in greater estimation accuracy with data that displayed low spatial autocorrelation, while UMBME provided more accurate estimators with data that displayed higher spatial autocorrelation. With the HIV data, UMBME performed slightly better than Poisson kriging in cross-validatory predictive checks, with both models performing better than the observed data model with no smoothing. Smoothing methods have different advantages depending upon both internal model assumptions that affect smoothing strength and external data environments, such as spatial correlation of the observed data. Further model comparisons in different data environments are required to provide public health practitioners with guidelines needed in choosing the most appropriate smoothing method for their particular health dataset.

  16. Adjusting for sampling variability in sparse data: geostatistical approaches to disease mapping

    Directory of Open Access Journals (Sweden)

    Pilcher Christopher D

    2011-10-01

    Full Text Available Abstract Background Disease maps of crude rates from routinely collected health data indexed at a small geographical resolution pose specific statistical problems due to the sparse nature of the data. Spatial smoothers allow areas to borrow strength from neighboring regions to produce a more stable estimate of the areal value. Geostatistical smoothers are able to quantify the uncertainty in smoothed rate estimates without a high computational burden. In this paper, we introduce a uniform model extension of Bayesian Maximum Entropy (UMBME and compare its performance to that of Poisson kriging in measures of smoothing strength and estimation accuracy as applied to simulated data and the real data example of HIV infection in North Carolina. The aim is to produce more reliable maps of disease rates in small areas to improve identification of spatial trends at the local level. Results In all data environments, Poisson kriging exhibited greater smoothing strength than UMBME. With the simulated data where the true latent rate of infection was known, Poisson kriging resulted in greater estimation accuracy with data that displayed low spatial autocorrelation, while UMBME provided more accurate estimators with data that displayed higher spatial autocorrelation. With the HIV data, UMBME performed slightly better than Poisson kriging in cross-validatory predictive checks, with both models performing better than the observed data model with no smoothing. Conclusions Smoothing methods have different advantages depending upon both internal model assumptions that affect smoothing strength and external data environments, such as spatial correlation of the observed data. Further model comparisons in different data environments are required to provide public health practitioners with guidelines needed in choosing the most appropriate smoothing method for their particular health dataset.

  17. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. ReactionMap: an efficient atom-mapping algorithm for chemical reactions.

    Science.gov (United States)

    Fooshee, David; Andronico, Alessio; Baldi, Pierre

    2013-11-25

    Large databases of chemical reactions provide new data-mining opportunities and challenges. Key challenges result from the imperfect quality of the data and the fact that many of these reactions are not properly balanced or atom-mapped. Here, we describe ReactionMap, an efficient atom-mapping algorithm. Our approach uses a combination of maximum common chemical subgraph search and minimization of an assignment cost function derived empirically from training data. We use a set of over 259,000 balanced atom-mapped reactions from the SPRESI commercial database to train the system, and we validate it on random sets of 1000 and 17,996 reactions sampled from this pool. These large test sets represent a broad range of chemical reaction types, and ReactionMap correctly maps about 99% of the atoms and about 96% of the reactions, with a mean time per mapping of 2 s. Most correctly mapped reactions are mapped with high confidence. Mapping accuracy compares favorably with ChemAxon's AutoMapper, versions 5 and 6.1, and the DREAM Web tool. These approaches correctly map 60.7%, 86.5%, and 90.3% of the reactions, respectively, on the same data set. A ReactionMap server is available on the ChemDB Web portal at http://cdb.ics.uci.edu .

  19. A behavioral approach to shared mapping of peripersonal space between oneself and others.

    Science.gov (United States)

    Teramoto, Wataru

    2018-04-03

    Recent physiological studies have showed that some visuotactile brain areas respond to other's peripersonal spaces (PPS) as they would their own. This study investigates this PPS remapping phenomenon in terms of human behavior. Participants placed their left hands on a tabletop screen where visual stimuli were projected. A vibrotactile stimulator was attached to the tip of their index finger. While a white disk approached or receded from the hand in the participant's near or far space, the participant was instructed to quickly detect a target (vibrotactile stimulation, change in the moving disk's color or both). When performing this task alone, the participants exhibited shorter detection times when the disk approached the hand in their near space. In contrast, when performing the task with a partner across the table, the participants exhibited shorter detection times both when the disk approached their own hand in their near space and when it approached the partner's hand in the partner's near space but the participants' far space. This phenomenon was also observed when the body parts from which the visual stimuli approached/receded differed between the participant and partner. These results suggest that humans can share PPS representations and/or body-derived attention/arousal mechanisms with others.

  20. An automatic approach for rice mapping in temperate region using time series of MODIS imagery: first results for Mediterranean environment

    Science.gov (United States)

    Boschetti, M.; Nelson, A.; Manfrom, G.; Brivio, P. A.

    2012-04-01

    Timely and accurate information on crop typology and status are required to support suitable action to better manage agriculture production and reduce food insecurity. More specifically, regional crop masking and phenological information are important inputs for spatialized crop growth models for yield forecasting systems. Digital cartographic data available at global/regional scale, such as GLC2000, GLOBCOVER or MODIS land cover products (MOD12), are often not adequate for this crop modeling application. For this reason, there is a need to develop and test methods that can provide such information for specific cropsusing automated classification techniques.. In this framework we focused our analysis on the rice cultivation area detection due to the importance of this crop. Rice is a staple food for half of the world's population (FAO 2004). Over 90% of the world's rice is produced and consumed in Asia and the region is home to 70% of the world's poor, most of whom depend on rice for their livelihoods andor food security. Several initiatives are being promoted at the international level to provide maps of rice cultivated areas in South and South East Asia using different approaches available in literature for rice mapping in tropical regions. We contribute to these efforts by proposing an automatic method to detect rice cultivated areas in temperate regions exploiting MODIS 8-Day composite of Surface Reflectance at 500m spatial resolution (MOD09A1product). Temperate rice is cultivated worldwide in more than 20 countries covering around 16M ha for a total production of about 65M tons of paddy per year. The proposed method is based on a common approach available in literature that first identifies flood condition that can be related to rice agronomic practice and then checks for vegetation growth. The method presents innovative aspects related both to the flood detection, exploiting Short Wave Infrared spectral information, and to the crop grow monitoring analyzing

  1. Mapping carbon sequestration in forests at the regional scale - a climate biomonitoring approach by example of Germany

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, Winfried; Pesch, Roland [University of Vechta, Chair of Landscape Ecology, PO Box. 1553, Vechta (Germany)

    2011-12-15

    The United Nations Framework Convention on Climate Change recognizes carbon (C) fixation in forests as an important contribution for the reduction of atmospheric pollution in terms of greenhouse gases. Spatial differentiation of C sequestration in forests either at the national or at the regional scale is therefore needed for forest planning purposes. Hence, within the framework of the Forest Focus regulation, the aim of this investigation was to statistically analyse factors influencing the C fixation and to use the corresponding associations in terms of a predictive mapping approach at the regional scale by example of the German federal state North Rhine-Westphalia. The results of the methodical scheme outlined in this article should be compared with an already-published approach applied to the same data which were used in the investigation at hand. Site-specific data on C sequestration in humus, forest trees/dead wood and soil from two forest monitoring networks were intersected with available surface information on topography, soil, climate and forestal growing areas and districts. Next, the association between the C sequestration and the influence factors were examined and modelled by linear regression analyses. The resulting regression equations were applied on the surface data to predicatively map the C sequestration for the entire study area. The computations yielded an estimation of 146.7 mio t C sequestered in the forests of North Rhine-Westphalia corresponding to 168.6 t/ha. The calculated values correspond well to according specifications given by the literature. Furthermore, the results are almost identical to those of another pilot study where a different statistical methodology was applied on the same database. Nevertheless, the underlying regression models contribute only a low degree of explanation to the overall variance of the C fixation. This might mainly be due to data quality aspects and missing influence factors in the analyses. In another

  2. Prediction of soil CO2 flux in sugarcane management systems using the Random Forest approach

    Directory of Open Access Journals (Sweden)

    Rose Luiza Moraes Tavares

    Full Text Available ABSTRACT: The Random Forest algorithm is a data mining technique used for classifying attributes in order of importance to explain the variation in an attribute-target, as soil CO2 flux. This study aimed to identify prediction of soil CO2 flux variables in management systems of sugarcane through the machine-learning algorithm called Random Forest. Two different management areas of sugarcane in the state of São Paulo, Brazil, were selected: burned and green. In each area, we assembled a sampling grid with 81 georeferenced points to assess soil CO2 flux through automated portable soil gas chamber with measuring spectroscopy in the infrared during the dry season of 2011 and the rainy season of 2012. In addition, we sampled the soil to evaluate physical, chemical, and microbiological attributes. For data interpretation, we used the Random Forest algorithm, based on the combination of predicted decision trees (machine learning algorithms in which every tree depends on the values of a random vector sampled independently with the same distribution to all the trees of the forest. The results indicated that clay content in the soil was the most important attribute to explain the CO2 flux in the areas studied during the evaluated period. The use of the Random Forest algorithm originated a model with a good fit (R2 = 0.80 for predicted and observed values.

  3. Bilateral robotic priming before task-oriented approach in subacute stroke rehabilitation: a pilot randomized controlled trial.

    Science.gov (United States)

    Hsieh, Yu-Wei; Wu, Ching-Yi; Wang, Wei-En; Lin, Keh-Chung; Chang, Ku-Chou; Chen, Chih-Chi; Liu, Chien-Ting

    2017-02-01

    To investigate the treatment effects of bilateral robotic priming combined with the task-oriented approach on motor impairment, disability, daily function, and quality of life in patients with subacute stroke. A randomized controlled trial. Occupational therapy clinics in medical centers. Thirty-one subacute stroke patients were recruited. Participants were randomly assigned to receive bilateral priming combined with the task-oriented approach (i.e., primed group) or to the task-oriented approach alone (i.e., unprimed group) for 90 minutes/day, 5 days/week for 4 weeks. The primed group began with the bilateral priming technique by using a bimanual robot-aided device. Motor impairments were assessed by the Fugal-Meyer Assessment, grip strength, and the Box and Block Test. Disability and daily function were measured by the modified Rankin Scale, the Functional Independence Measure, and actigraphy. Quality of life was examined by the Stroke Impact Scale. The primed and unprimed groups improved significantly on most outcomes over time. The primed group demonstrated significantly better improvement on the Stroke Impact Scale strength subscale ( p = 0.012) and a trend for greater improvement on the modified Rankin Scale ( p = 0.065) than the unprimed group. Bilateral priming combined with the task-oriented approach elicited more improvements in self-reported strength and disability degrees than the task-oriented approach by itself. Further large-scale research with at least 31 participants in each intervention group is suggested to confirm the study findings.

  4. 52 Million Points and Counting: A New Stratification Approach for Mapping Global Marine Ecosystems

    Science.gov (United States)

    Wright, D. J.; Sayre, R.; Breyer, S.; Butler, K. A.; VanGraafeiland, K.; Goodin, K.; Kavanaugh, M.; Costello, M. J.; Cressie, N.; Basher, Z.; Harris, P. T.; Guinotte, J. M.

    2016-12-01

    We report progress on the Ecological Marine Units (EMU) project, a new undertaking commissioned by the Group on Earth Observations (GEO) as a means of developing a standardized and practical global ecosystems classification and map for the oceans, and thus a key outcome of the GEO Biodiversity Observation Network (GEO BON). The project is one of four components of the new GI-14 GEO Ecosystems Initiative within the GEO 2016 Transitional Work plan, and for eventual use by the Global Earth Observation System of Systems (GEOSS). The project is also the follow-on to a comprehensive Ecological Land Units project (ELU), also commissioned by GEO. The EMU is comprised of a global point mesh framework, created from 52,487,233 points from the NOAA World Ocean Atlas; spatial resolution is ¼° by ¼° by varying depth; temporal resolution is currently decadal; each point has x, y, z, as well as six attributes of chemical and physical oceanographic structure (temperature, salinity, dissolved oxygen, nitrate, silicate, phosphate) that are likely drivers of many ecosystem responses. We implemented a k-means statistical clustering of the point mesh (using the pseudo-F statistic to help determine the numbers of clusters), allowing us to identify and map 37 environmentally distinct 3D regions (candidate `ecosystems') within the water column. These units can be attributed according to their productivity, direction and velocity of currents, species abundance, global seafloor geomorphology (from Harris et al.), and much more. A series of data products for open access will share the 3D point mesh and EMU clusters at the surface, bottom, and within the water column, as well as 2D and 3D web apps for exploration of the EMUs and the original World Ocean Atlas data. Future plans include a global delineation of Ecological Coastal Units (ECU) at a much finer spatial resolution (not yet commenced), as well as global ecological freshwater ecosystems (EFUs; in earliest planning stages). We will

  5. Corporate Sustainability integration : development of a framework to map supporting approaches

    NARCIS (Netherlands)

    Witjes, S.; Vermeulen, W.J.V.; Cramer, J.M.

    2015-01-01

    Companies have become more aware of the impact they generate on society. Some companies take up the challenge to convert this awareness in an added value to their core business activities. There is an extensive amount of Corporate Sustainability approaches (tools, instruments and initiatives)

  6. THE CONTRIBUTION OF GIS IN FLOOD MAPPING: TWO APPROACHES USING OPEN SOURCE GRASS GIS SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Marzocchi

    2014-01-01

    In this work we present a comparison between the two models mentioned above. We analyse the possibility of integrating these two approaches. We intend to use the 1D model, GIS embedded if possible, to calculate the water surface profile along the river axis and the 2D numerical one to analyse inundation beside the river levees.

  7. Mapping the Demographic Landscape of Characters in Recent Dutch Prose : A Quantitative Approach to Literary Representation

    NARCIS (Netherlands)

    van der Deijl, Lucas; Pieterse, S.A.; Prinse, Marion; Smeets, Roel

    2016-01-01

    The lack of ethnic and gender diversity in the Dutch literary domain has recently been subject to discussions in the public debate. In the academic context, questions regarding diversity are studied either on a literary-sociological level (institutional approaches) or on the level of the individual

  8. An empirical InSAR-optical fusion approach to mapping vegetation canopy height

    Science.gov (United States)

    Wayne S. Walker; Josef M. Kellndorfer; Elizabeth LaPoint; Michael Hoppus; James Westfall

    2007-01-01

    Exploiting synergies afforded by a host of recently available national-scale data sets derived from interferometric synthetic aperture radar (InSAR) and passive optical remote sensing, this paper describes the development of a novel empirical approach for the provision of regional- to continental-scale estimates of vegetation canopy height. Supported by data from the...

  9. Random walks on a fluctuating lattice: A renormalization group approach applied in one dimension

    International Nuclear Information System (INIS)

    Levermore, C.D.; Nadler, W.; Stein, D.L.

    1995-01-01

    We study the problem of a random walk on a lattice in which bonds connecting nearest-neighbor sites open and close randomly in time, a situation often encountered in fluctuating media. We present a simple renormalization group technique to solve for the effective diffusive behavior at long times. For one-dimensional lattices we obtain better quantitative agreement with simulation data than earlier effective medium results. Our technique works in principle in any dimension, although the amount of computation required rises with the dimensionality of the lattice

  10. Spectral SP: A New Approach to Mapping Reservoir Flow and Permeability

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Donald M. [Univ. of Hawaii, Honolulu, HI (United States). Hawaii Inst. of Geophysics; Lienert, Barry R. [Univ. of Hawaii, Honolulu, HI (United States). Hawaii Inst. of Geophysics; Wallin, Erin L. [Univ. of Hawaii, Honolulu, HI (United States); Gasperikova, Erika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-05-27

    Our objectives for the current project were to develop an innovative inversion and analysis procedure for magnetotelluric field data and time variable self-potentials that will enable us to map not only the subsurface resistivity structure of a geothermal prospect but to also delineate the permeability distribution within the field. Hence, the ultimate objective were to provide better targeting information for exploratory and development drilling of a geothermal prospect. Field data were collected and analyzed from the Kilauea Summit, Kilauea East Rift Zone, and the Humuula Saddle between Mauna Loa and Mauna Kea volcanoes. All of these areas were known or suspected to have geothermal activity of varying intensities. Our results provided evidence for significant long-term coordinated changes in spontaneous potential that could be associated with subsurface flows, significant interferences were encountered that arose from surface environmental changes (rainfall, temperature) that rendered it nearly impossible to unequivocally distinguish between deep fluid flow changes and environmental effects. Further, the analysis of the inferred spontaneous potential changes in the context of depth of the signals, and hence, permeability horizons, were unable to be completed in the time available.

  11. MAPPING OF FLOOD SUSCEPTIBILITY IN CAMPINA GRANDE COUNTY - PB: A SPATIAL MULTICRITERIA APPROACH

    Directory of Open Access Journals (Sweden)

    Priscila Barros Ramalho Alves

    Full Text Available Abstract: The social and economic impacts caused by floods in urban areas are diverse and increase as the land becomes gradually impervious. Due to the increasing urbanization of cities, it is necessary to implement a better planning process and optimize the urban spaces management and occupation. Thus, the government needs to gather reliable and useful data for the decision-making process. Therefore, the GIS plays an important role among urban planning instruments. Given the current situation in Campina Grande County, Paraiba State, Brazil - an area continually facing disturbances caused by occasional and concentrated rainfalls - the current study aims to map the areas seen as the most susceptible to floods, by using a MCDA GIS-based model (Multi-Criteria Decision Analysis. There are five quantitative criteria considered in the analysis: slope, altitude, roads with drainage infrastructure, distance from water bodies and land use. It is a pixel by pixel analysis based on predetermined assumptions. Fuzzy functions were developed and overlay operations were performed. The results were consistent with historical records and with previous studies about the county, thus adding reliability to the model, which can be considered a potential management instrument for the case study area, as well as for cities facing similar issues.

  12. A noninvasive approach to quantitative functional brain mapping with H215O and positron emission tomography

    International Nuclear Information System (INIS)

    Fox, P.T.; Mintun, M.A.; Raichle, M.E.; Herscovitch, P.

    1984-01-01

    Positron emission tomographic (PET) measurements of regional cerebral blood flow (rCBF) with intravenously administered 15 O-labeled water and an adaptation of the Kety autoradiographic model are well suited to the study of functional-anatomical correlations within the human brain. This model requires arterial blood sampling to determine rCBF from the regional tissue radiotracer concentration (Cr) recorded by the tomograph. Based upon the well-defined, nearly linear relation between Cr and rCBF inherent in the model, we have developed a method for estimating changes in rCBF from changes in Cr without calculating true rCBF and thus without arterial sampling. This study demonstrates that quantitative functional brain mapping does not require the determination of rCBF from Cr when regional neuronal activation is expressed as the change in rCBF from an initial, resting-state measurement. Patterned-flash visual stimulation was used to produce a wide range of increases in rCBF within the striate cortex. Changes in occipital rCBF were found to be accurately estimated directly from Cr over a series of 56 measurements on eight subjects. This adaptation of the PET/autoradiographic method serves to simplify its application and to make it more acceptable to the subject

  13. Integrative computational and experimental approaches to establish a post-myocardial infarction knowledge map.

    Directory of Open Access Journals (Sweden)

    Nguyen T Nguyen

    2014-03-01

    Full Text Available Vast research efforts have been devoted to providing clinical diagnostic markers of myocardial infarction (MI, leading to over one million abstracts associated with "MI" and "Cardiovascular Diseases" in PubMed. Accumulation of the research results imposed a challenge to integrate and interpret these results. To address this problem and better understand how the left ventricle (LV remodels post-MI at both the molecular and cellular levels, we propose here an integrative framework that couples computational methods and experimental data. We selected an initial set of MI-related proteins from published human studies and constructed an MI-specific protein-protein-interaction network (MIPIN. Structural and functional analysis of the MIPIN showed that the post-MI LV exhibited increased representation of proteins involved in transcriptional activity, inflammatory response, and extracellular matrix (ECM remodeling. Known plasma or serum expression changes of the MIPIN proteins in patients with MI were acquired by data mining of the PubMed and UniProt knowledgebase, and served as a training set to predict unlabeled MIPIN protein changes post-MI. The predictions were validated with published r